Center for Center for Technology for Advanced Scientific Component Software (TASCS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kostadin, Damevski
A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technologymore » for Advanced Scientific Component Software (TASCS)1 tackles these these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.« less
Managing Scientific Software Complexity with Bocca and CCA
Allan, Benjamin A.; Norris, Boyana; Elwasif, Wael R.; ...
2008-01-01
In high-performance scientific software development, the emphasis is often on short time to first solution. Even when the development of new components mostly reuses existing components or libraries and only small amounts of new code must be created, dealing with the component glue code and software build processes to obtain complete applications is still tedious and error-prone. Component-based software meant to reduce complexity at the application level increases complexity to the extent that the user must learn and remember the interfaces and conventions of the component model itself. To address these needs, we introduce Bocca, the first tool to enablemore » application developers to perform rapid component prototyping while maintaining robust software-engineering practices suitable to HPC environments. Bocca provides project management and a comprehensive build environment for creating and managing applications composed of Common Component Architecture components. Of critical importance for high-performance computing (HPC) applications, Bocca is designed to operate in a language-agnostic way, simultaneously handling components written in any of the languages commonly used in scientific applications: C, C++, Fortran, Python and Java. Bocca automates the tasks related to the component glue code, freeing the user to focus on the scientific aspects of the application. Bocca embraces the philosophy pioneered by Ruby on Rails for web applications: start with something that works, and evolve it to the user's purpose.« less
Component Technology for High-Performance Scientific Simulation Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Epperly, T; Kohn, S; Kumfert, G
2000-11-09
We are developing scientific software component technology to manage the complexity of modem, parallel simulation software and increase the interoperability and re-use of scientific software packages. In this paper, we describe a language interoperability tool named Babel that enables the creation and distribution of language-independent software libraries using interface definition language (IDL) techniques. We have created a scientific IDL that focuses on the unique interface description needs of scientific codes, such as complex numbers, dense multidimensional arrays, complicated data types, and parallelism. Preliminary results indicate that in addition to language interoperability, this approach provides useful tools for thinking about themore » design of modem object-oriented scientific software libraries. Finally, we also describe a web-based component repository called Alexandria that facilitates the distribution, documentation, and re-use of scientific components and libraries.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaenko, Alexander; Windus, Theresa L.; Sosonkina, Masha
2012-10-19
The design and development of scientific software components to provide an interface to the effective fragment potential (EFP) methods are reported. Multiscale modeling of physical and chemical phenomena demands the merging of software packages developed by research groups in significantly different fields. Componentization offers an efficient way to realize new high performance scientific methods by combining the best models available in different software packages without a need for package readaptation after the initial componentization is complete. The EFP method is an efficient electronic structure theory based model potential that is suitable for predictive modeling of intermolecular interactions in large molecularmore » systems, such as liquids, proteins, atmospheric aerosols, and nanoparticles, with an accuracy that is comparable to that of correlated ab initio methods. The developed components make the EFP functionality accessible for any scientific component-aware software package. The performance of the component is demonstrated on a protein interaction model, and its accuracy is compared with results obtained with coupled cluster methods.« less
Component-based integration of chemistry and optimization software.
Kenny, Joseph P; Benson, Steven J; Alexeev, Yuri; Sarich, Jason; Janssen, Curtis L; McInnes, Lois Curfman; Krishnan, Manojkumar; Nieplocha, Jarek; Jurrus, Elizabeth; Fahlstrom, Carl; Windus, Theresa L
2004-11-15
Typical scientific software designs make rigid assumptions regarding programming language and data structures, frustrating software interoperability and scientific collaboration. Component-based software engineering is an emerging approach to managing the increasing complexity of scientific software. Component technology facilitates code interoperability and reuse. Through the adoption of methodology and tools developed by the Common Component Architecture Forum, we have developed a component architecture for molecular structure optimization. Using the NWChem and Massively Parallel Quantum Chemistry packages, we have produced chemistry components that provide capacity for energy and energy derivative evaluation. We have constructed geometry optimization applications by integrating the Toolkit for Advanced Optimization, Portable Extensible Toolkit for Scientific Computation, and Global Arrays packages, which provide optimization and linear algebra capabilities. We present a brief overview of the component development process and a description of abstract interfaces for chemical optimizations. The components conforming to these abstract interfaces allow the construction of applications using different chemistry and mathematics packages interchangeably. Initial numerical results for the component software demonstrate good performance, and highlight potential research enabled by this platform.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Svetlana Shasharina
The goal of the Center for Technology for Advanced Scientific Component Software is to fundamentally changing the way scientific software is developed and used by bringing component-based software development technologies to high-performance scientific and engineering computing. The role of Tech-X work in TASCS project is to provide an outreach to accelerator physics and fusion applications by introducing TASCS tools into applications, testing tools in the applications and modifying the tools to be more usable.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Armstrong, Robert C.; Ray, Jaideep; Malony, A.
2003-11-01
We present a case study of performance measurement and modeling of a CCA (Common Component Architecture) component-based application in a high performance computing environment. We explore issues peculiar to component-based HPC applications and propose a performance measurement infrastructure for HPC based loosely on recent work done for Grid environments. A prototypical implementation of the infrastructure is used to collect data for a three components in a scientific application and construct performance models for two of them. Both computational and message-passing performance are addressed.
Bonded repair of composite aircraft structures: A review of scientific challenges and opportunities
NASA Astrophysics Data System (ADS)
Katnam, K. B.; Da Silva, L. F. M.; Young, T. M.
2013-08-01
Advanced composite materials have gained popularity in high-performance structural designs such as aerospace applications that require lightweight components with superior mechanical properties in order to perform in demanding service conditions as well as provide energy efficiency. However, one of the major challenges that the aerospace industry faces with advanced composites - because of their inherent complex damage behaviour - is structural repair. Composite materials are primarily damaged by mechanical loads and/or environmental conditions. If material damage is not extensive, structural repair is the only feasible solution as replacing the entire component is not cost-effective in many cases. Bonded composite repairs (e.g. scarf patches) are generally preferred as they provide enhanced stress transfer mechanisms, joint efficiencies and aerodynamic performance. With an increased usage of advanced composites in primary and secondary aerospace structural components, it is thus essential to have robust, reliable and repeatable structural bonded repair procedures to restore damaged composite components. But structural bonded repairs, especially with primary structures, pose several scientific challenges with the current existing repair technologies. In this regard, the area of structural bonded repair of composites is broadly reviewed - starting from damage assessment to automation - to identify current scientific challenges and future opportunities.
A study of Korean students' creativity in science using structural equation modeling
NASA Astrophysics Data System (ADS)
Jo, Son Mi
Through the review of creativity research I have found that studies lack certain crucial parts: (a) a theoretical framework for the study of creativity in science, (b) studies considering the unique components related to scientific creativity, and (c) studies of the interactions among key components through simultaneous analyses. The primary purpose of this study is to explore the dynamic interactions among four components (scientific proficiency, intrinsic motivation, creative competence, context supporting creativity) related to scientific creativity under the framework of scientific creativity. A total of 295 Korean middle school students participated. Well-known and commonly used measurements were selected and developed. Two scientific achievement scores and one score measured by performance-based assessment were used to measure student scientific knowledge/inquiry skills. Six items selected from the study of Lederman, Abd-El-Khalick, Bell, and Schwartz (2002) were used to assess how well students understand the nature of science. Five items were selected from the subscale of the scientific attitude inventory version II (Moore & Foy, 1997) to assess student attitude toward science. The Test of Creative Thinking-Drawing Production (Urban & Jellen, 1996) was used to measure creative competence. Eight items chosen from the 15 items of the Work Preference Inventory (1994) were applied to measure students' intrinsic motivation. To assess the level of context supporting creativity, eight items were adapted from measurement of the work environment (Amabile, Conti, Coon, Lazenby, and Herron, 1996). To assess scientific creativity, one open-ended science problem was used and three raters rated the level of scientific creativity through the Consensual Assessment Technique (Amabile, 1996). The results show that scientific proficiency and creative competence correlates with scientific creativity. Intrinsic motivation and context components do not predict scientific creativity. The strength of relationships between scientific proficiency and scientific creativity (estimate parameter=0.43) and creative competence and scientific creativity (estimate parameter=0.17) are similar [chi2.05(1)=0.670, P>.05]. In specific analysis of structural model, I found that creative competence and scientific proficiency play a role of partial mediators among three components (general creativity, scientific proficiency, and scientific creativity). The moderate effects of intrinsic motivation and context component were investigated, but the moderation effects were not found.
An Integrated Framework for Parameter-based Optimization of Scientific Workflows.
Kumar, Vijay S; Sadayappan, P; Mehta, Gaurang; Vahi, Karan; Deelman, Ewa; Ratnakar, Varun; Kim, Jihie; Gil, Yolanda; Hall, Mary; Kurc, Tahsin; Saltz, Joel
2009-01-01
Data analysis processes in scientific applications can be expressed as coarse-grain workflows of complex data processing operations with data flow dependencies between them. Performance optimization of these workflows can be viewed as a search for a set of optimal values in a multi-dimensional parameter space. While some performance parameters such as grouping of workflow components and their mapping to machines do not a ect the accuracy of the output, others may dictate trading the output quality of individual components (and of the whole workflow) for performance. This paper describes an integrated framework which is capable of supporting performance optimizations along multiple dimensions of the parameter space. Using two real-world applications in the spatial data analysis domain, we present an experimental evaluation of the proposed framework.
UMAMI: A Recipe for Generating Meaningful Metrics through Holistic I/O Performance Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lockwood, Glenn K.; Yoo, Wucherl; Byna, Suren
I/O efficiency is essential to productivity in scientific computing, especially as many scientific domains become more data-intensive. Many characterization tools have been used to elucidate specific aspects of parallel I/O performance, but analyzing components of complex I/O subsystems in isolation fails to provide insight into critical questions: how do the I/O components interact, what are reasonable expectations for application performance, and what are the underlying causes of I/O performance problems? To address these questions while capitalizing on existing component-level characterization tools, we propose an approach that combines on-demand, modular synthesis of I/O characterization data into a unified monitoring and metricsmore » interface (UMAMI) to provide a normalized, holistic view of I/O behavior. We evaluate the feasibility of this approach by applying it to a month-long benchmarking study on two distinct largescale computing platforms. We present three case studies that highlight the importance of analyzing application I/O performance in context with both contemporaneous and historical component metrics, and we provide new insights into the factors affecting I/O performance. By demonstrating the generality of our approach, we lay the groundwork for a production-grade framework for holistic I/O analysis.« less
A Principal Component Analysis of 39 Scientific Impact Measures
Bollen, Johan; Van de Sompel, Herbert
2009-01-01
Background The impact of scientific publications has traditionally been expressed in terms of citation counts. However, scientific activity has moved online over the past decade. To better capture scientific impact in the digital era, a variety of new impact measures has been proposed on the basis of social network analysis and usage log data. Here we investigate how these new measures relate to each other, and how accurately and completely they express scientific impact. Methodology We performed a principal component analysis of the rankings produced by 39 existing and proposed measures of scholarly impact that were calculated on the basis of both citation and usage log data. Conclusions Our results indicate that the notion of scientific impact is a multi-dimensional construct that can not be adequately measured by any single indicator, although some measures are more suitable than others. The commonly used citation Impact Factor is not positioned at the core of this construct, but at its periphery, and should thus be used with caution. PMID:19562078
Experimental evaluation of a COTS system for space applications
NASA Technical Reports Server (NTRS)
Some, R. R.; Madeira, H.; Moreira, F.; Costa, D.; Rennels, D.
2002-01-01
The use of COTS-based systems in space missions for scientific data processing is very attractive, as their ratio of performance to power consumption of commercial components can be an order of magnitude greater than that of radiation hardened components, and the price differential is even higher.
Cross-language Babel structs—making scientific interfaces more efficient
NASA Astrophysics Data System (ADS)
Prantl, Adrian; Ebner, Dietmar; Epperly, Thomas G. W.
2013-01-01
Babel is an open-source language interoperability framework tailored to the needs of high-performance scientific computing. As an integral element of the Common Component Architecture, it is employed in a wide range of scientific applications where it is used to connect components written in different programming languages. In this paper we describe how we extended Babel to support interoperable tuple data types (structs). Structs are a common idiom in (mono-lingual) scientific application programming interfaces (APIs); they are an efficient way to pass tuples of nonuniform data between functions, and are supported natively by most programming languages. Using our extended version of Babel, developers of scientific codes can now pass structs as arguments between functions implemented in any of the supported languages. In C, C++, Fortran 2003/2008 and Chapel, structs can be passed without the overhead of data marshaling or copying, providing language interoperability at minimal cost. Other supported languages are Fortran 77, Fortran 90/95, Java and Python. We will show how we designed a struct implementation that is interoperable with all of the supported languages and present benchmark data to compare the performance of all language bindings, highlighting the differences between languages that offer native struct support and an object-oriented interface with getter/setter methods. A case study shows how structs can help simplify the interfaces of scientific codes significantly.
Exploring the Utility of a Virtual Performance Assessment
ERIC Educational Resources Information Center
Clarke-Midura, Jody; Code, Jillianne; Zap, Nick; Dede, Chris
2011-01-01
With funding from the Institute of Education Sciences (IES), the Virtual Performance Assessment project at the Harvard Graduate School of Education is developing and studying the feasibility of immersive virtual performance assessments (VPAs) to assess scientific inquiry of middle school students as a standardized component of an accountability…
What Not To Do: Anti-patterns for Developing Scientific Workflow Software Components
NASA Astrophysics Data System (ADS)
Futrelle, J.; Maffei, A. R.; Sosik, H. M.; Gallager, S. M.; York, A.
2013-12-01
Scientific workflows promise to enable efficient scaling-up of researcher code to handle large datasets and workloads, as well as documentation of scientific processing via standardized provenance records, etc. Workflow systems and related frameworks for coordinating the execution of otherwise separate components are limited, however, in their ability to overcome software engineering design problems commonly encountered in pre-existing components, such as scripts developed externally by scientists in their laboratories. In practice, this often means that components must be rewritten or replaced in a time-consuming, expensive process. In the course of an extensive workflow development project involving large-scale oceanographic image processing, we have begun to identify and codify 'anti-patterns'--problematic design characteristics of software--that make components fit poorly into complex automated workflows. We have gone on to develop and document low-effort solutions and best practices that efficiently address the anti-patterns we have identified. The issues, solutions, and best practices can be used to evaluate and improve existing code, as well as guiding the development of new components. For example, we have identified a common anti-pattern we call 'batch-itis' in which a script fails and then cannot perform more work, even if that work is not precluded by the failure. The solution we have identified--removing unnecessary looping over independent units of work--is often easier to code than the anti-pattern, as it eliminates the need for complex control flow logic in the component. Other anti-patterns we have identified are similarly easy to identify and often easy to fix. We have drawn upon experience working with three science teams at Woods Hole Oceanographic Institution, each of which has designed novel imaging instruments and associated image analysis code. By developing use cases and prototypes within these teams, we have undertaken formal evaluations of software components developed by programmers with widely varying levels of expertise, and have been able to discover and characterize a number of anti-patterns. Our evaluation methodology and testbed have also enabled us to assess the efficacy of strategies to address these anti-patterns according to scientifically relevant metrics, such as ability of algorithms to perform faster than the rate of data acquisition and the accuracy of workflow component output relative to ground truth. The set of anti-patterns and solutions we have identified augments of the body of more well-known software engineering anti-patterns by addressing additional concerns that obtain when a software component has to function as part of a workflow assembled out of independently-developed codebases. Our experience shows that identifying and resolving these anti-patterns reduces development time and improves performance without reducing component reusability.
New project to support scientific collaboration electronically
NASA Astrophysics Data System (ADS)
Clauer, C. R.; Rasmussen, C. E.; Niciejewski, R. J.; Killeen, T. L.; Kelly, J. D.; Zambre, Y.; Rosenberg, T. J.; Stauning, P.; Friis-Christensen, E.; Mende, S. B.; Weymouth, T. E.; Prakash, A.; McDaniel, S. E.; Olson, G. M.; Finholt, T. A.; Atkins, D. E.
A new multidisciplinary effort is linking research in the upper atmospheric and space, computer, and behavioral sciences to develop a prototype electronic environment for conducting team science worldwide. A real-world electronic collaboration testbed has been established to support scientific work centered around the experimental operations being conducted with instruments from the Sondrestrom Upper Atmospheric Research Facility in Kangerlussuaq, Greenland. Such group computing environments will become an important component of the National Information Infrastructure initiative, which is envisioned as the high-performance communications infrastructure to support national scientific research.
XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Mid-year report FY17 Q2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.; Pugmire, David; Rogers, David
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less
XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY17.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.; Pugmire, David; Rogers, David
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less
XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem. Mid-year report FY16 Q2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Geveci, Berk; Maynard, Robert
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. The XVis project brought together collaborators from predominant DOE projects for visualization on accelerators and combining their respectivemore » features into a new visualization toolkit called VTK-m.« less
XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY15 Q4.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank
The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less
A Component-based Programming Model for Composite, Distributed Applications
NASA Technical Reports Server (NTRS)
Eidson, Thomas M.; Bushnell, Dennis M. (Technical Monitor)
2001-01-01
The nature of scientific programming is evolving to larger, composite applications that are composed of smaller element applications. These composite applications are more frequently being targeted for distributed, heterogeneous networks of computers. They are most likely programmed by a group of developers. Software component technology and computational frameworks are being proposed and developed to meet the programming requirements of these new applications. Historically, programming systems have had a hard time being accepted by the scientific programming community. In this paper, a programming model is outlined that attempts to organize the software component concepts and fundamental programming entities into programming abstractions that will be better understood by the application developers. The programming model is designed to support computational frameworks that manage many of the tedious programming details, but also that allow sufficient programmer control to design an accurate, high-performance application.
Bill would bolster science at EPA
NASA Astrophysics Data System (ADS)
Showstack, Randy
Since its establishment in 1970, the U.S. Environmental Protection Agency (EPA) has primarily served as a regulatory agency with a significant science component. However, the agency's scientific practices and performance at times have been criticized by the U.S. General Accounting Office, the National Academies of Science (NAS), Congress, and EPA's own science advisory board, as well as in a number of lawsuits.New legislation introduced by Rep. Vernon Ehlers (R-Mich.), chair of the House of Representatives' Science Subcommittee on Environment, Technology, and Standards, includes measures meant to improve the agency's science component. The legislation, H.R. 64, would require the president to appoint an EPA deputy administrator for science and technology This deputy administrator, who would rank higher than existing assistant administrators (AAs), would be responsible for the overall scientific and technical foundation of the agency's decisions, including ensuring that the agency's scientific endeavors use the best possible peer review and research planning practices.
NASA Technical Reports Server (NTRS)
Said, Magdi A; Schur, Willi W.; Gupta, Amit; Mock, Gary N.; Seyam, Abdelfattah M.; Theyson, Thomas
2004-01-01
Science and technology development from balloon-borne telescopes and experiments is a rich return on a relatively modest involvement of NASA resources. For the past three decades, the development of increasingly competitive and complex science payloads and observational programs from high altitude balloon-borne platforms has yielded significant scientific discoveries. The success and capabilities of scientific balloons are closely related to advancements in the textile and plastic industries. This paper will present an overview of scientific balloons as a viable and economical platform for transporting large telescopes and scientific instruments to the upper atmosphere to conduct scientific missions. Additionally, the paper sheds the light on the problems associated with UV degradation of high performance textile components that are used to support the payload of the balloon and proposes future research to reduce/eliminate Ultra Violet (UV) degradation in order to conduct long-term scientific missions.
2013-01-01
Background The effectiveness of microprocessor-controlled prosthetic knee joints (MPKs) has been assessed using a variety of outcome measures in a variety of health and health-related domains. However, if the patient is to receive a prosthetic knee joint that enables him to function optimally in daily life, it is vital that the clinician has adequate information about the effects of that particular component on all aspects of persons’ functioning. Especially information concerning activities and participation is of high importance, as this component of functioning closely describes the person’s ability to function with the prosthesis in daily life. The present study aimed to review the outcome measures that have been utilized to assess the effects of microprocessor-controlled prosthetic knee joints (MPK), in comparison with mechanically controlled prosthetic knee joints, and aimed to classify these measures according to the components and categories of functioning defined by the International Classification of Functioning, Disability and Health (ICF). Subsequently, the gaps in the scientific evidence regarding the effectiveness of MPKs were determined. Methods A systematic literature search in 6 databases (i.e. PubMed, CINAHL, Cochrane Library, Embase, Medline and PsychInfo) identified scientific studies that compared the effects of using MPKs with mechanically controlled prosthetic knee joints on persons’ functioning. The outcome measures that have been utilized in those studies were extracted and categorized according to the ICF framework. Also, a descriptive analysis regarding all studies has been performed. Results A total of 37 studies and 72 outcome measures have been identified. The majority (67%) of the outcome measures that described the effects of using an MPK on persons’ actual performance with the prosthesis covered the ICF body functions component. Only 31% of the measures on persons’ actual performance investigated how an MPK may affect performance in daily life. Research also typically focused on young, fit and active persons. Conclusions Scientifically valid evidence regarding the performance of persons with an MPK in everyday life is limited. Future research should specifically focus on activities and participation to increase the understanding of the possible functional added value of MPKs. PMID:24279314
Theeven, Patrick J R; Hemmen, Bea; Brink, Peter R G; Smeets, Rob J E M; Seelen, Henk A M
2013-11-27
The effectiveness of microprocessor-controlled prosthetic knee joints (MPKs) has been assessed using a variety of outcome measures in a variety of health and health-related domains. However, if the patient is to receive a prosthetic knee joint that enables him to function optimally in daily life, it is vital that the clinician has adequate information about the effects of that particular component on all aspects of persons' functioning. Especially information concerning activities and participation is of high importance, as this component of functioning closely describes the person's ability to function with the prosthesis in daily life. The present study aimed to review the outcome measures that have been utilized to assess the effects of microprocessor-controlled prosthetic knee joints (MPK), in comparison with mechanically controlled prosthetic knee joints, and aimed to classify these measures according to the components and categories of functioning defined by the International Classification of Functioning, Disability and Health (ICF). Subsequently, the gaps in the scientific evidence regarding the effectiveness of MPKs were determined. A systematic literature search in 6 databases (i.e. PubMed, CINAHL, Cochrane Library, Embase, Medline and PsychInfo) identified scientific studies that compared the effects of using MPKs with mechanically controlled prosthetic knee joints on persons' functioning. The outcome measures that have been utilized in those studies were extracted and categorized according to the ICF framework. Also, a descriptive analysis regarding all studies has been performed. A total of 37 studies and 72 outcome measures have been identified. The majority (67%) of the outcome measures that described the effects of using an MPK on persons' actual performance with the prosthesis covered the ICF body functions component. Only 31% of the measures on persons' actual performance investigated how an MPK may affect performance in daily life. Research also typically focused on young, fit and active persons. Scientifically valid evidence regarding the performance of persons with an MPK in everyday life is limited. Future research should specifically focus on activities and participation to increase the understanding of the possible functional added value of MPKs.
Scientific expertise and the Athlete Biological Passport: 3 years of experience.
Schumacher, Yorck Olaf; d'Onofrio, Giuseppe
2012-06-01
Expert evaluation of biological data is a key component of the Athlete Biological Passport approach in the fight against doping. The evaluation consists of a longitudinal assessment of biological variables to determine the probability of the data being physiological on the basis of the athlete's on own previous values (performed by an automated software system using a Bayesian model) and a subjective evaluation of the results in view of possible causes (performed by experts). The role of the expert is therefore a key component in the process. Experts should be qualified to evaluate the data regarding possible explanations related to the influence of doping products and methods, analytical issues, and the influence of exercise or pathological conditions. The evaluation provides a scientific basis for the decision taken by a disciplinary panel. This evaluation should therefore encompass and balance all possible causes for a given blood profile and provide a likelihood for potential scenarios (pathology, normal variation, doping) that might have caused the pattern. It should comply with the standards for the evaluation of scientific evidence in forensics. On the basis of their evaluation of profiles, experts might provide assistance in planning appropriate target testing schemes.
Test Driven Development of a Parameterized Ice Sheet Component
NASA Astrophysics Data System (ADS)
Clune, T.
2011-12-01
Test driven development (TDD) is a software development methodology that offers many advantages over traditional approaches including reduced development and maintenance costs, improved reliability, and superior design quality. Although TDD is widely accepted in many software communities, the suitability to scientific software is largely undemonstrated and warrants a degree of skepticism. Indeed, numerical algorithms pose several challenges to unit testing in general, and TDD in particular. Among these challenges are the need to have simple, non-redundant closed-form expressions to compare against the results obtained from the implementation as well as realistic error estimates. The necessity for serial and parallel performance raises additional concerns for many scientific applicaitons. In previous work I demonstrated that TDD performed well for the development of a relatively simple numerical model that simulates the growth of snowflakes, but the results were anecdotal and of limited relevance to far more complex software components typical of climate models. This investigation has now been extended by successfully applying TDD to the implementation of a substantial portion of a new parameterized ice sheet component within a full climate model. After a brief introduction to TDD, I will present techniques that address some of the obstacles encountered with numerical algorithms. I will conclude with some quantitative and qualitative comparisons against climate components developed in a more traditional manner.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kraus, R. G.; Mcnabb, D.; Kumar, M.
The National Nuclear Security Agency has recently recognized that a long-term need exists to establish a stronger scientific basis for the assessment and qualification of materials and manufacturing processes for the nuclear stockpile and other national security applications. These materials may have undergone substantial changes with age, or may represent new materials that are being introduced because of difficulties associated with reusing or recreating materials used in original stockpile components. Also, with advancements in manufacturing methods, the NNSA anticipates opportunities for an enhanced range of control over fabricated components, an enhanced pace of materials development, and enhanced functionality. The developmentmore » of qualification standards for these new materials will require the ability to understand and control material characteristics that affect both mechanical and dynamic performance. A unique aspect for NNSA is that the performance requirements for materials are often set by system hydrodynamics, and these materials must perform in extreme environments and loading conditions. Thus, the scientific motivation is to understand “Matter-Radiation Interactions in Extremes (MaRIE).”« less
Masic, Izet
2016-01-01
The nature of performing a scientific research is a process that has several different components which consist of identifying the key research question(s), choices of scientific approach for the study and data collection, data analysis, and finally reporting on results. Generally, peer review is a series of procedures in the evaluation of a creative work or performance by other people, who work in the same or related field, with the aim of maintaining and improving the quality of work or performance in that field. The assessment of the achievement of every scientist, and thus indirectly determining his reputation in the scientific community of these publications, especially journals, is done through the so-called impact factor index. The impact factor predicts or estimates that how many annual citations article may receive after its publication. Evaluation of scientific productivity and assessment of the published articles of researchers and scientists can be made through the so-called H-index. The quality of published results of scientific work largely depends on knowledge sources that are used in the preparation, which means that it should be considered to serve the purpose and the very relevance of the information used. Scientometrics as a field of science covers all aforementioned issues, and scientometric analysis is obligatory for quality assessment of the scientific validity of published articles and other type of publications.
Masic, Izet
2016-01-01
The nature of performing a scientific research is a process that has several different components which consist of identifying the key research question(s), choices of scientific approach for the study and data collection, data analysis, and finally reporting on results. Generally, peer review is a series of procedures in the evaluation of a creative work or performance by other people, who work in the same or related field, with the aim of maintaining and improving the quality of work or performance in that field. The assessment of the achievement of every scientist, and thus indirectly determining his reputation in the scientific community of these publications, especially journals, is done through the so-called impact factor index. The impact factor predicts or estimates that how many annual citations article may receive after its publication. Evaluation of scientific productivity and assessment of the published articles of researchers and scientists can be made through the so-called H-index. The quality of published results of scientific work largely depends on knowledge sources that are used in the preparation, which means that it should be considered to serve the purpose and the very relevance of the information used. Scientometrics as a field of science covers all aforementioned issues, and scientometric analysis is obligatory for quality assessment of the scientific validity of published articles and other type of publications. PMID:26985429
NASA Astrophysics Data System (ADS)
Wiwin, E.; Kustijono, R.
2018-03-01
The purpose of the study is to describe the use of Physics practicum to train the science process skills and its effect on the scientific attitudes of the vocational high school students. The components of science process skills are: observing, classifying, inferring, predicting, and communicating. The established scientific attitudes are: curiosity, honesty, collaboration, responsibility, and open-mindedness. This is an experimental research with the one-shot case study design. The subjects are 30 Multimedia Program students of SMK Negeri 12 Surabaya. The data collection techniques used are observation and performance tests. The score of science process skills and scientific attitudes are taken from observational and performance instruments. Data analysis used are descriptive statistics and correlation. The results show that: 1) the physics practicum can train the science process skills and scientific attitudes in good category, 2) the relationship between the science process skills and the students' scientific attitude is good category 3) Student responses to the learning process using the practicum in the good category, The results of the research conclude that the physics practicum can train the science process skill and have a significant effect on the scientific attitude of the vocational highschool students.
Herrmann-Lingen, Christoph; Brunner, Edgar; Hildenbrand, Sibylle; Loew, Thomas H.; Raupach, Tobias; Spies, Claudia; Treede, Rolf-Detlef; Vahl, Christian-Friedrich; Wenz, Hans-Jürgen
2014-01-01
Objective: The evaluation of medical research performance is a key prerequisite for the systematic advancement of medical faculties, research foci, academic departments, and individual scientists’ careers. However, it is often based on vaguely defined aims and questionable methods and can thereby lead to unwanted regulatory effects. The current paper aims at defining the position of German academic medicine toward the aims, methods, and consequences of its evaluation. Methods: During the Berlin Forum of the Association of the Scientific Medical Societies in Germany (AWMF) held on 18 October 2013, international experts presented data on methods for evaluating medical research performance. Subsequent discussions among representatives of relevant scientific organizations and within three ad-hoc writing groups led to a first draft of this article. Further discussions within the AWMF Committee for Evaluation of Performance in Research and Teaching and the AWMF Executive Board resulted in the final consented version presented here. Results: The AWMF recommends modifications to the current system of evaluating medical research performance. Evaluations should follow clearly defined and communicated aims and consist of both summative and formative components. Informed peer reviews are valuable but feasible in longer time intervals only. They can be complemented by objective indicators. However, the Journal Impact Factor is not an appropriate measure for evaluating individual publications or their authors. The scientific “impact” rather requires multidimensional evaluation. Indicators of potential relevance in this context may include, e.g., normalized citation rates of scientific publications, other forms of reception by the scientific community and the public, and activities in scientific organizations, research synthesis and science communication. In addition, differentiated recommendations are made for evaluating the acquisition of third-party funds and the promotion of junior scientists. Conclusions: With the explicit recommendations presented in the current position paper, the AWMF suggests enhancements to the practice of evaluating medical research performance by faculties, ministries and research funding organizations. PMID:24971044
Herrmann-Lingen, Christoph; Brunner, Edgar; Hildenbrand, Sibylle; Loew, Thomas H; Raupach, Tobias; Spies, Claudia; Treede, Rolf-Detlef; Vahl, Christian-Friedrich; Wenz, Hans-Jürgen
2014-01-01
The evaluation of medical research performance is a key prerequisite for the systematic advancement of medical faculties, research foci, academic departments, and individual scientists' careers. However, it is often based on vaguely defined aims and questionable methods and can thereby lead to unwanted regulatory effects. The current paper aims at defining the position of German academic medicine toward the aims, methods, and consequences of its evaluation. During the Berlin Forum of the Association of the Scientific Medical Societies in Germany (AWMF) held on 18 October 2013, international experts presented data on methods for evaluating medical research performance. Subsequent discussions among representatives of relevant scientific organizations and within three ad-hoc writing groups led to a first draft of this article. Further discussions within the AWMF Committee for Evaluation of Performance in Research and Teaching and the AWMF Executive Board resulted in the final consented version presented here. The AWMF recommends modifications to the current system of evaluating medical research performance. Evaluations should follow clearly defined and communicated aims and consist of both summative and formative components. Informed peer reviews are valuable but feasible in longer time intervals only. They can be complemented by objective indicators. However, the Journal Impact Factor is not an appropriate measure for evaluating individual publications or their authors. The scientific "impact" rather requires multidimensional evaluation. Indicators of potential relevance in this context may include, e.g., normalized citation rates of scientific publications, other forms of reception by the scientific community and the public, and activities in scientific organizations, research synthesis and science communication. In addition, differentiated recommendations are made for evaluating the acquisition of third-party funds and the promotion of junior scientists. With the explicit recommendations presented in the current position paper, the AWMF suggests enhancements to the practice of evaluating medical research performance by faculties, ministries and research funding organizations.
The women in science and engineering scholars program
NASA Technical Reports Server (NTRS)
Falconer, Etta Z.; Guy, Lori Ann
1989-01-01
The Women in Science and Engineering Scholars Program provides scientifically talented women students, including those from groups underrepresented in the scientific and technical work force, with the opportunity to pursue undergraduate studies in science and engineering in the highly motivating and supportive environment of Spelman College. It also exposes students to research training at NASA Centers during the summer. The program provides an opportunity for students to increase their knowledge of career opportunities at NASA and to strengthen their motivation through exposure to NASA women scientists and engineers as role models. An extensive counseling and academic support component to maximize academic performance supplements the instructional and research components. The program is designed to increase the number of women scientists and engineers with graduate degrees, particularly those with an interest in a career with NASA.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malony, Allen D; Shende, Sameer
The primary goal of the University of Oregon's DOE "ÃÂcompetitiveness" project was to create performance technology that embodies and supports knowledge of performance data, analysis, and diagnosis in parallel performance problem solving. The target of our development activities was the TAU Performance System and the technology accomplishments reported in this and prior reports have all been incorporated in the TAU open software distribution. In addition, the project has been committed to maintaining strong interactions with the DOE SciDAC Performance Engineering Research Institute (PERI) and Center for Technology for Advanced Scientific Component Software (TASCS). This collaboration has proved valuable for translationmore » of our knowledge-based performance techniques to parallel application development and performance engineering practice. Our outreach has also extended to the DOE Advanced CompuTational Software (ACTS) collection and project. Throughout the project we have participated in the PERI and TASCS meetings, as well as the ACTS annual workshops.« less
NASA Technical Reports Server (NTRS)
Truscello, V.
1972-01-01
A major concern in the integration of a radioisotope thermoelectric generator (RTG) with a spacecraft designed to explore the outer planets is the effect of the emitted radiation on the normal operation of scientific instruments. The necessary techniques and tools developed to allow accurate calculation of the neutron and gamma spectrum emanating from the RTG. The specific sources of radiation were identified and quantified. Monte Carlo techniques are then employed to perform the nuclear transport calculations. The results of these studies are presented. An extensive experimental program was initiated to measure the response of a number of scientific components to the nuclear radiation.
NASA Technical Reports Server (NTRS)
Matijevic, J. R.; Bickler, D. B.; Braun, D. F.; Eisen, H. J.; Matthies, L. H.; Mishkin, A. H.; Stone, H. W.; van Nieuwstadt, L. M.; Wen, L. C.; Wilcox, B. H.;
1996-01-01
An exciting scientific component of the Pathfinder mission is the rover, which will act as a mini-field geologist by providing us with access to samples for chemical analyses and close-up images of the Martian surface, performing active experiments to modify the surface and study the results, and exploring the landing site area.
Aeronautical Engineering: A continuing bibliography with indexes (supplement 188)
NASA Technical Reports Server (NTRS)
1985-01-01
This bibliography lists 477 reports, articles and other documents introduced into the NASA scientific and technical information system in May 1985. The coverage includes documents on the engineering and theoretical aspects of design, construction, evaluation, testing, operation, and performance of aircraft (including aircraft engines) and associated components, equipment and systems.
Queuing Models of Tertiary Storage
NASA Technical Reports Server (NTRS)
Johnson, Theodore
1996-01-01
Large scale scientific projects generate and use large amounts of data. For example, the NASA Earth Observation System Data and Information System (EOSDIS) project is expected to archive one petabyte per year of raw satellite data. This data is made automatically available for processing into higher level data products and for dissemination to the scientific community. Such large volumes of data can only be stored in robotic storage libraries (RSL's) for near-line access. A characteristic of RSL's is the use of a robot arm that transfers media between a storage rack and the read/write drives, thus multiplying the capacity of the system. The performance of the RSL's can be a critical limiting factor for the performance of the archive system. However, the many interacting components of an RSL make a performance analysis difficult. In addition, different RSL components can have widely varying performance characteristics. This paper describes our work to develop performance models of an RSL in isolation. Next we show how the RSL model can be incorporated into a queuing network model. We use the models to make some example performance studies of archive systems. The models described in this paper, developed for the NASA EODIS project, are implemented in C with a well defined interface. The source code, accompanying documentation, and also sample JAVA applets are available at: http://www.cis.ufl.edu/ted/
Transformation of OODT CAS to Perform Larger Tasks
NASA Technical Reports Server (NTRS)
Mattmann, Chris; Freeborn, Dana; Crichton, Daniel; Hughes, John; Ramirez, Paul; Hardman, Sean; Woollard, David; Kelly, Sean
2008-01-01
A computer program denoted OODT CAS has been transformed to enable performance of larger tasks that involve greatly increased data volumes and increasingly intensive processing of data on heterogeneous, geographically dispersed computers. Prior to the transformation, OODT CAS (also alternatively denoted, simply, 'CAS') [wherein 'OODT' signifies 'Object-Oriented Data Technology' and 'CAS' signifies 'Catalog and Archive Service'] was a proven software component used to manage scientific data from spaceflight missions. In the transformation, CAS was split into two separate components representing its canonical capabilities: file management and workflow management. In addition, CAS was augmented by addition of a resource-management component. This third component enables CAS to manage heterogeneous computing by use of diverse resources, including high-performance clusters of computers, commodity computing hardware, and grid computing infrastructures. CAS is now more easily maintainable, evolvable, and reusable. These components can be used separately or, taking advantage of synergies, can be used together. Other elements of the transformation included addition of a separate Web presentation layer that supports distribution of data products via Really Simple Syndication (RSS) feeds, and provision for full Resource Description Framework (RDF) exports of metadata.
Exploration of Korean Students' Scientific Imagination Using the Scientific Imagination Inventory
NASA Astrophysics Data System (ADS)
Mun, Jiyeong; Mun, Kongju; Kim, Sung-Won
2015-09-01
This article reports on the study of the components of scientific imagination and describes the scales used to measure scientific imagination in Korean elementary and secondary students. In this study, we developed an inventory, which we call the Scientific Imagination Inventory (SII), in order to examine aspects of scientific imagination. We identified three conceptual components of scientific imagination, which were composed of (1) scientific sensitivity, (2) scientific creativity, and (3) scientific productivity. We administered SII to 662 students (4th-8th grades) and confirmed validity and reliability using exploratory factor analysis and Cronbach α coefficient. The characteristics of Korean elementary and secondary students' overall scientific imagination and difference across gender and grade level are discussed in the results section.
1986-01-01
physiological changes that contribute to the state of arousal upon which a smoking habit may depend. TheI radial muscle of the iris in the eye contracts...studied the vestibular nystagnus pattern of smokers; amplitude, frequency, speed of slow component, speed of fast component, and angular deviation of eyes ...carbon monoxide was measured before and after treatment in order to estimate the degree of inhalation, and cigarette butts were collected for analysis
Avionics and Power Management for Low-Cost High-Altitude Balloon Science Platforms
NASA Technical Reports Server (NTRS)
Chin, Jeffrey; Roberts, Anthony; McNatt, Jeremiah
2016-01-01
High-altitude balloons (HABs) have become popular as educational and scientific platforms for planetary research. This document outlines key components for missions where low cost and rapid development are desired. As an alternative to ground-based vacuum and thermal testing, these systems can be flight tested at comparable costs. Communication, solar, space, and atmospheric sensing experiments often require environments where ground level testing can be challenging or impossible in certain cases. When performing HAB research the ability to monitor the status of the platform and gather data is key for both scientific and recoverability aspects of the mission. A few turnkey platform solutions are outlined that leverage rapidly evolving open-source engineering ecosystems. Rather than building custom components from scratch, these recommendations attempt to maximize simplicity and cost of HAB platforms to make launches more accessible to everyone.
Teaching toward a More Scientifically Literate Society
ERIC Educational Resources Information Center
LoGiudici, Raymond; Ende, Fred
2010-01-01
To teach scientific literacy to eighth graders, the authors created a yearlong project that emphasizes the various components and skills required to be a scientifically literate citizen. This project is broken into four separate components: skeptical thinking (pseudoscience), current-event article analysis, fiction and nonfiction literature, and…
NASA Astrophysics Data System (ADS)
Rachmatullah, Arif; Diana, Sariwulan; Rustaman, Nuryani Y.
2016-02-01
Along with the development of science and technology, the basic ability to read, write and count is not enough just to be able to survive in the modern era that surrounded by the products of science and technology. Scientific literacy is an ability that might be added as basic ability for human in the modern era. Recently, Fives et al. developed a new scientific literacy assessment for students, named as SLA (Scientific Literacy Assessment). A pilot study on the achievements of scientific literacy of middle school students in Sumedang using SLA was conducted to investigate the profile scientific literacy achievement of 223 middle school students in Sumedang, and compare the outcomes between genders (159 girls and 64 boys) and school accreditation (A and B) using a quantitative method with descriptive research-school survey. Based on the results, the average achievement of scientific literacy Sumedang middle school students is 45.21 and classified as the low category. The five components of scientific literacy, which is only one component in the medium category, namely science motivation and beliefs, and the four other components are in the low and very low category. Boys have higher scientific literacy, but the differences not statistically significant. Student's scientific literacy in an accredited school is higher than B, and the differences are statistically significant. Recommendation for further are: involve more research subjects, add more number of questions for each indicator, and conduct an independent research for each component.
Enabling NVM for Data-Intensive Scientific Services
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carns, Philip; Jenkins, John; Seo, Sangmin
Specialized, transient data services are playing an increasingly prominent role in data-intensive scientific computing. These services offer flexible, on-demand pairing of applications with storage hardware using semantics that are optimized for the problem domain. Concurrent with this trend, upcoming scientific computing and big data systems will be deployed with emerging NVM technology to achieve the highest possible price/productivity ratio. Clearly, therefore, we must develop techniques to facilitate the confluence of specialized data services and NVM technology. In this work we explore how to enable the composition of NVM resources within transient distributed services while still retaining their essential performance characteristics.more » Our approach involves eschewing the conventional distributed file system model and instead projecting NVM devices as remote microservices that leverage user-level threads, RPC services, RMA-enabled network transports, and persistent memory libraries in order to maximize performance. We describe a prototype system that incorporates these concepts, evaluate its performance for key workloads on an exemplar system, and discuss how the system can be leveraged as a component of future data-intensive architectures.« less
Inquiring into Familiar Objects: An Inquiry-Based Approach to Introduce Scientific Vocabulary
ERIC Educational Resources Information Center
Hicks Pries, Caitlin; Hughes, Julie
2012-01-01
Learning science vocabulary is an often tedious but important component of many curricula. Frequently, students are expected to learn science vocabulary indirectly, but this method can hinder the success of lower-performing students (Carlisle, Fleming, and Gudbrandsen 2000). We have developed an inquiry-based vocabulary activity wherein students…
Charlton, Bruce G
2007-01-01
In scientific writing, although clarity and precision of language are vital to effective communication, it seems undeniable that content is more important than form. Potentially valuable knowledge should not be excluded from the scientific literature merely because the researchers lack advanced language skills. Given that global scientific literature is overwhelmingly in the English-language, this presents a problem for non-native speakers. My proposal is that scientists should be permitted to construct papers using a substantial number of direct quotations from the already-published scientific literature. Quotations would need to be explicitly referenced so that the original author and publication should be given full credit for creating such a useful and valid description. At the extreme, this might result in a paper consisting mainly of a 'mosaic' of quotations from the already existing scientific literature, which are linked and extended by relatively few sentences comprising new data or ideas. This model bears some conceptual relationship to the recent trend in computing science for component-based or component-oriented software engineering - in which new programs are constructed by reusing programme components, which may be available in libraries. A new functionality is constructed by linking-together many pre-existing chunks of software. I suggest that journal editors should, in their instructions to authors, explicitly allow this 'component-oriented' method of constructing scientific articles; and carefully describe how it can be accomplished in such a way that proper referencing is enforced, and full credit is allocated to the authors of the reused linguistic components.
Effects of VR system fidelity on analyzing isosurface visualization of volume datasets.
Laha, Bireswar; Bowman, Doug A; Socha, John J
2014-04-01
Volume visualization is an important technique for analyzing datasets from a variety of different scientific domains. Volume data analysis is inherently difficult because volumes are three-dimensional, dense, and unfamiliar, requiring scientists to precisely control the viewpoint and to make precise spatial judgments. Researchers have proposed that more immersive (higher fidelity) VR systems might improve task performance with volume datasets, and significant results tied to different components of display fidelity have been reported. However, more information is needed to generalize these results to different task types, domains, and rendering styles. We visualized isosurfaces extracted from synchrotron microscopic computed tomography (SR-μCT) scans of beetles, in a CAVE-like display. We ran a controlled experiment evaluating the effects of three components of system fidelity (field of regard, stereoscopy, and head tracking) on a variety of abstract task categories that are applicable to various scientific domains, and also compared our results with those from our prior experiment using 3D texture-based rendering. We report many significant findings. For example, for search and spatial judgment tasks with isosurface visualization, a stereoscopic display provides better performance, but for tasks with 3D texture-based rendering, displays with higher field of regard were more effective, independent of the levels of the other display components. We also found that systems with high field of regard and head tracking improve performance in spatial judgment tasks. Our results extend existing knowledge and produce new guidelines for designing VR systems to improve the effectiveness of volume data analysis.
Activation Levels, Handling, and Storage of Activated Components in the Target Hall at FRIB
NASA Astrophysics Data System (ADS)
Georgobiani, D.; Bennett, R.; Bollen, G.; Kostin, M.; Ronningen, R.
2018-06-01
The Facility for Rare Isotope Beams (FRIB) is a major new scientific user facility under construction in the United States for nuclear science research with beams of rare isotopes. 400 kW beam operations with heavy ions ranging from oxygen to uranium will create a high radiation environment for many components, particularly for the beam line components located in the target hall, where approximately 100 kW of beam power are dissipated in the target and another 300 kW are dissipated in the beam dump. Detailed studies of the component activation, their remote handling, storage, and transport, have been performed to ensure safe operation levels in this environment. Levels of activation are calculated for the beam line components within the FRIB target hall.
NASA Astrophysics Data System (ADS)
Matsushima, Masaki; Tsunakawa, Hideo; Iijima, Yu-Ichi; Nakazawa, Satoru; Matsuoka, Ayako; Ikegami, Shingo; Ishikawa, Tomoaki; Shibuya, Hidetoshi; Shimizu, Hisayoshi; Takahashi, Futoshi
2010-07-01
To achieve the scientific objectives related to the lunar magnetic field measurements in a polar orbit at an altitude of 100 km, strict electromagnetic compatibility (EMC) requirements were applied to all components and subsystems of the SELENE (Kaguya) spacecraft. The magnetic cleanliness program was defined as one of the EMC control procedures, and magnetic tests were carried out for most of the engineering and flight models. The EMC performance of all components was systematically controlled and examined through a series of EMC tests. As a result, the Kaguya spacecraft was made to be very clean, magnetically. Hence reliable scientific data related to the magnetic field around the Moon were obtained by the LMAG (Lunar MAGnetometer) and the PACE (Plasma energy Angle and Composition Experiment) onboard the Kaguya spacecraft. These data have been available for lunar science use since November 2009.
Visualization techniques to aid in the analysis of multispectral astrophysical data sets
NASA Technical Reports Server (NTRS)
Brugel, E. W.; Domik, Gitta O.; Ayres, T. R.
1993-01-01
The goal of this project was to support the scientific analysis of multi-spectral astrophysical data by means of scientific visualization. Scientific visualization offers its greatest value if it is not used as a method separate or alternative to other data analysis methods but rather in addition to these methods. Together with quantitative analysis of data, such as offered by statistical analysis, image or signal processing, visualization attempts to explore all information inherent in astrophysical data in the most effective way. Data visualization is one aspect of data analysis. Our taxonomy as developed in Section 2 includes identification and access to existing information, preprocessing and quantitative analysis of data, visual representation and the user interface as major components to the software environment of astrophysical data analysis. In pursuing our goal to provide methods and tools for scientific visualization of multi-spectral astrophysical data, we therefore looked at scientific data analysis as one whole process, adding visualization tools to an already existing environment and integrating the various components that define a scientific data analysis environment. As long as the software development process of each component is separate from all other components, users of data analysis software are constantly interrupted in their scientific work in order to convert from one data format to another, or to move from one storage medium to another, or to switch from one user interface to another. We also took an in-depth look at scientific visualization and its underlying concepts, current visualization systems, their contributions and their shortcomings. The role of data visualization is to stimulate mental processes different from quantitative data analysis, such as the perception of spatial relationships or the discovery of patterns or anomalies while browsing through large data sets. Visualization often leads to an intuitive understanding of the meaning of data values and their relationships by sacrificing accuracy in interpreting the data values. In order to be accurate in the interpretation, data values need to be measured, computed on, and compared to theoretical or empirical models (quantitative analysis). If visualization software hampers quantitative analysis (which happens with some commercial visualization products), its use is greatly diminished for astrophysical data analysis. The software system STAR (Scientific Toolkit for Astrophysical Research) was developed as a prototype during the course of the project to better understand the pragmatic concerns raised in the project. STAR led to a better understanding on the importance of collaboration between astrophysicists and computer scientists. Twenty-one examples of the use of visualization for astrophysical data are included with this report. Sixteen publications related to efforts performed during or initiated through work on this project are listed at the end of this report.
Aeronautical Engineering: A Continuing Bibliography with indexes
NASA Technical Reports Server (NTRS)
1984-01-01
This bibliography lists 426 reports, articles and other documents introduced into the NASA scientific and technical information system in August 1984. Reports are cited in the area of Aeronautical Engineering. The coverage includes documents on the engineering and theoretical aspects of design, construction, evaluation, testing operation and performance of aircraft (including aircraft engines) and associated components, equipment and systems.
An Overview of the Performance and Scientific Results From the Chandra X-Ray Observatory (CXO)
NASA Technical Reports Server (NTRS)
Weisskopf, M. C.; Brinkman, B.; Canizares, C.; Garmine, G.; Murray, S.; VanSpeybroeck, L. P.; Six, N. Frank (Technical Monitor)
2001-01-01
The Chandra X-Ray Observatory (CXO), the x-ray component of NASA's Great Observatories, was launched on 1999, July 23 by the Space Shuttle Columbia. After satellite systems activation, the first x-rays focused by the telescope were observed on 1999, August 12. Beginning with the initial observation it was clear that the telescope had survived the launch environment and was operating as expected. Despite an initial surprise due to the discovery that the telescope was far more efficient for concentrating CCD-damaging low-energy protons than had been anticipated, the observatory is performing well and is returning superb scientific data. Together with other space observatories, most notably XMM-Newton, it is clear that we have entered a new era of discovery in high-energy astrophysics.
Development of methodology for component testing under impact loading for space applications
NASA Astrophysics Data System (ADS)
Church, Phillip; Taylor, Nicholas; Perkinson, Marie-Claire; Wishart, Alex; Vijendran, Sanjay; Braithwaite, Chris
2017-06-01
A number of recent studies have highlighted the scientific benefits of penetrator technology in conducting exploration on other planetary bodies and moons within the solar system. Such a ``hard landing'' approach is cheaper and easier than the traditional ``soft landing'' method. However it is necessary for the science package of such a mission to withstand the rapid decelerations that will occur upon impact. This paper outlines an approach that has been developed to simulate the loading appropriate to Europa and also to monitor component performance before, during and after the impact.
Basic Science Considerations in Primary Total Hip Replacement Arthroplasty
Mirza, Saqeb B; Dunlop, Douglas G; Panesar, Sukhmeet S; Naqvi, Syed G; Gangoo, Shafat; Salih, Saif
2010-01-01
Total Hip Replacement is one of the most common operations performed in the developed world today. An increasingly ageing population means that the numbers of people undergoing this operation is set to rise. There are a numerous number of prosthesis on the market and it is often difficult to choose between them. It is therefore necessary to have a good understanding of the basic scientific principles in Total Hip Replacement and the evidence base underpinning them. This paper reviews the relevant anatomical and biomechanical principles in THA. It goes on to elaborate on the structural properties of materials used in modern implants and looks at the evidence base for different types of fixation including cemented and uncemented components. Modern bearing surfaces are discussed in addition to the scientific basis of various surface engineering modifications in THA prostheses. The basic science considerations in component alignment and abductor tension are also discussed. A brief discussion on modular and custom designs of THR is also included. This article reviews basic science concepts and the rationale underpinning the use of the femoral and acetabular component in total hip replacement. PMID:20582240
Nutrition and health – transforming research traditions.
Hanekamp, Jaap C; Bast, Aalt; Calabrese, Edward J
2015-01-01
In this contribution, we show that current scientific methodologies used in nutrition science and by regulatory agencies, such as the randomized control trial, limit our understanding of nutrition and health as they are to crude to capture the subtle pleiotropic nature of most nutrients. Thereby, regulatory agencies such as the European Food Safety Authority curb the development of scientific knowledge and industrial innovations within the nutritional field. In order to develop insights into the health impact of certain food and food-components, we need to realize that health is adaptation set within a homeostatic range. Increased performance of health, i.e., the maximum stimulation of health, typically seems 30-60% greater than the control group, with a width of no more than about a factor of ten, clarifying the difficulty of documenting responses of food-endogenous components within the homeostatic range of healthy people. A strategy to record subtle responses of food components is the summation of procentual effects of relevant health outcomes. We illustrate this approach with the action of flavanols on vascular health, specifically endothelial function.
Performance Characteristics of Lithium-Ion Prototype Batteries for Mars Surveyor Program 2001 Lander
NASA Technical Reports Server (NTRS)
Smart, M. C.; Ratnakumar, B. V.; Whitcanack, L.; Surampudi, S.; Byers, J.; Marsh, R. A.
2000-01-01
A viewgraph presentation outlines the scientific payload, expected launch date and tasks, and an image of the Mars Surveyor 2001 Lander components. The Lander's battery specifications are given. The program objectives for the Li-ion cells for the Lander are listed, and results performance evaluation and cycle life performance tests are outlined for different temperatures. Cell charge characteristics are described, and test data is presented for charge capacity at varying temperatures. Capacity retention and storage characteristics tests are described and results are shown.
A generative model for scientific concept hierarchies.
Datta, Srayan; Adar, Eytan
2018-01-01
In many scientific disciplines, each new 'product' of research (method, finding, artifact, etc.) is often built upon previous findings-leading to extension and branching of scientific concepts over time. We aim to understand the evolution of scientific concepts by placing them in phylogenetic hierarchies where scientific keyphrases from a large, longitudinal academic corpora are used as a proxy of scientific concepts. These hierarchies exhibit various important properties, including power-law degree distribution, power-law component size distribution, existence of a giant component and less probability of extending an older concept. We present a generative model based on preferential attachment to simulate the graphical and temporal properties of these hierarchies which helps us understand the underlying process behind scientific concept evolution and may be useful in simulating and predicting scientific evolution.
A generative model for scientific concept hierarchies
Adar, Eytan
2018-01-01
In many scientific disciplines, each new ‘product’ of research (method, finding, artifact, etc.) is often built upon previous findings–leading to extension and branching of scientific concepts over time. We aim to understand the evolution of scientific concepts by placing them in phylogenetic hierarchies where scientific keyphrases from a large, longitudinal academic corpora are used as a proxy of scientific concepts. These hierarchies exhibit various important properties, including power-law degree distribution, power-law component size distribution, existence of a giant component and less probability of extending an older concept. We present a generative model based on preferential attachment to simulate the graphical and temporal properties of these hierarchies which helps us understand the underlying process behind scientific concept evolution and may be useful in simulating and predicting scientific evolution. PMID:29474409
The Effectiveness of Scientific Inquiry With/Without Integration of Scientific Reasoning
ERIC Educational Resources Information Center
Chen, Chun-Ting; She, Hsiao-Ching
2015-01-01
This study examines the difference in effectiveness between two scientific inquiry programs-one with an emphasis on scientific reasoning and one without a scientific reasoning component-on students' scientific concepts, scientific concept-dependent reasoning, and scientific inquiry. A mixed-method approach was used in which 115 grade 5…
ERIC Educational Resources Information Center
Yeh, Kuan-Hue; She, Hsiao-Ching
2010-01-01
The purpose of this study is to examine the difference in effectiveness between two on-line scientific learning programs--one with an argumentation component and one without an argumentation component--on students' scientific argumentation ability and conceptual change. A quasi-experimental design was used in this study. Two classes of 8th grade…
Aeronautical engineering: A continuing bibliography with indexes (supplement 284)
NASA Technical Reports Server (NTRS)
1992-01-01
This bibliography lists 974 reports, articles, and other documents introduced into the NASA scientific and technical information system in Oct. 1992. The coverage includes documents on design, construction, evaluation, testing, operation, and performance of aircraft (including aircraft engines) and associated components, equipment, and systems. It also includes research and development in aerodynamics, aeronautics, and ground support equipment for aeronautical vehicles.
DoSSiER: Database of scientific simulation and experimental results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wenzel, Hans; Yarba, Julia; Genser, Krzystof
The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientific Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER can be easily accessed via a web application. In addition, a web service allows for programmatic access to the repository to extract records in json or xml exchange formats. In this paper, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.
DoSSiER: Database of scientific simulation and experimental results
Wenzel, Hans; Yarba, Julia; Genser, Krzystof; ...
2016-08-01
The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientific Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER can be easily accessed via a web application. In addition, a web service allows for programmatic access to the repository to extract records in json or xml exchange formats. In this paper, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.
The architecture of the High Performance Storage System (HPSS)
NASA Technical Reports Server (NTRS)
Teaff, Danny; Watson, Dick; Coyne, Bob
1994-01-01
The rapid growth in the size of datasets has caused a serious imbalance in I/O and storage system performance and functionality relative to application requirements and the capabilities of other system components. The High Performance Storage System (HPSS) is a scalable, next-generation storage system that will meet the functionality and performance requirements or large-scale scientific and commercial computing environments. Our goal is to improve the performance and capacity of storage by two orders of magnitude or more over what is available in the general or mass marketplace today. We are also providing corresponding improvements in architecture and functionality. This paper describes the architecture and functionality of HPSS.
NASA Astrophysics Data System (ADS)
Okaya, D.; Deelman, E.; Maechling, P.; Wong-Barnum, M.; Jordan, T. H.; Meyers, D.
2007-12-01
Large scientific collaborations, such as the SCEC Petascale Cyberfacility for Physics-based Seismic Hazard Analysis (PetaSHA) Project, involve interactions between many scientists who exchange ideas and research results. These groups must organize, manage, and make accessible their community materials of observational data, derivative (research) results, computational products, and community software. The integration of scientific workflows as a paradigm to solve complex computations provides advantages of efficiency, reliability, repeatability, choices, and ease of use. The underlying resource needed for a scientific workflow to function and create discoverable and exchangeable products is the construction, tracking, and preservation of metadata. In the scientific workflow environment there is a two-tier structure of metadata. Workflow-level metadata and provenance describe operational steps, identity of resources, execution status, and product locations and names. Domain-level metadata essentially define the scientific meaning of data, codes and products. To a large degree the metadata at these two levels are separate. However, between these two levels is a subset of metadata produced at one level but is needed by the other. This crossover metadata suggests that some commonality in metadata handling is needed. SCEC researchers are collaborating with computer scientists at SDSC, the USC Information Sciences Institute, and Carnegie Mellon Univ. in order to perform earthquake science using high-performance computational resources. A primary objective of the "PetaSHA" collaboration is to perform physics-based estimations of strong ground motion associated with real and hypothetical earthquakes located within Southern California. Construction of 3D earth models, earthquake representations, and numerical simulation of seismic waves are key components of these estimations. Scientific workflows are used to orchestrate the sequences of scientific tasks and to access distributed computational facilities such as the NSF TeraGrid. Different types of metadata are produced and captured within the scientific workflows. One workflow within PetaSHA ("Earthworks") performs a linear sequence of tasks with workflow and seismological metadata preserved. Downstream scientific codes ingest these metadata produced by upstream codes. The seismological metadata uses attribute-value pairing in plain text; an identified need is to use more advanced handling methods. Another workflow system within PetaSHA ("Cybershake") involves several complex workflows in order to perform statistical analysis of ground shaking due to thousands of hypothetical but plausible earthquakes. Metadata management has been challenging due to its construction around a number of legacy scientific codes. We describe difficulties arising in the scientific workflow due to the lack of this metadata and suggest corrective steps, which in some cases include the cultural shift of domain science programmers coding for metadata.
Enhanced-Adhesion Multi-Walled Carbon Nanotubes on Titanium Substrates for Stray Light Control
NASA Technical Reports Server (NTRS)
Hagopian, John; Getty, Stephanie; Quijada, Manuel
2012-01-01
Carbon nanotubes previously grown on silicon have extremely low reflectance, making them a good candidate for stray light suppression. Silicon, however, is not a good structural material for stray light components such as tubes, stops, and baffles. Titanium is a good structural material and can tolerate the 700 C nanotube growth process. The ability to grow carbon nanotubes on a titanium substrate that are ten times blacker than the current NASA state-of-the-art paints in the visible to near infrared spectra has been achieved. This innovation will allow significant improvement of stray light performance in scientific instruments or any other optical system. This innovation is a refinement of the utilization of multiwalled carbon nano tubes for stray light suppression in spaceflight instruments. The innovation is a process to make the surface darker and improve the adhesion to the substrate, improving robustness for spaceflight use. Bright objects such as clouds or ice scatter light off of instrument structures and components and make it difficult to see dim objects in Earth observations. A darker material to suppress this stray light has multiple benefits to these observations, including enabling scientific observations not currently possible, increasing observational efficiencies in high-contrast scenes, and simplifying instruments and lowering their cost by utilizing fewer stray light components and achieving equivalent performance. The prior art was to use commercially available black paint, which resulted in approximately 4% of the light being reflected (hemispherical reflectance or total integrated scatter, or TIS). Use of multiwalled carbon nanotubes on titanium components such as baffles, entrance aperture, tubes, and stops, can decrease this scattered light by a factor of ten per bounce over the 200-nm to 2,500-nm wavelength range. This can improve system stray light performance by orders of magnitude. The purpose of the innovation is to provide an enhanced stray light control capability by making a blacker surface treatment for typical stray light control components. Since baffles, stops, and tubes used in scientific observations often undergo loads such as vibration, it was critical to develop this surface treatment on structural materials. The innovation is to optimize the carbon nanotube growth for titanium, which is a strong, lightweight structural material suitable for spaceflight use. The titanium substrate carbon nanotubes are more robust than those grown on silicon and allow for easier utilization. They are darker than current surface treatments over larger angles and larger wavelength range. The primary advantage of titanium substrate is that it is a good structural material, and not as brittle as silicon.
Computational Simulations and the Scientific Method
NASA Technical Reports Server (NTRS)
Kleb, Bil; Wood, Bill
2005-01-01
As scientific simulation software becomes more complicated, the scientific-software implementor's need for component tests from new model developers becomes more crucial. The community's ability to follow the basic premise of the Scientific Method requires independently repeatable experiments, and model innovators are in the best position to create these test fixtures. Scientific software developers also need to quickly judge the value of the new model, i.e., its cost-to-benefit ratio in terms of gains provided by the new model and implementation risks such as cost, time, and quality. This paper asks two questions. The first is whether other scientific software developers would find published component tests useful, and the second is whether model innovators think publishing test fixtures is a feasible approach.
Improving Scientific Research and Writing Skills through Peer Review and Empirical Group Learning †
Senkevitch, Emilee; Smith, Ann C.; Marbach-Ad, Gili; Song, Wenxia
2011-01-01
Here we describe a semester-long, multipart activity called “Read and wRite to reveal the Research process” (R3) that was designed to teach students the elements of a scientific research paper. We implemented R3 in an advanced immunology course. In R3, we paralleled the activities of reading, discussion, and presentation of relevant immunology work from primary research papers with student writing, discussion, and presentation of their own lab findings. We used reading, discussing, and writing activities to introduce students to the rationale for basic components of a scientific research paper, the method of composing a scientific paper, and the applications of course content to scientific research. As a final part of R3, students worked collaboratively to construct a Group Research Paper that reported on a hypothesis-driven research project, followed by a peer review activity that mimicked the last stage of the scientific publishing process. Assessment of student learning revealed a statistically significant gain in student performance on writing in the style of a research paper from the start of the semester to the end of the semester. PMID:23653760
NASA Astrophysics Data System (ADS)
Qin, Fangcheng; Li, Yongtang; Qi, Huiping; Ju, Li
2017-01-01
Research on compact manufacturing technology for shape and performance controllability of metallic components can realize the simplification and high-reliability of manufacturing process on the premise of satisfying the requirement of macro/micro-structure. It is not only the key paths in improving performance, saving material and energy, and green manufacturing of components used in major equipments, but also the challenging subjects in frontiers of advanced plastic forming. To provide a novel horizon for the manufacturing in the critical components is significant. Focused on the high-performance large-scale components such as bearing rings, flanges, railway wheels, thick-walled pipes, etc, the conventional processes and their developing situations are summarized. The existing problems including multi-pass heating, wasting material and energy, high cost and high-emission are discussed, and the present study unable to meet the manufacturing in high-quality components is also pointed out. Thus, the new techniques related to casting-rolling compound precise forming of rings, compact manufacturing for duplex-metal composite rings, compact manufacturing for railway wheels, and casting-extruding continuous forming of thick-walled pipes are introduced in detail, respectively. The corresponding research contents, such as casting ring blank, hot ring rolling, near solid-state pressure forming, hot extruding, are elaborated. Some findings in through-thickness microstructure evolution and mechanical properties are also presented. The components produced by the new techniques are mainly characterized by fine and homogeneous grains. Moreover, the possible directions for further development of those techniques are suggested. Finally, the key scientific problems are first proposed. All of these results and conclusions have reference value and guiding significance for the integrated control of shape and performance in advanced compact manufacturing.
Analysis of the acceptance of autonomous planetary science data collection by field of inquiry
NASA Astrophysics Data System (ADS)
Straub, Jeremy
2015-06-01
The acceptance of autonomous control technologies in planetary science has met significant resistance. Many within this scientific community question the efficacy of autonomous technologies for making decisions regarding what data to collect, how to process it and its processing. These technologies, however, can be used to significantly increase the scientific return on mission investment by removing limitations imposed by communications bandwidth constraints and communications and human decision making delays. A fully autonomous mission, in an ideal case, could be deployed, perform most of the substantive work itself (possibly relying on human assistance for dealing with any unexpected or unexplained occurrences) and return an answer to a scientific question along with data selected to allow scientists to validate software performance. This paper presents the results of a survey of planetary scientists which attempts to identify the root causes of the impediments to the use of this type of technology and identify pathways to its acceptance. Previous work considered planetary science as a single large community. This paper contrasts the differences in acceptance between component fields of planetary science.
Exploration of Korean Students' Scientific Imagination Using the Scientific Imagination Inventory
ERIC Educational Resources Information Center
Mun, Jiyeong; Mun, Kongju; Kim, Sung-Won
2015-01-01
This article reports on the study of the components of scientific imagination and describes the scales used to measure scientific imagination in Korean elementary and secondary students. In this study, we developed an inventory, which we call the Scientific Imagination Inventory (SII), in order to examine aspects of scientific imagination. We…
xSDK Foundations: Toward an Extreme-scale Scientific Software Development Kit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heroux, Michael A.; Bartlett, Roscoe; Demeshko, Irina
Here, extreme-scale computational science increasingly demands multiscale and multiphysics formulations. Combining software developed by independent groups is imperative: no single team has resources for all predictive science and decision support capabilities. Scientific libraries provide high-quality, reusable software components for constructing applications with improved robustness and portability. However, without coordination, many libraries cannot be easily composed. Namespace collisions, inconsistent arguments, lack of third-party software versioning, and additional difficulties make composition costly. The Extreme-scale Scientific Software Development Kit (xSDK) defines community policies to improve code quality and compatibility across independently developed packages (hypre, PETSc, SuperLU, Trilinos, and Alquimia) and provides a foundationmore » for addressing broader issues in software interoperability, performance portability, and sustainability. The xSDK provides turnkey installation of member software and seamless combination of aggregate capabilities, and it marks first steps toward extreme-scale scientific software ecosystems from which future applications can be composed rapidly with assured quality and scalability.« less
xSDK Foundations: Toward an Extreme-scale Scientific Software Development Kit
Heroux, Michael A.; Bartlett, Roscoe; Demeshko, Irina; ...
2017-03-01
Here, extreme-scale computational science increasingly demands multiscale and multiphysics formulations. Combining software developed by independent groups is imperative: no single team has resources for all predictive science and decision support capabilities. Scientific libraries provide high-quality, reusable software components for constructing applications with improved robustness and portability. However, without coordination, many libraries cannot be easily composed. Namespace collisions, inconsistent arguments, lack of third-party software versioning, and additional difficulties make composition costly. The Extreme-scale Scientific Software Development Kit (xSDK) defines community policies to improve code quality and compatibility across independently developed packages (hypre, PETSc, SuperLU, Trilinos, and Alquimia) and provides a foundationmore » for addressing broader issues in software interoperability, performance portability, and sustainability. The xSDK provides turnkey installation of member software and seamless combination of aggregate capabilities, and it marks first steps toward extreme-scale scientific software ecosystems from which future applications can be composed rapidly with assured quality and scalability.« less
Nonlinear analysis and performance evaluation of the Annular Suspension and Pointing System (ASPS)
NASA Technical Reports Server (NTRS)
Joshi, S. M.
1978-01-01
The Annular Suspension and Pointing System (ASPS) can provide high accurate fine pointing for a variety of solar-, stellar-, and Earth-viewing scientific instruments during space shuttle orbital missions. In this report, a detailed nonlinear mathematical model is developed for the ASPS/Space Shuttle system. The equations are augmented with nonlinear models of components such as magnetic actuators and gimbal torquers. Control systems and payload attitude state estimators are designed in order to obtain satisfactory pointing performance, and statistical pointing performance is predicted in the presence of measurement noise and disturbances.
NASA Astrophysics Data System (ADS)
Delen, Ibrahim
Engage students in constructing scientific practices is a critical component of science instruction. Therefore a number of researchers have developed software programs to help students and teachers in this hard task. The Zydeco group, designed a mobile application called Zydeco, which enables students to collect data inside and outside the classroom, and then use the data to create scientific explanations by using claim-evidence-reasoning framework. Previous technologies designed to support scientific explanations focused on how these programs improve students' scientific explanations, but these programs ignored how scientific explanation technologies can support teacher practices. Thus, to increase our knowledge how different scaffolds can work together, this study aimed to portray the synergy between a teacher's instructional practices (part 1) and using supports within a mobile devices (part 2) to support students in constructing explanations. Synergy can be thought of as generic and content-specific scaffolds working together to enable students to accomplish challenging tasks, such as creating explanations that they would not normally be able to do without the scaffolds working together. Providing instruction (part 1) focused on understanding how the teacher scaffolds students' initial understanding of the claim-evidence-reasoning (CER) framework. The second component of examining synergy (part 2: using mobile devices) investigated how this teacher used mobile devices to provide feedback when students created explanations. The synergy between providing instruction and using mobile devices was investigated by analyzing a middle school teacher's practices in two different units (plants and water quality). Next, this study focused on describing how the level of synergy influenced the quality of students' scientific explanations. Finally, I investigated the role of focused teaching intervention sessions to inform teacher in relation to students' performance. In conclusion, findings of this study showed that the decrease in the teacher's support for claims, did not affect the quality of the students' claims. On the other hand, the quality of students' reasoning were linked with the teacher's practices. This suggests that when supporting students' explanations, focusing on components that students find challenging would benefit students' construction of explanations. To achieve synergy in this process, the collaboration between teacher's practices, focused teaching intervention sessions and scaffolds designed to support teachers played a crucial role in aiding students in creating explanations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Yao; Balaprakash, Prasanna; Meng, Jiayuan
We present Raexplore, a performance modeling framework for architecture exploration. Raexplore enables rapid, automated, and systematic search of architecture design space by combining hardware counter-based performance characterization and analytical performance modeling. We demonstrate Raexplore for two recent manycore processors IBM Blue- Gene/Q compute chip and Intel Xeon Phi, targeting a set of scientific applications. Our framework is able to capture complex interactions between architectural components including instruction pipeline, cache, and memory, and to achieve a 3–22% error for same-architecture and cross-architecture performance predictions. Furthermore, we apply our framework to assess the two processors, and discover and evaluate a list ofmore » architectural scaling options for future processor designs.« less
ERIC Educational Resources Information Center
Ault, Marilyn; Craig-Hare, Jana; Frey, Bruce
2016-01-01
Reason Racer is an online, rate-based, multiplayer game designed to engage middle school students in the knowledge and skills related to scientific argumentation. Several game features are included as design considerations unrelated to science content or argumentation. One specific feature, a competitive racing component that occurs in between…
Aeronautical engineering: A continuing bibliography with indexes (supplement 119)
NASA Technical Reports Server (NTRS)
1980-01-01
This bibliography lists 341 reports, articles, and other documents introduced into the NASA scientific and technical information system in January 1980. Abstracts on the engineering and theoretical aspects of design, construction, evaluation, testing, operation, and performance of aircraft (including aircraft engines) and associated components, equipment, and systems are presented. Research and development in aerodynamics, aeronautics, and ground support equipment for aeronautical vehicles are also presented.
Aeronautical engineering: A continuing bibliography with indexes (supplement 282)
NASA Technical Reports Server (NTRS)
1992-01-01
This bibliography lists 623 reports, articles, and other documents introduced into the NASA scientific and technical information system in Aug. 1992. The coverage includes documents on the engineering and theoretical aspects of design, construction, evaluation, testing, operation, and performance of aircraft (including aircraft engines) and associated components, equipment, and systems. It also includes research and development in aerodynamics, aeronautics, and ground support equipment for aeronautical vehicles.
Aeronautical Engineering: A Continuing Bibliography with Indexes
NASA Technical Reports Server (NTRS)
1995-01-01
This bibliography lists 193 reports, journal articles, and other documents introduced in the NASA scientific and technical system in Aug. 1995. Subject coverage includes documents on the engineering and theoretical aspects of design, construction, evaluation, testing, operation, and performance of aircraft (including aircraft engines) and associated components, equipment, and systems. It also includes research and development in aerodynamics, aeronautics, and ground support equipment for aeronautical vehicles
Aeronautical engineering: A continuing bibliography with indexes (supplement 324)
NASA Technical Reports Server (NTRS)
1995-01-01
This bibliography lists 149 reports, articles, and other documents introduced into the NASA scientific and technical information system in December 1995. Subject coverage includes engineering and theoretical aspects of design, construction, evaluation, testing, operation, and performance of aircraft (including aircraft engines) and associated components, equipment, and systems. It also includes research and development in aerodynamics, aeronautics, and ground support equipment for aeronautical vehicles.
Aeronautical engineering: A continuing bibliography with indexes (supplement 313)
NASA Technical Reports Server (NTRS)
1995-01-01
This bibliography lists 179 reports, articles, and other documents introduced into the NASA scientific and technical information system in Jan. 1995. Subject coverage includes: engineering and theoretical aspects of design, construction, evaluation, testing, operation, and performance of aircraft (including aircraft engines) and associated components, equipment, and systems. It also includes research and development in aerodynamics, aeronautics, and ground support equipment for aeronautical vehicles.
Aeronautical engineering: A continuing bibliography with indexes (supplement 310)
NASA Technical Reports Server (NTRS)
1994-01-01
This bibliography lists 29 reports, articles, and other documents introduced into the NASA scientific and technical information system in Nov. 1994. Subject coverage includes: engineering and theoretical aspects of design, construction,evaluation testing, operation, and performance of aircraft (including aircraft engines) and associated components, equipment and systems. It also includes research and development in aerodynamics, aeronautics, and ground support equipment for aeronautical vehicles.
Final Technical Report - Center for Technology for Advanced Scientific Component Software (TASCS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sussman, Alan
2014-10-21
This is a final technical report for the University of Maryland work in the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS). The Maryland work focused on software tools for coupling parallel software components built using the Common Component Architecture (CCA) APIs. Those tools are based on the Maryland InterComm software framework that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes.
A multi-service data management platform for scientific oceanographic products
NASA Astrophysics Data System (ADS)
D'Anca, Alessandro; Conte, Laura; Nassisi, Paola; Palazzo, Cosimo; Lecci, Rita; Cretì, Sergio; Mancini, Marco; Nuzzo, Alessandra; Mirto, Maria; Mannarini, Gianandrea; Coppini, Giovanni; Fiore, Sandro; Aloisio, Giovanni
2017-02-01
An efficient, secure and interoperable data platform solution has been developed in the TESSA project to provide fast navigation and access to the data stored in the data archive, as well as a standard-based metadata management support. The platform mainly targets scientific users and the situational sea awareness high-level services such as the decision support systems (DSS). These datasets are accessible through the following three main components: the Data Access Service (DAS), the Metadata Service and the Complex Data Analysis Module (CDAM). The DAS allows access to data stored in the archive by providing interfaces for different protocols and services for downloading, variables selection, data subsetting or map generation. Metadata Service is the heart of the information system of the TESSA products and completes the overall infrastructure for data and metadata management. This component enables data search and discovery and addresses interoperability by exploiting widely adopted standards for geospatial data. Finally, the CDAM represents the back-end of the TESSA DSS by performing on-demand complex data analysis tasks.
Investigation of Space Interferometer Control Using Imaging Sensor Output Feedback
NASA Technical Reports Server (NTRS)
Leitner, Jesse A.; Cheng, Victor H. L.
2003-01-01
Numerous space interferometry missions are planned for the next decade to verify different enabling technologies towards very-long-baseline interferometry to achieve high-resolution imaging and high-precision measurements. These objectives will require coordinated formations of spacecraft separately carrying optical elements comprising the interferometer. High-precision sensing and control of the spacecraft and the interferometer-component payloads are necessary to deliver sub-wavelength accuracy to achieve the scientific objectives. For these missions, the primary scientific product of interferometer measurements may be the only source of data available at the precision required to maintain the spacecraft and interferometer-component formation. A concept is studied for detecting the interferometer's optical configuration errors based on information extracted from the interferometer sensor output. It enables precision control of the optical components, and, in cases of space interferometers requiring formation flight of spacecraft that comprise the elements of a distributed instrument, it enables the control of the formation-flying vehicles because independent navigation or ranging sensors cannot deliver the high-precision metrology over the entire required geometry. Since the concept can act on the quality of the interferometer output directly, it can detect errors outside the capability of traditional metrology instruments, and provide the means needed to augment the traditional instrumentation to enable enhanced performance. Specific analyses performed in this study include the application of signal-processing and image-processing techniques to solve the problems of interferometer aperture baseline control, interferometer pointing, and orientation of multiple interferometer aperture pairs.
Hot Spots and Hot Moments in Scientific Collaborations and Social Movements
ERIC Educational Resources Information Center
Parker, John N.; Hackett, Edward J.
2012-01-01
Emotions are essential but little understood components of research; they catalyze and sustain creative scientific work and fuel the scientific and intellectual social movements (SIMs) that propel scientific change. Adopting a micro-sociological focus, we examine how emotions shape two intellectual processes central to all scientific work:…
Web-Accessible Scientific Workflow System for Performance Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roelof Versteeg; Roelof Versteeg; Trevor Rowe
2006-03-01
We describe the design and implementation of a web accessible scientific workflow system for environmental monitoring. This workflow environment integrates distributed, automated data acquisition with server side data management and information visualization through flexible browser based data access tools. Component technologies include a rich browser-based client (using dynamic Javascript and HTML/CSS) for data selection, a back-end server which uses PHP for data processing, user management, and result delivery, and third party applications which are invoked by the back-end using webservices. This environment allows for reproducible, transparent result generation by a diverse user base. It has been implemented for several monitoringmore » systems with different degrees of complexity.« less
High performance compression of science data
NASA Technical Reports Server (NTRS)
Storer, James A.; Cohn, Martin
1992-01-01
In the future, NASA expects to gather over a tera-byte per day of data requiring space for levels of archival storage. Data compression will be a key component in systems that store this data (e.g., optical disk and tape) as well as in communications systems (both between space and Earth and between scientific locations on Earth). We propose to develop algorithms that can be a basis for software and hardware systems that compress a wide variety of scientific data with different criteria for fidelity/bandwidth tradeoffs. The algorithmic approaches we consider are specially targeted for parallel computation where data rates of over 1 billion bits per second are achievable with current technology.
High performance compression of science data
NASA Technical Reports Server (NTRS)
Storer, James A.; Cohn, Martin
1993-01-01
In the future, NASA expects to gather over a tera-byte per day of data requiring space for levels of archival storage. Data compression will be a key component in systems that store this data (e.g., optical disk and tape) as well as in communications systems (both between space and Earth and between scientific locations on Earth). We propose to develop algorithms that can be a basis for software and hardware systems that compress a wide variety of scientific data with different criteria for fidelity/bandwidth tradeoffs. The algorithmic approaches we consider are specially targeted for parallel computation where data rates of over 1 billion bits per second are achievable with current technology.
EOS MLS Science Data Processing System: A Description of Architecture and Capabilities
NASA Technical Reports Server (NTRS)
Cuddy, David T.; Echeverri, Mark D.; Wagner, Paul A.; Hanzel, Audrey T.; Fuller, Ryan A.
2006-01-01
This paper describes the architecture and capabilities of the Science Data Processing System (SDPS) for the EOS MLS. The SDPS consists of two major components--the Science Computing Facility and the Science Investigator-led Processing System. The Science Computing Facility provides the facilities for the EOS MLS Science Team to perform the functions of scientific algorithm development, processing software development, quality control of data products, and scientific analyses. The Science Investigator-led Processing System processes and reprocesses the science data for the entire mission and delivers the data products to the Science Computing Facility and to the Goddard Space Flight Center Earth Science Distributed Active Archive Center, which archives and distributes the standard science products.
Principles and Guidelines for Duty and Rest Scheduling in Commercial Aviation
NASA Technical Reports Server (NTRS)
Dinges, David F.; Graeber, R. Curtis; Rosekind, Mark R.; Samel, Alexander
1996-01-01
The aviation industry requires 24-hour activities to meet operational demands. Growth in global long-haul, regional, overnight cargo, and short-haul domestic operations will continue to increase these round-the-clock requirements. Flight crews must be available to support 24-hour-a-day operations to meet these industry demands. Both domestic and international aviation can also require crossing multiple time zones. Therefore, shift work, night work, irregular work schedules, unpredictable work schedules, and dm zone changes will continue to be commonplace components of the aviation industry. These factors pose known challenges to human physiology, and because they result in performance-impairing fatigue, they pose a risk to safety. It is critical to acknowledge and, whenever possible, incorporate scientific information on fatigue, human sleep, and circadian physiology into 24-hour aviation operations. Utilization of such scientific information can help promote crew performance and alertness during flight operations and thereby maintain and improve the safety margin.
PASSCLAIM - Synthesis and review of existing processes.
Richardson, David P; Affertsholt, Tage; Asp, Nils-Georg; Bruce, Ake; Grossklaus, Rolf; Howlett, John; Pannemans, Daphne; Ross, Richard; Verhagen, Hans; Viechtbauer, Volker
2003-03-01
Several approaches to the use of health claims on foods have been made around the world, and the common theme is that any health claim will require scientific validation and substantiation. There is also broad consensus that any regulatory framework should protect the consumer, promote fair trade and encourage innovation in the food industry.This paper is based on a critical evaluation of existing international approaches to the scientific substantiation of health claims, with a view to identifying common new ideas, definitions, best practice and a methodology to underpin current and future developments. There is a clear need to have uniform understanding, terminology and description of types of nutrition and health claims. Two broad categories were defined: Nutrition Claims, i. e. what the product contains, and Health Claims, i. e. relating to health, well-being and/or performance, including well-established nutrient function claims, enhanced function claims and disease risk reduction claims. Such health claims relate to what the food or food components does or do. The categories of health claims are closely and progressively related and are, in practice, part of a continuum. Provision is also made for "generic" or well-established, generally accepted claims and for "innovative" or "product-specific" claims. Special attention was paid to reflect the health-promoting properties of a food or food component in such a way as to facilitate the making of risk reduction claims outside the medical scope of the term prevention. The paper sets out basic principles and guidelines for communication of health claims and principles of nutritional safety. The main body of the work examines the process for the assessment of scientific support for health claims on food and emphasises an evidence-based approach consisting of: Identification of all relevant studies exploring the collection of evidence, data searches, the nature of the scientific evidence, sources of scientific data (including human intervention studies, human observational studies, animal studies and in vitro studies, and the use of biomarkers in human studies. Evaluation of the quality of individual studies to ensure good experimental design and interpretation. Interpretation of the totality of evidence to apply scientific judgement to interpret the weight of evidence as a whole. Assessment of significant scientific agreement on a case-by-case basis to agree within the relevant scientific community that an association between a food or a food component and a health benefit is valid. Annexes include an international comparison of regulatory approaches to health claims, suggestions for the documentation and presentation of evidence, and a procedure for reviewing the evidence.
Rübel, Oliver; Dougherty, Max; Prabhat; Denes, Peter; Conant, David; Chang, Edward F.; Bouchard, Kristofer
2016-01-01
Neuroscience continues to experience a tremendous growth in data; in terms of the volume and variety of data, the velocity at which data is acquired, and in turn the veracity of data. These challenges are a serious impediment to sharing of data, analyses, and tools within and across labs. Here, we introduce BRAINformat, a novel data standardization framework for the design and management of scientific data formats. The BRAINformat library defines application-independent design concepts and modules that together create a general framework for standardization of scientific data. We describe the formal specification of scientific data standards, which facilitates sharing and verification of data and formats. We introduce the concept of Managed Objects, enabling semantic components of data formats to be specified as self-contained units, supporting modular and reusable design of data format components and file storage. We also introduce the novel concept of Relationship Attributes for modeling and use of semantic relationships between data objects. Based on these concepts we demonstrate the application of our framework to design and implement a standard format for electrophysiology data and show how data standardization and relationship-modeling facilitate data analysis and sharing. The format uses HDF5, enabling portable, scalable, and self-describing data storage and integration with modern high-performance computing for data-driven discovery. The BRAINformat library is open source, easy-to-use, and provides detailed user and developer documentation and is freely available at: https://bitbucket.org/oruebel/brainformat. PMID:27867355
NASA Astrophysics Data System (ADS)
Stisen, S.; Demirel, C.; Koch, J.
2017-12-01
Evaluation of performance is an integral part of model development and calibration as well as it is of paramount importance when communicating modelling results to stakeholders and the scientific community. There exists a comprehensive and well tested toolbox of metrics to assess temporal model performance in the hydrological modelling community. On the contrary, the experience to evaluate spatial performance is not corresponding to the grand availability of spatial observations readily available and to the sophisticate model codes simulating the spatial variability of complex hydrological processes. This study aims at making a contribution towards advancing spatial pattern oriented model evaluation for distributed hydrological models. This is achieved by introducing a novel spatial performance metric which provides robust pattern performance during model calibration. The promoted SPAtial EFficiency (spaef) metric reflects three equally weighted components: correlation, coefficient of variation and histogram overlap. This multi-component approach is necessary in order to adequately compare spatial patterns. spaef, its three components individually and two alternative spatial performance metrics, i.e. connectivity analysis and fractions skill score, are tested in a spatial pattern oriented model calibration of a catchment model in Denmark. The calibration is constrained by a remote sensing based spatial pattern of evapotranspiration and discharge timeseries at two stations. Our results stress that stand-alone metrics tend to fail to provide holistic pattern information to the optimizer which underlines the importance of multi-component metrics. The three spaef components are independent which allows them to complement each other in a meaningful way. This study promotes the use of bias insensitive metrics which allow comparing variables which are related but may differ in unit in order to optimally exploit spatial observations made available by remote sensing platforms. We see great potential of spaef across environmental disciplines dealing with spatially distributed modelling.
Manuel, Sharrón L; Johnson, Brian W; Frevert, Charles W; Duncan, Francesca E
2018-04-21
Immunohistochemistry (IHC) is a robust scientific tool whereby cellular components are visualized within a tissue, and this method has been and continues to be a mainstay for many reproductive biologists. IHC is highly informative if performed and interpreted correctly, but studies have shown that the general use and reporting of appropriate controls in IHC experiments is low. This omission of the scientific method can result in data that lacks rigor and reproducibility. In this editorial, we highlight key concepts in IHC controls and describe an opportunity for our field to partner with the Histochemical Society to adopt their IHC guidelines broadly as researchers, authors, ad hoc reviewers, editorial board members, and editors-in-chief. Such cross-professional society interactions will ensure that we produce the highest quality data as new technologies emerge that still rely upon the foundations of classic histological and immunohistochemical principles.
Development of high performance scientific components for interoperability of computing packages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gulabani, Teena Pratap
2008-01-01
Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achievedmore » by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.« less
Zhou, Shaona; Han, Jing; Koenig, Kathleen; Raplinger, Amy; Pi, Yuan; Li, Dan; Xiao, Hua; Fu, Zhao; Bao, Lei
2016-03-01
Scientific reasoning is an important component under the cognitive strand of the 21st century skills and is highly emphasized in the new science education standards. This study focuses on the assessment of student reasoning in control of variables (COV), which is a core sub-skill of scientific reasoning. The main research question is to investigate the extent to which the existence of experimental data in questions impacts student reasoning and performance. This study also explores the effects of task contexts on student reasoning as well as students' abilities to distinguish between testability and causal influences of variables in COV experiments. Data were collected with students from both USA and China. Students received randomly one of two test versions, one with experimental data and one without. The results show that students from both populations (1) perform better when experimental data are not provided, (2) perform better in physics contexts than in real-life contexts, and (3) students have a tendency to equate non-influential variables to non-testable variables. In addition, based on the analysis of both quantitative and qualitative data, a possible progression of developmental levels of student reasoning in control of variables is proposed, which can be used to inform future development of assessment and instruction.
Zhou, Shaona; Han, Jing; Koenig, Kathleen; Raplinger, Amy; Pi, Yuan; Li, Dan; Xiao, Hua; Fu, Zhao
2015-01-01
Scientific reasoning is an important component under the cognitive strand of the 21st century skills and is highly emphasized in the new science education standards. This study focuses on the assessment of student reasoning in control of variables (COV), which is a core sub-skill of scientific reasoning. The main research question is to investigate the extent to which the existence of experimental data in questions impacts student reasoning and performance. This study also explores the effects of task contexts on student reasoning as well as students’ abilities to distinguish between testability and causal influences of variables in COV experiments. Data were collected with students from both USA and China. Students received randomly one of two test versions, one with experimental data and one without. The results show that students from both populations (1) perform better when experimental data are not provided, (2) perform better in physics contexts than in real-life contexts, and (3) students have a tendency to equate non-influential variables to non-testable variables. In addition, based on the analysis of both quantitative and qualitative data, a possible progression of developmental levels of student reasoning in control of variables is proposed, which can be used to inform future development of assessment and instruction. PMID:26949425
De Mello Costa, Maria Fernanda; Slocombe, Ron
2012-01-01
Angiotensin II is a key regulator of blood pressure and cardiovascular function in mammals. The conversion of angiotensin into its active form is carried out by Angiotensin I-Converting Enzyme (ACE). The measurement of ACE concentration in plasma or serum, its enzymatic activity, and the correlation between an insertion/deletion (I/D) genetic polymorphism of the ACE gene have been investigated as possible indicators of superior athletic performance in humans. In this context, other indicators of superior adaptation to exercise resulting in better athletic performance (such as ventricular hypertrophy, VO2 max, and competition results) were mostly used to study the association between ACE I/D polymorphism and improved performance. Despite the fact that the existing literature presents little consensus, there is sufficient scientific evidence to warrant further investigation on the usage of ACE activity and the I/D ACE gene polymorphism as biomarkers of superior athletic performance in humans of specific ethnicities or in athletes involved in certain sports. In this sense, a biomarker would be a substance or genetic component that could be measured to provide a degree of certainty, or an indication, of the presence of a certain trait or characteristic that would be beneficial to the athlete’s performance. Difficulties in interpreting and comparing the results of scientific research on the topic arise from dissimilar protocols and variation in study design. This review aims to investigate the current literature on the use of ACE I/D polymorphism as a biomarker of performance in humans through the comparison of scientific publications. PMID:25586030
De Mello Costa, Maria Fernanda; Slocombe, Ron
2012-10-09
Angiotensin II is a key regulator of blood pressure and cardiovascular function in mammals. The conversion of angiotensin into its active form is carried out by Angiotensin I-Converting Enzyme (ACE). The measurement of ACE concentration in plasma or serum, its enzymatic activity, and the correlation between an insertion/deletion (I/D) genetic polymorphism of the ACE gene have been investigated as possible indicators of superior athletic performance in humans. In this context, other indicators of superior adaptation to exercise resulting in better athletic performance (such as ventricular hypertrophy, VO2 max, and competition results) were mostly used to study the association between ACE I/D polymorphism and improved performance. Despite the fact that the existing literature presents little consensus, there is sufficient scientific evidence to warrant further investigation on the usage of ACE activity and the I/D ACE gene polymorphism as biomarkers of superior athletic performance in humans of specific ethnicities or in athletes involved in certain sports. In this sense, a biomarker would be a substance or genetic component that could be measured to provide a degree of certainty, or an indication, of the presence of a certain trait or characteristic that would be beneficial to the athlete's performance. Difficulties in interpreting and comparing the results of scientific research on the topic arise from dissimilar protocols and variation in study design. This review aims to investigate the current literature on the use of ACE I/D polymorphism as a biomarker of performance in humans through the comparison of scientific publications.
Aeronautical engineering: A continuing bibliography with indexes (supplement 277)
NASA Technical Reports Server (NTRS)
1992-01-01
This bibliography lists 467 reports, articles, and other documents introduced into the NASA scientific and technical information system in Mar. 1992. Subject coverage includes: the engineering and theoretical aspects of design, construction, evaluation, testing, operation, and performance of aircraft (including aircraft engines); and associated aircraft components, equipment, and systems. It also includes research and development in ground support systems, theoretical and applied aspects of aerodynamics, and general fluid dynamics.
Aeronautical Engineering: A special bibliography with indexes, supplement 13
NASA Technical Reports Server (NTRS)
1972-01-01
This special bibliography lists 283 reports, articles, and other documents introduced into the NASA scientific and technical information system in December, 1971. Emphasis is placed on engineering and theoretical aspects for design, construction, evaluation, testing, operation and performance of aircraft (including aircraft engines), and associated components, equipment and systems. Also included are entries on research and development in aeronautics and aerodynamics and research and ground support for aeronautical vehicles.
Aeronautical Engineering, a special bibliography with indexes, supplement 15
NASA Technical Reports Server (NTRS)
1972-01-01
This special bibliography lists 363 reports, articles, and other documents introduced into the NASA scientific and technical information system in January 1972. Emphasis is placed on engineering and theoretical aspects for design, construction, evaluation, testing, operation and performance of aircraft (including aircraft engines) and associated components, equipment and systems. Also included are entries on research and development in aeronautics and aerodynamics and research and ground support for aeronautical vehicles.
Computational Aspects of Data Assimilation and the ESMF
NASA Technical Reports Server (NTRS)
daSilva, A.
2003-01-01
The scientific challenge of developing advanced data assimilation applications is a daunting task. Independently developed components may have incompatible interfaces or may be written in different computer languages. The high-performance computer (HPC) platforms required by numerically intensive Earth system applications are complex, varied, rapidly evolving and multi-part systems themselves. Since the market for high-end platforms is relatively small, there is little robust middleware available to buffer the modeler from the difficulties of HPC programming. To complicate matters further, the collaborations required to develop large Earth system applications often span initiatives, institutions and agencies, involve geoscience, software engineering, and computer science communities, and cross national borders.The Earth System Modeling Framework (ESMF) project is a concerted response to these challenges. Its goal is to increase software reuse, interoperability, ease of use and performance in Earth system models through the use of a common software framework, developed in an open manner by leaders in the modeling community. The ESMF addresses the technical and to some extent the cultural - aspects of Earth system modeling, laying the groundwork for addressing the more difficult scientific aspects, such as the physical compatibility of components, in the future. In this talk we will discuss the general philosophy and architecture of the ESMF, focussing on those capabilities useful for developing advanced data assimilation applications.
NASA Astrophysics Data System (ADS)
Koch, Julian; Cüneyd Demirel, Mehmet; Stisen, Simon
2018-05-01
The process of model evaluation is not only an integral part of model development and calibration but also of paramount importance when communicating modelling results to the scientific community and stakeholders. The modelling community has a large and well-tested toolbox of metrics to evaluate temporal model performance. In contrast, spatial performance evaluation does not correspond to the grand availability of spatial observations readily available and to the sophisticate model codes simulating the spatial variability of complex hydrological processes. This study makes a contribution towards advancing spatial-pattern-oriented model calibration by rigorously testing a multiple-component performance metric. The promoted SPAtial EFficiency (SPAEF) metric reflects three equally weighted components: correlation, coefficient of variation and histogram overlap. This multiple-component approach is found to be advantageous in order to achieve the complex task of comparing spatial patterns. SPAEF, its three components individually and two alternative spatial performance metrics, i.e. connectivity analysis and fractions skill score, are applied in a spatial-pattern-oriented model calibration of a catchment model in Denmark. Results suggest the importance of multiple-component metrics because stand-alone metrics tend to fail to provide holistic pattern information. The three SPAEF components are found to be independent, which allows them to complement each other in a meaningful way. In order to optimally exploit spatial observations made available by remote sensing platforms, this study suggests applying bias insensitive metrics which further allow for a comparison of variables which are related but may differ in unit. This study applies SPAEF in the hydrological context using the mesoscale Hydrologic Model (mHM; version 5.8), but we see great potential across disciplines related to spatially distributed earth system modelling.
Basic Inferences of Scientific Reasoning, Argumentation, and Discovery
ERIC Educational Resources Information Center
Lawson, Anton E.
2010-01-01
Helping students better understand how scientists reason and argue to draw scientific conclusions has long been viewed as a critical component of scientific literacy, thus remains a central goal of science instruction. However, differences of opinion persist regarding the nature of scientific reasoning, argumentation, and discovery. Accordingly,…
Spaceborne sensors (1983-2000 AD): A forecast of technology
NASA Technical Reports Server (NTRS)
Kostiuk, T.; Clark, B. P.
1984-01-01
A technical review and forecast of space technology as it applies to spaceborne sensors for future NASA missions is presented. A format for categorization of sensor systems covering the entire electromagnetic spectrum, including particles and fields is developed. Major generic sensor systems are related to their subsystems, components, and to basic research and development. General supporting technologies such as cryogenics, optical design, and data processing electronics are addressed where appropriate. The dependence of many classes of instruments on common components, basic R&D and support technologies is also illustrated. A forecast of important system designs and instrument and component performance parameters is provided for the 1983-2000 AD time frame. Some insight into the scientific and applications capabilities and goals of the sensor systems is also given.
A relative navigation sensor for CubeSats based on LED fiducial markers
NASA Astrophysics Data System (ADS)
Sansone, Francesco; Branz, Francesco; Francesconi, Alessandro
2018-05-01
Small satellite platforms are becoming very appealing both for scientific and commercial applications, thanks to their low cost, short development times and availability of standard components and subsystems. The main disadvantage with such vehicles is the limitation of available resources to perform mission tasks. To overcome this drawback, mission concepts are under study that foresee cooperation between autonomous small satellites to accomplish complex tasks; among these, on-orbit servicing and on-orbit assembly of large structures are of particular interest and the global scientific community is putting a significant effort in the miniaturization of critical technologies that are required for such innovative mission scenarios. In this work, the development and the laboratory testing of an accurate relative navigation package for nanosatellites compliant to the CubeSat standard is presented. The system features a small camera and two sets of LED fiducial markers, and is conceived as a standard package that allows small spacecraft to perform mutual tracking during rendezvous and docking maneuvers. The hardware is based on off-the-shelf components assembled in a compact configuration that is compatible with the CubeSat standard. The image processing and pose estimation software was custom developed. The experimental evaluation of the system allowed to determine both the static and dynamic performances. The system is capable to determine the close range relative position and attitude faster than 10 S/s, with errors always below 10 mm and 2 deg.
Bonsai: an event-based framework for processing and controlling data streams
Lopes, Gonçalo; Bonacchi, Niccolò; Frazão, João; Neto, Joana P.; Atallah, Bassam V.; Soares, Sofia; Moreira, Luís; Matias, Sara; Itskov, Pavel M.; Correia, Patrícia A.; Medina, Roberto E.; Calcaterra, Lorenza; Dreosti, Elena; Paton, Joseph J.; Kampff, Adam R.
2015-01-01
The design of modern scientific experiments requires the control and monitoring of many different data streams. However, the serial execution of programming instructions in a computer makes it a challenge to develop software that can deal with the asynchronous, parallel nature of scientific data. Here we present Bonsai, a modular, high-performance, open-source visual programming framework for the acquisition and online processing of data streams. We describe Bonsai's core principles and architecture and demonstrate how it allows for the rapid and flexible prototyping of integrated experimental designs in neuroscience. We specifically highlight some applications that require the combination of many different hardware and software components, including video tracking of behavior, electrophysiology and closed-loop control of stimulation. PMID:25904861
Clabough, Erin B D; Clabough, Seth W
2016-01-01
Scientific writing is an important communication and learning tool in neuroscience, yet it is a skill not adequately cultivated in introductory undergraduate science courses. Proficient, confident scientific writers are produced by providing specific knowledge about the writing process, combined with a clear student understanding about how to think about writing (also known as metacognition). We developed a rubric for evaluating scientific papers and assessed different methods of using the rubric in inquiry-based introductory biology classrooms. Students were either 1) given the rubric alone, 2) given the rubric, but also required to visit a biology subject tutor for paper assistance, or 3) asked to self-grade paper components using the rubric. Students who were required to use a peer tutor had more negative attitudes towards scientific writing, while students who used the rubric alone reported more confidence in their science writing skills by the conclusion of the semester. Overall, students rated the use of an example paper or grading rubric as the most effective ways of teaching scientific writing, while rating peer review as ineffective. Our paper describes a concrete, simple method of infusing scientific writing into inquiry-based science classes, and provides clear avenues to enhance communication and scientific writing skills in entry-level classes through the use of a rubric or example paper, with the goal of producing students capable of performing at a higher level in upper level neuroscience classes and independent research.
Clabough, Erin B.D.; Clabough, Seth W.
2016-01-01
Scientific writing is an important communication and learning tool in neuroscience, yet it is a skill not adequately cultivated in introductory undergraduate science courses. Proficient, confident scientific writers are produced by providing specific knowledge about the writing process, combined with a clear student understanding about how to think about writing (also known as metacognition). We developed a rubric for evaluating scientific papers and assessed different methods of using the rubric in inquiry-based introductory biology classrooms. Students were either 1) given the rubric alone, 2) given the rubric, but also required to visit a biology subject tutor for paper assistance, or 3) asked to self-grade paper components using the rubric. Students who were required to use a peer tutor had more negative attitudes towards scientific writing, while students who used the rubric alone reported more confidence in their science writing skills by the conclusion of the semester. Overall, students rated the use of an example paper or grading rubric as the most effective ways of teaching scientific writing, while rating peer review as ineffective. Our paper describes a concrete, simple method of infusing scientific writing into inquiry-based science classes, and provides clear avenues to enhance communication and scientific writing skills in entry-level classes through the use of a rubric or example paper, with the goal of producing students capable of performing at a higher level in upper level neuroscience classes and independent research. PMID:27980476
WASP (Write a Scientific Paper) using Excel - 8: t-Tests.
Grech, Victor
2018-06-01
t-Testing is a common component of inferential statistics when comparing two means. This paper explains the central limit theorem and the concept of the null hypothesis as well as types of errors. On the practical side, this paper outlines how different t-tests may be performed in Microsoft Excel, for different purposes, both statically as well as dynamically, with Excel's functions. Copyright © 2018 Elsevier B.V. All rights reserved.
Program Components | Cancer Prevention Fellowship Program
Annual Cancer Prevention Fellows' Scientific Symposium The Annual Cancer Prevention Fellows’ Scientific Symposium is held each fall. The symposium brings together senior fellows, new fellows, and the CPFP staff for a day of scientific exchange in the area of cancer prevention.
Proportional Reasoning: An Essential Component of Scientific Understanding
ERIC Educational Resources Information Center
Hilton, Annette; Hilton, Geoff
2016-01-01
In many scientific contexts, students need to be able to use mathematical knowledge in order to engage in scientific reasoning and problem-solving, and their understanding of scientific concepts relies heavily on their ability to understand and use mathematics in often new or unfamiliar contexts. Not only do science students need high levels of…
Judicious use of custom development in an open source component architecture
NASA Astrophysics Data System (ADS)
Bristol, S.; Latysh, N.; Long, D.; Tekell, S.; Allen, J.
2014-12-01
Modern software engineering is not as much programming from scratch as innovative assembly of existing components. Seamlessly integrating disparate components into scalable, performant architecture requires sound engineering craftsmanship and can often result in increased cost efficiency and accelerated capabilities if software teams focus their creativity on the edges of the problem space. ScienceBase is part of the U.S. Geological Survey scientific cyberinfrastructure, providing data and information management, distribution services, and analysis capabilities in a way that strives to follow this pattern. ScienceBase leverages open source NoSQL and relational databases, search indexing technology, spatial service engines, numerous libraries, and one proprietary but necessary software component in its architecture. The primary engineering focus is cohesive component interaction, including construction of a seamless Application Programming Interface (API) across all elements. The API allows researchers and software developers alike to leverage the infrastructure in unique, creative ways. Scaling the ScienceBase architecture and core API with increasing data volume (more databases) and complexity (integrated science problems) is a primary challenge addressed by judicious use of custom development in the component architecture. Other data management and informatics activities in the earth sciences have independently resolved to a similar design of reusing and building upon established technology and are working through similar issues for managing and developing information (e.g., U.S. Geoscience Information Network; NASA's Earth Observing System Clearing House; GSToRE at the University of New Mexico). Recent discussions facilitated through the Earth Science Information Partners are exploring potential avenues to exploit the implicit relationships between similar projects for explicit gains in our ability to more rapidly advance global scientific cyberinfrastructure.
NASA Astrophysics Data System (ADS)
Guilyardi, E.
2003-04-01
The European Union's PRISM infrastructure project (PRogram for Integrated earth System Modelling) aims at designing a flexible environment to easily assemble and run Earth System Models (http://prism.enes.org). Europe's widely distributed modelling expertise is both a strength and a challenge. Recognizing this, the PRISM project aims at developing an efficient shared modelling software infrastructure for climate scientists, providing them with an opportunity for greater focus on scientific issues, including the necessary scientific diversity (models and approaches). The proposed PRISM system includes 1) the use - or definition - and promotion of scientific and technical standards to increase component modularity, 2) an end-to-end software environment (coupler, user interface, diagnostics) to launch, monitor and analyze complex Earth System Models built around the existing and future community models, 3) testing and quality standards to ensure HPC performance on a variety of platforms and 4) community wide inputs and requirements capture in all stages of system specifications and design through user/developers meetings, workshops and thematic schools. This science driven project, led by 22 institutes* and started December 1st 2001, benefits from a unique gathering of scientific and technical expertise. More than 30 models (both global and regional) have expressed interest to be part of the PRISM system and 6 types of components have been identified: atmosphere, atmosphere chemistry, land surface, ocean, sea ice and ocean biochemistry. Progress and overall architecture design will be presented. * MPI-Met (Coordinator), KNMI (co-coordinator), MPI-M&D, Met Office, University of Reading, IPSL, Meteo-France, CERFACS, DMI, SMHI, NERSC, ETH Zurich, INGV, MPI-BGC, PIK, ECMWF, UCL-ASTR, NEC, FECIT, SGI, SUN, CCRLE
Center for Technology for Advanced Scientific Componet Software (TASCS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Govindaraju, Madhusudhan
Advanced Scientific Computing Research Computer Science FY 2010Report Center for Technology for Advanced Scientific Component Software: Distributed CCA State University of New York, Binghamton, NY, 13902 Summary The overall objective of Binghamton's involvement is to work on enhancements of the CCA environment, motivated by the applications and research initiatives discussed in the proposal. This year we are working on re-focusing our design and development efforts to develop proof-of-concept implementations that have the potential to significantly impact scientific components. We worked on developing parallel implementations for non-hydrostatic code and worked on a model coupling interface for biogeochemical computations coded in MATLAB.more » We also worked on the design and implementation modules that will be required for the emerging MapReduce model to be effective for scientific applications. Finally, we focused on optimizing the processing of scientific datasets on multi-core processors. Research Details We worked on the following research projects that we are working on applying to CCA-based scientific applications. 1. Non-Hydrostatic Hydrodynamics: Non-static hydrodynamics are significantly more accurate at modeling internal waves that may be important in lake ecosystems. Non-hydrostatic codes, however, are significantly more computationally expensive, often prohibitively so. We have worked with Chin Wu at the University of Wisconsin to parallelize non-hydrostatic code. We have obtained a speed up of about 26 times maximum. Although this is significant progress, we hope to improve the performance further, such that it becomes a practical alternative to hydrostatic codes. 2. Model-coupling for water-based ecosystems: To answer pressing questions about water resources requires that physical models (hydrodynamics) be coupled with biological and chemical models. Most hydrodynamics codes are written in Fortran, however, while most ecologists work in MATLAB. This disconnect creates a great barrier. To address this, we are working on a model coupling interface that will allow biogeochemical computations written in MATLAB to couple with Fortran codes. This will greatly improve the productivity of ecosystem scientists. 2. Low overhead and Elastic MapReduce Implementation Optimized for Memory and CPU-Intensive Applications: Since its inception, MapReduce has frequently been associated with Hadoop and large-scale datasets. Its deployment at Amazon in the cloud, and its applications at Yahoo! for large-scale distributed document indexing and database building, among other tasks, have thrust MapReduce to the forefront of the data processing application domain. The applicability of the paradigm however extends far beyond its use with data intensive applications and diskbased systems, and can also be brought to bear in processing small but CPU intensive distributed applications. MapReduce however carries its own burdens. Through experiments using Hadoop in the context of diverse applications, we uncovered latencies and delay conditions potentially inhibiting the expected performance of a parallel execution in CPU-intensive applications. Furthermore, as it currently stands, MapReduce is favored for data-centric applications, and as such tends to be solely applied to disk-based applications. The paradigm, falls short in bringing its novelty to diskless systems dedicated to in-memory applications, and compute intensive programs processing much smaller data, but requiring intensive computations. In this project, we focused both on the performance of processing large-scale hierarchical data in distributed scientific applications, as well as the processing of smaller but demanding input sizes primarily used in diskless, and memory resident I/O systems. We designed LEMO-MR [1], a Low overhead, elastic, configurable for in- memory applications, and on-demand fault tolerance, an optimized implementation of MapReduce, for both on disk and in memory applications. We conducted experiments to identify not only the necessary components of this model, but also trade offs and factors to be considered. We have initial results to show the efficacy of our implementation in terms of potential speedup that can be achieved for representative data sets used by cloud applications. We have quantified the performance gains exhibited by our MapReduce implementation over Apache Hadoop in a compute intensive environment. 3. Cache Performance Optimization for Processing XML and HDF-based Application Data on Multi-core Processors: It is important to design and develop scientific middleware libraries to harness the opportunities presented by emerging multi-core processors. Implementations of scientific middleware and applications that do not adapt to the programming paradigm when executing on emerging processors can severely impact the overall performance. In this project, we focused on the utilization of the L2 cache, which is a critical shared resource on chip multiprocessors (CMP). The access pattern of the shared L2 cache, which is dependent on how the application schedules and assigns processing work to each thread, can either enhance or hurt the ability to hide memory latency on a multi-core processor. Therefore, while processing scientific datasets such as HDF5, it is essential to conduct fine-grained analysis of cache utilization, to inform scheduling decisions in multi-threaded programming. In this project, using the TAU toolkit for performance feedback from dual- and quad-core machines, we conducted performance analysis and recommendations on how processing threads can be scheduled on multi-core nodes to enhance the performance of a class of scientific applications that requires processing of HDF5 data. In particular, we quantified the gains associated with the use of the adaptations we have made to the Cache-Affinity and Balanced-Set scheduling algorithms to improve L2 cache performance, and hence the overall application execution time [2]. References: 1. Zacharia Fadika, Madhusudhan Govindaraju, ``MapReduce Implementation for Memory-Based and Processing Intensive Applications'', accepted in 2nd IEEE International Conference on Cloud Computing Technology and Science, Indianapolis, USA, Nov 30 - Dec 3, 2010. 2. Rajdeep Bhowmik, Madhusudhan Govindaraju, ``Cache Performance Optimization for Processing XML-based Application Data on Multi-core Processors'', in proceedings of The 10th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing, May 17-20, 2010, Melbourne, Victoria, Australia. Contact Information: Madhusudhan Govindaraju Binghamton University State University of New York (SUNY) mgovinda@cs.binghamton.edu Phone: 607-777-4904« less
NASA Astrophysics Data System (ADS)
Yamada, Yoshiyuki; Gouda, Naoteru; Yano, Taihei; Kobayashi, Yukiyasu; Tsujimoto, Takuji; Suganuma, Masahiro; Niwa, Yoshito; Sako, Nobutada; Hatsutori, Yoichi; Tanaka, Takashi
2006-06-01
We explain simulation tools in JASMINE project (JASMINE simulator). The JASMINE project stands at the stage where its basic design will be determined in a few years. Then it is very important to simulate the data stream generated by astrometric fields at JASMINE in order to support investigations into error budgets, sampling strategy, data compression, data analysis, scientific performances, etc. Of course, component simulations are needed, but total simulations which include all components from observation target to satellite system are also very important. We find that new software technologies, such as Object Oriented(OO) methodologies are ideal tools for the simulation system of JASMINE(the JASMINE simulator). In this article, we explain the framework of the JASMINE simulator.
Pang, Hanqing; Wang, Jun; Tang, Yuping; Xu, Huiqin; Wu, Liang; Jin, Yi; Zhu, Zhenhua; Guo, Sheng; Shi, Xuqin; Huang, Shengliang; Sun, Dazheng; Duan, Jin-Ao
2016-11-01
Xin-Sheng-Hua granule, a representative formula for postpartum hemorrhage, has been used clinically to treat postpartum diseases. Its main bioactive components comprise aromatic acids, phthalides, alkaloids, flavonoids, and gingerols among others. To investigate the changes in main bioactive constituents in its seven single herbs before and after compatibility, a rapid, simple, and sensitive method was developed for comparative analysis of 27 main bioactive components by using ultrahigh-performance liquid chromatography with triple quadrupole electrospray tandem mass spectrometry for the first time. The sufficient separation of 27 target constituents was achieved on a Thermo Scientific Hypersil GOLD column (100 mm × 3 mm, 1.9 μm) within 20 min under the optimized chromatographic conditions. Compared with the theoretical content, the observed content of each analyte showed remarkable differences in Xin-Sheng-Hua granule except thymine, p-coumaric acid, senkyunolide I, senkyunolide H, and ligustilide; the total contents of 27 components increased significantly, and the content variation degrees for the different components were gingerols > flavonoids > aromatic acids > alkaloids > phthalides. The results could provide a good reference for the quality control of Xin-Sheng-Hua granule and might be helpful to interpret the drug interactions based on variation of bioactive components in formulae. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
An Expert Assistant for Computer Aided Parallelization
NASA Technical Reports Server (NTRS)
Jost, Gabriele; Chun, Robert; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit
2004-01-01
The prototype implementation of an expert system was developed to assist the user in the computer aided parallelization process. The system interfaces to tools for automatic parallelization and performance analysis. By fusing static program structure information and dynamic performance analysis data the expert system can help the user to filter, correlate, and interpret the data gathered by the existing tools. Sections of the code that show poor performance and require further attention are rapidly identified and suggestions for improvements are presented to the user. In this paper we describe the components of the expert system and discuss its interface to the existing tools. We present a case study to demonstrate the successful use in full scale scientific applications.
Ballen, Cissy J; Thompson, Seth K; Blum, Jessamina E; Newstrom, Nicholas P; Cotner, Sehoya
2018-01-01
Course-based undergraduate research experiences (CUREs) are a type of laboratory learning environment associated with a science course, in which undergraduates participate in novel research. According to Auchincloss et al. (CBE Life Sci Educ 2104; 13:29-40), CUREs are distinct from other laboratory learning environments because they possess five core design components, and while national calls to improve STEM education have led to an increase in CURE programs nationally, less work has specifically focused on which core components are critical to achieving desired student outcomes. Here we use a backward elimination experimental design to test the importance of two CURE components for a population of non-biology majors: the experience of discovery and the production of data broadly relevant to the scientific or local community. We found nonsignificant impacts of either laboratory component on students' academic performance, science self-efficacy, sense of project ownership, and perceived value of the laboratory experience. Our results challenge the assumption that all core components of CUREs are essential to achieve positive student outcomes when applied at scale.
What Matters in Scientific Explanations: Effects of Elaboration and Content
Rottman, Benjamin M.; Keil, Frank C.
2011-01-01
Given the breadth and depth of available information, determining which components of an explanation are most important is a crucial process for simplifying learning. Three experiments tested whether people believe that components of an explanation with more elaboration are more important. In Experiment 1, participants read separate and unstructured components that comprised explanations of real-world scientific phenomena, rated the components on their importance for understanding the explanations, and drew graphs depicting which components elaborated on which other components. Participants gave higher importance scores for components that they judged to be elaborated upon by other components. Experiment 2 demonstrated that experimentally increasing the amount of elaboration of a component increased the perceived importance of the elaborated component. Furthermore, Experiment 3 demonstrated that elaboration increases the importance of the elaborated information by providing insight into understanding the elaborated information; information that was too technical to provide insight into the elaborated component did not increase the importance of the elaborated component. While learning an explanation, people piece together the structure of elaboration relationships between components and use the insight provided by elaboration to identify important components. PMID:21924709
Software for Planning Scientific Activities on Mars
NASA Technical Reports Server (NTRS)
Ai-Chang, Mitchell; Bresina, John; Jonsson, Ari; Hsu, Jennifer; Kanefsky, Bob; Morris, Paul; Rajan, Kanna; Yglesias, Jeffrey; Charest, Len; Maldague, Pierre
2003-01-01
Mixed-Initiative Activity Plan Generator (MAPGEN) is a ground-based computer program for planning and scheduling the scientific activities of instrumented exploratory robotic vehicles, within the limitations of available resources onboard the vehicle. MAPGEN is a combination of two prior software systems: (1) an activity-planning program, APGEN, developed at NASA s Jet Propulsion Laboratory and (2) the Europa planner/scheduler from NASA Ames Research Center. MAPGEN performs all of the following functions: Automatic generation of plans and schedules for scientific and engineering activities; Testing of hypotheses (or what-if analyses of various scenarios); Editing of plans; Computation and analysis of resources; and Enforcement and maintenance of constraints, including resolution of temporal and resource conflicts among planned activities. MAPGEN can be used in either of two modes: one in which the planner/scheduler is turned off and only the basic APGEN functionality is utilized, or one in which both component programs are used to obtain the full planning, scheduling, and constraint-maintenance functionality.
Machine learning based job status prediction in scientific clusters
Yoo, Wucherl; Sim, Alex; Wu, Kesheng
2016-09-01
Large high-performance computing systems are built with increasing number of components with more CPU cores, more memory, and more storage space. At the same time, scientific applications have been growing in complexity. Together, they are leading to more frequent unsuccessful job statuses on HPC systems. From measured job statuses, 23.4% of CPU time was spent to the unsuccessful jobs. Here, we set out to study whether these unsuccessful job statuses could be anticipated from known job characteristics. To explore this possibility, we have developed a job status prediction method for the execution of jobs on scientific clusters. The Random Forestsmore » algorithm was applied to extract and characterize the patterns of unsuccessful job statuses. Experimental results show that our method can predict the unsuccessful job statuses from the monitored ongoing job executions in 99.8% the cases with 83.6% recall and 94.8% precision. Lastly, this prediction accuracy can be sufficiently high that it can be used to mitigation procedures of predicted failures.« less
Machine learning based job status prediction in scientific clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Wucherl; Sim, Alex; Wu, Kesheng
Large high-performance computing systems are built with increasing number of components with more CPU cores, more memory, and more storage space. At the same time, scientific applications have been growing in complexity. Together, they are leading to more frequent unsuccessful job statuses on HPC systems. From measured job statuses, 23.4% of CPU time was spent to the unsuccessful jobs. Here, we set out to study whether these unsuccessful job statuses could be anticipated from known job characteristics. To explore this possibility, we have developed a job status prediction method for the execution of jobs on scientific clusters. The Random Forestsmore » algorithm was applied to extract and characterize the patterns of unsuccessful job statuses. Experimental results show that our method can predict the unsuccessful job statuses from the monitored ongoing job executions in 99.8% the cases with 83.6% recall and 94.8% precision. Lastly, this prediction accuracy can be sufficiently high that it can be used to mitigation procedures of predicted failures.« less
Coronagraph observations and analyses of the ultraviolet solar corona
NASA Technical Reports Server (NTRS)
Kohl, John L.
1989-01-01
The major activities on the Spartan Ultraviolet Coronal Spectrometer project include both scientific and experimental/technical efforts. In the scientific area, a detailed analysis of the previously reported Doppler dimming of HI Ly-alpha from the July 1982 rocket flight has determined an outflow velocity at 2 solar radii from sun center to be between 153 and 251 km/s at 67 percent confidence. The technical activities include, several improvements made to the instrument that will result in enhanced scientific performance or in regaining a capability that had deteriorated during the delay time in the launch date. These include testing and characterizing the detector for OVI radiation, characterizing a serrated occulter at UV and visible wavelengths, fabricating and testing telescope mirrors with improved edges, testing and evaluating a new array detector system, modifying the slit mask mechanism and installing a mask in the instrument to block the Ly-alpha resonance line when the electron scattered component is being observed.
Using Authentic Data in High School Earth System Science Research - Inspiring Future Scientists
NASA Astrophysics Data System (ADS)
Bruck, L. F.
2006-05-01
Using authentic data in a science research class is an effective way to teach students the scientific process, problem solving, and communication skills. In Frederick County Public Schools, MD a course has been developed to hone scientific research skills, and inspire interest in careers in science and technology. The Earth System Science Research course provides eleventh and twelfth grade students an opportunity to study Earth System Science using the latest information developed through current technologies. The system approach to this course helps students understand the complexity and interrelatedness of the Earth system. Consequently students appreciate the dynamics of local and global environments as part of a complex system. This course is an elective offering designed to engage students in the study of the atmosphere, biosphere, cryosphere, geosphere, and hydrosphere. This course allows students to utilize skills and processes gained from previous science courses to study the physical, chemical, and biological aspects of the Earth system. The research component of the course makes up fifty percent of course time in which students perform independent research on the interactions within the Earth system. Students are required to produce a scientific presentation to communicate the results of their research. Posters are then presented to the scientific community. Some of these presentations have led to internships and other scientific opportunities.
NASA Astrophysics Data System (ADS)
Wang, Chia-Yu
2015-01-01
This study investigated the effects of scaffolds as cognitive prompts and as metacognitive evaluation on seventh-grade students' growth of content knowledge and construction of scientific explanations in five inquiry-based biology activities. Students' scores on multiple-choice pretest and posttest and worksheets for five inquiry-based activities were analyzed. The results show that the students' content knowledge in all conditions significantly increased from the pretest to posttest. Incorporating cognitive prompts with the explanation scaffolds better facilitated knowledge integration and resulted in greater learning gains of content knowledge and better quality evidence and reasoning. The metacognitive evaluation instruction improved all explanation components, especially claims and reasoning. This metacognitive approach also significantly reduced students' over- or underestimation during peer-evaluation by refining their internal standards for the quality of scientific explanations. The ability to accurately evaluate the quality of explanations was strongly associated with better performance on explanation construction. The cognitive prompts and metacognitive evaluation instruction address different aspects of the challenges faced by the students, and show different effects on the enhancement of content knowledge and the quality of scientific explanations. Future directions and suggestions are provided for improving the design of the scaffolds to facilitate the construction of scientific explanations.
The development of scientific thinking in elementary school: a comprehensive inventory.
Koerber, Susanne; Mayer, Daniela; Osterhaus, Christopher; Schwippert, Knut; Sodian, Beate
2015-01-01
The development of scientific thinking was assessed in 1,581 second, third, and fourth graders (8-, 9-, 10-year-olds) based on a conceptual model that posits developmental progression from naïve to more advanced conceptions. Using a 66-item scale, five components of scientific thinking were addressed, including experimental design, data interpretation, and understanding the nature of science. Unidimensional and multidimensional item response theory analyses supported the instrument's reliability and validity and suggested that the multiple components of scientific thinking form a unitary construct, independent of verbal or reasoning skills. A partial credit model gave evidence for a hierarchical developmental progression. Across each grade transition, advanced conceptions increased while naïve conceptions decreased. Independent effects of intelligence, schooling, and parental education on scientific thinking are discussed. © 2014 The Authors. Child Development © 2014 Society for Research in Child Development, Inc.
Future observations of and missions to Mercury
NASA Technical Reports Server (NTRS)
Stern, Alan S.; Vilas, Faith
1988-01-01
Key scientific objectives of Mercury explorations are discussed, and the methods by which remote observations of Mercury can be carried out from earth and from space are examined. Attention is also given to the scientific rationale and technical concepts for missions to Mercury. It is pointed out that multiple Venus-Mercury encounter trajectories exist which, through successive gravity assists, reduce mission performance requirements to levels deliverable by available systems, such as Titan-Centaur, Atlas-Centaur, and Shuttle/TOS. It is shown that a single launch in July of 1994, using a Titan-Centaur combination, could place a 1477-kg payload into orbit around Meercury. The components of a Mercury-orbiter payload designed to study surface geology and geochemistry, atmospheric composition and structure, the local particle and fields environment, and solid-body rotation dynamics are listed.
Ecological prediction with nonlinear multivariate time-frequency functional data models
Yang, Wen-Hsi; Wikle, Christopher K.; Holan, Scott H.; Wildhaber, Mark L.
2013-01-01
Time-frequency analysis has become a fundamental component of many scientific inquiries. Due to improvements in technology, the amount of high-frequency signals that are collected for ecological and other scientific processes is increasing at a dramatic rate. In order to facilitate the use of these data in ecological prediction, we introduce a class of nonlinear multivariate time-frequency functional models that can identify important features of each signal as well as the interaction of signals corresponding to the response variable of interest. Our methodology is of independent interest and utilizes stochastic search variable selection to improve model selection and performs model averaging to enhance prediction. We illustrate the effectiveness of our approach through simulation and by application to predicting spawning success of shovelnose sturgeon in the Lower Missouri River.
The Intersection of Information and Science Literacy
ERIC Educational Resources Information Center
Klucevsek, Kristin
2017-01-01
To achieve higher science literacy, both students and the public require discipline-specific information literacy in the sciences. Scientific information literacy is a core component of the scientific process. In addition to teaching how to find and evaluate resources, scientific information literacy should include teaching the process of…
Scientific Culture and School Culture: Epistemic and Procedural Components.
ERIC Educational Resources Information Center
Jimenez-Aleixandre, Maria Pilar; Diaz de Bustamante, Joaquin; Duschl, Richard A.
This paper discusses the elaboration and application of "scientific culture" categories to the analysis of students' discourse while solving problems in inquiry contexts. Scientific culture means the particular domain culture of science, the culture of science practitioners. The categories proposed include both epistemic operations and…
NASA Astrophysics Data System (ADS)
Carnelli, Ian; Galvez, Andres; Mellab, Karim
2016-04-01
The Asteroid Impact Mission (AIM) is a small and innovative mission of opportunity, currently under study at ESA, intending to demonstrate new technologies for future deep-space missions while addressing planetary defense objectives and performing for the first time detailed investigations of a binary asteroid system. It leverages on a unique opportunity provided by asteroid 65803 Didymos, set for an Earth close-encounter in October 2022, to achieve a fast mission return in only two years after launch in October/November 2020. AIM is also ESA's contribution to an international cooperation between ESA and NASA called Asteroid Impact Deflection Assessment (AIDA), consisting of two mission elements: the NASA Double Asteroid Redirection Test (DART) mission and the AIM rendezvous spacecraft. The primary goals of AIDA are to test our ability to perform a spacecraft impact on a near-Earth asteroid and to measure and characterize the deflection caused by the impact. The two mission components of AIDA, DART and AIM, are each independently valuable but when combined they provide a greatly increased scientific return. The DART hypervelocity impact on the secondary asteroid will alter the binary orbit period, which will also be measured by means of lightcurves observations from Earth-based telescopes. AIM instead will perform before and after detailed characterization shedding light on the dependence of the momentum transfer on the asteroid's bulk density, porosity, surface and internal properties. AIM will gather data describing the fragmentation and restructuring processes as well as the ejection of material, and relate them to parameters that can only be available from ground-based observations. Collisional events are of great importance in the formation and evolution of planetary systems, own Solar System and planetary rings. The AIDA scenario will provide a unique opportunity to observe a collision event directly in space, and simultaneously from ground-based optical and radar facilities. For the first time, an impact experiment at asteroid scale will be performed with accurate knowledge of the precise impact conditions and also the impact outcome, together with information on the physical properties of the target, ultimately validating at appropriate scales our knowledge of the process and impact simulations. AIM's important technology demonstration component includes a deep-space optical communication terminal and inter-satellite network with two CubeSats deployed in the vicinity of the Didymos system and a lander on the surface of the secondary. To achieve a low-cost objective AIM's technology and scientific payload are being combined to support both close-proximity navigation and scientific investigations. AIM will demonstrate the capability to achieve a small spacecraft design with a very large technological and scientific mission return.
I/O-Efficient Scientific Computation Using TPIE
NASA Technical Reports Server (NTRS)
Vengroff, Darren Erik; Vitter, Jeffrey Scott
1996-01-01
In recent years, input/output (I/O)-efficient algorithms for a wide variety of problems have appeared in the literature. However, systems specifically designed to assist programmers in implementing such algorithms have remained scarce. TPIE is a system designed to support I/O-efficient paradigms for problems from a variety of domains, including computational geometry, graph algorithms, and scientific computation. The TPIE interface frees programmers from having to deal not only with explicit read and write calls, but also the complex memory management that must be performed for I/O-efficient computation. In this paper we discuss applications of TPIE to problems in scientific computation. We discuss algorithmic issues underlying the design and implementation of the relevant components of TPIE and present performance results of programs written to solve a series of benchmark problems using our current TPIE prototype. Some of the benchmarks we present are based on the NAS parallel benchmarks while others are of our own creation. We demonstrate that the central processing unit (CPU) overhead required to manage I/O is small and that even with just a single disk, the I/O overhead of I/O-efficient computation ranges from negligible to the same order of magnitude as CPU time. We conjecture that if we use a number of disks in parallel this overhead can be all but eliminated.
The PRO Instructional Strategy in the Construction of Scientific Explanations
ERIC Educational Resources Information Center
Tang, Kok-Sing
2015-01-01
This article presents an instructional strategy called Premise-Reasoning-Outcome (PRO) designed to support students in the construction of scientific explanations. Informed by the philosophy of science and linguistic studies of science, the PRO strategy involves identifying three components of a scientific explanation: (i) premise--an accepted…
Creativity, visualization abilities, and visual cognitive style.
Kozhevnikov, Maria; Kozhevnikov, Michael; Yu, Chen Jiao; Blazhenkova, Olesya
2013-06-01
Despite the recent evidence for a multi-component nature of both visual imagery and creativity, there have been no systematic studies on how the different dimensions of creativity and imagery might interrelate. The main goal of this study was to investigate the relationship between different dimensions of creativity (artistic and scientific) and dimensions of visualization abilities and styles (object and spatial). In addition, we compared the contributions of object and spatial visualization abilities versus corresponding styles to scientific and artistic dimensions of creativity. Twenty-four undergraduate students (12 females) were recruited for the first study, and 75 additional participants (36 females) were recruited for an additional experiment. Participants were administered a number of object and spatial visualization abilities and style assessments as well as a number of artistic and scientific creativity tests. The results show that object visualization relates to artistic creativity and spatial visualization relates to scientific creativity, while both are distinct from verbal creativity. Furthermore, our findings demonstrate that style predicts corresponding dimension of creativity even after removing shared variance between style and visualization ability. The results suggest that styles might be a more ecologically valid construct in predicting real-life creative behaviour, such as performance in different professional domains. © 2013 The British Psychological Society.
The development of an intelligent user interface for NASA's scientific databases
NASA Technical Reports Server (NTRS)
Campbell, William J.; Roelofs, Larry H.
1986-01-01
The National Space Science Data Center (NSSDC) has initiated an Intelligent Data Management (IDM) research effort which has as one of its components, the development of an Intelligent User Interface (IUI). The intent of the IUI effort is to develop a friendly and intelligent user interface service that is based on expert systems and natural language processing technologies. This paper presents the design concepts, development approach and evaluation of performance of a prototype Intelligent User Interface Subsystem (IUIS) supporting an operational database.
In-Orbit Performance of the Space Telescope NINA and Galactic Cosmic-Ray Flux Measurements
NASA Astrophysics Data System (ADS)
Bidoli, V.; Canestro, A.; Casolino, M.; De Pascale, M. P.; Furano, G.; Iannucci, A.; Morselli, A.; Picozza, P.; Reali, E.; Sparvoli, R.; Bakaldin, A.; Galper, A.; Koldashov, S.; Korotkov, M.; Leonov, A.; Mikhailov, V.; Murashov, A.; Voronov, S.; Boezio, M.; Bonvicini, V.; Cirami, R.; Vacchi, A.; Zampa, N.; Ambriola, M.; Bellotti, R.; Cafagna, F.; Ciacio, F.; Circella, M.; De Marzo, C.; Adriani, O.; Papini, P.; Piccardi, S.; Spillantini, P.; Straulino, S.; Bartalucci, S.; Mazzenga, G.; Ricci, M.; Castellini, G.
2001-02-01
The NINA apparatus, on board the Russian satellite Resurs-01 No. 4, has been in polar orbit since 1998 July 10, at an altitude of 840 km. Its main scientific task is to study the Galactic, solar, and anomalous components of cosmic rays in the energy interval 10-200 MeV nucleon-1. In this paper we present a description of the instrument and its basic operating modes. Measurements of Galactic cosmic-ray spectra will also be shown.
The Common Data Acquisition Platform in the Helmholtz Association
NASA Astrophysics Data System (ADS)
Kaever, P.; Balzer, M.; Kopmann, A.; Zimmer, M.; Rongen, H.
2017-04-01
Various centres of the German Helmholtz Association (HGF) started in 2012 to develop a modular data acquisition (DAQ) platform, covering the entire range from detector readout to data transfer into parallel computing environments. This platform integrates generic hardware components like the multi-purpose HGF-Advanced Mezzanine Card or a smart scientific camera framework, adding user value with Linux drivers and board support packages. Technically the scope comprises the DAQ-chain from FPGA-modules to computing servers, notably frontend-electronics-interfaces, microcontrollers and GPUs with their software plus high-performance data transmission links. The core idea is a generic and component-based approach, enabling the implementation of specific experiment requirements with low effort. This so called DTS-platform will support standards like MTCA.4 in hard- and software to ensure compatibility with commercial components. Its capability to deploy on other crate standards or FPGA-boards with PCI express or Ethernet interfaces remains an essential feature. Competences of the participating centres are coordinated in order to provide a solid technological basis for both research topics in the Helmholtz Programme ``Matter and Technology'': ``Detector Technology and Systems'' and ``Accelerator Research and Development''. The DTS-platform aims at reducing costs and development time and will ensure access to latest technologies for the collaboration. Due to its flexible approach, it has the potential to be applied in other scientific programs.
Binary pressure-sensitive paint measurements using miniaturised, colour, machine vision cameras
NASA Astrophysics Data System (ADS)
Quinn, Mark Kenneth
2018-05-01
Recent advances in machine vision technology and capability have led to machine vision cameras becoming applicable for scientific imaging. This study aims to demonstrate the applicability of machine vision colour cameras for the measurement of dual-component pressure-sensitive paint (PSP). The presence of a second luminophore component in the PSP mixture significantly reduces its inherent temperature sensitivity, increasing its applicability at low speeds. All of the devices tested are smaller than the cooled CCD cameras traditionally used and most are of significantly lower cost, thereby increasing the accessibility of such technology and techniques. Comparisons between three machine vision cameras, a three CCD camera, and a commercially available specialist PSP camera are made on a range of parameters, and a detailed PSP calibration is conducted in a static calibration chamber. The findings demonstrate that colour machine vision cameras can be used for quantitative, dual-component, pressure measurements. These results give rise to the possibility of performing on-board dual-component PSP measurements in wind tunnels or on real flight/road vehicles.
The Satellite Telescope Nina for Nuclear and Isotopic Investigations in Space
NASA Astrophysics Data System (ADS)
Circella, M.; Bidoli, V.; Casolino, M.; de Pascale, M. P.; Morselli, A.; Furano, G.; Picozza, P.; Scoscini, A.; Sparvoli, R.; Barbiellini, G.; Bonvicini, W.; Cirami, R.; Schiavon, P.; Vacchi, A.; Zampa, N.; Ambriola, M.; Bellotti, R.; Cafagna, F.; Ciacio, F.; de Marzo, C.; Bartalucci, S.; Giuntoli, S.; Ricci, M.; Papini, P.; Piccardi, S.; Spillantini, P.; Bakaldin, A.; Batishev, A.; Galper, A. M.; Koldashov, S.; Mikhailov, V.; Murashov, A.; Voronov, S.; Boezio, M.
2000-09-01
NINA is a satellite silicon detector designed to perform measurements of the nuclear and isotopic composition of the galactic and anomalous components of cosmic rays, as well as of the energetic particles associated with solar flares. It has been orbiting the Earth onboard the Russian satellite Resource 01 n. 4 since July 1998. It can perform nuclear discrimination from hydrogen to iron as well as isotopic analyses at least up to the beryllium isotopes in a large energy range. NINA is the first step of the wide scientific program WiZard-RIM, which includes the design and deployment of the PAMELA magnet spectrometer.
Performance Engineering Research Institute SciDAC-2 Enabling Technologies Institute Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lucas, Robert
2013-04-20
Enhancing the performance of SciDAC applications on petascale systems had high priority within DOE SC at the start of the second phase of the SciDAC program, SciDAC-2, as it continues to do so today. Achieving expected levels of performance on high-end computing (HEC) systems is growing ever more challenging due to enormous scale, increasing architectural complexity, and increasing application complexity. To address these challenges, the University of Southern California?s Information Sciences Institute organized the Performance Engineering Research Institute (PERI). PERI implemented a unified, tripartite research plan encompassing: (1) performance modeling and prediction; (2) automatic performance tuning; and (3) performance engineeringmore » of high profile applications. Within PERI, USC?s primary research activity was automatic tuning (autotuning) of scientific software. This activity was spurred by the strong user preference for automatic tools and was based on previous successful activities such as ATLAS, which automatically tuned components of the LAPACK linear algebra library, and other recent work on autotuning domain-specific libraries. Our other major component was application engagement, to which we devoted approximately 30% of our effort to work directly with SciDAC-2 applications. This report is a summary of the overall results of the USC PERI effort.« less
[Law and educational components of patient's safety in surgery].
Sazhin, V P; Karsanov, A M; Maskin, S S
2018-01-01
To evaluate law and educational components of patient's safety (PS) in surgery. In order to analyze complex causes of adverse outcomes in surgery we performed an interviewing of 110 surgeons, 42 emergency physicians and 25 health care managers. The main keynote consisted in assessing law and educational components of PS. The study revealed significant professional shortcomings in law PS level and low educational and motivational activity of physicians of all specialties. Multi-faceted nature of PS problem requires multidisciplinary training of modern surgeons not only in the knowledge of key risk factors for adverse outcomes, but also in satisfaction of non-medical expectations of patients. Due to numerous objective reasons Russian surgical school should have the opportunity not to blindly copy the experience of our foreign colleagues, but to scientifically substantiate the development of own national security system both for surgical patients and medical workers themselves.
Piekny, Jeanette; Maehler, Claudia
2013-06-01
According to Klahr's (2000, 2005; Klahr & Dunbar, 1988) Scientific Discovery as Dual Search model, inquiry processes require three cognitive components: hypothesis generation, experimentation, and evidence evaluation. The aim of the present study was to investigate (a) when the ability to evaluate perfect covariation, imperfect covariation, and non-covariation evidence emerges, (b) when experimentation emerges, (c) when hypothesis generation skills emerge, and (d), whether these abilities develop synchronously during childhood. We administered three scientific reasoning tasks referring to the three components to 223 children of five age groups (from age 4.0 to 13.5 years). Our results show that the three cognitive components of domain-general scientific reasoning emerge asynchronously. The development of domain-general scientific reasoning begins with the ability to handle unambiguous data, progresses to the interpretation of ambiguous data, and leads to a flexible adaptation of hypotheses according to the sufficiency of evidence. When children understand the relation between the level of ambiguity of evidence and the level of confidence in hypotheses, the ability to differentiate conclusive from inconclusive experiments accompanies this development. Implications of these results for designing science education concepts for young children are briefly discussed. © 2012 The British Psychological Society.
Predicting Cost/Performance Trade-Offs for Whitney: A Commodity Computing Cluster
NASA Technical Reports Server (NTRS)
Becker, Jeffrey C.; Nitzberg, Bill; VanderWijngaart, Rob F.; Kutler, Paul (Technical Monitor)
1997-01-01
Recent advances in low-end processor and network technology have made it possible to build a "supercomputer" out of commodity components. We develop simple models of the NAS Parallel Benchmarks version 2 (NPB 2) to explore the cost/performance trade-offs involved in building a balanced parallel computer supporting a scientific workload. We develop closed form expressions detailing the number and size of messages sent by each benchmark. Coupling these with measured single processor performance, network latency, and network bandwidth, our models predict benchmark performance to within 30%. A comparison based on total system cost reveals that current commodity technology (200 MHz Pentium Pros with 100baseT Ethernet) is well balanced for the NPBs up to a total system cost of around $1,000,000.
Holzhauser, Thomas; Ree, Ronald van; Poulsen, Lars K; Bannon, Gary A
2008-10-01
There is detailed guidance on how to perform bioinformatic analyses and enzymatic degradation studies for genetically modified crops under consideration for approval by regulatory agencies; however, there is no consensus in the scientific community on the details of how to perform IgE serum studies. IgE serum studies are an important safety component to acceptance of genetically modified crops when the introduced protein is novel, the introduced protein is similar to known allergens, or the crop is allergenic. In this manuscript, we describe the characteristics of the reagents, validation of assay performance, and data analysis necessary to optimize the information obtained from serum testing of novel proteins and genetically modified (GM) crops and to make results more accurate and comparable between different investigations.
Falavigna, Asdrubal; Martins Filho, Délio Eulálio; Avila, José María Jiménez; Guyot, Juan Pablo; Gonzáles, Alvaro Silva; Riew, Daniel K
2015-07-01
The emancipatory nature of education requires research as its fundamental base, because physicians can only improve their skills and knowledge through enquiry. The number and quality of scientific publications by Latin-American spine surgeons found in the Medline database was low between 2000 and 2011. Nevertheless, the research Bank Survey of AOSpine Latin America (AOSLA) members showed that 96% of responders were very interested and motivated to perform scientific research. The research officer of AOSLA together with the Country Council and the AOSpine Research Commission established a competency-based curriculum to improve understanding of what is necessary to produce research and the best methods to achieve this goal. The research curriculum was divided into four main components: (1) research educational plan, (2) performing research, (3) technical and professional support and (4) assessment. The competences, learning outcomes and a syllabus on knowledge in research were developed to enable the participants to understand and perform investigations effectively. The eLearning module was designed to improve the competences to access, evaluate and use scientific information available in the main databases efficiently. Research courses were given as an isolated activity four times in Brazil and Mexico and as precourse activities six times in Brazil, Mexico and Peru. The result was an increased number of articles published and works presented at congresses. The project of education in research can be effectively disseminated and applied across regions, across students and across specialties.
Hay, L.; Knapp, L.
1996-01-01
Investigating natural, potential, and man-induced impacts on hydrological systems commonly requires complex modelling with overlapping data requirements, and massive amounts of one- to four-dimensional data at multiple scales and formats. Given the complexity of most hydrological studies, the requisite software infrastructure must incorporate many components including simulation modelling, spatial analysis and flexible, intuitive displays. There is a general requirement for a set of capabilities to support scientific analysis which, at this time, can only come from an integration of several software components. Integration of geographic information systems (GISs) and scientific visualization systems (SVSs) is a powerful technique for developing and analysing complex models. This paper describes the integration of an orographic precipitation model, a GIS and a SVS. The combination of these individual components provides a robust infrastructure which allows the scientist to work with the full dimensionality of the data and to examine the data in a more intuitive manner.
Köver, Hania; Wirt, Stacey E; Owens, Melinda T; Dosmann, Andrew J
2014-01-01
Learning and practicing scientific inquiry is an essential component of a STEM education, but it is often difficult to teach to novices or those outside of a laboratory setting. To promote scientific thinking in a freshmen introductory neuroscience course without a lab component, we developed a series of learning activities and assignments designed to foster scientific thinking through the use of scientific grant proposals. Students wrote three short grant proposals on topics ranging from molecular to cognitive neuroscience during a 10-week class (one quarter). We made this challenging and advanced task feasible for novice learners through extensive instructional scaffolding, opportunity for practice, and frequent peer and instructor feedback. Student and instructor reports indicate that the assignments were highly intellectually engaging and that they promoted critical thinking, a deeper understanding of neuroscience material, and effective written communication skills. Here we outline the mechanics of the assignment, student and instructor impressions of learning outcomes, and the advantages and disadvantages of implementing this approach.
Köver, Hania; Wirt, Stacey E.; Owens, Melinda T.; Dosmann, Andrew J.
2014-01-01
Learning and practicing scientific inquiry is an essential component of a STEM education, but it is often difficult to teach to novices or those outside of a laboratory setting. To promote scientific thinking in a freshmen introductory neuroscience course without a lab component, we developed a series of learning activities and assignments designed to foster scientific thinking through the use of scientific grant proposals. Students wrote three short grant proposals on topics ranging from molecular to cognitive neuroscience during a 10-week class (one quarter). We made this challenging and advanced task feasible for novice learners through extensive instructional scaffolding, opportunity for practice, and frequent peer and instructor feedback. Student and instructor reports indicate that the assignments were highly intellectually engaging and that they promoted critical thinking, a deeper understanding of neuroscience material, and effective written communication skills. Here we outline the mechanics of the assignment, student and instructor impressions of learning outcomes, and the advantages and disadvantages of implementing this approach. PMID:25565917
NASA Technical Reports Server (NTRS)
Tarbell, Theodore D.; Topka, Kenneth P.
1992-01-01
The definition phase of a scientific study of active regions on the sun by balloon flight of a former Spacelab instrument, the Solar Optical Universal Polarimeter (SOUP) is described. SOUP is an optical telescope with image stabilization, tunable filter and various cameras. After the flight phase of the program was cancelled due to budgetary problems, scientific and engineering studies relevant to future balloon experiments of this type were completed. High resolution observations of the sun were obtained using SOUP components at the Swedish Solar Observatory in the Canary Islands. These were analyzed and published in studies of solar magnetic fields and active regions. In addition, testing of low-voltage piezoelectric transducers was performed, which showed they were appropriate for use in image stabilization on a balloon.
The Neutron Star Interior Composition Explorer (NICER): Design and Development
NASA Technical Reports Server (NTRS)
Gendreau, Keith C.; Arzoumanian, Zaven; Adkins, Phillip W.; Albert, Cheryl L.; Anders, John F.; Aylward, Andrew T.; Baker, Charles L.; Balsamo, Erin R.; Bamford, William A.; Benegalrao, Suyog S.;
2016-01-01
During 2014 and 2015, NASA's Neutron star Interior Composition Explorer (NICER) mission proceeded successfully through Phase C, Design and Development. An X-ray (0.2{12 keV) astrophysics payload destined for the International Space Station, NICER is manifested for launch in early 2017 on the Commercial Resupply Services SpaceX-11 flight. Its scientific objectives are to investigate the internal structure, dynamics, and energetics of neutron stars, the densest objects in the universe. During Phase C, flight components including optics, detectors, the optical bench, pointing actuators, electronics, and others were subjected to environmental testing and integrated to form the flight payload. A custom-built facility was used to co-align and integrate the X-ray \\concentrator" optics and silicon-drift detectors. Ground calibration provided robust performance measures of the optical (at NASA's Goddard Space Flight Center) and detector (at the Massachusetts Institute of Technology) subsystems, while comprehensive functional tests prior to payload-level environmental testing met all instrument performance requirements. We describe here the implementation of NICER's major subsystems, summarize their performance and calibration, and outline the component-level testing that was successfully applied.
The Neutron star Interior Composition Explorer (NICER): design and development
NASA Astrophysics Data System (ADS)
Gendreau, Keith C.; Arzoumanian, Zaven; Adkins, Phillip W.; Albert, Cheryl L.; Anders, John F.; Aylward, Andrew T.; Baker, Charles L.; Balsamo, Erin R.; Bamford, William A.; Benegalrao, Suyog S.; Berry, Daniel L.; Bhalwani, Shiraz; Black, J. Kevin; Blaurock, Carl; Bronke, Ginger M.; Brown, Gary L.; Budinoff, Jason G.; Cantwell, Jeffrey D.; Cazeau, Thoniel; Chen, Philip T.; Clement, Thomas G.; Colangelo, Andrew T.; Coleman, Jerry S.; Coopersmith, Jonathan D.; Dehaven, William E.; Doty, John P.; Egan, Mark D.; Enoto, Teruaki; Fan, Terry W.; Ferro, Deneen M.; Foster, Richard; Galassi, Nicholas M.; Gallo, Luis D.; Green, Chris M.; Grosh, Dave; Ha, Kong Q.; Hasouneh, Monther A.; Heefner, Kristofer B.; Hestnes, Phyllis; Hoge, Lisa J.; Jacobs, Tawanda M.; Jørgensen, John L.; Kaiser, Michael A.; Kellogg, James W.; Kenyon, Steven J.; Koenecke, Richard G.; Kozon, Robert P.; LaMarr, Beverly; Lambertson, Mike D.; Larson, Anne M.; Lentine, Steven; Lewis, Jesse H.; Lilly, Michael G.; Liu, Kuochia Alice; Malonis, Andrew; Manthripragada, Sridhar S.; Markwardt, Craig B.; Matonak, Bryan D.; Mcginnis, Isaac E.; Miller, Roger L.; Mitchell, Alissa L.; Mitchell, Jason W.; Mohammed, Jelila S.; Monroe, Charles A.; Montt de Garcia, Kristina M.; Mulé, Peter D.; Nagao, Louis T.; Ngo, Son N.; Norris, Eric D.; Norwood, Dwight A.; Novotka, Joseph; Okajima, Takashi; Olsen, Lawrence G.; Onyeachu, Chimaobi O.; Orosco, Henry Y.; Peterson, Jacqualine R.; Pevear, Kristina N.; Pham, Karen K.; Pollard, Sue E.; Pope, John S.; Powers, Daniel F.; Powers, Charles E.; Price, Samuel R.; Prigozhin, Gregory Y.; Ramirez, Julian B.; Reid, Winston J.; Remillard, Ronald A.; Rogstad, Eric M.; Rosecrans, Glenn P.; Rowe, John N.; Sager, Jennifer A.; Sanders, Claude A.; Savadkin, Bruce; Saylor, Maxine R.; Schaeffer, Alexander F.; Schweiss, Nancy S.; Semper, Sean R.; Serlemitsos, Peter J.; Shackelford, Larry V.; Soong, Yang; Struebel, Jonathan; Vezie, Michael L.; Villasenor, Joel S.; Winternitz, Luke B.; Wofford, George I.; Wright, Michael R.; Yang, Mike Y.; Yu, Wayne H.
2016-07-01
During 2014 and 2015, NASA's Neutron star Interior Composition Explorer (NICER) mission proceeded success- fully through Phase C, Design and Development. An X-ray (0.2-12 keV) astrophysics payload destined for the International Space Station, NICER is manifested for launch in early 2017 on the Commercial Resupply Services SpaceX-11 flight. Its scientific objectives are to investigate the internal structure, dynamics, and energetics of neutron stars, the densest objects in the universe. During Phase C, flight components including optics, detectors, the optical bench, pointing actuators, electronics, and others were subjected to environmental testing and integrated to form the flight payload. A custom-built facility was used to co-align and integrate the X-ray "con- centrator" optics and silicon-drift detectors. Ground calibration provided robust performance measures of the optical (at NASA's Goddard Space Flight Center) and detector (at the Massachusetts Institute of Technology) subsystems, while comprehensive functional tests prior to payload-level environmental testing met all instrument performance requirements. We describe here the implementation of NICER's major subsystems, summarize their performance and calibration, and outline the component-level testing that was successfully applied.
NASA Astrophysics Data System (ADS)
Olschanowsky, C.; Flores, A. N.; FitzGerald, K.; Masarik, M. T.; Rudisill, W. J.; Aguayo, M.
2017-12-01
Dynamic models of the spatiotemporal evolution of water, energy, and nutrient cycling are important tools to assess impacts of climate and other environmental changes on ecohydrologic systems. These models require spatiotemporally varying environmental forcings like precipitation, temperature, humidity, windspeed, and solar radiation. These input data originate from a variety of sources, including global and regional weather and climate models, global and regional reanalysis products, and geostatistically interpolated surface observations. Data translation measures, often subsetting in space and/or time and transforming and converting variable units, represent a seemingly mundane, but critical step in the application workflows. Translation steps can introduce errors, misrepresentations of data, slow execution time, and interrupt data provenance. We leverage a workflow that subsets a large regional dataset derived from the Weather Research and Forecasting (WRF) model and prepares inputs to the Parflow integrated hydrologic model to demonstrate the impact translation tool software quality on scientific workflow results and performance. We propose that such workflows will benefit from a community approved collection of data transformation components. The components should be self-contained composable units of code. This design pattern enables automated parallelization and software verification, improving performance and reliability. Ensuring that individual translation components are self-contained and target minute tasks increases reliability. The small code size of each component enables effective unit and regression testing. The components can be automatically composed for efficient execution. An efficient data translation framework should be written to minimize data movement. Composing components within a single streaming process reduces data movement. Each component will typically have a low arithmetic intensity, meaning that it requires about the same number of bytes to be read as the number of computations it performs. When several components' executions are coordinated the overall arithmetic intensity increases, leading to increased efficiency.
NASA Technical Reports Server (NTRS)
Blanchard, M. B.; Oberbeck, V. R.; Bunch, T. E.; Reynolds, R. T.; Canning, T. N.; Jackson, R. W.
1976-01-01
The feasibility of employing penetrators for exploring Mars was examined. Eight areas of interest for key scientific experiments were identified. These include: seismic activity, imaging, geochemistry, water measurement, heatflow, meteorology, magnetometry, and biochemistry. In seven of the eight potential experiment categories this year's progress included: conceptual design, instrument fabrication, instrument performance evaluation, and shock loading of important components. Most of the components survived deceleration testing with negligible performance changes. Components intended to be placed inside the penetrator forebody were tested up to 3,500 g and components intended to be placed on the afterbody were tested up to 21,000 g. A field test program was conducted using tentative Mars penetrator mission constraints. Drop tests were performed at two selected terrestrial analog sites to determine the range of penetration depths for anticipated common Martian materials. Minimum penetration occurred in basalt at Amboy, California. Three full-scale penetrators penetrated 0.4 to 0.9 m into the basalt after passing through 0.3 to 0.5 m of alluvial overburden. Maximum penetration occurred in unconsolidated sediments at McCook, Nebraska. Two full-scale penetrators penetrated 2.5 to 8.5 m of sediment. Impact occurred in two kinds of sediment: loess and layered clay. Deceleration g loads of nominally 2,000 for the forebody and 20,000 for the afterbody did not present serious design problems for potential experiments. Penetrators have successfully impacted into terrestrial analogs of the probable extremes of potential Martian sites.
NASA Astrophysics Data System (ADS)
Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.
2015-12-01
The CyberShake computational platform, developed by the Southern California Earthquake Center (SCEC), is an integrated collection of scientific software and middleware that performs 3D physics-based probabilistic seismic hazard analysis (PSHA) for Southern California. CyberShake integrates large-scale and high-throughput research codes to produce probabilistic seismic hazard curves for individual locations of interest and hazard maps for an entire region. A recent CyberShake calculation produced about 500,000 two-component seismograms for each of 336 locations, resulting in over 300 million synthetic seismograms in a Los Angeles-area probabilistic seismic hazard model. CyberShake calculations require a series of scientific software programs. Early computational stages produce data used as inputs by later stages, so we describe CyberShake calculations using a workflow definition language. Scientific workflow tools automate and manage the input and output data and enable remote job execution on large-scale HPC systems. To satisfy the requests of broad impact users of CyberShake data, such as seismologists, utility companies, and building code engineers, we successfully completed CyberShake Study 15.4 in April and May 2015, calculating a 1 Hz urban seismic hazard map for Los Angeles. We distributed the calculation between the NSF Track 1 system NCSA Blue Waters, the DOE Leadership-class system OLCF Titan, and USC's Center for High Performance Computing. This study ran for over 5 weeks, burning about 1.1 million node-hours and producing over half a petabyte of data. The CyberShake Study 15.4 results doubled the maximum simulated seismic frequency from 0.5 Hz to 1.0 Hz as compared to previous studies, representing a factor of 16 increase in computational complexity. We will describe how our workflow tools supported splitting the calculation across multiple systems. We will explain how we modified CyberShake software components, including GPU implementations and migrating from file-based communication to MPI messaging, to greatly reduce the I/O demands and node-hour requirements of CyberShake. We will also present performance metrics from CyberShake Study 15.4, and discuss challenges that producers of Big Data on open-science HPC resources face moving forward.
The Coastal Zone: Man and Nature. An Application of the Socio-Scientific Reasoning Model.
ERIC Educational Resources Information Center
Maul, June Paradise; And Others
The curriculum model described here has been designed by incorporating the socio-scientific reasoning model with a simulation design in an attempt to have students investigate the onshore impacts of Outer Continental Shelf (OCS) gas and oil development. The socio-scientific reasoning model incorporates a logical/physical reasoning component as…
The Development of Scientific Thinking in Elementary School: A Comprehensive Inventory
ERIC Educational Resources Information Center
Koerber, Susanne; Mayer, Daniela; Osterhaus, Christopher; Schwippert, Knut; Sodian, Beate
2015-01-01
The development of scientific thinking was assessed in 1,581 second, third, and fourth graders (8-, 9-, 10-year-olds) based on a conceptual model that posits developmental progression from naïve to more advanced conceptions. Using a 66-item scale, five components of scientific thinking were addressed, including experimental design, data…
NASA Astrophysics Data System (ADS)
Agram, P. S.; Gurrola, E. M.; Lavalle, M.; Sacco, G. F.; Rosen, P. A.
2016-12-01
The InSAR Scientific Computing Environment (ISCE) provides both a modular, flexible, and extensible framework for building software components and applications that work together seamlessly as well as a toolbox for processing InSAR data into higher level geodetic image products from a diverse array of radar satellites and aircraft. ISCE easily scales to serve as the SAR processing engine at the core of the NASA JPL Advanced Rapid Imaging and Analysis (ARIA) Center for Natural Hazards as well as a software toolbox for individual scientists working with SAR data. ISCE is planned as the foundational element in processing NISAR data, enabling a new class of analyses that take greater advantage of the long time and large spatial scales of these data. ISCE in ARIA is also a SAR Foundry for development of new processing components and workflows to meet the needs of both large processing centers and individual users. The ISCE framework contains object-oriented Python components layered to construct Python InSAR components that manage legacy Fortran/C InSAR programs. The Python user interface enables both command-line deployment of workflows as well as an interactive "sand box" (the Python interpreter) where scientists can "play" with the data. Recent developments in ISCE include the addition of components to ingest Sentinel-1A SAR data (both stripmap and TOPS-mode) and a new workflow for processing the TOPS-mode data. New components are being developed to exploit polarimetric-SAR data to provide the ecosystem and land-cover/land-use change communities with rigorous and efficient tools to perform multi-temporal, polarimetric and tomographic analyses in order to generate calibrated, geocoded and mosaicked Level-2 and Level-3 products (e.g., maps of above-ground biomass or forest disturbance). ISCE has been downloaded by over 200 users by a license for WinSAR members through the Unavco.org website. Others may apply directly to JPL for a license at download.jpl.nasa.gov.
Optimizing CyberShake Seismic Hazard Workflows for Large HPC Resources
NASA Astrophysics Data System (ADS)
Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.
2014-12-01
The CyberShake computational platform is a well-integrated collection of scientific software and middleware that calculates 3D simulation-based probabilistic seismic hazard curves and hazard maps for the Los Angeles region. Currently each CyberShake model comprises about 235 million synthetic seismograms from about 415,000 rupture variations computed at 286 sites. CyberShake integrates large-scale parallel and high-throughput serial seismological research codes into a processing framework in which early stages produce files used as inputs by later stages. Scientific workflow tools are used to manage the jobs, data, and metadata. The Southern California Earthquake Center (SCEC) developed the CyberShake platform using USC High Performance Computing and Communications systems and open-science NSF resources.CyberShake calculations were migrated to the NSF Track 1 system NCSA Blue Waters when it became operational in 2013, via an interdisciplinary team approach including domain scientists, computer scientists, and middleware developers. Due to the excellent performance of Blue Waters and CyberShake software optimizations, we reduced the makespan (a measure of wallclock time-to-solution) of a CyberShake study from 1467 to 342 hours. We will describe the technical enhancements behind this improvement, including judicious introduction of new GPU software, improved scientific software components, increased workflow-based automation, and Blue Waters-specific workflow optimizations.Our CyberShake performance improvements highlight the benefits of scientific workflow tools. The CyberShake workflow software stack includes the Pegasus Workflow Management System (Pegasus-WMS, which includes Condor DAGMan), HTCondor, and Globus GRAM, with Pegasus-mpi-cluster managing the high-throughput tasks on the HPC resources. The workflow tools handle data management, automatically transferring about 13 TB back to SCEC storage.We will present performance metrics from the most recent CyberShake study, executed on Blue Waters. We will compare the performance of CPU and GPU versions of our large-scale parallel wave propagation code, AWP-ODC-SGT. Finally, we will discuss how these enhancements have enabled SCEC to move forward with plans to increase the CyberShake simulation frequency to 1.0 Hz.
NASA Technical Reports Server (NTRS)
Schmidt, G. K.
1979-01-01
A booms and mechanisms subsystem was designed, developed, and qualified for the geostationary scientific satellite GEOS. Part of this subsystem consist of four axial booms consisting of one pair of 1 m booms and one pair of 2.5 m booms. Each of these booms is carrying one bird cage electric field sensor. Alignment accuracy requirements led to a telescopic type solution. Deployment is performed by pressurized nitrogen. At deployment in orbit two of these booms showed some anomalies and one of these two deployed only about 80%. Following this malfunction a detailed failure investigation was performed resulting in a design modification of some critical components as release mechanism, guide sleeves of the telescopic elements, and pressure system.
Achieving a balance - Science and human exploration
NASA Technical Reports Server (NTRS)
Duke, Michael B.
1992-01-01
An evaluation is made of the opportunities for advancing the scientific understanding of Mars through a research program, conducted under the egis of NASA's Space Exploration Initiative, which emphasizes the element of human exploration as well as the requisite robotic component. A Mars exploration program that involves such complementary human/robotic components will entail the construction of a closed ecological life-support system, long-duration spacecraft facilities for crews, and the development of extraterrestrial resources; these R&D imperatives will have great subsequent payoffs, both scientific and economic.
Components of the Early Apollo Scientific Experiments Package (EASEP)
1969-07-20
AS11-37-5551 (20 July 1969) --- Two components of the Early Apollo Scientific Experiments Package (EASEP) are seen deployed on the lunar surface in this view photographed from inside the Lunar Module (LM). In the far background is the Passive Seismic Experiment Package (PSEP); and to the right and closer to the camera is the Laser Ranging Retro-Reflector (LR-3). The footprints of Apollo 11 astronauts Neil A. Armstrong and Edwin E. Aldrin Jr. are very distinct in the lunar soil.
Efficient Load Balancing and Data Remapping for Adaptive Grid Calculations
NASA Technical Reports Server (NTRS)
Oliker, Leonid; Biswas, Rupak
1997-01-01
Mesh adaption is a powerful tool for efficient unstructured- grid computations but causes load imbalance among processors on a parallel machine. We present a novel method to dynamically balance the processor workloads with a global view. This paper presents, for the first time, the implementation and integration of all major components within our dynamic load balancing strategy for adaptive grid calculations. Mesh adaption, repartitioning, processor assignment, and remapping are critical components of the framework that must be accomplished rapidly and efficiently so as not to cause a significant overhead to the numerical simulation. Previous results indicated that mesh repartitioning and data remapping are potential bottlenecks for performing large-scale scientific calculations. We resolve these issues and demonstrate that our framework remains viable on a large number of processors.
Scientific Data Management (SDM) Center for Enabling Technologies. 2007-2012
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ludascher, Bertram; Altintas, Ilkay
Over the past five years, our activities have both established Kepler as a viable scientific workflow environment and demonstrated its value across multiple science applications. We have published numerous peer-reviewed papers on the technologies highlighted in this short paper and have given Kepler tutorials at SC06,SC07,SC08,and SciDAC 2007. Our outreach activities have allowed scientists to learn best practices and better utilize Kepler to address their individual workflow problems. Our contributions to advancing the state-of-the-art in scientific workflows have focused on the following areas. Progress in each of these areas is described in subsequent sections. Workflow development. The development of amore » deeper understanding of scientific workflows "in the wild" and of the requirements for support tools that allow easy construction of complex scientific workflows; Generic workflow components and templates. The development of generic actors (i.e.workflow components and processes) which can be broadly applied to scientific problems; Provenance collection and analysis. The design of a flexible provenance collection and analysis infrastructure within the workflow environment; and, Workflow reliability and fault tolerance. The improvement of the reliability and fault-tolerance of workflow environments.« less
Professional development model for science teachers based on scientific literacy
NASA Astrophysics Data System (ADS)
Rubini, B.; Ardianto, D.; Pursitasari, I. D.; Permana, I.
2017-01-01
Scientific literacy is considered as a benchmark of high and low quality of science education in a country. Teachers as a major component of learning at the forefront of building science literacy skills of students in the class. The primary purpose this study is development science teacher coaching model based on scientific literacy. In this article we describe about teacher science literacy and profile coaching model for science’ teachers based on scientific literacy which a part of study conducted in first year. The instrument used in this study consisted of tests, observation sheet, interview guides. The finding showed that problem of low scientific literacy is not only happen the students, but science’ teachers which is a major component in the learning process is still not satisfactory. Understanding science teacher is strongly associated with the background disciplinary. Science teacher was still weak when explaining scientific phenomena, mainly related to the material that relates to the concept of environmental. Coaching model generated from this study consisted of 8 stages by assuming the teacher is an independent learner, so the coaching is done with methods on and off, with time off for activities designed more.
The space telescope NINA: results of a beam test calibration
NASA Astrophysics Data System (ADS)
Bidoli, V.; Casolino, M.; Pascale, M. P. D.; Morselli, A.; Furano, G.; Picozza, P.; Scoscini, A.; Sparvoli, R.; Barbiellini, G.; Bonvicini, W.; Cirami, R.; Schiavon, P.; Vacchi, A.; Zampa, N.; Ambriola, M.; Bellotti, R.; Cafagna, F.; Ciacio, F.; Castellano, M.; Circella, M.; Marzo, C. D.; Bartalucci, S.; Giuntoli, S.; Ricci, M.; Papini, P.; Piccardi, S.; Spillantini, P.; Bakaldin, A.; Batishev, A.; Galper, A. M.; Koldashov, S.; Korotkov, M.; Mikhailov, V.; Murashov, A.; Voronov, S.; Boezio, M.
1999-03-01
In June 1998 the telescope NINA will be launched in space on board of the Russian satellite Resource-01 n.4. The main scientific objective of the mission is the study of the anomalous, galactic and solar components of the cosmic rays in the energy interval 10-200MeV/n. The core of the instrument is a silicon detector whose performances have been tested with a particle beam at the GSI Laboratory in Germany in 1997; we report here on the results obtained during the beam calibration.
Scientific Data Management (SDM) Center for Enabling Technologies. Final Report, 2007-2012
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ludascher, Bertram; Altintas, Ilkay
Our contributions to advancing the State of the Art in scientific workflows have focused on the following areas: Workflow development; Generic workflow components and templates; Provenance collection and analysis; and, Workflow reliability and fault tolerance.
State-of-the-Art for Small Satellite Propulsion Systems
NASA Technical Reports Server (NTRS)
Parker, Khary I.
2016-01-01
The NASA/Goddard Space Flight Center (NASA/GSFC) is in the business of performing world-class, space-based, scientific research on various spacecraft platforms, which now include small satellites (SmallSats). In order to perform world class science on a SmallSat, NASA/GSFC requires that their components be highly reliable, high performing, have low power consumption, at the lowest cost possible. The Propulsion Branch (Code 597) at NASA/GSFC has conducted a SmallSat propulsion system survey to determine their availability and level of development. Based on publicly available information and unique features, this paper discusses some of the existing SmallSat propulsion systems.. The systems described in this paper do not indicate or imply any endorsement by NASA or NASA/GSFC over those not included.
NASA Astrophysics Data System (ADS)
Peña, M.; Saha, S.; Wu, X.; Wang, J.; Tripp, P.; Moorthi, S.; Bhattacharjee, P.
2016-12-01
The next version of the operational Climate Forecast System (version 3, CFSv3) will be a fully coupled six-components system with diverse applications to earth system modeling, including weather and climate predictions. This system will couple the earth's atmosphere, land, ocean, sea-ice, waves and aerosols for both data assimilation and modeling. It will also use the NOAA Environmental Modeling System (NEMS) software super structure to couple these components. The CFSv3 is part of the next Unified Global Coupled System (UGCS), which will unify the global prediction systems that are now operational at NCEP. The UGCS is being developed through the efforts of dedicated research and engineering teams and through coordination across many CPO/MAPP and NGGPS groups. During this development phase, the UGCS is being tested for seasonal purposes and undergoes frequent revisions. Each new revision is evaluated to quickly discover, isolate and solve problems that negatively impact its performance. In the UGCS-seasonal model, components (e.g., ocean, sea-ice, atmosphere, etc.) are coupled through a NEMS-based "mediator". In this numerical infrastructure, model diagnostics and forecast validation are carried out, both component by component, and as a whole. The next stage, model optimization, will require enhanced performance diagnostics tools to help prioritize areas of numerical improvements. After the technical development of the UGCS-seasonal is completed, it will become the first realization of the CFSv3. All future development of this system will be carried out by the climate team at NCEP, in scientific collaboration with the groups that developed the individual components, as well as the climate community. A unique challenge to evaluate this unified weather-climate system is the large number of variables, which evolve over a wide range of temporal and spatial scales. A small set of performance measures and scorecard displays are been created, and collaboration and software contributions from research and operational centers are being incorporated. A status of the CFSv3/UGCS-seasonal development and examples of its performance and measuring tools will be presented.
A guide for writing in the scientific forum.
Kotsis, Sandra V; Chung, Kevin C
2010-11-01
When considering the importance of scientific writing in disseminating new discoveries and ideas, it is quite remarkable that few physicians have received any formal instruction in this essential process. This article focuses on the fundamental principles of scientific writing that also include a "style and grace" component. The art of good scientific writing is to convey scientific materials in a clear and interesting way, while avoiding incomprehensible sentences that only serve to disguise marginal contents within the article. The goal of this article is to encourage authors and readers to critically examine the art of scientific writing to overcome the barrier to effective communication.
Performance of the engineering analysis and data system 2 common file system
NASA Technical Reports Server (NTRS)
Debrunner, Linda S.
1993-01-01
The Engineering Analysis and Data System (EADS) was used from April 1986 to July 1993 to support large scale scientific and engineering computation (e.g. computational fluid dynamics) at Marshall Space Flight Center. The need for an updated system resulted in a RFP in June 1991, after which a contract was awarded to Cray Grumman. EADS II was installed in February 1993, and by July 1993 most users were migrated. EADS II is a network of heterogeneous computer systems supporting scientific and engineering applications. The Common File System (CFS) is a key component of this system. The CFS provides a seamless, integrated environment to the users of EADS II including both disk and tape storage. UniTree software is used to implement this hierarchical storage management system. The performance of the CFS suffered during the early months of the production system. Several of the performance problems were traced to software bugs which have been corrected. Other problems were associated with hardware. However, the use of NFS in UniTree UCFM software limits the performance of the system. The performance issues related to the CFS have led to a need to develop a greater understanding of the CFS organization. This paper will first describe the EADS II with emphasis on the CFS. Then, a discussion of mass storage systems will be presented, and methods of measuring the performance of the Common File System will be outlined. Finally, areas for further study will be identified and conclusions will be drawn.
Dynamic file-access characteristics of a production parallel scientific workload
NASA Technical Reports Server (NTRS)
Kotz, David; Nieuwejaar, Nils
1994-01-01
Multiprocessors have permitted astounding increases in computational performance, but many cannot meet the intense I/O requirements of some scientific applications. An important component of any solution to this I/O bottleneck is a parallel file system that can provide high-bandwidth access to tremendous amounts of data in parallel to hundreds or thousands of processors. Most successful systems are based on a solid understanding of the expected workload, but thus far there have been no comprehensive workload characterizations of multiprocessor file systems. This paper presents the results of a three week tracing study in which all file-related activity on a massively parallel computer was recorded. Our instrumentation differs from previous efforts in that it collects information about every I/O request and about the mix of jobs running in a production environment. We also present the results of a trace-driven caching simulation and recommendations for designers of multiprocessor file systems.
NASA Technical Reports Server (NTRS)
Springfield, C. W., Jr.
1985-01-01
The space telescope contains various scientific instrument (SI) modules which are mounted to the Focal Plane Structure (FPS) in a statically determinate manner. This is accomplished by using three registration fittings per SI module, one resisting three translations, another resisting two and the third resisting only one. Due to thermal insulating requirements these fittings are complex devices composed of numerous pieces. The structural integrity of these fittings is of great importance to the safety of the orbiter transporting the telescope, so in addition to the stress analyses performed during the design of these components, fracture susceptibility also needs to be considered. The pieces of the registration fittings for the Radial SI Module containing the Wide Field Planetary Camera are examined to determine which would endanger the orbiter if they fractured and what is the likelihood of their fracture. The latter is stated in terms of maximum allowable initial flaw sizes in these pieces.
Pathway Towards Fluency: Using 'disaggregate instruction' to promote science literacy
NASA Astrophysics Data System (ADS)
Brown, Bryan A.; Ryoo, Kihyun; Rodriguez, Jamie
2010-07-01
This study examines the impact of Disaggregate Instruction on students' science learning. Disaggregate Instruction is the idea that science teaching and learning can be separated into conceptual and discursive components. Using randomly assigned experimental and control groups, 49 fifth-grade students received web-based science lessons on photosynthesis using our experimental approach. We supplemented quantitative statistical comparisons of students' performance on pre- and post-test questions (multiple choice and short answer) with a qualitative analysis of students' post-test interviews. The results revealed that students in the experimental group outscored their control group counterparts across all measures. In addition, students taught using the experimental method demonstrated an improved ability to write using scientific language as well as an improved ability to provide oral explanations using scientific language. This study has important implications for how science educators can prepare teachers to teach diverse student populations.
Visualization techniques to aid in the analysis of multi-spectral astrophysical data sets
NASA Technical Reports Server (NTRS)
Brugel, Edward W.; Domik, Gitta O.; Ayres, Thomas R.
1993-01-01
The goal of this project was to support the scientific analysis of multi-spectral astrophysical data by means of scientific visualization. Scientific visualization offers its greatest value if it is not used as a method separate or alternative to other data analysis methods but rather in addition to these methods. Together with quantitative analysis of data, such as offered by statistical analysis, image or signal processing, visualization attempts to explore all information inherent in astrophysical data in the most effective way. Data visualization is one aspect of data analysis. Our taxonomy as developed in Section 2 includes identification and access to existing information, preprocessing and quantitative analysis of data, visual representation and the user interface as major components to the software environment of astrophysical data analysis. In pursuing our goal to provide methods and tools for scientific visualization of multi-spectral astrophysical data, we therefore looked at scientific data analysis as one whole process, adding visualization tools to an already existing environment and integrating the various components that define a scientific data analysis environment. As long as the software development process of each component is separate from all other components, users of data analysis software are constantly interrupted in their scientific work in order to convert from one data format to another, or to move from one storage medium to another, or to switch from one user interface to another. We also took an in-depth look at scientific visualization and its underlying concepts, current visualization systems, their contributions, and their shortcomings. The role of data visualization is to stimulate mental processes different from quantitative data analysis, such as the perception of spatial relationships or the discovery of patterns or anomalies while browsing through large data sets. Visualization often leads to an intuitive understanding of the meaning of data values and their relationships by sacrificing accuracy in interpreting the data values. In order to be accurate in the interpretation, data values need to be measured, computed on, and compared to theoretical or empirical models (quantitative analysis). If visualization software hampers quantitative analysis (which happens with some commercial visualization products), its use is greatly diminished for astrophysical data analysis. The software system STAR (Scientific Toolkit for Astrophysical Research) was developed as a prototype during the course of the project to better understand the pragmatic concerns raised in the project. STAR led to a better understanding on the importance of collaboration between astrophysicists and computer scientists.
Zan, Ke; Jiao, Xing-Ping; Guo, Li-Nong; Zheng, Jian; Ma, Shuang-Cheng
2016-06-01
This study is to establish the HPLC specific chromatogram and determine four main effective components of Lamiophlomis Herba and its counterfeit.Chlorogenic acid, forsythoside B, acteoside and luteoloside were reference substance.HPLC analysis was performed on a Waters XSelect C₁₈ column (4.6 mm×250 mm,5 μm).The mobile phase was acetonitrile-0.5% phosphoric acid solution (18∶82) with isocratic elution.The flow rate was 1.0 mL•min⁻¹, the detection wavelength was 332 nm and the column temperature was 30 ℃.Chemometrics software Chempattern was employed to analyze the research data.HPLC specific chromatogram of Lamiophlomis Herba from different samples were of high similarity, but the similarity of the HPLC specific chromatogram of its counterfeit were less than 0.65.Both of cluster and principal component analysis can distinguish certified products and adulterants.The HPLC specific chromatogram and contents of four effective components can be used for the quality control of Lamiophlomis Herba and its preparations.It provided scientific basis to standardize the use of the crude drug. Copyright© by the Chinese Pharmaceutical Association.
Repertoires: A post-Kuhnian perspective on scientific change and collaborative research.
Ankeny, Rachel A; Leonelli, Sabina
2016-12-01
We propose a framework to describe, analyze, and explain the conditions under which scientific communities organize themselves to do research, particularly within large-scale, multidisciplinary projects. The framework centers on the notion of a research repertoire, which encompasses well-aligned assemblages of the skills, behaviors, and material, social, and epistemic components that a group may use to practice certain kinds of science, and whose enactment affects the methods and results of research. This account provides an alternative to the idea of Kuhnian paradigms for understanding scientific change in the following ways: (1) it does not frame change as primarily generated and shaped by theoretical developments, but rather takes account of administrative, material, technological, and institutional innovations that contribute to change and explicitly questions whether and how such innovations accompany, underpin, and/or undercut theoretical shifts; (2) it thus allows for tracking of the organization, continuity, and coherence in research practices which Kuhn characterized as 'normal science' without relying on the occurrence of paradigmatic shifts and revolutions to be able to identify relevant components; and (3) it requires particular attention be paid to the performative aspects of science, whose study Kuhn pioneered but which he did not extensively conceptualize. We provide a detailed characterization of repertoires and discuss their relationship with communities, disciplines, and other forms of collaborative activities within science, building on an analysis of historical episodes and contemporary developments in the life sciences, as well as cases drawn from social and historical studies of physics, psychology, and medicine. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Mattson, E.; Versteeg, R.; Ankeny, M.; Stormberg, G.
2005-12-01
Long term performance monitoring has been identified by DOE, DOD and EPA as one of the most challenging and costly elements of contaminated site remedial efforts. Such monitoring should provide timely and actionable information relevant to a multitude of stakeholder needs. This information should be obtained in a manner which is auditable, cost effective and transparent. Over the last several years INL staff has designed and implemented a web accessible scientific workflow system for environmental monitoring. This workflow environment integrates distributed, automated data acquisition from diverse sensors (geophysical, geochemical and hydrological) with server side data management and information visualization through flexible browser based data access tools. Component technologies include a rich browser-based client (using dynamic javascript and html/css) for data selection, a back-end server which uses PHP for data processing, user management, and result delivery, and third party applications which are invoked by the back-end using webservices. This system has been implemented and is operational for several sites, including the Ruby Gulch Waste Rock Repository (a capped mine waste rock dump on the Gilt Edge Mine Superfund Site), the INL Vadoze Zone Research Park and an alternative cover landfill. Implementations for other vadoze zone sites are currently in progress. These systems allow for autonomous performance monitoring through automated data analysis and report generation. This performance monitoring has allowed users to obtain insights into system dynamics, regulatory compliance and residence times of water. Our system uses modular components for data selection and graphing and WSDL compliant webservices for external functions such as statistical analyses and model invocations. Thus, implementing this system for novel sites and extending functionality (e.g. adding novel models) is relatively straightforward. As system access requires a standard webbrowser and uses intuitive functionality, stakeholders with diverse degrees of technical insight can use this system with little or no training.
González-Alcaide, Gregorio; Park, Jinseo; Huamaní, Charles; Belinchón, Isabel; Ramos, José M.
2015-01-01
Background Although researchers have worked in collaboration since the origins of modern science and the publication of the first scientific journals in the eighteenth century, this phenomenon has acquired exceptional importance in the last several decades. Since the mid-twentieth century, new knowledge has been generated from within an ever-growing network of investigators, working cooperatively in research groups across countries and institutions. Cooperation is a crucial determinant of academic success. Objective The aim of the present paper is to analyze the evolution of scientific collaboration at the micro level, with regard to the scientific production generated on psoriasis research. Methods A bibliographic search in the Medline database containing the MeSH terms “psoriasis” or “psoriatic arthritis” was carried out. The search results were limited to articles, reviews and letters. After identifying the co-authorships of documents on psoriasis indexed in the Medline database (1942–2013), various bibliometric indicators were obtained, including the average number of authors per document and degree of multi-authorship over time. In addition, we performed a network analysis to study the evolution of certain features of the co-authorship network as a whole: average degree, size of the largest component, clustering coefficient, density and average distance. We also analyzed the evolution of the giant component to characterize the changing research patterns in the field, and we calculated social network indicators for the nodes, namely betweenness and closeness. Results The main active research clusters in the area were identified, along with their authors of reference. Our analysis of 28,670 documents sheds light on different aspects related to the evolution of scientific collaboration in the field, including the progressive increase in the mean number of co-authors (which stood at 5.17 in the 2004–2013 decade), and the rise in multi-authored papers signed by many different authors (in the same decade, 25.77% of the documents had between 6 and 9 co-authors, and 10.28% had 10 or more). With regard to the network indicators, the average degree gradually increased up to 10.97 in the study period. The percentage of authors pertaining to the largest component also rose to 73.02% of the authors. The clustering coefficient, on the other hand, remained stable throughout the entire 70-year period, with values hovering around 0.9. Finally, the average distance peaked in the decades 1974–1983 (8.29) and 1984–2003 (8.12) then fell over the next two decades, down to 5.25 in 2004–2013. The construction of the co-authorship network (threshold of collaboration ≥ 10 co-authored works) revealed a giant component of 161 researchers, containing 6 highly cohesive sub-components. Conclusions Our study reveals the existence of a growing research community in which collaboration is increasingly important. We can highlight an essential feature associated with scientific collaboration: multi-authored papers, with growing numbers of collaborators contributing to them, are becoming more and more common, therefore the formation of research groups of increasing depth (specialization) and breadth (multidisciplinarity) is now a cornerstone of research success. PMID:26658481
González-Alcaide, Gregorio; Park, Jinseo; Huamaní, Charles; Belinchón, Isabel; Ramos, José M
2015-01-01
Although researchers have worked in collaboration since the origins of modern science and the publication of the first scientific journals in the eighteenth century, this phenomenon has acquired exceptional importance in the last several decades. Since the mid-twentieth century, new knowledge has been generated from within an ever-growing network of investigators, working cooperatively in research groups across countries and institutions. Cooperation is a crucial determinant of academic success. The aim of the present paper is to analyze the evolution of scientific collaboration at the micro level, with regard to the scientific production generated on psoriasis research. A bibliographic search in the Medline database containing the MeSH terms "psoriasis" or "psoriatic arthritis" was carried out. The search results were limited to articles, reviews and letters. After identifying the co-authorships of documents on psoriasis indexed in the Medline database (1942-2013), various bibliometric indicators were obtained, including the average number of authors per document and degree of multi-authorship over time. In addition, we performed a network analysis to study the evolution of certain features of the co-authorship network as a whole: average degree, size of the largest component, clustering coefficient, density and average distance. We also analyzed the evolution of the giant component to characterize the changing research patterns in the field, and we calculated social network indicators for the nodes, namely betweenness and closeness. The main active research clusters in the area were identified, along with their authors of reference. Our analysis of 28,670 documents sheds light on different aspects related to the evolution of scientific collaboration in the field, including the progressive increase in the mean number of co-authors (which stood at 5.17 in the 2004-2013 decade), and the rise in multi-authored papers signed by many different authors (in the same decade, 25.77% of the documents had between 6 and 9 co-authors, and 10.28% had 10 or more). With regard to the network indicators, the average degree gradually increased up to 10.97 in the study period. The percentage of authors pertaining to the largest component also rose to 73.02% of the authors. The clustering coefficient, on the other hand, remained stable throughout the entire 70-year period, with values hovering around 0.9. Finally, the average distance peaked in the decades 1974-1983 (8.29) and 1984-2003 (8.12) then fell over the next two decades, down to 5.25 in 2004-2013. The construction of the co-authorship network (threshold of collaboration ≥ 10 co-authored works) revealed a giant component of 161 researchers, containing 6 highly cohesive sub-components. Our study reveals the existence of a growing research community in which collaboration is increasingly important. We can highlight an essential feature associated with scientific collaboration: multi-authored papers, with growing numbers of collaborators contributing to them, are becoming more and more common, therefore the formation of research groups of increasing depth (specialization) and breadth (multidisciplinarity) is now a cornerstone of research success.
NASA Astrophysics Data System (ADS)
Biswal, Milan; Mishra, Srikanta
2018-05-01
The limited information on origin and nature of stimulus frequency otoacoustic emissions (SFOAEs) necessitates a thorough reexamination into SFOAE analysis procedures. This will lead to a better understanding of the generation of SFOAEs. The SFOAE response waveform in the time domain can be interpreted as a summation of amplitude modulated and frequency modulated component waveforms. The efficiency of a technique to segregate these components is critical to describe the nature of SFOAEs. Recent advancements in robust time-frequency analysis algorithms have staked claims on the more accurate extraction of these components, from composite signals buried in noise. However, their potential has not been fully explored for SFOAEs analysis. Indifference to distinct information, due to nature of these analysis techniques, may impact the scientific conclusions. This paper attempts to bridge this gap in literature by evaluating the performance of three linear time-frequency analysis algorithms: short-time Fourier transform (STFT), continuous Wavelet transform (CWT), S-transform (ST) and two nonlinear algorithms: Hilbert-Huang Transform (HHT), synchrosqueezed Wavelet transform (SWT). We revisit the extraction of constituent components and estimation of their magnitude and delay, by carefully evaluating the impact of variation in analysis parameters. The performance of HHT and SWT from the perspective of time-frequency filtering and delay estimation were found to be relatively less efficient for analyzing SFOAEs. The intrinsic mode functions of HHT does not completely characterize the reflection components and hence IMF based filtering alone, is not recommended for segregating principal emission from multiple reflection components. We found STFT, WT, and ST to be suitable for canceling multiple internal reflection components with marginal altering in SFOAE.
[Study on volatile components from flowers of Gymnema sylvestre].
Qiu, Qin; Zhen, Han-Shen; Huang, Pei-Qian
2013-04-01
To analyze the volatile components from flowers of Gymnema sylvestre. Volatile components of flowers of Gymnema sylvestre were extracted by water vapor distilling, and the components were separated and identified by GC-MS. 55 components were separated and 33 components were identified, accounting for 88.73% of all quantity. The principal volatile components are Phytol, Pentacosane, 10-Heneicosene (c, t), 3-Eicosene, (E) -and 2-Methyl-Z-2-docosane. The research can pro-vide scientific basis for chemical component research of flowers of Gymnema sylvestre.
OOI CyberInfrastructure - Next Generation Oceanographic Research
NASA Astrophysics Data System (ADS)
Farcas, C.; Fox, P.; Arrott, M.; Farcas, E.; Klacansky, I.; Krueger, I.; Meisinger, M.; Orcutt, J.
2008-12-01
Software has become a key enabling technology for scientific discovery, observation, modeling, and exploitation of natural phenomena. New value emerges from the integration of individual subsystems into networked federations of capabilities exposed to the scientific community. Such data-intensive interoperability networks are crucial for future scientific collaborative research, as they open up new ways of fusing data from different sources and across various domains, and analysis on wide geographic areas. The recently established NSF OOI program, through its CyberInfrastructure component addresses this challenge by providing broad access from sensor networks for data acquisition up to computational grids for massive computations and binding infrastructure facilitating policy management and governance of the emerging system-of-scientific-systems. We provide insight into the integration core of this effort, namely, a hierarchic service-oriented architecture for a robust, performant, and maintainable implementation. We first discuss the relationship between data management and CI crosscutting concerns such as identity management, policy and governance, which define the organizational contexts for data access and usage. Next, we detail critical services including data ingestion, transformation, preservation, inventory, and presentation. To address interoperability issues between data represented in various formats we employ a semantic framework derived from the Earth System Grid technology, a canonical representation for scientific data based on DAP/OPeNDAP, and related data publishers such as ERDDAP. Finally, we briefly present the underlying transport based on a messaging infrastructure over the AMQP protocol, and the preservation based on a distributed file system through SDSC iRODS.
Performance Engineering Research Institute SciDAC-2 Enabling Technologies Institute Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hall, Mary
2014-09-19
Enhancing the performance of SciDAC applications on petascale systems has high priority within DOE SC. As we look to the future, achieving expected levels of performance on high-end com-puting (HEC) systems is growing ever more challenging due to enormous scale, increasing archi-tectural complexity, and increasing application complexity. To address these challenges, PERI has implemented a unified, tripartite research plan encompassing: (1) performance modeling and prediction; (2) automatic performance tuning; and (3) performance engineering of high profile applications. The PERI performance modeling and prediction activity is developing and refining performance models, significantly reducing the cost of collecting the data upon whichmore » the models are based, and increasing model fidelity, speed and generality. Our primary research activity is automatic tuning (autotuning) of scientific software. This activity is spurred by the strong user preference for automatic tools and is based on previous successful activities such as ATLAS, which has automatically tuned components of the LAPACK linear algebra library, and other re-cent work on autotuning domain-specific libraries. Our third major component is application en-gagement, to which we are devoting approximately 30% of our effort to work directly with Sci-DAC-2 applications. This last activity not only helps DOE scientists meet their near-term per-formance goals, but also helps keep PERI research focused on the real challenges facing DOE computational scientists as they enter the Petascale Era.« less
The Economic Burden of Malnutrition in Pregnant Women and Children under 5 Years of Age in Cambodia
Moench-Pfanner, Regina; Silo, Sok; Laillou, Arnaud; Wieringa, Frank; Hong, Rathamony; Hong, Rathavuth; Poirot, Etienne; Bagriansky, Jack
2016-01-01
Malnutrition is locked in a vicious cycle of increased mortality, poor health, impaired cognitive development, slow physical growth, reduced learning capacity, inferior performance, and ultimately lower adult work performance and productivity. The consensus of global scientific evidence indicates that lowering the rates of malnutrition will be an indispensable component of any successful program to raise the quality of human capital and resources. This study used a “consequence model” to apply the coefficient risk-deficit on economic losses, established in the global scientific literature, to Cambodian health, demographic, and economic data to develop a national estimate of the value of economic losses due to malnutrition. The impact of the indicators of malnutrition analyzed represent a burden to the national economy of Cambodia estimated at 266 million USD annually (1.7% of GDP). Stunting is reducing the Cambodian economic output by more than 120 million USD, and iodine deficiency disorders alone by 57 million USD. This economic burden is too high in view of Cambodia’s efforts to drive economic development. The government should rapidly expand a range of low-cost effective nutrition interventions to break the current cycle of increased mortality, poor health and ultimately lower work performance, productivity, and earnings. PMID:27187462
The Design and Evaluation of "CAPTools"--A Computer Aided Parallelization Toolkit
NASA Technical Reports Server (NTRS)
Yan, Jerry; Frumkin, Michael; Hribar, Michelle; Jin, Haoqiang; Waheed, Abdul; Johnson, Steve; Cross, Jark; Evans, Emyr; Ierotheou, Constantinos; Leggett, Pete;
1998-01-01
Writing applications for high performance computers is a challenging task. Although writing code by hand still offers the best performance, it is extremely costly and often not very portable. The Computer Aided Parallelization Tools (CAPTools) are a toolkit designed to help automate the mapping of sequential FORTRAN scientific applications onto multiprocessors. CAPTools consists of the following major components: an inter-procedural dependence analysis module that incorporates user knowledge; a 'self-propagating' data partitioning module driven via user guidance; an execution control mask generation and optimization module for the user to fine tune parallel processing of individual partitions; a program transformation/restructuring facility for source code clean up and optimization; a set of browsers through which the user interacts with CAPTools at each stage of the parallelization process; and a code generator supporting multiple programming paradigms on various multiprocessors. Besides describing the rationale behind the architecture of CAPTools, the parallelization process is illustrated via case studies involving structured and unstructured meshes. The programming process and the performance of the generated parallel programs are compared against other programming alternatives based on the NAS Parallel Benchmarks, ARC3D and other scientific applications. Based on these results, a discussion on the feasibility of constructing architectural independent parallel applications is presented.
The Economic Burden of Malnutrition in Pregnant Women and Children under 5 Years of Age in Cambodia.
Moench-Pfanner, Regina; Silo, Sok; Laillou, Arnaud; Wieringa, Frank; Hong, Rathamony; Hong, Rathavuth; Poirot, Etienne; Bagriansky, Jack
2016-05-14
Malnutrition is locked in a vicious cycle of increased mortality, poor health, impaired cognitive development, slow physical growth, reduced learning capacity, inferior performance, and ultimately lower adult work performance and productivity. The consensus of global scientific evidence indicates that lowering the rates of malnutrition will be an indispensable component of any successful program to raise the quality of human capital and resources. This study used a "consequence model" to apply the coefficient risk-deficit on economic losses, established in the global scientific literature, to Cambodian health, demographic, and economic data to develop a national estimate of the value of economic losses due to malnutrition. The impact of the indicators of malnutrition analyzed represent a burden to the national economy of Cambodia estimated at 266 million USD annually (1.7% of GDP). Stunting is reducing the Cambodian economic output by more than 120 million USD, and iodine deficiency disorders alone by 57 million USD. This economic burden is too high in view of Cambodia's efforts to drive economic development. The government should rapidly expand a range of low-cost effective nutrition interventions to break the current cycle of increased mortality, poor health and ultimately lower work performance, productivity, and earnings.
Is There Any Scientific Basis of Hawan to be used in Epilepsy-Prevention/Cure?
Bansal, Parveen; Kaur, Ramandeep; Gupta, Vikas; Kumar, Sanjiv; Kaur, RamanPreet
2015-01-01
Epilepsy is a neuropsychiatric disorder associated with religiosity and spirituality. Nasal drug delivery systems are the best for diseases related to brain. In older times RishiMuni, ancient scholars and physicians used to recommend Hawan for mental peace and well being. Gayatri Mantra also tells that sughandhim (aroma, fragrance) puushtivardhanam (gives rise to good health). Om triambkum yajamahe, sughandhim puushtivardhanam, urvarukmev vandhanaat, mrityu mokshay mamritaat! Hawan is a scientific experiment in which special herbs (Hawan Samagri) are offered in the fire of medicinal woods ignited in a specially designed fire pit called agnikuñda. Hawan seems to be designed by the ancient scholars to fight with the diseases of the brain. Our metadata analysis demonstrates that the components of Hawan are having a number of volatile oils that are specifically useful for epilepsy through one or the other mechanism of action. Due to high temperature of fire the vapors of these oils enter into the central nervous system through nasal route. The routine of performing Hawan might keep the threshold value of the therapeutic components in the body and help in preventing epilepsy. In the present manuscript authors have tried to highlight and integrate the modern and ancient concepts for treatment and prevention of epilepsy. PMID:26819935
MODIS information, data and control system (MIDACS) level 2 functional requirements
NASA Technical Reports Server (NTRS)
Han, D.; Salomonson, V.; Ormsby, J.; Sharts, B.; Folta, D.; Ardanuy, P.; Mckay, A.; Hoyt, D.; Jaffin, S.; Vallette, B.
1988-01-01
The MODIS Information, Data and Control System (MIDACS) Level 2 Functional Requirements Document establishes the functional requirements for MIDACS and provides a basis for the mutual understanding between the users and the designers of the EosDIS, including the requirements, operating environment, external interfaces, and development plan. In defining the requirements and scope of the system, this document describes how MIDACS will operate as an element of the EOS within the EosDIS environment. This version of the Level 2 Requirements Document follows an earlier release of a preliminary draft version. The sections on functional and performance requirements do not yet fully represent the requirements of the data system needed to achieve the scientific objectives of the MODIS instruments and science teams. Indeed, the team members have not yet been selected and the team has not yet been formed; however, it has been possible to identify many relevant requirements based on the present concept of EosDIS and through interviews and meetings with key members of the scientific community. These requirements have been grouped by functional component of the data system, and by function within each component. These requirements have been merged with the complete set of Level 1 and Level 2 context diagrams, data flow diagrams, and data dictionary.
Main Power Distribution Unit for the Jupiter Icy Moons Orbiter (JIMO)
NASA Technical Reports Server (NTRS)
Papa, Melissa R.
2004-01-01
Around the year 2011, the Jupiter Icy Moons Orbiter (JIMO) will be launched and on its way to orbit three of Jupiter s planet-sized moons. The mission goals for the JIMO project revolve heavily around gathering scientific data concerning ingredients we, as humans, consider essential: water, energy and necessary chemical elements. The JIM0 is an ambitious mission which will implore propulsion from an ION thruster powered by a nuclear fission reactor. Glenn Research Center is responsible for the development of the dynamic power conversion, power management and distribution, heat rejection and ION thrusters. The first test phase for the JIM0 program concerns the High Power AC Power Management and Distribution (PMAD) Test Bed. The goal of this testing is to support electrical performance verification of the power systems. The test bed will incorporate a 2kW Brayton Rotating Unit (BRU) to simulate the nuclear reactor as well as two ION thrusters. The first module of the PMAD Test Bed to be designed is the Main Power Distribution Unit (MPDU) which relays the power input to the various propulsion systems and scientific instruments. The MPDU involves circuitry design as well as mechanical design to determine the placement of the components. The MPDU consists of fourteen relays of four different variations used to convert the input power into the appropriate power output. The three phase system uses 400 Vo1ts(sub L-L) rms at 1000 Hertz. The power is relayed through the circuit and distributed to the scientific instruments, the ION thrusters and other controlled systems. The mechanical design requires the components to be positioned for easy electrical wiring as well as allowing adequate room for the main buss bars, individual circuit boards connected to each component and power supplies. To accomplish creating a suitable design, AutoCAD was used as a drafting tool. By showing a visual layout of the components, it is easy to see where there is extra room or where the components may interfere with one another. By working with the electrical engineer who is designing the circuit, the specific design requirements for the MPDU were determined and used as guidelines. Space is limited due to the size of the mounting plate therefore each component must be strategically placed. Since the MPDU is being designed to fit into a simulated model of the spacecraft systems on the JIMO, components must be positioned where they are easily accessible to be wired to the other onboard systems. Mechanical and electrical requirements provided equally important limits which are combined to produce the best possible design of the MPDU.
A survey and assessment of the capabilities of Cubesats for Earth observation
NASA Astrophysics Data System (ADS)
Selva, Daniel; Krejci, David
2012-05-01
In less than a decade, Cubesats have evolved from purely educational tools to a standard platform for technology demonstration and scientific instrumentation. The use of COTS (Commercial-Off-The-Shelf) components and the ongoing miniaturization of several technologies have already led to scattered instances of missions with promising scientific value. Furthermore, advantages in terms of development cost and development time with respect to larger satellites, as well as the possibility of launching several dozens of Cubesats with a single rocket launch, have brought forth the potential for radically new mission architectures consisting of very large constellations or clusters of Cubesats. These architectures promise to combine the temporal resolution of GEO missions with the spatial resolution of LEO missions, thus breaking a traditional trade-off in Earth observation mission design. This paper assesses the current capabilities of Cubesats with respect to potential employment in Earth observation missions. A thorough review of Cubesat bus technology capabilities is performed, identifying potential limitations and their implications on 17 different Earth observation payload technologies. These results are matched to an exhaustive review of scientific requirements in the field of Earth observation, assessing the possibilities of Cubesats to cope with the requirements set for each one of 21 measurement categories. Based on this review, several Earth observation measurements are identified that can potentially be compatible with the current state-of-the-art of Cubesat technology although some of them have actually never been addressed by any Cubesat mission. Simultaneously, other measurements are identified which are unlikely to be performed by Cubesats in the next few years due to insuperable constraints. Ultimately, this paper is intended to supply a box of ideas for universities to design future Cubesat missions with high scientific payoff.
NASA Astrophysics Data System (ADS)
Ogawa, Kazunori; Shirai, Kei; Sawada, Hirotaka; Arakawa, Masahiko; Honda, Rie; Wada, Koji; Ishibashi, Ko; Iijima, Yu-ichi; Sakatani, Naoya; Nakazawa, Satoru; Hayakawa, Hajime
2017-07-01
An artificial impact experiment is scheduled for 2018-2019 in which an impactor will collide with asteroid 162137 Ryugu (1999 JU3) during the asteroid rendezvous phase of the Hayabusa2 spacecraft. The small carry-on impactor (SCI) will shoot a 2-kg projectile at 2 km/s to create a crater 1-10 m in diameter with an expected subsequent ejecta curtain of a 100-m scale on an ideal sandy surface. A miniaturized deployable camera (DCAM3) unit will separate from the spacecraft at about 1 km from impact, and simultaneously conduct optical observations of the experiment. We designed and developed a camera system (DCAM3-D) in the DCAM3, specialized for scientific observations of impact phenomenon, in order to clarify the subsurface structure, construct theories of impact applicable in a microgravity environment, and identify the impact point on the asteroid. The DCAM3-D system consists of a miniaturized camera with a wide-angle and high-focusing performance, high-speed radio communication devices, and control units with large data storage on both the DCAM3 unit and the spacecraft. These components were successfully developed under severe constraints of size, mass and power, and the whole DCAM3-D system has passed all tests verifying functions, performance, and environmental tolerance. Results indicated sufficient potential to conduct the scientific observations during the SCI impact experiment. An operation plan was carefully considered along with the configuration and a time schedule of the impact experiment, and pre-programed into the control unit before the launch. In this paper, we describe details of the system design concept, specifications, and the operating plan of the DCAM3-D system, focusing on the feasibility of scientific observations.
NASA Astrophysics Data System (ADS)
Frailis, M.; Maris, M.; Zacchei, A.; Morisset, N.; Rohlfs, R.; Meharga, M.; Binko, P.; Türler, M.; Galeotta, S.; Gasparo, F.; Franceschi, E.; Butler, R. C.; D'Arcangelo, O.; Fogliani, S.; Gregorio, A.; Lowe, S. R.; Maggio, G.; Malaspina, M.; Mandolesi, N.; Manzato, P.; Pasian, F.; Perrotta, F.; Sandri, M.; Terenzi, L.; Tomasi, M.; Zonca, A.
2009-12-01
The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the handling of the scientific and housekeeping telemetry. It is a critical component of the Planck ground segment which has to strictly commit to the project schedule to be ready for the launch and flight operations. In order to guarantee the quality necessary to achieve the objectives of the Planck mission, the design and development of the Level 1 software has followed the ESA Software Engineering Standards. A fundamental step in the software life cycle is the Verification and Validation of the software. The purpose of this work is to show an example of procedures, test development and analysis successfully applied to a key software project of an ESA mission. We present the end-to-end validation tests performed on the Level 1 of the LFI-DPC, by detailing the methods used and the results obtained. Different approaches have been used to test the scientific and housekeeping data processing. Scientific data processing has been tested by injecting signals with known properties directly into the acquisition electronics, in order to generate a test dataset of real telemetry data and reproduce as much as possible nominal conditions. For the HK telemetry processing, validation software have been developed to inject known parameter values into a set of real housekeeping packets and perform a comparison with the corresponding timelines generated by the Level 1. With the proposed validation and verification procedure, where the on-board and ground processing are viewed as a single pipeline, we demonstrated that the scientific and housekeeping processing of the Planck-LFI raw data is correct and meets the project requirements.
NASA Astrophysics Data System (ADS)
Ramkilowan, A.; Griffith, D. J.
2017-10-01
Surveillance modelling in terms of the standard Detect, Recognise and Identify (DRI) thresholds remains a key requirement for determining the effectiveness of surveillance sensors. With readily available computational resources it has become feasible to perform statistically representative evaluations of the effectiveness of these sensors. A new capability for performing this Monte-Carlo type analysis is demonstrated in the MORTICIA (Monte- Carlo Optical Rendering for Theatre Investigations of Capability under the Influence of the Atmosphere) software package developed at the Council for Scientific and Industrial Research (CSIR). This first generation, python-based open-source integrated software package, currently in the alpha stage of development aims to provide all the functionality required to perform statistical investigations of the effectiveness of optical surveillance systems in specific or generic deployment theatres. This includes modelling of the mathematical and physical processes that govern amongst other components of a surveillance system; a sensor's detector and optical components, a target and its background as well as the intervening atmospheric influences. In this paper we discuss integral aspects of the bespoke framework that are critical to the longevity of all subsequent modelling efforts. Additionally, some preliminary results are presented.
Project Report on Development of a Safeguards Approach for Pyroprocessing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robert Bean
The Idaho National Laboratory has undertaken an effort to develop a standard safeguards approach for international commercial pyroprocessing facilities. This report details progress for the fiscal year 2010 effort. A component by component diversion pathway analysis has been performed, and has led to insight on the mitigation needs and equipment development needed for a valid safeguards approach. The effort to develop an in-hot cell detection capability led to the digital cloud chamber, and more importantly, the significant potential scientific breakthrough of the inverse spectroscopy algorithm, including the ability to identify energy and spatial location of gamma ray emitting sources withmore » a single, non-complex, stationary radiation detector system. Curium measurements were performed on historical and current samples at the FCF to attempt to determine the utility of using gross neutron counting for accountancy measurements. A solid cost estimate of equipment installation at FCF has been developed to guide proposals and cost allocations to use FCF as a test bed for safeguards measurement demonstrations. A combined MATLAB and MCNPX model has been developed to perform detector placement calculations around the electrorefiner. Early harvesting has occurred wherein the project team has been requested to provide pyroprocessing technology and safeguards short courses.« less
The Literacy Component of Mathematical and Scientific Literacy
ERIC Educational Resources Information Center
Yore, Larry D.; Pimm, David; Tuan, Hsiao-Lin
2007-01-01
This opening article of the Special Issue makes an argument for parallel definitions of scientific literacy and mathematical literacy that have shared features: importance of general cognitive and metacognitive abilities and reasoning/thinking and discipline-specific language, habits-of-mind/emotional dispositions, and information communication…
NASA Astrophysics Data System (ADS)
Price, Aaron
2010-01-01
Citizen Sky is a new three-year, astronomical citizen science project launched in June, 2009 with funding from the National Science Foundation. This paper reports on early results of an assessment delivered to 1000 participants when they first joined the project. The goal of the assessment, based on the Nature of Scientific Knowledge Scale (NSKS), is to characterize their attitudes towards the nature of scientific knowledge. Our results are that the NSKS components of the assessment achieved high levels of reliability. Both reliability and overall scores fall within the range reported from other NSKS studies in the literature. Correlation analysis with other components of the assessment reveals some factors, such as age and understanding of scientific evidence, may be reflected in scores of subscales of NSKS items. Further work will be done using online discourse analysis and interviews. Overall, we find that the NSKS can be used as an entrance assessment for an online citizen science project.
Program Components | Cancer Prevention Fellowship Program
Annual Cancer Prevention Fellows' Scientific Symposium The Annual Cancer Prevention Fellows’ Scientific Symposium is held each fall. The symposium brings together senior fellows, new fellows, and the CPFP staff for a day of scientific exchange in the area of cancer prevention. The event provides an opportunity for fellows to discuss their projects, ideas, and potential future collaborations. Fellows plan the symposium, including developing the program agenda and special workshops, and selecting invited speakers.
New Challenges in Tribology: Wear Assessment Using 3D Optical Scanners
Valigi, Maria Cristina; Logozzo, Silvia; Affatato, Saverio
2017-01-01
Wear is a significant mechanical and clinical problem. To acquire further knowledge on the tribological phenomena that involve freeform mechanical components or medical prostheses, wear tests are performed on biomedical and industrial materials in order to solve or reduce failures or malfunctions due to material loss. Scientific and technological advances in the field of optical scanning allow the application of innovative devices for wear measurements, leading to improvements that were unimaginable until a few years ago. It is therefore important to develop techniques, based on new instrumentations, for more accurate and reproducible measurements of wear. The aim of this work is to discuss the use of innovative 3D optical scanners and an experimental procedure to detect and evaluate wear, comparing this technique with other wear evaluation methods for industrial components and biomedical devices. PMID:28772905
New Challenges in Tribology: Wear Assessment Using 3D Optical Scanners.
Valigi, Maria Cristina; Logozzo, Silvia; Affatato, Saverio
2017-05-18
Wear is a significant mechanical and clinical problem. To acquire further knowledge on the tribological phenomena that involve freeform mechanical components or medical prostheses, wear tests are performed on biomedical and industrial materials in order to solve or reduce failures or malfunctions due to material loss. Scientific and technological advances in the field of optical scanning allow the application of innovative devices for wear measurements, leading to improvements that were unimaginable until a few years ago. It is therefore important to develop techniques, based on new instrumentations, for more accurate and reproducible measurements of wear. The aim of this work is to discuss the use of innovative 3D optical scanners and an experimental procedure to detect and evaluate wear, comparing this technique with other wear evaluation methods for industrial components and biomedical devices.
Kenny, Joseph P.; Janssen, Curtis L.; Gordon, Mark S.; ...
2008-01-01
Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE) has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA) Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also addressmore » interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.« less
Graphene as a long-term metal oxidation barrier: worse than nothing.
Schriver, Maria; Regan, William; Gannett, Will J; Zaniewski, Anna M; Crommie, Michael F; Zettl, Alex
2013-07-23
Anticorrosion and antioxidation surface treatments such as paint or anodization are a foundational component in nearly all industries. Graphene, a single-atom-thick sheet of carbon with impressive impermeability to gases, seems to hold promise as an effective anticorrosion barrier, and recent work supports this hope. We perform a complete study of the short- and long-term performance of graphene coatings for Cu and Si substrates. Our work reveals that although graphene indeed offers effective short-term oxidation protection, over long time scales it promotes more extensive wet corrosion than that seen for an initially bare, unprotected Cu surface. This surprising result has important implications for future scientific studies and industrial applications. In addition to informing any future work on graphene as a protective coating, the results presented here have implications for graphene's performance in a wide range of applications.
Jarosova, Darja; Gurkova, Elena; Ziakova, Katarina; Nedvedova, Daniela; Palese, Alvisa; Godeas, Gloria; Chan, Sally Wai-Chi; Song, Mi Sook; Lee, Jongwon; Cordeiro, Raul; Babiarczyk, Beata; Fras, Malgorzata
2017-03-01
There is a considerable amount of empirical evidence to indicate a positive association between an employee's subjective well-being and workplace performance and job satisfaction. Compared with nursing research, there is a relative lack of consistent scientific evidence concerning midwives' subjective well-being and its determinants related to domains of job satisfaction. The purpose of the study was to examine the association between the domains of job satisfaction and components of subjective well-being in hospital midwives. This cross-sectional descriptive study involved 1190 hospital midwives from 7 countries. Job satisfaction was measured by the McCloskey/Mueller Satisfaction Scale. Subjective well-being was conceptualized in the study by the 2 components (the affective and the cognitive component). The affective component of subjective well-being (ie, emotional well-being) was assessed by the Positive and the Negative Affect Scale. The cognitive component of subjective well-being (ie, life satisfaction) was measured by the Personal Well-Being Index. Pearson correlations and multiple regression analyses were used to determine associations between variables. Findings from correlation and regression analyses indicated an overall weak association between the domains of job satisfaction and components of subjective well-being. Satisfaction with extrinsic rewards, coworkers, and interaction opportunities accounted for only 13% of variance in the cognitive component (life satisfaction). The affective component (emotional well-being) was weakly associated with satisfaction with control and responsibility. The low amount of variance suggests that neither component of subjective well-being is influenced by the domains of job satisfaction. Further studies should focus on identifying other predictors of subjective well-being among midwives. A better understanding of how specific job facets are related to the subjective well-being of midwives might assist employers in the design of counseling and intervention programs for subjective well-being of midwives in the workplace and workplace performance. © 2016 by the American College of Nurse-Midwives.
Scientific reasoning profile of junior secondary school students on the concept of static fluid
NASA Astrophysics Data System (ADS)
Mariana, N.; Siahaan, P.; Utari, S.
2018-05-01
Scientific reasoning is one of the most important ability. This study aims to determine the profile of scientific reasoning of junior high school students about the concept of static fluid. This research uses a descriptive method with a quantitative approach to get an idea about the scientific reasoning of One Roof Junior Secondary School Student Kotabaru Reteh in Riau. The technique of collecting data is done by test of scientific reasoning. Scientific reasoning capability refers to Furtak’s EBR (Evidence Based Reasoning) scientific reasoning indicator that contains the components of claims, data, evidence, and rules. The result obtained on each element of scientific reasoning is 35% claim, 23% data, 21% evidence and 17% rule. The conclusions of this research that scientific reasoning of Satu Atap Junior Secondary School student Kotabaru Reteh, Riau Province still in the low category.
Scientific Elitism and the Information System of Science
ERIC Educational Resources Information Center
Amick, Daniel James
1973-01-01
Scientific elitism must be viewed as a multidimensional phenomenon. Ten variables of elitism are considered and a principal components factor analysis is used to scale this multivariate domain. Two significant dimensions of elitism were found; one in basic and one in applied science. (20 references) (Author)
How Do Primary School Students Acquire the Skill of Making Hypothesis
ERIC Educational Resources Information Center
Darus, Faridah Binti; Saat, Rohaida Mohd
2014-01-01
Science education in Malaysia emphasizes three components: namely knowledge, scientific skills which include science process skills and manipulative skills; scientific attitudes; and noble values. The science process skills are important in enhancing students' cognitive development and also to facilitate students' active participation during the…
Satellite image-based maps: Scientific inference or pretty pictures?
Ronald E. McRoberts
2011-01-01
The scientific method has been characterized as having two distinct components, Discovery and Justification. Discovery emphasizes ideas and creativity, focuses on conceiving hypotheses and constructing models, and is generally regarded as lacking a formal logic. Justification begins with the hypotheses and models and ends with a...
Science Games and the Development of Scientific Possible Selves
ERIC Educational Resources Information Center
Beier, Margaret E.; Miller, Leslie M.; Wang, Shu
2012-01-01
Serious scientific games, especially those that include a virtual apprenticeship component, provide players with realistic experiences in science. This article discusses how science games can influence learning about science and the development of science-oriented possible selves through repeated practice in professional play and through social…
Worlds of wonder: Sensation and the Victorian scientific performance.
Morus, Iwan Rhys
2010-12-01
Performances of various kinds were central to the strategies adopted by Victorian natural philosophers to constitute their authority. Appealing to the senses of their audience through spectacular effects or ingenious demonstrations of skill was key to the success of these performances. If we want to understand the politics and practice of Victorian science-and science more generally-we need to pay particular attention to these sorts of performances. We need to understand the ingredients that went into them and the relationships between scientific performers and their publics. In particular, we need to investigate the self-conscious nature of Victorian scientific performances. Looking at science as performance provides us with a new set of tools for understanding the politics of knowledge, the relationship between producers and consumers of scientific knowledge, and the construction and constitution of scientific authority.
Exploring Two Approaches for an End-to-End Scientific Analysis Workflow
NASA Astrophysics Data System (ADS)
Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; Paterno, Marc; Sehrish, Saba
2015-12-01
The scientific discovery process can be advanced by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally, it is important for scientists to be able to share their workflows with collaborators. We have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC); the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In this paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.
NASA Astrophysics Data System (ADS)
Corrie, Brian; Zimmerman, Todd
Scientific research is fundamentally collaborative in nature, and many of today's complex scientific problems require domain expertise in a wide range of disciplines. In order to create research groups that can effectively explore such problems, research collaborations are often formed that involve colleagues at many institutions, sometimes spanning a country and often spanning the world. An increasingly common manifestation of such a collaboration is the collaboratory (Bos et al., 2007), a “…center without walls in which the nation's researchers can perform research without regard to geographical location — interacting with colleagues, accessing instrumentation, sharing data and computational resources, and accessing information from digital libraries.” In order to bring groups together on such a scale, a wide range of components need to be available to researchers, including distributed computer systems, remote instrumentation, data storage, collaboration tools, and the financial and human resources to operate and run such a system (National Research Council, 1993). Media Spaces, as both a technology and a social facilitator, have the potential to meet many of these needs. In this chapter, we focus on the use of scientific media spaces (SMS) as a tool for supporting collaboration in scientific research. In particular, we discuss the design, deployment, and use of a set of SMS environments deployed by WestGrid and one of its collaborating organizations, the Centre for Interdisciplinary Research in the Mathematical and Computational Sciences (IRMACS) over a 5-year period.
Chandra monitoring, trends, and response
NASA Astrophysics Data System (ADS)
Spitzbart, Brad D.; Wolk, Scott J.; Isobe, Takashi
2002-12-01
The Chandra X-ray Observatory was launched in July, 1999 and has yielded extraordinary scientific results. Behind the scenes, our Monitoring and Trends Analysis (MTA) system has proven to be a valuable resource. With three years worth of on-orbit data, we have available a vast array of both telescope diagnostic information and analysis of scientific data to access Observatory performance. As part of Chandra's Science Operations Team (SOT), the primary goal of MTA is to provide tools for effective decision making leading to the most efficient production of quality science output from the Observatory. We occupy a middle ground between flight operations, chiefly concerned with the health and safety of the spacecraft, and validation and verification, concerned with the scientific validity of the data taken and whether or not they fulfill the observer's requirements. In that role we provide and receive support from systems engineers, instrument experts, operations managers, and scientific users. MTA tools, products, and services include real-time monitoring and alert generation for the most mission critical components, long term trending of all spacecraft systems, detailed analysis of various subsystems for life expectancy or anomaly resolution, and creating and maintaining a large SQL database of relevant information. This is accomplished through the use of a wide variety of input data sources and flexible, accessible programming and analysis techniques. This paper will discuss the overall design of the system, its evolution and the resources available.
Analytics-Driven Lossless Data Compression for Rapid In-situ Indexing, Storing, and Querying
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jenkins, John; Arkatkar, Isha; Lakshminarasimhan, Sriram
2013-01-01
The analysis of scientific simulations is highly data-intensive and is becoming an increasingly important challenge. Peta-scale data sets require the use of light-weight query-driven analysis methods, as opposed to heavy-weight schemes that optimize for speed at the expense of size. This paper is an attempt in the direction of query processing over losslessly compressed scientific data. We propose a co-designed double-precision compression and indexing methodology for range queries by performing unique-value-based binning on the most significant bytes of double precision data (sign, exponent, and most significant mantissa bits), and inverting the resulting metadata to produce an inverted index over amore » reduced data representation. Without the inverted index, our method matches or improves compression ratios over both general-purpose and floating-point compression utilities. The inverted index is light-weight, and the overall storage requirement for both reduced column and index is less than 135%, whereas existing DBMS technologies can require 200-400%. As a proof-of-concept, we evaluate univariate range queries that additionally return column values, a critical component of data analytics, against state-of-the-art bitmap indexing technology, showing multi-fold query performance improvements.« less
Launch in orbit of the telescope NINA for cosmic ray observations: preliminary results
NASA Astrophysics Data System (ADS)
Sparvoli, R.; Bidoli, V.; Canestro, A.; Casolino, M.; de Pascale, M. P.; Furano, G.; Iannucci, A.; Morselli, A.; Picozza, P.; Bakaldin, A.; Galper, A.; Koldashov, S.; Korotkov, M.; Leonov, A.; Mikhailov, V.; Murashov, A.; Voronov, S.; Bonvicini, V.; Cirami, R.; Vacchi, A.; Zampa, N.; Ambriola, M.; Bellotti, R.; Cafagna, F.; Ciacio, F.; Circella, M.; de Marzo, C.; Bartalucci, S.; Ricci, M.; Adriani, O.; Papini, P.; Piccardi, S.; Spillantini, P.; Boezio, M.; Castellini, G.
2000-05-01
On July the 10th, 1998 the telescope NINA was launched in space on board the Russian satellite Resurs-01 n.4. The scientific task of the mission is the study of the galactic, solar and anomalous components of the cosmic rays in the energy interval 10-200 MeV/n for contained particles. The core of NINA is a segmented silicon detector mounted onto the satellite so to point to the zenith.In this paper we report about the cosmic ray measurements performed by the telescope during its first 6 months of operation
NASA Astrophysics Data System (ADS)
Carter, Frances D.
2011-12-01
Low participation and performance in science, technology, engineering, and mathematics (STEM) fields by U.S. citizens are widely recognized as major problems with substantial economic, political, and social ramifications. Studies of collegiate interventions designed to broaden participation in STEM fields suggest that participation in undergraduate research is a key program component that enhances such student outcomes as undergraduate GPA, graduation, persistence in a STEM major, and graduate school enrollment. However, little is known about the mechanisms that are responsible for these positive effects. The current study hypothesizes that undergraduate research participation increases scientific self-efficacy and scientific research proficiency. This hypothesis was tested using data obtained from a survey of minority students from several STEM intervention programs that offer undergraduate research opportunities. Students were surveyed both prior to and following the summer of 2010. Factor analysis was used to examine the factor structure of participants' responses on scientific self-efficacy and scientific research proficiency scales. Difference-in-difference analysis was then applied to the resulting factor score differences to estimate the relationship of summer research participation with scientific self-efficacy and scientific research proficiency. Factor analytic results replicate and further validate previous findings of a general scientific self-efficacy construct (Schultz, 2008). While the factor analytic results for the exploratory scientific research proficiency scale suggest that it was also a measureable construct, the factor structure was not generalizable over time. Potential reasons for the lack of generalizability validity for the scientific research proficiency scale are explored and recommendations for emerging scales are provided. Recent restructuring attempts within federal science agencies threaten the future of STEM intervention programs. Causal estimates of the effect of undergraduate research participation on specific and measurable benefits can play an important role in ensuring the sustainability of STEM intervention programs. Obtaining such estimates requires additional studies that, inter alia, incorporate adequate sample sizes, valid measurement scales, and the ability to account for unobserved variables. Political strategies, such as compromise, can also play an important role in ensuring the sustainability of STEM intervention programs.
Qu, Cheng; Pu, Zong-Jin; Zhou, Gui-Sheng; Wang, Jun; Zhu, Zhen-Hua; Yue, Shi-Jun; Li, Jian-Ping; Shang, Li-Li; Tang, Yu-Ping; Shi, Xu-Qin; Liu, Pei; Guo, Jian-Ming; Sun, Jing; Tang, Zhi-Shu; Zhao, Jing; Zhao, Bu-Chang; Duan, Jin-Ao
2017-09-01
A sensitive, reliable, and powerful ultra-high performance liquid chromatography coupled to triple quadrupole tandem mass spectrometry method was developed for simultaneous quantification of the 15 main bio-active components including phenolic acids and flavonoids within 13 min for the first time. The proposed method was first reported and validated by good linearity (r 2 > 0.9975), limit of detection (1.12-7.01 ng/mL), limit of quantification (3.73-23.37 ng/mL), intra- and inter-day precisions (RSD ≤ 1.92%, RSD ≤ 2.45%), stability (RSD ≤ 5.63%), repeatability (RSD ≤ 4.34%), recovery (96.84-102.12%), and matrix effects (0.92-1.02). The established analytical methodology was successfully applied to comparative analysis of main bio-active components in the herb pair Danshen-Honghua and its single herbs. Compared to the single herb, the content of most flavonoid glycosides was remarkably increased in their herb pair, and main phenolic acids were decreased, conversely. The content changes of the main components in the herb pair supported the synergistic effects on promoting blood circulation and removing blood stasis. The results provide a scientific basis and reference for the quality control of Danshen-Honghua herb pair and the drug interactions based on variation of bio-active components in herb pairs. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Individual and Team Performance in Team-Handball: A Review
Wagner, Herbert; Finkenzeller, Thomas; Würth, Sabine; von Duvillard, Serge P.
2014-01-01
Team handball is a complex sport game that is determined by the individual performance of each player as well as tactical components and interaction of the team. The aim of this review was to specify the elements of team-handball performance based on scientific studies and practical experience, and to convey perspectives for practical implication. Scientific studies were identified via data bases of PubMed, Web of Knowledge, SPORT Discus, Google Scholar, and Hercules. A total of 56 articles met the inclusion criteria. In addition, we supplemented the review with 13 additional articles, proceedings and book sections. It was found that the specific characteristics of team-handball with frequent intensity changes, team-handball techniques, hard body confrontations, mental skills and social factors specify the determinants of coordination, endurance, strength and cognition. Although we found comprehensive studies examining individual performance in team-handball players of different experience level, sex or age, there is a lack of studies, particularly for team-handball specific training, as well as cognition and social factors. Key Points The specific characteristics of team-handball with frequent intensity changes, specific skills, hard body confrontations, mental skills and social factors define the determinants of coordination, endurance, strength and cognition. To increase individual and team performance in team-handball specific training based on these determinants have been suggested. Although there are comprehensive studies examining individual performance in team-handball players of different experience level, sex, or age are published, there is a lack of training studies, particularly for team-handball specific techniques and endurance, as well as cognition and social factors. PMID:25435773
Individual and team performance in team-handball: a review.
Wagner, Herbert; Finkenzeller, Thomas; Würth, Sabine; von Duvillard, Serge P
2014-12-01
Team handball is a complex sport game that is determined by the individual performance of each player as well as tactical components and interaction of the team. The aim of this review was to specify the elements of team-handball performance based on scientific studies and practical experience, and to convey perspectives for practical implication. Scientific studies were identified via data bases of PubMed, Web of Knowledge, SPORT Discus, Google Scholar, and Hercules. A total of 56 articles met the inclusion criteria. In addition, we supplemented the review with 13 additional articles, proceedings and book sections. It was found that the specific characteristics of team-handball with frequent intensity changes, team-handball techniques, hard body confrontations, mental skills and social factors specify the determinants of coordination, endurance, strength and cognition. Although we found comprehensive studies examining individual performance in team-handball players of different experience level, sex or age, there is a lack of studies, particularly for team-handball specific training, as well as cognition and social factors. Key PointsThe specific characteristics of team-handball with frequent intensity changes, specific skills, hard body confrontations, mental skills and social factors define the determinants of coordination, endurance, strength and cognition.To increase individual and team performance in team-handball specific training based on these determinants have been suggested.Although there are comprehensive studies examining individual performance in team-handball players of different experience level, sex, or age are published, there is a lack of training studies, particularly for team-handball specific techniques and endurance, as well as cognition and social factors.
Graph processing platforms at scale: practices and experiences
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lim, Seung-Hwan; Lee, Sangkeun; Brown, Tyler C
2015-01-01
Graph analysis unveils hidden associations of data in many phenomena and artifacts, such as road network, social networks, genomic information, and scientific collaboration. Unfortunately, a wide diversity in the characteristics of graphs and graph operations make it challenging to find a right combination of tools and implementation of algorithms to discover desired knowledge from the target data set. This study presents an extensive empirical study of three representative graph processing platforms: Pegasus, GraphX, and Urika. Each system represents a combination of options in data model, processing paradigm, and infrastructure. We benchmarked each platform using three popular graph operations, degree distribution,more » connected components, and PageRank over a variety of real-world graphs. Our experiments show that each graph processing platform shows different strength, depending the type of graph operations. While Urika performs the best in non-iterative operations like degree distribution, GraphX outputforms iterative operations like connected components and PageRank. In addition, we discuss challenges to optimize the performance of each platform over large scale real world graphs.« less
Straight from the Professional Development Classroom: A Practical Experience
ERIC Educational Resources Information Center
Koul, Anjni
2017-01-01
This article presents an instructional strategy called Premise-Reasoning- Outcome (PRO) designed to support students in the construction of scientific explanations. Informed by the philosophy of science and linguistic studies of science, the PRO strategy involves identifying three components of a scientific explanation: (i) premise--an accepted…
Teaching Information Literacy and Scientific Process Skills: An Integrated Approach.
ERIC Educational Resources Information Center
Souchek, Russell; Meier, Marjorie
1997-01-01
Describes an online searching and scientific process component taught as part of the laboratory for a general zoology course. The activities were designed to be gradually more challenging, culminating in a student-developed final research project. Student evaluations were positive, and faculty indicated that student research skills transferred to…
Designing a Technology-Enhanced Learning Environment to Support Scientific Modeling
ERIC Educational Resources Information Center
Wu, Hsin-Kai; Hsu, Ying-Shao; Hwang, Fu-Kwun
2010-01-01
Modeling of a natural phenomenon is of value in science learning and increasingly emphasized as an important component of science education. However, previous research has shown that secondary school students encounter difficulties when engaging in modeling activities and need substantial support in order to create meaningful scientific models.…
Reinforcing the Afrocentric Paradigm: A Theoretical Project
ERIC Educational Resources Information Center
Sams, Timothy E.
2010-01-01
Thomas Kuhn's 1962 groundbreaking work, "The Scientific Revolution," established the process for creating, and the components of, a disciplinary paradigm. This "scientific revolution" has evolved to become the standard for determining a field's claim to disciplinary status. In 2001 and 2003, Ama Mazama, used Kuhn's model to establish the…
A framework for integration of scientific applications into the OpenTopography workflow
NASA Astrophysics Data System (ADS)
Nandigam, V.; Crosby, C.; Baru, C.
2012-12-01
The NSF-funded OpenTopography facility provides online access to Earth science-oriented high-resolution LIDAR topography data, online processing tools, and derivative products. The underlying cyberinfrastructure employs a multi-tier service oriented architecture that is comprised of an infrastructure tier, a processing services tier, and an application tier. The infrastructure tier consists of storage, compute resources as well as supporting databases. The services tier consists of the set of processing routines each deployed as a Web service. The applications tier provides client interfaces to the system. (e.g. Portal). We propose a "pluggable" infrastructure design that will allow new scientific algorithms and processing routines developed and maintained by the community to be integrated into the OpenTopography system so that the wider earth science community can benefit from its availability. All core components in OpenTopography are available as Web services using a customized open-source Opal toolkit. The Opal toolkit provides mechanisms to manage and track job submissions, with the help of a back-end database. It allows monitoring of job and system status by providing charting tools. All core components in OpenTopography have been developed, maintained and wrapped as Web services using Opal by OpenTopography developers. However, as the scientific community develops new processing and analysis approaches this integration approach is not scalable efficiently. Most of the new scientific applications will have their own active development teams performing regular updates, maintenance and other improvements. It would be optimal to have the application co-located where its developers can continue to actively work on it while still making it accessible within the OpenTopography workflow for processing capabilities. We will utilize a software framework for remote integration of these scientific applications into the OpenTopography system. This will be accomplished by virtually extending the OpenTopography service over the various infrastructures running these scientific applications and processing routines. This involves packaging and distributing a customized instance of the Opal toolkit that will wrap the software application as an OPAL-based web service and integrate it into the OpenTopography framework. We plan to make this as automated as possible. A structured specification of service inputs and outputs along with metadata annotations encoded in XML can be utilized to automate the generation of user interfaces, with appropriate tools tips and user help features, and generation of other internal software. The OpenTopography Opal toolkit will also include the customizations that will enable security authentication, authorization and the ability to write application usage and job statistics back to the OpenTopography databases. This usage information could then be reported to the original service providers and used for auditing and performance improvements. This pluggable framework will enable the application developers to continue to work on enhancing their application while making the latest iteration available in a timely manner to the earth sciences community. This will also help us establish an overall framework that other scientific application providers will also be able to use going forward.
Getting beyond technical rationality in developing health behavior programs with youth.
Perry, Cheryl L
2004-01-01
To explore 2 major components of health behavior research, etiologic research and action research. To argue that action research is both an artistic as well as scientific process. Review of the development process of effective health behavior programs with youth. Review of literature on art as part of the scientific process, especially in the field of education. Intervention programs that included explicitly creative components demonstrated success in reducing alcohol use and increasing healthful eating and activity patterns. Health behavior researchers might involve art and creativity in action research to enhance program retention and outcomes.
Abstracted Workow Framework with a Structure from Motion Application
NASA Astrophysics Data System (ADS)
Rossi, Adam J.
In scientific and engineering disciplines, from academia to industry, there is an increasing need for the development of custom software to perform experiments, construct systems, and develop products. The natural mindset initially is to shortcut and bypass all overhead and process rigor in order to obtain an immediate result for the problem at hand, with the misconception that the software will simply be thrown away at the end. In a majority of the cases, it turns out the software persists for many years, and likely ends up in production systems for which it was not initially intended. In the current study, a framework that can be used in both industry and academic applications mitigates underlying problems associated with developing scientific and engineering software. This results in software that is much more maintainable, documented, and usable by others, specifically allowing new users to extend capabilities of components already implemented in the framework. There is a multi-disciplinary need in the fields of imaging science, computer science, and software engineering for a unified implementation model, which motivates the development of an abstracted software framework. Structure from motion (SfM) has been identified as one use case where the abstracted workflow framework can improve research efficiencies and eliminate implementation redundancies in scientific fields. The SfM process begins by obtaining 2D images of a scene from different perspectives. Features from the images are extracted and correspondences are established. This provides a sufficient amount of information to initialize the problem for fully automated processing. Transformations are established between views, and 3D points are established via triangulation algorithms. The parameters for the camera models for all views / images are solved through bundle adjustment, establishing a highly consistent point cloud. The initial sparse point cloud and camera matrices are used to generate a dense point cloud through patch based techniques or densification algorithms such as Semi-Global Matching (SGM). The point cloud can be visualized or exploited by both humans and automated techniques. In some cases the point cloud is "draped" with original imagery in order to enhance the 3D model for a human viewer. The SfM workflow can be implemented in the abstracted framework, making it easily leverageable and extensible by multiple users. Like many processes in scientific and engineering domains, the workflow described for SfM is complex and requires many disparate components to form a functional system, often utilizing algorithms implemented by many users in different languages / environments and without knowledge of how the component fits into the larger system. In practice, this generally leads to issues interfacing the components, building the software for desired platforms, understanding its concept of operations, and how it can be manipulated in order to fit the desired function for a particular application. In addition, other scientists and engineers instinctively wish to analyze the performance of the system, establish new algorithms, optimize existing processes, and establish new functionality based on current research. This requires a framework whereby new components can be easily plugged in without affecting the current implemented functionality. The need for a universal programming environment establishes the motivation for the development of the abstracted workflow framework. This software implementation, named Catena, provides base classes from which new components must derive in order to operate within the framework. The derivation mandates requirements be satisfied in order to provide a complete implementation. Additionally, the developer must provide documentation of the component in terms of its overall function and inputs. The interface input and output values corresponding to the component must be defined in terms of their respective data types, and the implementation uses mechanisms within the framework to retrieve and send the values. This process requires the developer to componentize their algorithm rather than implement it monolithically. Although the requirements of the developer are slightly greater, the benefits realized from using Catena far outweigh the overhead, and results in extensible software. This thesis provides a basis for the abstracted workflow framework concept and the Catena software implementation. The benefits are also illustrated using a detailed examination of the SfM process as an example application.
Undergraduate Medical Academic Performance is Improved by Scientific Training
ERIC Educational Resources Information Center
Zhang, Lili; Zhang, Wei; Wu, Chong; Liu, Zhongming; Cai, Yunfei; Cao, Xingguo; He, Yushan; Liu, Guoxiang; Miao, Hongming
2017-01-01
The effect of scientific training on course learning in undergraduates is still controversial. In this study, we investigated the academic performance of undergraduate students with and without scientific training. The results show that scientific training improves students' test scores in general medical courses, such as biochemistry and…
Evaluation of Low-Voltage Distribution Network Index Based on Improved Principal Component Analysis
NASA Astrophysics Data System (ADS)
Fan, Hanlu; Gao, Suzhou; Fan, Wenjie; Zhong, Yinfeng; Zhu, Lei
2018-01-01
In order to evaluate the development level of the low-voltage distribution network objectively and scientifically, chromatography analysis method is utilized to construct evaluation index model of low-voltage distribution network. Based on the analysis of principal component and the characteristic of logarithmic distribution of the index data, a logarithmic centralization method is adopted to improve the principal component analysis algorithm. The algorithm can decorrelate and reduce the dimensions of the evaluation model and the comprehensive score has a better dispersion degree. The clustering method is adopted to analyse the comprehensive score because the comprehensive score of the courts is concentrated. Then the stratification evaluation of the courts is realized. An example is given to verify the objectivity and scientificity of the evaluation method.
NASA Astrophysics Data System (ADS)
Nativi, S.; Santoro, M.
2009-12-01
Currently, one of the major challenges for scientific community is the study of climate change effects on life on Earth. To achieve this, it is crucial to understand how climate change will impact on biodiversity and, in this context, several application scenarios require modeling the impact of climate change on distribution of individual species. In the context of GEOSS AIP-2 (Global Earth Observation System of Systems, Architecture Implementation Pilot- Phase 2), the Climate Change & Biodiversity thematic Working Group developed three significant user scenarios. A couple of them make use of a GEOSS-based framework to study the impact of climate change factors on regional species distribution. The presentation introduces and discusses this framework which provides an interoperability infrastructures to loosely couple standard services and components to discover and access climate and biodiversity data, and run forecast and processing models. The framework is comprised of the following main components and services: a)GEO Portal: through this component end user is able to search, find and access the needed services for the scenario execution; b)Graphical User Interface (GUI): this component provides user interaction functionalities. It controls the workflow manager to perform the required operations for the scenario implementation; c)Use Scenario controller: this component acts as a workflow controller implementing the scenario business process -i.e. a typical climate change & biodiversity projection scenario; d)Service Broker implementing Mediation Services: this component realizes a distributed catalogue which federates several discovery and access components (exposing them through a unique CSW standard interface). Federated components publish climate, environmental and biodiversity datasets; e)Ecological Niche Model Server: this component is able to run one or more Ecological Niche Models (ENM) on selected biodiversity and climate datasets; f)Data Access Transaction server: this component publishes the model outputs. The framework was successfully tested in two use scenarios of the GEOSS AIP-2 Climate Change and Biodiversity WG aiming to predict species distribution changes due to Climate Change factors, with the scientific patronage of the University of Colorado and the University of Alaska. The first scenario dealt with the Pikas specie regional distribution in the Great Basin area (North America). While, the second one concerned the modeling of the Arctic Food Chain species in the North Pole area -the relationships between different environmental parameters and Polar Bears distribution was analyzed. Results are published in the GEOSS AIP-2 web site: http://www.ogcnetwork.net/AIP2develop .
Enabling a Scientific Cloud Marketplace: VGL (Invited)
NASA Astrophysics Data System (ADS)
Fraser, R.; Woodcock, R.; Wyborn, L. A.; Vote, J.; Rankine, T.; Cox, S. J.
2013-12-01
The Virtual Geophysics Laboratory (VGL) provides a flexible, web based environment where researchers can browse data and use a variety of scientific software packaged into tool kits that run in the Cloud. Both data and tool kits are published by multiple researchers and registered with the VGL infrastructure forming a data and application marketplace. The VGL provides the basic work flow of Discovery and Access to the disparate data sources and a Library for tool kits and scripting to drive the scientific codes. Computation is then performed on the Research or Commercial Clouds. Provenance information is collected throughout the work flow and can be published alongside the results allowing for experiment comparison and sharing with other researchers. VGL's "mix and match" approach to data, computational resources and scientific codes, enables a dynamic approach to scientific collaboration. VGL allows scientists to publish their specific contribution, be it data, code, compute or work flow, knowing the VGL framework will provide other components needed for a complete application. Other scientists can choose the pieces that suit them best to assemble an experiment. The coarse grain workflow of the VGL framework combined with the flexibility of the scripting library and computational toolkits allows for significant customisation and sharing amongst the community. The VGL utilises the cloud computational and storage resources from the Australian academic research cloud provided by the NeCTAR initiative and a large variety of data accessible from national and state agencies via the Spatial Information Services Stack (SISS - http://siss.auscope.org). VGL v1.2 screenshot - http://vgl.auscope.org
Cultural and Technological Issues and Solutions for Geodynamics Software Citation
NASA Astrophysics Data System (ADS)
Heien, E. M.; Hwang, L.; Fish, A. E.; Smith, M.; Dumit, J.; Kellogg, L. H.
2014-12-01
Computational software and custom-written codes play a key role in scientific research and teaching, providing tools to perform data analysis and forward modeling through numerical computation. However, development of these codes is often hampered by the fact that there is no well-defined way for the authors to receive credit or professional recognition for their work through the standard methods of scientific publication and subsequent citation of the work. This in turn may discourage researchers from publishing their codes or making them easier for other scientists to use. We investigate the issues involved in citing software in a scientific context, and introduce features that should be components of a citation infrastructure, particularly oriented towards the codes and scientific culture in the area of geodynamics research. The codes used in geodynamics are primarily specialized numerical modeling codes for continuum mechanics problems; they may be developed by individual researchers, teams of researchers, geophysicists in collaboration with computational scientists and applied mathematicians, or by coordinated community efforts such as the Computational Infrastructure for Geodynamics. Some but not all geodynamics codes are open-source. These characteristics are common to many areas of geophysical software development and use. We provide background on the problem of software citation and discuss some of the barriers preventing adoption of such citations, including social/cultural barriers, insufficient technological support infrastructure, and an overall lack of agreement about what a software citation should consist of. We suggest solutions in an initial effort to create a system to support citation of software and promotion of scientific software development.
Kim, Min Kyoung; Yun, Kwang Jun; Lim, Da Hae; Kim, Jinju; Jang, Young Pyo
2016-01-01
The chemical components and biological activity of Camellia mistletoe, Korthalsella japonica (Loranthaceae) are relatively unknown compared to other mistletoe species. Therefore, we investigated the phytochemical properties and biological activity of this parasitic plant to provide essential preliminary scientific evidence to support and encourage its further pharmaceutical research and development. The major plant components were chromatographically isolated using high-performance liquid chromatography and their structures were elucidated using tandem mass spectrometry and nuclear magnetic resonance anlysis. Furthermore, the anti-inflammatory activity of the 70% ethanol extract of K. japonica (KJ) and its isolated components was evaluated using a nitric oxide (NO) assay and western blot analysis for inducible NO synthase (iNOS) and cyclooxygenase (COX)-2. Three flavone di-C-glycosides, lucenin-2, vicenin-2, and stellarin-2 were identified as major components of KJ, for the first time. KJ significantly inhibited NO production and reduced iNOS and COX-2 expression in lipopolysaccharide-stimulated RAW 264.7 cells at 100 μg/mL while similar activity were observed with isolated flavone C-glycosides. In conclusion, KJ has a simple secondary metabolite profiles including flavone di-C-glycosides as major components and has a strong potential for further research and development as a source of therapeutic anti-inflammatory agents. PMID:27302962
Borodulin, V I; Gliantsev, S P
2017-07-01
The article considers particular key methodological aspects of problem of scientific clinical school in national medicine. These aspects have to do with notion of school, its profile, issues of pedagogues, teachings and followers, subsidiary schools and issue of ethical component of scientific school. The article is a polemic one hence one will find no definite answers to specified questions. The reader is proposed to ponder over answers independently adducing examples of pro and contra. The conclusion is made about necessity of studying scientific schools in other areas of medicine and further elaboration of problem.
[Construction of automatic elucidation platform for mechanism of traditional Chinese medicine].
Zhang, Bai-xia; Luo, Si-jun; Yan, Jing; Gu, Hao; Luo, Ji; Zhang, Yan-ling; Tao, Ou; Wang, Yun
2015-10-01
Aim at the two problems in the field of traditional Chinese medicine (TCM) mechanism elucidation, one is the lack of detailed biological processes information, next is the low efficient in constructing network models, we constructed an auxiliary elucidation system for the TCM mechanism and realize the automatic establishment of biological network model. This study used the Entity Grammar Systems (EGS) as the theoretical framework, integrated the data of formulae, herbs, chemical components, targets of component, biological reactions, signaling pathways and disease related proteins, established the formal models, wrote the reasoning engine, constructed the auxiliary elucidation system for the TCM mechanism elucidation. The platform provides an automatic modeling method for biological network model of TCM mechanism. It would be benefit to perform the in-depth research on TCM theory of natures and combination and provides the scientific references for R&D of TCM.
Bittorf, A.; Diepgen, T. L.
1996-01-01
The World Wide Web (WWW) is becoming the major way of acquiring information in all scientific disciplines as well as in business. It is very well suitable for fast distribution and exchange of up to date teaching resources. However, to date most teaching applications on the Web do not use its full power by integrating interactive components. We have set up a computer based training (CBT) framework for Dermatology, which consists of dynamic lecture scripts, case reports, an atlas and a quiz system. All these components heavily rely on an underlying image database that permits the creation of dynamic documents. We used a demon process that keeps the database open and can be accessed using HTTP to achieve better performance and avoid the overhead involved by starting CGI-processes. The result of our evaluation was very encouraging. Images Figure 3 PMID:8947625
1993 Earth Observing System reference handbook
NASA Technical Reports Server (NTRS)
Asrar, Ghassem (Editor); Dokken, David Jon (Editor)
1993-01-01
Mission to Planet Earth (MTPE) is a NASA-sponsored concept that uses space- and ground-based measurement systems to provide the scientific basis for understanding global change. The space-based components of MTPE will provide a constellation of satellites to monitor the Earth from space. Sustained observations will allow researchers to monitor climate variables overtime to determine trends; however, space-based monitoring alone is not sufficient. A comprehensive data and information system, a community of scientists performing research with the data acquired, and extensive ground campaigns are all important components. Brief descriptions of the various elements that comprise the overall mission are provided. The Earth Observing System (EOS) - a series of polar-orbiting and low-inclination satellites for long-term global observations of the land surface, biosphere, solid Earth, atmosphere, and oceans - is the centerpiece of MTPE. The elements comprising the EOS mission are described in detail.
What are the Benefits of Exercise for Alzheimer's Disease? A Systematic Review of the Past 10 Years.
Hernández, Salma S S; Sandreschi, Paula F; da Silva, Franciele C; Arancibia, Beatriz A V; da Silva, Rudney; Gutierres, Paulo J B; Andrade, Alexandro
2015-10-01
To identify and characterize the scientific literature on the effects of exercise on Alzheimer's disease, research was conducted in the following databases: MEDLINE, CINAHL, Web of Science, and Scopus. These MeSH terms--"exercise", "motor activity", "physical fitness", "Alzheimer disease", and its synonyms in English--were used in the initial search to locate studies published between 2003 and 2013. After reading the 12 final articles in their entirety, two additional articles, found by a manual search, were included. Of these, 13 had beneficial results of exercise in Alzheimer's disease. Given the results discussed here, the exercise may be important for the improvement of functionality and performance of daily life activities, neuropsychiatric disturbances, cardiovascular and cardiorespiratory fitness, functional capacity components (flexibility, agility, balance, strength), and improvements in some cognitive components such as sustained attention, visual memory, and frontal cognitive function in patients with AD.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foust, O J
1978-01-01
The handbook is intended for use by present and future designers in the Liquid Metals Fast Breeder Reactor (LMFBR) Program and by the engineering and scientific community performing other type investigation and exprimentation requiring high-temperature sodium and NaK technology. The arrangement of subject matter progresses from a technological discussion of sodium and sodium--potassium alloy (NaK) to discussions of varius categories and uses of hardware in sodium and NaK systems. Emphasis is placed on sodium and NaK as heat-transport media. Sufficient detail is included for basic understanding of sodium and NaK technology and of technical aspects of sodium and NaK componentsmore » and instrument systems. Information presented is considered adequate for use in feasibility studies and conceptual design, sizing components and systems, developing preliminary component and system descriptions, identifying technological limitations and problem areas, and defining basic constraints and parameters.« less
Chaste: An Open Source C++ Library for Computational Physiology and Biology
Mirams, Gary R.; Arthurs, Christopher J.; Bernabeu, Miguel O.; Bordas, Rafel; Cooper, Jonathan; Corrias, Alberto; Davit, Yohan; Dunn, Sara-Jane; Fletcher, Alexander G.; Harvey, Daniel G.; Marsh, Megan E.; Osborne, James M.; Pathmanathan, Pras; Pitt-Francis, Joe; Southern, James; Zemzemi, Nejib; Gavaghan, David J.
2013-01-01
Chaste — Cancer, Heart And Soft Tissue Environment — is an open source C++ library for the computational simulation of mathematical models developed for physiology and biology. Code development has been driven by two initial applications: cardiac electrophysiology and cancer development. A large number of cardiac electrophysiology studies have been enabled and performed, including high-performance computational investigations of defibrillation on realistic human cardiac geometries. New models for the initiation and growth of tumours have been developed. In particular, cell-based simulations have provided novel insight into the role of stem cells in the colorectal crypt. Chaste is constantly evolving and is now being applied to a far wider range of problems. The code provides modules for handling common scientific computing components, such as meshes and solvers for ordinary and partial differential equations (ODEs/PDEs). Re-use of these components avoids the need for researchers to ‘re-invent the wheel’ with each new project, accelerating the rate of progress in new applications. Chaste is developed using industrially-derived techniques, in particular test-driven development, to ensure code quality, re-use and reliability. In this article we provide examples that illustrate the types of problems Chaste can be used to solve, which can be run on a desktop computer. We highlight some scientific studies that have used or are using Chaste, and the insights they have provided. The source code, both for specific releases and the development version, is available to download under an open source Berkeley Software Distribution (BSD) licence at http://www.cs.ox.ac.uk/chaste, together with details of a mailing list and links to documentation and tutorials. PMID:23516352
Augestad, K M; Han, H; Paige, J; Ponsky, T; Schlachta, C M; Dunkin, B; Mellinger, J
2017-10-01
Surgical telementoring (ST) was introduced in the sixties, promoting videoconferencing to enhance surgical education across large distances. Widespread use of ST in the surgical community is lacking. Despite numerous surveys assessing ST, there remains a lack of high-level scientific evidence demonstrating its impact on mentorship and surgical education. Despite this, there is an ongoing paradigm shift involving remote presence technologies and their application to skill development and technique dissemination in the international surgical community. Factors facilitating this include improved access to ST technology, including ease of use and data transmission, and affordability. Several international research initiatives have commenced to strengthen the scientific foundation documenting the impact of ST in surgical education and performance. International experts on ST were invited to the SAGES Project Six Summit in August 2015. Two experts in surgical education prepared relevant questions for discussion and organized the meeting (JP and HH). The questions were open-ended, and the discussion continued until no new item appeared. The transcripts of interviews were recorded by a secretary from SAGES. In this paper, we present a summary of the work performed by the SAGES Project 6 Education Working Group. We summarize the existing evidence regarding education in ST, identify and detail conceptual educational frameworks that may be used during ST, and present a structured framework for an educational curriculum in ST. The educational impact and optimal curricular organization of ST programs are largely unexplored. We outline the critical components of a structured ST curriculum, including prerequisites, teaching modalities, and key curricular components. We also detail research strategies critical to its continued evolution as an educational tool, including randomized controlled trials, establishment of a quality registry, qualitative research, learning analytics, and development of a standardized taxonomy.
Structural analysis of health-relevant policy-making information exchange networks in Canada.
Contandriopoulos, Damien; Benoît, François; Bryant-Lukosius, Denise; Carrier, Annie; Carter, Nancy; Deber, Raisa; Duhoux, Arnaud; Greenhalgh, Trisha; Larouche, Catherine; Leclerc, Bernard-Simon; Levy, Adrian; Martin-Misener, Ruth; Maximova, Katerina; McGrail, Kimberlyn; Nykiforuk, Candace; Roos, Noralou; Schwartz, Robert; Valente, Thomas W; Wong, Sabrina; Lindquist, Evert; Pullen, Carolyn; Lardeux, Anne; Perroux, Melanie
2017-09-20
Health systems worldwide struggle to identify, adopt, and implement in a timely and system-wide manner the best-evidence-informed-policy-level practices. Yet, there is still only limited evidence about individual and institutional best practices for fostering the use of scientific evidence in policy-making processes The present project is the first national-level attempt to (1) map and structurally analyze-quantitatively-health-relevant policy-making networks that connect evidence production, synthesis, interpretation, and use; (2) qualitatively investigate the interaction patterns of a subsample of actors with high centrality metrics within these networks to develop an in-depth understanding of evidence circulation processes; and (3) combine these findings in order to assess a policy network's "absorptive capacity" regarding scientific evidence and integrate them into a conceptually sound and empirically grounded framework. The project is divided into two research components. The first component is based on quantitative analysis of ties (relationships) that link nodes (participants) in a network. Network data will be collected through a multi-step snowball sampling strategy. Data will be analyzed structurally using social network mapping and analysis methods. The second component is based on qualitative interviews with a subsample of the Web survey participants having central, bridging, or atypical positions in the network. Interviews will focus on the process through which evidence circulates and enters practice. Results from both components will then be integrated through an assessment of the network's and subnetwork's effectiveness in identifying, capturing, interpreting, sharing, reframing, and recodifying scientific evidence in policy-making processes. Knowledge developed from this project has the potential both to strengthen the scientific understanding of how policy-level knowledge transfer and exchange functions and to provide significantly improved advice on how to ensure evidence plays a more prominent role in public policies.
Contextual Shifting: Teachers Emphasizing Students' Academic Identity to Promote Scientific Literacy
ERIC Educational Resources Information Center
Reveles, John M.; Brown, Bryan A.
2008-01-01
This research presents a case study of two teachers' emphasis on students' academic identity as a means of facilitating their science literacy development. These cases support a theoretical position that deconstructs the notion of normative science literacy into its constitutive components: (a) being scientific and (b) appropriating its literate…
Fostering Scientific Literacy and Critical Thinking in Elementary Science Education
ERIC Educational Resources Information Center
Vieira, Rui Marques; Tenreiro-Vieira, Celina
2016-01-01
Scientific literacy (SL) and critical thinking (CT) are key components of science education aiming to prepare students to think and to function as responsible citizens in a world increasingly affected by science and technology (S&T). Therefore, students should be given opportunities in their science classes to be engaged in learning…
Chapter 01: Wood identification and pattern recognition
Alex Wiedenhoeft
2011-01-01
Wood identification is a combination of art and science. Although the bulk of this manual focuses on the scientific characteristics used to make accurate field identifications of wood, the contribution of the artistic component to the identification process should be neither overlooked nor understated. Though the accumulation of scientific knowledge and experience is...
Teachers Describe Epistemologies of Science Instruction through Q Methodology
ERIC Educational Resources Information Center
Barnes, Caitlin; Angle, Julie; Montgomery, Diane
2015-01-01
Creating scientifically literate students is a common goal among educational stakeholders. An understanding of nature of science is an important component of scientific literacy in K-12 science education. Q methodology was used to investigate the opinions of preservice and in-service teachers on how they intend to teach or currently teach science.…
The Efficacy of Weight-Loss Clinics: An Issue in Consumer Health Education.
ERIC Educational Resources Information Center
Thomas, Susan E.
1988-01-01
Weight loss clinics based on scientific fact and containing diet therapy, exercise therapy, and behavior modification components can be effective vehicles for weight loss among the mildly to moderately obese. Health educators are called on to disseminate the information necessary to establish scientifically based criteria and program evaluation…
How Might Research Inform Scientific Literacy in Schools?
ERIC Educational Resources Information Center
Jenkins, Edgar
2010-01-01
Scientific literacy is now seen as an essential component of informed citizenship and a key curriculum goal in many parts of the world. The relevant literature is vast and replete with a variety of definitions, descriptions, prescriptions, slogans and theoretical perspectives. It addresses not only formal education but also fields as diverse as…
MULTIDISCIPLINARY SCIENTIFIC AND ENGINEERING APPROACHES TO ASSESSING DIESEL EXHAUST TOXICITY
Based on epidemiology reports, diesel exhaust (DE) containing particulate matter (PM) may play a role in increasing cardiopulmonary mortality and morbidity, such as lung infection and asthma symptoms. DE gas-phase components may modify the PM effects. DE components vary depending...
There is no such thing as a biocompatible material.
Williams, David F
2014-12-01
This Leading Opinion Paper discusses a very important matter concerning the use of a single word in biomaterials science. This might be considered as being solely concerned with semantics, but it has implications for the scientific rationale for biomaterials selection and the understanding of their performance. That word is the adjective 'biocompatible', which is often used to characterize a material property. It is argued here that biocompatibility is a perfectly acceptable term, but that it subsumes a variety of mechanisms of interaction between biomaterials and tissues or tissue components and can only be considered in the context of the characteristics of both the material and the biological host within which it placed. De facto it is a property of a system and not of a material. It follows that there can be no such thing as a biocompatible material. It is further argued that in those situations where it is considered important, or necessary, to use a descriptor of biocompatibility, as in a scientific paper, a regulatory submission or in a legal argument, the phrase 'intrinsically biocompatible system' would be the most appropriate. The rationale for this linguistic restraint is that far too often it has been assumed that some materials are 'universally biocompatible' on the basis of acceptable clinical performance in one situation, only for entirely unacceptable performance to ensue in quite different clinical circumstances. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Tumer, Kagan; Wolpert, David
2004-01-01
Due to the increasing sophistication and miniaturization of computational components, complex, distributed systems of interacting agents are becoming ubiquitous. Such systems, where each agent aims to optimize its own performance, but where there is a well-defined set of system-level performance criteria, are called collectives. The fundamental problem in analyzing/designing such systems is in determining how the combined actions of self-interested agents leads to 'coordinated' behavior on a iarge scale. Examples of artificial systems which exhibit such behavior include packet routing across a data network, control of an array of communication satellites, coordination of multiple deployables, and dynamic job scheduling across a distributed computer grid. Examples of natural systems include ecosystems, economies, and the organelles within a living cell. No current scientific discipline provides a thorough understanding of the relation between the structure of collectives and how well they meet their overall performance criteria. Although still very young, research on collectives has resulted in successes both in understanding and designing such systems. It is eqected that as it matures and draws upon other disciplines related to collectives, this field will greatly expand the range of computationally addressable tasks. Moreover, in addition to drawing on them, such a fully developed field of collective intelligence may provide insight into already established scientific fields, such as mechanism design, economics, game theory, and population biology. This chapter provides a survey to the emerging science of collectives.
The role of dedicated data computing centers in the age of cloud computing
NASA Astrophysics Data System (ADS)
Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr
2017-10-01
Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.
NASA Astrophysics Data System (ADS)
He, Zhi-Ping; Wang, Bin-Yong; Lü, Gang; Li, Chun-Lai; Yuan, Li-Yin; Xu, Rui; Liu, Bin; Chen, Kai; Wang, Jian-Yu
2014-12-01
The Visible and Near-Infrared Imaging Spectrometer (VNIS), using two acousto-optic tunable filters as dispersive components, consists of a VIS/NIR imaging spectrometer (0.45-0.95 μm), a shortwave IR spectrometer (0.9-2.4 μm) and a calibration unit with dust-proofing functionality. The VNIS was utilized to detect the spectrum of the lunar surface and achieve in-orbit calibration, which satisfied the requirements for scientific detection. Mounted at the front of the Yutu rover, lunar objects that are detected with the VNIS with a 45° visual angle to obtain spectra and geometrical data in order to analyze the mineral composition of the lunar surface. After landing successfully on the Moon, the VNIS performed several explorations and calibrations, and obtained several spectral images and spectral reflectance curves of the lunar soil in the region of Mare Imbrium. This paper describes the working principle and detection characteristics of the VNIS and provides a reference for data processing and scientific applications.
Polymer-Based Nanocomposites: An Internship Program for Deaf and Hard of Hearing Students
NASA Astrophysics Data System (ADS)
Cebe, Peggy; Cherdack, Daniel; Seyhan Ince-Gunduz, B.; Guertin, Robert; Haas, Terry; Valluzzi, Regina
2007-03-01
We report on our summer internship program in Polymer-Based Nanocomposites, for deaf and hard of hearing undergraduates who engage in classroom and laboratory research work in polymer physics. The unique attributes of this program are its emphasis on: 1. Teamwork; 2. Performance of a start-to-finish research project; 3. Physics of materials approach; and 4. Diversity. Students of all disability levels have participated in this program, including students who neither hear nor voice. The classroom and laboratory components address the materials chemistry and physics of polymer-based nanocomposites, crystallization and melting of polymers, the interaction of X-rays and light with polymers, mechanical properties of polymers, and the connection between thermal processing, structure, and ultimate properties of polymers. A set of Best Practices is developed for accommodating deaf and hard of hearing students into the laboratory setting. The goal is to bring deaf and hard of hearing students into the larger scientific community as professionals, by providing positive scientific experiences at a formative time in their educational lives.
Using Jupyter Notebooks for Interactive Space Science Simulations
NASA Astrophysics Data System (ADS)
Schmidt, Albrecht
2016-04-01
Jupyter Notebooks can be used as an effective means to communicate scientific ideas through Web-based visualisations and, at the same time, give a user more than a pre-defined set of options to manipulate the visualisations. To some degree, even computations can be done without too much knowledge of the underlying data structures and infrastructure to discover novel aspects of the data or tailor view to users' needs. Here, we show how to combine Jupyter Notebooks with other open-source tools to provide rich and interactive views on space data, especially the visualisation of spacecraft operations. Topics covered are orbit visualisation, spacecraft orientation, instrument timelines as well as performance analysis of mission segments. Technically, also the re-use and integration of existing components will be shown, both on the code level as well on the visualisation level so that the effort which was put into the development of new components could be reduced. Another important aspect is the bridging of the gap between operational data and the scientific exploitation of the payload data, for which also a way forward will be shown. A lesson learned from the implementation and use of a prototype is the synergy between the team who provisions the notebooks and the consumers, who both share access to the same code base, if not resources; this often simplifies communication and deployment.
Is the clinical use of cannabis by oncology patients advisable?
Bar-Sela, Gil; Avisar, Adva; Batash, Ron; Schaffer, Moshe
2014-06-01
The use of the cannabis plant for various medical indications by cancer patients has been rising significantly in the past few years in several European countries, the US and Israel. The increase in use comes from public demand for the most part, and not due to a scientific basis. Cannabis chemistry is complex, and the isolation and extraction of the active ingredient remain difficult. The active agent in cannabis is unique among psychoactive plant materials, as it contains no nitrogen and, thus, is not an alkaloid. Alongside inconclusive evidence of increased risks of lung and head and neck cancers from prolonged smoking of the plant produce, laboratory evidence of the anti-cancer effects of plant components exists, but with no clinical research in this direction. The beneficial effects of treatment with the plant, or treatment with medicine produced from its components, are related to symptoms of the disease: pain, nausea and vomiting, loss of appetite and weight loss. The clinical evidence of the efficacy of cannabis for these indications is only partial. However, recent scientific data from studies with THC and cannabidiol combinations report the first clinical indication of cancer-related pain relief. The difficulties of performing research into products that are not medicinal, such as cannabis, have not allowed a true study of the cannabis plant extract although, from the public point of view, such studies are greatly desirable.
Quantifying the Ease of Scientific Discovery
Arbesman, Samuel
2012-01-01
It has long been known that scientific output proceeds on an exponential increase, or more properly, a logistic growth curve. The interplay between effort and discovery is clear, and the nature of the functional form has been thought to be due to many changes in the scientific process over time. Here I show a quantitative method for examining the ease of scientific progress, another necessary component in understanding scientific discovery. Using examples from three different scientific disciplines – mammalian species, chemical elements, and minor planets – I find the ease of discovery to conform to an exponential decay. In addition, I show how the pace of scientific discovery can be best understood as the outcome of both scientific output and ease of discovery. A quantitative study of the ease of scientific discovery in the aggregate, such as done here, has the potential to provide a great deal of insight into both the nature of future discoveries and the technical processes behind discoveries in science. PMID:22328796
Quantifying the Ease of Scientific Discovery.
Arbesman, Samuel
2011-02-01
It has long been known that scientific output proceeds on an exponential increase, or more properly, a logistic growth curve. The interplay between effort and discovery is clear, and the nature of the functional form has been thought to be due to many changes in the scientific process over time. Here I show a quantitative method for examining the ease of scientific progress, another necessary component in understanding scientific discovery. Using examples from three different scientific disciplines - mammalian species, chemical elements, and minor planets - I find the ease of discovery to conform to an exponential decay. In addition, I show how the pace of scientific discovery can be best understood as the outcome of both scientific output and ease of discovery. A quantitative study of the ease of scientific discovery in the aggregate, such as done here, has the potential to provide a great deal of insight into both the nature of future discoveries and the technical processes behind discoveries in science.
Understanding the Performance and Potential of Cloud Computing for Scientific Applications
Sadooghi, Iman; Martin, Jesus Hernandez; Li, Tonglin; ...
2015-02-19
In this paper, commercial clouds bring a great opportunity to the scientific computing area. Scientific applications usually require significant resources, however not all scientists have access to sufficient high-end computing systems, may of which can be found in the Top500 list. Cloud Computing has gained the attention of scientists as a competitive resource to run HPC applications at a potentially lower cost. But as a different infrastructure, it is unclear whether clouds are capable of running scientific applications with a reasonable performance per money spent. This work studies the performance of public clouds and places this performance in context tomore » price. We evaluate the raw performance of different services of AWS cloud in terms of the basic resources, such as compute, memory, network and I/O. We also evaluate the performance of the scientific applications running in the cloud. This paper aims to assess the ability of the cloud to perform well, as well as to evaluate the cost of the cloud running scientific applications. We developed a full set of metrics and conducted a comprehensive performance evlauation over the Amazon cloud. We evaluated EC2, S3, EBS and DynamoDB among the many Amazon AWS services. We evaluated the memory sub-system performance with CacheBench, the network performance with iperf, processor and network performance with the HPL benchmark application, and shared storage with NFS and PVFS in addition to S3. We also evaluated a real scientific computing application through the Swift parallel scripting system at scale. Armed with both detailed benchmarks to gauge expected performance and a detailed monetary cost analysis, we expect this paper will be a recipe cookbook for scientists to help them decide where to deploy and run their scientific applications between public clouds, private clouds, or hybrid clouds.« less
Understanding the Performance and Potential of Cloud Computing for Scientific Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sadooghi, Iman; Martin, Jesus Hernandez; Li, Tonglin
In this paper, commercial clouds bring a great opportunity to the scientific computing area. Scientific applications usually require significant resources, however not all scientists have access to sufficient high-end computing systems, may of which can be found in the Top500 list. Cloud Computing has gained the attention of scientists as a competitive resource to run HPC applications at a potentially lower cost. But as a different infrastructure, it is unclear whether clouds are capable of running scientific applications with a reasonable performance per money spent. This work studies the performance of public clouds and places this performance in context tomore » price. We evaluate the raw performance of different services of AWS cloud in terms of the basic resources, such as compute, memory, network and I/O. We also evaluate the performance of the scientific applications running in the cloud. This paper aims to assess the ability of the cloud to perform well, as well as to evaluate the cost of the cloud running scientific applications. We developed a full set of metrics and conducted a comprehensive performance evlauation over the Amazon cloud. We evaluated EC2, S3, EBS and DynamoDB among the many Amazon AWS services. We evaluated the memory sub-system performance with CacheBench, the network performance with iperf, processor and network performance with the HPL benchmark application, and shared storage with NFS and PVFS in addition to S3. We also evaluated a real scientific computing application through the Swift parallel scripting system at scale. Armed with both detailed benchmarks to gauge expected performance and a detailed monetary cost analysis, we expect this paper will be a recipe cookbook for scientists to help them decide where to deploy and run their scientific applications between public clouds, private clouds, or hybrid clouds.« less
NASA Astrophysics Data System (ADS)
Werkheiser, W. H.
2016-12-01
10 Years of Scientific Integrity Policy at the U.S. Geological Survey The U.S. Geological Survey implemented its first scientific integrity policy in January 2007. Following the 2009 and 2010 executive memoranda aimed at creating scientific integrity policies throughout the federal government, USGS' policy served as a template to inform the U.S. Department of Interior's policy set forth in January 2011. Scientific integrity policy at the USGS and DOI continues to evolve as best practices come to the fore and the broader Federal scientific integrity community evolves in its understanding of a vital and expanding endeavor. We find that scientific integrity is best served by: formal and informal mechanisms through which to resolve scientific integrity issues; a well-communicated and enforceable code of scientific conduct that is accessible to multiple audiences; an unfailing commitment to the code on the part of all parties; awareness through mandatory training; robust protection to encourage whistleblowers to come forward; and outreach with the scientific integrity community to foster consistency and share experiences.
NASA Astrophysics Data System (ADS)
Werkheiser, W. H.
2017-12-01
10 Years of Scientific Integrity Policy at the U.S. Geological Survey The U.S. Geological Survey implemented its first scientific integrity policy in January 2007. Following the 2009 and 2010 executive memoranda aimed at creating scientific integrity policies throughout the federal government, USGS' policy served as a template to inform the U.S. Department of Interior's policy set forth in January 2011. Scientific integrity policy at the USGS and DOI continues to evolve as best practices come to the fore and the broader Federal scientific integrity community evolves in its understanding of a vital and expanding endeavor. We find that scientific integrity is best served by: formal and informal mechanisms through which to resolve scientific integrity issues; a well-communicated and enforceable code of scientific conduct that is accessible to multiple audiences; an unfailing commitment to the code on the part of all parties; awareness through mandatory training; robust protection to encourage whistleblowers to come forward; and outreach with the scientific integrity community to foster consistency and share experiences.
Madhugiri, Venkatesh S
2015-01-01
Scientific publications are a reflection of the quality of the clinical and academic work being carried out in an institute. Training in the process of research and scientific writing are important components of the residency curriculum. The publication performance and research output of institutes training residents in neurology and neurosurgery were evaluated. Internet-based study. This study was based on the data available on the websites of the Medical Council of India and the National Board of Examinations. The PubMed search interface was used to determine the publication output of institutes over the past 5 years (2010-2014). Google Scholar was used to determine the citation performance of each paper. The publication parameters were normalized to the number of faculty members in each institute as listed on the institutional web page. The normalized publication performance for an institute was computed by comparing the figures for that institute with the national average. Institutes could be ranked on several criteria. There was a high degree of clustering of output from the top 5% of the institutes. About 13% of the neurology intake and 30.9% of neurosurgery intake over the past 5 years has been into the institutes that have not published a single paper during this period. This evaluation of the publication performance and research output of neurology and neurosurgery training institutes would serve as a baseline data for future evaluations and comparisons. The absence of any publication and research output from several training institutes is a matter of concern.
Position of the American Dietetic Association: functional foods.
Hasler, Clare M; Brown, Amy C
2009-04-01
All foods are functional at some physiological level, but it is the position of the American Dietetic Association (ADA) that functional foods that include whole foods and fortified, enriched, or enhanced foods have a potentially beneficial effect on health when consumed as part of a varied diet on a regular basis, at effective levels. ADA supports research to further define the health benefits and risks of individual functional foods and their physiologically active components. Health claims on food products, including functional foods, should be based on the significant scientific agreement standard of evidence and ADA supports label claims based on such strong scientific substantiation. Food and nutrition professionals will continue to work with the food industry, allied health professionals, the government, the scientific community, and the media to ensure that the public has accurate information regarding functional foods and thus should continue to educate themselves on this emerging area of food and nutrition science. Knowledge of the role of physiologically active food components, from plant, animal, and microbial food sources, has changed the role of diet in health. Functional foods have evolved as food and nutrition science has advanced beyond the treatment of deficiency syndromes to reduction of disease risk and health promotion. This position paper reviews the definition of functional foods, their regulation, and the scientific evidence supporting this evolving area of food and nutrition. Foods can no longer be evaluated only in terms of macronutrient and micronutrient content alone. Analyzing the content of other physiologically active components and evaluating their role in health promotion will be necessary. The availability of health-promoting functional foods in the US diet has the potential to help ensure a healthier population. However, each functional food should be evaluated on the basis of scientific evidence to ensure appropriate integration into a varied diet.
Exploring Two Approaches for an End-to-End Scientific Analysis Workflow
Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; ...
2015-12-23
The advance of the scientific discovery process is accomplished by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally,more » it is important for scientists to be able to share their workflows with collaborators. Moreover we have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC), the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In our paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spentzouris, Linda
The objective of the proposal was to develop graduate student training in materials and engineering research relevant to the development of particle accelerators. Many components used in today's accelerators or storage rings are at the limit of performance. The path forward in many cases requires the development of new materials or fabrication techniques, or a novel engineering approach. Often, accelerator-based laboratories find it difficult to get top-level engineers or materials experts with the motivation to work on these problems. The three years of funding provided by this grant was used to support development of accelerator components through a multidisciplinary approachmore » that cut across the disciplinary boundaries of accelerator physics, materials science, and surface chemistry. The following results were achieved: (1) significant scientific results on fabrication of novel photocathodes, (2) application of surface science and superconducting materials expertise to accelerator problems through faculty involvement, (3) development of instrumentation for fabrication and characterization of materials for accelerator components, (4) student involvement with problems at the interface of material science and accelerator physics.« less
Dietary bioactives: establishing a scientific framework for recommended intakes
USDA-ARS?s Scientific Manuscript database
Research has shown that numerous dietary bioactive components that are not considered essential may still be beneficial to health. The dietary reference intake (DRI) process has been applied to nonessential nutrients, such as fiber, yet the majority of bioactive components await a recommended intake...
Aloe vera: a valuable ingredient for the food, pharmaceutical and cosmetic industries--a review.
Eshun, Kojo; He, Qian
2004-01-01
Scientific investigations on Aloe vera have gained more attention over the last several decades due to its reputable medicinal properties. Some publications have appeared in reputable Scientific Journals that have made appreciable contributions to the discovery of the functions and utilizations of Aloe--"nature's gift." Chemical analysis reveals that Aloe vera contains various carbohydrate polymers, notably glucomannans, along with a range of other organic and inorganic components. Although many physiological properties of Aloe vera have been described, it still remains uncertain as to which of the component(s) is responsible for these physiological properties. Further research needs to be done to unravel the myth surrounding the biological activities and the functional properties of A. vera. Appropriate processing techniques should be employed during the stabilization of the gel in order to affect and extend its field of utilization.
Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Dean N.; Silva, Claudio
2013-09-30
For the past three years, a large analysis and visualization effort—funded by the Department of Energy’s Office of Biological and Environmental Research (BER), the National Aeronautics and Space Administration (NASA), and the National Oceanic and Atmospheric Administration (NOAA)—has brought together a wide variety of industry-standard scientific computing libraries and applications to create Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT) to serve the global climate simulation and observational research communities. To support interactive analysis and visualization, all components connect through a provenance application–programming interface to capture meaningful history and workflow. Components can be loosely coupled into the framework for fast integrationmore » or tightly coupled for greater system functionality and communication with other components. The overarching goal of UV-CDAT is to provide a new paradigm for access to and analysis of massive, distributed scientific data collections by leveraging distributed data architectures located throughout the world. The UV-CDAT framework addresses challenges in analysis and visualization and incorporates new opportunities, including parallelism for better efficiency, higher speed, and more accurate scientific inferences. Today, it provides more than 600 users access to more analysis and visualization products than any other single source.« less
Sun, Li-Qiong; Wang, Shu-Yao; Li, Yan-Jing; Wang, Yong-Xiang; Wang, Zhen-Zhong; Huang, Wen-Zhe; Wang, Yue-Sheng; Bi, Yu-An; Ding, Gang; Xiao, Wei
2016-01-01
The present study was designed to determine the relationships between the performance of ethanol precipitation and seven process parameters in the ethanol precipitation process of Re Du Ning Injections, including concentrate density, concentrate temperature, ethanol content, flow rate and stir rate in the addition of ethanol, precipitation time, and precipitation temperature. Under the experimental and simulated production conditions, a series of precipitated resultants were prepared by changing these variables one by one, and then examined by HPLC fingerprint analyses. Different from the traditional evaluation model based on single or a few constituents, the fingerprint data of every parameter fluctuation test was processed with Principal Component Analysis (PCA) to comprehensively assess the performance of ethanol precipitation. Our results showed that concentrate density, ethanol content, and precipitation time were the most important parameters that influence the recovery of active compounds in precipitation resultants. The present study would provide some reference for pharmaceutical scientists engaged in research on pharmaceutical process optimization and help pharmaceutical enterprises adapt a scientific and reasonable cost-effective approach to ensure the batch-to-batch quality consistency of the final products. Copyright © 2016 China Pharmaceutical University. Published by Elsevier B.V. All rights reserved.
An automated and integrated framework for dust storm detection based on ogc web processing services
NASA Astrophysics Data System (ADS)
Xiao, F.; Shea, G. Y. K.; Wong, M. S.; Campbell, J.
2014-11-01
Dust storms are known to have adverse effects on public health. Atmospheric dust loading is also one of the major uncertainties in global climatic modelling as it is known to have a significant impact on the radiation budget and atmospheric stability. The complexity of building scientific dust storm models is coupled with the scientific computation advancement, ongoing computing platform development, and the development of heterogeneous Earth Observation (EO) networks. It is a challenging task to develop an integrated and automated scheme for dust storm detection that combines Geo-Processing frameworks, scientific models and EO data together to enable the dust storm detection and tracking processes in a dynamic and timely manner. This study develops an automated and integrated framework for dust storm detection and tracking based on the Web Processing Services (WPS) initiated by Open Geospatial Consortium (OGC). The presented WPS framework consists of EO data retrieval components, dust storm detecting and tracking component, and service chain orchestration engine. The EO data processing component is implemented based on OPeNDAP standard. The dust storm detecting and tracking component combines three earth scientific models, which are SBDART model (for computing aerosol optical depth (AOT) of dust particles), WRF model (for simulating meteorological parameters) and HYSPLIT model (for simulating the dust storm transport processes). The service chain orchestration engine is implemented based on Business Process Execution Language for Web Service (BPEL4WS) using open-source software. The output results, including horizontal and vertical AOT distribution of dust particles as well as their transport paths, were represented using KML/XML and displayed in Google Earth. A serious dust storm, which occurred over East Asia from 26 to 28 Apr 2012, is used to test the applicability of the proposed WPS framework. Our aim here is to solve a specific instance of a complex EO data and scientific model integration problem by using a framework and scientific workflow approach together. The experimental result shows that this newly automated and integrated framework can be used to give advance near real-time warning of dust storms, for both environmental authorities and public. The methods presented in this paper might be also generalized to other types of Earth system models, leading to improved ease of use and flexibility.
MIMA, a miniaturized Fourier spectrometer for Mars ground exploration: Part II. Optical design
NASA Astrophysics Data System (ADS)
Fonti, S.; Marzo, G. A.; Politi, R.; Bellucci, G.; Saggin, B.
2007-10-01
The Mars Infrared MApper (MIMA) is a FT-IR miniaturised spectrometer which is being developed for ESA ExoMars Pasteur mission. MIMA will be mounted on the rover mast and so it must be compact and light-weight. The scientific goals and its thermo-mechanical design are presented in two companion papers [1] and [2]. In this work the optical design will be reviewed and the results of the tests performed on some optical components will be presented. The design has faced challenging constraints mainly linked to the requirement of keeping the performances good enough to fulfil the scientific objectives of the mission, while, at the same time, it was imperative to keep the overall size and weigh within the allocated resources. In addition the instrument must be able to operate in the very harsh environment of the Martian surface and to withstand, without permanent damage, even harsher conditions as well as the severe dynamic loads expected at landing on Mars. The chosen solution is a single channel double pendulum interferometer, covering the spectral range between 2 and 25 micron, crucial for the scientific interpretation of the recorded spectra, with a resolution variable between 10 and 5 cm-1. Since the spectral range is too wide to be covered by a single detector, it has been decided to use two different detectors, mounted side by side, in a customised case. Such innovative solution has obviously pros and cons and the optical design has been driven by the need to reduce the inconveniences, while maintaining the advantages.
ERIC Educational Resources Information Center
Schwichow, Martin; Christoph, Simon; Boone, William J.; Härtig, Hendrik
2016-01-01
The so-called control-of-variables strategy (CVS) incorporates the important scientific reasoning skills of designing controlled experiments and interpreting experimental outcomes. As CVS is a prominent component of science standards appropriate assessment instruments are required to measure these scientific reasoning skills and to evaluate the…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-24
... tools research and development by organizing and implementing joint engineering and scientific research... components in the engineering and scientific areas of electronic systems, hardware design, packaging and... Civil Enforcement, Antitrust Division. [FR Doc. 2011-27114 Filed 10-21-11; 8:45 am] BILLING CODE 4410-11...
Scientific-Chemical Viewpoints regarding Smoking: A Science Laboratory for All
ERIC Educational Resources Information Center
Blonder, Ron
2008-01-01
This article describes laboratory activity that examines the chemical process of smoking and the components of smoke, of both cigarettes and water pipes (narghiles also known as "hookah"). The aim of this activity is to expose adolescents to the scientific aspects of smoking; and to present the relevance of chemistry in everyday life. (Contains 3…
ERIC Educational Resources Information Center
Piekny, Jeanette; Maehler, Claudia
2013-01-01
According to Klahr's (2000, 2005; Klahr & Dunbar, 1988) Scientific Discovery as Dual Search model, inquiry processes require three cognitive components: hypothesis generation, experimentation, and evidence evaluation. The aim of the present study was to investigate (a) when the ability to evaluate perfect covariation, imperfect covariation,…
ERIC Educational Resources Information Center
Holding, Matthew L.; Denton, Robert D.; Kulesza, Amy E.; Ridgway, Judith S.
2014-01-01
A fundamental component of science curricula is the understanding of scientific inquiry. Although recent trends favor using student inquiry to learn concepts through hands-on activities, it is often unclear to students where the line is drawn between the content and the process of science. This activity explicitly introduces students to the…
ERIC Educational Resources Information Center
Tan, Aik-Ling; Lee, Peter Peng Foo; Cheah, Yin Hong
2017-01-01
This study examines the verbal interactions among a group of pre-service teachers as they engaged in scientific discussions in a medicinal chemistry course. These discussions were part of the course that encompassed an explicit instruction of scientific argumentation structures as well as an applied component, whereby the pre-service teachers…
ERIC Educational Resources Information Center
Sadler, Troy D.
2004-01-01
Science educators have appropriated many meanings for the phrase "scientific literacy" (Champagne & Lovitts, 1989). This paper advances an argument that in order to maintain the usefulness of such a malleable phrase, its users must explicitly address the context of its use. Based on the vision of science education articulated in standards…
Getting Beyond Technical Rationality in Developing Health Behavior Programs With Youth
ERIC Educational Resources Information Center
Perry, Cheryl L.
2004-01-01
Objective: To explore 2 major components of health behavior research, etiologic research and action research. To argue that action research is both an artistic as well as scientific process. Methods: Review of the development process of effective health behavior programs with youth. Review of literature on art as part of the scientific process,…
ERIC Educational Resources Information Center
Mancuso, Vincent J.
2010-01-01
Students' scientific investigations have been identified in national standards and related reform documents as a critical component of students' learning experiences in school, yet it is not easy to implement them in science classrooms. Could science demonstrations help science teachers put this recommendation into practice? While demonstrations…
NASA Technical Reports Server (NTRS)
Nguyen, Daniel H.; Skladany, Lynn M.; Prats, Benito D.; Griffin, Thomas J. (Technical Monitor)
2001-01-01
The Hubble Space Telescope (HST) is one of NASA's most productive astronomical observatories. Launched in 1990, the HST continues to gather scientific data to help scientists around the world discover amazing wonders of the universe. To maintain HST in the fore front of scientific discoveries, NASA has routinely conducted servicing missions to refurbish older equipment as well as to replace existing scientific instruments with better, more powerful instruments. In early 2002, NASA will conduct its fourth servicing mission to the HST. This servicing mission is named Servicing Mission 3B (SM3B). During SM3B, one of the major refurbishment efforts will be to install new rigid-panel solar arrays as a replacement for the existing flexible-foil solar arrays. This is necessary in order to increase electrical power availability for the new scientific instruments. Prior to installing the new solar arrays on HST, the HST project must be certain that the new solar arrays will not cause any performance degradations to the observatory. One of the major concerns is any disturbance that can cause pointing Loss of Lock (LOL) for the telescope. While in orbit, the solar-array temperature transitions quickly from sun to shadow. The resulting thermal expansion and contraction can cause a "mechanical disturbance" which may result in LOL. To better characterize this behavior, a test was conducted at the European Space Research and Technology Centre (ESTEC) in the Large Space Simulator (LSS) thermal-vacuum chamber. In this test, the Sun simulator was used to simulate on-orbit effects on the solar arrays. This paper summarizes the thermal performance of the Solar Array-3 (SA3) during the Disturbance Verification Test (DVT). The test was conducted between 26 October 2000 and 30 October 2000. Included in this paper are: (1) brief description of the SA3's components and its thermal design; (2) a summary of the on-orbit temperature predictions; (3) pretest thermal preparations; (4) a description of the chamber and thermal monitoring sensors; and (6) presentation of test thermal data results versus flight predictions.
Exploitation of Cytotoxicity of Some Essential Oils for Translation in Cancer Therapy
Russo, Rossella; Corasaniti, Maria Tiziana; Bagetta, Giacinto; Morrone, Luigi Antonio
2015-01-01
Essential oils are complex mixtures of several components endowed with a wide range of biological activities, including antiseptic, anti-inflammatory, spasmolytic, sedative, analgesic, and anesthetic properties. A growing body of scientific reports has recently focused on the potential of essential oils as anticancer treatment in the attempt to overcome the development of multidrug resistance and important side effects associated with the antitumor drugs currently used. In this review we discuss the literature on the effects of essential oils in in vitro and in vivo models of cancer, focusing on the studies performed with the whole phytocomplex rather than single constituents. PMID:25722735
Software aspects of the Geant4 validation repository
NASA Astrophysics Data System (ADS)
Dotti, Andrea; Wenzel, Hans; Elvira, Daniel; Genser, Krzysztof; Yarba, Julia; Carminati, Federico; Folger, Gunter; Konstantinov, Dmitri; Pokorski, Witold; Ribon, Alberto
2017-10-01
The Geant4, GeantV and GENIE collaborations regularly perform validation and regression tests for simulation results. DoSSiER (Database of Scientific Simulation and Experimental Results) is being developed as a central repository to store the simulation results as well as the experimental data used for validation. DoSSiER is easily accessible via a web application. In addition, a web service allows for programmatic access to the repository to extract records in JSON or XML exchange formats. In this article, we describe the functionality and the current status of various components of DoSSiER as well as the technology choices we made.
Innovative divertor concept development on DIII-D and EAST
Guo, H. Y.; Allen, S.; Canik, J.; ...
2016-06-02
A critical issue facing the design and operation of next-step high-power steady-state fusion devices is the control of heat fluxes and erosion at the plasma-facing components, in particular, the divertor target plates. A new initiative has been launched on DIII-D to develop and demonstrate innovative boundary plasma-materials interface solutions. The central purposes of this new initiative are to advance scientific understanding in this critical area and develop an advanced divertor concept for application to next-step fusion devices. Finally, DIII-D will leverage strong collaborative efforts on the EAST superconducting tokamak for extending integrated high performance advanced divertor solutions to true steady-state.
Preparing a scientific manuscript in Linux: Today's possibilities and limitations.
Tchantchaleishvili, Vakhtang; Schmitto, Jan D
2011-10-22
Increasing number of scientists are enthusiastic about using free, open source software for their research purposes. Authors' specific goal was to examine whether a Linux-based operating system with open source software packages would allow to prepare a submission-ready scientific manuscript without the need to use the proprietary software. Preparation and editing of scientific manuscripts is possible using Linux and open source software. This letter to the editor describes key steps for preparation of a publication-ready scientific manuscript in a Linux-based operating system, as well as discusses the necessary software components. This manuscript was created using Linux and open source programs for Linux.
[SciELO: method for electronic publishing].
Laerte Packer, A; Rocha Biojone, M; Antonio, I; Mayumi Takemaka, R; Pedroso García, A; Costa da Silva, A; Toshiyuki Murasaki, R; Mylek, C; Carvalho Reisl, O; Rocha F Delbucio, H C
2001-01-01
It describes the SciELO Methodology Scientific Electronic Library Online for electronic publishing of scientific periodicals, examining issues such as the transition from traditional printed publication to electronic publishing, the scientific communication process, the principles which founded the methodology development, its application in the building of the SciELO site, its modules and components, the tools use for its construction etc. The article also discusses the potentialities and trends for the area in Brazil and Latin America, pointing out questions and proposals which should be investigated and solved by the methodology. It concludes that the SciELO Methodology is an efficient, flexible and wide solution for the scientific electronic publishing.
Enabling Diverse Software Stacks on Supercomputers using High Performance Virtual Clusters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Younge, Andrew J.; Pedretti, Kevin; Grant, Ryan
While large-scale simulations have been the hallmark of the High Performance Computing (HPC) community for decades, Large Scale Data Analytics (LSDA) workloads are gaining attention within the scientific community not only as a processing component to large HPC simulations, but also as standalone scientific tools for knowledge discovery. With the path towards Exascale, new HPC runtime systems are also emerging in a way that differs from classical distributed com- puting models. However, system software for such capabilities on the latest extreme-scale DOE supercomputing needs to be enhanced to more appropriately support these types of emerging soft- ware ecosystems. In thismore » paper, we propose the use of Virtual Clusters on advanced supercomputing resources to enable systems to support not only HPC workloads, but also emerging big data stacks. Specifi- cally, we have deployed the KVM hypervisor within Cray's Compute Node Linux on a XC-series supercomputer testbed. We also use libvirt and QEMU to manage and provision VMs directly on compute nodes, leveraging Ethernet-over-Aries network emulation. To our knowledge, this is the first known use of KVM on a true MPP supercomputer. We investigate the overhead our solution using HPC benchmarks, both evaluating single-node performance as well as weak scaling of a 32-node virtual cluster. Overall, we find single node performance of our solution using KVM on a Cray is very efficient with near-native performance. However overhead increases by up to 20% as virtual cluster size increases, due to limitations of the Ethernet-over-Aries bridged network. Furthermore, we deploy Apache Spark with large data analysis workloads in a Virtual Cluster, ef- fectively demonstrating how diverse software ecosystems can be supported by High Performance Virtual Clusters.« less
MPAS-Ocean NESAP Status Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petersen, Mark Roger; Arndt, William; Keen, Noel
NESAP performance improvements on MPAS-Ocean have resulted in a 5% to 7% speed-up on each of the examined systems including Cori-KNL, Cori-Haswell, and Edison. These tests were configured to emulate a production workload by using 128 nodes and a high-resolution ocean domain. Overall, the gap between standard and many-core architecture performance has been narrowed, but Cori-KNL remains considerably under-performing relative to Edison. NESAP code alterations affected 600 lines of code, and most of these improvements will benefit other MPAS codes (sea ice, land ice) that are also components within ACME. Modifications are fully tested within MPAS. Testing in ACME acrossmore » many platforms is underway, and must be completed before the code is merged. In addition, a ten-year production ACME global simulation was conducted on Cori-KNL in late 2016 with the pre-NESAP code in order to test readiness and configurations for scientific studies. Next steps include assessing performance across a range of nodes, threads per node, and ocean resolutions on Cori-KNL.« less
NASA Technical Reports Server (NTRS)
Matthews, Christine G.; Posenau, Mary-Anne; Leonard, Desiree M.; Avis, Elizabeth L.; Debure, Kelly R.; Stacy, Kathryn; Vonofenheim, Bill
1992-01-01
The intent is to provide an introduction to the image processing capabilities available at the Langley Research Center (LaRC) Central Scientific Computing Complex (CSCC). Various image processing software components are described. Information is given concerning the use of these components in the Data Visualization and Animation Laboratory at LaRC.
Neutron resonance spin-echo upgrade at the three-axis spectrometer FLEXX
DOE Office of Scientific and Technical Information (OSTI.GOV)
Groitl, F., E-mail: felix.groitl@psi.ch; Quintero-Castro, D. L.; Habicht, K.
2015-02-15
We describe the upgrade of the neutron resonance spin-echo setup at the cold neutron triple-axis spectrometer FLEXX at the BER II neutron source at the Helmholtz-Zentrum Berlin. The parameters of redesigned key components are discussed, including the radio frequency (RF) spin-flip coils, the magnetic shield, and the zero field coupling coils. The RF-flippers with larger beam windows allow for an improved neutron flux transfer from the source to the sample and further to the analyzer. The larger beam cross sections permit higher coil inclination angles and enable measurements on dispersive excitations with a larger slope of the dispersion. Due tomore » the compact design of the spin-echo units in combination with the increased coil tilt angles, the accessible momentum-range in the Larmor diffraction mode is substantially enlarged. In combination with the redesigned components of the FLEXX spectrometer, including the guide, the S-bender polarizer, the double focusing monochromator, and a Heusler crystal analyzer, the count rate increased by a factor of 15.5, and the neutron beam polarization is enhanced. The improved performance extends the range of feasible experiments, both for inelastic scattering on excitation lifetimes in single crystals, and for high-resolution Larmor diffraction. The experimental characterization of the instrument components demonstrates the reliable performance of the new neutron resonance spin-echo option, now available for the scientific community at FLEXX.« less
Cardiopulmonary resuscitation: a historical perspective leading up to the end of the 19th century.
Ekmektzoglou, Konstantinos A; Johnson, Elizabeth O; Syros, Periklis; Chalkias, Athanasios; Kalambalikis, Lazaros; Xanthos, Theodoros
2012-01-01
Social laws and religious beliefs throughout history underscore the leaps and bounds that the science of resuscitation has achieved from ancient times until today. The effort to resuscitate victims goes back to ancient history, where death was considered a special form of sleep or an act of God. Biblical accounts of resuscitation attempts are numerous. Resuscitation in the Middle Ages was forbidden, but later during Renaissance, any prohibition against performing cardiopulmonary resuscitation (CPR) was challenged, which finally led to the Enlightenment, where scholars attempted to scientifically solve the problem of sudden death. It was then that the various components of CPR (ventilation, circulation, electricity, and organization of emergency medical services) began to take shape. The 19th century gave way to hallmarks both in the ventilatory support (intubation innovations and the artificial respirator) and the open-and closed chest circulatory support. Meanwhile, novel defibrillation techniques had been employed and ventricular fibrillation described. The groundbreaking discoveries of the 20th century finally led to the scientific framework of CPR. In 1960, mouth-to-mouth resuscitation was eventually combined with chest compression and defibrillation to become CPR as we now know it. This review presents the scientific milestones behind one of medicine's most widely used fields.
Undergraduate medical academic performance is improved by scientific training.
Zhang, Lili; Zhang, Wei; Wu, Chong; Liu, Zhongming; Cai, Yunfei; Cao, Xingguo; He, Yushan; Liu, Guoxiang; Miao, Hongming
2017-09-01
The effect of scientific training on course learning in undergraduates is still controversial. In this study, we investigated the academic performance of undergraduate students with and without scientific training. The results show that scientific training improves students' test scores in general medical courses, such as biochemistry and molecular biology, cell biology, physiology, and even English. We classified scientific training into four levels. We found that literature reading could significantly improve students' test scores in general courses. Students who received scientific training carried out experiments more effectively and published articles performed better than their untrained counterparts in biochemistry and molecular biology examinations. The questionnaire survey demonstrated that the trained students were more confident of their course learning, and displayed more interest, motivation and capability in course learning. In summary, undergraduate academic performance is improved by scientific training. Our findings shed light on the novel strategies in the management of undergraduate education in the medical school. © 2017 by The International Union of Biochemistry and Molecular Biology, 45(5):379-384, 2017. © 2017 The International Union of Biochemistry and Molecular Biology.
NASA Astrophysics Data System (ADS)
Washington, W. M.
2010-12-01
The development of climate and earth system models has been regarded primarily as the making of scientific tools to study the complex nature of the Earth’s climate. These models have a long history starting with very simple physical models based on fundamental physics in the 1960s and over time they have become much more complex with atmospheric, ocean, sea ice, land/vegetation, biogeochemical, glacial and ecological components. The policy use aspects of these models did not start in the 1960s and 1970s as decision making tools but were used to answer fundamental scientific questions such as what happens when the atmospheric carbon dioxide concentration increases or is doubled. They gave insights into the various interactions and were extensively compared with observations. It was realized that models of the earlier time periods could only give first order answers to many of the fundamental policy questions. As societal concerns about climate change rose, the policy questions of anthropogenic climate change became better defined; they were mostly concerned with the climate impacts of increasing greenhouse gases, aerosols, and land cover change. In the late 1980s, the United Nations set up the Intergovernmental Panel on Climate Change to perform assessments of the published literature. Thus, the development of climate and Earth system models became intimately linked to the need to not only improve our scientific understanding but also answering fundamental policy questions. In order to meet this challenge, the models became more complex and realistic so that they could address these policy oriented science questions such as rising sea level. The presentation will discuss the past and future development of global climate and earth system models for science and policy purposes. Also to be discussed is their interactions with economic integrated assessment models, regional and specialized models such as river transport or ecological components. As an example of one development pathway, the NSF/Department of Energy supported Community Climate System and Earth System Models will be featured in the presentation. Computational challenges will also part of the discussion.
The neutron star interior composition explorer (NICER): mission definition
NASA Astrophysics Data System (ADS)
Arzoumanian, Z.; Gendreau, K. C.; Baker, C. L.; Cazeau, T.; Hestnes, P.; Kellogg, J. W.; Kenyon, S. J.; Kozon, R. P.; Liu, K.-C.; Manthripragada, S. S.; Markwardt, C. B.; Mitchell, A. L.; Mitchell, J. W.; Monroe, C. A.; Okajima, T.; Pollard, S. E.; Powers, D. F.; Savadkin, B. J.; Winternitz, L. B.; Chen, P. T.; Wright, M. R.; Foster, R.; Prigozhin, G.; Remillard, R.; Doty, J.
2014-07-01
Over a 10-month period during 2013 and early 2014, development of the Neutron star Interior Composition Explorer (NICER) mission [1] proceeded through Phase B, Mission Definition. An external attached payload on the International Space Station (ISS), NICER is scheduled to launch in 2016 for an 18-month baseline mission. Its prime scientific focus is an in-depth investigation of neutron stars—objects that compress up to two Solar masses into a volume the size of a city—accomplished through observations in 0.2-12 keV X-rays, the electromagnetic band into which the stars radiate significant fractions of their thermal, magnetic, and rotational energy stores. Additionally, NICER enables the Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) demonstration of spacecraft navigation using pulsars as beacons. During Phase B, substantive refinements were made to the mission-level requirements, concept of operations, and payload and instrument design. Fabrication and testing of engineering-model components improved the fidelity of the anticipated scientific performance of NICER's X-ray Timing Instrument (XTI), as well as of the payload's pointing system, which enables tracking of science targets from the ISS platform. We briefly summarize advances in the mission's formulation that, together with strong programmatic performance in project management, culminated in NICER's confirmation by NASA into Phase C, Design and Development, in March 2014.
A fiber-coupled gas cell for space application
NASA Astrophysics Data System (ADS)
Thomin, Stéphane; Bera, Olivier; Beraud, Pascal; Lecallier, Arnaud; Tonck, Laurence; Belmana, Salem
2017-09-01
An increasing number of space-borne optical instruments now include fiber components. Telecom-type components have proved their reliability and versatility for space missions. Fibered lasers are now used for various purposes, such as remote IR-sounding missions, metrology, scientific missions and optical links (satellite-to-satellite, Earth-to-satellite).
Age and Scientific Performance.
ERIC Educational Resources Information Center
Cole, Stephen
1979-01-01
The long-standing belief that age is negatively associated with scientific productivity and creativity is shown to be based upon incorrect analysis of data. Studies reported in this article suggest that the relationship between age and scientific performance is influenced by the operation of the reward system. (Author)
The QuakeSim Project: Web Services for Managing Geophysical Data and Applications
NASA Astrophysics Data System (ADS)
Pierce, Marlon E.; Fox, Geoffrey C.; Aktas, Mehmet S.; Aydin, Galip; Gadgil, Harshawardhan; Qi, Zhigang; Sayar, Ahmet
2008-04-01
We describe our distributed systems research efforts to build the “cyberinfrastructure” components that constitute a geophysical Grid, or more accurately, a Grid of Grids. Service-oriented computing principles are used to build a distributed infrastructure of Web accessible components for accessing data and scientific applications. Our data services fall into two major categories: Archival, database-backed services based around Geographical Information System (GIS) standards from the Open Geospatial Consortium, and streaming services that can be used to filter and route real-time data sources such as Global Positioning System data streams. Execution support services include application execution management services and services for transferring remote files. These data and execution service families are bound together through metadata information and workflow services for service orchestration. Users may access the system through the QuakeSim scientific Web portal, which is built using a portlet component approach.
The hazards of hazard identification in environmental epidemiology.
Saracci, Rodolfo
2017-08-09
Hazard identification is a major scientific challenge, notably for environmental epidemiology, and is often surrounded, as the recent case of glyphosate shows, by debate arising in the first place by the inherently problematic nature of many components of the identification process. Particularly relevant in this respect are components less amenable to logical or mathematical formalization and essentially dependent on scientists' judgment. Four such potentially hazardous components that are capable of distorting the correct process of hazard identification are reviewed and discussed from an epidemiologist perspective: (1) lexical mix-up of hazard and risk (2) scientific questions as distinct from testable hypotheses, and implications for the hierarchy of strength of evidence obtainable from different types of study designs (3) assumptions in prior beliefs and model choices and (4) conflicts of interest. Four suggestions are put forward to strengthen a process that remains in several aspects judgmental, but not arbitrary, in nature.
ERIC Educational Resources Information Center
Kraemer, Sara; Thorn, Christopher A.
2010-01-01
The purpose of this exploratory study was to identify and describe some of the dimensions of scientific collaborations using high throughput computing (HTC) through the lens of a virtual team performance framework. A secondary purpose was to assess the viability of using a virtual team performance framework to study scientific collaborations using…
Mythical thinking, scientific discourses and research dissemination.
Hroar Klempe, Sven
2011-06-01
This article focuses on some principles for understanding. By taking Anna Mikulak's article "Mismatches between 'scientific' and 'non-scientific' ways of knowing and their contributions to public understanding of science" (IPBS 2011) as a point of departure, the idea of demarcation criteria for scientific and non-scientific discourses is addressed. Yet this is juxtaposed with mythical thinking, which is supposed to be the most salient trait of non-scientific discourses. The author demonstrates how the most widespread demarcation criterion, the criterion of verification, is self-contradictory, not only when it comes to logic, but also in the achievement of isolating natural sciences from other forms of knowledge. According to Aristotle induction is a rhetorical device and as far as scientific statements are based on inductive inferences, they are relying on humanities, which rhetoric is a part of. Yet induction also has an empirical component by being based on sense-impressions, which is not a part of the rhetoric, but the psychology. Also the myths are understood in a rhetorical (Lévi-Strauss) and a psychological (Cassirer) perspective. Thus it is argued that both scientific and non-scientific discourses can be mythical.
1972-02-01
The final version of the Marshall Space Flight Center managed Skylab consisted of four primary parts. One component was the Apollo Telescope Mount (ATM) that housed the first marned scientific telescopes in space. This picture is a view of the ATM spar, which contained the scientific instruments, as the multiple docking adapter (MDA) canister end is lowered over it. The MDA served to link the major parts of Skylab together.
ERIC Educational Resources Information Center
Stansfield, William D.
2013-01-01
Students should not graduate from high school without understanding that scientific debates are essential components of scientific methodology. This article presents a brief history of ongoing debates regarding the hypothesis that group selection is an evolutionary mechanism, and it serves as an example of the role that debates play in correcting…
ERIC Educational Resources Information Center
Leung, Jessica Shuk Ching; Wong, Alice Siu Ling; Yung, Benny Hin Wai
2015-01-01
Understandings of nature of science (NOS) are a core component of scientific literacy, and a scientifically literate populace is expected to be able to critically evaluate science in the media. While evidence has remained inconclusive on whether better NOS understandings will lead to critical evaluation of science in the media, this study aimed at…
The influence of essential oils on human attention. I: alertness.
Ilmberger, J; Heuberger, E; Mahrhofer, C; Dessovic, H; Kowarik, D; Buchbauer, G
2001-03-01
Scientific research on the effects of essential oils on human behavior lags behind the promises made by popular aromatherapy. Nearly all aspects of human behavior are closely linked to processes of attention, the basic level being that of alertness, which ranges from sleep to wakefulness. In our study we measured the influence of essential oils and components of essential oils [peppermint, jasmine, ylang-ylang, 1,8-cineole (in two different dosages) and menthol] on this core attentional function, which can be experimentally defined as speed of information processing. Substances were administered by inhalation; levels of alertness were assessed by measuring motor and reaction times in a reaction time paradigm. The performances of the six experimental groups receiving substances (n = 20 in four groups, n = 30 in two groups) were compared with those of corresponding control groups receiving water. Between-group analysis, i.e. comparisons between experimental groups and their respective control groups, mainly did not reach statistical significance. However, within-group analysis showed complex correlations between subjective evaluations of substances and objective performance, indicating that effects of essentials oils or their components on basic forms of attentional behavior are mainly psychological.
The quest to standardize hemodialysis care.
Hegbrant, Jörgen; Gentile, Giorgio; Strippoli, Giovanni F M
2011-01-01
A large global dialysis provider's core activities include providing dialysis care with excellent quality, ensuring a low variability across the clinic network and ensuring strong focus on patient safety. In this article, we summarize the pertinent components of the quality assurance and safety program of the Diaverum Renal Services Group. Concerning medical performance, the key components of a successful quality program are setting treatment targets; implementing evidence-based guidelines and clinical protocols; consistently, regularly, prospectively and accurately collecting data from all clinics in the network; processing collected data to provide feedback to clinics in a timely manner, incorporating information on interclinic and intercountry variations; and revising targets, guidelines and clinical protocols based on sound scientific data. The key activities for ensuring patient safety include a standardized approach to education, i.e. a uniform education program including control of theoretical knowledge and clinical competencies; implementation of clinical policies and procedures in the organization in order to reduce variability and potential defects in clinic practice; and auditing of clinical practice on a regular basis. By applying a standardized and systematic continuous quality improvement approach throughout the entire organization, it has been possible for Diaverum to progressively improve medical performance and ensure patient safety. Copyright © 2011 S. Karger AG, Basel.
Peterfreund, Alan R.; Xenos, Samuel P.; Bayliss, Frank; Carnal, Nancy
2007-01-01
Supplemental instruction classes have been shown in many studies to enhance performance in the supported courses and even to improve graduation rates. Generally, there has been little evidence of a differential impact on students from different ethnic/racial backgrounds. At San Francisco State University, however, supplemental instruction in the Introductory Biology I class is associated with even more dramatic gains among students from underrepresented minority populations than the gains found among their peers. These gains do not seem to be the product of better students availing themselves of supplemental instruction or other outside factors. The Introductory Biology I class consists of a team-taught lecture component, taught in a large lecture classroom, and a laboratory component where students participate in smaller lab sections. Students are expected to master an understanding of basic concepts, content, and vocabulary in biology as well as gain laboratory investigation skills and experience applying scientific methodology. In this context, supplemental instruction classes are cooperative learning environments where students participate in learning activities that complement the course material, focusing on student misconceptions and difficulties, construction of a scaffolded knowledge base, applications involving problem solving, and articulation of constructs with peers. PMID:17785403
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Chase Qishi; Zhu, Michelle Mengxia
The advent of large-scale collaborative scientific applications has demonstrated the potential for broad scientific communities to pool globally distributed resources to produce unprecedented data acquisition, movement, and analysis. System resources including supercomputers, data repositories, computing facilities, network infrastructures, storage systems, and display devices have been increasingly deployed at national laboratories and academic institutes. These resources are typically shared by large communities of users over Internet or dedicated networks and hence exhibit an inherent dynamic nature in their availability, accessibility, capacity, and stability. Scientific applications using either experimental facilities or computation-based simulations with various physical, chemical, climatic, and biological models featuremore » diverse scientific workflows as simple as linear pipelines or as complex as a directed acyclic graphs, which must be executed and supported over wide-area networks with massively distributed resources. Application users oftentimes need to manually configure their computing tasks over networks in an ad hoc manner, hence significantly limiting the productivity of scientists and constraining the utilization of resources. The success of these large-scale distributed applications requires a highly adaptive and massively scalable workflow platform that provides automated and optimized computing and networking services. This project is to design and develop a generic Scientific Workflow Automation and Management Platform (SWAMP), which contains a web-based user interface specially tailored for a target application, a set of user libraries, and several easy-to-use computing and networking toolkits for application scientists to conveniently assemble, execute, monitor, and control complex computing workflows in heterogeneous high-performance network environments. SWAMP will enable the automation and management of the entire process of scientific workflows with the convenience of a few mouse clicks while hiding the implementation and technical details from end users. Particularly, we will consider two types of applications with distinct performance requirements: data-centric and service-centric applications. For data-centric applications, the main workflow task involves large-volume data generation, catalog, storage, and movement typically from supercomputers or experimental facilities to a team of geographically distributed users; while for service-centric applications, the main focus of workflow is on data archiving, preprocessing, filtering, synthesis, visualization, and other application-specific analysis. We will conduct a comprehensive comparison of existing workflow systems and choose the best suited one with open-source code, a flexible system structure, and a large user base as the starting point for our development. Based on the chosen system, we will develop and integrate new components including a black box design of computing modules, performance monitoring and prediction, and workflow optimization and reconfiguration, which are missing from existing workflow systems. A modular design for separating specification, execution, and monitoring aspects will be adopted to establish a common generic infrastructure suited for a wide spectrum of science applications. We will further design and develop efficient workflow mapping and scheduling algorithms to optimize the workflow performance in terms of minimum end-to-end delay, maximum frame rate, and highest reliability. We will develop and demonstrate the SWAMP system in a local environment, the grid network, and the 100Gpbs Advanced Network Initiative (ANI) testbed. The demonstration will target scientific applications in climate modeling and high energy physics and the functions to be demonstrated include workflow deployment, execution, steering, and reconfiguration. Throughout the project period, we will work closely with the science communities in the fields of climate modeling and high energy physics including Spallation Neutron Source (SNS) and Large Hadron Collider (LHC) projects to mature the system for production use.« less
Forecasting Space Weather-Induced GPS Performance Degradation Using Random Forest
NASA Astrophysics Data System (ADS)
Filjar, R.; Filic, M.; Milinkovic, F.
2017-12-01
Space weather and ionospheric dynamics have a profound effect on positioning performance of the Global Satellite Navigation System (GNSS). However, the quantification of that effect is still the subject of scientific activities around the world. In the latest contribution to the understanding of the space weather and ionospheric effects on satellite-based positioning performance, we conducted a study of several candidates for forecasting method for space weather-induced GPS positioning performance deterioration. First, a 5-days set of experimentally collected data was established, encompassing the space weather and ionospheric activity indices (including: the readings of the Sudden Ionospheric Disturbance (SID) monitors, components of geomagnetic field strength, global Kp index, Dst index, GPS-derived Total Electron Content (TEC) samples, standard deviation of TEC samples, and sunspot number) and observations of GPS positioning error components (northing, easting, and height positioning error) derived from the Adriatic Sea IGS reference stations' RINEX raw pseudorange files in quiet space weather periods. This data set was split into the training and test sub-sets. Then, a selected set of supervised machine learning methods based on Random Forest was applied to the experimentally collected data set in order to establish the appropriate regional (the Adriatic Sea) forecasting models for space weather-induced GPS positioning performance deterioration. The forecasting models were developed in the R/rattle statistical programming environment. The forecasting quality of the regional forecasting models developed was assessed, and the conclusions drawn on the advantages and shortcomings of the regional forecasting models for space weather-caused GNSS positioning performance deterioration.
Science games and the development of scientific possible selves.
Beier, Margaret; Miller, Leslie; Wang, Shu
2012-12-01
Serious scientific games, especially those that include a virtual apprenticeship component, provide players with realistic experiences in science. This article discusses how science games can influence learning about science and the development of science-oriented possible selves through repeated practice in professional play and through social influences (e.g., peer groups). We first review the theory of possible selves (Markus and Nurius 1986) and discuss the potential of serious scientific games for influencing the development of scientific possible selves. As part of our review, we present a forensic game that inspired our work. Next we present a measure of scientific possible selves and assess its reliability and validity with a sample of middle-school students (N=374). We conclude by discussing the promise of science games and the development of scientific possible selves on both the individual and group levels as a means of inspiring STEM careers among adolescents.
Science games and the development of scientific possible selves
Beier, Margaret; Miller, Leslie; Wang, Shu
2012-01-01
Serious scientific games, especially those that include a virtual apprenticeship component, provide players with realistic experiences in science. This article discusses how science games can influence learning about science and the development of science-oriented possible selves through repeated practice in professional play and through social influences (e.g., peer groups). We first review the theory of possible selves (Markus and Nurius 1986) and discuss the potential of serious scientific games for influencing the development of scientific possible selves. As part of our review, we present a forensic game that inspired our work. Next we present a measure of scientific possible selves and assess its reliability and validity with a sample of middle-school students (N=374). We conclude by discussing the promise of science games and the development of scientific possible selves on both the individual and group levels as a means of inspiring STEM careers among adolescents. PMID:23483731
Science games and the development of scientific possible selves
NASA Astrophysics Data System (ADS)
Beier, Margaret E.; Miller, Leslie M.; Wang, Shu
2012-12-01
Serious scientific games, especially those that include a virtual apprenticeship component, provide players with realistic experiences in science. This article discusses how science games can influence learning about science and the development of science-oriented possible selves through repeated practice in professional play and through social influences (e.g., peer groups). We first review the theory of possible selves (Markus and Nurius 1986) and discuss the potential of serious scientific games for influencing the development of scientific possible selves. As part of our review, we present a forensic game that inspired our work. Next we present a measure of scientific possible selves and assess its reliability and validity with a sample of middle-school students ( N = 374). We conclude by discussing the promise of science games and the development of scientific possible selves on both the individual and group levels as a means of inspiring STEM careers among adolescents.
Ji, Tao; Su, Shu-Lan; Guo, Sheng; Qian, Da-Wei; Ouyang, Zhen; Duan, Jin-Ao
2016-06-01
Column chromatography was used for enrichment and separation of flavonoids, alkaloids and polysaccharides from the extracts of Morus alba leaves; glucose oxidase method was used with sucrose as the substrate to evaluate the multi-components of M. alba leaves in α-glucosidase inhibitory models; isobole method, Chou-Talalay combination index analysis and isobolographic analysis were used to evaluate the interaction effects and dose-effect characteristics of two components, providing scientific basis for revealing the hpyerglycemic mechanism of M. alba leaves. The components analysis showed that flavonoid content was 5.3%; organic phenolic acids content was 10.8%; DNJ content was 39.4%; and polysaccharide content was 18.9%. Activity evaluation results demonstrated that flavonoids, alkaloids and polysaccharides of M. alba leaves had significant inhibitory effects on α-glucosidase, and the inhibitory rate was increased with the increasing concentration. Alkaloids showed most significant inhibitory effects among these three components. Both compatibility of alkaloids and flavonoids, and the compatibility of alkaloids and polysaccharides demonstrated synergistic effects, but the compatibility of flavonoids and polysaccharides showed no obvious synergistic effects. The results have confirmed the interaction of multi-components from M. alba leaves to regulate blood sugar, and provided scientific basis for revealing hpyerglycemic effectiveness and mechanism of the multi-components from M. alba leaves. Copyright© by the Chinese Pharmaceutical Association.
Optical components damage parameters database system
NASA Astrophysics Data System (ADS)
Tao, Yizheng; Li, Xinglan; Jin, Yuquan; Xie, Dongmei; Tang, Dingyong
2012-10-01
Optical component is the key to large-scale laser device developed by one of its load capacity is directly related to the device output capacity indicators, load capacity depends on many factors. Through the optical components will damage parameters database load capacity factors of various digital, information technology, for the load capacity of optical components to provide a scientific basis for data support; use of business processes and model-driven approach, the establishment of component damage parameter information model and database systems, system application results that meet the injury test optical components business processes and data management requirements of damage parameters, component parameters of flexible, configurable system is simple, easy to use, improve the efficiency of the optical component damage test.
Preparing a scientific manuscript in Linux: Today's possibilities and limitations
2011-01-01
Background Increasing number of scientists are enthusiastic about using free, open source software for their research purposes. Authors' specific goal was to examine whether a Linux-based operating system with open source software packages would allow to prepare a submission-ready scientific manuscript without the need to use the proprietary software. Findings Preparation and editing of scientific manuscripts is possible using Linux and open source software. This letter to the editor describes key steps for preparation of a publication-ready scientific manuscript in a Linux-based operating system, as well as discusses the necessary software components. This manuscript was created using Linux and open source programs for Linux. PMID:22018246
Li, Panlin; Su, Weiwei; Yun, Sha; Liao, Yiqiu; Liao, Yinyin; Liu, Hong; Li, Peibo; Wang, Yonggang; Peng, Wei; Yao, Hongliang
2017-01-01
Since traditional Chinese medicine (TCM) is a complex mixture of multiple components, the application of methodologies for evaluating single-components Western medicine in TCM studies may have certain limitations. Appropriate strategies that recognize the integrality of TCM and connect to TCM theories remain to be developed. Here we use multiple unique approaches to study the scientific connotation of a TCM formula Dan-hong injection (DHI) without undermining its prescription integrity. The blood circulation improving and healing promoting effects of DHI were assessed by a qi stagnation blood stasis rat model and a mouse model of laser irradiation induced cerebral microvascular thrombosis. By UFLC-PDA-Triple Q-TOF-MS/MS and relevance analysis between chemical characters and biological effects, 82 chemical constituents and nine core components, whose blood circulation promoting effects were found comparable to that of whole DHI, were successfully identified. What’s more, the rationality of DHI prescription compatibility could be reflected not only in the maximum efficacy of the original ratio, but also in the interactions of compounds from different ingredient herbs, such as complementary activities and facilitating tissues distribution. This study provides scientific evidences in explanation of the clinical benefits of DHI, and also gives a good demonstration for the comprehensive evaluation of other TCM. PMID:28393856
Rasmussen, Charlotte Diana Nørregaard; Højberg, Helene; Bengtsen, Elizabeth; Jørgensen, Marie Birk
2018-02-01
In a recent study, we involved all relevant stakeholders to identify practice-based implementation components for successful implementation and sustainability in work environment interventions. To understand possible knowledge gaps between evidence and practice, the aim of this paper is to investigate if effectiveness studies of the 11 practice-based implementation components can be identified in existing scientific literature. PubMed/MEDLINE, PsycINFO, and Web of Science were searched for relevant studies. After screening, 38 articles met the inclusion criteria. Since some of the studies describe more than one practice-based implementation concept a total of 125 quality criteria assessments were made. The overall result is that 10 of the 11 practice-based implementation components can be found in the scientific literature, but the evaluation of them is poor. From this review it is clear that there are knowledge gaps between evidence and practice with respect to the effectiveness of implementation concepts. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
[Content determination of twelve major components in Tibetan medicine Zuozhu Daxi by UPLC].
Qu, Yan; Li, Jin-hua; Zhang, Chen; Li, Chun-xue; Dong, Hong-jiao; Wang, Chang-sheng; Zeng, Rui; Chen, Xiao-hu
2015-05-01
A quantitative analytical method of ultra-high performance liquid chromatography (UPLC) was developed for simultaneously determining twelve components in Tibetan medicine Zuozhu Daxi. SIMPCA 12.0 software was used a principal component analysis PCA) and partial small squares analysis (PLSD-DA) on the twelve components in 10 batches from four pharmaceutical factories. Acquity UPLC BEH C15 column (2.1 mm x 100 mm, 1.7 µm) was adopted at the column temperature of 35 °C and eluted with acetonitrile (A) -0.05% phosphate acid solution (B) as the mobile phase with a flow rate of 0. 3 mL · min(-1). The injection volume was 1 µL. The detection wavelengths were set at 210 nm for alantolactone, isoalantolactone and oleanolic; 260 nm for trychnine and brucine; 288 nm for protopine; 306 nm for protopine, resveratrol and piperine; 370 nm for quercetin and isorhamnetin. The results showed a good separation among index components, with a good linearity relationship (R2 = 0.999 6) within the selected concentration range. The average sample recovery rates ranged between 99.44%-101.8%, with RSD between 0.37%-1.7%, indicating the method is rapid and accurate with a good repeatability and stability. The PCA and PLSD-DA analysis on the sample determination results revealed a great difference among samples from different pharmaceutical factories. The twelve components included in this study contributed significantly to the quantitative determination of intrinsic quality of Zuozhu Daxi. The UPLC established for to the quantitative determination of the twelve components can provide scientific basis for the comprehensive quality evaluation of Zuozhu Daxi.
Was Hawan Designed to Fight Anxiety-Scientific Evidences?
Romana, R K; Sharma, A; Gupta, V; Kaur, R; Kumar, S; Bansal, P
2017-01-06
Anxiety is a psychiatric disorder with unknown neurobiology; however, neurotransmitters like gamma-amino butyric acid, norepinephrine and serotonin (5-HT) play crucial roles in mediating anxiety. Present drug modules pose dependence risk to the patient; hence, there is a great need to develop complementary therapies to fight this disorder. Aromatherapy has also been employed in ancient times for a number of mental disorders. Mahamrituanjay Mantra, Om triambkum yajamahe, sughandhim puushtivardhanam, urvarukmev vandhanaat, mrityu mokshay mamritaat!!!!, the part of veda enlightens that aroma gives rise to good health (sughandhim puushtivardhanam). Hawan is a religious practice recommended for mental peace. Hawan is a process in which special herbs are offered in the fire of medicinal woods ignited in a specially designed fire pit. Analysis of literature demonstrates that the components of Hawan are having a number of volatile oils that are specifically useful for prevention and treatment of anxiety through some mechanism of action. Due to high temperature of fire, the vapors of these oils from herbs enter into the central nervous system through nasal route. As per modern science and ancient texts on medicine, nasal drug delivery systems are the best for the diseases related to brain and head. The routine of performing Hawan might keep the threshold value of the therapeutic components in the body and help in preventing anxiety. In the present manuscript, authors highlight and integrate the modern and ancient concepts for treatment and prevention of anxiety through scientific evidences.
Schor, Nina Felice; Troen, Philip; Kanter, Steven L; Levine, Arthur S
2005-09-01
Many U.S. medical schools offer students the opportunity to undertake laboratory or clinical research or another form of scholarly project over the summer months, yet few require this as a prerequisite for graduation, and even fewer provide comprehensive didactic material in preparation for the performance of such a project as an integrated component of their curricula. The authors describe the Scholarly Project Initiative of the University of Pittsburgh School of Medicine, a novel, longitudinal, and required program. The program will aim to provide all students with structured preparatory coursework, foster critical analytical and communication skills, and introduce the breadth and depth of the research and scholarly enterprise engendered by modern academic medicine in the contexts of both the classroom and an individual, mentored experience. The initiative has two goals: encouraging an interest in academic medicine in an era marked by the continuing decline in the number of physician-investigators, and fostering the development of physicians who have confidence in their abilities to practice medicine with creativity, original and analytical thought, and relentless attention to the scientific method. Planning for the Scholarly Project Initiative began officially at the University of Pittsburgh School of Medicine's Curriculum Colloquium in May 2003. The initiative was implemented with the first-year class of July 2004 as part of the new "Scientific Reasoning and Medicine" block of the School of Medicine's curriculum. The block as a whole includes traditional lectures, small-group laboratory and problem-based sessions, and mentored independent study components.
Optical Characteristics of the Marshall Space Flight Center Solar Ultraviolet Magnetograph
NASA Technical Reports Server (NTRS)
West, Edward; Porter, Jason; Davis, John; Gary, Allen; Adams, Mitzi; Rose, M. Franklin (Technical Monitor)
2001-01-01
This paper will describe the scientific objectives of the MSFC SUMI project and the optical components that have been developed to meet those objectives. In order to test the scientific feasibility of measuring magnetic fields in the UV, a sounding rocket payload is being developed, This paper will describe the optical measurements that have been made on the SUMI telescope mirrors and polarization optics.
NASA Technical Reports Server (NTRS)
Borisenkov, Y. P.; Fedorov, O. M.
1974-01-01
A report is made on the automated system known as SIGMA-s for the measurement, collection, and processing of hydrometeorological data aboard scientific research vessels of the Hydrometeorological Service. The various components of the system and the interfacing between them are described, as well as the projects that the system is equipped to handle.
Scientific and Technical Personnel in Industry, 1960
1960-01-01
calculated on the bass ofunmromded research and development. fgrsadteeoemynot correspond exactly with those ludicated by the Sujec to a standard...second type, designated as secondary esti- As a result of all the exclusions described above, mates, were components of the primary estimates. a... design , beginning with the 1959 survey. A earlier ones conducted by the Bureau of Labor fairly detailed two-way breakdown of scientific Statistics for
USSR and Eastern Europe Scientific Abstracts, Biomedical and Behavioral Sciences, Number 96.
1978-10-26
COMPONENTS IN THE FERMENTATION BROTH AFFECTING THE DISTRIBUTION OF 0LEAND0MYCIN DURING EXTRACTION Moscow ANTIBIOTIKI in Russian No 7, 1978 pp 626...Various extraction studies were conducted with fermentation broths which led to the conclusion that the broth contains component(s) that binds...had taken, internally, 80-150 ml of a vinegar essence. Twenty-six died in the first h6 hrs displaying decom- pensated shock. Studies included EKG
Ynalvez, Marcus Antonius; Ynalvez, Ruby A; Ramírez, Enrique
2017-03-04
We explored the social shaping of science at the micro-level reality of face-to-face interaction in one of the traditional places for scientific activities-the scientific lab. We specifically examined how doctoral students' perception of their: (i) interaction with doctoral mentors (MMI) and (ii) lab social environment (LSE) influenced productivity. Construed as the production of peer-reviewed articles, we measured productivity using total number of articles (TOTAL), number of articles with impact factor greater than or equal to 4.00 (IFGE4), and number of first-authored articles (NFA). Via face-to-face interviews, we obtained data from n = 210 molecular biology Ph.D. students in selected universities in Japan, Singapore, and Taiwan. Additional productivity data (NFA) were obtained from online bibliometric databases. To summarize the original 13 MMI and 13 LSE semantic-differential items which we used to measure students' perceptions, principal component (PC) analyses were performed. The results were smaller sets of 4 MMI PCs and 4 LSE PCs. To identify which PCs influenced publication counts, we performed Poisson regression analyses. Although perceived MMI was not linked to productivity, perceived LSE was linked: Students who perceived their LSE as intellectually stimulating reported high levels of productivity in both TOTAL and IFGE4, but not in NFA. Our findings not only highlight how students' perception of their training environment factors in the production of scientific output, our findings also carry important implications for improving mentoring programs in science. © 2016 by The International Union of Biochemistry and Molecular Biology, 45(2):130-144, 2017. © 2016 The International Union of Biochemistry and Molecular Biology.
NASA Technical Reports Server (NTRS)
Webster, W., Jr.; Frawley, J. J.; Stefanik, M.
1984-01-01
Simulation studies established that the main (core), crustal and electrojet components of the Earth's magnetic field can be observed with greater resolution or over a longer time-base than is presently possible by using the capabilities provided by the space station. Two systems are studied. The first, a large lifetime, magnetic monitor would observe the main field and its time variation. The second, a remotely-piloted, magnetic probe would observe the crustal field at low altitude and the electrojet field in situ. The system design and the scientific performance of these systems is assessed. The advantages of the space station are reviewed.
SHINE, The SpHere INfrared survey for Exoplanets
NASA Astrophysics Data System (ADS)
Chauvin, G.; Desidera, S.; Lagrange, A.-M.; Vigan, A.; Feldt, M.; Gratton, R.; Langlois, M.; Cheetham, A.; Bonnefoy, M.; Meyer, M.
2017-12-01
The SHINE survey for SPHERE High-contrast ImagiNg survey for Exoplanets, is a large near-infrared survey of 400-600 young, nearby stars and represents a significant component of the SPHERE consortium Guaranteed Time Observations consisting in 200 observing nights. The scientific goals are: i) to characterize known planetary systems (architecture, orbit, stability, luminosity, atmosphere); ii) to search for new planetary systems using SPHERE's unprecedented performance; and finally iii) to determine the occurrence and orbital and mass function properties of the wide-orbit, giant planet population as a function of the stellar host mass and age. Combined, the results will increase our understanding of planetary atmospheric physics and the processes of planetary formation and evolution.
Geopotential research mission, science, engineering and program summary
NASA Technical Reports Server (NTRS)
Keating, T. (Editor); Taylor, P. (Editor); Kahn, W. (Editor); Lerch, F. (Editor)
1986-01-01
This report is based upon the accumulated scientific and engineering studies pertaining to the Geopotential Research Mission (GRM). The scientific need and justification for the measurement of the Earth's gravity and magnetic fields are discussed. Emphasis is placed upon the studies and conclusions of scientific organizations and NASA advisory groups. The engineering design and investigations performed over the last 4 years are described, and a spacecraft design capable of fulfilling all scientific objectives is presented. In addition, critical features of the scientific requirements and state-of-the-art limitations of spacecraft design, mission flight performance, and data processing are discussed.
Hamedi, Azadeh; Afifi, Mehdi; Etemadfard, Hamed
2017-01-01
Hydrosol soft drinks in Persian nutrition culture are produced as side products of the essential oil industry to be used as safe remedies for treatment of some ailments. This study investigated hydrosols for women’s hormonal health conditions. Detailed information was gathered by questionnaires. Chemical constituents of these mono- or poly-herbal hydrosols were identified after liquid/liquid extraction and gas chromatography–mass spectrometry. Hierarchical cluster and K-means analysis (SPSS software) were used to find their relevance. A literature survey was also performed. In most cases, thymol, carvacrol, and carvone were the major constituents except for dill, white horehound, willow, Moderr, and yarrow hydrosols, whose their major components were dill ether, menthol, phenethyl alcohol, linalool, or camphor. Based on clustering methods, some similarities could be found in their constituents with some exceptions. None of them have been studied scientifically before. These investigations may lead to the development of some functional drinks or even new lead components. PMID:28701045
NASA Astrophysics Data System (ADS)
Anderson, D. V.; Koniges, A. E.; Shumaker, D. E.
1988-11-01
Many physical problems require the solution of coupled partial differential equations on three-dimensional domains. When the time scales of interest dictate an implicit discretization of the equations a rather complicated global matrix system needs solution. The exact form of the matrix depends on the choice of spatial grids and on the finite element or finite difference approximations employed. CPDES3 allows each spatial operator to have 7, 15, 19, or 27 point stencils and allows for general couplings between all of the component PDE's and it automatically generates the matrix structures needed to perform the algorithm. The resulting sparse matrix equation is solved by either the preconditioned conjugate gradient (CG) method or by the preconditioned biconjugate gradient (BCG) algorithm. An arbitrary number of component equations are permitted only limited by available memory. In the sub-band representation used, we generate an algorithm that is written compactly in terms of indirect induces which is vectorizable on some of the newer scientific computers.
NASA Astrophysics Data System (ADS)
Anderson, D. V.; Koniges, A. E.; Shumaker, D. E.
1988-11-01
Many physical problems require the solution of coupled partial differential equations on two-dimensional domains. When the time scales of interest dictate an implicit discretization of the equations a rather complicated global matrix system needs solution. The exact form of the matrix depends on the choice of spatial grids and on the finite element or finite difference approximations employed. CPDES2 allows each spatial operator to have 5 or 9 point stencils and allows for general couplings between all of the component PDE's and it automatically generates the matrix structures needed to perform the algorithm. The resulting sparse matrix equation is solved by either the preconditioned conjugate gradient (CG) method or by the preconditioned biconjugate gradient (BCG) algorithm. An arbitrary number of component equations are permitted only limited by available memory. In the sub-band representation used, we generate an algorithm that is written compactly in terms of indirect indices which is vectorizable on some of the newer scientific computers.
A Common DPU Platform for ESA JUICE Mission Instruments
NASA Astrophysics Data System (ADS)
Aberg, Martin; Hellstrom, Daniel; Samuelsson, Arne; Torelli, Felice
2016-08-01
This paper describes the resulting hardware and software platform based on GR712RC [1] LEON3-FT that Cobham Gaisler developed in accordance with the common system requirements of the ten scientific instruments on-board the ESA JUICE spacecraft destined the Jupiter system [8].The radiation hardened DPU platform features EDAC protected boot, application memory and working memory of configurable sizes and SpaceWire, FPGA I/O-32/16/8, GPIO, UART and SPI I/O interfaces. The design has undergone PSA, Risk, WCA, Radiation analyses etc. to justify component and design choices resulting in a robust design that can be used in spacecrafts requiring a total dose up to 100krad(Si). The prototype board manufactured uses engineering models of the flight components to ensure that development is representative.Validated boot, standby and driver software accommodates the various DPU platform configurations. The boot performs low-level DPU initialization, standby handles OBC SpaceWire communication and finally the loading and executing of application images typically stored in the non-volatile application memory.
Hamedi, Azadeh; Afifi, Mehdi; Etemadfard, Hamed
2017-10-01
Hydrosol soft drinks in Persian nutrition culture are produced as side products of the essential oil industry to be used as safe remedies for treatment of some ailments. This study investigated hydrosols for women's hormonal health conditions. Detailed information was gathered by questionnaires. Chemical constituents of these mono- or poly-herbal hydrosols were identified after liquid/liquid extraction and gas chromatography-mass spectrometry. Hierarchical cluster and K-means analysis (SPSS software) were used to find their relevance. A literature survey was also performed. In most cases, thymol, carvacrol, and carvone were the major constituents except for dill, white horehound, willow, Moderr, and yarrow hydrosols, whose their major components were dill ether, menthol, phenethyl alcohol, linalool, or camphor. Based on clustering methods, some similarities could be found in their constituents with some exceptions. None of them have been studied scientifically before. These investigations may lead to the development of some functional drinks or even new lead components.
Phipps, Eric T.; D'Elia, Marta; Edwards, Harold C.; ...
2017-04-18
In this study, quantifying simulation uncertainties is a critical component of rigorous predictive simulation. A key component of this is forward propagation of uncertainties in simulation input data to output quantities of interest. Typical approaches involve repeated sampling of the simulation over the uncertain input data, and can require numerous samples when accurately propagating uncertainties from large numbers of sources. Often simulation processes from sample to sample are similar and much of the data generated from each sample evaluation could be reused. We explore a new method for implementing sampling methods that simultaneously propagates groups of samples together in anmore » embedded fashion, which we call embedded ensemble propagation. We show how this approach takes advantage of properties of modern computer architectures to improve performance by enabling reuse between samples, reducing memory bandwidth requirements, improving memory access patterns, improving opportunities for fine-grained parallelization, and reducing communication costs. We describe a software technique for implementing embedded ensemble propagation based on the use of C++ templates and describe its integration with various scientific computing libraries within Trilinos. We demonstrate improved performance, portability and scalability for the approach applied to the simulation of partial differential equations on a variety of CPU, GPU, and accelerator architectures, including up to 131,072 cores on a Cray XK7 (Titan).« less
I/O load balancing for big data HPC applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paul, Arnab K.; Goyal, Arpit; Wang, Feiyi
High Performance Computing (HPC) big data problems require efficient distributed storage systems. However, at scale, such storage systems often experience load imbalance and resource contention due to two factors: the bursty nature of scientific application I/O; and the complex I/O path that is without centralized arbitration and control. For example, the extant Lustre parallel file system-that supports many HPC centers-comprises numerous components connected via custom network topologies, and serves varying demands of a large number of users and applications. Consequently, some storage servers can be more loaded than others, which creates bottlenecks and reduces overall application I/O performance. Existing solutionsmore » typically focus on per application load balancing, and thus are not as effective given their lack of a global view of the system. In this paper, we propose a data-driven approach to load balance the I/O servers at scale, targeted at Lustre deployments. To this end, we design a global mapper on Lustre Metadata Server, which gathers runtime statistics from key storage components on the I/O path, and applies Markov chain modeling and a minimum-cost maximum-flow algorithm to decide where data should be placed. Evaluation using a realistic system simulator and a real setup shows that our approach yields better load balancing, which in turn can improve end-to-end performance.« less
Advancing Water Science through Improved Cyberinfrastructure
NASA Astrophysics Data System (ADS)
Koch, B. J.; Miles, B.; Rai, A.; Ahalt, S.; Band, L. E.; Minsker, B.; Palmer, M.; Williams, M. R.; Idaszak, R.; Whitton, M. C.
2012-12-01
Major scientific advances are needed to help address impacts of climate change and increasing human-mediated environmental modification on the water cycle at global and local scales. However, such advances within the water sciences are limited in part by inadequate information infrastructures. For example, cyberinfrastructure (CI) includes the integrated computer hardware, software, networks, sensors, data, and human capital that enable scientific workflows to be carried out within and among individual research efforts and across varied disciplines. A coordinated transformation of existing CI and development of new CI could accelerate the productivity of water science by enabling greater discovery, access, and interoperability of data and models, and by freeing scientists to do science rather than create and manage technological tools. To elucidate specific ways in which improved CI could advance water science, three challenges confronting the water science community were evaluated: 1) How does ecohydrologic patch structure affect nitrogen transport and fate in watersheds?, 2) How can human-modified environments emulate natural water and nutrient cycling to enhance both human and ecosystem well-being?, 3) How do changes in climate affect water availability to support biodiversity and human needs? We assessed the approaches used by researchers to address components of these challenges, identified barriers imposed by limitations of current CI, and interviewed leaders in various water science subdisciplines to determine the most recent CI tools employed. Our preliminary findings revealed four areas where CI improvements are likely to stimulate scientific advances: 1) sensor networks, 2) data quality assurance/quality control, 3) data and modeling standards, 4) high performance computing. In addition, the full potential of a re-envisioned water science CI cannot be realized without a substantial training component. In light of these findings, we suggest that CI industry-proven practices such as open-source community architecture, agile development methodologies, and sound software engineering methods offer a promising pathway to a transformed water science CI capable of meeting the demands of both individual scientists and community-wide research initiatives.
Power Management and Distribution Trades Studies for a Deep-Space Mission Scientific Spacecraft
NASA Technical Reports Server (NTRS)
Kimnach, Greg L.; Soltis, James V.
2004-01-01
As part of NASA's Project Prometheus, the Nuclear Systems Program, NASA GRC performed trade studies on the various Power Management and Distribution (PMAD) options for a deep-space scientific spacecraft which would have a nominal electrical power requirement of 100 kWe. These options included AC (1000Hz and 1500Hz and DC primary distribution at various voltages. The distribution system efficiency, reliability, mass, thermal, corona, space radiation levels and technology readiness of devices and components were considered. The final proposed system consisted of two independent power distribution channels, sourced by two 3-phase, 110 kVA alternators nominally operating at half-rated power. Each alternator nominally supplies 50kWe to one half of the ion thrusters and science modules but is capable of supplying the total power re3quirements in the event of loss of one alternator. This paper is an introduction to the methodology for the trades done to arrive at the proposed PMAD architecture. Any opinions expressed are those of the author(s) and do not necessarily reflect the views of Project Prometheus.
The Cassini/Huygens Doppler Wind Experiment: Results from the Titan Descent
NASA Technical Reports Server (NTRS)
Bird, M. K.; Dutta-Roy, R.; Allison, M.; Asmar, S. W.; Atkinson, D. H.; Edenhofer, P.; Plettemeier, D.; Tyler, G. L.
2005-01-01
The primary objective of the Doppler Wind Experiment (DWE), one of the six scientific investigations comprising the payload of the ESA Huygens Probe, is a determination of the wind velocity in Titan's atmosphere. Measurements of the Doppler shift of the S-band (2040 MHz) carrier signal to the Cassini Orbiter and to Earth were recorded during the Probe descent in order to deduce wind-induced motion of the Probe to an accuracy better than 1 m s-1. An experiment with the same scientific goal was performed with the Galileo Probe at Jupiter. Analogous to the Galileo experience, it was anticipated that the frequency of the Huygens radio signal could be measured on Earth to obtain an additional component of the horizontal winds. Specific secondary science objectives of DWE include measurements of: (a) Doppler fluctuations to determine the turbulence spectrum and possible wave activity in the Titan atmosphere; (b) Doppler and signal level modulation to monitor Probe descent dynamics (e.g., spinrate/spinphase, parachute swing); (c) Probe coordinates and orientation during descent and after impact on Titan.
Peer review in forensic science.
Ballantyne, Kaye N; Edmond, Gary; Found, Bryan
2017-08-01
Peer review features prominently in the forensic sciences. Drawing on recent research and studies, this article examines different types of peer review, specifically: editorial peer review; peer review by the scientific community; technical and administrative review; and verification (and replication). The article reviews the different meanings of these quite disparate activities and their utility in relation to enhancing performance and reducing error. It explains how forensic practitioners should approach and use peer review, as well as how it should be described in expert reports and oral testimony. While peer review has considerable potential, and is a key component of modern quality management systems, its actual value in most forensic science settings has yet to be determined. In consequence, forensic practitioners should reflect on why they use specific review procedures and endeavour to make their actual practices and their potential value transparent to consumers; whether investigators, lawyers, jurors or judges. Claims that review increases the validity of a scientific technique or accuracy of opinions within a particular case should be avoided until empirical evidence is available to support such assertions. Copyright © 2017 Elsevier B.V. All rights reserved.
Power Management and Distribution Trades Studies for a Deep-space Mission Scientific Spacecraft
NASA Astrophysics Data System (ADS)
Kimnach, Greg L.; Soltis, James V.
2004-02-01
As part of NASA's Project Prometheus, the Nuclear Systems Program, NASA GRC performed trade studies on the various Power Management and Distribution (PMAD) options for a deep-space scientific spacecraft, which would have a nominal electrical power requirement of 100 kWe. These options included AC (1000Hz and 1500Hz) and DC primary distribution at various voltages. The distribution system efficiency, reliability, mass, thermal, corona, space radiation levels, and technology readiness of devices and components were considered. The final proposed system consisted of two independent power distribution channels, sourced by two 3-phase, 110 kVA alternators nominally operating at half-rated power. Each alternator nominally supplies 50 kWe to one-half of the ion thrusters and science modules, but is capable of supplying the total power requirements in the event of loss of one alternator. This paper is an introduction to the methodology for the trades done to arrive at the proposed PMAD architecture. Any opinions expressed are those of the author(s) and do not necessarily reflect the views of Project Prometheus.
Earthscape, a Multi-Purpose Interactive 3d Globe Viewer for Hybrid Data Visualization and Analysis
NASA Astrophysics Data System (ADS)
Sarthou, A.; Mas, S.; Jacquin, M.; Moreno, N.; Salamon, A.
2015-08-01
The hybrid visualization and interaction tool EarthScape is presented here. The software is able to display simultaneously LiDAR point clouds, draped videos with moving footprint, volume scientific data (using volume rendering, isosurface and slice plane), raster data such as still satellite images, vector data and 3D models such as buildings or vehicles. The application runs on touch screen devices such as tablets. The software is based on open source libraries, such as OpenSceneGraph, osgEarth and OpenCV, and shader programming is used to implement volume rendering of scientific data. The next goal of EarthScape is to perform data analysis using ENVI Services Engine, a cloud data analysis solution. EarthScape is also designed to be a client of Jagwire which provides multisource geo-referenced video fluxes. When all these components will be included, EarthScape will be a multi-purpose platform that will provide at the same time data analysis, hybrid visualization and complex interactions. The software is available on demand for free at france@exelisvis.com.
NASA Technical Reports Server (NTRS)
Campbell, William J.; Roelofs, Larry H.; Short, Nicholas M., Jr.
1987-01-01
The National Space Science Data Center (NSSDC) has initiated an Intelligent Data Management (IDM) research effort which has as one of its components the development of an Intelligent User Interface (IUI).The intent of the latter is to develop a friendly and intelligent user interface service that is based on expert systems and natural language processing technologies. The purpose is to support the large number of potential scientific and engineering users presently having need of space and land related research and technical data but who have little or no experience in query languages or understanding of the information content or architecture of the databases involved. This technical memorandum presents prototype Intelligent User Interface Subsystem (IUIS) using the Crustal Dynamics Project Database as a test bed for the implementation of the CRUDDES (Crustal Dynamics Expert System). The knowledge base has more than 200 rules and represents a single application view and the architectural view. Operational performance using CRUDDES has allowed nondatabase users to obtain useful information from the database previously accessible only to an expert database user or the database designer.
An Internship Program for Deaf and Hard of Hearing Students in Polymer-Based Nanocomposites
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cebe,P.; Cherdack, D.; Guertin, R.
2006-01-01
We report on our summer internship program in Polymer-Based Nanocomposites, for deaf and hard of hearing undergraduates who engage in classroom and laboratory research work in polymer physics. The unique attributes of this program are its emphasis on: 1. Teamwork; 2. Performance of a start-to-finish research project; 3. Physics of materials approach; and 4. Diversity. Students of all disability levels have participated in this program, including students who neither hear nor voice. The classroom and laboratory components address the materials chemistry and physics of polymer-based nanocomposites, crystallization and melting of polymers, the interaction of X-rays and light with polymers, mechanicalmore » properties of polymers, and the connection between thermal processing, structure, and ultimate properties of polymers. A set of Best Practices is developed for accommodating deaf and hard of hearing students into the laboratory setting. The goal is to bring deaf and hard of hearing students into the larger scientific community as professionals, by providing positive scientific experiences at a formative time in their educational lives.« less
The Process of Science Communications at NASA/Marshall Space Flight Center
NASA Technical Reports Server (NTRS)
Horack, John M.; Treise, Deborah
1998-01-01
The communication of new scientific knowledge and understanding is an integral component of science research, essential for its continued survival. Like any learning-based activity, science cannot continue without communication between and among peers so that skeptical inquiry and learning can take place. This communication provides necessary organic support to maintain the development of new knowledge and technology. However, communication beyond the peer-community is becoming equally critical for science to survive as an enterprise into the 21st century. Therefore, scientists not only have a 'noble responsibility' to advance and communicate scientific knowledge and understanding to audiences within and beyond the peer-community, but their fulfillment of this responsibility is necessary to maintain the survival of the science enterprise. Despite the critical importance of communication to the viability of science, the skills required to perform effective science communications historically have not been taught as a part of the training of scientist, and the culture of science is often averse to significant communication beyond the peer community. Thus scientists can find themselves ill equipped and uncomfortable with the requirements of their job in the new millennium.
Foster, Jamie S; Lemus, Judith D
2015-01-01
Scientific inquiry represents a multifaceted approach to explore and understand the natural world. Training students in the principles of scientific inquiry can help promote the scientific learning process as well as help students enhance their understanding of scientific research. Here, we report on the development and implementation of a learning module that introduces astrobiology students to the concepts of creative and scientific inquiry, as well as provide practical exercises to build critical thinking skills. The module contained three distinct components: (1) a creative inquiry activity designed to introduce concepts regarding the role of creativity in scientific inquiry; (2) guidelines to help astrobiology students formulate and self-assess questions regarding various scientific content and imagery; and (3) a practical exercise where students were allowed to watch a scientific presentation and practice their analytical skills. Pre- and post-course surveys were used to assess the students' perceptions regarding creative and scientific inquiry and whether this activity impacted their understanding of the scientific process. Survey results indicate that the exercise helped improve students' science skills by promoting awareness regarding the role of creativity in scientific inquiry and building their confidence in formulating and assessing scientific questions. Together, the module and survey results confirm the need to include such inquiry-based activities into the higher education classroom, thereby helping students hone their critical thinking and question asking skill set and facilitating their professional development in astrobiology.
GEOSS AIP-2 Climate Change and Biodiversity Use Scenarios: Interoperability Infrastructures
NASA Astrophysics Data System (ADS)
Nativi, Stefano; Santoro, Mattia
2010-05-01
In the last years, scientific community is producing great efforts in order to study the effects of climate change on life on Earth. In this general framework, a key role is played by the impact of climate change on biodiversity. To assess this, several use scenarios require the modeling of climatological change impact on the regional distribution of biodiversity species. Designing and developing interoperability infrastructures which enable scientists to search, discover, access and use multi-disciplinary resources (i.e. datasets, services, models, etc.) is currently one of the main research fields for the Earth and Space Science Informatics. This presentation introduces and discusses an interoperability infrastructure which implements the discovery, access, and chaining of loosely-coupled resources in the climatology and biodiversity domains. This allows to set up and run forecast and processing models. The presented framework was successfully developed and experimented in the context of GEOSS AIP-2 (Global Earth Observation System of Systems, Architecture Implementation Pilot- Phase 2) Climate Change & Biodiversity thematic Working Group. This interoperability infrastructure is comprised of the following main components and services: a)GEO Portal: through this component end user is able to search, find and access the needed services for the scenario execution; b)Graphical User Interface (GUI): this component provides user interaction functionalities. It controls the workflow manager to perform the required operations for the scenario implementation; c)Use Scenario controller: this component acts as a workflow controller implementing the scenario business process -i.e. a typical climate change & biodiversity projection scenario; d)Service Broker implementing Mediation Services: this component realizes a distributed catalogue which federates several discovery and access components (exposing them through a unique CSW standard interface). Federated components publish climate, environmental and biodiversity datasets; e)Ecological Niche Model Server: this component is able to run one or more Ecological Niche Models (ENM) on selected biodiversity and climate datasets; f)Data Access Transaction server: this component publishes the model outputs. This framework was assessed in two use scenarios of GEOSS AIP-2 Climate Change and Biodiversity WG. Both scenarios concern the prediction of species distributions driven by climatological change forecasts. The first scenario dealt with the Pikas specie regional distribution in the Great Basin area (North America). While, the second one concerned the modeling of the Arctic Food Chain species in the North Pole area -the relationships between different environmental parameters and Polar Bears distribution was analyzed. The scientific patronage was provided by the University of Colorado and the University of Alaska, respectively. Results are published in the GEOSS AIP-2 web site: http://www.ogcnetwork.net/AIP2develop.
A statistical study of the relationship between surface quality and laser induced damage
NASA Astrophysics Data System (ADS)
Turner, Trey; Turchette, Quentin; Martin, Alex R.
2012-11-01
Laser induced damage of optical components is a concern in many applications in the commercial, scientific and military market sectors. Numerous component manufacturers supply "high laser damage threshold" (HLDT) optics to meet the needs of this market, and consumers pay a premium price for these products. While there's no question that HLDT optics are manufactured to more rigorous standards (and are therefore inherently more expensive) than conventional products, it is not clear how this added expense translates directly into better performance. This is because the standard methods for evaluating laser damage, and the underlying assumptions about the validity of traditional laser damage testing, are flawed. In particular, the surface and coating defects that generally lead to laser damage (in many laserparameter regimes of interest) are widely distributed over the component surface with large spaces in between them. As a result, laser damage testing typically doesn't include enough of these defects to achieve the sample sizes necessary to make its results statistically meaningful. The result is a poor correlation between defect characteristics and damage events. This paper establishes specifically why this is the case, and provides some indication of what might be done to remedy the problem.
A novel dismantling process of waste printed circuit boards using water-soluble ionic liquid.
Zeng, Xianlai; Li, Jinhui; Xie, Henghua; Liu, Lili
2013-10-01
Recycling processes for waste printed circuit boards (WPCBs) have been well established in terms of scientific research and field pilots. However, current dismantling procedures for WPCBs have restricted the recycling process, due to their low efficiency and negative impacts on environmental and human health. This work aimed to seek an environmental-friendly dismantling process through heating with water-soluble ionic liquid to separate electronic components and tin solder from two main types of WPCBs-cathode ray tubes and computer mainframes. The work systematically investigates the influence factors, heating mechanism, and optimal parameters for opening solder connections on WPCBs during the dismantling process, and addresses its environmental performance and economic assessment. The results obtained demonstrate that the optimal temperature, retention time, and turbulence resulting from impeller rotation during the dismantling process, were 250 °C, 12 min, and 45 rpm, respectively. Nearly 90% of the electronic components were separated from the WPCBs under the optimal experimental conditions. This novel process offers the possibility of large industrial-scale operations for separating electronic components and recovering tin solder, and for a more efficient and environmentally sound process for WPCBs recycling. Copyright © 2013 Elsevier Ltd. All rights reserved.
Renkawitz, Tobias; Tingart, Markus; Grifka, Joachim; Sendtner, Ernst; Kalteis, Thomas
2009-09-01
This article outlines the scientific basis and a state-of-the-art application of computer-assisted orthopedic surgery in total hip arthroplasty (THA) and provides a future perspective on this technology. Computer-assisted orthopedic surgery in primary THA has the potential to couple 3D simulations with real-time evaluations of surgical performance, which has brought these developments from the research laboratory all the way to clinical use. Nonimage- or imageless-based navigation systems without the need for additional pre- or intra-operative image acquisition have stood the test to significantly reduce the variability in positioning the acetabular component and have shown precise measurement of leg length and offset changes during THA. More recently, computer-assisted orthopedic surgery systems have opened a new frontier for accurate surgical practice in minimally invasive, tissue-preserving THA. The future generation of imageless navigation systems will switch from simple measurement tasks to real navigation tools. These software algorithms will consider the cup and stem as components of a coupled biomechanical system, navigating the orthopedic surgeon to find an optimized complementary component orientation rather than target values intraoperatively, and are expected to have a high impact on clinical practice and postoperative functionality in modern THA.
The BepiColombo Laser Altimeter (BELA): Scientific Performance at Mercury
NASA Astrophysics Data System (ADS)
Hussmann, H.; Steinbrügge, G.; Stark, A.; Oberst, J.; Thomas, N.; Lara, L.-M.
2018-05-01
We discuss the expected scientific performance of BELA in Mercury orbit. Based on a performance model, we present the measurement accuracy of global and local topography, surface slopes and roughness, as well as the tidal Love number h2.
Introduction to the LaRC central scientific computing complex
NASA Technical Reports Server (NTRS)
Shoosmith, John N.
1993-01-01
The computers and associated equipment that make up the Central Scientific Computing Complex of the Langley Research Center are briefly described. The electronic networks that provide access to the various components of the complex and a number of areas that can be used by Langley and contractors staff for special applications (scientific visualization, image processing, software engineering, and grid generation) are also described. Flight simulation facilities that use the central computers are described. Management of the complex, procedures for its use, and available services and resources are discussed. This document is intended for new users of the complex, for current users who wish to keep appraised of changes, and for visitors who need to understand the role of central scientific computers at Langley.
Communicating Scientific Research to Non-Specialists
NASA Astrophysics Data System (ADS)
Holman, Megan
Public outreach to effectively communicate current scientific advances is an essential component of the scientific process. The challenge in making this information accessible is forming a clear, accurate, and concise version of the information from a variety of different sources, so that the information is understandable and compelling to non-specialists in the general public. We are preparing a magazine article about planetary system formation. This article will include background information about star formation and different theories and observations of planet formation to provide context. We will then discuss the latest research and theories describing how planetary systems may be forming in different areas of the universe. We demonstrate here the original professional-level scientific work alongside our public-level explanations and original graphics to demonstrate our editorial process.
The coexistence of alternative and scientific conceptions in physics
NASA Astrophysics Data System (ADS)
Ozdemir, Omer F.
The purpose of this study was to inquire about the simultaneous coexistence of alternative and scientific conceptions in the domain of physics. This study was particularly motivated by several arguments put forward in opposition to the Conceptual Change Model. In the simplest form, these arguments state that people construct different domains of knowledge and different modes of perception in different situations. Therefore, holding different conceptualizations is unavoidable and expecting a replacement in an individual's conceptual structure is not plausible in terms of instructional practices. The following research questions were generated to inquire about this argument: (1) Do individuals keep their alternative conceptions after they have acquired scientific conceptions? (2) Assuming that individuals who acquired scientific conceptions also have alternative conceptions, how are these different conceptions nested in their conceptual structure? (3) What kind of knowledge, skills, and reasoning are necessary to transfer scientific principles instead of alternative ones in the construction of a valid model? Analysis of the data collected from the non-physics group indicated that the nature of alternative conceptions is framed by two types of reasoning: reasoning by mental simulation and semiformal reasoning. Analysis of the data collected from the physics group revealed that mental images or scenes feeding reasoning by mental simulation had not disappeared after the acquisition of scientific conceptions. The analysis of data also provided enough evidence to conclude that alternative principles feeding semiformal reasoning have not necessarily disappeared after the acquisition of scientific conceptions. However, in regard to semiformal reasoning, compartmentalization was not as clear as the case demonstrated in reasoning by mental simulation; instead semiformal and scientific reasoning are intertwined in a way that the components of semiformal reasoning can easily take their place among the components of scientific reasoning. In spite of the fact that the coexistence of multiple conceptions might obstruct the transfer of scientific conceptions in problem-solving situations, several factors stimulating the use of scientific conceptions were noticed explicitly. These factors were categorized as follows: (a) the level of individuals' domain specific knowledge in the corresponding field, (b) the level of individuals' knowledge about the process of science (how science generates its knowledge claims), (c) the level of individuals' awareness of different types of reasoning and conceptions, and (d) the context in which the problem is situated. (Abstract shortened by UMI.)
Theoretical and Empirical Base for Implementation Components of Health-Promoting Schools
ERIC Educational Resources Information Center
Samdal, Oddrun; Rowling, Louise
2011-01-01
Purpose: Efforts to create a scientific base for the health-promoting school approach have so far not articulated a clear "Science of Delivery". There is thus a need for systematic identification of clearly operationalised implementation components. To address a next step in the refinement of the health-promoting schools' work, this paper sets out…
The Influence of Unsteadiness on the Analysis of Pressure Gain Combustion Devices
NASA Technical Reports Server (NTRS)
Paxson, Daniel E.; Kaemming, Tom
2013-01-01
Pressure gain combustion (PGC) has been the object of scientific study for over a century due to its promise of improved thermodynamic efficiency. In many recent application concepts PGC is utilized as a component in an otherwise continuous, normally steady flow system, such as a gas turbine or ram jet engine. However, PGC is inherently unsteady. Failure to account for the effects of this periodic unsteadiness can lead to misunderstanding and errors in performance calculations. This paper seeks to provide some clarity by presenting a consistent method of thermodynamic cycle analysis for a device utilizing PGC technology. The incorporation of the unsteady PGC process into the conservation equations for a continuous flow device is presented. Most importantly, the appropriate method for computing the conservation of momentum is presented. It will be shown that proper, consistent analysis of cyclic conservation principles produces representative performance predictions.
Philosophies Applied in the Selection of Space Suit Joint Range of Motion Requirements
NASA Technical Reports Server (NTRS)
Aitchison, Lindsway; Ross, Amy; Matty, Jennifer
2009-01-01
Space suits are the most important tool for astronauts working in harsh space and planetary environments; suits keep crewmembers alive and allow them to perform exploration, construction, and scientific tasks on a routine basis over a period of several months. The efficiency with which the tasks are performed is largely dictated by the mobility features of the space suit. For previous space suit development programs, the mobility requirements were written as pure functional mobility requirements that did not separate joint ranges of motion from the joint torques. The Constellation Space Suit Element has the goal to make more quantitative mobility requirements that focused on the individual components of mobility to enable future suit designers to build and test systems more effectively. This paper details the test planning and selection process for the Constellation space suit pressure garment range of motion requirements.
Improvement of Productivity in TIG Welding Plant by Equipment Design in Orbit
NASA Astrophysics Data System (ADS)
Gnanavel, C.; Saravanan, R.; Chandrasekaran, M.; Jayakanth, J. J.
2017-03-01
Measurements and improvements are very indispensable task at all levels of management. Here some samples are, at operator level: Measuring operating parameters to ensure OEE (Overall Equipment Effectiveness) and measuring Q components performance to ensure quality, at supervisory level: measuring operator’s performance to ensure labour utility at managerial level: production and productivity measurements and at top level capital and capacity utilization. An often accepted statement is “Improvement is impossible without measurement”. Measurements often referred as observation. The case study was conducted at Government Boiler factory in India. The scientific approach followed for indentifying non value added activities. Personalised new equipment designed and installed to achieve productivity improvement of 85% for a day. The new equipment can serve 360o around its axis hence it simplified loading and unloading procedures as well as reduce their times and ensured effective space and time.
The computational challenges of Earth-system science.
O'Neill, Alan; Steenman-Clark, Lois
2002-06-15
The Earth system--comprising atmosphere, ocean, land, cryosphere and biosphere--is an immensely complex system, involving processes and interactions on a wide range of space- and time-scales. To understand and predict the evolution of the Earth system is one of the greatest challenges of modern science, with success likely to bring enormous societal benefits. High-performance computing, along with the wealth of new observational data, is revolutionizing our ability to simulate the Earth system with computer models that link the different components of the system together. There are, however, considerable scientific and technical challenges to be overcome. This paper will consider four of them: complexity, spatial resolution, inherent uncertainty and time-scales. Meeting these challenges requires a significant increase in the power of high-performance computers. The benefits of being able to make reliable predictions about the evolution of the Earth system should, on their own, amply repay this investment.
Techno-economic performance indicators of municipal solid waste collection strategies.
Bertanza, G; Ziliani, E; Menoni, L
2018-04-01
Several indicators for the evaluation of the MSW collection systems have been proposed in the literature. These evaluation tools consider only some of the aspects that influence the operational efficiency of the collection service. The aim of this paper is to suggest a set of (easy to calculate) indicators that overcomes this limitation, taking into account both the characteristics of collected waste and the operational - economic performance. The main components of the collection system (labour, vehicles and containers) are separately considered so that it is possible to quantify and compare their role within the whole process. As an example of application, the proposed approach was used for comparing the MSW collection strategies adopted in four towns in Northern Italy. Results are discussed and a comparison with alternative assessment methods available in the scientific literature is reported. Copyright © 2018 Elsevier Ltd. All rights reserved.
Parallel Polarization State Generation
NASA Astrophysics Data System (ADS)
She, Alan; Capasso, Federico
2016-05-01
The control of polarization, an essential property of light, is of wide scientific and technological interest. The general problem of generating arbitrary time-varying states of polarization (SOP) has always been mathematically formulated by a series of linear transformations, i.e. a product of matrices, imposing a serial architecture. Here we show a parallel architecture described by a sum of matrices. The theory is experimentally demonstrated by modulating spatially-separated polarization components of a laser using a digital micromirror device that are subsequently beam combined. This method greatly expands the parameter space for engineering devices that control polarization. Consequently, performance characteristics, such as speed, stability, and spectral range, are entirely dictated by the technologies of optical intensity modulation, including absorption, reflection, emission, and scattering. This opens up important prospects for polarization state generation (PSG) with unique performance characteristics with applications in spectroscopic ellipsometry, spectropolarimetry, communications, imaging, and security.
Microprobe Analysis of Pu-Ga Standards
Wall, Angélique D.; Romero, Joseph P.; Schwartz, Daniel
2017-08-04
In order to obtain quantitative analysis using an Electron Scanning Microprobe it is essential to have a standard of known composition. Most elemental and multi-elemental standards can be easily obtained from places like Elemental Scientific or other standards organizations that are NIST (National Institute of Standards and Technology) traceable. It is, however, more challenging to find standards for plutonium. Past work performed in our group has typically involved using the plutonium sample to be analysed as its own standard as long as all other known components of the sample have standards to be compared to [1,2,3]. Finally, this method worksmore » well enough, but this experiment was performed in order to develop a more reliable standard for plutonium using five samples of known chemistry of a plutonium gallium mix that could then be used as the main plutonium and gallium standards for future experiments.« less
Microprobe Analysis of Pu-Ga Standards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wall, Angélique D.; Romero, Joseph P.; Schwartz, Daniel
In order to obtain quantitative analysis using an Electron Scanning Microprobe it is essential to have a standard of known composition. Most elemental and multi-elemental standards can be easily obtained from places like Elemental Scientific or other standards organizations that are NIST (National Institute of Standards and Technology) traceable. It is, however, more challenging to find standards for plutonium. Past work performed in our group has typically involved using the plutonium sample to be analysed as its own standard as long as all other known components of the sample have standards to be compared to [1,2,3]. Finally, this method worksmore » well enough, but this experiment was performed in order to develop a more reliable standard for plutonium using five samples of known chemistry of a plutonium gallium mix that could then be used as the main plutonium and gallium standards for future experiments.« less
Welcome to NASA's Earth Science Enterprise. Version 3
NASA Technical Reports Server (NTRS)
2001-01-01
There are strong scientific indications that natural change in the Earth system is being accelerated by human intervention. As a result, planet Earth faces the possibility of rapid environmental changes that would have a profound impact on all nations. However, we do not fully understand either the short-term effects of our activities, or their long-term implications - many important scientific questions remain unanswered. The National Aeronautics and Space Administration (NASA) is working with the national and international scientific communities to establish a sound scientific basis for addressing these critical issues through research efforts coordinated under the U.S. Global Change Research Program, the International Geosphere-Biosphere Program, and the World Climate Research Program. The Earth Science Enterprise is NASA's contribution to the U.S. Global Change Research Program. NASA's Earth Science Enterprise will use space- and surface-based measurement systems to provide the scientific basis for understanding global change. The space-based components will provide a constellation of satellites to monitor the Earth from space. A major component of the Earth Science Enterprise is the Earth Observing System (EOS). The overall objective of the EOS Program is to determine the extent, causes, and regional consequences of global climate change. EOS will provide sustained space-based observations that will allow researchers to monitor climate variables over time to determine trends. A constellation of EOS satellites will acquire global data, beginning in 1998 and extending well into the 21st century.
Automatic sentence extraction for the detection of scientific paper relations
NASA Astrophysics Data System (ADS)
Sibaroni, Y.; Prasetiyowati, S. S.; Miftachudin, M.
2018-03-01
The relations between scientific papers are very useful for researchers to see the interconnection between scientific papers quickly. By observing the inter-article relationships, researchers can identify, among others, the weaknesses of existing research, performance improvements achieved to date, and tools or data typically used in research in specific fields. So far, methods that have been developed to detect paper relations include machine learning and rule-based methods. However, a problem still arises in the process of sentence extraction from scientific paper documents, which is still done manually. This manual process causes the detection of scientific paper relations longer and inefficient. To overcome this problem, this study performs an automatic sentences extraction while the paper relations are identified based on the citation sentence. The performance of the built system is then compared with that of the manual extraction system. The analysis results suggested that the automatic sentence extraction indicates a very high level of performance in the detection of paper relations, which is close to that of manual sentence extraction.
Physical fitness and academic performance in youth: A systematic review.
Santana, C C A; Azevedo, L B; Cattuzzo, M T; Hill, J O; Andrade, L P; Prado, W L
2017-06-01
Physical fitness (PF) is a construct of health- and skill-related attributes which have been associated with academic performance (AP) in youth. This study aimed to review the scientific evidence on the association among components of PF and AP in children and adolescents. A systematic review of articles using databases PubMed/Medline, ERIC, LILACS, SciELO, and Web of Science was undertaken. Cross-sectional and longitudinal studies examining the association between at least one component of PF and AP in children and adolescents, published between 1990 and June 2016, were included. Independent extraction of articles was carried out by the two authors using predefined data fields. From a total of 45 studies included, 25 report a positive association between components of PF with AP and 20 describe a single association between cardiorespiratory fitness (CRF) and AP. According to the Strengthening the Reporting of Observational Studies in Epidemiology guidelines: 12 were classified as low, 32 as medium risk, and 1 as high risk of bias. Thirty-one studies reported a positive association between AP and CRF, six studies with muscular strength, three studies with flexibility, and seven studies reported a positive association between clustered of PF components and AP. The magnitude of the associations is weak to moderate (β = 0.10-0.42 and odds = 1.01-4.14). There is strong evidence for a positive association between CRF and cluster of PF with AP in cross-sectional studies; and evidence from longitudinal studies for a positive association between cluster of PF and AP; the relationship between muscular strength and flexibility with AP remains uncertain. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Physical and Physiological Profiles of Brazilian Jiu-Jitsu Athletes: a Systematic Review.
Andreato, Leonardo Vidal; Lara, Francisco Javier Díaz; Andrade, Alexandro; Branco, Braulio Henrique Magnani
2017-12-01
Brazilian jiu-jitsu is a grappling combat sport that has intermittency as its core element; in other words, actions of high, moderate and low intensity are interspersed during matches, requiring a high level of conditioning to support optimal levels of performance for the total match time. The athletes perform from four to six matches during a day of competition, and this number may increase if the open-class competition, which is held parallel to the competition by weight class, is considered. This systematic review examined the physical and physiological profiles of Brazilian jiu-jitsu athletes. Only scientific researches dealing with the major fitness components of Brazilian jiu-jitsu athletes (i.e. body composition and somatotype, aerobic and anaerobic profiles, muscular strength and power) and using accepted methods that provided relevant practical applications for a Brazilian jiu-jitsu athlete's fitness training and/or performance were included in the current review. A computer literature search was carried out of the PubMed, ISI Web of Knowledge, SportDiscus and Scopus databases (up to January 2016). The database research generated 205 articles. After the application of inclusion and exclusion criteria, 58 studies were included for the present systematic review. A total of 1496 subjects were involved in all the selected investigations. Body fat is generally low for these athletes and the mesomorphic component is predominant. The different studies showed VO 2max values between 42 and 52 mL/kg/min, and it seems that aerobic fitness does not discriminate among Brazilian jiu-jitsu athletes of different competitive levels. There is a lack of scientific studies that have investigated anaerobic responses both in lower and upper limbs. Maximal dynamic, isometric and endurance strength can be associated with sporting success in Brazilian jiu-jitsu athletes. Although decisive actions during Brazilian jiu-jitsu matches are mainly dependent on muscular power, more specific studies are necessary to describe it. Studies involving the female sex should be conducted. In addition, further research is needed to analyse whether there are differences between sex, belt ranks and competitive level, and among the different weight categories for different variables.
[Analysis of chemical constituents of volatile components from Jia Ga Song Tang by GC-MS].
Tan, Qing-long; Xiong, Tian-qin; Liao, Jia-yi; Yang, Tao; Zhao, Yu-min; Lin, Xi; Zhang, Cui-xian
2014-10-01
To analyze the chemical components of volatile components from Jia Ga Song Tang. The volatile oils were extracted by water steam distillation. The chemical components of essential oil were analyzed by GC-MS and quantitatively determined by a normalization method. 103 components were separated and 87 components were identified in the volatile oil of Zingiberis Rhizoma. 58 components were separated and 38 components were identified in the volatile oil of Myristicae Semen. 49 components were separated and 38 components were identified in the volatile oil of Amomi Rotundus Fructus. 89 components were separated and 63 components were identified in the volatile oil of Jia Ga Song Tang. Eucalyptol, β-phellandrene and other terpenes were the main compounds in the volatile oil of Jia Ga Song Tang. Changes in the kinds and content of volatile components can provide evidences for scientific and rational compatibility for Jia Ga Song Tang.
Lemus, Judith D.
2015-01-01
Abstract Scientific inquiry represents a multifaceted approach to explore and understand the natural world. Training students in the principles of scientific inquiry can help promote the scientific learning process as well as help students enhance their understanding of scientific research. Here, we report on the development and implementation of a learning module that introduces astrobiology students to the concepts of creative and scientific inquiry, as well as provide practical exercises to build critical thinking skills. The module contained three distinct components: (1) a creative inquiry activity designed to introduce concepts regarding the role of creativity in scientific inquiry; (2) guidelines to help astrobiology students formulate and self-assess questions regarding various scientific content and imagery; and (3) a practical exercise where students were allowed to watch a scientific presentation and practice their analytical skills. Pre- and post-course surveys were used to assess the students' perceptions regarding creative and scientific inquiry and whether this activity impacted their understanding of the scientific process. Survey results indicate that the exercise helped improve students' science skills by promoting awareness regarding the role of creativity in scientific inquiry and building their confidence in formulating and assessing scientific questions. Together, the module and survey results confirm the need to include such inquiry-based activities into the higher education classroom, thereby helping students hone their critical thinking and question asking skill set and facilitating their professional development in astrobiology. Key Words: Scientific inquiry—Critical thinking—Curriculum development—Astrobiology—Microbialites. Astrobiology 15, 89–99. PMID:25474292
Semantics Enabled Queries in EuroGEOSS: a Discovery Augmentation Approach
NASA Astrophysics Data System (ADS)
Santoro, M.; Mazzetti, P.; Fugazza, C.; Nativi, S.; Craglia, M.
2010-12-01
One of the main challenges in Earth Science Informatics is to build interoperability frameworks which allow users to discover, evaluate, and use information from different scientific domains. This needs to address multidisciplinary interoperability challenges concerning both technological and scientific aspects. From the technological point of view, it is necessary to provide a set of special interoperability arrangement in order to develop flexible frameworks that allow a variety of loosely-coupled services to interact with each other. From a scientific point of view, it is necessary to document clearly the theoretical and methodological assumptions underpinning applications in different scientific domains, and develop cross-domain ontologies to facilitate interdisciplinary dialogue and understanding. In this presentation we discuss a brokering approach that extends the traditional Service Oriented Architecture (SOA) adopted by most Spatial Data Infrastructures (SDIs) to provide the necessary special interoperability arrangements. In the EC-funded EuroGEOSS (A European approach to GEOSS) project, we distinguish among three possible functional brokering components: discovery, access and semantics brokers. This presentation focuses on the semantics broker, the Discovery Augmentation Component (DAC), which was specifically developed to address the three thematic areas covered by the EuroGEOSS project: biodiversity, forestry and drought. The EuroGEOSS DAC federates both semantics (e.g. SKOS repositories) and ISO-compliant geospatial catalog services. The DAC can be queried using common geospatial constraints (i.e. what, where, when, etc.). Two different augmented discovery styles are supported: a) automatic query expansion; b) user assisted query expansion. In the first case, the main discovery steps are: i. the query keywords (the what constraint) are “expanded” with related concepts/terms retrieved from the set of federated semantic services. A default expansion regards the multilinguality relationship; ii. The resulting queries are submitted to the federated catalog services; iii. The DAC performs a “smart” aggregation of the queries results and provides them back to the client. In the second case, the main discovery steps are: i. the user browses the federated semantic repositories and selects the concepts/terms-of-interest; ii. The DAC creates the set of geospatial queries based on the selected concepts/terms and submits them to the federated catalog services; iii. The DAC performs a “smart” aggregation of the queries results and provides them back to the client. A Graphical User Interface (GUI) was also developed for testing and interacting with the DAC. The entire brokering framework is deployed in the context of EuroGEOSS infrastructure and it is used in a couple of GEOSS AIP-3 use scenarios: the “e-Habitat Use Scenario” for the Biodiversity and Climate Change topic, and the “Comprehensive Drought Index Use Scenario” for Water/Drought topic
The role of probiotics on each component of the metabolic syndrome and other cardiovascular risks.
Miglioranza Scavuzzi, Bruna; Miglioranza, Lucia Helena da Silva; Henrique, Fernanda Carla; Pitelli Paroschi, Thanise; Lozovoy, Marcell Alysson Batisti; Simão, Andréa Name Colado; Dichi, Isaias
2015-01-01
Probiotics are defined as live microorganisms that when administered in adequate amounts confer health benefits to the host. The consumption of probiotics has gained increasing recognition from the scientific community due to the promising effects on metabolic health through gut microbiota modulation. This article presents a review of scientific studies investigating probiotic species and their effects on different risk factors of the metabolic syndrome (MetS). This article also presents a summary of the major mechanisms involved with gut microbiota and the components of the MetS and raises the key issues to be considered by scientists in search of probiotics species for treatment of patients suffering from this metabolic disorder. Probiotics may confer numerous health benefits to the host through positive gut microbiota modulation. The strain selection is the most important factor for determining health effects. Further studies may consider gut microbiota as a novel target for prevention and management of MetS components and other cardiovascular risks.
Blacklock, Claire; MacPepple, Ekelechi; Kunutsor, Setor; Witter, Sophie
2016-12-01
Paying for performance is a strategy to meet the unmet need for family planning in low and middle income countries; however, rigorous evidence on effectiveness is lacking. Scientific databases and grey literature were searched from 1994 to May 2016. Thirteen studies were included. Payments were linked to units of targeted services, usually modified by quality indicators. Ancillary components and payment indicators differed between studies. Results were mixed for family planning outcome measures. Paying for performance was associated with improved modern family planning use in one study, and increased user and coverage rates in two more. Paying for performance with conditional cash transfers increased family planning use in another. One study found increased use in the upper wealth group only. However, eight studies reported no impact on modern family planning use or prevalence. Secondary outcomes of equity, financial risk protection, satisfaction, quality, and service organization were mixed. Available evidence is inconclusive and limited by the scarcity of studies and by variation in intervention, study design, and outcome measures. Further studies are warranted. © 2016 The Population Council, Inc.
ERIC Educational Resources Information Center
Zangori, Laura; Forbes, Cory T.
2016-01-01
To develop scientific literacy, elementary students should engage in knowledge building of core concepts through scientific practice (Duschl, Schweingruber, & Schouse, 2007). A core scientific practice is engagement in scientific modeling to build conceptual understanding about discipline-specific concepts. Yet scientific modeling remains…
Using blackmail, bribery, and guilt to address the tragedy of the virtual intellectual commons
NASA Astrophysics Data System (ADS)
Griffith, P. C.; Cook, R. B.; Wilson, B. E.; Gentry, M. J.; Horta, L. M.; McGroddy, M.; Morrell, A. L.; Wilcox, L. E.
2008-12-01
One goal of the NSF's vision for 21st Century Cyberinfrastructure is to create a virtual intellectual commons for the scientific community where advanced technologies perpetuate transformation of this community's productivity and capabilities. The metadata describing scientific observations, like the first paragraph of a news story, should answer the questions who? what? why? where? when? and how?, making them discoverable, comprehensible, contextualized, exchangeable, and machine-readable. Investigators who create good scientific metadata increase the scientific value of their observations within such a virtual intellectual commons. But the tragedy of this commons arises when investigators wish to receive without giving in return. The authors of this talk will describe how they have used combinations of blackmail, bribery, and guilt to motivate good behavior by investigators participating in two major scientific programs (NASA's component of the Large-scale Biosphere-Atmosphere Experiment in Amazonia; and the US Climate Change Science Program's North American Carbon Program).
The Utility of Writing Assignments in Undergraduate Bioscience
Libarkin, Julie; Ording, Gabriel
2012-01-01
We tested the hypothesis that engagement in a few, brief writing assignments in a nonmajors science course can improve student ability to convey critical thought about science. A sample of three papers written by students (n = 30) was coded for presence and accuracy of elements related to scientific writing. Scores for different aspects of scientific writing were significantly correlated, suggesting that students recognized relationships between components of scientific thought. We found that students' ability to write about science topics and state conclusions based on data improved over the course of three writing assignments, while the abilities to state a hypothesis and draw clear connections between human activities and environmental impacts did not improve. Three writing assignments generated significant change in student ability to write scientifically, although our results suggest that three is an insufficient number to generate complete development of scientific writing skills. PMID:22383616
Art of reading a journal article: Methodically and effectively
Subramanyam, RV
2013-01-01
Background: Reading scientific literature is mandatory for researchers and clinicians. With an overflow of medical and dental journals, it is essential to develop a method to choose and read the right articles. Objective: To outline a logical and orderly approach to reading a scientific manuscript. By breaking down the task into smaller, step-by-step components, one should be able to attain the skills to read a scientific article with ease. Methods: The reader should begin by reading the title, abstract and conclusions first. If a decision is made to read the entire article, the key elements of the article can be perused in a systematic manner effectively and efficiently. A cogent and organized method is presented to read articles published in scientific journals. Conclusion: One can read and appreciate a scientific manuscript if a systematic approach is followed in a simple and logical manner. PMID:23798833
Novel residual-based large eddy simulation turbulence models for incompressible magnetohydrodynamics
NASA Astrophysics Data System (ADS)
Sondak, David
The goal of this work was to develop, introduce, and test a promising computational paradigm for the development of turbulence models for incompressible magnetohydrodynamics (MHD). MHD governs the behavior of an electrically conducting fluid in the presence of an external electromagnetic (EM) field. The incompressible MHD model is used in many engineering and scientific disciplines from the development of nuclear fusion as a sustainable energy source to the study of space weather and solar physics. Many interesting MHD systems exhibit the phenomenon of turbulence which remains an elusive problem from all scientific perspectives. This work focuses on the computational perspective and proposes techniques that enable the study of systems involving MHD turbulence. Direct numerical simulation (DNS) is not a feasible approach for studying MHD turbulence. In this work, turbulence models for incompressible MHD were developed from the variational multiscale (VMS) formulation wherein the solution fields were decomposed into resolved and unresolved components. The unresolved components were modeled with a term that is proportional to the residual of the resolved scales. Two additional MHD models were developed based off of the VMS formulation: a residual-based eddy viscosity (RBEV) model and a mixed model that partners the VMS formulation with the RBEV model. These models are endowed with several special numerical and physics features. Included in the numerical features is the internal numerical consistency of each of the models. Physically, the new models are able to capture desirable MHD physics such as the inverse cascade of magnetic energy and the subgrid dynamo effect. The models were tested with a Fourier-spectral numerical method and the finite element method (FEM). The primary test problem was the Taylor-Green vortex. Results comparing the performance of the new models to DNS were obtained. The performance of the new models was compared to classic and cutting-edge dynamic Smagorinsky eddy viscosity (DSEV) models. The new models typically outperform the classical models.
Scientific Communication and the Nature of Science
NASA Astrophysics Data System (ADS)
Nielsen, Kristian H.
2013-09-01
Communication is an important part of scientific practice and, arguably, may be seen as constitutive to scientific knowledge. Yet, often scientific communication gets cursory treatment in science studies as well as in science education. In Nature of Science (NOS), for example, communication is rarely mentioned explicitly, even though, as will be argued in this paper, scientific communication could be treated as a central component of NOS. Like other forms of communication, scientific communication is socially and symbolically differentiated. Among other things, it encompasses technical language and grammar, lab communications, and peer reviews, all of which will be treated in this paper in an attempt to engage on an empirical and theoretical level with science as communication. Seeing science as a form of communicative action supplements the epistemological view of science that is standard to both NOS and the philosophy of science. Additions to the seven NOS aspects on Lederman's (Handbook of research on science education. Lawrence Erlbaum, Mahwah, pp. 831-879,
Position of the American Dietetic Association: Functional foods.
Hasler, Clare M; Bloch, Abby S; Thomson, Cynthia A; Enrione, Evelyn; Manning, Carolyn
2004-05-01
It is the position of the American Dietetic Association that functional foods, including whole foods and fortified, enriched, or enhanced foods, have a potentially beneficial effect on health when consumed as part of a varied diet on a regular basis, at effective levels. The Association supports research to define further the health benefits and risks of individual functional foods and their physiologically active components. Dietetics professionals will continue to work with the food industry, the government, the scientific community, and the media to ensure that the public has accurate information regarding this emerging area of food and nutrition science. Knowledge of the role of physiologically active food components, from both phytochemicals and zoochemicals, has changed the role of diet in health. Functional foods have evolved as food and nutrition science has advanced beyond the treatment of deficiency syndromes to reduction of disease risk. This position reviews the definition of functional foods, their regulation, and the scientific evidence supporting this emerging area of food and nutrition. Foods can no longer be evaluated only in terms of macronutrient and micronutrient content alone. Analyzing the content of other physiologically active components and evaluating their role in health promotion will be necessary. The availability of health-promoting functional foods in the US diet has the potential to help ensure a healthier population. However, each functional food should be evaluated on the basis of scientific evidence to ensure appropriate integration into a varied diet.
Optimized technical and scientific design approach for high performance anticoincidence shields
NASA Astrophysics Data System (ADS)
Graue, Roland; Stuffler, Timo; Monzani, Franco; Bastia, Paolo; Gryksa, Werner; Pahl, Germit
2018-04-01
This paper, "Optimized technical and scientific design approach for high performance anticoincidence shields," was presented as part of International Conference on Space Optics—ICSO 1997, held in Toulouse, France.
ERIC Educational Resources Information Center
Mukan, Nataliya; Kravets, Svitlana; Khamulyak, Nataliya
2016-01-01
In the article the content and operational components of continuing professional development of public school teachers in Great Britain, Canada, the USA have been characterized. The main objectives are defined as the theoretical analysis of scientific-pedagogical literature, which highlights different aspects of the problem under research;…
ERIC Educational Resources Information Center
Deemer, Eric D.; Carter, Alice P.; Lobrano, Michael T.
2010-01-01
The current research sought to extend the 2 x 2 achievement goal framework by developing and testing the Achievement Goals for Research Scale (AGRS). Participants (N = 317) consisted of graduate students in the life, physical, and behavioral sciences. A principal components analysis (PCA) extracted five components accounting for 72.59% of the…
Integrated System for Autonomous Science
NASA Technical Reports Server (NTRS)
Chien, Steve; Sherwood, Robert; Tran, Daniel; Cichy, Benjamin; Davies, Ashley; Castano, Rebecca; Rabideau, Gregg; Frye, Stuart; Trout, Bruce; Shulman, Seth;
2006-01-01
The New Millennium Program Space Technology 6 Project Autonomous Sciencecraft software implements an integrated system for autonomous planning and execution of scientific, engineering, and spacecraft-coordination actions. A prior version of this software was reported in "The TechSat 21 Autonomous Sciencecraft Experiment" (NPO-30784), NASA Tech Briefs, Vol. 28, No. 3 (March 2004), page 33. This software is now in continuous use aboard the Earth Orbiter 1 (EO-1) spacecraft mission and is being adapted for use in the Mars Odyssey and Mars Exploration Rovers missions. This software enables EO-1 to detect and respond to such events of scientific interest as volcanic activity, flooding, and freezing and thawing of water. It uses classification algorithms to analyze imagery onboard to detect changes, including events of scientific interest. Detection of such events triggers acquisition of follow-up imagery. The mission-planning component of the software develops a response plan that accounts for visibility of targets and operational constraints. The plan is then executed under control by a task-execution component of the software that is capable of responding to anomalies.
Development of Probiotic Formulation for the Treatment of Iron Deficiency Anemia.
Korčok, Davor Jovan; Tršić-Milanović, Nada Aleksandar; Ivanović, Nevena Djuro; Đorđević, Brižita Ivan
2018-04-01
Probiotics are increasingly more present both as functional foods, and in pharmaceutical preparations with multiple levels of action that contribute to human health. Probiotics realize their positive effects with a proper dose, and by maintaining a declared number of probiotics cells by the expiration date. Important precondition for developing a probiotic product is the right choice of clinically proven probiotic strain, the choice of other active components, as well as, the optimization of the quantity of active component of probiotic per product dose. This scientific paper describes the optimization of the number of probiotics cells in the formulation of dietary supplement that contains probiotic culture Lactobacillus plantarum 299v, iron and vitamin C. Variations of the quantity of active component were analyzed in development batches of the encapsulated probiotic product categorized as dietary supplement with the following ingredients: probiotic culture, sucrosomal form of iron and vitamin C. Optimal quantity of active component L. plantarum of 50 mg, was selected. The purpose of this scientific paper is to select the optimal formulation of probiotic culture in a dietary supplement that contains iron and vitamin C, and to also determine its expiration date by the analysis of the number of viable probiotic cells.
Modern software approaches applied to a Hydrological model: the GEOtop Open-Source Software Project
NASA Astrophysics Data System (ADS)
Cozzini, Stefano; Endrizzi, Stefano; Cordano, Emanuele; Bertoldi, Giacomo; Dall'Amico, Matteo
2017-04-01
The GEOtop hydrological scientific package is an integrated hydrological model that simulates the heat and water budgets at and below the soil surface. It describes the three-dimensional water flow in the soil and the energy exchange with the atmosphere, considering the radiative and turbulent fluxes. Furthermore, it reproduces the highly non-linear interactions between the water and energy balance during soil freezing and thawing, and simulates the temporal evolution of snow cover, soil temperature and moisture. The core components of the package were presented in the 2.0 version (Endrizzi et al, 2014), which was released as Free Software Open-source project. However, despite the high scientific quality of the project, a modern software engineering approach was still missing. Such weakness hindered its scientific potential and its use both as a standalone package and, more importantly, in an integrate way with other hydrological software tools. In this contribution we present our recent software re-engineering efforts to create a robust and stable scientific software package open to the hydrological community, easily usable by researchers and experts, and interoperable with other packages. The activity takes as a starting point the 2.0 version, scientifically tested and published. This version, together with several test cases based on recent published or available GEOtop applications (Cordano and Rigon, 2013, WRR, Kollet et al, 2016, WRR) provides the baseline code and a certain number of referenced results as benchmark. Comparison and scientific validation can then be performed for each software re-engineering activity performed on the package. To keep track of any single change the package is published on its own github repository geotopmodel.github.io/geotop/ under GPL v3.0 license. A Continuous Integration mechanism by means of Travis-CI has been enabled on the github repository on master and main development branches. The usage of CMake configuration tool and the suite of tests (easily manageable by means of ctest tools) greatly reduces the burden of the installation and allows us to enhance portability on different compilers and Operating system platforms. The package was also complemented by several software tools which provide web-based visualization of results based on R plugins, in particular "shiny" (Chang at al, 2016), "geotopbricks" and "geotopOptim2" (Cordano et al, 2016) packages, which allow rapid and efficient scientific validation of new examples and tests. The software re-engineering activities are still under development. However, our first results are promising enough to eventually reach a robust and stable software project that manages in a flexible way a complex state-of-the-art hydrological model like GEOtop and integrates it into wider workflows.
NASA Astrophysics Data System (ADS)
Lescinsky, D. T.; Wyborn, L. A.; Evans, B. J. K.; Allen, C.; Fraser, R.; Rankine, T.
2014-12-01
We present collaborative work on a generic, modular infrastructure for virtual laboratories (VLs, similar to science gateways) that combine online access to data, scientific code, and computing resources as services that support multiple data intensive scientific computing needs across a wide range of science disciplines. We are leveraging access to 10+ PB of earth science data on Lustre filesystems at Australia's National Computational Infrastructure (NCI) Research Data Storage Infrastructure (RDSI) node, co-located with NCI's 1.2 PFlop Raijin supercomputer and a 3000 CPU core research cloud. The development, maintenance and sustainability of VLs is best accomplished through modularisation and standardisation of interfaces between components. Our approach has been to break up tightly-coupled, specialised application packages into modules, with identified best techniques and algorithms repackaged either as data services or scientific tools that are accessible across domains. The data services can be used to manipulate, visualise and transform multiple data types whilst the scientific tools can be used in concert with multiple scientific codes. We are currently designing a scalable generic infrastructure that will handle scientific code as modularised services and thereby enable the rapid/easy deployment of new codes or versions of codes. The goal is to build open source libraries/collections of scientific tools, scripts and modelling codes that can be combined in specially designed deployments. Additional services in development include: provenance, publication of results, monitoring, workflow tools, etc. The generic VL infrastructure will be hosted at NCI, but can access alternative computing infrastructures (i.e., public/private cloud, HPC).The Virtual Geophysics Laboratory (VGL) was developed as a pilot project to demonstrate the underlying technology. This base is now being redesigned and generalised to develop a Virtual Hazards Impact and Risk Laboratory (VHIRL); any enhancements and new capabilities will be incorporated into a generic VL infrastructure. At same time, we are scoping seven new VLs and in the process, identifying other common components to prioritise and focus development.
Launch of the space experiment PAMELA
NASA Astrophysics Data System (ADS)
Casolino, M.; Picozza, P.; Altamura, F.; Basili, A.; De Simone, N.; Di Felice, V.; De Pascale, M. P.; Marcelli, L.; Minori, M.; Nagni, M.; Sparvoli, R.; Galper, A. M.; Mikhailov, V. V.; Runtso, M. F.; Voronov, S. A.; Yurkin, Y. T.; Zverev, V. G.; Castellini, G.; Adriani, O.; Bonechi, L.; Bongi, M.; Taddei, E.; Vannuccini, E.; Fedele, D.; Papini, P.; Ricciarini, S. B.; Spillantini, P.; Ambriola, M.; Cafagna, F.; De Marzo, C.; Barbarino, G. C.; Campana, D.; De Rosa, G.; Osteria, G.; Russo, S.; Bazilevskaja, G. A.; Kvashnin, A. N.; Maksumov, O.; Misin, S.; Stozhkov, Yu. I.; Bogomolov, E. A.; Krutkov, S. Yu.; Nikonov, N. N.; Bonvicini, V.; Boezio, M.; Lundquist, J.; Mocchiutti, E.; Vacchi, A.; Zampa, G.; Zampa, N.; Bongiorno, L.; Ricci, M.; Carlson, P.; Hofverberg, P.; Lund, J.; Orsi, S.; Pearce, M.; Menn, W.; Simon, M.
2008-08-01
PAMELA is a satellite borne experiment designed to study with great accuracy cosmic rays of galactic, solar, and trapped nature in a wide energy range (protons 80 MeV-700 GeV, electrons 50 MeV-400 GeV). Main objective is the study of the antimatter component: antiprotons (80 MeV-190 GeV), positrons (50 MeV-270 GeV) and search for antimatter with a precision of the order of 10 -8. The experiment, housed on board the Russian Resurs-DK1 satellite, was launched on June 15th, 2006 in a 350 × 600 km orbit with an inclination of 70°. The detector is composed of a series of scintillator counters arranged at the extremities of a permanent magnet spectrometer to provide charge, time-of-flight, and rigidity information. Lepton/hadron identification is performed by a silicon-tungsten calorimeter and a neutron detector placed at the bottom of the device. An anticounter system is used offline to reject false triggers coming from the satellite. In self-trigger mode the calorimeter, the neutron detector, and a shower tail catcher are capable of an independent measure of the lepton component up to 2 TeV. In this work we describe the experiment, its scientific objectives, and the performance in the first months after launch.
Smartphones for cell and biomolecular detection.
Liu, Xiyuan; Lin, Tung-Yi; Lillehoj, Peter B
2014-11-01
Recent advances in biomedical science and technology have played a significant role in the development of new sensors and assays for cell and biomolecular detection. Generally, these efforts are aimed at reducing the complexity and costs associated with diagnostic testing so that it can be performed outside of a laboratory or hospital setting, requiring minimal equipment and user involvement. In particular, point-of-care (POC) testing offers immense potential for many important applications including medical diagnosis, environmental monitoring, food safety, and biosecurity. When coupled with smartphones, POC systems can offer portability, ease of use and enhanced functionality while maintaining performance. This review article focuses on recent advancements and developments in smartphone-based POC systems within the last 6 years with an emphasis on cell and biomolecular detection. These devices typically comprise multiple components, such as detectors, sample processors, disposable chips, batteries, and software, which are integrated with a commercial smartphone. One of the most important aspects of developing these systems is the integration of these components onto a compact and lightweight platform that requires minimal power. Researchers have demonstrated several promising approaches employing various detection schemes and device configurations, and it is expected that further developments in biosensors, battery technology and miniaturized electronics will enable smartphone-based POC technologies to become more mainstream tools in the scientific and biomedical communities.
Increasing patient engagement in rehabilitation exercises using computer-based citizen science.
Laut, Jeffrey; Cappa, Francesco; Nov, Oded; Porfiri, Maurizio
2015-01-01
Patient motivation is an important factor to consider when developing rehabilitation programs. Here, we explore the effectiveness of active participation in web-based citizen science activities as a means of increasing participant engagement in rehabilitation exercises, through the use of a low-cost haptic joystick interfaced with a laptop computer. Using the joystick, patients navigate a virtual environment representing the site of a citizen science project situated in a polluted canal. Participants are tasked with following a path on a laptop screen representing the canal. The experiment consists of two conditions: in one condition, a citizen science component where participants classify images from the canal is included; and in the other, the citizen science component is absent. Both conditions are tested on a group of young patients undergoing rehabilitation treatments and a group of healthy subjects. A survey administered at the end of both tasks reveals that participants prefer performing the scientific task, and are more likely to choose to repeat it, even at the cost of increasing the time of their rehabilitation exercise. Furthermore, performance indices based on data collected from the joystick indicate significant differences in the trajectories created by patients and healthy subjects, suggesting that the low-cost device can be used in a rehabilitation setting for gauging patient recovery.
NASA Astrophysics Data System (ADS)
Riedler, W.; Torkar, K.
1996-05-01
This issue is grouped into sections on materials, design, performance and analysis of balloons, reviews of major national and international balloon programmes, novel instrumentation and systems for scientific ballooning, and selected recent scientific observations.
Reference management: A critical element of scientific writing
Kali, Arunava
2016-01-01
With the rapid growth of medical science, the number of scientific writing contributing to medical literature has increased significantly in recent years. Owing to considerable variation of formatting in different citation styles, strict adherence to the accurate referencing manually is labor intensive and challenging. However, the introduction of referencing tools has decreased the complexity to a great extent. These software have advanced overtime to include newer features to support effective reference management. Since scientific writing is an essential component of medical curriculum, it is imperative for medical graduates to understand various referencing systems to effectively make use of these tools in their dissertations and future researches. PMID:26952149
Reference management: A critical element of scientific writing.
Kali, Arunava
2016-01-01
With the rapid growth of medical science, the number of scientific writing contributing to medical literature has increased significantly in recent years. Owing to considerable variation of formatting in different citation styles, strict adherence to the accurate referencing manually is labor intensive and challenging. However, the introduction of referencing tools has decreased the complexity to a great extent. These software have advanced overtime to include newer features to support effective reference management. Since scientific writing is an essential component of medical curriculum, it is imperative for medical graduates to understand various referencing systems to effectively make use of these tools in their dissertations and future researches.
NASA Astrophysics Data System (ADS)
Bergey, Bradley W.; Ketelhut, Diane Jass; Liang, Senfeng; Natarajan, Uma; Karakus, Melissa
2015-10-01
The primary aim of the study was to examine whether performance on a science assessment in an immersive virtual environment was associated with changes in scientific inquiry self-efficacy. A secondary aim of the study was to examine whether performance on the science assessment was equitable for students with different levels of computer game self-efficacy, including whether gender differences were observed. We examined 407 middle school students' scientific inquiry self-efficacy and computer game self-efficacy before and after completing a computer game-like assessment about a science mystery. Results from path analyses indicated that prior scientific inquiry self-efficacy predicted achievement on end-of-module questions, which in turn predicted change in scientific inquiry self-efficacy. By contrast, computer game self-efficacy was neither predictive of nor predicted by performance on the science assessment. While boys had higher computer game self-efficacy compared to girls, multi-group analyses suggested only minor gender differences in how efficacy beliefs related to performance. Implications for assessments with virtual environments and future design and research are discussed.
Age 60 study, part II : airline pilot age and performance - a review of the scientific literature.
DOT National Transportation Integrated Search
1994-10-01
This review of the literature establishes the scientific foundation for subsequent studies on the Age 60 Rule research conducted under a contract with Hilton Systems, Inc. The scientific literature relevant to the two separate scientific approaches r...
Liu, Yingchun; Liu, Zhongbo; Sun, Guoxiang; Wang, Yan; Ling, Junhong; Gao, Jiayue; Huang, Jiahao
2015-01-01
A combination method of multi-wavelength fingerprinting and multi-component quantification by high performance liquid chromatography (HPLC) coupled with diode array detector (DAD) was developed and validated to monitor and evaluate the quality consistency of herbal medicines (HM) in the classical preparation Compound Bismuth Aluminate tablets (CBAT). The validation results demonstrated that our method met the requirements of fingerprint analysis and quantification analysis with suitable linearity, precision, accuracy, limits of detection (LOD) and limits of quantification (LOQ). In the fingerprint assessments, rather than using conventional qualitative "Similarity" as a criterion, the simple quantified ratio fingerprint method (SQRFM) was recommended, which has an important quantified fingerprint advantage over the "Similarity" approach. SQRFM qualitatively and quantitatively offers the scientific criteria for traditional Chinese medicines (TCM)/HM quality pyramid and warning gate in terms of three parameters. In order to combine the comprehensive characterization of multi-wavelength fingerprints, an integrated fingerprint assessment strategy based on information entropy was set up involving a super-information characteristic digitized parameter of fingerprints, which reveals the total entropy value and absolute information amount about the fingerprints and, thus, offers an excellent method for fingerprint integration. The correlation results between quantified fingerprints and quantitative determination of 5 marker compounds, including glycyrrhizic acid (GLY), liquiritin (LQ), isoliquiritigenin (ILG), isoliquiritin (ILQ) and isoliquiritin apioside (ILA), indicated that multi-component quantification could be replaced by quantified fingerprints. The Fenton reaction was employed to determine the antioxidant activities of CBAT samples in vitro, and they were correlated with HPLC fingerprint components using the partial least squares regression (PLSR) method. In summary, the method of multi-wavelength fingerprints combined with antioxidant activities has been proved to be a feasible and scientific procedure for monitoring and evaluating the quality consistency of CBAT.
Simulation-Based e-Learning Tools for Science,Engineering, and Technology Education(SimBeLT)
NASA Astrophysics Data System (ADS)
Davis, Doyle V.; Cherner, Y.
2006-12-01
The focus of Project SimBeLT is the research, development, testing, and dissemination of a new type of simulation-based integrated e-learning set of modules for two-year college technical and engineering curricula in the areas of thermodynamics, fluid physics, and fiber optics that can also be used in secondary schools and four-year colleges. A collection of sophisticated virtual labs is the core component of the SimBeLT modules. These labs will be designed to enhance the understanding of technical concepts and underlying fundamental principles of these topics, as well as to master certain performance based skills online. SimBeLT software will help educators to meet the National Science Education Standard that "learning science and technology is something that students do, not something that is done to them". A major component of Project SimBeLT is the development of multi-layered technology-oriented virtual labs that realistically mimic workplace-like environments. Dynamic data exchange between simulations will be implemented and links with instant instructional messages and data handling tools will be realized. A second important goal of Project SimBeLT labs is to bridge technical skills and scientific knowledge by enhancing the teaching and learning of specific scientific or engineering subjects. SimBeLT builds upon research and outcomes of interactive teaching strategies and tools developed through prior NSF funding (http://webphysics.nhctc.edu/compact/index.html) (Project SimBeLT is partially supported by a grant from the National Science Foundation DUE-0603277)
Data Transparency in Privately Funded Scientific Research
NASA Astrophysics Data System (ADS)
Brewer, P. G.
2016-12-01
Research investigations funded by the Gulf of Mexico Research Initiative (GoMRI) have resulted in a large pulse of scientific data produced by studies ranging across the research goals of the program. These studies have produced datasets from laboratory, field, and modeling activities describing phenomenon ranging from microscopic fluid dynamics to large-scale ocean currents, bacteria to marine mammals, and detailed field observations to synoptic mapping. One of GoMRI's central tenets is to ensure that all data are preserved and made publicly available. Thus, GoMRI formed the Gulf of Mexico Research Initiative Data and Information Cooperative (GRIIDC) with the mission to ensure a data and information legacy that promotes continual scientific discovery and public awareness of the Gulf of Mexico ecosystem. The GoMRI Research Board commitment to open data is exemplified in GoMRI's data program policies and management. The Research Board established a policy that research data must be publically available as soon as possible and no later than one year following collection or at the time of publication. GRIIDC's data specialists, and computer system experts along with a team of researchers funded by GOMRI and GoMRI Research Board members developed a data management system and process for storing and distributing all of the scientific data generated by the GoMRI researchers. Researcher compliance with the data policy is a requirement for annual funding increments, No Cost Extensions, and eligibility for future funding. Since data compliance is an important element of grant performance compliance with GOMRI data policies data are actively tracked and reported to the Board. This initiative comprises an essential component of GoMRI's research independence and legacy.
Mobile Phones Democratize and Cultivate Next-Generation Imaging, Diagnostics and Measurement Tools
Ozcan, Aydogan
2014-01-01
In this article, I discuss some of the emerging applications and the future opportunities and challenges created by the use of mobile phones and their embedded components for the development of next-generation imaging, sensing, diagnostics and measurement tools. The massive volume of mobile phone users, which has now reached ~7 billion, drives the rapid improvements of the hardware, software and high-end imaging and sensing technologies embedded in our phones, transforming the mobile phone into a cost-effective and yet extremely powerful platform to run e.g., biomedical tests and perform scientific measurements that would normally require advanced laboratory instruments. This rapidly evolving and continuing trend will help us transform how medicine, engineering and sciences are practiced and taught globally. PMID:24647550
Motion Simulation Analysis of Rail Weld CNC Fine Milling Machine
NASA Astrophysics Data System (ADS)
Mao, Huajie; Shu, Min; Li, Chao; Zhang, Baojun
CNC fine milling machine is a new advanced equipment of rail weld precision machining with high precision, high efficiency, low environmental pollution and other technical advantages. The motion performance of this machine directly affects its machining accuracy and stability, which makes it an important consideration for its design. Based on the design drawings, this article completed 3D modeling of 60mm/kg rail weld CNC fine milling machine by using Solidworks. After that, the geometry was imported into Adams to finish the motion simulation analysis. The displacement, velocity, angular velocity and some other kinematical parameters curves of the main components were obtained in the post-processing and these are the scientific basis for the design and development for this machine.
Adaptive infrared-reflecting systems inspired by cephalopods
NASA Astrophysics Data System (ADS)
Xu, Chengyi; Stiubianu, George T.; Gorodetsky, Alon A.
2018-03-01
Materials and systems that statically reflect radiation in the infrared region of the electromagnetic spectrum underpin the performance of many entrenched technologies, including building insulation, energy-conserving windows, spacecraft components, electronics shielding, container packaging, protective clothing, and camouflage platforms. The development of their adaptive variants, in which the infrared-reflecting properties dynamically change in response to external stimuli, has emerged as an important unmet scientific challenge. By drawing inspiration from cephalopod skin, we developed adaptive infrared-reflecting platforms that feature a simple actuation mechanism, low working temperature, tunable spectral range, weak angular dependence, fast response, stability to repeated cycling, amenability to patterning and multiplexing, autonomous operation, robust mechanical properties, and straightforward manufacturability. Our findings may open opportunities for infrared camouflage and other technologies that regulate infrared radiation.
The Classification of Ground Roasted Decaffeinated Coffee Using UV-VIS Spectroscopy and SIMCA Method
NASA Astrophysics Data System (ADS)
Yulia, M.; Asnaning, A. R.; Suhandy, D.
2018-05-01
In this work, an investigation on the classification between decaffeinated and non- decaffeinated coffee samples using UV-VIS spectroscopy and SIMCA method was investigated. Total 200 samples of ground roasted coffee were used (100 samples for decaffeinated coffee and 100 samples for non-decaffeinated coffee). After extraction and dilution, the spectra of coffee samples solution were acquired using a UV-VIS spectrometer (Genesys™ 10S UV-VIS, Thermo Scientific, USA) in the range of 190-1100 nm. The multivariate analyses of the spectra were performed using principal component analysis (PCA) and soft independent modeling of class analogy (SIMCA). The SIMCA model showed that the classification between decaffeinated and non-decaffeinated coffee samples was detected with 100% sensitivity and specificity.
NASA Astrophysics Data System (ADS)
Vines, Aleksander; Hansen, Morten W.; Korosov, Anton
2017-04-01
Existing infrastructure international and Norwegian projects, e.g., NorDataNet, NMDC and NORMAP, provide open data access through the OPeNDAP protocol following the conventions for CF (Climate and Forecast) metadata, designed to promote the processing and sharing of files created with the NetCDF application programming interface (API). This approach is now also being implemented in the Norwegian Sentinel Data Hub (satellittdata.no) to provide satellite EO data to the user community. Simultaneously with providing simplified and unified data access, these projects also seek to use and establish common standards for use and discovery metadata. This then allows development of standardized tools for data search and (subset) streaming over the internet to perform actual scientific analysis. A combinnation of software tools, which we call a Scientific Platform as a Service (SPaaS), will take advantage of these opportunities to harmonize and streamline the search, retrieval and analysis of integrated satellite and auxiliary observations of the oceans in a seamless system. The SPaaS is a cloud solution for integration of analysis tools with scientific datasets via an API. The core part of the SPaaS is a distributed metadata catalog to store granular metadata describing the structure, location and content of available satellite, model, and in situ datasets. The analysis tools include software for visualization (also online), interactive in-depth analysis, and server-based processing chains. The API conveys search requests between system nodes (i.e., interactive and server tools) and provides easy access to the metadata catalog, data repositories, and the tools. The SPaaS components are integrated in virtual machines, of which provisioning and deployment are automatized using existing state-of-the-art open-source tools (e.g., Vagrant, Ansible, Docker). The open-source code for scientific tools and virtual machine configurations is under version control at https://github.com/nansencenter/, and is coupled to an online continuous integration system (e.g., Travis CI).
A high performance scientific cloud computing environment for materials simulations
NASA Astrophysics Data System (ADS)
Jorissen, K.; Vila, F. D.; Rehr, J. J.
2012-09-01
We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.
NASA Astrophysics Data System (ADS)
Schalk, Kelly A.
The purpose of this investigation was to measure specific ways a student interest SSI-based curricular and pedagogical affects undergraduates' ability informally reason. The delimited components of informal reasoning measured were undergraduates' Nature of Science conceptualizations and ability to evaluate scientific information. The socio-scientific issues (SSI) theoretical framework used in this case-study has been advocated as a means for improving students' functional scientific literacy. This investigation focused on the laboratory component of an undergraduate microbiology course in spring 2008. There were 26 participants. The instruments used in this study included: (1) Individual and Group research projects, (2) journals, (3) laboratory write-ups, (4) a laboratory quiz, (5) anonymous evaluations, and (6) a pre/post article exercise. All instruments yielded qualitative data, which were coded using the qualitative software NVivo7. Data analyses were subjected to instrumental triangulation, inter-rater reliability, and member-checking. It was determined that undergraduates' epistemological knowledge of scientific discovery, processes, and justification matured in response to the intervention. Specifically, students realized: (1) differences between facts, theories, and opinions; (2) testable questions are not definitively proven; (3) there is no stepwise scientific process; and (4) lack of data weakens a claim. It was determined that this knowledge influenced participants' beliefs and ability to informally reason. For instance, students exhibited more critical evaluations of scientific information. It was also found that undergraduates' prior opinions had changed over the semester. Further, the student interest aspect of this framework engaged learners by offering participants several opportunities to influentially examine microbiology issues that affected their life. The investigation provided empirically based insights into the ways undergraduates' interest and functional scientific literacy can be promoted. The investigation advanced what was known about using SSI-based frameworks to the post-secondary learner context. Outstanding questions remain for investigation. For example, is this type of student interest SSI-based intervention broadly applicable (i.e., in other science disciplines and grade levels)? And, what challenges would teachers in diverse contexts encounter when implementing a SSI-based theoretical framework?
Improved Ecosystem Predictions of the California Current System via Accurate Light Calculations
2011-09-30
System via Accurate Light Calculations Curtis D. Mobley Sequoia Scientific, Inc. 2700 Richards Road, Suite 107 Bellevue, WA 98005 phone: 425...7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Sequoia Scientific, Inc,2700 Richards Road, Suite 107,Bellevue,WA,98005 8. PERFORMING...EcoLight-S 1.0 Users’ Guide and Technical Documentation. Sequoia Scientific, Inc., Bellevue, WA, 38 pages. Mobley, C. D., 2011. Fast light calculations
Haidar, Azzam; Jagode, Heike; Vaccaro, Phil; ...
2018-03-22
The emergence of power efficiency as a primary constraint in processor and system design poses new challenges concerning power and energy awareness for numerical libraries and scientific applications. Power consumption also plays a major role in the design of data centers, which may house petascale or exascale-level computing systems. At these extreme scales, understanding and improving the energy efficiency of numerical libraries and their related applications becomes a crucial part of the successful implementation and operation of the computing system. In this paper, we study and investigate the practice of controlling a compute system's power usage, and we explore howmore » different power caps affect the performance of numerical algorithms with different computational intensities. Further, we determine the impact, in terms of performance and energy usage, that these caps have on a system running scientific applications. This analysis will enable us to characterize the types of algorithms that benefit most from these power management schemes. Our experiments are performed using a set of representative kernels and several popular scientific benchmarks. Lastly, we quantify a number of power and performance measurements and draw observations and conclusions that can be viewed as a roadmap to achieving energy efficiency in the design and execution of scientific algorithms.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haidar, Azzam; Jagode, Heike; Vaccaro, Phil
The emergence of power efficiency as a primary constraint in processor and system design poses new challenges concerning power and energy awareness for numerical libraries and scientific applications. Power consumption also plays a major role in the design of data centers, which may house petascale or exascale-level computing systems. At these extreme scales, understanding and improving the energy efficiency of numerical libraries and their related applications becomes a crucial part of the successful implementation and operation of the computing system. In this paper, we study and investigate the practice of controlling a compute system's power usage, and we explore howmore » different power caps affect the performance of numerical algorithms with different computational intensities. Further, we determine the impact, in terms of performance and energy usage, that these caps have on a system running scientific applications. This analysis will enable us to characterize the types of algorithms that benefit most from these power management schemes. Our experiments are performed using a set of representative kernels and several popular scientific benchmarks. Lastly, we quantify a number of power and performance measurements and draw observations and conclusions that can be viewed as a roadmap to achieving energy efficiency in the design and execution of scientific algorithms.« less
A Systematic Approach for Obtaining Performance on Matrix-Like Operations
NASA Astrophysics Data System (ADS)
Veras, Richard Michael
Scientific Computation provides a critical role in the scientific process because it allows us ask complex queries and test predictions that would otherwise be unfeasible to perform experimentally. Because of its power, Scientific Computing has helped drive advances in many fields ranging from Engineering and Physics to Biology and Sociology to Economics and Drug Development and even to Machine Learning and Artificial Intelligence. Common among these domains is the desire for timely computational results, thus a considerable amount of human expert effort is spent towards obtaining performance for these scientific codes. However, this is no easy task because each of these domains present their own unique set of challenges to software developers, such as domain specific operations, structurally complex data and ever-growing datasets. Compounding these problems are the myriads of constantly changing, complex and unique hardware platforms that an expert must target. Unfortunately, an expert is typically forced to reproduce their effort across multiple problem domains and hardware platforms. In this thesis, we demonstrate the automatic generation of expert level high-performance scientific codes for Dense Linear Algebra (DLA), Structured Mesh (Stencil), Sparse Linear Algebra and Graph Analytic. In particular, this thesis seeks to address the issue of obtaining performance on many complex platforms for a certain class of matrix-like operations that span across many scientific, engineering and social fields. We do this by automating a method used for obtaining high performance in DLA and extending it to structured, sparse and scale-free domains. We argue that it is through the use of the underlying structure found in the data from these domains that enables this process. Thus, obtaining performance for most operations does not occur in isolation of the data being operated on, but instead depends significantly on the structure of the data.
Teaching Scientific Communication Skills in Science Studies: Does It Make a Difference?
ERIC Educational Resources Information Center
Spektor-Levy, Ornit; Eylon, Bat-Sheva; Scherz, Zahava
2009-01-01
This study explores the impact of "Scientific Communication" (SC) skills instruction on students' performances in scientific literacy assessment tasks. We present a general model for skills instruction, characterized by explicit and spiral instruction, integration into content learning, practice in several scientific topics, and application of…
Chandra X-ray Center Science Data Systems Regression Testing of CIAO
NASA Astrophysics Data System (ADS)
Lee, N. P.; Karovska, M.; Galle, E. C.; Bonaventura, N. R.
2011-07-01
The Chandra Interactive Analysis of Observations (CIAO) is a software system developed for the analysis of Chandra X-ray Observatory observations. An important component of a successful CIAO release is the repeated testing of the tools across various platforms to ensure consistent and scientifically valid results. We describe the procedures of the scientific regression testing of CIAO and the enhancements made to the testing system to increase the efficiency of run time and result validation.
1967-01-01
414 S E S PC 210 SCIENTIFIC SATELLITES W.~YI to W.l WI* W FIGuRE 7-5.- Effects of vacuum on space components and types of vacuum pumps used in...origin, 16-17 Vacuum, effects on satellites, 207- relation to satellite dynamics, 90 210 , 284, 365 satellite research, 4, 18, 411, 584 simulation, 209...18 decades (_3 km to _..3 10-s A). (See fig. 1-1.) The charged- particle shielding effectiveness of the Earth’s magnetic field is also reduced at
Scientific ballooning in India Recent developments
NASA Astrophysics Data System (ADS)
Manchanda, R. K.
Established in 1971, the National Balloon Facility operated by TIFR in Hyderabad, India, is a unique facility in the country, which provides a complete solution in scientific ballooning. It is also one of its kind in the world since it combines both, the in-house balloon production and a complete flight support for scientific ballooning. With a large team working through out the year to design, fabricate and launch scientific balloons, the Hyderabad Facility is a unique centre of expertise where the balloon design, research and development, the production and launch facilities are located under one roof. Our balloons are manufactured from 100% indigenous components. The mission specific balloon design, high reliability control and support instrumentation, in-house competence in tracking, telemetry, telecommand, data processing, system design and mechanics is its hallmark. In the past few years, we have executed a major programme of upgradation of different components of balloon production, telemetry and telecommand hardware and various support facilities. This paper focuses on our increased capability of balloon production of large sizes up to 780,000 m 3 using Antrix film, development of high strength balloon load tapes with the breaking strength of 182 kg, and the recent introduction of S-band telemetry and a commandable timer cut-off unit in the flight hardware. A summary of the various flights conducted in recent years will be presented along with the plans for new facilities.
The PACA Project Ecology: Observing Campaigns, Outreach and Citizen Science
NASA Astrophysics Data System (ADS)
Yanamandra-Fisher, P. A.
2016-12-01
The PACA Project has three main components: observational campaigns aligned with scientific research; outreach to engage all forms of audiences and citizen science projects that aim to produce specific scientific results, by engaging professional scientific and amateur communities and a variety of audiences. The primary observational projects are defined by specific scientific goals by professionals, resulting in global observing campaigns involving a variety of observers, and observing techniques. Some of PACA's observing campaigns have included global characterization of comets (e.g., C/ISON, SidingSpring, 67P/Churyumov-Gerasimenko, Lovejoy, etc.), planets (Jupiter, Saturn and Mars) and currently expanding to include polarimetric exploration of solar system objects with small apertures and collaboration with CITIZEN CATE, a citizen science observing campaign to observe the 2017 Continental America Total Eclipse. Our Outreach campaigns leverage the multiple social media/platforms for at least two important reasons: (i) the immediate dissemination of observations and interaction with the global network and (ii) free or inexpensive resources for most of the participants. The use of social media is becoming prevalent in citizen science projects due to these factors. The final stage of the PACA ecosystem is the integration of these components into a publication. We shall highlight some of the interesting challenges and solutions of the PACA Project so far and provide a view of future projects in all three categories with new partnerships and collaborations.
A distributed component framework for science data product interoperability
NASA Technical Reports Server (NTRS)
Crichton, D.; Hughes, S.; Kelly, S.; Hardman, S.
2000-01-01
Correlation of science results from multi-disciplinary communities is a difficult task. Traditionally data from science missions is archived in proprietary data systems that are not interoperable. The Object Oriented Data Technology (OODT) task at the Jet Propulsion Laboratory is working on building a distributed product server as part of a distributed component framework to allow heterogeneous data systems to communicate and share scientific results.
Manufacturing Methods and Technology Project Summary Reports
1981-06-01
a tough urethane film. The basic principle is to pump two components to a spinning disc, mixing the components just prior to depositing in a well...and check out an electronic target scoring device using developed scientific principles without drastically modifying existing commercial...equipment. The scoring device selected and installed was an Accubar Model ATS-16D using the underlying physics principle of acoustic shock wave propagation
Argumentation Key to Communicating Climate Change to the Public
NASA Astrophysics Data System (ADS)
Bleicher, R. E.; Lambert, J. L.
2012-12-01
Argumentation plays an important role in how we communicate climate change science to the public and is a key component integrated throughout the Next Generation Science Standards. A scientific argument can be described as a disagreement between explanations with data being used to justify each position. Argumentation is social process where two or more individuals construct and critique arguments (Kuhn & Udell, 2003; Nussbaum, 1997). Sampson, Grooms, and Walker's (2011) developed a framework for understanding the components of a scientific argument. The three components start with a claim (a conjecture, conclusion, explanation, or an answer to a research question). This claim must fit the evidence (observations that show trends over time, relationships between variables or difference between groups). The evidence must be justified with reasoning (explains how the evidence supports the explanation and whey it should count as support). In a scientific argument, or debate, the controversy focuses on how data were collected, what data can or should be included, and what inferences can be made based on a set of evidence. Toulmin's model (1969) also includes rebutting or presenting an alternative explanation supported by counter evidence and reasoning of why the alternative is not the appropriate explanation for the question of the problem. The process of scientific argumentation should involve the construction and critique of scientific arguments, one that involves the consideration of alternative hypotheses (Lawson, 2003). Scientific literacy depends as much on the ability to refute and recognize poor scientific arguments as much as it does on the ability to present an effective argument based on good scientific data (Osborne, 2010). Argument is, therefore, a core feature of science. When students learn to construct a sound scientific argument, they demonstrate critical thinking and a mastery of the science being taught. To present a convincing argument in support of climate change, students must have a sound foundation in the science underlying it. One place to lay this foundation is in the high school science classroom. For students to gain a good conceptual understanding of climate change science, teachers need a sound understanding of climate change and effective resources to teach it to students. Teacher professional development opportunities are required to provide this background as well as establish collaborative curriculum planning opportunities on the school site (Shulman, 2007). Various strategies for and challenges of implementing argumentation with preservice and practicing teachers will be discussed in this session, as well as ways that argumentation skills can help the broader public evaluate claims of climate skeptics. In the field of argumentation theory, Goodwin (2010) has designed a strategy for developing the ability to make effective scientific arguments. The goal is to establish trust even when there is strong disagreement. At the core, a student fully acknowledges the uncertainty involved in the complex science underlying climate change. This has the effect of establishing some degree of trust. In other words, teachers or students trying to explain climate change to others might be perceived as more trustworthy if they openly declare that there are degrees of uncertainty in different aspects of climate change science (American Meteorological Society, 2011).
Predicting performance in competitive apnea diving, part II: dynamic apnoea.
Schagatay, Erika
2010-03-01
Part I of this series of articles identified the main physiological factors defining the limits of static apnea, while this paper reviews the factors involved when physical work is added in the dynamic distance disciplines, performed in shallow water in a swimming pool. Little scientific work has been done concerning the prerequisites and limitations of swimming with or without fins whilst breath holding to extreme limits. Apneic duration influences all competitive apnea disciplines, and can be prolonged by any means that increase gas storage or tolerance to asphyxia, or reduce metabolic rate, as reviewed in the first article. For horizontal underwater distance swimming, the main challenge is to restrict metabolism despite the work, and to direct blood flow only to areas where demand is greatest, to allow sustained function. Here, work economy, local tissue energy and oxygen stores and the anaerobic capacity of the muscles are key components. Improvements in swimming techniques and, especially in swimming with fins, equipment have already contributed to enhanced performance and may do so further. High lactate levels observed after competition swims suggest a high anaerobic component, and muscle hypoxia could ultimately limit muscle work and swimming distance. However, the frequency of syncope, especially in swimming without fins, suggests that cerebral oxygenation may often be compromised before this occurs. In these pool disciplines, safety is high and the dive can be interrupted by the competitor or safety diver within seconds. The safety routines in place during pool competitions are described.
He, Yufei; Li, Qing; Bi, Kaishun
2015-04-01
To control the quality of Rhizoma Chuanxiong, a method based on high-performance liquid chromatography method coupled with diode array detection was developed for the quantitative analysis of six active ingredients using a single standard to determine multi-components and chemical fingerprint analysis for the first time. The separation was performed on an Agilent Zorbax SB-C18 column by gradient elution with methanol and aqueous phase (containing 0.5% glacial acetic acid) at a flow rate of 1.0 mL/min. The UV wavelength was set at 274 nm. This assay was fully validated with respect to precision, repeatability, and accuracy. All calibration curves showed good linearity (R(2) > 0.9994) within test ranges. The limit of detection and limit of quantification were lower than 0.01 and 0.03 μg/mL, respectively. The relative standard deviation for repeatability and the intermediate precision of six analytes were less than 1.6 and 2.5%, respectively, the overall recovery was 96.1-103.1%. In addition, fingerprint chromatography using hierarchical clustering analysis and similarity analysis was performed to differentiate and classify the samples. The method described here could provide a more comprehensive and reasonable scientific assessment of the quality of Rhizoma Chuanxiong. Therefore, the strategy is feasible, credible, and is easily and effectively adapted for evaluating the quality control of Rhizoma Chuanxiong. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Validation of WRF forecasts for the Chajnantor region
NASA Astrophysics Data System (ADS)
Pozo, Diana; Marín, J. C.; Illanes, L.; Curé, M.; Rabanus, D.
2016-06-01
This study assesses the performance of the Weather Research and Forecasting (WRF) model to represent the near-surface weather conditions and the precipitable water vapour (PWV) in the Chajnantor plateau, in the north of Chile, from 2007 April to December. The WRF model shows a very good performance forecasting the near-surface temperature and zonal wind component, although it overestimates the 2 m water vapour mixing ratio and underestimates the 10 m meridional wind component. The model represents very well the seasonal, intraseasonal and the diurnal variation of PWV. However, the PWV errors increase after the 12 h of simulation. Errors in the simulations are larger than 1.5 mm only during 10 per cent of the study period, they do not exceed 0.5 mm during 65 per cent of the time and they are below 0.25 mm more than 45 per cent of the time, which emphasizes the good performance of the model to forecast the PWV over the region. The misrepresentation of the near-surface humidity in the region by the WRF model may have a negative impact on the PWV forecasts. Thus, having accurate forecasts of humidity near the surface may result in more accurate PWV forecasts. Overall, results from this, as well as recent studies, supports the use of the WRF model to provide accurate weather forecasts for the region, particularly for the PWV, which can be of great benefit for astronomers in the planning of their scientific operations and observing time.
ArcGIS Framework for Scientific Data Analysis and Serving
NASA Astrophysics Data System (ADS)
Xu, H.; Ju, W.; Zhang, J.
2015-12-01
ArcGIS is a platform for managing, visualizing, analyzing, and serving geospatial data. Scientific data as part of the geospatial data features multiple dimensions (X, Y, time, and depth) and large volume. Multidimensional mosaic dataset (MDMD), a newly enhanced data model in ArcGIS, models the multidimensional gridded data (e.g. raster or image) as a hypercube and enables ArcGIS's capabilities to handle the large volume and near-real time scientific data. Built on top of geodatabase, the MDMD stores the dimension values and the variables (2D arrays) in a geodatabase table which allows accessing a slice or slices of the hypercube through a simple query and supports animating changes along time or vertical dimension using ArcGIS desktop or web clients. Through raster types, MDMD can manage not only netCDF, GRIB, and HDF formats but also many other formats or satellite data. It is scalable and can handle large data volume. The parallel geo-processing engine makes the data ingestion fast and easily. Raster function, definition of a raster processing algorithm, is a very important component in ArcGIS platform for on-demand raster processing and analysis. The scientific data analytics is achieved through the MDMD and raster function templates which perform on-demand scientific computation with variables ingested in the MDMD. For example, aggregating monthly average from daily data; computing total rainfall of a year; calculating heat index for forecasting data, and identifying fishing habitat zones etc. Addtionally, MDMD with the associated raster function templates can be served through ArcGIS server as image services which provide a framework for on-demand server side computation and analysis, and the published services can be accessed by multiple clients such as ArcMap, ArcGIS Online, JavaScript, REST, WCS, and WMS. This presentation will focus on the MDMD model and raster processing templates. In addtion, MODIS land cover, NDFD weather service, and HYCOM ocean model will be used to illustrate how ArcGIS platform and MDMD model can facilitate scientific data visualization and analytics and how the analysis results can be shared to more audience through ArcGIS Online and Portal.
15 CFR 1180.8 - Appointment of Agency Liaison Officers.
Code of Federal Regulations, 2014 CFR
2014-01-01
... SCIENTIFIC, TECHNICAL AND ENGINEERING INFORMATION TO THE NATIONAL TECHNICAL INFORMATION SERVICE § 1180.8... instruments); (3) Appoint additional liaison officers for major units or components of an agency if the...
26 CFR 1.971-1 - Definitions with respect to export trade corporations.
Code of Federal Regulations, 2012 CFR
2012-04-01
..., industrial, financial, technical, scientific, managerial, engineering, architectural, skilled, or other..., industrial, financial, technical, scientific, managerial, engineering, architectural, skilled, or other... performance for any person of commercial, industrial, financial, technical, scientific, managerial...
26 CFR 1.971-1 - Definitions with respect to export trade corporations.
Code of Federal Regulations, 2010 CFR
2010-04-01
..., industrial, financial, technical, scientific, managerial, engineering, architectural, skilled, or other..., industrial, financial, technical, scientific, managerial, engineering, architectural, skilled, or other... performance for any person of commercial, industrial, financial, technical, scientific, managerial...
The Space Telescope SI C&DH system. [Scientific Instrument Control and Data Handling Subsystem
NASA Technical Reports Server (NTRS)
Gadwal, Govind R.; Barasch, Ronald S.
1990-01-01
The Hubble Space Telescope Scientific Instrument Control and Data Handling Subsystem (SI C&DH) is designed to interface with five scientific instruments of the Space Telescope to provide ground and autonomous control and collect health and status information using the Standard Telemetry and Command Components (STACC) multiplex data bus. It also formats high throughput science data into packets. The packetized data is interleaved and Reed-Solomon encoded for error correction and Pseudo Random encoded. An inner convolutional coding with the outer Reed-Solomon coding provides excellent error correction capability. The subsystem is designed with the capacity for orbital replacement in order to meet a mission life of fifteen years. The spacecraft computer and the SI C&DH computer coordinate the activities of the spacecraft and the scientific instruments to achieve the mission objectives.
[Alternative medicine: faith or science?].
Pletscher, A
1990-04-21
For the success of both alternative and scientific (conventional) medicine, factors such as the psychological influence of the doctor, loving care, human affection, the patient's belief in the treatment, the suggestive power of attractive (even unproven) theories, dogmas and chance events (e.g. spontaneous remissions) etc. play a major role. Some practices of alternative medicine have a particularly strong appeal to the non-rational side of the human being. Conventional medicine includes a component which is based on scientific and statistical methods. The possibility that in alternative medicine principles and effects exist which are not (yet) known to scientific medicine, but which match up to scientific criteria, cannot be excluded. However, up to now this has not been convincingly proven. The difficulties which arise in the elucidation of this problem are discussed in the light of examples from the literature and some experiments of our own.
Storey, Andrew P; Hieftje, Gary M
2016-12-01
Over the last several decades, science has benefited tremendously by the implementation of digital electronic components for analytical instrumentation. A pioneer in this area of scientific inquiry was Howard Malmstadt. Frequently, such revolutions in scientific history can be viewed as a series of discoveries without a great deal of attention as to how mentorship shapes the careers and methodologies of those who made great strides forward for science. This paper focuses on the verifiable relationships of those who are connected through the academic tree of Malmstadt and how their experiences and the context of world events influenced their scientific pursuits. Particular attention is dedicated to the development of American chemistry departments and the critical role played by many of the individuals in the tree in this process. © The Author(s) 2016.
SCIFIO: an extensible framework to support scientific image formats.
Hiner, Mark C; Rueden, Curtis T; Eliceiri, Kevin W
2016-12-07
No gold standard exists in the world of scientific image acquisition; a proliferation of instruments each with its own proprietary data format has made out-of-the-box sharing of that data nearly impossible. In the field of light microscopy, the Bio-Formats library was designed to translate such proprietary data formats to a common, open-source schema, enabling sharing and reproduction of scientific results. While Bio-Formats has proved successful for microscopy images, the greater scientific community was lacking a domain-independent framework for format translation. SCIFIO (SCientific Image Format Input and Output) is presented as a freely available, open-source library unifying the mechanisms of reading and writing image data. The core of SCIFIO is its modular definition of formats, the design of which clearly outlines the components of image I/O to encourage extensibility, facilitated by the dynamic discovery of the SciJava plugin framework. SCIFIO is structured to support coexistence of multiple domain-specific open exchange formats, such as Bio-Formats' OME-TIFF, within a unified environment. SCIFIO is a freely available software library developed to standardize the process of reading and writing scientific image formats.
Gottesman, Alan J; Hoskins, Sally G
2013-01-01
The Consider, Read, Elucidate hypotheses, Analyze and interpret data, Think of the next Experiment (CREATE) strategy for teaching and learning uses intensive analysis of primary literature to improve students' critical-thinking and content integration abilities, as well as their self-rated science attitudes, understanding, and confidence. CREATE also supports maturation of undergraduates' epistemological beliefs about science. This approach, originally tested with upper-level students, has been adapted in Introduction to Scientific Thinking, a new course for freshmen. Results from this course's initial semesters indicate that freshmen in a one-semester introductory course that uses a narrowly focused set of readings to promote development of analytical skills made significant gains in critical-thinking and experimental design abilities. Students also reported significant gains in their ability to think scientifically and understand primary literature. Their perceptions and understanding of science improved, and multiple aspects of their epistemological beliefs about science gained sophistication. The course has no laboratory component, is relatively inexpensive to run, and could be adapted to any area of scientific study.
Gottesman, Alan J.; Hoskins, Sally G.
2013-01-01
The Consider, Read, Elucidate hypotheses, Analyze and interpret data, Think of the next Experiment (CREATE) strategy for teaching and learning uses intensive analysis of primary literature to improve students’ critical-thinking and content integration abilities, as well as their self-rated science attitudes, understanding, and confidence. CREATE also supports maturation of undergraduates’ epistemological beliefs about science. This approach, originally tested with upper-level students, has been adapted in Introduction to Scientific Thinking, a new course for freshmen. Results from this course's initial semesters indicate that freshmen in a one-semester introductory course that uses a narrowly focused set of readings to promote development of analytical skills made significant gains in critical-thinking and experimental design abilities. Students also reported significant gains in their ability to think scientifically and understand primary literature. Their perceptions and understanding of science improved, and multiple aspects of their epistemological beliefs about science gained sophistication. The course has no laboratory component, is relatively inexpensive to run, and could be adapted to any area of scientific study. PMID:23463229
NASA Technical Reports Server (NTRS)
Hansen, Patricia A.; Hughes, David W.; Hedgeland, Randy J.; Chivatero, Craig J.; Studer, Robert J.; Kostos, Peter J.
1994-01-01
The Scientific Instrument Protective Enclosures were designed for the Hubble Space Telescope Servicing Missions to provide a beginning environment to a Scientific Instrument during ground and on orbit activities. The Scientific Instruments required very stringent surface cleanliness and molecular outgassing levels to maintain ultraviolet performance. Data from the First Servicing Mission verified that both the Scientific Instruments and Scientific Instrument Protective Enclosures met surface cleanliness level requirements during ground and on-orbit activities.
Software Reuse Within the Earth Science Community
NASA Technical Reports Server (NTRS)
Marshall, James J.; Olding, Steve; Wolfe, Robert E.; Delnore, Victor E.
2006-01-01
Scientific missions in the Earth sciences frequently require cost-effective, highly reliable, and easy-to-use software, which can be a challenge for software developers to provide. The NASA Earth Science Enterprise (ESE) spends a significant amount of resources developing software components and other software development artifacts that may also be of value if reused in other projects requiring similar functionality. In general, software reuse is often defined as utilizing existing software artifacts. Software reuse can improve productivity and quality while decreasing the cost of software development, as documented by case studies in the literature. Since large software systems are often the results of the integration of many smaller and sometimes reusable components, ensuring reusability of such software components becomes a necessity. Indeed, designing software components with reusability as a requirement can increase the software reuse potential within a community such as the NASA ESE community. The NASA Earth Science Data Systems (ESDS) Software Reuse Working Group is chartered to oversee the development of a process that will maximize the reuse potential of existing software components while recommending strategies for maximizing the reusability potential of yet-to-be-designed components. As part of this work, two surveys of the Earth science community were conducted. The first was performed in 2004 and distributed among government employees and contractors. A follow-up survey was performed in 2005 and distributed among a wider community, to include members of industry and academia. The surveys were designed to collect information on subjects such as the current software reuse practices of Earth science software developers, why they choose to reuse software, and what perceived barriers prevent them from reusing software. In this paper, we compare the results of these surveys, summarize the observed trends, and discuss the findings. The results are very similar, with the second, larger survey confirming the basic results of the first, smaller survey. The results suggest that reuse of ESE software can drive down the cost and time of system development, increase flexibility and responsiveness of these systems to new technologies and requirements, and increase effective and accountable community participation.
Creation and Assessment of an Active e-Learning Introductory Geology Course
NASA Astrophysics Data System (ADS)
Sit, Stefany M.; Brudzinski, Michael R.
2017-12-01
The recent emphasis in higher education on both student engagement and online learning encouraged the authors to develop an active e-learning environment for an introductory geohazards course, which enrolls 70+ undergraduate students per semester. Instructors focused on replicating the achievements and addressing the challenges within an already established face-to-face student-centered class (Brudzinski and Sikorski 2010; Sit 2013). Through the use of a learning management system (LMS) and other available technologies, a wide range of course components were developed including online homework assignments with automatic grading and tailored feedback, video tutorials of software programs like Google Earth and Microsoft Excel, and more realistic scientific investigations using authentic and freely available data downloaded from the internet. The different course components designed to engage students and improve overall student learning and development were evaluated using student surveys and instructor reflection. Each component can be used independently and intertwined into a face-to-face course. Results suggest that significant opportunities are available in an online environment including the potential for improved student performance and new datasets for educational research. Specifically, results from pre and post-semester Geoscience Concept Inventory (GCI) testing in an active e-learning course show enhanced student learning gains compared to face-to-face lecture-based and student-centered courses.
76 FR 29725 - Application(s) for Duty-Free Entry of Scientific Instruments
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-23
.... Manufacturer: FEI Company, Brno, Czech Republic. Intended Use: The instrument will be used to study polymers... construction; lubricated components in automotives; and electrode materials in lithium ion batteries...
2013-03-07
and toughness properties • Organic and inorganic components from molecular to macro length scales enables mechanically-robust materials with...Nanostructured Carbon 0D Fullerene 3D ? 13 DISTRIBUTION STATEMENT A – Unclassified, Unlimited Distribution Overarching Scientific Challenges
NASA Astrophysics Data System (ADS)
Cuff, K.; Cannady, M.; Dorph, R.; Rodriguez, V. A.; Romero, V.
2016-12-01
The UC Berkeley East Bay Academy for Young Scientists (EBAYS) program provides youth from non-dominant communities in the East San Francisco Bay Area with unique opportunities to develop deeper understanding of environmental science content, as well as fundamental scientific practice skills. A key component of EBAYS programming is collaborative research projects that generate information useful in addressing critical environmental issues. This important component also provides opportunities for youth to present results of their investigations to other community members and to the scientific community at large. Inclusion of the environmental science research component is intended to help address the following program goals: A) increasing appreciation for the value of scientific practices as a tool for addressing important community-based issues; B) helping raise community awareness of important issues; C) sparking interest in other forms of community activism; D) increasing understanding of key science concepts; and E) generating valuable environmental quality data. In an effort to assess the degree to which EBAYS programming accomplishes these goals, as well as to evaluate its capacity to be effectively replicated on a broader scale, EBAYS staff has engaged in an investigation of associated learning and youth development outcomes. In this regard a research strategy has been developed that includes the use of assessment tools that will help foster a deeper understanding of the ways in which EBAYS programming increases the extent to which participants value the application of science, affects their overall occupational trajectory, and inspires them to consider careers in STEM.
GEE-WIS Anchored Problem Solving Using Real-Time Authentic Water Quality Data
NASA Astrophysics Data System (ADS)
Young, M.; Wlodarczyk, M. S.; Branco, B.; Torgersen, T.
2002-05-01
GEE-WIS scientific problem solving consists of observing, hypothesizing, synthesis, argument building and reasoning, in the context of analysis, representation, modeling and sense-making of real-time authentic water quality data. Geoscience Environmental Education - Web-accessible Instrumented Systems, or GEE-WIS, an NSF Geoscience Education grant, has established a set of companion websites that stream real-time data from two campus retention ponds for research and use in secondary and undergraduate water quality lessons. We have targeted scientific problem solving skills because of the nature of the GEE-WIS environment, but further because they are central to state and federal efforts to establish science education curriculum standards and are at the core of performance-based testing. We have used a design experiment process to create and test two Anchored Instruction scenario problems. Customization such as that done through a design process, is acknowledged to be a fundamental component of educational research from an ecological psychology perspective. Our efforts have shared core design elements with other NSF water quality projects. Our method involves the analysis of student written scenario responses for level of scientific problem solving using a qualitative scoring rubric designed from participation in a related NSF project, SCALE (Synergy Communities: Aggregating Learning about Education). Student solutions of GEE-WIS anchor problems from Fall 2001 and Spring 2002 will be summarized. Implications are drawn for those interested in making secondary and high education geoscience more realistic and more motivating for students through the use of real-time authentic data via Internet.
Reengineering observatory operations for the time domain
NASA Astrophysics Data System (ADS)
Seaman, Robert L.; Vestrand, W. T.; Hessman, Frederic V.
2014-07-01
Observatories are complex scientific and technical institutions serving diverse users and purposes. Their telescopes, instruments, software, and human resources engage in interwoven workflows over a broad range of timescales. These workflows have been tuned to be responsive to concepts of observatory operations that were applicable when various assets were commissioned, years or decades in the past. The astronomical community is entering an era of rapid change increasingly characterized by large time domain surveys, robotic telescopes and automated infrastructures, and - most significantly - of operating modes and scientific consortia that span our individual facilities, joining them into complex network entities. Observatories must adapt and numerous initiatives are in progress that focus on redesigning individual components out of the astronomical toolkit. New instrumentation is both more capable and more complex than ever, and even simple instruments may have powerful observation scripting capabilities. Remote and queue observing modes are now widespread. Data archives are becoming ubiquitous. Virtual observatory standards and protocols and astroinformatics data-mining techniques layered on these are areas of active development. Indeed, new large-aperture ground-based telescopes may be as expensive as space missions and have similarly formal project management processes and large data management requirements. This piecewise approach is not enough. Whatever challenges of funding or politics facing the national and international astronomical communities it will be more efficient - scientifically as well as in the usual figures of merit of cost, schedule, performance, and risks - to explicitly address the systems engineering of the astronomical community as a whole.
Tobacco industry scientific advisors: serving society or selling cigarettes?
Warner, K E
1991-07-01
According to industry documents, the tobacco industry has executed a "brilliantly conceived" strategy to "creat[e] doubt" in the public's mind about whether cigarette smoking is in fact a serious cause of disease. A component of this strategy has been the funding of scientific research "into the gaps in knowledge in the smoking controversy." Grant review and selection are performed by a group of independent scientists. Knowledgeable observers believe that the existence of this research funding program in general, and the Scientific Advisory Board in particular, is intended by the industry to reinforce doubts in the public mind about the severity of the hazards posed by smoking. Because the Advisory Board has never taken a public stance against the industry's position that the causal relationship between smoking and disease remains unproven, I polled these scientists to determine whether they believed that smoking is a cause of lung cancer. Despite repeated opportunities, only four of 13 board members responded, all affirmatively; two others have expressed their judgment that smoking causes lung cancer in their professional publications. Thus, over half of the Board members, and the Board as a whole, have not gone on record as rejecting the industry's "party line." It might be hoped that the American scientists would follow the lead of the members of a similar body of scientists in Australia who have taken a strong and public stand against the industry position that smoking is not an established cause of disease.
NASA Astrophysics Data System (ADS)
Buccheri, Grazia; Abt Gürber, Nadja; Brühwiler, Christian
2011-01-01
Many countries belonging to the Organisation for Economic Co-operation and Development (OECD) note a shortage of highly qualified scientific-technical personnel, whereas demand for such employees is growing. Therefore, how to motivate (female) high performers in science or mathematics to pursue scientific careers is of special interest. The sample for this study is taken from the Programme for International Student Assessment (PISA) 2006. It comprises 7,819 high performers either in sciences or mathematics from representative countries of four different education systems which generally performed well or around the OECD average in PISA 2006: Switzerland, Finland, Australia, and Korea. The results give evidence that gender specificity and gender inequity in science education are a cross-national problem. Interests in specific science disciplines only partly support vocational choices in scientific-technical fields. Instead, gender and gender stereotypes play a significant role. Enhancing the utility of a scientific vocational choice is expected to soften the gender impact.
Optical Characteristics of the Marshall Space Flight Center Solar Ultraviolet Magnetograph
NASA Technical Reports Server (NTRS)
West, E. A.; Porter, J. G.; Davis, J. M.; Gary, G. A.; Adams, M.; Smith, S.; Hraba, J. F.
2001-01-01
This paper will describe the scientific objectives of the Marshall Space Flight Center (MSFC) Solar Ultraviolet Magnetograph Investigation (SUMI) and the optical components that have been developed to meet those objectives. In order to test the scientific feasibility of measuring magnetic fields in the UV, a sounding rocket payload is being developed. This paper will discuss: (1) the scientific measurements that will be made by the SUMI sounding rocket program, (2) how the optics have been optimized for simultaneous measurements of two magnetic lines CIV (1550 Angstroms) and MgII (2800 Angstroms), and (3) the optical, reflectance, transmission and polarization measurements that have been made on the SUMI telescope mirror and polarimeter.
Science in the liberal arts curriculum - A personal view
NASA Astrophysics Data System (ADS)
Young, A.
1983-12-01
A discussion concerning the character and importance of the epistemological structure of science notes that contemporary textbooks and traditional courses used in the scientific component of the liberal arts curriculum do not communicate that structure. A course, designated 'The Structure of Scientific Thought', is suggested as a vehicle for communicating to nonscientists the fundamental aspects of scientific inquiry, and the shortcommings of traditional textbooks and courses are illustrated by contrast to its contents. Attention is given to such aspects of the structure of science as empiricism, conceptualization, the relationships among science, truth and reality, theoretical hierarchies, the distinction between explanation and understanding, and the centrality of abstraction and mathematical formalism in science.
Hohmann, Erik; Brand, Jefferson C; Rossi, Michael J; Lubowitz, James H
2018-02-01
Our current trend and focus on evidence-based medicine is biased in favor of randomized controlled trials, which are ranked highest in the hierarchy of evidence while devaluing expert opinion, which is ranked lowest in the hierarchy. However, randomized controlled trials have weaknesses as well as strengths, and no research method is flawless. Moreover, stringent application of scientific research techniques, such as the Delphi Panel methodology, allows survey of experts in a high quality and scientific manner. Level V evidence (expert opinion) remains a necessary component in the armamentarium used to determine the answer to a clinical question. Copyright © 2017 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.
Cubesat Application for Planetary Entry (CAPE) Missions: Micro-Return Capsule (MIRCA)
NASA Technical Reports Server (NTRS)
Esper, Jaime
2016-01-01
The Cubesat Application for Planetary Entry Missions (CAPE) concept describes a high-performing Cubesat system which includes a propulsion module and miniaturized technologies capable of surviving atmospheric entry heating, while reliably transmitting scientific and engineering data. The Micro Return Capsule (MIRCA) is CAPE's first planetary entry probe flight prototype. Within this context, this paper briefly describes CAPE's configuration and typical operational scenario, and summarizes ongoing work on the design and basic aerodynamic characteristics of the prototype MIRCA vehicle. CAPE not only opens the door to new planetary mission capabilities, it also offers relatively low-cost opportunities especially suitable to university participation. In broad terms, CAPE consists of two main functional components: the "service module" (SM), and "CAPE's entry probe" (CEP). The SM contains the subsystems necessary to support vehicle targeting (propulsion, ACS, computer, power) and the communications capability to relay data from the CEP probe to an orbiting "mother-ship". The CEP itself carries the scientific instrumentation capable of measuring atmospheric properties (such as density, temperature, composition), and embedded engineering sensors for Entry, Descent, and Landing (EDL). The first flight of MIRCA was successfully completed on 10 October 2015 as a "piggy-back" payload onboard a NASA stratospheric balloon launched from Ft. Sumner, NM.
Payload topography camera of Chang'e-3
NASA Astrophysics Data System (ADS)
Yu, Guo-Bin; Liu, En-Hai; Zhao, Ru-Jin; Zhong, Jie; Zhou, Xiang-Dong; Zhou, Wu-Lin; Wang, Jin; Chen, Yuan-Pei; Hao, Yong-Jie
2015-11-01
Chang'e-3 was China's first soft-landing lunar probe that achieved a successful roving exploration on the Moon. A topography camera functioning as the lander's “eye” was one of the main scientific payloads installed on the lander. It was composed of a camera probe, an electronic component that performed image compression, and a cable assembly. Its exploration mission was to obtain optical images of the lunar topography in the landing zone for investigation and research. It also observed rover movement on the lunar surface and finished taking pictures of the lander and rover. After starting up successfully, the topography camera obtained static images and video of rover movement from different directions, 360° panoramic pictures of the lunar surface around the lander from multiple angles, and numerous pictures of the Earth. All images of the rover, lunar surface, and the Earth were clear, and those of the Chinese national flag were recorded in true color. This paper describes the exploration mission, system design, working principle, quality assessment of image compression, and color correction of the topography camera. Finally, test results from the lunar surface are provided to serve as a reference for scientific data processing and application.
Confounding factors and genetic polymorphism in the evaluation of individual steroid profiling
Kuuranne, Tiia; Saugy, Martial; Baume, Norbert
2014-01-01
In the fight against doping, steroid profiling is a powerful tool to detect drug misuse with endogenous anabolic androgenic steroids. To establish sensitive and reliable models, the factors influencing profiling should be recognised. We performed an extensive literature review of the multiple factors that could influence the quantitative levels and ratios of endogenous steroids in urine matrix. For a comprehensive and scientific evaluation of the urinary steroid profile, it is necessary to define the target analytes as well as testosterone metabolism. The two main confounding factors, that is, endogenous and exogenous factors, are detailed to show the complex process of quantifying the steroid profile within WADA-accredited laboratories. Technical aspects are also discussed as they could have a significant impact on the steroid profile, and thus the steroid module of the athlete biological passport (ABP). The different factors impacting the major components of the steroid profile must be understood to ensure scientifically sound interpretation through the Bayesian model of the ABP. Not only should the statistical data be considered but also the experts in the field must be consulted for successful implementation of the steroidal module. PMID:24764553
2012 Global Summit on Regulatory Science (GSRS-2012)--modernizing toxicology.
Miller, Margaret A; Tong, Weida; Fan, Xiaohui; Slikker, William
2013-01-01
Regulatory science encompasses the tools, models, techniques, and studies needed to assess and evaluate product safety, efficacy, quality, and performance. Several recent publications have emphasized the role of regulatory science in improving global health, supporting economic development and fostering innovation. As for other scientific disciplines, research in regulatory science is the critical element underpinning the development and advancement of regulatory science as a modern scientific discipline. As a regulatory agency in the 21st century, the Food and Drug Administration (FDA) has an international component that underpins its domestic mission; foods, drugs, and devices are developed and imported to the United States from across the world. The Global Summit on Regulatory Science, an international conference for discussing innovative technologies, approaches, and partnerships that enhance the translation of basic science into regulatory applications, is providing leadership for the advancement of regulatory sciences within the global context. Held annually, this international conference provides a platform where regulators, policy makers, and bench scientists from various countries can exchange views on how to develop, apply, and implement innovative methodologies into regulatory assessments in their respective countries, as well as developing a harmonized strategy to improve global public health through global collaboration.
Open semantic annotation of scientific publications using DOMEO.
Ciccarese, Paolo; Ocana, Marco; Clark, Tim
2012-04-24
Our group has developed a useful shared software framework for performing, versioning, sharing and viewing Web annotations of a number of kinds, using an open representation model. The Domeo Annotation Tool was developed in tandem with this open model, the Annotation Ontology (AO). Development of both the Annotation Framework and the open model was driven by requirements of several different types of alpha users, including bench scientists and biomedical curators from university research labs, online scientific communities, publishing and pharmaceutical companies.Several use cases were incrementally implemented by the toolkit. These use cases in biomedical communications include personal note-taking, group document annotation, semantic tagging, claim-evidence-context extraction, reagent tagging, and curation of textmining results from entity extraction algorithms. We report on the Domeo user interface here. Domeo has been deployed in beta release as part of the NIH Neuroscience Information Framework (NIF, http://www.neuinfo.org) and is scheduled for production deployment in the NIF's next full release.Future papers will describe other aspects of this work in detail, including Annotation Framework Services and components for integrating with external textmining services, such as the NCBO Annotator web service, and with other textmining applications using the Apache UIMA framework.
Open semantic annotation of scientific publications using DOMEO
2012-01-01
Background Our group has developed a useful shared software framework for performing, versioning, sharing and viewing Web annotations of a number of kinds, using an open representation model. Methods The Domeo Annotation Tool was developed in tandem with this open model, the Annotation Ontology (AO). Development of both the Annotation Framework and the open model was driven by requirements of several different types of alpha users, including bench scientists and biomedical curators from university research labs, online scientific communities, publishing and pharmaceutical companies. Several use cases were incrementally implemented by the toolkit. These use cases in biomedical communications include personal note-taking, group document annotation, semantic tagging, claim-evidence-context extraction, reagent tagging, and curation of textmining results from entity extraction algorithms. Results We report on the Domeo user interface here. Domeo has been deployed in beta release as part of the NIH Neuroscience Information Framework (NIF, http://www.neuinfo.org) and is scheduled for production deployment in the NIF’s next full release. Future papers will describe other aspects of this work in detail, including Annotation Framework Services and components for integrating with external textmining services, such as the NCBO Annotator web service, and with other textmining applications using the Apache UIMA framework. PMID:22541592
Scientific Background for Processing of Aluminum Waste
NASA Astrophysics Data System (ADS)
Kononchuk, Olga; Alekseev, Alexey; Zubkova, Olga; Udovitsky, Vladimir
2017-11-01
Changing the source of raw materials for producing aluminum and the emergence of a huge number of secondary alumina waste (foundry slag, sludge, spent catalysts, mineral parts of coal and others that are formed in various industrial enterprises) require the creation of scientific and theoretical foundations for their processing. In this paper, the aluminum alloys (GOST 4784-97) are used as an aluminum raw material component, containing the aluminum component produced as chips in the machine-building enterprises. The aluminum waste is a whole range of metallic aluminum alloys including elements: magnesium, copper, silica, zinc and iron. Analysis of the aluminum waste A1- Zn-Cu-Si-Fe shows that depending on the content of the metal the dissolution process of an aluminum alloy should be treated as the result of the chemical interaction of the metal with an alkaline solution. It is necessary to consider the behavior of the main components of alloys in an alkaline solution as applied to the system Na2O - Al2O3 - SiO2 - CO2 - H2O.
Spacelab program's scientific benefits to mankind
NASA Technical Reports Server (NTRS)
Craft, H. G. Jr; Marmann, R. A.
1994-01-01
This paper describes the Spacelab program's scientific accomplishments during the past 10 years, highlighting major scientific accomplishments. An overview of Spacelab systems performance, significant issues, and utilization and operations activities applicable to the space station era is presented.
Scientific Ethics in Chemical Education
NASA Astrophysics Data System (ADS)
Kovac, Jeffrey
1996-10-01
Scientific ethics is a subset of professional ethics, the special rules of conduct adhered to by people engaged in those pursuits called professions. It is distinct from, but consistent with, both ordinary morality and moral theory. The codes of professional ethics derive from the two bargains that define a profession: the internal code of practice and the external bargain between the profession and society. While the informal code of professional conduct is well understood by working scientists, it is rarely explicitly included in the chemistry curriculum. Instead, we have relied on informal methods to teach students scientific ethics, a strategy that is haphazard at best. In this paper I argue that scientific ethics can and must be taught as part of the chemistry curriculum and that this is the best done through the case-study method. Many decisions made by working scientists have both a technical and an ethical component. Students need to learn how to make good decisions in professional ethics. The alternative is, at best, sloppy science and, at worst, scientific misconduct.
Scientific knowledge and modern prospecting
Neuerburg, G.J.
1985-01-01
Modern prospecting is the systematic search for specified and generally ill-exposed components of the Earth's crust known as ore. This prospecting depends entirely on reliable, or scientific knowledge for guidance and for recognition of the search objects. Improvement in prospecting results from additions and refinements to scientific knowledge. Scientific knowledge is an ordered distillation of observations too numerous and too complex in themselves for easy understanding and for effective management. The ordering of these observations is accomplished by an evolutionary hierarchy of abstractions. These abstractions employ simplified descriptions consisting of characterization by selected properties, sampling to represent much larger parts of a phenomenon, generalized mappings of patterns of geometrical and numerical relations among properties, and explanation (theory) of these patterns as functional relations among the selected properties. Each abstraction is predicated on the mode of abstraction anticipated for the next higher level, so that research is a deductive process in which the highest level, theory, is indispensible for the growth and refinement of scientific knowledge, and therefore of prospecting methodology. ?? 1985 Springer-Verlag.
NASA Astrophysics Data System (ADS)
Chitnork, Amporn; Yuenyong, Chokchai
2018-01-01
The research aimed to enhance Grade 10 Thai students' scientific argumentation in learning about electric field through science, technology, and society (STS) approach. The participants included 45 Grade 10 students who were studying in a school in Nongsonghong, Khon Kaen, Thailand. Methodology regarded interpretive paradigm. The intervention was the force unit which was provided based on Yuenyong (2006) STS approach. Students learned about the STS electric field unit for 4 weeks. The students' scientific argumentation was interpreted based on Toulmin's argument pattern or TAP. The TAP provided six components of argumentation including data, claim, warrants, qualifiers, rebuttals and backing. Tools of interpretation included students' activity sheets, conversation, journal writing, classroom observation and interview. The findings revealed that students held the different pattern of argumentation. Then, they change pattern of argumentation close to the TAP. It indicates that the intervention of STS electric field unit enhance students to develop scientific argumentation. This finding may has implication of further enhancing scientific argumentation in Thailand.
Multi-component testing using HZ-PAN and AgZ-PAN Sorbents for OSPREY Model validation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garn, Troy G.; Greenhalgh, Mitchell; Lyon, Kevin L.
2015-04-01
In efforts to further develop the capability of the Off-gas SeParation and RecoverY (OSPREY) model, multi-component tests were completed using both HZ-PAN and AgZ-PAN sorbents. The primary purpose of this effort was to obtain multi-component xenon and krypton capacities for comparison to future OSPREY predicted multi-component capacities using previously acquired Langmuir equilibrium parameters determined from single component isotherms. Experimental capacities were determined for each sorbent using two feed gas compositions of 1000 ppmv xenon and 150 ppmv krypton in either a helium or air balance. Test temperatures were consistently held at 220 K and the gas flowrate was 50 sccm.more » Capacities were calculated from breakthrough curves using TableCurve® 2D software by Jandel Scientific. The HZ-PAN sorbent was tested in the custom designed cryostat while the AgZ-PAN was tested in a newly installed cooling apparatus. Previous modeling validation efforts indicated the OSPREY model can be used to effectively predict single component xenon and krypton capacities for both engineered form sorbents. Results indicated good agreement with the experimental and predicted capacity values for both krypton and xenon on the sorbents. Overall, the model predicted slightly elevated capacities for both gases which can be partially attributed to the estimation of the parameters and the uncertainty associated with the experimental measurements. Currently, OSPREY is configured such that one species adsorbs and one does not (i.e. krypton in helium). Modification of OSPREY code is currently being performed to incorporate multiple adsorbing species and non-ideal interactions of gas phase species with the sorbent and adsorbed phases. Once these modifications are complete, the sorbent capacities determined in the present work will be used to validate OSPREY multicomponent adsorption predictions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, X. H.; Akahori, Takuya; Anderson, C. S.
2015-02-01
Faraday rotation measures (RMs) and more general Faraday structures are key parameters for studying cosmic magnetism and are also sensitive probes of faint ionized thermal gas. A definition of which derived quantities are required for various scientific studies is needed, as well as addressing the challenges in determining Faraday structures. A wide variety of algorithms has been proposed to reconstruct these structures. In preparation for the Polarization Sky Survey of the Universe's Magnetism (POSSUM) to be conducted with the Australian Square Kilometre Array Pathfinder and the ongoing Galactic Arecibo L-band Feeds Array Continuum Transit Survey (GALFACTS), we run a Faradaymore » structure determination data challenge to benchmark the currently available algorithms, including Faraday synthesis (previously called RM synthesis in the literature), wavelet, compressive sampling, and QU-fitting. The input models include sources with one Faraday thin component, two Faraday thin components, and one Faraday thick component. The frequency set is similar to POSSUM/GALFACTS with a 300 MHz bandwidth from 1.1 to 1.4 GHz. We define three figures of merit motivated by the underlying science: (1) an average RM weighted by polarized intensity, RM{sub wtd}, (2) the separation Δϕ of two Faraday components, and (3) the reduced chi-squared χ{sub r}{sup 2}. Based on the current test data with a signal-to-noise ratio of about 32, we find the following. (1) When only one Faraday thin component is present, most methods perform as expected, with occasional failures where two components are incorrectly found. (2) For two Faraday thin components, QU-fitting routines perform the best, with errors close to the theoretical ones for RM{sub wtd} but with significantly higher errors for Δϕ. All other methods, including standard Faraday synthesis, frequently identify only one component when Δϕ is below or near the width of the Faraday point-spread function. (3) No methods as currently implemented work well for Faraday thick components due to the narrow bandwidth. (4) There exist combinations of two Faraday components that produce a large range of acceptable fits and hence large uncertainties in the derived single RMs; in these cases, different RMs lead to the same Q, U behavior, so no method can recover a unique input model. Further exploration of all these issues is required before upcoming surveys will be able to provide reliable results on Faraday structures.« less
Design of a water electrolysis flight experiment
NASA Technical Reports Server (NTRS)
Lee, M. Gene; Grigger, David J.; Thompson, C. Dean; Cusick, Robert J.
1993-01-01
Supply of oxygen (O2) and hydrogen (H2) by electolyzing water in space will play an important role in meeting the National Aeronautics and Space Administration's (NASA's) needs and goals for future space missios. Both O2 and H2 are envisioned to be used in a variety of processes including crew life support, spacecraft propulsion, extravehicular activity, electrical power generation/storage as well as in scientific experiment and manufacturing processes. The Electrolysis Performance Improvement Concept Study (EPICS) flight experiment described herein is sponsored by NASA Headquarters as a part of the In-Space Technology Experiment Program (IN-STEP). The objective of the EPICS is to further contribute to the improvement of the SEF technology, specifially by demonstrating and validating the SFE electromechanical process in microgravity as well as investigating perrformance improvements projected possible in a microgravity environment. This paper defines the experiment objective and presents the results of the preliminary design of the EPICS. The experiment will include testing three subscale self-contained SFE units: one containing baseline components, and two units having variations in key component materials. Tests will be conducted at varying current and thermal condition.
A Customizable Importer for the Clinical Data Warehouses PaDaWaN and I2B2.
Fette, Georg; Kaspar, Mathias; Dietrich, Georg; Ertl, Maximilian; Krebs, Jonathan; Stoerk, Stefan; Puppe, Frank
2017-01-01
In recent years, clinical data warehouses (CDW) storing routine patient data have become more and more popular to support scientific work in the medical domain. Although CDW systems provide interfaces to import new data, these interfaces have to be used by processing tools that are often not included in the systems themselves. In order to establish an extraction-transformation-load (ETL) workflow, already existing components have to be taken or new components have to be developed to perform the load part of the ETL. We present a customizable importer for the two CDW systems PaDaWaN and I2B2, which is able to import the most common import formats (plain text, CSV and XML files). In order to be run, the importer only needs a configuration file with the user credentials for the target CDW and a list of XML import configuration files, which determine how already exported data is indented to be imported. The importer is provided as a Java program, which has no further software requirements.
Astronomy through the Skylab scientific airlocks.
NASA Technical Reports Server (NTRS)
Henize, K. G.; Weinberg, J. L.
1973-01-01
Description of Skylab astronomy experiments (other than the Apollo Telescope Mount experiments) designed to study the earth's atmosphere, particles near the spacecraft, various components of the background skylight, the spectra of the sun, and the features of stars, nebulae, and galaxies. Emphasis is placed on the eight experiments that will operate through the scientific airlocks in the Orbital Workshop. The major features of equipment to be used in each experiment are outlined together with characteristics and relevance of information expected in each case.
NASA Technical Reports Server (NTRS)
Flora-Adams, Dana; Makihara, Jeanne; Benenyan, Zabel; Berner, Jeff; Kwok, Andrew
2007-01-01
Object Oriented Data Technology (OODT) is a software framework for creating a Web-based system for exchange of scientific data that are stored in diverse formats on computers at different sites under the management of scientific peers. OODT software consists of a set of cooperating, distributed peer components that provide distributed peer-topeer (P2P) services that enable one peer to search and retrieve data managed by another peer. In effect, computers running OODT software at different locations become parts of an integrated data-management system.
Software Framework for Peer Data-Management Services
NASA Technical Reports Server (NTRS)
Hughes, John; Hardman, Sean; Crichton, Daniel; Hyon, Jason; Kelly, Sean; Tran, Thuy
2007-01-01
Object Oriented Data Technology (OODT) is a software framework for creating a Web-based system for exchange of scientific data that are stored in diverse formats on computers at different sites under the management of scientific peers. OODT software consists of a set of cooperating, distributed peer components that provide distributed peer-to-peer (P2P) services that enable one peer to search and retrieve data managed by another peer. In effect, computers running OODT software at different locations become parts of an integrated data-management system.
Assessing Scientific Performance.
ERIC Educational Resources Information Center
Weiner, John M.; And Others
1984-01-01
A method for assessing scientific performance based on relationships displayed numerically in published documents is proposed and illustrated using published documents in pediatric oncology for the period 1979-1982. Contributions of a major clinical investigations group, the Childrens Cancer Study Group, are analyzed. Twenty-nine references are…
PKDE4J: Entity and relation extraction for public knowledge discovery.
Song, Min; Kim, Won Chul; Lee, Dahee; Heo, Go Eun; Kang, Keun Young
2015-10-01
Due to an enormous number of scientific publications that cannot be handled manually, there is a rising interest in text-mining techniques for automated information extraction, especially in the biomedical field. Such techniques provide effective means of information search, knowledge discovery, and hypothesis generation. Most previous studies have primarily focused on the design and performance improvement of either named entity recognition or relation extraction. In this paper, we present PKDE4J, a comprehensive text-mining system that integrates dictionary-based entity extraction and rule-based relation extraction in a highly flexible and extensible framework. Starting with the Stanford CoreNLP, we developed the system to cope with multiple types of entities and relations. The system also has fairly good performance in terms of accuracy as well as the ability to configure text-processing components. We demonstrate its competitive performance by evaluating it on many corpora and found that it surpasses existing systems with average F-measures of 85% for entity extraction and 81% for relation extraction. Copyright © 2015 Elsevier Inc. All rights reserved.
Optimal technique for deep breathing exercises after cardiac surgery.
Westerdahl, E
2015-06-01
Cardiac surgery patients often develop a restrictive pulmonary impairment and gas exchange abnormalities in the early postoperative period. Chest physiotherapy is routinely prescribed in order to reduce or prevent these complications. Besides early mobilization, positioning and shoulder girdle exercises, various breathing exercises have been implemented as a major component of postoperative care. A variety of deep breathing maneuvres are recommended to the spontaneously breathing patient to reduce atelectasis and to improve lung function in the early postoperative period. Different breathing exercises are recommended in different parts of the world, and there is no consensus about the most effective breathing technique after cardiac surgery. Arbitrary instructions are given, and recommendations on performance and duration vary between hospitals. Deep breathing exercises are a major part of this therapy, but scientific evidence for the efficacy has been lacking until recently, and there is a lack of trials describing how postoperative breathing exercises actually should be performed. The purpose of this review is to provide a brief overview of postoperative breathing exercises for patients undergoing cardiac surgery via sternotomy, and to discuss and suggest an optimal technique for the performance of deep breathing exercises.
SRTR center-specific reporting tools: Posttransplant outcomes.
Dickinson, D M; Shearon, T H; O'Keefe, J; Wong, H-H; Berg, C L; Rosendale, J D; Delmonico, F L; Webb, R L; Wolfe, R A
2006-01-01
Measuring and monitoring performance--be it waiting list and posttransplant outcomes by a transplant center, or organ donation success by an organ procurement organization and its partnering hospitals--is an important component of ensuring good care for people with end-stage organ failure. Many parties have an interest in examining these outcomes, from patients and their families to payers such as insurance companies or the Centers for Medicare and Medicaid Services; from primary caregivers providing patient counseling to government agencies charged with protecting patients. The Scientific Registry of Transplant Recipients produces regular, public reports on the performance of transplant centers and organ procurement organizations. This article explains the statistical tools used to prepare these reports, with a focus on graft survival and patient survival rates of transplant centers--especially the methods used to fairly and usefully compare outcomes of centers that serve different populations. The article concludes with a practical application of these statistics--their use in screening transplant center performance to identify centers that may need remedial action by the OPTN/UNOS Membership and Professional Standards Committee.
Inflight Calibration of the Lunar Reconnaissance Orbiter Camera Wide Angle Camera
NASA Astrophysics Data System (ADS)
Mahanti, P.; Humm, D. C.; Robinson, M. S.; Boyd, A. K.; Stelling, R.; Sato, H.; Denevi, B. W.; Braden, S. E.; Bowman-Cisneros, E.; Brylow, S. M.; Tschimmel, M.
2016-04-01
The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) has acquired more than 250,000 images of the illuminated lunar surface and over 190,000 observations of space and non-illuminated Moon since 1 January 2010. These images, along with images from the Narrow Angle Camera (NAC) and other Lunar Reconnaissance Orbiter instrument datasets are enabling new discoveries about the morphology, composition, and geologic/geochemical evolution of the Moon. Characterizing the inflight WAC system performance is crucial to scientific and exploration results. Pre-launch calibration of the WAC provided a baseline characterization that was critical for early targeting and analysis. Here we present an analysis of WAC performance from the inflight data. In the course of our analysis we compare and contrast with the pre-launch performance wherever possible and quantify the uncertainty related to various components of the calibration process. We document the absolute and relative radiometric calibration, point spread function, and scattered light sources and provide estimates of sources of uncertainty for spectral reflectance measurements of the Moon across a range of imaging conditions.
Structurally Integrated, Damage-Tolerant, Thermal Spray Coatings
NASA Astrophysics Data System (ADS)
Vackel, Andrew; Dwivedi, Gopal; Sampath, Sanjay
2015-07-01
Thermal spray coatings are used extensively for the protection and life extension of engineering components exposed to harsh wear and/or corrosion during service in aerospace, energy, and heavy machinery sectors. Cermet coatings applied via high-velocity thermal spray are used in aggressive wear situations almost always coupled with corrosive environments. In several instances (e.g., landing gear), coatings are considered as part of the structure requiring system-level considerations. Despite their widespread use, the technology has lacked generalized scientific principles for robust coating design, manufacturing, and performance analysis. Advances in process and in situ diagnostics have provided significant insights into the process-structure-property-performance correlations providing a framework-enhanced design. In this overview, critical aspects of materials, process, parametrics, and performance are discussed through exemplary studies on relevant compositions. The underlying connective theme is understanding and controlling residual stresses generation, which not only addresses process dynamics but also provides linkage for process-property relationship for both the system (e.g., fatigue) and the surface (wear and corrosion). The anisotropic microstructure also invokes the need for damage-tolerant material design to meet future goals.
NASA Astrophysics Data System (ADS)
Myre, Joseph M.
Heterogeneous computing systems have recently come to the forefront of the High-Performance Computing (HPC) community's interest. HPC computer systems that incorporate special purpose accelerators, such as Graphics Processing Units (GPUs), are said to be heterogeneous. Large scale heterogeneous computing systems have consistently ranked highly on the Top500 list since the beginning of the heterogeneous computing trend. By using heterogeneous computing systems that consist of both general purpose processors and special- purpose accelerators, the speed and problem size of many simulations could be dramatically increased. Ultimately this results in enhanced simulation capabilities that allows, in some cases for the first time, the execution of parameter space and uncertainty analyses, model optimizations, and other inverse modeling techniques that are critical for scientific discovery and engineering analysis. However, simplifying the usage and optimization of codes for heterogeneous computing systems remains a challenge. This is particularly true for scientists and engineers for whom understanding HPC architectures and undertaking performance analysis may not be primary research objectives. To enable scientists and engineers to remain focused on their primary research objectives, a modular environment for geophysical inversion and run-time autotuning on heterogeneous computing systems is presented. This environment is composed of three major components: 1) CUSH---a framework for reducing the complexity of programming heterogeneous computer systems, 2) geophysical inversion routines which can be used to characterize physical systems, and 3) run-time autotuning routines designed to determine configurations of heterogeneous computing systems in an attempt to maximize the performance of scientific and engineering codes. Using three case studies, a lattice-Boltzmann method, a non-negative least squares inversion, and a finite-difference fluid flow method, it is shown that this environment provides scientists and engineers with means to reduce the programmatic complexity of their applications, to perform geophysical inversions for characterizing physical systems, and to determine high-performing run-time configurations of heterogeneous computing systems using a run-time autotuner.
Conceptual model of iCAL4LA: Proposing the components using comparative analysis
NASA Astrophysics Data System (ADS)
Ahmad, Siti Zulaiha; Mutalib, Ariffin Abdul
2016-08-01
This paper discusses an on-going study that initiates an initial process in determining the common components for a conceptual model of interactive computer-assisted learning that is specifically designed for low achieving children. This group of children needs a specific learning support that can be used as an alternative learning material in their learning environment. In order to develop the conceptual model, this study extracts the common components from 15 strongly justified computer assisted learning studies. A comparative analysis has been conducted to determine the most appropriate components by using a set of specific indication classification to prioritize the applicability. The results of the extraction process reveal 17 common components for consideration. Later, based on scientific justifications, 16 of them were selected as the proposed components for the model.
Scientific grade CCDs from EG & G Reticon
NASA Technical Reports Server (NTRS)
Cizdziel, Philip J.
1990-01-01
The design and performance of three scientific grade CCDs are summarized: a 1200 x 400 astronomical array of 27 x 27 sq micron pixels, a 512 x 512 scientific array of 27 x 27 sq micron pixels and a 404 x 64 VNIR array of 52 x 52 sq micron pixels. Each of the arrays is fabricated using a four phase, double poly, buried n-channel, multi-pinned phase CCD process. Performance data for each sensor is presented.
Interactive, process-oriented climate modeling with CLIMLAB
NASA Astrophysics Data System (ADS)
Rose, B. E. J.
2016-12-01
Global climate is a complex emergent property of the rich interactions between simpler components of the climate system. We build scientific understanding of this system by breaking it down into component process models (e.g. radiation, large-scale dynamics, boundary layer turbulence), understanding each components, and putting them back together. Hands-on experience and freedom to tinker with climate models (whether simple or complex) is invaluable for building physical understanding. CLIMLAB is an open-ended software engine for interactive, process-oriented climate modeling. With CLIMLAB you can interactively mix and match model components, or combine simpler process models together into a more comprehensive model. It was created primarily to support classroom activities, using hands-on modeling to teach fundamentals of climate science at both undergraduate and graduate levels. CLIMLAB is written in Python and ties in with the rich ecosystem of open-source scientific Python tools for numerics and graphics. The Jupyter Notebook format provides an elegant medium for distributing interactive example code. I will give an overview of the current capabilities of CLIMLAB, the curriculum we have developed thus far, and plans for the future. Using CLIMLAB requires some basic Python coding skills. We consider this an educational asset, as we are targeting upper-level undergraduates and Python is an increasingly important language in STEM fields.
NASA Astrophysics Data System (ADS)
Fiore, S.; Płóciennik, M.; Doutriaux, C.; Blanquer, I.; Barbera, R.; Williams, D. N.; Anantharaj, V. G.; Evans, B. J. K.; Salomoni, D.; Aloisio, G.
2017-12-01
The increased models resolution in the development of comprehensive Earth System Models is rapidly leading to very large climate simulations output that pose significant scientific data management challenges in terms of data sharing, processing, analysis, visualization, preservation, curation, and archiving.Large scale global experiments for Climate Model Intercomparison Projects (CMIP) have led to the development of the Earth System Grid Federation (ESGF), a federated data infrastructure which has been serving the CMIP5 experiment, providing access to 2PB of data for the IPCC Assessment Reports. In such a context, running a multi-model data analysis experiment is very challenging, as it requires the availability of a large amount of data related to multiple climate models simulations and scientific data management tools for large-scale data analytics. To address these challenges, a case study on climate models intercomparison data analysis has been defined and implemented in the context of the EU H2020 INDIGO-DataCloud project. The case study has been tested and validated on CMIP5 datasets, in the context of a large scale, international testbed involving several ESGF sites (LLNL, ORNL and CMCC), one orchestrator site (PSNC) and one more hosting INDIGO PaaS services (UPV). Additional ESGF sites, such as NCI (Australia) and a couple more in Europe, are also joining the testbed. The added value of the proposed solution is summarized in the following: it implements a server-side paradigm which limits data movement; it relies on a High-Performance Data Analytics (HPDA) stack to address performance; it exploits the INDIGO PaaS layer to support flexible, dynamic and automated deployment of software components; it provides user-friendly web access based on the INDIGO Future Gateway; and finally it integrates, complements and extends the support currently available through ESGF. Overall it provides a new "tool" for climate scientists to run multi-model experiments. At the time this contribution is being written, the proposed testbed represents the first implementation of a distributed large-scale, multi-model experiment in the ESGF/CMIP context, joining together server-side approaches for scientific data analysis, HPDA frameworks, end-to-end workflow management, and cloud computing.
NASA Astrophysics Data System (ADS)
Haguenauer, P.; Fedrigo, E.; Pettazzi, L.; Reinero, C.; Gonte, F.; Pallanca, L.; Frahm, R.; Woillez, J.; Lilley, P.
2016-07-01
The MACAO curvature wavefront sensors have been designed as a generic adaptive optics sensor for the Very Large Telescope. Six systems have been manufactured and implemented on sky: four installed in the UTs Coudé train as an AO facility for the VLTI, and two in UT's instruments, SINFONI and CRIRES. The MACAO-VLTI have now been in use for scientific operation for more than a decade and are planned to be operated for at least ten more years. As second generation instruments for the VLTI were planned to start implementation in end of 2015, accompanied with a major upgrade of the VLTI infrastructure, we saw it as a good time for a rejuvenation project of these systems, correcting the obsolete components. This obsolescence correction also gave us the opportunity to implement improved capabilities: the correction frequency was pushed from 420 Hz to 1050 Hz, and an automatic vibrations compensation algorithm was added. The implementation on the first MACAO was done in October 2014 and the first phase of obsolescence correction was completed in all four MACAO-VLTI systems in October 2015 with the systems delivered back to operation. The resuming of the scientific operation of the VLTI on the UTs in November 2015 allowed to gather statistics in order to evaluate the improvement of the performances through this upgrade. A second phase of obsolescence correction has now been started, together with a global reflection on possible further improvements to secure observations with the VLTI.
Handling Metadata in a Neurophysiology Laboratory
Zehl, Lyuba; Jaillet, Florent; Stoewer, Adrian; Grewe, Jan; Sobolev, Andrey; Wachtler, Thomas; Brochier, Thomas G.; Riehle, Alexa; Denker, Michael; Grün, Sonja
2016-01-01
To date, non-reproducibility of neurophysiological research is a matter of intense discussion in the scientific community. A crucial component to enhance reproducibility is to comprehensively collect and store metadata, that is, all information about the experiment, the data, and the applied preprocessing steps on the data, such that they can be accessed and shared in a consistent and simple manner. However, the complexity of experiments, the highly specialized analysis workflows and a lack of knowledge on how to make use of supporting software tools often overburden researchers to perform such a detailed documentation. For this reason, the collected metadata are often incomplete, incomprehensible for outsiders or ambiguous. Based on our research experience in dealing with diverse datasets, we here provide conceptual and technical guidance to overcome the challenges associated with the collection, organization, and storage of metadata in a neurophysiology laboratory. Through the concrete example of managing the metadata of a complex experiment that yields multi-channel recordings from monkeys performing a behavioral motor task, we practically demonstrate the implementation of these approaches and solutions with the intention that they may be generalized to other projects. Moreover, we detail five use cases that demonstrate the resulting benefits of constructing a well-organized metadata collection when processing or analyzing the recorded data, in particular when these are shared between laboratories in a modern scientific collaboration. Finally, we suggest an adaptable workflow to accumulate, structure and store metadata from different sources using, by way of example, the odML metadata framework. PMID:27486397
Handling Metadata in a Neurophysiology Laboratory.
Zehl, Lyuba; Jaillet, Florent; Stoewer, Adrian; Grewe, Jan; Sobolev, Andrey; Wachtler, Thomas; Brochier, Thomas G; Riehle, Alexa; Denker, Michael; Grün, Sonja
2016-01-01
To date, non-reproducibility of neurophysiological research is a matter of intense discussion in the scientific community. A crucial component to enhance reproducibility is to comprehensively collect and store metadata, that is, all information about the experiment, the data, and the applied preprocessing steps on the data, such that they can be accessed and shared in a consistent and simple manner. However, the complexity of experiments, the highly specialized analysis workflows and a lack of knowledge on how to make use of supporting software tools often overburden researchers to perform such a detailed documentation. For this reason, the collected metadata are often incomplete, incomprehensible for outsiders or ambiguous. Based on our research experience in dealing with diverse datasets, we here provide conceptual and technical guidance to overcome the challenges associated with the collection, organization, and storage of metadata in a neurophysiology laboratory. Through the concrete example of managing the metadata of a complex experiment that yields multi-channel recordings from monkeys performing a behavioral motor task, we practically demonstrate the implementation of these approaches and solutions with the intention that they may be generalized to other projects. Moreover, we detail five use cases that demonstrate the resulting benefits of constructing a well-organized metadata collection when processing or analyzing the recorded data, in particular when these are shared between laboratories in a modern scientific collaboration. Finally, we suggest an adaptable workflow to accumulate, structure and store metadata from different sources using, by way of example, the odML metadata framework.
G-jitter Effects on Transport and Pattern Formation
NASA Technical Reports Server (NTRS)
Schatz, Michael F.
2003-01-01
The research performed under this grant has led to an number of new insights into two general categories of fluid flows in the presence of time-dependent acceleration, as outlined briefly below. These results have been widely communicated in the scientific community through seven presentations at international conferences (4 invited, 3 contributed), five published papers (4 journal articles and 1 conference proceeding), and images from the research featured on the cover of all 2003 editions of the research journal, Nonlinearity. The work performed under this proposal also contained a substantial educational component by contributed significantly to the scientific training of one postdoctoral associate, one Ph.D. student and five undergraduate researchers. One main area of focus in this research was convective flow with time-dependent acceleration. Convection is one class of behavior that can arise from g-jitter effects. Our research focused on studies of Rayleigh-Benard system, which is an important model for understanding thermal convection; studies of this problem in the presence of acceleration modulations provided insight into the nature of g-jitter induced flow and of the effects of modulation and noise on non-equilibrium pattern formation. Our experiments on vertically vibrated Rayleigh-Benard convection demonstrated the existence of two classes of pure flow patterns (synchronous & subharmonic) patterns) that had long been predicted by theory but never before observed experimentally. Detailed studies of ranges of parameters where both classes of patterns exist simultaneously led to the discovery of a new type of patterns (called superlattices) in systems driven out of thermodynamic equilibrium.
Science on Stage: Engaging and teaching scientific content through performance art
NASA Astrophysics Data System (ADS)
Posner, Esther
2016-04-01
Engaging teaching material through performance art and music can improve the long-term retention of scientific content. Additionally, the development of effective performance skills are a powerful tool to communicate scientific concepts and information to a broader audience that can have many positive benefits in terms of career development and the delivery of professional presentations. While arts integration has been shown to increase student engagement and achievement, relevant artistic materials are still required for use as supplemental activities in STEM (science, technology, engineering, mathematics) courses. I will present an original performance poem, "Tectonic Petrameter: A Journey Through Earth History," with instructions for its implementation as a play in pre-university and undergraduate geoscience classrooms. "Tectonic Petrameter" uses a dynamic combination of rhythm and rhyme to teach the geological time scale, fundamental concepts in geology and important events in Earth history. I propose that using performance arts, such as "Tectonic Petrameter" and other creative art forms, may be an avenue for breaking down barriers related to teaching students and the broader non-scientific community about Earth's long and complex history.
Evaluation of Cache-based Superscalar and Cacheless Vector Architectures for Scientific Computations
NASA Technical Reports Server (NTRS)
Oliker, Leonid; Carter, Jonathan; Shalf, John; Skinner, David; Ethier, Stephane; Biswas, Rupak; Djomehri, Jahed; VanderWijngaart, Rob
2003-01-01
The growing gap between sustained and peak performance for scientific applications has become a well-known problem in high performance computing. The recent development of parallel vector systems offers the potential to bridge this gap for a significant number of computational science codes and deliver a substantial increase in computing capabilities. This paper examines the intranode performance of the NEC SX6 vector processor and the cache-based IBM Power3/4 superscalar architectures across a number of key scientific computing areas. First, we present the performance of a microbenchmark suite that examines a full spectrum of low-level machine characteristics. Next, we study the behavior of the NAS Parallel Benchmarks using some simple optimizations. Finally, we evaluate the perfor- mance of several numerical codes from key scientific computing domains. Overall results demonstrate that the SX6 achieves high performance on a large fraction of our application suite and in many cases significantly outperforms the RISC-based architectures. However, certain classes of applications are not easily amenable to vectorization and would likely require extensive reengineering of both algorithm and implementation to utilize the SX6 effectively.
Safe traffic : Vision Zero on the move
DOT National Transportation Integrated Search
2006-03-01
Vision Zero is composed of several basic : elements, each of which affects safety in : road traffic. These concerns ethics, human : capability and tolerance, responsibility, : scientific facts and a realisation that the : different components in the ...
ERIC Educational Resources Information Center
Park, Jongwon; Jang, Kyoung-Ae; Kim, Ikgyun
2009-01-01
Investigation of scientists' actual processes of conducting research can provide us with more realistic aspects of scientific inquiry. This study was performed to identify three aspects of scientists' actual research: their motivations for scientific inquiry, the scientific inquiry skills they used, and the main types of results obtained from…
PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deelman, Ewa; Carothers, Christopher; Mandal, Anirban
Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less
A visiting scientist program for the burst and transient source experiment
NASA Technical Reports Server (NTRS)
Kerr, Frank J.
1995-01-01
During this project, Universities Space Research Association provided program management and the administration for overseeing the performance of the total contractual effort. The program director and administrative staff provided the expertise and experience needed to efficiently manage the program.USRA provided a program coordinator and v visiting scientists to perform scientific research with Burst and Transient Source Experiment (BATSE) data. This research was associated with the primary scientific objectives of BATSE and with the various BATSE collaborations which were formed in response to the Compton Gamma Ray Observatory Guest Investigator Program. USRA provided administration for workshops, colloquia, the preparation of scientific documentation, etc. and also provided flexible program support in order to meet the on-going needs of MSFC's BATSE program. USRA performed tasks associated with the recovery, archiving, and processing of scientific data from BATSE. A bibliography of research in the astrophysics discipline is attached as Appendix 1. Visiting Scientists and Research Associates performed activities on this project, and their technical reports are attached as Appendix 2.
PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows
Deelman, Ewa; Carothers, Christopher; Mandal, Anirban; ...
2015-07-14
Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less
2014-01-01
Background Accounts of evidence are vital to evaluate and reproduce scientific findings and integrate data on an informed basis. Currently, such accounts are often inadequate, unstandardized and inaccessible for computational knowledge engineering even though computational technologies, among them those of the semantic web, are ever more employed to represent, disseminate and integrate biomedical data and knowledge. Results We present SEE (Semantic EvidencE), an RDF/OWL based approach for detailed representation of evidence in terms of the argumentative structure of the supporting background for claims even in complex settings. We derive design principles and identify minimal components for the representation of evidence. We specify the Reasoning and Discourse Ontology (RDO), an OWL representation of the model of scientific claims, their subjects, their provenance and their argumentative relations underlying the SEE approach. We demonstrate the application of SEE and illustrate its design patterns in a case study by providing an expressive account of the evidence for certain claims regarding the isolation of the enzyme glutamine synthetase. Conclusions SEE is suited to provide coherent and computationally accessible representations of evidence-related information such as the materials, methods, assumptions, reasoning and information sources used to establish a scientific finding by adopting a consistently claim-based perspective on scientific results and their evidence. SEE allows for extensible evidence representations, in which the level of detail can be adjusted and which can be extended as needed. It supports representation of arbitrary many consecutive layers of interpretation and attribution and different evaluations of the same data. SEE and its underlying model could be a valuable component in a variety of use cases that require careful representation or examination of evidence for data presented on the semantic web or in other formats. PMID:25093070
Bölling, Christian; Weidlich, Michael; Holzhütter, Hermann-Georg
2014-01-01
Accounts of evidence are vital to evaluate and reproduce scientific findings and integrate data on an informed basis. Currently, such accounts are often inadequate, unstandardized and inaccessible for computational knowledge engineering even though computational technologies, among them those of the semantic web, are ever more employed to represent, disseminate and integrate biomedical data and knowledge. We present SEE (Semantic EvidencE), an RDF/OWL based approach for detailed representation of evidence in terms of the argumentative structure of the supporting background for claims even in complex settings. We derive design principles and identify minimal components for the representation of evidence. We specify the Reasoning and Discourse Ontology (RDO), an OWL representation of the model of scientific claims, their subjects, their provenance and their argumentative relations underlying the SEE approach. We demonstrate the application of SEE and illustrate its design patterns in a case study by providing an expressive account of the evidence for certain claims regarding the isolation of the enzyme glutamine synthetase. SEE is suited to provide coherent and computationally accessible representations of evidence-related information such as the materials, methods, assumptions, reasoning and information sources used to establish a scientific finding by adopting a consistently claim-based perspective on scientific results and their evidence. SEE allows for extensible evidence representations, in which the level of detail can be adjusted and which can be extended as needed. It supports representation of arbitrary many consecutive layers of interpretation and attribution and different evaluations of the same data. SEE and its underlying model could be a valuable component in a variety of use cases that require careful representation or examination of evidence for data presented on the semantic web or in other formats.
Confronting uncertainty in wildlife management: performance of grizzly bear management.
Artelle, Kyle A; Anderson, Sean C; Cooper, Andrew B; Paquet, Paul C; Reynolds, John D; Darimont, Chris T
2013-01-01
Scientific management of wildlife requires confronting the complexities of natural and social systems. Uncertainty poses a central problem. Whereas the importance of considering uncertainty has been widely discussed, studies of the effects of unaddressed uncertainty on real management systems have been rare. We examined the effects of outcome uncertainty and components of biological uncertainty on hunt management performance, illustrated with grizzly bears (Ursus arctos horribilis) in British Columbia, Canada. We found that both forms of uncertainty can have serious impacts on management performance. Outcome uncertainty alone--discrepancy between expected and realized mortality levels--led to excess mortality in 19% of cases (population-years) examined. Accounting for uncertainty around estimated biological parameters (i.e., biological uncertainty) revealed that excess mortality might have occurred in up to 70% of cases. We offer a general method for identifying targets for exploited species that incorporates uncertainty and maintains the probability of exceeding mortality limits below specified thresholds. Setting targets in our focal system using this method at thresholds of 25% and 5% probability of overmortality would require average target mortality reductions of 47% and 81%, respectively. Application of our transparent and generalizable framework to this or other systems could improve management performance in the presence of uncertainty.
Field: a new meta-authoring platform for data-intensive scientific visualization
NASA Astrophysics Data System (ADS)
Downie, M.; Ameres, E.; Fox, P. A.; Goebel, J.; Graves, A.; Hendler, J.
2012-12-01
This presentation will demonstrate a new platform for data-intensive scientific visualization, called Field, that rethinks the problem of visual data exploration. Several new opportunities for scientific visualization present themselves here at this moment in time. We believe that when taken together they may catalyze a transformation of the practice of science and to begin to seed a technical culture within science that fuses data analysis, programming and myriad visual strategies. It is at integrative levels that the principle challenges exist, for many fundamental technical components of our field are now well understood and widely available. File formats from CSV through HDF all have broad library support; low-level high-performance graphics APIs (OpenGL) are in a period of stable growth; and a dizzying ecosystem of analysis and machine learning libraries abound. The hardware of computer graphics offers unprecedented computing power within commodity components; programming languages and platforms are coalescing around a core set of umbrella runtimes. Each of these trends are each set to continue — computer graphics hardware is developing at a super-Moore-law rate, and trends in publication and dissemination point only towards an increasing amount of access to code and data. The critical opportunity here for scientific visualization is, we maintain, not a in developing a new statistical library, nor a new tool centered on a particular technique, but rather new visual, "live" programming environment that is promiscuous in its scope. We can identify the necessarily methodological practice and traditions required here not in science or engineering but in the "live-coding" practices prevalent in the fields of digital art and design. We can define this practice as an approach to programming that is live, iterative, integrative, speculative and exploratory. "Live" because it is exclusively practiced in real-time (often during performance); "iterative", because intermediate programs and this visual results are constantly being made and remade en route; "speculative", because these programs and images result out of mode of inquiry into image-making not unlike that of hypothesis formation and testing; "integrative" because this style draws deeply upon the libraries of algorithms and materials available online today; and "exploratory" because the results of these speculations are inherently open to the data and unforeseen out the outset. To this end our development environment — Field — comprises a minimal core and a powerful plug-in system that can be extended from within the environment itself. By providing a hybrid text editor that can incorporate text-based programming at the same time with graphical user-interface elements, its flexible and extensible interface provides space as necessary for notation, visualization, interface construction, and introspection. In addition, it provides an advanced GPU-accelerated graphics system ideal for large-scale data visualization. Since Field was created in the context of widely divergent interdisciplinary projects, its aim is to give its users not only the ability to work rapidly, but to shape their Field environment extensively and flexibly for their own demands.
NASA Astrophysics Data System (ADS)
Michalsky, Tova
2013-07-01
This study investigated the effectiveness of cognitive-metacognitive versus motivational components of the IMPROVE self-regulatory model, used while reading scientific texts, for 10th graders' scientific literacy and self-regulated learning (SRL). Three treatment groups (N = 198) received one type of self-addressable questions while reading scientific texts: cognitive-metacognitive (CogMet), motivational (Mot), or combined (CogMetMot). Control group received no self-addressed questions (noSRL). One measure assessed scientific literacy, and two measures assessed SRL: (a) as an aptitude-pre/post questionnaires assessing self-perceived SRL, and (b) as an event-audiotaping participants' thinking-aloud SRL behaviors in real-time learning experiences and data coding illustrating SRL changes. Findings indicated that treatment groups significantly outperformed the non-treatment group. No differences emerged between CogMet and Mot, whereas fully combined SRL support (CogMetMot) was most effective. Theoretical and practical implications of this preliminary study are discussed.
Recent Evolutions of the GEOSCOPE Broadband Seismic Observatory
NASA Astrophysics Data System (ADS)
Stutzmann, E.; Vallee, M.; Zigone, D.; Bonaime, S.; Thore, J. Y.; Pesqueira, F.; Pardo, C.; Bernard, A.; Maggi, A.; Vincent, D.; Sayadi, J.
2017-12-01
The GEOSCOPE observatory provides 36 years of continuous broadband data to the scientific community. The 32 operational GEOSCOPE stations are installed in 17 countries, across all continents and on islands throughout the oceans. They are equipped with three component very broadband seismometers (STS1 or STS2) and 24 or 26 bit digitizers (Q330HR). Seismometers are installed with warpless base plates, which decrease long period noise on horizontal components by up to 15dB. All stations send data in real time to the IPGP data center and are automatically transmitted to other data centers (IRIS-DMC and RESIF) and tsunami warning centers. Recent improvements include a new station in Wallis and Futuna (FUTU, South-Western Pacific Ocean) and the re-installation of WUS station in Western China. Data of the stations are technically validated by IPGP (25 stations) or EOST (6 stations) in order to check their continuity and integrity. A scientific data validation is also performed by analyzing seismic noise level of the continuous data and by comparing real and synthetic earthquake waveforms (body waves). After these validations, data are archived by the IPGP data center in Paris. They are made available to the international scientific community through different interfaces (see details on http://geoscope.ipgp.fr). All GEOSCOPE data are in miniseed format but using various conventions. An important technical work is done to homogenize the data miniseed formats of the whole GEOSCOPE database, in order to make easier the data duplication at the IRIS-DMC and RESIF data centers. The GEOSCOPE observatory also provides near-real time information on the World large seismicity (above magnitude 5.5-6) through the automated use of the SCARDEC method. Earthquake parameters (depth, moment magnitude, focal mechanism, source time function) are determined about 45 minutes after the occurrence of the event. A specific webpage is then generated, which also includes information for a non-seismologist audience (past seismicity, foreshocks and aftershocks, 3D representations of the fault motion…). This information is also disseminated in real-time through mailing lists and social networks. Examples for recent earthquakes can be seen in http://geoscope.ipgp.fr/index.php/en/data/earthquake-data/latest-earthquakes.
Crossing the chasm: how to develop weather and climate models for next generation computers?
NASA Astrophysics Data System (ADS)
Lawrence, Bryan N.; Rezny, Michael; Budich, Reinhard; Bauer, Peter; Behrens, Jörg; Carter, Mick; Deconinck, Willem; Ford, Rupert; Maynard, Christopher; Mullerworth, Steven; Osuna, Carlos; Porter, Andrew; Serradell, Kim; Valcke, Sophie; Wedi, Nils; Wilson, Simon
2018-05-01
Weather and climate models are complex pieces of software which include many individual components, each of which is evolving under pressure to exploit advances in computing to enhance some combination of a range of possible improvements (higher spatio-temporal resolution, increased fidelity in terms of resolved processes, more quantification of uncertainty, etc.). However, after many years of a relatively stable computing environment with little choice in processing architecture or programming paradigm (basically X86 processors using MPI for parallelism), the existing menu of processor choices includes significant diversity, and more is on the horizon. This computational diversity, coupled with ever increasing software complexity, leads to the very real possibility that weather and climate modelling will arrive at a chasm which will separate scientific aspiration from our ability to develop and/or rapidly adapt codes to the available hardware. In this paper we review the hardware and software trends which are leading us towards this chasm, before describing current progress in addressing some of the tools which we may be able to use to bridge the chasm. This brief introduction to current tools and plans is followed by a discussion outlining the scientific requirements for quality model codes which have satisfactory performance and portability, while simultaneously supporting productive scientific evolution. We assert that the existing method of incremental model improvements employing small steps which adjust to the changing hardware environment is likely to be inadequate for crossing the chasm between aspiration and hardware at a satisfactory pace, in part because institutions cannot have all the relevant expertise in house. Instead, we outline a methodology based on large community efforts in engineering and standardisation, which will depend on identifying a taxonomy of key activities - perhaps based on existing efforts to develop domain-specific languages, identify common patterns in weather and climate codes, and develop community approaches to commonly needed tools and libraries - and then collaboratively building up those key components. Such a collaborative approach will depend on institutions, projects, and individuals adopting new interdependencies and ways of working.
Recent evolutions of the GEOSCOPE broadband seismic observatory
NASA Astrophysics Data System (ADS)
Vallee, M.; Leroy, N.; Bonaime, S.; Zigone, D.; Stutzmann, E.; Thore, J. Y.; Pardo, C.; Bernard, A.; Pesqueira, F.; Maggi, A.; Vincent, D.
2016-12-01
The GEOSCOPE observatory provides 34 years of continuous broadband data to the scientific community. The 31 operational GEOSCOPE stations are installed in 17 countries, across all continents and on islands throughout the oceans. They are equipped with three component very broadband seismometers (STS1 or STS2) and 24 or 26 bit digitizers (Q330HR). Seismometers are installed with warpless base plates, which decrease long period noise on horizontal components by up to 15dB. All stations send data in real time to the GEOSCOPE data center and are automatically transmitted to other data centers (IRIS-DMC and RESIF) and tsunami warning centers. In 2016, a new station has been installed in Wallis and Futuna (FUTU, South-Western Pacific Ocean), and final work is done to reinstall WUS station in Western China. Data of the stations are technically validated by IPGP (25 stations) or EOST (6 stations) in order to check their continuity and integrity. A scientific data validation is also performed by analyzing seismic noise level of the continuous data and by comparing real and synthetic earthquake waveforms (body waves). After these validations, data are archived by the GEOSCOPE data center in Paris. They are made available to the international scientific community through different interfaces (see details on http://geoscope.ipgp.fr ). An important technical work is now done to homogenize the data formats of the whole GEOSCOPE database, in order to make easier the data duplication at the IRIS-DMC and RESIF data centers. The GEOSCOPE broadband seismic observatory also provides near-real time information on the World large seismicity (above magnitude 5.5-6) through the automated application of the SCARDEC method. By using global data from the FDSN - in particular from GEOSCOPE and IRIS/USGS stations -, earthquake source parameters (depth, moment magnitude, focal mechanism, source time function) are determined about 45 minutes after the occurrence of the event. A specific webpage is then generated for each earthquake, which also includes information for a non-seismologist audience (past seismicity, foreshocks and afterschocks, 3D representations of the fault motion…). Examples for recent earthquakes can be seen in http://geoscope.ipgp.fr/index.php/en/data/earthquake-data/latest-earthquakes
When biological scientists become health-care workers: emotional labour in embryology.
Fitzgerald, R P; Legge, M; Frank, N
2013-05-01
Can biological scientists working in medically assisted reproduction (MAR) have a role as health-care workers and, if so, how do they engage in the emotional labour commonly associated with health-care work? The scientists at Fertility Associates (FA) in New Zealand perform the technical and emotional cares associated with health-care work in an occupationally specific manner, which we refer to as a hybrid care style. Their emotional labour consists of managing difficult patients, 'talking up' bad news, finding strategies to sustain hope and meaning, and 'clicking' or 'not clicking' with individual patients. Effective emotional labour is a key component of patient-centred care and is as important to the experience of high-quality MAR as excellent clinical and scientific technique. This is a qualitative study based on open-ended interviews and ethnographic observations with 14 staff in 2 laboratories conducted over 2 separate periods of 3 weeks duration in 2007. Analysis of fieldnotes and interviews was conducted using thematic analysis and an NVivo qualitative database and compared for consistency across each interviewer. The participants were consenting biological scientists working in one of the two laboratories. Semi-structured interviews were conducted in 'quiet' work times, and supervised access was allowed to all parts of the laboratories and meeting places. Opportunities for participant review of results and cross comparison of independent analysis by authors increases the faithfulness of fit of this account to laboratory life. The study suggests that emotional labour is a part of routinized scientific labour in MAR laboratories for FA. This is a qualitative study and thus the findings are not generalizable to populations beyond the study participants. While little has been published of the emotional component of scientist's working lives, there may be a New Zealand style of doing scientific work in MAR laboratories which is patient centred and which incorporates much higher patient contact and involvement than is experienced in other laboratories. This study was funded by a research grant from the University of Otago and was also partly funded by a Marsden Grant administered by the Royal Society of New Zealand. N/A.
Architectural frameworks: defining the structures for implementing learning health systems.
Lessard, Lysanne; Michalowski, Wojtek; Fung-Kee-Fung, Michael; Jones, Lori; Grudniewicz, Agnes
2017-06-23
The vision of transforming health systems into learning health systems (LHSs) that rapidly and continuously transform knowledge into improved health outcomes at lower cost is generating increased interest in government agencies, health organizations, and health research communities. While existing initiatives demonstrate that different approaches can succeed in making the LHS vision a reality, they are too varied in their goals, focus, and scale to be reproduced without undue effort. Indeed, the structures necessary to effectively design and implement LHSs on a larger scale are lacking. In this paper, we propose the use of architectural frameworks to develop LHSs that adhere to a recognized vision while being adapted to their specific organizational context. Architectural frameworks are high-level descriptions of an organization as a system; they capture the structure of its main components at varied levels, the interrelationships among these components, and the principles that guide their evolution. Because these frameworks support the analysis of LHSs and allow their outcomes to be simulated, they act as pre-implementation decision-support tools that identify potential barriers and enablers of system development. They thus increase the chances of successful LHS deployment. We present an architectural framework for LHSs that incorporates five dimensions-goals, scientific, social, technical, and ethical-commonly found in the LHS literature. The proposed architectural framework is comprised of six decision layers that model these dimensions. The performance layer models goals, the scientific layer models the scientific dimension, the organizational layer models the social dimension, the data layer and information technology layer model the technical dimension, and the ethics and security layer models the ethical dimension. We describe the types of decisions that must be made within each layer and identify methods to support decision-making. In this paper, we outline a high-level architectural framework grounded in conceptual and empirical LHS literature. Applying this architectural framework can guide the development and implementation of new LHSs and the evolution of existing ones, as it allows for clear and critical understanding of the types of decisions that underlie LHS operations. Further research is required to assess and refine its generalizability and methods.
Warp-X: A new exascale computing platform for beam–plasma simulations
Vay, J. -L.; Almgren, A.; Bell, J.; ...
2018-01-31
Turning the current experimental plasma accelerator state-of-the-art from a promising technology into mainstream scientific tools depends critically on high-performance, high-fidelity modeling of complex processes that develop over a wide range of space and time scales. As part of the U.S. Department of Energy's Exascale Computing Project, a team from Lawrence Berkeley National Laboratory, in collaboration with teams from SLAC National Accelerator Laboratory and Lawrence Livermore National Laboratory, is developing a new plasma accelerator simulation tool that will harness the power of future exascale supercomputers for high-performance modeling of plasma accelerators. We present the various components of the codes such asmore » the new Particle-In-Cell Scalable Application Resource (PICSAR) and the redesigned adaptive mesh refinement library AMReX, which are combined with redesigned elements of the Warp code, in the new WarpX software. Lastly, the code structure, status, early examples of applications and plans are discussed.« less
Low-cost laser speckle contrast imaging of blood flow using a webcam.
Richards, Lisa M; Kazmi, S M Shams; Davis, Janel L; Olin, Katherine E; Dunn, Andrew K
2013-01-01
Laser speckle contrast imaging has become a widely used tool for dynamic imaging of blood flow, both in animal models and in the clinic. Typically, laser speckle contrast imaging is performed using scientific-grade instrumentation. However, due to recent advances in camera technology, these expensive components may not be necessary to produce accurate images. In this paper, we demonstrate that a consumer-grade webcam can be used to visualize changes in flow, both in a microfluidic flow phantom and in vivo in a mouse model. A two-camera setup was used to simultaneously image with a high performance monochrome CCD camera and the webcam for direct comparison. The webcam was also tested with inexpensive aspheric lenses and a laser pointer for a complete low-cost, compact setup ($90, 5.6 cm length, 25 g). The CCD and webcam showed excellent agreement with the two-camera setup, and the inexpensive setup was used to image dynamic blood flow changes before and after a targeted cerebral occlusion.
Low-cost laser speckle contrast imaging of blood flow using a webcam
Richards, Lisa M.; Kazmi, S. M. Shams; Davis, Janel L.; Olin, Katherine E.; Dunn, Andrew K.
2013-01-01
Laser speckle contrast imaging has become a widely used tool for dynamic imaging of blood flow, both in animal models and in the clinic. Typically, laser speckle contrast imaging is performed using scientific-grade instrumentation. However, due to recent advances in camera technology, these expensive components may not be necessary to produce accurate images. In this paper, we demonstrate that a consumer-grade webcam can be used to visualize changes in flow, both in a microfluidic flow phantom and in vivo in a mouse model. A two-camera setup was used to simultaneously image with a high performance monochrome CCD camera and the webcam for direct comparison. The webcam was also tested with inexpensive aspheric lenses and a laser pointer for a complete low-cost, compact setup ($90, 5.6 cm length, 25 g). The CCD and webcam showed excellent agreement with the two-camera setup, and the inexpensive setup was used to image dynamic blood flow changes before and after a targeted cerebral occlusion. PMID:24156082