Sample records for optimization forhigh-end scientific

  1. Cerebral Laterality and Handedness in Aviation: Performance and Selection Implications

    DTIC Science & Technology

    1989-01-01

    population; orangutans , rhesus monkeys, and mice demonstrated this seemingly random pattern (253). Chimpanzees have recently been tested for...higher right Sylvian point In the brains of chimpanzees and orangutans (as in humans) (144), a larger right frontal lobe in the baboon (34), and the

  2. Chemical Communications

    DTIC Science & Technology

    2012-10-26

    the need for alignment. We have also demonstrated the use of this technique with various materials as masks for silk biopolymer RIE processing and a...project. The automatization of silk solution was developed. Examination of different processing conditions for the raw material showed promise for...higher durability and higher flexibility optical substrates. Progress on interfaces was solidified. The previous findings on silk -metal interfaces

  3. The Effect of Back Pressure on the Operation of a Diesel Engine

    DTIC Science & Technology

    2011-02-01

    increased back pressure on a turbocharged diesel engine. Steady state and varying back pressure are considered. The results show that high back...a turbocharged diesel engine using the Ricardo Wave engine modelling software, to gain understanding of the problem and provide a good base for...higher pressure. The pressure ratios across the turbocharger compressor and turbine decrease, reducing the mass flow of air through these components

  4. The Effect of Back Pressure on the Operation of a Disel Engine

    DTIC Science & Technology

    2011-02-01

    increased back pressure on a turbocharged diesel engine. Steady state and varying back pressure are considered. The results show that high back...a turbocharged diesel engine using the Ricardo Wave engine modelling software, to gain understanding of the problem and provide a good base for...higher pressure. The pressure ratios across the turbocharger compressor and turbine decrease, reducing the mass flow of air through these components

  5. Computational Sensing and in vitro Classification of GMOs and Biomolecular Events

    DTIC Science & Technology

    2008-12-01

    COMPUTATIONAL SENSING AND IN VITRO CLASSIFICATION OF GMOs AND BIOMOLECULAR EVENTS Elebeoba May1∗, Miler T. Lee2†, Patricia Dolan1, Paul Crozier1...modified organisms ( GMOs ) in the pres- ence of non-lethal agents. Using an information and coding- theoretic framework we develop a de novo method for...high through- put screening, distinguishing genetically modified organisms ( GMOs ), molecular computing, differentiating biological mark- ers

  6. Interfacial Engineering for Low-Density Graphene Nanocomposites

    DTIC Science & Technology

    2014-07-23

    structure of polydimethylsiloxane ( PDMS ) to contain pyrene pendant groups such that it would non-covalently bind to graphene. This would allow for...high graphene loadings and conductive strain-sensitivity in PDMS . SEM images of these composites are shown here: 2 The high level of dispersion...allowed for a pristine graphene composite conductivity of 220 S/m; this is after using a membrane to induce separation between graphene-bound PDMS

  7. Boko Haram: Developing New Strategies to Combat Terrorism in Nigeria

    DTIC Science & Technology

    2013-03-01

    by the Commission on Higher Education of the Middle States Association of Colleges and Schools, 3624 Market Street, Philadelphia, PA 19104, (215) 662...5606. The Commission on Higher Education is an institutional accrediting agency recognized by the U.S. Secretary of Education and the Council for...Higher Education Accreditation. Standard Form 298 (Rev. 8/98) Prescribed by ANSI Std. Z39.18 REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704

  8. Dayton Aircraft Cabin Fire Model, Version 3, Volume I. Physical Description.

    DTIC Science & Technology

    1982-06-01

    contact to any surface directly above a burning element, provided that the current flame length makes contact possible. For fires originating on the...no extension of the flames horizontally beneath the surface is considered. The equation for computing the flame length is presented in Section 5. For...high as 0.3. The values chosen for DACFIR3 are 0.15 for Ec and 0.10 for E P. The Steward model is also used to compute flame length , hf, for the fire

  9. ROSE Version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quinlan, D.; Yi, Q.; Buduc, R.

    2005-02-17

    ROSE is an object-oriented software infrastructure for source-to-source translation that provides an interface for programmers to write their own specialized translators for optimizing scientific applications. ROSE is a part of current research on telescoping languages, which provides optimizations of the use of libraries in scientific applications. ROSE defines approaches to extend the optimization techniques, common in well defined languages, to the optimization of scientific applications using well defined libraries. ROSE includes a rich set of tools for generating customized transformations to support optimization of applications codes. We currently support full C and C++ (including template instantiation etc.), with Fortran 90more » support under development as part of a collaboration and contract with Rice to use their version of the open source Open64 F90 front-end. ROSE represents an attempt to define an open compiler infrastructure to handle the full complexity of full scale DOE applications codes using the languages common to scientific computing within DOE. We expect that such an infrastructure will also be useful for the development of numerous tools that may then realistically expect to work on DOE full scale applications.« less

  10. Visualization for Hyper-Heuristics. Front-End Graphical User Interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kroenung, Lauren

    Modern society is faced with ever more complex problems, many of which can be formulated as generate-and-test optimization problems. General-purpose optimization algorithms are not well suited for real-world scenarios where many instances of the same problem class need to be repeatedly and efficiently solved because they are not targeted to a particular scenario. Hyper-heuristics automate the design of algorithms to create a custom algorithm for a particular scenario. While such automated design has great advantages, it can often be difficult to understand exactly how a design was derived and why it should be trusted. This project aims to address thesemore » issues of usability by creating an easy-to-use graphical user interface (GUI) for hyper-heuristics to support practitioners, as well as scientific visualization of the produced automated designs. My contributions to this project are exhibited in the user-facing portion of the developed system and the detailed scientific visualizations created from back-end data.« less

  11. Magnetic bead purification of labeled DNA fragments forhigh-throughput capillary electrophoresis sequencing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elkin, Christopher; Kapur, Hitesh; Smith, Troy

    2001-09-15

    We have developed an automated purification method for terminator sequencing products based on a magnetic bead technology. This 384-well protocol generates labeled DNA fragments that are essentially free of contaminates for less than $0.005 per reaction. In comparison to laborious ethanol precipitation protocols, this method increases the phred20 read length by forty bases with various DNA templates such as PCR fragments, Plasmids, Cosmids and RCA products. Our method eliminates centrifugation and is compatible with both the MegaBACE 1000 and ABIPrism 3700 capillary instruments. As of September 2001, this method has produced over 1.6 million samples with 93 percent averaging 620more » phred20 bases as part of Joint Genome Institutes Production Process.« less

  12. Automated Design of Propellant-Optimal, End-to-End, Low-Thrust Trajectories for Trojan Asteroid Tours

    NASA Technical Reports Server (NTRS)

    Stuart, Jeffrey; Howell, Kathleen; Wilson, Roby

    2013-01-01

    The Sun-Jupiter Trojan asteroids are celestial bodies of great scientific interest as well as potential resources offering water and other mineral resources for longterm human exploration of the solar system. Previous investigations under this project have addressed the automated design of tours within the asteroid swarm. This investigation expands the current automation scheme by incorporating options for a complete trajectory design approach to the Trojan asteroids. Computational aspects of the design procedure are automated such that end-to-end trajectories are generated with a minimum of human interaction after key elements and constraints associated with a proposed mission concept are specified.

  13. The X-IFU end-to-end simulations performed for the TES array optimization exercise

    NASA Astrophysics Data System (ADS)

    Peille, Philippe; Wilms, J.; Brand, T.; Cobo, B.; Ceballos, M. T.; Dauser, T.; Smith, S. J.; Barret, D.; den Herder, J. W.; Piro, L.; Barcons, X.; Pointecouteau, E.; Bandler, S.; den Hartog, R.; de Plaa, J.

    2015-09-01

    The focal plane assembly of the Athena X-ray Integral Field Unit (X-IFU) includes as the baseline an array of ~4000 single size calorimeters based on Transition Edge Sensors (TES). Other sensor array configurations could however be considered, combining TES of different properties (e.g. size). In attempting to improve the X-IFU performance in terms of field of view, count rate performance, and even spectral resolution, two alternative TES array configurations to the baseline have been simulated, each combining a small and a large pixel array. With the X-IFU end-to-end simulator, a sub-sample of the Athena core science goals, selected by the X-IFU science team as potentially driving the optimal TES array configuration, has been simulated for the results to be scientifically assessed and compared. In this contribution, we will describe the simulation set-up for the various array configurations, and highlight some of the results of the test cases simulated.

  14. Secure Scientific Applications Scheduling Technique for Cloud Computing Environment Using Global League Championship Algorithm

    PubMed Central

    Abdulhamid, Shafi’i Muhammad; Abd Latiff, Muhammad Shafie; Abdul-Salaam, Gaddafi; Hussain Madni, Syed Hamid

    2016-01-01

    Cloud computing system is a huge cluster of interconnected servers residing in a datacenter and dynamically provisioned to clients on-demand via a front-end interface. Scientific applications scheduling in the cloud computing environment is identified as NP-hard problem due to the dynamic nature of heterogeneous resources. Recently, a number of metaheuristics optimization schemes have been applied to address the challenges of applications scheduling in the cloud system, without much emphasis on the issue of secure global scheduling. In this paper, scientific applications scheduling techniques using the Global League Championship Algorithm (GBLCA) optimization technique is first presented for global task scheduling in the cloud environment. The experiment is carried out using CloudSim simulator. The experimental results show that, the proposed GBLCA technique produced remarkable performance improvement rate on the makespan that ranges between 14.44% to 46.41%. It also shows significant reduction in the time taken to securely schedule applications as parametrically measured in terms of the response time. In view of the experimental results, the proposed technique provides better-quality scheduling solution that is suitable for scientific applications task execution in the Cloud Computing environment than the MinMin, MaxMin, Genetic Algorithm (GA) and Ant Colony Optimization (ACO) scheduling techniques. PMID:27384239

  15. Secure Scientific Applications Scheduling Technique for Cloud Computing Environment Using Global League Championship Algorithm.

    PubMed

    Abdulhamid, Shafi'i Muhammad; Abd Latiff, Muhammad Shafie; Abdul-Salaam, Gaddafi; Hussain Madni, Syed Hamid

    2016-01-01

    Cloud computing system is a huge cluster of interconnected servers residing in a datacenter and dynamically provisioned to clients on-demand via a front-end interface. Scientific applications scheduling in the cloud computing environment is identified as NP-hard problem due to the dynamic nature of heterogeneous resources. Recently, a number of metaheuristics optimization schemes have been applied to address the challenges of applications scheduling in the cloud system, without much emphasis on the issue of secure global scheduling. In this paper, scientific applications scheduling techniques using the Global League Championship Algorithm (GBLCA) optimization technique is first presented for global task scheduling in the cloud environment. The experiment is carried out using CloudSim simulator. The experimental results show that, the proposed GBLCA technique produced remarkable performance improvement rate on the makespan that ranges between 14.44% to 46.41%. It also shows significant reduction in the time taken to securely schedule applications as parametrically measured in terms of the response time. In view of the experimental results, the proposed technique provides better-quality scheduling solution that is suitable for scientific applications task execution in the Cloud Computing environment than the MinMin, MaxMin, Genetic Algorithm (GA) and Ant Colony Optimization (ACO) scheduling techniques.

  16. Visualization for Hyper-Heuristics: Back-End Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simon, Luke

    Modern society is faced with increasingly complex problems, many of which can be formulated as generate-and-test optimization problems. Yet, general-purpose optimization algorithms may sometimes require too much computational time. In these instances, hyperheuristics may be used. Hyper-heuristics automate the design of algorithms to create a custom algorithm for a particular scenario, finding the solution significantly faster than its predecessor. However, it may be difficult to understand exactly how a design was derived and why it should be trusted. This project aims to address these issues by creating an easy-to-use graphical user interface (GUI) for hyper-heuristics and an easy-to-understand scientific visualizationmore » for the produced solutions. To support the development of this GUI, my portion of the research involved developing algorithms that would allow for parsing of the data produced by the hyper-heuristics. This data would then be sent to the front-end, where it would be displayed to the end user.« less

  17. A Review of Autologous Stem Cell Transplantation in Lymphoma.

    PubMed

    Zahid, Umar; Akbar, Faisal; Amaraneni, Akshay; Husnain, Muhammad; Chan, Onyee; Riaz, Irbaz Bin; McBride, Ali; Iftikhar, Ahmad; Anwer, Faiz

    2017-06-01

    Chemotherapy remains the first-line therapy for aggressive lymphomas. However, 20-30% of patients with non-Hodgkin lymphoma (NHL) and 15% with Hodgkin lymphoma (HL) recur after initial therapy. We want to explore the role of high-dose chemotherapy (HDT) and autologous stem cell transplant (ASCT) for these patients. There is some utility of upfront consolidation for-high risk/high-grade B-cell lymphoma, mantle cell lymphoma, and T-cell lymphoma, but there is no role of similar intervention for HL. New conditioning regimens are being investigated which have demonstrated an improved safety profile without compromising the myeloablative efficiency for relapsed or refractory HL. Salvage chemotherapy followed by HDT and rescue autologous stem cell transplant remains the standard of care for relapsed/refractory lymphoma. The role of novel agents to improve disease-related parameters remains to be elucidated in frontline induction, disease salvage, and high-dose consolidation or in the maintenance setting.

  18. Data Transfer Advisor with Transport Profiling Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S.; Liu, Qiang; Yun, Daqing

    The network infrastructures have been rapidly upgraded in many high-performance networks (HPNs). However, such infrastructure investment has not led to corresponding performance improvement in big data transfer, especially at the application layer, largely due to the complexity of optimizing transport control on end hosts. We design and implement ProbData, a PRofiling Optimization Based DAta Transfer Advisor, to help users determine the most effective data transfer method with the most appropriate control parameter values to achieve the best data transfer performance. ProbData employs a profiling optimization based approach to exploit the optimal operational zone of various data transfer methods in supportmore » of big data transfer in extreme scale scientific applications. We present a theoretical framework of the optimized profiling approach employed in ProbData as wellas its detailed design and implementation. The advising procedure and performance benefits of ProbData are illustrated and evaluated by proof-of-concept experiments in real-life networks.« less

  19. Aggregate assesment and durability evaluation of optimized graded concrete in the state of Oklahoma

    NASA Astrophysics Data System (ADS)

    Ghaeezadeh, Ashkan

    This research is a part of a larger project that emphasizes on creating a more scientific approach to designing concrete mixtures for concrete pavements that use less cement and more aggregate which is called optimized graded concrete. The most challenging obstacle in optimized mixtures is reaching enough workability so that one doesn't have to add more cement or super-plasticizer to reach the desired level of flowability. Aggregate gradation and characteristics have found to be very important when it comes to the workabaility of optimized graded concrete. In this research a new automated method of aggregate assessment was used to compare the shape and the surface of different aggregates as well as their influence on the concrete flowability. At the end, the performance of optimized graded concrete against drying shrinkage and freezing and thawing condition were investigated.

  20. End-User Applications of Real-Time Earthquake Information in Europe

    NASA Astrophysics Data System (ADS)

    Cua, G. B.; Gasparini, P.; Giardini, D.; Zschau, J.; Filangieri, A. R.; Reakt Wp7 Team

    2011-12-01

    The primary objective of European FP7 project REAKT (Strategies and Tools for Real-Time Earthquake Risk Reduction) is to improve the efficiency of real-time earthquake risk mitigation methods and their capability of protecting structures, infrastructures, and populations. REAKT aims to address the issues of real-time earthquake hazard and response from end-to-end, with efforts directed along the full spectrum of methodology development in earthquake forecasting, earthquake early warning, and real-time vulnerability systems, through optimal decision-making, and engagement and cooperation of scientists and end users for the establishment of best practices for use of real-time information. Twelve strategic test cases/end users throughout Europe have been selected. This diverse group of applications/end users includes civil protection authorities, railway systems, hospitals, schools, industrial complexes, nuclear plants, lifeline systems, national seismic networks, and critical structures. The scale of target applications covers a wide range, from two school complexes in Naples, to individual critical structures, such as the Rion Antirion bridge in Patras, and the Fatih Sultan Mehmet bridge in Istanbul, to large complexes, such as the SINES industrial complex in Portugal and the Thessaloniki port area, to distributed lifeline and transportation networks and nuclear plants. Some end-users are interested in in-depth feasibility studies for use of real-time information and development of rapid response plans, while others intend to install real-time instrumentation and develop customized automated control systems. From the onset, REAKT scientists and end-users will work together on concept development and initial implementation efforts using the data products and decision-making methodologies developed with the goal of improving end-user risk mitigation. The aim of this scientific/end-user partnership is to ensure that scientific efforts are applicable to operational, real-world problems.

  1. Prototyping a 10 Gigabit-Ethernet Event-Builder for the CTA Camera Server

    NASA Astrophysics Data System (ADS)

    Hoffmann, Dirk; Houles, Julien

    2012-12-01

    While the Cherenkov Telescope Array will end its Preperatory Phase in 2012 or 2013 with the publication of a Technical Design Report, our lab has undertaken within the french CTA community the design and prototyping of a Camera-Server, which is a PC architecture based computer, used as a switchboard assigned to each of a hundred telescopes to handle a maximum amount of scientific data recorded by each telescope. Our work aims for a data acquisition hardware and software system for the scientific raw data at optimal speed. We have evaluated the maximum performance that can be obtained by choosing standard (COTS) hardware and software (Linux) in conjunction with a 10 Gb/s switch.

  2. Priority design parameters of industrialized optical fiber sensors in civil engineering

    NASA Astrophysics Data System (ADS)

    Wang, Huaping; Jiang, Lizhong; Xiang, Ping

    2018-03-01

    Considering the mechanical effects and the different paths for transferring deformation, optical fiber sensors commonly used in civil engineering have been systematically classified. Based on the strain transfer theory, the relationship between the strain transfer coefficient and allowable testing error is established. The proposed relationship is regarded as the optimal control equation to obtain the optimal value of sensors that satisfy the requirement of measurement precision. Furthermore, specific optimization design methods and priority design parameters of the classified sensors are presented. This research indicates that (1) strain transfer theory-based optimization design method is much suitable for the sensor that depends on the interfacial shear stress to transfer the deformation; (2) the priority design parameters are bonded (sensing) length, interfacial bonded strength, elastic modulus and radius of protective layer and thickness of adhesive layer; (3) the optimization design of sensors with two anchor pieces at two ends is independent of strain transfer theory as the strain transfer coefficient can be conveniently calibrated by test, and this kind of sensors has no obvious priority design parameters. Improved calibration test is put forward to enhance the accuracy of the calibration coefficient of end-expanding sensors. By considering the practical state of sensors and the testing accuracy, comprehensive and systematic analyses on optical fiber sensors are provided from the perspective of mechanical actions, which could scientifically instruct the application design and calibration test of industrialized optical fiber sensors.

  3. Towards a Scalable and Adaptive Application Support Platform for Large-Scale Distributed E-Sciences in High-Performance Network Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Chase Qishi; Zhu, Michelle Mengxia

    The advent of large-scale collaborative scientific applications has demonstrated the potential for broad scientific communities to pool globally distributed resources to produce unprecedented data acquisition, movement, and analysis. System resources including supercomputers, data repositories, computing facilities, network infrastructures, storage systems, and display devices have been increasingly deployed at national laboratories and academic institutes. These resources are typically shared by large communities of users over Internet or dedicated networks and hence exhibit an inherent dynamic nature in their availability, accessibility, capacity, and stability. Scientific applications using either experimental facilities or computation-based simulations with various physical, chemical, climatic, and biological models featuremore » diverse scientific workflows as simple as linear pipelines or as complex as a directed acyclic graphs, which must be executed and supported over wide-area networks with massively distributed resources. Application users oftentimes need to manually configure their computing tasks over networks in an ad hoc manner, hence significantly limiting the productivity of scientists and constraining the utilization of resources. The success of these large-scale distributed applications requires a highly adaptive and massively scalable workflow platform that provides automated and optimized computing and networking services. This project is to design and develop a generic Scientific Workflow Automation and Management Platform (SWAMP), which contains a web-based user interface specially tailored for a target application, a set of user libraries, and several easy-to-use computing and networking toolkits for application scientists to conveniently assemble, execute, monitor, and control complex computing workflows in heterogeneous high-performance network environments. SWAMP will enable the automation and management of the entire process of scientific workflows with the convenience of a few mouse clicks while hiding the implementation and technical details from end users. Particularly, we will consider two types of applications with distinct performance requirements: data-centric and service-centric applications. For data-centric applications, the main workflow task involves large-volume data generation, catalog, storage, and movement typically from supercomputers or experimental facilities to a team of geographically distributed users; while for service-centric applications, the main focus of workflow is on data archiving, preprocessing, filtering, synthesis, visualization, and other application-specific analysis. We will conduct a comprehensive comparison of existing workflow systems and choose the best suited one with open-source code, a flexible system structure, and a large user base as the starting point for our development. Based on the chosen system, we will develop and integrate new components including a black box design of computing modules, performance monitoring and prediction, and workflow optimization and reconfiguration, which are missing from existing workflow systems. A modular design for separating specification, execution, and monitoring aspects will be adopted to establish a common generic infrastructure suited for a wide spectrum of science applications. We will further design and develop efficient workflow mapping and scheduling algorithms to optimize the workflow performance in terms of minimum end-to-end delay, maximum frame rate, and highest reliability. We will develop and demonstrate the SWAMP system in a local environment, the grid network, and the 100Gpbs Advanced Network Initiative (ANI) testbed. The demonstration will target scientific applications in climate modeling and high energy physics and the functions to be demonstrated include workflow deployment, execution, steering, and reconfiguration. Throughout the project period, we will work closely with the science communities in the fields of climate modeling and high energy physics including Spallation Neutron Source (SNS) and Large Hadron Collider (LHC) projects to mature the system for production use.« less

  4. Formal and Informal Learning and First-Year Psychology Students’ Development of Scientific Thinking: A Two-Wave Panel Study

    PubMed Central

    Soyyılmaz, Demet; Griffin, Laura M.; Martín, Miguel H.; Kucharský, Šimon; Peycheva, Ekaterina D.; Vaupotič, Nina; Edelsbrunner, Peter A.

    2017-01-01

    Scientific thinking is a predicate for scientific inquiry, and thus important to develop early in psychology students as potential future researchers. The present research is aimed at fathoming the contributions of formal and informal learning experiences to psychology students’ development of scientific thinking during their 1st-year of study. We hypothesize that informal experiences are relevant beyond formal experiences. First-year psychology student cohorts from various European countries will be assessed at the beginning and again at the end of the second semester. Assessments of scientific thinking will include scientific reasoning skills, the understanding of basic statistics concepts, and epistemic cognition. Formal learning experiences will include engagement in academic activities which are guided by university authorities. Informal learning experiences will include non-compulsory, self-guided learning experiences. Formal and informal experiences will be assessed with a newly developed survey. As dispositional predictors, students’ need for cognition and self-efficacy in psychological science will be assessed. In a structural equation model, students’ learning experiences and personal dispositions will be examined as predictors of their development of scientific thinking. Commonalities and differences in predictive weights across universities will be tested. The project is aimed at contributing information for designing university environments to optimize the development of students’ scientific thinking. PMID:28239363

  5. Formal and Informal Learning and First-Year Psychology Students' Development of Scientific Thinking: A Two-Wave Panel Study.

    PubMed

    Soyyılmaz, Demet; Griffin, Laura M; Martín, Miguel H; Kucharský, Šimon; Peycheva, Ekaterina D; Vaupotič, Nina; Edelsbrunner, Peter A

    2017-01-01

    Scientific thinking is a predicate for scientific inquiry, and thus important to develop early in psychology students as potential future researchers. The present research is aimed at fathoming the contributions of formal and informal learning experiences to psychology students' development of scientific thinking during their 1st-year of study. We hypothesize that informal experiences are relevant beyond formal experiences. First-year psychology student cohorts from various European countries will be assessed at the beginning and again at the end of the second semester. Assessments of scientific thinking will include scientific reasoning skills, the understanding of basic statistics concepts, and epistemic cognition. Formal learning experiences will include engagement in academic activities which are guided by university authorities. Informal learning experiences will include non-compulsory, self-guided learning experiences. Formal and informal experiences will be assessed with a newly developed survey. As dispositional predictors, students' need for cognition and self-efficacy in psychological science will be assessed. In a structural equation model, students' learning experiences and personal dispositions will be examined as predictors of their development of scientific thinking. Commonalities and differences in predictive weights across universities will be tested. The project is aimed at contributing information for designing university environments to optimize the development of students' scientific thinking.

  6. Robust continuous clustering

    PubMed Central

    Shah, Sohil Atul

    2017-01-01

    Clustering is a fundamental procedure in the analysis of scientific data. It is used ubiquitously across the sciences. Despite decades of research, existing clustering algorithms have limited effectiveness in high dimensions and often require tuning parameters for different domains and datasets. We present a clustering algorithm that achieves high accuracy across multiple domains and scales efficiently to high dimensions and large datasets. The presented algorithm optimizes a smooth continuous objective, which is based on robust statistics and allows heavily mixed clusters to be untangled. The continuous nature of the objective also allows clustering to be integrated as a module in end-to-end feature learning pipelines. We demonstrate this by extending the algorithm to perform joint clustering and dimensionality reduction by efficiently optimizing a continuous global objective. The presented approach is evaluated on large datasets of faces, hand-written digits, objects, newswire articles, sensor readings from the Space Shuttle, and protein expression levels. Our method achieves high accuracy across all datasets, outperforming the best prior algorithm by a factor of 3 in average rank. PMID:28851838

  7. Unified Performance and Power Modeling of Scientific Workloads

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, Shuaiwen; Barker, Kevin J.; Kerbyson, Darren J.

    2013-11-17

    It is expected that scientific applications executing on future large-scale HPC must be optimized not only in terms of performance, but also in terms of power consumption. As power and energy become increasingly constrained resources, researchers and developers must have access to tools that will allow for accurate prediction of both performance and power consumption. Reasoning about performance and power consumption in concert will be critical for achieving maximum utilization of limited resources on future HPC systems. To this end, we present a unified performance and power model for the Nek-Bone mini-application developed as part of the DOE's CESAR Exascalemore » Co-Design Center. Our models consider the impact of computation, point-to-point communication, and collective communication« less

  8. Intricacies of modern supercomputing illustrated with recent advances in simulations of strongly correlated electron systems

    NASA Astrophysics Data System (ADS)

    Schulthess, Thomas C.

    2013-03-01

    The continued thousand-fold improvement in sustained application performance per decade on modern supercomputers keeps opening new opportunities for scientific simulations. But supercomputers have become very complex machines, built with thousands or tens of thousands of complex nodes consisting of multiple CPU cores or, most recently, a combination of CPU and GPU processors. Efficient simulations on such high-end computing systems require tailored algorithms that optimally map numerical methods to particular architectures. These intricacies will be illustrated with simulations of strongly correlated electron systems, where the development of quantum cluster methods, Monte Carlo techniques, as well as their optimal implementation by means of algorithms with improved data locality and high arithmetic density have gone hand in hand with evolving computer architectures. The present work would not have been possible without continued access to computing resources at the National Center for Computational Science of Oak Ridge National Laboratory, which is funded by the Facilities Division of the Office of Advanced Scientific Computing Research, and the Swiss National Supercomputing Center (CSCS) that is funded by ETH Zurich.

  9. Scientific Rationale for the Canadian NGST Visible Imager

    NASA Astrophysics Data System (ADS)

    Drissen, L.; Hickson, P.; Hutchings, J.; Lilly, S.; Murowinski, R.; Stetson, P.

    1999-05-01

    While NGST will be optimized for observing in the infrared, it also offers tremendous scientific opportunities in the visible regime (0.5 - 1 mu m). This poster presents some of the science drivers for a visible imager on board NGST. Potential targets include: young starbursts and AGNs at high redshift (z=3-8); gravitational lensing by clusters of galaxies; white dwarfs in the Galactic halo and globular clusters; RR Lyrae stars in the M81 group; the lower end of the IMF in Local Group starburst clusters; low surface brightness galaxies; the environment of nearby (z<0.2) supernovae; and trans-neptunian objects. We also briefly describe the current status of the studies on the Canadian NGST Visible Imager, which is one of three instruments proposed as a Canadian contribution to NGST.

  10. The Ultimate Challenge: Prove B. F. Skinner Wrong

    PubMed Central

    Chance, Paul

    2007-01-01

    For much of his career, B. F. Skinner displayed the optimism that is often attributed to behaviorists. With time, however, he became less and less sanguine about the power of behavior science to solve the major problems facing humanity. Near the end of his life he concluded that a fair consideration of principles revealed by the scientific analysis of behavior leads to pessimism about our species. In this article I discuss the case for Skinner's pessimism and suggest that the ultimate challenge for behavior analysts today is to prove Skinner wrong. PMID:22478494

  11. [Purifying process of gynostemma pentaphyllum saponins based on "adjoint marker" online control technology and identification of their compositions by UPLC-QTOF-MS].

    PubMed

    Fan, Dong-Dong; Kuang, Yan-Hui; Dong, Li-Hua; Ye, Xiao; Chen, Liang-Mian; Zhang, Dong; Ma, Zhen-Shan; Wang, Jin-Yu; Zhu, Jing-Jing; Wang, Zhi-Min; Wang, De-Qin; Li, Chu-Yuan

    2017-04-01

    To optimize the purification process of gynostemma pentaphyllum saponins (GPS) based on "adjoint marker" online control technology with GPS as the testing index. UPLC-QTOF-MS technology was used for qualitative analysis. "Adjoint marker" online control results showed that the end point of load sample was that the UV absorbance of effluent liquid was equal to half of that of load sample solution, and the absorbance was basically stable when the end point was stable. In UPLC-QTOF-MS qualitative analysis, 16 saponins were identified from GPS, including 13 known gynostemma saponins and 3 new saponins. This optimized method was proved to be simple, scientific, reasonable, easy for online determination, real-time record, and can be better applied to the mass production and automation of production. The results of qualitative analysis indicated that the "adjoint marker" online control technology can well retain main efficacy components of medicinal materials, and provide analysis tools for the process control and quality traceability. Copyright© by the Chinese Pharmaceutical Association.

  12. Ontology-Driven Provenance Management in eScience: An Application in Parasite Research

    NASA Astrophysics Data System (ADS)

    Sahoo, Satya S.; Weatherly, D. Brent; Mutharaju, Raghava; Anantharam, Pramod; Sheth, Amit; Tarleton, Rick L.

    Provenance, from the French word "provenir", describes the lineage or history of a data entity. Provenance is critical information in scientific applications to verify experiment process, validate data quality and associate trust values with scientific results. Current industrial scale eScience projects require an end-to-end provenance management infrastructure. This infrastructure needs to be underpinned by formal semantics to enable analysis of large scale provenance information by software applications. Further, effective analysis of provenance information requires well-defined query mechanisms to support complex queries over large datasets. This paper introduces an ontology-driven provenance management infrastructure for biology experiment data, as part of the Semantic Problem Solving Environment (SPSE) for Trypanosoma cruzi (T.cruzi). This provenance infrastructure, called T.cruzi Provenance Management System (PMS), is underpinned by (a) a domain-specific provenance ontology called Parasite Experiment ontology, (b) specialized query operators for provenance analysis, and (c) a provenance query engine. The query engine uses a novel optimization technique based on materialized views called materialized provenance views (MPV) to scale with increasing data size and query complexity. This comprehensive ontology-driven provenance infrastructure not only allows effective tracking and management of ongoing experiments in the Tarleton Research Group at the Center for Tropical and Emerging Global Diseases (CTEGD), but also enables researchers to retrieve the complete provenance information of scientific results for publication in literature.

  13. Idea Paper: The Lifecycle of Software for Scientific Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dubey, Anshu; McInnes, Lois C.

    The software lifecycle is a well researched topic that has produced many models to meet the needs of different types of software projects. However, one class of projects, software development for scientific computing, has received relatively little attention from lifecycle researchers. In particular, software for end-to-end computations for obtaining scientific results has received few lifecycle proposals and no formalization of a development model. An examination of development approaches employed by the teams implementing large multicomponent codes reveals a great deal of similarity in their strategies. This idea paper formalizes these related approaches into a lifecycle model for end-to-end scientific applicationmore » software, featuring loose coupling between submodels for development of infrastructure and scientific capability. We also invite input from stakeholders to converge on a model that captures the complexity of this development processes and provides needed lifecycle guidance to the scientific software community.« less

  14. The survival of 19th-century scientific optimism: the public discourse on science in Belgium in the aftermath of the Great War (ca. 1919-1930).

    PubMed

    Onghena, Sofie

    2011-01-01

    In historiography there is a tendency to see the Great War as marking the end of scientific optimism and the period that followed the war as a time of discord. Connecting to current (inter)national historiographical debate on the question of whether the First World War meant a disruption from the pre-war period or not, this article strives to prove that faith in scientific progress still prevailed in the 1920s. This is shown through the use of Belgium as a case study, which suggests that the generally adopted cultural pessimism in the post-war years did not apply to the public rhetoric of science in this country. Diverse actors -- scientists, industrialists, politicians, the public opinion, and the military staff -- declared a confidence in science, enhanced by wartime results. Furthermore, belief in science in Belgium was not affected by public outcry over the use of mustard gas, unlike in the former belligerent countries where the gas became an unpleasant reminder of how science was used during the war. Even German science with its industrial applications remained the norm after 1918. In fact, the faith in science exhibited during the pre-war years continued to exist, at least until the 1920s, despite anti-German sentiments being voiced by many sections of Belgian society in the immediate aftermath of the war.

  15. Single event effect hardness for the front-end ASICs in the DAMPE satellite BGO calorimeter

    NASA Astrophysics Data System (ADS)

    Gao, Shan-Shan; Jiang, Di; Feng, Chang-Qing; Xi, Kai; Liu, Shu-Bin; An, Qi

    2016-01-01

    The Dark Matter Particle Explorer (DAMPE) is a Chinese scientific satellite designed for cosmic ray studies with a primary scientific goal of indirect detection of dark matter particles. As a crucial sub-detector, the BGO calorimeter measures the energy spectrum of cosmic rays in the energy range from 5 GeV to 10 TeV. In order to implement high-density front-end electronics (FEE) with the ability to measure 1848 signals from 616 photomultiplier tubes on the strictly constrained satellite platform, two kinds of 32-channel front-end ASICs, VA160 and VATA160, are customized. However, a space mission period of more than 3 years makes single event effects (SEEs) become threats to reliability. In order to evaluate SEE sensitivities of these chips and verify the effectiveness of mitigation methods, a series of laser-induced and heavy ion-induced SEE tests were performed. Benefiting from the single event latch-up (SEL) protection circuit for power supply, the triple module redundancy (TMR) technology for the configuration registers and the optimized sequential design for the data acquisition process, 52 VA160 chips and 32 VATA160 chips have been applied in the flight model of the BGO calorimeter with radiation hardness assurance. Supported by Strategic Priority Research Program on Space Science of the Chinese Academy of Sciences (XDA04040202-4) and Fundamental Research Funds for the Central Universities (WK2030040048)

  16. A Method for Estimating Meteorite Fall Mass from Weather Radar Data

    NASA Technical Reports Server (NTRS)

    Laird, C.; Fries, M.; Matson, R.

    2017-01-01

    Techniques such as weather RADAR, seismometers, and all-sky cameras allow new insights concerning the physics of meteorite fall dynamics and fragmentation during "dark flight", the period of time between the end of the meteor's luminous flight and the concluding impact on the Earth's surface. Understanding dark flight dynamics enables us to rapidly analyze the characteristics of new meteorite falls. This analysis will provide essential information to meteorite hunters to optimize recovery, increasing the frequency and total mass of scientifically important freshly-fallen meteorites available to the scientific community. We have developed a mathematical method to estimate meteorite fall mass using reflectivity data as recorded by National Oceanic and Atmospheric Administration (NOAA) Next Generation RADAR (NEXRAD) stations. This study analyzed eleven official and one unofficial meteorite falls in the United States and Canada to achieve this purpose.

  17. Accelerating scientific discovery : 2007 annual report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckman, P.; Dave, P.; Drugan, C.

    2008-11-14

    As a gateway for scientific discovery, the Argonne Leadership Computing Facility (ALCF) works hand in hand with the world's best computational scientists to advance research in a diverse span of scientific domains, ranging from chemistry, applied mathematics, and materials science to engineering physics and life sciences. Sponsored by the U.S. Department of Energy's (DOE) Office of Science, researchers are using the IBM Blue Gene/L supercomputer at the ALCF to study and explore key scientific problems that underlie important challenges facing our society. For instance, a research team at the University of California-San Diego/ SDSC is studying the molecular basis ofmore » Parkinson's disease. The researchers plan to use the knowledge they gain to discover new drugs to treat the disease and to identify risk factors for other diseases that are equally prevalent. Likewise, scientists from Pratt & Whitney are using the Blue Gene to understand the complex processes within aircraft engines. Expanding our understanding of jet engine combustors is the secret to improved fuel efficiency and reduced emissions. Lessons learned from the scientific simulations of jet engine combustors have already led Pratt & Whitney to newer designs with unprecedented reductions in emissions, noise, and cost of ownership. ALCF staff members provide in-depth expertise and assistance to those using the Blue Gene/L and optimizing user applications. Both the Catalyst and Applications Performance Engineering and Data Analytics (APEDA) teams support the users projects. In addition to working with scientists running experiments on the Blue Gene/L, we have become a nexus for the broader global community. In partnership with the Mathematics and Computer Science Division at Argonne National Laboratory, we have created an environment where the world's most challenging computational science problems can be addressed. Our expertise in high-end scientific computing enables us to provide guidance for applications that are transitioning to petascale as well as to produce software that facilitates their development, such as the MPICH library, which provides a portable and efficient implementation of the MPI standard--the prevalent programming model for large-scale scientific applications--and the PETSc toolkit that provides a programming paradigm that eases the development of many scientific applications on high-end computers.« less

  18. Integrated design and management of complex and fast track projects

    NASA Astrophysics Data System (ADS)

    Mancini, Dario

    2003-02-01

    Modern scientific and technological projects are increasingly in competition over scientific aims, technological innovation, performance, time and cost. They require a dedicated and innovative organization able to satisfy contemporarily various technical and logistic constraints imposed by the final user, and guarantee the satisfaction of technical specifications, identified on the basis of scientific aims. In order to satisfy all the above, the management has to be strategically innovative and intuitive, by removing, first of all, the bottlenecks that are pointed out, usually only at the end of the projects, as the causes of general dissatisfaction. More than 30 years spent working on complex multidisciplinary systems and 20 years of formative experience in managing contemporarily both scientific, technological and industrial projects have given the author the possibility to study, test and validate strategies for parallel project management and integrated design, merged in a sort of unique optimized task, using the newly-coined word "Technomethodology". The paper highlights useful information to be taken into consideration during project organization to minimize the program deviations from the expected goals and describe some of the basic meanings of this new advanced method that is the key for parallel successful management of multiple and interdisciplinary activities.

  19. The TMT instrumentation program

    NASA Astrophysics Data System (ADS)

    Simard, Luc; Crampton, David; Ellerbroek, Brent; Boyer, Corinne

    2010-07-01

    An overview of the current status of the Thirty Meter Telescope (TMT) instrumentation program is presented. Conceptual designs for the three first light instruments (IRIS, WFOS and IRMS) are in progress, as well as feasibility studies of MIRES. Considerable effort is underway to understand the end-to-end performance of the complete telescopeadaptive optics-instrument system under realistic conditions on Mauna Kea. Highly efficient operation is being designed into the TMT system, based on a detailed investigation of the observation workflow to ensure very fast target acquisition and set up of all subsystems. Future TMT instruments will almost certainly involve contributions from institutions in many different locations in North America and partner nations. Coordinating and optimizing the design and construction of the instruments to ensure delivery of the best possible scientific capabilities is an interesting challenge. TMT welcomes involvement from all interested instrument teams.

  20. Optimal allocation model of construction land based on two-level system optimization theory

    NASA Astrophysics Data System (ADS)

    Liu, Min; Liu, Yanfang; Xia, Yuping; Lei, Qihong

    2007-06-01

    The allocation of construction land is an important task in land-use planning. Whether implementation of planning decisions is a success or not, usually depends on a reasonable and scientific distribution method. Considering the constitution of land-use planning system and planning process in China, multiple levels and multiple objective decision problems is its essence. Also, planning quantity decomposition is a two-level system optimization problem and an optimal resource allocation decision problem between a decision-maker in the topper and a number of parallel decision-makers in the lower. According the characteristics of the decision-making process of two-level decision-making system, this paper develops an optimal allocation model of construction land based on two-level linear planning. In order to verify the rationality and the validity of our model, Baoan district of Shenzhen City has been taken as a test case. Under the assistance of the allocation model, construction land is allocated to ten townships of Baoan district. The result obtained from our model is compared to that of traditional method, and results show that our model is reasonable and usable. In the end, the paper points out the shortcomings of the model and further research directions.

  1. Ocean Acidification Scientific Data Stewardship: An approach for end-to-end data management and integration

    NASA Astrophysics Data System (ADS)

    Arzayus, K. M.; Garcia, H. E.; Jiang, L.; Michael, P.

    2012-12-01

    As the designated Federal permanent oceanographic data center in the United States, NOAA's National Oceanographic Data Center (NODC) has been providing scientific stewardship for national and international marine environmental and ecosystem data for over 50 years. NODC is supporting NOAA's Ocean Acidification Program and the science community by providing end-to-end scientific data management of ocean acidification (OA) data, dedicated online data discovery, and user-friendly access to a diverse range of historical and modern OA and other chemical, physical, and biological oceanographic data. This effort is being catalyzed by the NOAA Ocean Acidification Program, but the intended reach is for the broader scientific ocean acidification community. The first three years of the project will be focused on infrastructure building. A complete ocean acidification data content standard is being developed to ensure that a full spectrum of ocean acidification data and metadata can be stored and utilized for optimal data discovery and access in usable data formats. We plan to develop a data access interface capable of allowing users to constrain their search based on real-time and delayed mode measured variables, scientific data quality, their observation types, the temporal coverage, methods, instruments, standards, collecting institutions, and the spatial coverage. In addition, NODC seeks to utilize the existing suite of international standards (including ISO 19115-2 and CF-compliant netCDF) to help our data producers use those standards for their data, and help our data consumers make use of the well-standardized metadata-rich data sets. These tools will be available through our NODC Ocean Acidification Scientific Data Stewardship (OADS) web page at http://www.nodc.noaa.gov/oceanacidification. NODC also has a goal to provide each archived dataset with a unique ID, to ensure a means of providing credit to the data provider. Working with partner institutions, such as the Carbon Dioxide Information Analysis Center (CDIAC), Biological and Chemical Oceanography Data management Office (BCO-DMO), and federal labs, NODC is exploring the challenges of coordinated data flow and quality control for diverse ocean acidification data sets. These data sets include data from coastal and ocean monitoring, laboratory and field experiments, model output, and remotely sensed data. NODC already has in place automated data extraction protocols for archiving oceanographic data from BCO-DMO and CDIAC. We present a vision for how these disparate data streams can be more fully utilized when brought together using data standards. Like the Multiple-Listing Service in the real estate market, the OADS project is dedicated to developing a repository of ocean acidification data from all sources, and to serving them to the ocean acidification community using a user-friendly interface in a timely manner. For further information please contact NODC.Ocean.Acidification@noaa.gov.

  2. High-End Scientific Computing

    EPA Pesticide Factsheets

    EPA uses high-end scientific computing, geospatial services and remote sensing/imagery analysis to support EPA's mission. The Center for Environmental Computing (CEC) assists the Agency's program offices and regions to meet staff needs in these areas.

  3. Quantification of submarine groundwater discharge and its short-term dynamics by linking time-variant end-member mixing analysis and isotope mass balancing (222-Rn)

    NASA Astrophysics Data System (ADS)

    Petermann, Eric; Knöller, Kay; Stollberg, Reiner; Scholten, Jan; Rocha, Carlos; Weiß, Holger; Schubert, Michael

    2017-04-01

    Submarine groundwater discharge (SGD) plays a crucial role for the water quality of coastal waters due to associated fluxes of nutrients, organic compounds and/or heavy-metals. Thus, the quantification of SGD is essential for evaluating the vulnerability of coastal water bodies with regard to groundwater pollution as well as for understanding the matter cycles of the connected water bodies. Here, we present a scientific approach for quantifying discharge of fresh groundwater (GWf) and recirculated seawater (SWrec), including its short-term temporal dynamics, into the tide-affected Knysna estuary, South Africa. For a time-variant end-member mixing analysis we conducted time-series observations of radon (222Rn) and salinity within the estuary over two tidal cycles in combination with estimates of the related end-members for seawater, river water, GWf and SWrec. The mixing analysis was treated as constrained optimization problem for finding an end-member mixing ratio that simultaneously fits the observed data for radon and salinity best for every time-step. Uncertainty of each mixing ratio was quantified by Monte Carlo simulations of the optimization procedure considering uncertainty in end-member characterization. Results reveal the highest GWf and SWrec fraction in the estuary during peak low tide with averages of 0.8 % and 1.4 %, respectively. Further, we calculated a radon mass balance that revealed a daily radon flux of 4.8 * 108 Bq into the estuary equivalent to a GWf discharge of 29.000 m3/d (9.000-59.000 m3/d for 25th-75th percentile range) and a SWrec discharge of 80.000 m3/d (45.000-130.000 m3/d for 25th-75th percentile range). The uncertainty of SGD reflects the end-member uncertainty, i.e. the spatial heterogeneity of groundwater composition. The presented approach allows the calculation of mixing ratios of multiple uncertain end-members for time-series measurements of multiple parameters. Linking these results with a tracer mass balance allows conversion of end-member fractions to end-member fluxes.

  4. Utility of coupling nonlinear optimization methods with numerical modeling software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murphy, M.J.

    1996-08-05

    Results of using GLO (Global Local Optimizer), a general purpose nonlinear optimization software package for investigating multi-parameter problems in science and engineering is discussed. The package consists of the modular optimization control system (GLO), a graphical user interface (GLO-GUI), a pre-processor (GLO-PUT), a post-processor (GLO-GET), and nonlinear optimization software modules, GLOBAL & LOCAL. GLO is designed for controlling and easy coupling to any scientific software application. GLO runs the optimization module and scientific software application in an iterative loop. At each iteration, the optimization module defines new values for the set of parameters being optimized. GLO-PUT inserts the new parametermore » values into the input file of the scientific application. GLO runs the application with the new parameter values. GLO-GET determines the value of the objective function by extracting the results of the analysis and comparing to the desired result. GLO continues to run the scientific application over and over until it finds the ``best`` set of parameters by minimizing (or maximizing) the objective function. An example problem showing the optimization of material model is presented (Taylor cylinder impact test).« less

  5. Component-based integration of chemistry and optimization software.

    PubMed

    Kenny, Joseph P; Benson, Steven J; Alexeev, Yuri; Sarich, Jason; Janssen, Curtis L; McInnes, Lois Curfman; Krishnan, Manojkumar; Nieplocha, Jarek; Jurrus, Elizabeth; Fahlstrom, Carl; Windus, Theresa L

    2004-11-15

    Typical scientific software designs make rigid assumptions regarding programming language and data structures, frustrating software interoperability and scientific collaboration. Component-based software engineering is an emerging approach to managing the increasing complexity of scientific software. Component technology facilitates code interoperability and reuse. Through the adoption of methodology and tools developed by the Common Component Architecture Forum, we have developed a component architecture for molecular structure optimization. Using the NWChem and Massively Parallel Quantum Chemistry packages, we have produced chemistry components that provide capacity for energy and energy derivative evaluation. We have constructed geometry optimization applications by integrating the Toolkit for Advanced Optimization, Portable Extensible Toolkit for Scientific Computation, and Global Arrays packages, which provide optimization and linear algebra capabilities. We present a brief overview of the component development process and a description of abstract interfaces for chemical optimizations. The components conforming to these abstract interfaces allow the construction of applications using different chemistry and mathematics packages interchangeably. Initial numerical results for the component software demonstrate good performance, and highlight potential research enabled by this platform.

  6. 76 FR 47596 - Notice of Scientific Summit; The Science of Compassion-Future Directions in End-of-Life and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-05

    ...; The Science of Compassion--Future Directions in End-of-Life and Palliative Care SUMMARY: Notice is... science at the end-of-life. On August 11-12, the summit will feature keynote presentations, three plenary...), Department of Health and Human Services, will convene a scientific summit titled ``The Science of Compassion...

  7. Ending AIDS as a public health threat by 2030: Scientific Developments from the 2016 INTEREST Conference in Yaoundé, Cameroon

    PubMed Central

    Hankins, Catherine A; Koulla-Shiro, Sinata; Kilmarx, Peter; Ferrari, Guido; Schechter, Mauro; Kane, Coumba Touré; Venter, François; Boucher, Charles AB; Ross, Anna-Laura; Zewdie, Debrework; Eholié, Serge Paul; Katabira, Elly

    2017-01-01

    The underpinning theme of the 2016 INTEREST Conference held in Yaoundé, Cameroon, 3–6 May 2016 was ending AIDS as a public health threat by 2030. Focused primarily on HIV treatment, pathogenesis and prevention research in resource-limited settings, the conference attracted 369 active delegates from 34 countries, of which 22 were in Africa. Presentations on treatment optimization, acquired drug resistance, care of children and adolescents, laboratory monitoring and diagnostics, implementation challenges, HIV prevention, key populations, vaccine and cure, hepatitis C, mHealth, financing the HIV response and emerging pathogens, were accompanied by oral, mini-oral and poster presentations. Spirited plenary debates on the UNAIDS 90-90-90 treatment cascade goal and on antiretroviral pre-exposure prophylaxis took place. Joep Lange career guidance sessions and grantspersonship sessions attracted early career researchers. At the closing ceremony, the Yaoundé Declaration called on African governments; UNAIDS; development, bilateral, and multilateral partners; and civil society to adopt urgent and sustained approaches to end HIV by 2030. PMID:28387654

  8. Optimization strategies for molecular dynamics programs on Cray computers and scalar work stations

    NASA Astrophysics Data System (ADS)

    Unekis, Michael J.; Rice, Betsy M.

    1994-12-01

    We present results of timing runs and different optimization strategies for a prototype molecular dynamics program that simulates shock waves in a two-dimensional (2-D) model of a reactive energetic solid. The performance of the program may be improved substantially by simple changes to the Fortran or by employing various vendor-supplied compiler optimizations. The optimum strategy varies among the machines used and will vary depending upon the details of the program. The effect of various compiler options and vendor-supplied subroutine calls is demonstrated. Comparison is made between two scalar workstations (IBM RS/6000 Model 370 and Model 530) and several Cray supercomputers (X-MP/48, Y-MP8/128, and C-90/16256). We find that for a scientific application program dominated by sequential, scalar statements, a relatively inexpensive high-end work station such as the IBM RS/60006 RISC series will outperform single processor performance of the Cray X-MP/48 and perform competitively with single processor performance of the Y-MP8/128 and C-9O/16256.

  9. Optimized technical and scientific design approach for high performance anticoincidence shields

    NASA Astrophysics Data System (ADS)

    Graue, Roland; Stuffler, Timo; Monzani, Franco; Bastia, Paolo; Gryksa, Werner; Pahl, Germit

    2018-04-01

    This paper, "Optimized technical and scientific design approach for high performance anticoincidence shields," was presented as part of International Conference on Space Optics—ICSO 1997, held in Toulouse, France.

  10. Undergraduate honors students' images of science: Nature of scientific work and scientific knowledge

    NASA Astrophysics Data System (ADS)

    Wallace, Michael L.

    This exploratory study assessed the influence of an implicit, inquiry-oriented nature of science (NOS) instructional approach undertaken in an interdisciplinary college science course on undergraduate honor students' (UHS) understanding of the aspects of NOS for scientific work and scientific knowledge. In this study, the nature of scientific work concentrated upon the delineation of science from pseudoscience and the value scientists place on reproducibility. The nature of scientific knowledge concentrated upon how UHS view scientific theories and how they believe scientists utilize scientific theories in their research. The 39 UHS who participated in the study were non-science majors enrolled in a Honors College sponsored interdisciplinary science course where the instructors took an implicit NOS instructional approach. An open-ended assessment instrument, the UFO Scenario, was designed for the course and used to assess UHS' images of science at the beginning and end of the semester. The mixed-design study employed both qualitative and quantitative techniques to analyze the open-ended responses. The qualitative techniques of open and axial coding were utilized to find recurring themes within UHS' responses. McNemar's chi-square test for two dependent samples was used to identify whether any statistically significant changes occurred within responses from the beginning to the end of the semester. At the start of the study, the majority of UHS held mixed NOS views, but were able to accurately define what a scientific theory is and explicate how scientists utilize theories within scientific research. Postinstruction assessment indicated that UHS did not make significant gains in their understanding of the nature of scientific work or scientific knowledge and their overall images of science remained static. The results of the present study found implicit NOS instruction even with an extensive inquiry-oriented component was an ineffective approach for modifying UHS' images of science towards a more informed view of NOS.

  11. Sitting biomechanics, part II: optimal car driver's seat and optimal driver's spinal model.

    PubMed

    Harrison, D D; Harrison, S O; Croft, A C; Harrison, D E; Troyanovich, S J

    2000-01-01

    Driving has been associated with signs and symptoms caused by vibrations. Sitting causes the pelvis to rotate backwards and the lumbar lordosis to reduce. Lumbar support and armrests reduce disc pressure and electromyographically recorded values. However, the ideal driver's seat and an optimal seated spinal model have not been described. To determine an optimal automobile seat and an ideal spinal model of a driver. Information was obtained from peer-reviewed scientific journals and texts, automotive engineering reports, and the National Library of Medicine. Driving predisposes vehicle operators to low-back pain and degeneration. The optimal seat would have an adjustable seat back incline of 100 degrees from horizontal, a changeable depth of seat back to front edge of seat bottom, adjustable height, an adjustable seat bottom incline, firm (dense) foam in the seat bottom cushion, horizontally and vertically adjustable lumbar support, adjustable bilateral arm rests, adjustable head restraint with lordosis pad, seat shock absorbers to dampen frequencies in the 1 to 20 Hz range, and linear front-back travel of the seat enabling drivers of all sizes to reach the pedals. The lumbar support should be pulsating in depth to reduce static load. The seat back should be damped to reduce rebounding of the torso in rear-end impacts. The optimal driver's spinal model would be the average Harrison model in a 10 degrees posterior inclining seat back angle.

  12. Optimize scientific communication skills on work and energy concept with implementation of interactive conceptual instruction and multi representation approach

    NASA Astrophysics Data System (ADS)

    Patriot, E. A.; Suhandi, A.; Chandra, D. T.

    2018-05-01

    The ultimate goal of learning in the curriculum 2013 is that learning must improve and balance between soft skills and hard skills of learners. In addition to the knowledge aspect, one of the other skills to be trained in the learning process using a scientific approach is communication skills. This study aims to get an overview of the implementation of interactive conceptual instruction with multi representation to optimize the achievement of students’ scientific communication skills on work and energy concept. The scientific communication skills contains the sub-skills were searching the information, scientific writing, group discussion and knowledge presentation. This study was descriptive research with observation method. Subjects in this study were 35 students of class X in Senior High School at Sumedang. The results indicate an achievement of optimal scientific communication skills. The greatest achievement of KKI based on observation is at fourth meeting of KKI-3, which is a sub-skill of resume writing of 89%. Allmost students responded positively to the implication of interactive conceptual instruction with multi representation approach. It can be concluded that the implication of interactive conceptual instruction with multi representation approach can optimize the achievement of students’ scientific communication skill on work and energy concept.

  13. A Robust Method to Integrate End-to-End Mission Architecture Optimization Tools

    NASA Technical Reports Server (NTRS)

    Lugo, Rafael; Litton, Daniel; Qu, Min; Shidner, Jeremy; Powell, Richard

    2016-01-01

    End-to-end mission simulations include multiple phases of flight. For example, an end-to-end Mars mission simulation may include launch from Earth, interplanetary transit to Mars and entry, descent and landing. Each phase of flight is optimized to meet specified constraints and often depend on and impact subsequent phases. The design and optimization tools and methodologies used to combine different aspects of end-to-end framework and their impact on mission planning are presented. This work focuses on a robust implementation of a Multidisciplinary Design Analysis and Optimization (MDAO) method that offers the flexibility to quickly adapt to changing mission design requirements. Different simulations tailored to the liftoff, ascent, and atmospheric entry phases of a trajectory are integrated and optimized in the MDAO program Isight, which provides the user a graphical interface to link simulation inputs and outputs. This approach provides many advantages to mission planners, as it is easily adapted to different mission scenarios and can improve the understanding of the integrated system performance within a particular mission configuration. A Mars direct entry mission using the Space Launch System (SLS) is presented as a generic end-to-end case study. For the given launch period, the SLS launch performance is traded for improved orbit geometry alignment, resulting in an optimized a net payload that is comparable to that in the SLS Mission Planner's Guide.

  14. The New Human Condition and Climate Change: Humanities and Social Science Perceptions of Threat

    NASA Astrophysics Data System (ADS)

    Holm, Poul; Travis, Charles

    2017-09-01

    Thinking, no doubt, plays an enormous role in every scientific enterprise, but it is the role of a means to an end; the end is determined by a decision about what is worth-while knowing and this decision cannot be scientific.

  15. Cerenkov luminescence imaging: physics principles and potential applications in biomedical sciences.

    PubMed

    Ciarrocchi, Esther; Belcari, Nicola

    2017-12-01

    Cerenkov luminescence imaging (CLI) is a novel imaging modality to study charged particles with optical methods by detecting the Cerenkov luminescence produced in tissue. This paper first describes the physical processes that govern the production and transport in tissue of Cerenkov luminescence. The detectors used for CLI and their most relevant specifications to optimize the acquisition of the Cerenkov signal are then presented, and CLI is compared with the other optical imaging modalities sharing the same data acquisition and processing methods. Finally, the scientific work related to CLI and the applications for which CLI has been proposed are reviewed. The paper ends with some considerations about further perspectives for this novel imaging modality.

  16. Big Data Smart Socket (BDSS): a system that abstracts data transfer habits from end users.

    PubMed

    Watts, Nicholas A; Feltus, Frank A

    2017-02-15

    The ability to centralize and store data for long periods on an end user's computational resources is increasingly difficult for many scientific disciplines. For example, genomics data is increasingly large and distributed, and the data needs to be moved into workflow execution sites ranging from lab workstations to the cloud. However, the typical user is not always informed on emerging network technology or the most efficient methods to move and share data. Thus, the user defaults to using inefficient methods for transfer across the commercial internet. To accelerate large data transfer, we created a tool called the Big Data Smart Socket (BDSS) that abstracts data transfer methodology from the user. The user provides BDSS with a manifest of datasets stored in a remote storage repository. BDSS then queries a metadata repository for curated data transfer mechanisms and optimal path to move each of the files in the manifest to the site of workflow execution. BDSS functions as a standalone tool or can be directly integrated into a computational workflow such as provided by the Galaxy Project. To demonstrate applicability, we use BDSS within a biological context, although it is applicable to any scientific domain. BDSS is available under version 2 of the GNU General Public License at https://github.com/feltus/BDSS . ffeltus@clemson.edu. © The Author 2016. Published by Oxford University Press.

  17. Big Data Smart Socket (BDSS): a system that abstracts data transfer habits from end users

    PubMed Central

    Watts, Nicholas A.

    2017-01-01

    Motivation: The ability to centralize and store data for long periods on an end user’s computational resources is increasingly difficult for many scientific disciplines. For example, genomics data is increasingly large and distributed, and the data needs to be moved into workflow execution sites ranging from lab workstations to the cloud. However, the typical user is not always informed on emerging network technology or the most efficient methods to move and share data. Thus, the user defaults to using inefficient methods for transfer across the commercial internet. Results: To accelerate large data transfer, we created a tool called the Big Data Smart Socket (BDSS) that abstracts data transfer methodology from the user. The user provides BDSS with a manifest of datasets stored in a remote storage repository. BDSS then queries a metadata repository for curated data transfer mechanisms and optimal path to move each of the files in the manifest to the site of workflow execution. BDSS functions as a standalone tool or can be directly integrated into a computational workflow such as provided by the Galaxy Project. To demonstrate applicability, we use BDSS within a biological context, although it is applicable to any scientific domain. Availability and Implementation: BDSS is available under version 2 of the GNU General Public License at https://github.com/feltus/BDSS. Contact: ffeltus@clemson.edu PMID:27797780

  18. Trajectory Optimization for Missions to Small Bodies with a Focus on Scientific Merit.

    PubMed

    Englander, Jacob A; Vavrina, Matthew A; Lim, Lucy F; McFadden, Lucy A; Rhoden, Alyssa R; Noll, Keith S

    2017-01-01

    Trajectory design for missions to small bodies is tightly coupled both with the selection of targets for a mission and with the choice of spacecraft power, propulsion, and other hardware. Traditional methods of trajectory optimization have focused on finding the optimal trajectory for an a priori selection of destinations and spacecraft parameters. Recent research has expanded the field of trajectory optimization to multidisciplinary systems optimization that includes spacecraft parameters. The logical next step is to extend the optimization process to include target selection based not only on engineering figures of merit but also scientific value. This paper presents a new technique to solve the multidisciplinary mission optimization problem for small-bodies missions, including classical trajectory design, the choice of spacecraft power and propulsion systems, and also the scientific value of the targets. This technique, when combined with modern parallel computers, enables a holistic view of the small body mission design process that previously required iteration among several different design processes.

  19. VALUE: A trans-disciplinary research project - and some challenges in its implementation

    NASA Astrophysics Data System (ADS)

    Huebener, Heike

    2013-04-01

    The EU-COST-action VALUE ("Validating and Integrating Downscaling Methods for Climate Change Research") is composed as a trans-disciplinary network activity, meaning that stakeholders and end-users not only from different scientific disciplines (i.e. inter-disciplinary research) but also from outside science are included in the design, planning and progress of the project. This gives the optimal chance to produce really workable project results for the intended end-users. However, some considerable challenges lie this way. These challenges start with identifying and motivating the target-stakeholders, they cover communication in different user-specific languages and reach as far as the question of the freedom of research when under the prompting of politics or economy. We will cover only some of the mentioned challenges; focusing on the identification of the target-stakeholders or end-users, their motivation to participate in the project and on some typical problems arising in this constellation. First experiences from the project will be presented. The aim of the presentation is to instigate discussion on developing workable project structures for trans-disciplinary research, as this will become more and more relevant in future research funding.

  20. Tuning the cache memory usage in tomographic reconstruction on standard computers with Advanced Vector eXtensions (AVX)

    PubMed Central

    Agulleiro, Jose-Ignacio; Fernandez, Jose-Jesus

    2015-01-01

    Cache blocking is a technique widely used in scientific computing to minimize the exchange of information with main memory by reusing the data kept in cache memory. In tomographic reconstruction on standard computers using vector instructions, cache blocking turns out to be central to optimize performance. To this end, sinograms of the tilt-series and slices of the volumes to be reconstructed have to be divided into small blocks that fit into the different levels of cache memory. The code is then reorganized so as to operate with a block as much as possible before proceeding with another one. This data article is related to the research article titled Tomo3D 2.0 – Exploitation of Advanced Vector eXtensions (AVX) for 3D reconstruction (Agulleiro and Fernandez, 2015) [1]. Here we present data of a thorough study of the performance of tomographic reconstruction by varying cache block sizes, which allows derivation of expressions for their automatic quasi-optimal tuning. PMID:26217710

  1. Tuning the cache memory usage in tomographic reconstruction on standard computers with Advanced Vector eXtensions (AVX).

    PubMed

    Agulleiro, Jose-Ignacio; Fernandez, Jose-Jesus

    2015-06-01

    Cache blocking is a technique widely used in scientific computing to minimize the exchange of information with main memory by reusing the data kept in cache memory. In tomographic reconstruction on standard computers using vector instructions, cache blocking turns out to be central to optimize performance. To this end, sinograms of the tilt-series and slices of the volumes to be reconstructed have to be divided into small blocks that fit into the different levels of cache memory. The code is then reorganized so as to operate with a block as much as possible before proceeding with another one. This data article is related to the research article titled Tomo3D 2.0 - Exploitation of Advanced Vector eXtensions (AVX) for 3D reconstruction (Agulleiro and Fernandez, 2015) [1]. Here we present data of a thorough study of the performance of tomographic reconstruction by varying cache block sizes, which allows derivation of expressions for their automatic quasi-optimal tuning.

  2. Scaling to diversity: The DERECHOS distributed infrastructure for analyzing and sharing data

    NASA Astrophysics Data System (ADS)

    Rilee, M. L.; Kuo, K. S.; Clune, T.; Oloso, A.; Brown, P. G.

    2016-12-01

    Integrating Earth Science data from diverse sources such as satellite imagery and simulation output can be expensive and time-consuming, limiting scientific inquiry and the quality of our analyses. Reducing these costs will improve innovation and quality in science. The current Earth Science data infrastructure focuses on downloading data based on requests formed from the search and analysis of associated metadata. And while the data products provided by archives may use the best available data sharing technologies, scientist end-users generally do not have such resources (including staff) available to them. Furthermore, only once an end-user has received the data from multiple diverse sources and has integrated them can the actual analysis and synthesis begin. The cost of getting from idea to where synthesis can start dramatically slows progress. In this presentation we discuss a distributed computational and data storage framework that eliminates much of the aforementioned cost. The SciDB distributed array database is central as it is optimized for scientific computing involving very large arrays, performing better than less specialized frameworks like Spark. Adding spatiotemporal functions to the SciDB creates a powerful platform for analyzing and integrating massive, distributed datasets. SciDB allows Big Earth Data analysis to be performed "in place" without the need for expensive downloads and end-user resources. Spatiotemporal indexing technologies such as the hierarchical triangular mesh enable the compute and storage affinity needed to efficiently perform co-located and conditional analyses minimizing data transfers. These technologies automate the integration of diverse data sources using the framework, a critical step beyond current metadata search and analysis. Instead of downloading data into their idiosyncratic local environments, end-users can generate and share data products integrated from diverse multiple sources using a common shared environment, turning distributed active archive centers (DAACs) from warehouses into distributed active analysis centers.

  3. Hypergraph-Based Combinatorial Optimization of Matrix-Vector Multiplication

    ERIC Educational Resources Information Center

    Wolf, Michael Maclean

    2009-01-01

    Combinatorial scientific computing plays an important enabling role in computational science, particularly in high performance scientific computing. In this thesis, we will describe our work on optimizing matrix-vector multiplication using combinatorial techniques. Our research has focused on two different problems in combinatorial scientific…

  4. PARLO: PArallel Run-Time Layout Optimization for Scientific Data Explorations with Heterogeneous Access Pattern

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gong, Zhenhuan; Boyuka, David; Zou, X

    Download Citation Email Print Request Permissions Save to Project The size and scope of cutting-edge scientific simulations are growing much faster than the I/O and storage capabilities of their run-time environments. The growing gap is exacerbated by exploratory, data-intensive analytics, such as querying simulation data with multivariate, spatio-temporal constraints, which induces heterogeneous access patterns that stress the performance of the underlying storage system. Previous work addresses data layout and indexing techniques to improve query performance for a single access pattern, which is not sufficient for complex analytics jobs. We present PARLO a parallel run-time layout optimization framework, to achieve multi-levelmore » data layout optimization for scientific applications at run-time before data is written to storage. The layout schemes optimize for heterogeneous access patterns with user-specified priorities. PARLO is integrated with ADIOS, a high-performance parallel I/O middleware for large-scale HPC applications, to achieve user-transparent, light-weight layout optimization for scientific datasets. It offers simple XML-based configuration for users to achieve flexible layout optimization without the need to modify or recompile application codes. Experiments show that PARLO improves performance by 2 to 26 times for queries with heterogeneous access patterns compared to state-of-the-art scientific database management systems. Compared to traditional post-processing approaches, its underlying run-time layout optimization achieves a 56% savings in processing time and a reduction in storage overhead of up to 50%. PARLO also exhibits a low run-time resource requirement, while also limiting the performance impact on running applications to a reasonable level.« less

  5. Performance Assessment of Different Pulse Reconstruction Algorithms for the ATHENA X-Ray Integral Field Unit

    NASA Technical Reports Server (NTRS)

    Peille, Phillip; Ceballos, Maria Teresa; Cobo, Beatriz; Wilms, Joern; Bandler, Simon; Smith, Stephen J.; Dauser, Thomas; Brand, Thorsten; Den Haretog, Roland; de Plaa, Jelle; hide

    2016-01-01

    The X-ray Integral Field Unit (X-IFU) microcalorimeter, on-board Athena, with its focal plane comprising 3840 Transition Edge Sensors (TESs) operating at 90 mK, will provide unprecedented spectral-imaging capability in the 0.2-12 keV energy range. It will rely on the on-board digital processing of current pulses induced by the heat deposited in the TES absorber, as to recover the energy of each individual events. Assessing the capabilities of the pulse reconstruction is required to understand the overall scientific performance of the X-IFU, notably in terms of energy resolution degradation with both increasing energies and count rates. Using synthetic data streams generated by the X-IFU End-to-End simulator, we present here a comprehensive benchmark of various pulse reconstruction techniques, ranging from standard optimal filtering to more advanced algorithms based on noise covariance matrices. Beside deriving the spectral resolution achieved by the different algorithms, a first assessment of the computing power and ground calibration needs is presented. Overall, all methods show similar performances, with the reconstruction based on noise covariance matrices showing the best improvement with respect to the standard optimal filtering technique. Due to prohibitive calibration needs, this method might however not be applicable to the X-IFU and the best compromise currently appears to be the so-called resistance space analysis which also features very promising high count rate capabilities.

  6. Combining Simulation Tools for End-to-End Trajectory Optimization

    NASA Technical Reports Server (NTRS)

    Whitley, Ryan; Gutkowski, Jeffrey; Craig, Scott; Dawn, Tim; Williams, Jacobs; Stein, William B.; Litton, Daniel; Lugo, Rafael; Qu, Min

    2015-01-01

    Trajectory simulations with advanced optimization algorithms are invaluable tools in the process of designing spacecraft. Due to the need for complex models, simulations are often highly tailored to the needs of the particular program or mission. NASA's Orion and SLS programs are no exception. While independent analyses are valuable to assess individual spacecraft capabilities, a complete end-to-end trajectory from launch to splashdown maximizes potential performance and ensures a continuous solution. In order to obtain end-to-end capability, Orion's in-space tool (Copernicus) was made to interface directly with the SLS's ascent tool (POST2) and a new tool to optimize the full problem by operating both simulations simultaneously was born.

  7. Scientific progress: Knowledge versus understanding.

    PubMed

    Dellsén, Finnur

    2016-04-01

    What is scientific progress? On Alexander Bird's epistemic account of scientific progress, an episode in science is progressive precisely when there is more scientific knowledge at the end of the episode than at the beginning. Using Bird's epistemic account as a foil, this paper develops an alternative understanding-based account on which an episode in science is progressive precisely when scientists grasp how to correctly explain or predict more aspects of the world at the end of the episode than at the beginning. This account is shown to be superior to the epistemic account by examining cases in which knowledge and understanding come apart. In these cases, it is argued that scientific progress matches increases in scientific understanding rather than accumulations of knowledge. In addition, considerations having to do with minimalist idealizations, pragmatic virtues, and epistemic value all favor this understanding-based account over its epistemic counterpart. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. The Sardinia Radio Telescope . From a technological project to a radio observatory

    NASA Astrophysics Data System (ADS)

    Prandoni, I.; Murgia, M.; Tarchi, A.; Burgay, M.; Castangia, P.; Egron, E.; Govoni, F.; Pellizzoni, A.; Ricci, R.; Righini, S.; Bartolini, M.; Casu, S.; Corongiu, A.; Iacolina, M. N.; Melis, A.; Nasir, F. T.; Orlati, A.; Perrodin, D.; Poppi, S.; Trois, A.; Vacca, V.; Zanichelli, A.; Bachetti, M.; Buttu, M.; Comoretto, G.; Concu, R.; Fara, A.; Gaudiomonte, F.; Loi, F.; Migoni, C.; Orfei, A.; Pilia, M.; Bolli, P.; Carretti, E.; D'Amico, N.; Guidetti, D.; Loru, S.; Massi, F.; Pisanu, T.; Porceddu, I.; Ridolfi, A.; Serra, G.; Stanghellini, C.; Tiburzi, C.; Tingay, S.; Valente, G.

    2017-12-01

    Context. The Sardinia Radio Telescope (SRT) is the new 64 m dish operated by the Italian National Institute for Astrophysics (INAF). Its active surface, comprised of 1008 separate aluminium panels supported by electromechanical actuators, will allow us to observe at frequencies of up to 116 GHz. At the moment, three receivers, one per focal position, have been installed and tested: a 7-beam K-band receiver, a mono-feed C-band receiver, and a coaxial dual-feed L/P band receiver. The SRT was officially opened in September 2013, upon completion of its technical commissioning phase. In this paper, we provide an overview of the main science drivers for the SRT, describe the main outcomes from the scientific commissioning of the telescope, and discuss a set of observations demonstrating the scientific capabilities of the SRT. Aims: The scientific commissioning phase, carried out in the 2012-2015 period, proceeded in stages following the implementation and/or fine-tuning of advanced subsystems such as the active surface, the derotator, new releases of the acquisition software, etc. One of the main objectives of scientific commissioning was the identification of deficiencies in the instrumentation and/or in the telescope subsystems for further optimization. As a result, the overall telescope performance has been significantly improved. Methods: As part of the scientific commissioning activities, different observing modes were tested and validated, and the first astronomical observations were carried out to demonstrate the science capabilities of the SRT. In addition, we developed astronomer-oriented software tools to support future observers on site. In the following, we refer to the overall scientific commissioning and software development activities as astronomical validation. Results: The astronomical validation activities were prioritized based on technical readiness and scientific impact. The highest priority was to make the SRT available for joint observations as part of European networks. As a result, the SRT started to participate (in shared-risk mode) in European VLBI Network (EVN) and Large European Array for Pulsars (LEAP) observing sessions in early 2014. The validation of single-dish operations for the suite of SRT first light receivers and backends continued in the following year, and was concluded with the first call for shared-risk early-science observations issued at the end of 2015. As discussed in the paper, the SRT capabilities were tested (and optimized when possible) for several different observing modes: imaging, spectroscopy, pulsar timing, and transients.

  9. Collaboration pathway(s) using new tools for optimizing operational climate monitoring from space

    NASA Astrophysics Data System (ADS)

    Helmuth, Douglas B.; Selva, Daniel; Dwyer, Morgan M.

    2014-10-01

    Consistently collecting the earth's climate signatures remains a priority for world governments and international scientific organizations. Architecting a solution requires transforming scientific missions into an optimized robust `operational' constellation that addresses the needs of decision makers, scientific investigators and global users for trusted data. The application of new tools offers pathways for global architecture collaboration. Recent (2014) rulebased decision engine modeling runs that targeted optimizing the intended NPOESS architecture, becomes a surrogate for global operational climate monitoring architecture(s). This rule-based systems tools provide valuable insight for Global climate architectures, through the comparison and evaluation of alternatives considered and the exhaustive range of trade space explored. A representative optimization of Global ECV's (essential climate variables) climate monitoring architecture(s) is explored and described in some detail with thoughts on appropriate rule-based valuations. The optimization tools(s) suggest and support global collaboration pathways and hopefully elicit responses from the audience and climate science shareholders.

  10. Reduced duration of dual antiplatelet therapy using an improved drug-eluting stent for percutaneous coronary intervention of the left main artery in a real-world, all-comer population: Rationale and study design of the prospective randomized multicenter IDEAL-LM trial.

    PubMed

    Lemmert, Miguel E; Oldroyd, Keith; Barragan, Paul; Lesiak, Maciej; Byrne, Robert A; Merkulov, Evgeny; Daemen, Joost; Onuma, Yoshinobu; Witberg, Karen; van Geuns, Robert-Jan

    2017-05-01

    Continuous improvements in stent technology make percutaneous coronary intervention (PCI) a potential alternative to surgery in selected patients with unprotected left main coronary artery (uLMCA) disease. The optimal duration of dual antiplatelet therapy (DAPT) in these patients remains undetermined, and in addition, new stent designs using a bioabsorbable polymer might allow shorter duration of DAPT. IDEAL-LM is a prospective, randomized, multicenter study that will enroll 818 patients undergoing uLMCA PCI. Patients will be randomized in a 1:1 fashion to intravascular ultrasound-guided PCI with the novel everolimus-eluting platinum-chromium Synergy stent with a biodegradable polymer (Boston Scientific, Natick, MA) followed by 4 months of DAPT or the everolimus-eluting cobalt-chromium Xience stent (Abbott Vascular, Santa Clara, CA) followed by 12 months of DAPT. The total follow-up period will be 5 years. A subset of 100 patients will undergo optical coherence tomography at 3 months. The primary end point will be major adverse cardiovascular events (composite of all-cause mortality, myocardial infarction, and ischemia-driven target vessel revascularization) at 2 years. Secondary end points will consist of the individual components of the primary end point, procedural success, a device-oriented composite end point, stent thrombosis as per Academic Research Consortium criteria, and bleeding as per Bleeding Academic Research Consortium criteria. IDEAL-LM is designed to assess the safety and efficacy of the novel Synergy stent followed by 4 months of DAPT vs the Xience stent followed by 12 months of DAPT in patients undergoing uLMCA PCI. The study will provide novel insights regarding optimal treatment strategy for patients undergoing PCI of uLMCA disease (www.clinicaltrials.gov, NCT 02303717). Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  11. Neuropathological biomarker candidates in brain tumors: key issues for translational efficiency.

    PubMed

    Hainfellner, J A; Heinzl, H

    2010-01-01

    Brain tumors comprise a large spectrum of rare malignancies in children and adults that are often associated with severe neurological symptoms and fatal outcome. Neuropathological tumor typing provides both prognostic and predictive tissue information which is the basis for optimal postoperative patient management and therapy. Molecular biomarkers may extend and refine prognostic and predictive information in a brain tumor case, providing more individualized and optimized treatment options. In the recent past a few neuropathological brain tumor biomarkers have translated smoothly into clinical use whereas many candidates show protracted translation. We investigated the causes of protracted translation of candidate brain tumor biomarkers. Considering the research environment from personal, social and systemic perspectives we identified eight determinants of translational success: methodology, funding, statistics, organization, phases of research, cooperation, self-reflection, and scientific progeny. Smoothly translating biomarkers are associated with low degrees of translational complexity whereas biomarkers with protracted translation are associated with high degrees. Key issues for translational efficiency of neuropathological brain tumor biomarker research seem to be related to (i) the strict orientation to the mission of medical research, that is the improval of medical practice as primordial purpose of research, (ii) definition of research priorities according to clinical needs, and (iii) absorption of translational complexities by means of operatively beneficial standards. To this end, concrete actions should comprise adequate scientific education of young investigators, and shaping of integrative diagnostics and therapy research both on the local level and the level of influential international brain tumor research platforms.

  12. Lateral Penumbra Modelling Based Leaf End Shape Optimization for Multileaf Collimator in Radiotherapy.

    PubMed

    Zhou, Dong; Zhang, Hui; Ye, Peiqing

    2016-01-01

    Lateral penumbra of multileaf collimator plays an important role in radiotherapy treatment planning. Growing evidence has revealed that, for a single-focused multileaf collimator, lateral penumbra width is leaf position dependent and largely attributed to the leaf end shape. In our study, an analytical method for leaf end induced lateral penumbra modelling is formulated using Tangent Secant Theory. Compared with Monte Carlo simulation and ray tracing algorithm, our model serves well the purpose of cost-efficient penumbra evaluation. Leaf ends represented in parametric forms of circular arc, elliptical arc, Bézier curve, and B-spline are implemented. With biobjective function of penumbra mean and variance introduced, genetic algorithm is carried out for approximating the Pareto frontier. Results show that for circular arc leaf end objective function is convex and convergence to optimal solution is guaranteed using gradient based iterative method. It is found that optimal leaf end in the shape of Bézier curve achieves minimal standard deviation, while using B-spline minimum of penumbra mean is obtained. For treatment modalities in clinical application, optimized leaf ends are in close agreement with actual shapes. Taken together, the method that we propose can provide insight into leaf end shape design of multileaf collimator.

  13. Front End Software for Online Database Searching. Part 2: The Marketplace.

    ERIC Educational Resources Information Center

    Levy, Louise R.; Hawkins, Donald T.

    1986-01-01

    This article analyzes the front end software marketplace and discusses some of the complex forces influencing it. Discussion covers intermediary market; end users (library customers, scientific and technical professionals, corporate business specialists, consumers); marketing strategies; a British front end development firm; competitive pressures;…

  14. Understanding the Impact of an Apprenticeship-Based Scientific Research Program on High School Students' Understanding of Scientific Inquiry

    ERIC Educational Resources Information Center

    Aydeniz, Mehmet; Baksa, Kristen; Skinner, Jane

    2011-01-01

    The purpose of this study was to understand the impact of an apprenticeship program on high school students' understanding of the nature of scientific inquiry. Data related to seventeen students' understanding of science and scientific inquiry were collected through open-ended questionnaires. Findings suggest that although engagement in authentic…

  15. The Relationship of Fast ForWord Scientific Learning to North Carolina End of Grade Reading Scores at a Middle School in Anson County, North Carolina

    ERIC Educational Resources Information Center

    Benfield, Jamie Ledsinger

    2012-01-01

    Anson County School District wished to determine the relationship between Fast ForWord Scientific Learning data and North Carolina End of Grade reading scores at Anson Middle School in Anson County, North Carolina. The specific research questions that guided this study include: 1. How does the literacy intervention, Fast ForWord, affect EOG growth…

  16. Optimization of Error-Bounded Lossy Compression for Hard-to-Compress HPC Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Di, Sheng; Cappello, Franck

    Since today’s scientific applications are producing vast amounts of data, compressing them before storage/transmission is critical. Results of existing compressors show two types of HPC data sets: highly compressible and hard to compress. In this work, we carefully design and optimize the error-bounded lossy compression for hard-tocompress scientific data. We propose an optimized algorithm that can adaptively partition the HPC data into best-fit consecutive segments each having mutually close data values, such that the compression condition can be optimized. Another significant contribution is the optimization of shifting offset such that the XOR-leading-zero length between two consecutive unpredictable data points canmore » be maximized. We finally devise an adaptive method to select the best-fit compressor at runtime for maximizing the compression factor. We evaluate our solution using 13 benchmarks based on real-world scientific problems, and we compare it with 9 other state-of-the-art compressors. Experiments show that our compressor can always guarantee the compression errors within the user-specified error bounds. Most importantly, our optimization can improve the compression factor effectively, by up to 49% for hard-tocompress data sets with similar compression/decompression time cost.« less

  17. [Optimization of end-tool parameters based on robot hand-eye calibration].

    PubMed

    Zhang, Lilong; Cao, Tong; Liu, Da

    2017-04-01

    A new one-time registration method was developed in this research for hand-eye calibration of a surgical robot to simplify the operation process and reduce the preparation time. And a new and practical method is introduced in this research to optimize the end-tool parameters of the surgical robot based on analysis of the error sources in this registration method. In the process with one-time registration method, firstly a marker on the end-tool of the robot was recognized by a fixed binocular camera, and then the orientation and position of the marker were calculated based on the joint parameters of the robot. Secondly the relationship between the camera coordinate system and the robot base coordinate system could be established to complete the hand-eye calibration. Because of manufacturing and assembly errors of robot end-tool, an error equation was established with the transformation matrix between the robot end coordinate system and the robot end-tool coordinate system as the variable. Numerical optimization was employed to optimize end-tool parameters of the robot. The experimental results showed that the one-time registration method could significantly improve the efficiency of the robot hand-eye calibration compared with the existing methods. The parameter optimization method could significantly improve the absolute positioning accuracy of the one-time registration method. The absolute positioning accuracy of the one-time registration method can meet the requirements of the clinical surgery.

  18. Applied Mathematical Optimization Technique on Menu Scheduling for Boarding School Student Using Delete-Reshuffle-Reoptimize Algorithm

    NASA Astrophysics Data System (ADS)

    Sufahani, Suliadi; Mohamad, Mahathir; Roslan, Rozaini; Ghazali Kamardan, M.; Che-Him, Norziha; Ali, Maselan; Khalid, Kamal; Nazri, E. M.; Ahmad, Asmala

    2018-04-01

    Boarding school student needs to eat well balanced nutritious food which includes proper calories, vitality and supplements for legitimate development, keeping in mind the end goal is to repair and support the body tissues and averting undesired ailments and disease. Serving healthier menu is a noteworthy stride towards accomplishing that goal. Be that as it may, arranging a nutritious and adjusted menu physically is confounded, wasteful and tedious. This study intends to build up a scientific mathematical model for eating routine arranging that improves and meets the vital supplement consumption for boarding school student aged 13-18 and in addition saving the financial plan. It likewise gives the adaptability for the cook to change any favoured menu even after the ideal arrangement has been produced. A recalculation procedure will be performed in view of the ideal arrangement. The information was gathered from the the Ministry of Education and boarding schools’ authorities. Menu arranging is a notable enhancement issue and part of well-established optimization problem. The model was fathomed by utilizing Binary Programming and “Delete-Reshuffle-Reoptimize Algortihm (DDRA)”.

  19. Trace: a high-throughput tomographic reconstruction engine for large-scale datasets.

    PubMed

    Bicer, Tekin; Gürsoy, Doğa; Andrade, Vincent De; Kettimuthu, Rajkumar; Scullin, William; Carlo, Francesco De; Foster, Ian T

    2017-01-01

    Modern synchrotron light sources and detectors produce data at such scale and complexity that large-scale computation is required to unleash their full power. One of the widely used imaging techniques that generates data at tens of gigabytes per second is computed tomography (CT). Although CT experiments result in rapid data generation, the analysis and reconstruction of the collected data may require hours or even days of computation time with a medium-sized workstation, which hinders the scientific progress that relies on the results of analysis. We present Trace, a data-intensive computing engine that we have developed to enable high-performance implementation of iterative tomographic reconstruction algorithms for parallel computers. Trace provides fine-grained reconstruction of tomography datasets using both (thread-level) shared memory and (process-level) distributed memory parallelization. Trace utilizes a special data structure called replicated reconstruction object to maximize application performance. We also present the optimizations that we apply to the replicated reconstruction objects and evaluate them using tomography datasets collected at the Advanced Photon Source. Our experimental evaluations show that our optimizations and parallelization techniques can provide 158× speedup using 32 compute nodes (384 cores) over a single-core configuration and decrease the end-to-end processing time of a large sinogram (with 4501 × 1 × 22,400 dimensions) from 12.5 h to <5 min per iteration. The proposed tomographic reconstruction engine can efficiently process large-scale tomographic data using many compute nodes and minimize reconstruction times.

  20. Supporting Weather Data

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Since its founding in 1992, Global Science & Technology, Inc. (GST), of Greenbelt, Maryland, has been developing technologies and providing services in support of NASA scientific research. GST specialties include scientific analysis, science data and information systems, data visualization, communications, networking and Web technologies, computer science, and software system engineering. As a longtime contractor to Goddard Space Flight Center s Earth Science Directorate, GST scientific, engineering, and information technology staff have extensive qualifications with the synthesis of satellite, in situ, and Earth science data for weather- and climate-related projects. GST s experience in this arena is end-to-end, from building satellite ground receiving systems and science data systems, to product generation and research and analysis.

  1. The DYNES Instrument: A Description and Overview

    NASA Astrophysics Data System (ADS)

    Zurawski, Jason; Ball, Robert; Barczyk, Artur; Binkley, Mathew; Boote, Jeff; Boyd, Eric; Brown, Aaron; Brown, Robert; Lehman, Tom; McKee, Shawn; Meekhof, Benjeman; Mughal, Azher; Newman, Harvey; Rozsa, Sandor; Sheldon, Paul; Tackett, Alan; Voicu, Ramiro; Wolff, Stephen; Yang, Xi

    2012-12-01

    Scientific innovation continues to increase requirements for the computing and networking infrastructures of the world. Collaborative partners, instrumentation, storage, and processing facilities are often geographically and topologically separated, as is the case with LHC virtual organizations. These separations challenge the technology used to interconnect available resources, often delivered by Research and Education (R&E) networking providers, and leads to complications in the overall process of end-to-end data management. Capacity and traffic management are key concerns of R&E network operators; a delicate balance is required to serve both long-lived, high capacity network flows, as well as more traditional end-user activities. The advent of dynamic circuit services, a technology that enables the creation of variable duration, guaranteed bandwidth networking channels, allows for the efficient use of common network infrastructures. These gains are seen particularly in locations where overall capacity is scarce compared to the (sustained peak) needs of user communities. Related efforts, including those of the LHCOPN [3] operations group and the emerging LHCONE [4] project, may take advantage of available resources by designating specific network activities as a “high priority”, allowing reservation of dedicated bandwidth or optimizing for deadline scheduling and predicable delivery patterns. This paper presents the DYNES instrument, an NSF funded cyberinfrastructure project designed to facilitate end-to-end dynamic circuit services [2]. This combination of hardware and software innovation is being deployed across R&E networks in the United States at selected end-sites located on University Campuses. DYNES is peering with international efforts in other countries using similar solutions, and is increasing the reach of this emerging technology. This global data movement solution could be integrated into computing paradigms such as cloud and grid computing platforms, and through the use of APIs can be integrated into existing data movement software.

  2. Lateral Penumbra Modelling Based Leaf End Shape Optimization for Multileaf Collimator in Radiotherapy

    PubMed Central

    Zhou, Dong; Zhang, Hui; Ye, Peiqing

    2016-01-01

    Lateral penumbra of multileaf collimator plays an important role in radiotherapy treatment planning. Growing evidence has revealed that, for a single-focused multileaf collimator, lateral penumbra width is leaf position dependent and largely attributed to the leaf end shape. In our study, an analytical method for leaf end induced lateral penumbra modelling is formulated using Tangent Secant Theory. Compared with Monte Carlo simulation and ray tracing algorithm, our model serves well the purpose of cost-efficient penumbra evaluation. Leaf ends represented in parametric forms of circular arc, elliptical arc, Bézier curve, and B-spline are implemented. With biobjective function of penumbra mean and variance introduced, genetic algorithm is carried out for approximating the Pareto frontier. Results show that for circular arc leaf end objective function is convex and convergence to optimal solution is guaranteed using gradient based iterative method. It is found that optimal leaf end in the shape of Bézier curve achieves minimal standard deviation, while using B-spline minimum of penumbra mean is obtained. For treatment modalities in clinical application, optimized leaf ends are in close agreement with actual shapes. Taken together, the method that we propose can provide insight into leaf end shape design of multileaf collimator. PMID:27110274

  3. Gender-fair assessment of young gifted students' scientific thinking skills

    NASA Astrophysics Data System (ADS)

    Dori, Y. J.; Zohar, A.; Fischer-Shachor, D.; Kohan-Mass, J.; Carmi, M.

    2018-04-01

    This paper describes an Israeli national-level research examining the extent to which admissions of elementary school students to the gifted programmes based on standardised tests are gender-fair. In the research, the gifted students consisted of 275 boys, 128 girls, and additional 80 girls who were admitted to the gifted programme through affirmative action (AA). To assess these young students' scientific thinking skills, also referred to as science practices, open-ended questions of case-based questionnaires were developed. The investigated scientific thinking skills were question posing, explanation, graphing, inquiry, and metacognition. Analysis of the students' responses revealed that gifted girls who entered the programmes through AA performed at the same level as the other gifted students. We found significant differences between the three research groups in question posing and graphing skills. We suggest increasing gender-fairness by revising the standard national testing system to include case-based narratives followed by open-ended questions that assess gifted students' scientific thinking skills. This may diminish the gender inequity expressed by the different number of girls and boys accepted to the gifted programmes. We show that open-ended tools for analysing students' scientific thinking might better serve both research and practice by identifying gifted girls and boys equally well.

  4. Polishing parameter optimization for end-surface of chalcogenide glass fiber connector

    NASA Astrophysics Data System (ADS)

    Guo, Fangxia; Dai, Shixun; Tang, Junzhou; Wang, Xunsi; Li, Xing; Xu, Yinsheng; Wu, Yuehao; Liu, Zijun

    2017-11-01

    We have investigated the optimization parameters for polishing end-surface of chalcogenide glass fiber connector in the paper. Six SiC abrasive particles of different sizes were used to polish the fiber in order of size from large to small. We analyzed the effects of polishing parameters such as particle sizes, grinding speeds and polishing durations on the quality of the fiber end surface and determined the optimized polishing parameters. We found that, high-quality fiber end surface can be achieved using only three different SiC abrasives. The surface roughness of the final ChG fiber end surface is about 48 nm without any scratches, spots and cracks. Such polishing processes could reduce the average insertion loss of the connector to about 3.4 dB.

  5. The Scientific Field during Argentina's Latest Military Dictatorship (1976-1983): Contraction of Public Universities and Expansion of the National Council for Scientific and Technological Research (CONICET)

    ERIC Educational Resources Information Center

    Bekerman, Fabiana

    2013-01-01

    This study looks at some of the traits that characterized Argentina's scientific and university policies under the military regime that spanned from 1976 through 1983. To this end, it delves into a rarely explored empirical observation: financial resource transfers from national universities to the National Scientific and Technological Research…

  6. Optimel: Software for selecting the optimal method

    NASA Astrophysics Data System (ADS)

    Popova, Olga; Popov, Boris; Romanov, Dmitry; Evseeva, Marina

    Optimel: software for selecting the optimal method automates the process of selecting a solution method from the optimization methods domain. Optimel features practical novelty. It saves time and money when conducting exploratory studies if its objective is to select the most appropriate method for solving an optimization problem. Optimel features theoretical novelty because for obtaining the domain a new method of knowledge structuring was used. In the Optimel domain, extended quantity of methods and their properties are used, which allows identifying the level of scientific studies, enhancing the user's expertise level, expand the prospects the user faces and opening up new research objectives. Optimel can be used both in scientific research institutes and in educational institutions.

  7. The Construction of a Reasoned Explanation of a Health Phenomenon: An Analysis of Competencies Mobilized

    ERIC Educational Resources Information Center

    Faria, Cláudia; Freire, Sofia; Baptista, Mónica; Galvão, Cecília

    2014-01-01

    Mobilizing scientific knowledge for understanding the natural world and for critically appraise socio-scientific situations and make decisions are key competencies for today's' society. Therefore, it is essential to understand how students at the end of compulsory schooling use scientific knowledge for understanding the surrounding world. The…

  8. The Scientification of Skin Whitening and the Entrepreneurial University-Linked Corporate Scientific Officer

    ERIC Educational Resources Information Center

    Mire, Amina

    2012-01-01

    This work examines the interlocking strategies of scientific entrepreneurialism and academic capitalism in cutting-edge innovations in molecular biology, biomedicine, and other life sciences deployed in research and the development of high-end skin whitening and anti-aging cosmeceuticals. Skin whitening products and anti-aging cosmeceuticals are…

  9. ID16B: a hard X-ray nanoprobe beamline at the ESRF for nano-analysis

    PubMed Central

    Martínez-Criado, Gema; Villanova, Julie; Tucoulou, Rémi; Salomon, Damien; Suuronen, Jussi-Petteri; Labouré, Sylvain; Guilloud, Cyril; Valls, Valentin; Barrett, Raymond; Gagliardini, Eric; Dabin, Yves; Baker, Robert; Bohic, Sylvain; Cohen, Cédric; Morse, John

    2016-01-01

    Within the framework of the ESRF Phase I Upgrade Programme, a new state-of-the-art synchrotron beamline ID16B has been recently developed for hard X-ray nano-analysis. The construction of ID16B was driven by research areas with major scientific and societal impact such as nanotechnology, earth and environmental sciences, and bio-medical research. Based on a canted undulator source, this long beamline provides hard X-ray nanobeams optimized mainly for spectroscopic applications, including the combination of X-ray fluorescence, X-ray diffraction, X-ray excited optical luminescence, X-ray absorption spectroscopy and 2D/3D X-ray imaging techniques. Its end-station re-uses part of the apparatus of the earlier ID22 beamline, while improving and enlarging the spectroscopic capabilities: for example, the experimental arrangement offers improved lateral spatial resolution (∼50 nm), a larger and more flexible capability for in situ experiments, and monochromatic nanobeams tunable over a wider energy range which now includes the hard X-ray regime (5–70 keV). This paper describes the characteristics of this new facility, short-term technical developments and the first scientific results. PMID:26698084

  10. NOAA draft scientific integrity policy: Comment period open through 20 August

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    2011-08-01

    The National Oceanic and Atmospheric Administration (NOAA) is aiming to finalize its draft scientific integrity policy possibly by the end of the year, Larry Robinson, NOAA assistant secretary for conservation and management, indicated during a 28 July teleconference. The policy “is key to fostering an environment where science is encouraged, nurtured, respected, rewarded, and protected,” Robinson said, adding that the agency's comment period for the draft policy, which was released on 16 June, ends on 20 August. “Science underpins all that NOAA does. This policy is one piece of a broader effort to strengthen NOAA science,” Robinson said, noting that the draft “represents the first ever scientific integrity policy for NOAA. Previously, our policy only addressed research misconduct and focused on external grants. What's new about this policy is that it establishes NOAA's principles for scientific integrity, a scientific code of conduct, and a code of ethics for science supervision and management.”

  11. Analytical Support Capabilities of Turkish General Staff Scientific Decision Support Centre (SDSC) to Defence Transformation

    DTIC Science & Technology

    2005-04-01

    RTO-MP-SAS-055 4 - 1 UNCLASSIFIED/UNLIMITED UNCLASSIFIED/UNLIMITED Analytical Support Capabilities of Turkish General Staff Scientific...the end failed to achieve anything commensurate with the effort. The analytical support capabilities of Turkish Scientific Decision Support Center to...percent of the İpekkan, Z.; Özkil, A. (2005) Analytical Support Capabilities of Turkish General Staff Scientific Decision Support Centre (SDSC) to

  12. Implementation of evidence-based treatment for schizophrenic disorders: two-year outcome of an international field trial of optimal treatment

    PubMed Central

    Falloon, Ian RH; Montero, Isabel; Sungur, Mehmet; Mastroeni, Antonino; Malm, Ulf; Economou, Marina; Grawe, Rolf; Harangozo, Judit; Mizuno, Masafumi; Murakami, Masaaki; Hager, Bert; Held, Tilo; Veltro, Franco; Gedye, Robyn

    2004-01-01

    According to clinical trials literature, every person with a schizophrenic disorder should be provided with the combination of optimal dose antipsychotics, strategies to educate himself and his carers to cope more efficiently with environmental stresses, cognitive-behavioural strategies to enhance work and social goals and reducing residual symptoms, and assertive home-based management to help prevent and resolve major social needs and crises, including recurrent episodes of symptoms. Despite strong scientific support for the routine implementation of these 'evidence-based' strategies, few services provide more than the pharmacotherapy component, and even this is seldom applied in the manner associated with the best results in the clinical trials. An international collaborative group, the Optimal Treatment Project (OTP), has been developed to promote the routine use of evidence-based strategies for schizophrenic disorders. A field trial was started to evaluate the benefits and costs of applying evidence-based strategies over a 5-year period. Centres have been set up in 18 countries. This paper summarises the outcome after 24 months of 'optimal' treatment in 603 cases who had reached this stage in their treatment by the end of 2002. On all measures the evidence-based OTP approach achieved more than double the benefits associated with current best practices. One half of recent cases had achieved full recovery from clinical and social morbidity. These advantages were even more striking in centres where a random-control design was used. PMID:16633471

  13. Implementation of evidence-based treatment for schizophrenic disorders: two-year outcome of an international field trial of optimal treatment.

    PubMed

    Falloon, Ian R H; Montero, Isabel; Sungur, Mehmet; Mastroeni, Antonino; Malm, Ulf; Economou, Marina; Grawe, Rolf; Harangozo, Judit; Mizuno, Masafumi; Murakami, Masaaki; Hager, Bert; Held, Tilo; Veltro, Franco; Gedye, Robyn

    2004-06-01

    According to clinical trials literature, every person with a schizophrenic disorder should be provided with the combination of optimal dose antipsychotics, strategies to educate himself and his carers to cope more efficiently with environmental stresses, cognitive-behavioural strategies to enhance work and social goals and reducing residual symptoms, and assertive home-based management to help prevent and resolve major social needs and crises, including recurrent episodes of symptoms. Despite strong scientific support for the routine implementation of these 'evidence-based' strategies, few services provide more than the pharmacotherapy component, and even this is seldom applied in the manner associated with the best results in the clinical trials. An international collaborative group, the Optimal Treatment Project (OTP), has been developed to promote the routine use of evidence-based strategies for schizophrenic disorders. A field trial was started to evaluate the benefits and costs of applying evidence-based strategies over a 5-year period. Centres have been set up in 18 countries. This paper summarises the outcome after 24 months of 'optimal' treatment in 603 cases who had reached this stage in their treatment by the end of 2002. On all measures the evidence-based OTP approach achieved more than double the benefits associated with current best practices. One half of recent cases had achieved full recovery from clinical and social morbidity. These advantages were even more striking in centres where a random-control design was used.

  14. Airborne Cloud Computing Environment (ACCE)

    NASA Technical Reports Server (NTRS)

    Hardman, Sean; Freeborn, Dana; Crichton, Dan; Law, Emily; Kay-Im, Liz

    2011-01-01

    Airborne Cloud Computing Environment (ACCE) is JPL's internal investment to improve the return on airborne missions. Improve development performance of the data system. Improve return on the captured science data. The investment is to develop a common science data system capability for airborne instruments that encompasses the end-to-end lifecycle covering planning, provisioning of data system capabilities, and support for scientific analysis in order to improve the quality, cost effectiveness, and capabilities to enable new scientific discovery and research in earth observation.

  15. View from Silicon Valley: Maximizing the Scientific Impact of Global Brain Initiatives through Entrepreneurship.

    PubMed

    Joshi, Pushkar S; Ghosh, Kunal K

    2016-11-02

    In this era of technology-driven global neuroscience initiatives, the role of the neurotechnology industry remains woefully ambiguous. Here, we explain why industry is essential to the success of these global initiatives, and how it can maximize the scientific impact of these efforts by (1) scaling and ultimately democratizing access to breakthrough neurotechnologies, and (2) commercializing technologies as part of integrated, end-to-end solutions that accelerate neuroscientific discovery. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. The challenge of changing the inactivated poliomyelitis vaccine in Latin America: declaration of the Latin American Society of Pediatric Infectious Diseases (SLIPE).

    PubMed

    Falleiros-Arlant, Luiza Helena; Avila-Agüero, María Luisa; Brea del Castillo, José; Mariño, Cristina

    2014-10-01

    Even though we have already covered 99% of the path to eradicate poliomyelitis from the world, this disease is still causing paralysis in children. Its eradication means not only the end of wild poliovirus circulation, but vaccine-derived poliovirus circulation as well. Taking into account different factors such as: current epidemiological data, adverse events of the attenuated oral poliomyelitis vaccine (OPV), the availability of an injectable inactivated vaccine (IPV) without the potential of causing the severe adverse events of the oral vaccine (OPV), the efficacy and effectiveness of the IPV in several countries of the world where it has been used for several years, the rationale of changing the vaccination schedule in different Latin American countries; the Latin American Society of Pediatric Infectious Diseases (SLIPE) announces its recommendation of switching to IPV in Latin America, by this Declaration, with an Action Plan for 2014-2015 period as regards vaccination against polio policies in Latin America. 1. The optimal proposed schedule consists of four IPV doses (three doses in the primary schedule plus a booster dose), whether IPV is combined or not with other indicated vaccines in the immunization program of the country. During the OPV to IPV transition phase, an alternative schedule is acceptable; 2. Countries should set optimal strategies in order to maintain and improve vaccination coverage, and implement a nominal immunization registry; 3. Improving the Epidemiological Surveillance of Acute Flaccid Paralysis (AFP) and setting up an environmental surveillance program; 4. Setting up strategies for introducing IPV in National Immunization Programs, such as communicating properly with the population, among others; 5. Bringing scientific societies closer to decision makers; 6. Ensuring optimal supply and prices for IPV introduction; 7. Training vaccination teams; 8. Enhancing the distribution and storing logistics of vaccines. In addition to the scientific evidence, the countries that have not yet decided to switch to IPV should consider the implications of equity and social justice.

  17. Evaluating Scientific Misconceptions and Scientific Literacy in a General Science Course

    NASA Astrophysics Data System (ADS)

    Courtier, A. M.; Scott, T. J.

    2009-12-01

    The data used in this study were collected as part of the course assignments for General Education Science (GSci) 101: “Physics, Chemistry, and the Human Experience” at James Madison University. The course covers the basic principles of physics, chemistry, and astronomy. The primary goals of this study were to analyze student responses to general scientific questions, to identify scientific misconceptions, and to evaluate scientific literacy by comparing responses collected from different groups of students and from questions given during the course versus at the end of the course. While this project is focused on general scientific concepts, the misconceptions and patterns identified are particularly relevant for improving pedagogy in the geosciences as this field relies on multidisciplinary knowledge of fundamental physics, chemistry, and astronomy. We discuss differences in the results between the disciplines of physics, chemistry, and astronomy and their implications for general geology education and literacy, emphasizing the following questions: (a) What do students typically get wrong? (b) Did the overall scientific literacy of the students increase throughout the semester? Are the concepts discussed in answers provided at the end of class more accurate than those provided during class? (c) How do the before- and after- class responses change with respect to language and terminology? Did the students use more scientific terminology? Did the students use scientific terminology correctly?

  18. CARMENES: an overview six months after first light

    NASA Astrophysics Data System (ADS)

    Quirrenbach, A.; Amado, P. J.; Caballero, J. A.; Mundt, R.; Reiners, A.; Ribas, I.; Seifert, W.; Abril, M.; Aceituno, J.; Alonso-Floriano, F. J.; Anwand-Heerwart, H.; Azzaro, M.; Bauer, F.; Barrado, D.; Becerril, S.; Bejar, V. J. S.; Benitez, D.; Berdinas, Z. M.; Brinkmöller, M.; Cardenas, M. C.; Casal, E.; Claret, A.; Colomé, J.; Cortes-Contreras, M.; Czesla, S.; Doellinger, M.; Dreizler, S.; Feiz, C.; Fernandez, M.; Ferro, I. M.; Fuhrmeister, B.; Galadi, D.; Gallardo, I.; Gálvez-Ortiz, M. C.; Garcia-Piquer, A.; Garrido, R.; Gesa, L.; Gómez Galera, V.; González Hernández, J. I.; Gonzalez Peinado, R.; Grözinger, U.; Guàrdia, J.; Guenther, E. W.; de Guindos, E.; Hagen, H.-J.; Hatzes, A. P.; Hauschildt, P. H.; Helmling, J.; Henning, T.; Hermann, D.; Hernández Arabi, R.; Hernández Castaño, L.; Hernández Hernando, F.; Herrero, E.; Huber, A.; Huber, K. F.; Huke, P.; Jeffers, S. V.; de Juan, E.; Kaminski, A.; Kehr, M.; Kim, M.; Klein, R.; Klüter, J.; Kürster, M.; Lafarga, M.; Lara, L. M.; Lamert, A.; Laun, W.; Launhardt, R.; Lemke, U.; Lenzen, R.; Llamas, M.; Lopez del Fresno, M.; López-Puertas, M.; López-Santiago, J.; Lopez Salas, J. F.; Magan Madinabeitia, H.; Mall, U.; Mandel, H.; Mancini, L.; Marin Molina, J. A.; Maroto Fernández, D.; Martín, E. L.; Martín-Ruiz, S.; Marvin, C.; Mathar, R. J.; Mirabet, E.; Montes, D.; Morales, J. C.; Morales Muñoz, R.; Nagel, E.; Naranjo, V.; Nowak, G.; Palle, E.; Panduro, J.; Passegger, V. M.; Pavlov, A.; Pedraz, S.; Perez, E.; Pérez-Medialdea, D.; Perger, M.; Pluto, M.; Ramón, A.; Rebolo, R.; Redondo, P.; Reffert, S.; Reinhart, S.; Rhode, P.; Rix, H.-W.; Rodler, F.; Rodríguez, E.; Rodríguez López, C.; Rohloff, R. R.; Rosich, A.; Sanchez Carrasco, M. A.; Sanz-Forcada, J.; Sarkis, P.; Sarmiento, L. F.; Schäfer, S.; Schiller, J.; Schmidt, C.; Schmitt, J. H. M. M.; Schöfer, P.; Schweitzer, A.; Shulyak, D.; Solano, E.; Stahl, O.; Storz, C.; Tabernero, H. M.; Tala, M.; Tal-Or, L.; Ulbrich, R.-G.; Veredas, G.; Vico Linares, J. I.; Vilardell, F.; Wagner, K.; Winkler, J.; Zapatero Osorio, M.-R.; Zechmeister, M.; Ammler-von Eiff, M.; Anglada-Escudé, G.; del Burgo, C.; Garcia-Vargas, M. L.; Klutsch, A.; Lizon, J.-L.; Lopez-Morales, M.; Ofir, A.; Pérez-Calpena, A.; Perryman, M. A. C.; Sánchez-Blanco, E.; Strachan, J. B. P.; Stürmer, J.; Suárez, J. C.; Trifonov, T.; Tulloch, S. M.; Xu, W.

    2016-08-01

    The CARMENES instrument is a pair of high-resolution (R> 80,000) spectrographs covering the wavelength range from 0.52 to 1.71 μm, optimized for precise radial velocity measurements. It was installed and commissioned at the 3.5m telescope of the Calar Alto observatory in Southern Spain in 2015. The first large science program of CARMENES is a survey of 300 M dwarfs, which started on Jan 1, 2016. We present an overview of all subsystems of CARMENES (front end, fiber system, visible-light spectrograph, near-infrared spectrograph, calibration units, etalons, facility control, interlock system, instrument control system, data reduction pipeline, data flow, and archive), and give an overview of the assembly, integration, verification, and commissioning phases of the project. We show initial results and discuss further plans for the scientific use of CARMENES.

  19. Distributed cooperative regulation for multiagent systems and its applications to power systems: a survey.

    PubMed

    Hu, Jianqiang; Li, Yaping; Yong, Taiyou; Cao, Jinde; Yu, Jie; Mao, Wenbo

    2014-01-01

    Cooperative regulation of multiagent systems has become an active research area in the past decade. This paper reviews some recent progress in distributed coordination control for leader-following multiagent systems and its applications in power system and mainly focuses on the cooperative tracking control in terms of consensus tracking control and containment tracking control. Next, methods on how to rank the network nodes are summarized for undirected/directed network, based on which one can determine which follower should be connected to leaders such that partial followers can perceive leaders' information. Furthermore, we present a survey of the most relevant scientific studies investigating the regulation and optimization problems in power systems based on distributed strategies. Finally, some potential applications in the frequency tracking regulation of smart grids are discussed at the end of the paper.

  20. Distributed Cooperative Regulation for Multiagent Systems and Its Applications to Power Systems: A Survey

    PubMed Central

    Li, Yaping; Yong, Taiyou; Yu, Jie; Mao, Wenbo

    2014-01-01

    Cooperative regulation of multiagent systems has become an active research area in the past decade. This paper reviews some recent progress in distributed coordination control for leader-following multiagent systems and its applications in power system and mainly focuses on the cooperative tracking control in terms of consensus tracking control and containment tracking control. Next, methods on how to rank the network nodes are summarized for undirected/directed network, based on which one can determine which follower should be connected to leaders such that partial followers can perceive leaders' information. Furthermore, we present a survey of the most relevant scientific studies investigating the regulation and optimization problems in power systems based on distributed strategies. Finally, some potential applications in the frequency tracking regulation of smart grids are discussed at the end of the paper. PMID:25243199

  1. A review on the mechanical design elements of ankle rehabilitation robot.

    PubMed

    Khalid, Yusuf M; Gouwanda, Darwin; Parasuraman, Subramanian

    2015-06-01

    Ankle rehabilitation robots are developed to enhance ankle strength, flexibility and proprioception after injury and to promote motor learning and ankle plasticity in patients with drop foot. This article reviews the design elements that have been incorporated into the existing robots, for example, backdrivability, safety measures and type of actuation. It also discusses numerous challenges faced by engineers in designing this robot, including robot stability and its dynamic characteristics, universal evaluation criteria to assess end-user comfort, safety and training performance and the scientific basis on the optimal rehabilitation strategies to improve ankle condition. This article can serve as a reference to design robot with better stability and dynamic characteristics and good safety measures against internal and external events. It can also serve as a guideline for the engineers to report their designs and findings. © IMechE 2015.

  2. Exploring the Assessment of and Relationship between Elementary Students' Scientific Creativity and Science Inquiry

    ERIC Educational Resources Information Center

    Yang, Kuay-Keng; Lin, Shu-Fen; Hong, Zuway-R; Lin, Huann-shyang

    2016-01-01

    The purposes of this study were to (a) develop and validate instruments to assess elementary students' scientific creativity and science inquiry, (b) investigate the relationship between the two competencies, and (c) compare the two competencies among different grade level students. The scientific creativity test was composed of 7 open-ended items…

  3. Balancing the Pros and Cons of GMOs: Socio-Scientific Argumentation in Pre-Service Teacher Education

    ERIC Educational Resources Information Center

    Cinici, Ayhan

    2016-01-01

    This study investigates the role of the discursive process in the act of scientific knowledge building. Specifically, it links scientific knowledge building to risk perception of Genetically Modified Organisms (GMOs). To this end, this study designed and implemented a three-stage argumentation programme giving pre-service teachers (PSTs) the…

  4. 75 FR 33627 - Center for Scientific Review; Notice of Closed Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-14

    ... Committee: Center for Scientific Review Special Emphasis Panel; Oral Microbiology, Immunology, Cell Biology... Review Group; NeuroAIDS and other End-Organ Diseases Study Section. Date: July 16, 2010. Time: 8 a.m. to...

  5. 76 FR 8751 - Center for Scientific Review; Notice of Closed Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-15

    ... Committee: Center for Scientific Review Special Emphasis Panel; Member Conflict: Skeletal Muscle Biology... Review Group; NeuroAIDS and other End-Organ Diseases Study Section. Date: March 21, 2011. Time: 8 a.m. to...

  6. Optimization of magnet end-winding geometry

    NASA Astrophysics Data System (ADS)

    Reusch, Michael F.; Weissenburger, Donald W.; Nearing, James C.

    1994-03-01

    A simple, almost entirely analytic, method for the optimization of stress-reduced magnet-end winding paths for ribbon-like superconducting cable is presented. This technique is based on characterization of these paths as developable surfaces, i.e., surfaces whose intrinsic geometry is flat. The method is applicable to winding mandrels of arbitrary geometry. Computational searches for optimal winding paths are easily implemented via the technique. Its application to the end configuration of cylindrical Superconducting Super Collider (SSC)-type magnets is discussed. The method may be useful for other engineering problems involving the placement of thin sheets of material.

  7. Present State of Knowledge of the Upper Atmosphere 1996: An Assessment Report to Congress and the Environmental Protection Agency

    NASA Technical Reports Server (NTRS)

    Kurylo, M. J.; Kaye, J. A.; Decola, P. L.; Friedl, R. R.; Peterson, D. B.

    1997-01-01

    This document is issued in response to the Clean Air Act Amendment of 1990, Public Law 101-549, which mandates that the National Aeronautics and Space Administration (NASA) and other key agencies submit triennial report to congress and the Environmental Protection Agency. NASA is charged with the responsibility to report on the state of our knowledge of the Earth's upper atmosphere, particularly the Stratosphere. Part 1 of this report summarizes the objectives, status, and accomplishments of the research tasks supported under NASA's Upper Atmosphere Research Program and Atmospheric Chemistry Modeling and Analysis Program for the period of 1994-1996. Part 2 (this document) presents summaries of several scientific assessments, reviews, and summaries. These include the executive summaries of two scientific assessments: (Section B) 'Scientific Assessment of Ozone Depletion: 1994'; (Section C) 'l995 Scientific Assessment of the Atmospheric Effects of Stratospheric Aircraft); end of mission/series statements for three stratospherically-focused measurement campaigns: (Section D) 'ATLAS End-of-Series Statement'; (Section E) 'ASHOE/MAESA End-of-Mission Statement'; (Section F) 'TOTE/VOTE End-of-Mission Statement'; a summary of NASA's latest biennial review of fundamental photochemical processes important to atmospheric chemistry 'Chemical Kinetics and Photochemical Data for Use in Stratospheric Modeling'; and (Section H) the section 'Atmospheric Ozone Research" from the Mission to Planet Earth Science Research Plan, which describes NASA's current and future research activities related to both tropospheric and stratospheric chemistry.

  8. Extreme scale multi-physics simulations of the tsunamigenic 2004 Sumatra megathrust earthquake

    NASA Astrophysics Data System (ADS)

    Ulrich, T.; Gabriel, A. A.; Madden, E. H.; Wollherr, S.; Uphoff, C.; Rettenberger, S.; Bader, M.

    2017-12-01

    SeisSol (www.seissol.org) is an open-source software package based on an arbitrary high-order derivative Discontinuous Galerkin method (ADER-DG). It solves spontaneous dynamic rupture propagation on pre-existing fault interfaces according to non-linear friction laws, coupled to seismic wave propagation with high-order accuracy in space and time (minimal dispersion errors). SeisSol exploits unstructured meshes to account for complex geometries, e.g. high resolution topography and bathymetry, 3D subsurface structure, and fault networks. We present the up-to-date largest (1500 km of faults) and longest (500 s) dynamic rupture simulation modeling the 2004 Sumatra-Andaman earthquake. We demonstrate the need for end-to-end-optimization and petascale performance of scientific software to realize realistic simulations on the extreme scales of subduction zone earthquakes: Considering the full complexity of subduction zone geometries leads inevitably to huge differences in element sizes. The main code improvements include a cache-aware wave propagation scheme and optimizations of the dynamic rupture kernels using code generation. In addition, a novel clustered local-time-stepping scheme for dynamic rupture has been established. Finally, asynchronous output has been implemented to overlap I/O and compute time. We resolve the frictional sliding process on the curved mega-thrust and a system of splay faults, as well as the seismic wave field and seafloor displacement with frequency content up to 2.2 Hz. We validate the scenario by geodetic, seismological and tsunami observations. The resulting rupture dynamics shed new light on the activation and importance of splay faults.

  9. Do sufficient vitamin D levels at the end of summer in children and adolescents provide an assurance of vitamin D sufficiency at the end of winter? A cohort study.

    PubMed

    Shakeri, Habibesadat; Pournaghi, Seyed-Javad; Hashemi, Javad; Mohammad-Zadeh, Mohammad; Akaberi, Arash

    2017-10-26

    The changes in serum 25-hydroxyvitamin D (25(OH)D) in adolescents from summer to winter and optimal serum vitamin D levels in the summer to ensure adequate vitamin D levels at the end of winter are currently unknown. This study was conducted to address this knowledge gap. The study was conducted as a cohort study. Sixty-eight participants aged 7-18 years and who had sufficient vitamin D levels at the end of the summer in 2011 were selected using stratified random sampling. Subsequently, the participants' vitamin D levels were measured at the end of the winter in 2012. A receiver operating characteristic (ROC) curve was used to determine optimal cutoff points for vitamin D at the end of the summer to predict sufficient vitamin D levels at the end of the winter. The results indicated that 89.7% of all the participants had a decrease in vitamin D levels from summer to winter: 14.7% of them were vitamin D-deficient, 36.8% had insufficient vitamin D concentrations and only 48.5% where able to maintain sufficient vitamin D. The optimal cutoff point to provide assurance of sufficient serum vitamin D at the end of the winter was 40 ng/mL at the end of the summer. Sex, age and vitamin D levels at the end of the summer were significant predictors of non-sufficient vitamin D at the end of the winter. In this age group, a dramatic reduction in vitamin D was observed over the follow-up period. Sufficient vitamin D at the end of the summer did not guarantee vitamin D sufficiency at the end of the winter. We found 40 ng/mL as an optimal cutoff point.

  10. Free-time and fixed end-point optimal control theory in dissipative media: application to entanglement generation and maintenance.

    PubMed

    Mishima, K; Yamashita, K

    2009-07-07

    We develop monotonically convergent free-time and fixed end-point optimal control theory (OCT) in the density-matrix representation to deal with quantum systems showing dissipation. Our theory is more general and flexible for tailoring optimal laser pulses in order to control quantum dynamics with dissipation than the conventional fixed-time and fixed end-point OCT in that the optimal temporal duration of laser pulses can also be optimized exactly. To show the usefulness of our theory, it is applied to the generation and maintenance of the vibrational entanglement of carbon monoxide adsorbed on the copper (100) surface, CO/Cu(100). We demonstrate the numerical results and clarify how to combat vibrational decoherence as much as possible by the tailored shapes of the optimal laser pulses. It is expected that our theory will be general enough to be applied to a variety of dissipative quantum dynamics systems because the decoherence is one of the quantum phenomena sensitive to the temporal duration of the quantum dynamics.

  11. Optimization of the arthroscopic indentation instrument for the measurement of thin cartilage stiffness

    NASA Astrophysics Data System (ADS)

    Lyyra-Laitinen, Tiina; Niinimäki, Mia; Töyräs, Juha; Lindgren, Reijo; Kiviranta, Ilkka; Jurvelin, Jukka S.

    1999-10-01

    Structural alterations associated with early, mostly reversible, degeneration of articular cartilage induce tissue softening, generally preceding fibrillation and, thus, visible changes of the cartilage surface. We have already developed an indentation instrument for measuring arthroscopic stiffness of cartilage with typical thickness >2 mm. The aim of this study was to extend the applicability of the instrument for the measurement of thin (<2 mm) cartilage stiffness. Variations in cartilage thickness, which will not be known during arthroscopy, can nonetheless affect the indentation measurement, and therefore optimization of the indenter dimensions is necessary. First, we used theoretical and finite element models to compare plane-ended and spherical-ended indenters and, then, altered the dimensions to determine the optimal indenter for thin cartilage measurements. Finally, we experimentally validated the optimized indenter using bovine humeral head cartilage. Reference unconfined compression measurements were carried out with a material testing device. The spherical-ended indenter was more insensitive to the alterations in cartilage thickness (20% versus 39% in the thickness range 1.5-5 mm) than the plane-ended indenter. For thin cartilage, the optimal dimensions for the spherical-ended indenter were 0.5 mm for diameter and 0.1 mm for height. The experimental stiffness measurements with this indenter correlated well with the reference measurements (r = 0.811, n = 31, p<0.0001) in the cartilage thickness range 0.7-1.8 mm. We conclude that the optimized indenter is reliable and well suited for the measurement of thin cartilage stiffness.

  12. A new U.S.-Canada Collaboration to build SWOT Calibration/Validation and Science Capacity for Northern Rivers and Wetlands

    NASA Astrophysics Data System (ADS)

    Smith, L. C.; Gleason, C. J.; Pietroniro, A.; Fiset, J. M.

    2016-12-01

    The NASA/CNES/CSA Surface Water and Ocean Topography (SWOT) satellite mission holds strong promise to be a transformational mission for land surface hydrology in much the same way that conventional radar altimetry transformed physical oceanography following the launch of Seasat in 1978. However, to achieve this potential key pre-launch tasks remain, including 1) establishing benchmark monitoring sites, standardized measurement protocols, and international partnerships for quality calibration/validation of SWOT hydrology products; 2) demonstration that SWOT inundation area mapping for rivers, lakes, and wetlands is feasible; 3) demonstration that quality SWOT discharge retrievals for large rivers are feasible; and 4) demonstration of exciting new science from SWOT-like measurements. To these ends we present a new U.S.-Canada partnership to establish new SWOT calibration/validation sites, collect unique "SWOT-like" field and remote sensing datasets, conduct phenomenology studies of potentially important impacts (vegetation, sedimentary deposits, ice, and wind) on SWOT backscatter and water surface elevation (WSE) retrievals; and to gain scientific knowledge of the impact of permafrost on the form, hydraulics, and water surface elevations of northern rivers and lakes. This U.S-Canada partnership will establish scientifically interesting calibration/validation sites along three to four major Canadian rivers (current candidates: Saskatchewan, Athabasca, Arctic Red, Slave/Peace, or Ottawa Rivers). Field sites will be selected optimize scientific impact, logistics, and location inside the nominal planned orbits of the SWOT Fast Sampling Phase.

  13. Arguments for amending smoke-free legislation in U.S. states to restrict use of electronic nicotine delivery systems.

    PubMed

    Phan, Tiffany M; Bianco, Cezanne A; Nikitin, Dmitriy; Timberlake, David S

    2018-03-01

    The uneven diffusion of local and state laws restricting the use of electronic nicotine delivery systems (ENDS) in the United States may be a function of inconclusive scientific evidence and lack of guidance from the federal government. The objective of this study was to assess whether the rationale for amending clean indoor air acts (CIAAs) is being conflated by issues that are not directly relevant to protecting the health of ENDS non-users. Online sources were used in identifying bills ( n  = 25) that were presented in U.S. state legislatures from January 2009 to December 2015. The bills were categorized into one of three groups: 1) bills amending comprehensive CIAAs ( n  = 11), 2) bills prohibiting use of ENDS in places frequented by youth ( n  = 5), and 3) remaining bills that varied between the two categories ( n  = 9). Arguments presented in committee hearings were coded as scientific, public health, economic, enforcement, freedom, or regulatory. Arguments pertaining to amendment of clean indoor air acts spanned several categories, many of which were not directly relevant to the aims of the legislation. This finding could assist lawmakers and expert witnesses in making arguments that yield greater success in amending legislation. Alternatively, inconclusive scientific data on the hazards of ENDS aerosols might encourage lawmakers to propose legislation that prohibits ENDS use in places frequented by youths.

  14. The European Registry for Patients with Mechanical Circulatory Support (EUROMACS): first annual report.

    PubMed

    de By, Theo M M H; Mohacsi, Paul; Gummert, Jan; Bushnaq, Hasan; Krabatsch, Thomas; Gustafsson, Finn; Leprince, Pascal; Martinelli, Luigi; Meyns, Bart; Morshuis, Michiel; Netuka, Ivan; Potapov, Evgenij; Zittermann, Armin; Delmo Walter, Eva Maria; Hetzer, Roland

    2015-05-01

    The European Registry for Patients with Mechanical Circulatory Support (EUROMACS) was founded on 10 December 2009 with the initiative of Roland Hetzer (Deutsches Herzzentrum Berlin, Berlin, Germany) and Jan Gummert (Herz- und Diabeteszentrum Nordrhein-Westfalen, Bad Oeynhausen, Germany) with 15 other founding international members. It aims to promote scientific research to improve care of end-stage heart failure patients with ventricular assist device or a total artificial heart as long-term mechanical circulatory support. Likewise, the organization aims to provide and maintain a registry of device implantation data and long-term follow-up of patients with mechanical circulatory support. Hence, EUROMACS affiliated itself with Dendrite Clinical Systems Ltd to offer its members a software tool that allows input and analysis of patient clinical data on a daily basis. EUROMACS facilitates further scientific studies by offering research groups access to any available data wherein patients and centres are anonymized. Furthermore, EUROMACS aims to stimulate cooperation with clinical and research institutions and with peer associations involved to further its aims. EUROMACS is the only European-based Registry for Patients with Mechanical Circulatory Support with rapid increase in institutional and individual membership. Because of the expeditious data input, the European Association for Cardiothoracic Surgeons saw the need to optimize the data availability and the significance of the registry to improve care of patients with mechanical circulatory support and its potential contribution to scientific intents; hence, the beginning of their alliance in 2012. This first annual report is designed to provide an overview of EUROMACS' structure, its activities, a first data collection and an insight to its scientific contributions. © The Author 2015. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  15. Implications of chronic kidney disease for dietary treatment in cardiovascular disease.

    PubMed

    Packard, Diane P; Milton, Joan E; Shuler, Lynn A; Short, Robert A; Tuttle, Katherine R

    2006-07-01

    Chronic kidney disease (CKD) often accompanies cardiovascular disease (CVD). Trends foretelling a greater burden of CKD and CVD are largely a result of increasing frequencies of obesity, hypertension, and diabetes. Nutritional therapy occupies a critical role in reducing risk factors and preventing progressive damage to the kidneys and heart. Nutritional assessment and treatment should take into account both health concerns. This review examines several diet components and eating styles for efficacy in the treatment of these conditions. A variety of dietary regimens claim to provide health benefits, but rigorous scientific validation of long-term efficacy is frequently lacking. An urgent need exists for eating styles that reduce risk of chronic diseases and that are acceptable and achievable in free-living populations. We describe our ongoing study, a randomized controlled trial comparing the American Heart Association Step II diet and a Mediterranean diet, in survivors of a first myocardial infarction. The primary end point is a composite of mortality and major CVD events. Because many in this population have CKD, indicators of kidney damage and function are prespecified secondary end points. Results of this trial should provide insight into optimal dietary interventions for persons with both CVD and CKD.

  16. Remote observations of reentering spacecraft including the space shuttle orbiter

    NASA Astrophysics Data System (ADS)

    Horvath, Thomas J.; Cagle, Melinda F.; Grinstead, Jay H.; Gibson, David M.

    Flight measurement is a critical phase in development, validation and certification processes of technologies destined for future civilian and military operational capabilities. This paper focuses on several recent NASA-sponsored remote observations that have provided unique engineering and scientific insights of reentry vehicle flight phenomenology and performance that could not necessarily be obtained with more traditional instrumentation methods such as onboard discrete surface sensors. The missions highlighted include multiple spatially-resolved infrared observations of the NASA Space Shuttle Orbiter during hypersonic reentry from 2009 to 2011, and emission spectroscopy of comparatively small-sized sample return capsules returning from exploration missions. Emphasis has been placed upon identifying the challenges associated with these remote sensing missions with focus on end-to-end aspects that include the initial science objective, selection of the appropriate imaging platform and instrumentation suite, target flight path analysis and acquisition strategy, pre-mission simulations to optimize sensor configuration, logistics and communications during the actual observation. Explored are collaborative opportunities and technology investments required to develop a next-generation quantitative imaging system (i.e., an intelligent sensor and platform) with greater capability, which could more affordably support cross cutting civilian and military flight test needs.

  17. Remote Observations of Reentering Spacecraft Including the Space Shuttle Orbiter

    NASA Technical Reports Server (NTRS)

    Horvath, Thomas J.; Cagle, Melinda F.; Grinstead, jay H.; Gibson, David

    2013-01-01

    Flight measurement is a critical phase in development, validation and certification processes of technologies destined for future civilian and military operational capabilities. This paper focuses on several recent NASA-sponsored remote observations that have provided unique engineering and scientific insights of reentry vehicle flight phenomenology and performance that could not necessarily be obtained with more traditional instrumentation methods such as onboard discrete surface sensors. The missions highlighted include multiple spatially-resolved infrared observations of the NASA Space Shuttle Orbiter during hypersonic reentry from 2009 to 2011, and emission spectroscopy of comparatively small-sized sample return capsules returning from exploration missions. Emphasis has been placed upon identifying the challenges associated with these remote sensing missions with focus on end-to-end aspects that include the initial science objective, selection of the appropriate imaging platform and instrumentation suite, target flight path analysis and acquisition strategy, pre-mission simulations to optimize sensor configuration, logistics and communications during the actual observation. Explored are collaborative opportunities and technology investments required to develop a next-generation quantitative imaging system (i.e., an intelligent sensor and platform) with greater capability, which could more affordably support cross cutting civilian and military flight test needs.

  18. Do large-scale assessments measure students' ability to integrate scientific knowledge?

    NASA Astrophysics Data System (ADS)

    Lee, Hee-Sun

    2010-03-01

    Large-scale assessments are used as means to diagnose the current status of student achievement in science and compare students across schools, states, and countries. For efficiency, multiple-choice items and dichotomously-scored open-ended items are pervasively used in large-scale assessments such as Trends in International Math and Science Study (TIMSS). This study investigated how well these items measure secondary school students' ability to integrate scientific knowledge. This study collected responses of 8400 students to 116 multiple-choice and 84 open-ended items and applied an Item Response Theory analysis based on the Rasch Partial Credit Model. Results indicate that most multiple-choice items and dichotomously-scored open-ended items can be used to determine whether students have normative ideas about science topics, but cannot measure whether students integrate multiple pieces of relevant science ideas. Only when the scoring rubric is redesigned to capture subtle nuances of student open-ended responses, open-ended items become a valid and reliable tool to assess students' knowledge integration ability.

  19. The End of School Reform

    ERIC Educational Resources Information Center

    Berube, Maurice R.; Berube, Clair T.

    2006-01-01

    Education as a major social movement is coming to an end. This book derives its theoretical framework from the ideas of Hegel, who perceived an end to history, and Thomas Kuhn, who theorized that history does not follow a linear path but that the scientific landscape changes through large-scale movements called "paradigm shifts". This book…

  20. Method and apparatus for scientific analysis under low temperature vacuum conditions

    DOEpatents

    Winefordner, James D.; Jones, Bradley T.

    1990-01-01

    A method and apparatus for scientific analysis of a sample under low temperature vacuum conditions uses a vacuum chamber with a conveyor belt disposed therein. One end of the conveyor belt is a cool end in thermal contact with the cold stage of a refrigerator, whereas the other end of the conveyor belt is a warm end spaced from the refrigerator. A septum allows injection of a sample into the vacuum chamber on top of the conveyor belt for spectroscopic or other analysis. The sample freezes on the conveyor belt at the cold end. One or more windows in the vacuum chamber housing allow spectroscopic analysis of the sample. Following the spectroscopic analysis, the conveyor belt may be moved such that the sample moves toward the warm end of the conveyor belt where upon it evaporates, thereby cleaning the conveyor belt. Instead of injecting the sample by way of a septum and use of a syringe and needle, the present device may be used in series with capillary-column gas chromatography or micro-bore high performance liquid chromatography.

  1. A Technical Survey on Optimization of Processing Geo Distributed Data

    NASA Astrophysics Data System (ADS)

    Naga Malleswari, T. Y. J.; Ushasukhanya, S.; Nithyakalyani, A.; Girija, S.

    2018-04-01

    With growing cloud services and technology, there is growth in some geographically distributed data centers to store large amounts of data. Analysis of geo-distributed data is required in various services for data processing, storage of essential information, etc., processing this geo-distributed data and performing analytics on this data is a challenging task. The distributed data processing is accompanied by issues in storage, computation and communication. The key issues to be dealt with are time efficiency, cost minimization, utility maximization. This paper describes various optimization methods like end-to-end multiphase, G-MR, etc., using the techniques like Map-Reduce, CDS (Community Detection based Scheduling), ROUT, Workload-Aware Scheduling, SAGE, AMP (Ant Colony Optimization) to handle these issues. In this paper various optimization methods and techniques used are analyzed. It has been observed that end-to end multiphase achieves time efficiency; Cost minimization concentrates to achieve Quality of Service, Computation and reduction of Communication cost. SAGE achieves performance improvisation in processing geo-distributed data sets.

  2. A Step Beyond Simple Keyword Searches: Services Enabled by a Full Content Digital Journal Archive

    NASA Technical Reports Server (NTRS)

    Boccippio, Dennis J.

    2003-01-01

    The problems of managing and searching large archives of scientific journal articles can potentially be addressed through data mining and statistical techniques matured primarily for quantitative scientific data analysis. A journal paper could be represented by a multivariate descriptor, e.g., the occurrence counts of a number key technical terms or phrases (keywords), perhaps derived from a controlled vocabulary ( e . g . , the American Meteorological Society's Glossary of Meteorology) or bootstrapped from the journal archive itself. With this technique, conventional statistical classification tools can be leveraged to address challenges faced by both scientists and professional societies in knowledge management. For example, cluster analyses can be used to find bundles of "most-related" papers, and address the issue of journal bifurcation (when is a new journal necessary, and what topics should it encompass). Similarly, neural networks can be trained to predict the optimal journal (within a society's collection) in which a newly submitted paper should be published. Comparable techniques could enable very powerful end-user tools for journal searches, all premised on the view of a paper as a data point in a multidimensional descriptor space, e.g.: "find papers most similar to the one I am reading", "build a personalized subscription service, based on the content of the papers I am interested in, rather than preselected keywords", "find suitable reviewers, based on the content of their own published works", etc. Such services may represent the next "quantum leap" beyond the rudimentary search interfaces currently provided to end-users, as well as a compelling value-added component needed to bridge the print-to-digital-medium gap, and help stabilize professional societies' revenue stream during the print-to-digital transition.

  3. An Integrated Framework for Parameter-based Optimization of Scientific Workflows.

    PubMed

    Kumar, Vijay S; Sadayappan, P; Mehta, Gaurang; Vahi, Karan; Deelman, Ewa; Ratnakar, Varun; Kim, Jihie; Gil, Yolanda; Hall, Mary; Kurc, Tahsin; Saltz, Joel

    2009-01-01

    Data analysis processes in scientific applications can be expressed as coarse-grain workflows of complex data processing operations with data flow dependencies between them. Performance optimization of these workflows can be viewed as a search for a set of optimal values in a multi-dimensional parameter space. While some performance parameters such as grouping of workflow components and their mapping to machines do not a ect the accuracy of the output, others may dictate trading the output quality of individual components (and of the whole workflow) for performance. This paper describes an integrated framework which is capable of supporting performance optimizations along multiple dimensions of the parameter space. Using two real-world applications in the spatial data analysis domain, we present an experimental evaluation of the proposed framework.

  4. End-point controller design for an experimental two-link flexible manipulator using convex optimization

    NASA Technical Reports Server (NTRS)

    Oakley, Celia M.; Barratt, Craig H.

    1990-01-01

    Recent results in linear controller design are used to design an end-point controller for an experimental two-link flexible manipulator. A nominal 14-state linear-quadratic-Gaussian (LQG) controller was augmented with a 528-tap finite-impulse-response (FIR) filter designed using convex optimization techniques. The resulting 278-state controller produced improved end-point trajectory tracking and disturbance rejection in simulation and experimentally in real time.

  5. Implementing and Bounding a Cascade Heuristic for Large-Scale Optimization

    DTIC Science & Technology

    2017-06-01

    solving the monolith, we develop a method for producing lower bounds to the optimal objective function value. To do this, we solve a new integer...as developing and analyzing methods for producing lower bounds to the optimal objective function value of the seminal problem monolith, which this...length of the window decreases, the end effects of the model typically increase (Zerr, 2016). There are four primary methods for correcting end

  6. Modelling the interaction between flooding events and economic growth

    NASA Astrophysics Data System (ADS)

    Grames, Johanna; Fürnkranz-Prskawetz, Alexia; Grass, Dieter; Viglione, Alberto; Blöschl, Günter

    2016-04-01

    Recently socio-hydrology models have been proposed to analyze the interplay of community risk-coping culture, flooding damage and economic growth. These models descriptively explain the feedbacks between socio-economic development and natural disasters such as floods. Complementary to these descriptive models, we develop a dynamic optimization model, where the inter-temporal decision of an economic agent interacts with the hydrological system. This interdisciplinary approach matches with the goals of Panta Rhei i.e. to understand feedbacks between hydrology and society. It enables new perspectives but also shows limitations of each discipline. Young scientists need mentors from various scientific backgrounds to learn their different research approaches and how to best combine them such that interdisciplinary scientific work is also accepted by different science communities. In our socio-hydrology model we apply a macro-economic decision framework to a long-term flood-scenario. We assume a standard macro-economic growth model where agents derive utility from consumption and output depends on physical capital that can be accumulated through investment. To this framework we add the occurrence of flooding events which will destroy part of the capital. We identify two specific periodic long term solutions and denote them rich and poor economies. Whereas rich economies can afford to invest in flood defense and therefore avoid flood damage and develop high living standards, poor economies prefer consumption instead of investing in flood defense capital and end up facing flood damages every time the water level rises. Nevertheless, they manage to sustain at least a low level of physical capital. We identify optimal investment strategies and compare simulations with more frequent and more intense high water level events.

  7. Assessing the economics of processing end-of-life vehicles through manual dismantling.

    PubMed

    Tian, Jin; Chen, Ming

    2016-10-01

    Most dismantling enterprises in a number of developing countries, such as China, usually adopt the "manual+mechanical" dismantling approach to process end-of-life vehicles. However, the automobile industry does not have a clear indicator to reasonably and effectively determine the manual dismantling degree for end-of-life vehicles. In this study, five different dismantling scenarios and an economic system for end-of-life vehicles were developed based on the actual situation of end-of-life vehicles. The fuzzy analytic hierarchy process was applied to set the weights of direct costs, indirect costs, and sales and to obtain an optimal manual dismantling scenario. Results showed that although the traditional method of "dismantling to the end" can guarantee the highest recycling rate, this method is not the best among all the scenarios. The profit gained in the optimal scenario is 100.6% higher than that in the traditional scenario. The optimal manual dismantling scenario showed that enterprises are required to select suitable parts to process through manual dismantling. Selecting suitable parts maximizes economic profit and improves dismantling speed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Teachers' tendencies to promote student-led science projects: Associations with their views about science

    NASA Astrophysics Data System (ADS)

    Bencze, J. Lawrence; Bowen, G. Michael; Alsop, Steve

    2006-05-01

    School science students can benefit greatly from participation in student-directed, open-ended scientific inquiry projects. For various possible reasons, however, students tend not to be engaged in such inquiries. Among factors that may limit their opportunities to engage in open-ended inquiries of their design are teachers' conceptions about science. To explore possible relationships between teachers' conceptions about science and the types of inquiry activities in which they engage students, instrumental case studies of five secondary science teachers were developed, using field notes, repertory grids, samples of lesson plans and student activities, and semistructured interviews. Based on constructivist grounded theory analysis, participating teachers' tendencies to promote student-directed, open-ended scientific inquiry projects seemed to correspond with positions about the nature of science to which they indicated adherence. A tendency to encourage and enable students to carry out student-directed, open-ended scientific inquiry projects appeared to be associated with adherence to social constructivist views about science. Teachers who opposed social constructivist views tended to prefer tight control of student knowledge building procedures and conclusions. We suggest that these results can be explained with reference to human psychological factors, including those associated with teachers' self-esteem and their relationships with knowledge-building processes in the discipline of their teaching.

  9. Gpu Implementation of a Viscous Flow Solver on Unstructured Grids

    NASA Astrophysics Data System (ADS)

    Xu, Tianhao; Chen, Long

    2016-06-01

    Graphics processing units have gained popularities in scientific computing over past several years due to their outstanding parallel computing capability. Computational fluid dynamics applications involve large amounts of calculations, therefore a latest GPU card is preferable of which the peak computing performance and memory bandwidth are much better than a contemporary high-end CPU. We herein focus on the detailed implementation of our GPU targeting Reynolds-averaged Navier-Stokes equations solver based on finite-volume method. The solver employs a vertex-centered scheme on unstructured grids for the sake of being capable of handling complex topologies. Multiple optimizations are carried out to improve the memory accessing performance and kernel utilization. Both steady and unsteady flow simulation cases are carried out using explicit Runge-Kutta scheme. The solver with GPU acceleration in this paper is demonstrated to have competitive advantages over the CPU targeting one.

  10. An IMS Station life cycle from a sustainment point of view

    NASA Astrophysics Data System (ADS)

    Brely, Natalie; Gautier, Jean-Pierre; Foster, Daniel

    2014-05-01

    The International Monitoring System (IMS) is to consist of 321 monitoring facilities, composed of four different technologies with a variety of designs and equipment types, deployed in a range of environments around the globe. The International Monitoring System is conceived to operate in perpetuity through maintenance, replacement and recapitalization of IMS facilities' infrastructure and equipment when the end of service life is reached [CTBT/PTS/INF.1163]. Life Cycle techniques and modellization are being used by the PTS to plan and forecast life cycle sustainment requirements of IMS facilities. Through historical data analysis, Engineering inputs and Feedback from experienced Station Operators, the PTS currently works towards increasing the level of confidence on these forecasts and sustainment requirements planning. Continued validation, feedback and improvement of source data from scientific community and experienced users is sought and essential in order to ensure limited effect on data availability and optimal costs (human and financial).

  11. Optimizing technology development and adoption in medical imaging using the principles of innovation diffusion, part II: practical applications.

    PubMed

    Reiner, Bruce I

    2012-02-01

    Successful adoption of new technology development can be accentuated by learning and applying the scientific principles of innovation diffusion. This is of particular importance to areas within the medical imaging practice which have lagged in innovation; perhaps, the most notable of which is reporting which has remained relatively stagnant for over a century. While the theoretical advantages of structured reporting have been well documented throughout the medical imaging community, adoption to date has been tepid and largely relegated to the academic and breast imaging communities. Widespread adoption will likely require an alternative approach to innovation, which addresses the heterogeneity and diversity of the practicing radiologist community along with the ever-changing expectations in service delivery. The challenges and strategies for reporting innovation and adoption are discussed, with the goal of adapting and customizing new technology to the preferences and needs of individual end-users.

  12. Solving optimization problems on computational grids.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wright, S. J.; Mathematics and Computer Science

    2001-05-01

    Multiprocessor computing platforms, which have become more and more widely available since the mid-1980s, are now heavily used by organizations that need to solve very demanding computational problems. Parallel computing is now central to the culture of many research communities. Novel parallel approaches were developed for global optimization, network optimization, and direct-search methods for nonlinear optimization. Activity was particularly widespread in parallel branch-and-bound approaches for various problems in combinatorial and network optimization. As the cost of personal computers and low-end workstations has continued to fall, while the speed and capacity of processors and networks have increased dramatically, 'cluster' platforms havemore » become popular in many settings. A somewhat different type of parallel computing platform know as a computational grid (alternatively, metacomputer) has arisen in comparatively recent times. Broadly speaking, this term refers not to a multiprocessor with identical processing nodes but rather to a heterogeneous collection of devices that are widely distributed, possibly around the globe. The advantage of such platforms is obvious: they have the potential to deliver enormous computing power. Just as obviously, however, the complexity of grids makes them very difficult to use. The Condor team, headed by Miron Livny at the University of Wisconsin, were among the pioneers in providing infrastructure for grid computations. More recently, the Globus project has developed technologies to support computations on geographically distributed platforms consisting of high-end computers, storage and visualization devices, and other scientific instruments. In 1997, we started the metaneos project as a collaborative effort between optimization specialists and the Condor and Globus groups. Our aim was to address complex, difficult optimization problems in several areas, designing and implementing the algorithms and the software infrastructure need to solve these problems on computational grids. This article describes some of the results we have obtained during the first three years of the metaneos project. Our efforts have led to development of the runtime support library MW for implementing algorithms with master-worker control structure on Condor platforms. This work is discussed here, along with work on algorithms and codes for integer linear programming, the quadratic assignment problem, and stochastic linear programmming. Our experiences in the metaneos project have shown that cheap, powerful computational grids can be used to tackle large optimization problems of various types. In an industrial or commercial setting, the results demonstrate that one may not have to buy powerful computational servers to solve many of the large problems arising in areas such as scheduling, portfolio optimization, or logistics; the idle time on employee workstations (or, at worst, an investment in a modest cluster of PCs) may do the job. For the optimization research community, our results motivate further work on parallel, grid-enabled algorithms for solving very large problems of other types. The fact that very large problems can be solved cheaply allows researchers to better understand issues of 'practical' complexity and of the role of heuristics.« less

  13. Optimal Design of a Planar Textile Antenna for Industrial Scientific Medical (ISM) 2.4 GHz Wireless Body Area Networks (WBAN) with the CRO-SL Algorithm.

    PubMed

    Sánchez-Montero, Rocío; Camacho-Gómez, Carlos; López-Espí, Pablo-Luís; Salcedo-Sanz, Sancho

    2018-06-21

    This paper proposes a low-profile textile-modified meander line Inverted-F Antenna (IFA) with variable width and spacing meanders, for Industrial Scientific Medical (ISM) 2.4-GHz Wireless Body Area Networks (WBAN), optimized with a novel metaheuristic algorithm. Specifically, a metaheuristic known as Coral Reefs Optimization with Substrate Layer (CRO-SL) is used to obtain an optimal antenna for sensor systems, which allows covering properly and resiliently the 2.4⁻2.45-GHz industrial scientific medical bandwidth. Flexible pad foam has been used to make the designed prototype with a 1.1-mm thickness. We have used a version of the algorithm that is able to combine different searching operators within a single population of solutions. This approach is ideal to deal with hard optimization problems, such as the design of the proposed meander line IFA. During the optimization phase with the CRO-SL, the proposed antenna has been simulated using CST Microwave Studio software, linked to the CRO-SL by means of MATLAB implementation and Visual Basic Applications (VBA) code. We fully describe the antenna design process, the adaptation of the CRO-SL approach to this problem and several practical aspects of the optimization and details on the algorithm’s performance. To validate the simulation results, we have constructed and measured two prototypes of the antenna, designed with the proposed algorithm. Several practical aspects such as sensitivity during the antenna manufacturing or the agreement between the simulated and constructed antenna are also detailed in the paper.

  14. 77 FR 55847 - Center for Scientific Review; Notice of Closed Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-11

    ... Scientific Review Special Emphasis Panel; PAR09-259: Optimization of Small Molecule Probes For the Nervous System. Date: October 12, 2012. Time: 3 p.m. to 4 p.m. Agenda: To review and evaluate grant applications...

  15. Balancing the pros and cons of GMOs: socio-scientific argumentation in pre-service teacher education

    NASA Astrophysics Data System (ADS)

    Cinici, Ayhan

    2016-07-01

    This study investigates the role of the discursive process in the act of scientific knowledge building. Specifically, it links scientific knowledge building to risk perception of Genetically Modified Organisms (GMOs). To this end, this study designed and implemented a three-stage argumentation programme giving pre-service teachers (PSTs) the opportunity to consider, discuss and construct shared decisions about GMOs. The study involved 101 third-year PSTs from two different classes, randomly divided into control and experimental groups. The study utilised both quantitative and qualitative methods. During the quantitative phase, researchers administered a pre- and post-intervention scale to measure both groups' risk perception of GMOs. During the qualitative phase, data were collected from the experimental group alone through individual and group reports and an open-ended questionnaire. T-test results showed a statistically significant difference between the experimental and control groups' risk perception of GMOs. Qualitative analysis also revealed differences, for example, in PSTs' weighing of the pros and cons of scientific research demonstrating positive results of GMOs. In addition, PSTs' acceptance of GMOs increased. Consequently, this study suggests that developing familiarity with scientific enterprise may play an effective role in adopting a scientific perspective as well as a more balanced risk perception of GMOs.

  16. 48 CFR 1852.219-82 - Limitation on subcontracting-STTR program.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... performed by the research institution. Since the selection of R&D contractors is substantially based on the best scientific and technological sources, it is important that the Contractor not subcontract technical or scientific work without the Contracting Officer's advance approval. (End of clause) [71 FR...

  17. 48 CFR 1852.219-82 - Limitation on subcontracting-STTR program.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... performed by the research institution. Since the selection of R&D contractors is substantially based on the best scientific and technological sources, it is important that the Contractor not subcontract technical or scientific work without the Contracting Officer's advance approval. (End of clause) [71 FR...

  18. 48 CFR 1852.219-82 - Limitation on subcontracting-STTR program.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... performed by the research institution. Since the selection of R&D contractors is substantially based on the best scientific and technological sources, it is important that the Contractor not subcontract technical or scientific work without the Contracting Officer's advance approval. (End of clause) [71 FR...

  19. 48 CFR 1852.219-82 - Limitation on subcontracting-STTR program.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... performed by the research institution. Since the selection of R&D contractors is substantially based on the best scientific and technological sources, it is important that the Contractor not subcontract technical or scientific work without the Contracting Officer's advance approval. (End of clause) [71 FR...

  20. 48 CFR 1852.219-82 - Limitation on subcontracting-STTR program.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... performed by the research institution. Since the selection of R&D contractors is substantially based on the best scientific and technological sources, it is important that the Contractor not subcontract technical or scientific work without the Contracting Officer's advance approval. (End of clause) [71 FR...

  1. Personalizing Science

    ERIC Educational Resources Information Center

    Danielowich, Robert M.

    2014-01-01

    Science teachers are aware of many social issues that intersect with science. These socio-scientific issues (SSIs) are "open-ended problems without clear-cut solutions [that] can be informed by scientific principles, theories, and data, but…cannot be fully determined by [them]" (Sadler 2011, p. 4). This article describes the SSI lessons…

  2. Satellite image-based maps: Scientific inference or pretty pictures?

    Treesearch

    Ronald E. McRoberts

    2011-01-01

    The scientific method has been characterized as having two distinct components, Discovery and Justification. Discovery emphasizes ideas and creativity, focuses on conceiving hypotheses and constructing models, and is generally regarded as lacking a formal logic. Justification begins with the hypotheses and models and ends with a...

  3. Canadian Palliative Community Milrinone Infusions: A Case Series.

    PubMed

    Reimche, Ruthanne; Salcedo, Daniel

    2016-01-01

    Abstract Symptom managementfor end-of-life heartfailure (HF) patients is a significant concern. Currently, Canadian practice does not support community milrinone therapy in end-of-life HF patients. Two patients had severe HF that was unresponsive to optimal medications. Further optimization and furosemide infusions were ineffective for symptom management. Both patients' symptoms were better controlled with optimal medication, furosemide, and milrinone infusions. A tailored discharge plan was developed to assist with community milrinone infusions. We discuss the challenges and successes of transitioning two patients to the community. By providing symptom management and meaningful patient and family experience, both patients were able to die in a setting of their choosing. Milrinone infusions as a bridge to end of life may improve symptoms and quality of life. Select patients may benefit from milrinone infusions with resources put in place; these end-of-life HF patients can be supported in the community.

  4. Flyby Geometry Optimization Tool

    NASA Technical Reports Server (NTRS)

    Karlgaard, Christopher D.

    2007-01-01

    The Flyby Geometry Optimization Tool is a computer program for computing trajectories and trajectory-altering impulsive maneuvers for spacecraft used in radio relay of scientific data to Earth from an exploratory airplane flying in the atmosphere of Mars.

  5. Optimal Coordination of Building Loads and Energy Storage for Power Grid and End User Services

    DOE PAGES

    Hao, He; Wu, Di; Lian, Jianming; ...

    2017-01-18

    Demand response and energy storage play a profound role in the smart grid. The focus of this study is to evaluate benefits of coordinating flexible loads and energy storage to provide power grid and end user services. We present a Generalized Battery Model (GBM) to describe the flexibility of building loads and energy storage. An optimization-based approach is proposed to characterize the parameters (power and energy limits) of the GBM for flexible building loads. We then develop optimal coordination algorithms to provide power grid and end user services such as energy arbitrage, frequency regulation, spinning reserve, as well as energymore » cost and demand charge reduction. Several case studies have been performed to demonstrate the efficacy of the GBM and coordination algorithms, and evaluate the benefits of using their flexibility for power grid and end user services. We show that optimal coordination yields significant cost savings and revenue. Moreover, the best option for power grid services is to provide energy arbitrage and frequency regulation. Finally and furthermore, when coordinating flexible loads with energy storage to provide end user services, it is recommended to consider demand charge in addition to time-of-use price in order to flatten the aggregate power profile.« less

  6. Letter regarding 'Comparison between low-cost marker-less and high-end marker-based motion capture systems for the computer-aided assessment of working ergonomics' by Patrizi et al. and research reproducibility.

    PubMed

    2017-04-01

    The reporting of research in a manner that allows reproduction in subsequent investigations is important for scientific progress. Several details of the recent study by Patrizi et al., 'Comparison between low-cost marker-less and high-end marker-based motion capture systems for the computer-aided assessment of working ergonomics', are absent from the published manuscript and make reproduction of findings impossible. As new and complex technologies with great promise for ergonomics develop, new but surmountable challenges for reporting investigations using these technologies in a reproducible manner arise. Practitioner Summary: As with traditional methods, scientific reporting of new and complex ergonomics technologies should be performed in a manner that allows reproduction in subsequent investigations and supports scientific advancement.

  7. 77 FR 33472 - Center for Scientific Review; Notice of Closed Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-06

    ... Scientific Review Special Emphasis Panel, PAR09-260: Optimization of Small Molecule Probes for the Nervous System. Date: June 29, 2012. Time: 3:00 p.m. to 5:00 p.m. Agenda: To review and evaluate grant...

  8. How to improve a critical performance for an ExoMars 2020 Scientific Instrument (RLS). Raman Laser Spectrometer Signal to Noise Ratio (SNR) Optimization

    NASA Astrophysics Data System (ADS)

    Canora, C. P.; Moral, A. G.; Rull, F.; Maurice, S.; Hutchinson, I.; Ramos, G.; López-Reyes, G.; Belenguer, T.; Canchal, R.; Prieto, J. A. R.; Rodriguez, P.; Santamaria, P.; Berrocal, A.; Colombo, M.; Gallago, P.; Seoane, L.; Quintana, C.; Ibarmia, S.; Zafra, J.; Saiz, J.; Santiago, A.; Marin, A.; Gordillo, C.; Escribano, D.; Sanz-Palominoa, M.

    2017-09-01

    The Raman Laser Spectrometer (RLS) is one of the Pasteur Payload instruments, within the ESA's Aurora Exploration Programme, ExoMars mission. Raman spectroscopy is based on the analysis of spectral fingerprints due to the inelastic scattering of light when interacting with matter. RLS is composed by Units: SPU (Spectrometer Unit), iOH (Internal Optical Head), and ICEU (Instrument Control and Excitation Unit) and the harnesses (EH and OH). The iOH focuses the excitation laser on the samples and collects the Raman emission from the sample via SPU (CCD) and the video data (analog) is received, digitalizing it and transmiting it to the processor module (ICEU). The main sources of noise arise from the sample, the background, and the instrument (Laser, CCD, focuss, acquisition parameters, operation control). In this last case the sources are mainly perturbations from the optics, dark signal and readout noise. Also flicker noise arising from laser emission fluctuations can be considered as instrument noise. In order to evaluate the SNR of a Raman instrument in a practical manner it is useful to perform end-to-end measurements on given standards samples. These measurements have to be compared with radiometric simulations using Raman efficiency values from literature and taking into account the different instrumental contributions to the SNR. The RLS EQM instrument performances results and its functionalities have been demonstrated in accordance with the science expectations. The Instrument obtained SNR performances in the RLS EQM will be compared experimentally and via analysis, with the Instrument Radiometric Model tool. The characterization process for SNR optimization is still on going. The operational parameters and RLS algorithms (fluorescence removal and acquisition parameters estimation) will be improved in future models (EQM-2) until FM Model delivery.

  9. An optical fiber expendable seawater temperature/depth profile sensor

    NASA Astrophysics Data System (ADS)

    Zhao, Qiang; Chen, Shizhe; Zhang, Keke; Yan, Xingkui; Yang, Xianglong; Bai, Xuejiao; Liu, Shixuan

    2017-10-01

    Marine expendable temperature/depth profiler (XBT) is a disposable measuring instrument which can obtain temperature/depth profile data quickly in large area waters and mainly used for marine surveys, scientific research, military application. The temperature measuring device is a thermistor in the conventional XBT probe (CXBT)and the depth data is only a calculated value by speed and time depth calculation formula which is not an accurate measurement result. Firstly, an optical fiber expendable temperature/depth sensor based on the FBG-LPG cascaded structure is proposed to solve the problems of the CXBT, namely the use of LPG and FBG were used to detect the water temperature and depth, respectively. Secondly, the fiber end reflective mirror is used to simplify optical cascade structure and optimize the system performance. Finally, the optical path is designed and optimized using the reflective optical fiber end mirror. The experimental results show that the sensitivity of temperature and depth sensing based on FBG-LPG cascade structure is about 0.0030C and 0.1%F.S. respectively, which can meet the requirements of the sea water temperature/depth observation. The reflectivity of reflection mirror is in the range from 48.8% to 72.5%, the resonant peak of FBG and LPG are reasonable and the whole spectrum are suitable for demodulation. Through research on the optical fiber XBT (FXBT), the direct measurement of deep-sea temperature/depth profile data can be obtained simultaneously, quickly and accurately. The FXBT is a new all-optical seawater temperature/depth sensor, which has important academic value and broad application prospect and is expected to replace the CXBT in the future.

  10. Skylab

    NASA Image and Video Library

    1971-10-01

    The Apollo Telescope Mount (ATM) was designed and developed by the Marshall Space Flight Center (MSFC) and served as the primary scientific instrument unit aboard Skylab (1973-1979). The ATM consisted of eight scientific instruments as well as a number of smaller experiments. This image is of the ATM flight unit sun end canister in MSFC's building 4755.

  11. The Point of Scientificity, the Fall of the Epistemological Dominos, and the End of the Field of Education Administration.

    ERIC Educational Resources Information Center

    English, Fenwick W.

    2002-01-01

    Argues that there is not a field (as a totality) of educational administration; rather there are many fields. Suggests abandoning privilege based on scientific sanctuary, and instead examining leadership issues subjectively and within context. (Contains 100 references.) (NB)

  12. How to Read Scientific Research Articles: A Hands-On Classroom Exercise

    ERIC Educational Resources Information Center

    Bogucka, Roxanne; Wood, Emily

    2009-01-01

    Undergraduate students are generally unfamiliar with scientific literature. Further, students experience frustration when they read research articles the way they read textbooks, from beginning to end. Using a team-based active learning exercise, an instruction librarian and colleagues at University of Texas at Austin introduce nutritional…

  13. Sharpening the Craft of Scientific Writing.

    ERIC Educational Resources Information Center

    Koprowski, John L.

    1997-01-01

    Describes a writing-intensive ecology course designed to foster the development of writing and critiquing skills early in the semester and immerse students in the peer-review process toward the end of the course. By critiquing other scientific papers, students gain insight into the effectiveness of their own writing while also increasing their…

  14. University Student Conceptions of Learning Science through Writing

    ERIC Educational Resources Information Center

    Ellis, Robert A.; Taylor, Charlotte E.; Drury, Helen

    2006-01-01

    First-year undergraduate science students experienced a writing program as an important part of their assessment in a biology subject. The writing program was designed to help them develop both their scientific understanding as well as their written scientific expression. Open-ended questionnaires investigating the quality of the experience of…

  15. LDRD Final Report: Global Optimization for Engineering Science Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HART,WILLIAM E.

    1999-12-01

    For a wide variety of scientific and engineering problems the desired solution corresponds to an optimal set of objective function parameters, where the objective function measures a solution's quality. The main goal of the LDRD ''Global Optimization for Engineering Science Problems'' was the development of new robust and efficient optimization algorithms that can be used to find globally optimal solutions to complex optimization problems. This SAND report summarizes the technical accomplishments of this LDRD, discusses lessons learned and describes open research issues.

  16. Use of expert consensus to improve atherogenic dyslipidemia management.

    PubMed

    Millán Núñez-Cortés, Jesús; Pedro-Botet, Juan; Brea-Hernando, Ángel; Díaz-Rodríguez, Ángel; González-Santos, Pedro; Hernández-Mijares, Antonio; Mantilla-Morató, Teresa; Pintó-Sala, Xavier; Simó, Rafael

    2014-01-01

    Although atherogenic dyslipidemia is a recognized cardiovascular risk factor, it is often underassessed and thus undertreated and poorly controlled in clinical practice. The objective of this study was to reach a multidisciplinary consensus for the establishment of a set of clinical recommendations on atherogenic dyslipidemia to optimize its prevention, early detection, diagnostic evaluation, therapeutic approach, and follow-up. After a review of the scientific evidence, a scientific committee formulated 87 recommendations related to atherogenic dyslipidemia, which were grouped into 5 subject areas: general concepts (10 items), impact and epidemiology (4 items), cardiovascular risk (32 items), detection and diagnosis (19 items), and treatment (22 items). A 2-round modified Delphi method was conducted to compare the opinions of a panel of 65 specialists in cardiology (23%), endocrinology (24.6%), family medicine (27.7%), and internal medicine (24.6%) on these issues. After the first round, the panel reached consensus on 65 of the 87 items discussed, and agreed on 76 items by the end of the second round. Insufficient consensus was reached on 3 items related to the detection and diagnosis of atherogenic dyslipidemia and 3 items related to the therapeutic goals to be achieved in these patients. The external assessment conducted by experts on atherogenic dyslipidemia showed a high level of professional agreement with the proposed clinical recommendations. These recommendations represent a useful tool for improving the clinical management of patients with atherogenic dyslipidemia. A detailed analysis of the current scientific evidence is required for those statements that eluded consensus. Copyright © 2013 Sociedad Española de Cardiología. Published by Elsevier Espana. All rights reserved.

  17. A method for automatically optimizing medical devices for treating heart failure: designing polymeric injection patterns.

    PubMed

    Wenk, Jonathan F; Wall, Samuel T; Peterson, Robert C; Helgerson, Sam L; Sabbah, Hani N; Burger, Mike; Stander, Nielen; Ratcliffe, Mark B; Guccione, Julius M

    2009-12-01

    Heart failure continues to present a significant medical and economic burden throughout the developed world. Novel treatments involving the injection of polymeric materials into the myocardium of the failing left ventricle (LV) are currently being developed, which may reduce elevated myofiber stresses during the cardiac cycle and act to retard the progression of heart failure. A finite element (FE) simulation-based method was developed in this study that can automatically optimize the injection pattern of the polymeric "inclusions" according to a specific objective function, using commercially available software tools. The FE preprocessor TRUEGRID((R)) was used to create a parametric axisymmetric LV mesh matched to experimentally measured end-diastole and end-systole metrics from dogs with coronary microembolization-induced heart failure. Passive and active myocardial material properties were defined by a pseudo-elastic-strain energy function and a time-varying elastance model of active contraction, respectively, that were implemented in the FE software LS-DYNA. The companion optimization software LS-OPT was used to communicate directly with TRUEGRID((R)) to determine FE model parameters, such as defining the injection pattern and inclusion characteristics. The optimization resulted in an intuitive optimal injection pattern (i.e., the one with the greatest number of inclusions) when the objective function was weighted to minimize mean end-diastolic and end-systolic myofiber stress and ignore LV stroke volume. In contrast, the optimization resulted in a nonintuitive optimal pattern (i.e., 3 inclusions longitudinallyx6 inclusions circumferentially) when both myofiber stress and stroke volume were incorporated into the objective function with different weights.

  18. A three-dimensional optimal sawing system for small sawmills in central Appalachia

    Treesearch

    Wenshu Lin; Jingxin Wang; R. Edward. Thomas

    2011-01-01

    A three-dimensional (3D) log sawing optimization system was developed to perform 3D log generation, opening face determination, sawing simulation, and lumber grading. Superficial characteristics of logs such as length, large-end and small-end diameters, and external defects were collected from local sawmills. Internal log defect positions and shapes were predicted...

  19. A systematic approach to the Planck LFI end-to-end test and its application to the DPC Level 1 pipeline

    NASA Astrophysics Data System (ADS)

    Frailis, M.; Maris, M.; Zacchei, A.; Morisset, N.; Rohlfs, R.; Meharga, M.; Binko, P.; Türler, M.; Galeotta, S.; Gasparo, F.; Franceschi, E.; Butler, R. C.; D'Arcangelo, O.; Fogliani, S.; Gregorio, A.; Lowe, S. R.; Maggio, G.; Malaspina, M.; Mandolesi, N.; Manzato, P.; Pasian, F.; Perrotta, F.; Sandri, M.; Terenzi, L.; Tomasi, M.; Zonca, A.

    2009-12-01

    The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the handling of the scientific and housekeeping telemetry. It is a critical component of the Planck ground segment which has to strictly commit to the project schedule to be ready for the launch and flight operations. In order to guarantee the quality necessary to achieve the objectives of the Planck mission, the design and development of the Level 1 software has followed the ESA Software Engineering Standards. A fundamental step in the software life cycle is the Verification and Validation of the software. The purpose of this work is to show an example of procedures, test development and analysis successfully applied to a key software project of an ESA mission. We present the end-to-end validation tests performed on the Level 1 of the LFI-DPC, by detailing the methods used and the results obtained. Different approaches have been used to test the scientific and housekeeping data processing. Scientific data processing has been tested by injecting signals with known properties directly into the acquisition electronics, in order to generate a test dataset of real telemetry data and reproduce as much as possible nominal conditions. For the HK telemetry processing, validation software have been developed to inject known parameter values into a set of real housekeeping packets and perform a comparison with the corresponding timelines generated by the Level 1. With the proposed validation and verification procedure, where the on-board and ground processing are viewed as a single pipeline, we demonstrated that the scientific and housekeeping processing of the Planck-LFI raw data is correct and meets the project requirements.

  20. Optimization of a Thermodynamic Model Using a Dakota Toolbox Interface

    NASA Astrophysics Data System (ADS)

    Cyrus, J.; Jafarov, E. E.; Schaefer, K. M.; Wang, K.; Clow, G. D.; Piper, M.; Overeem, I.

    2016-12-01

    Scientific modeling of the Earth physical processes is an important driver of modern science. The behavior of these scientific models is governed by a set of input parameters. It is crucial to choose accurate input parameters that will also preserve the corresponding physics being simulated in the model. In order to effectively simulate real world processes the models output data must be close to the observed measurements. To achieve this optimal simulation, input parameters are tuned until we have minimized the objective function, which is the error between the simulation model outputs and the observed measurements. We developed an auxiliary package, which serves as a python interface between the user and DAKOTA. The package makes it easy for the user to conduct parameter space explorations, parameter optimizations, as well as sensitivity analysis while tracking and storing results in a database. The ability to perform these analyses via a Python library also allows the users to combine analysis techniques, for example finding an approximate equilibrium with optimization then immediately explore the space around it. We used the interface to calibrate input parameters for the heat flow model, which is commonly used in permafrost science. We performed optimization on the first three layers of the permafrost model, each with two thermal conductivity coefficients input parameters. Results of parameter space explorations indicate that the objective function not always has a unique minimal value. We found that gradient-based optimization works the best for the objective functions with one minimum. Otherwise, we employ more advanced Dakota methods such as genetic optimization and mesh based convergence in order to find the optimal input parameters. We were able to recover 6 initially unknown thermal conductivity parameters within 2% accuracy of their known values. Our initial tests indicate that the developed interface for the Dakota toolbox could be used to perform analysis and optimization on a `black box' scientific model more efficiently than using just Dakota.

  1. Linus Pauling and the scientific debate over fallout hazards.

    PubMed

    Jolly, J Christopher

    2002-12-01

    From 1954 to 1963, numerous scientists engaged in a public debate over the possible hazards from radioactive fallout from nuclear weapons testing. Nobel laureate Linus Pauling, a California Institute of Technology chemist, was one of the most prominent. His scientific papers relating to the fallout debate reveal many of the scientific, social and political issues involved in the controversy. Although the public controversy ended after the signing of the 1963 Limited Test Ban Treaty, many of the scientific questions about the possible hazards of low-level radiation remain under debate within the scientific community. Moreover, the fallout debate was a prototype of current controversies over environmental and public-health hazards.

  2. MDTM: Optimizing Data Transfer using Multicore-Aware I/O Scheduling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Liang; Demar, Phil; Wu, Wenji

    2017-05-09

    Bulk data transfer is facing significant challenges in the coming era of big data. There are multiple performance bottlenecks along the end-to-end path from the source to destination storage system. The limitations of current generation data transfer tools themselves can have a significant impact on end-to-end data transfer rates. In this paper, we identify the issues that lead to underperformance of these tools, and present a new data transfer tool with an innovative I/O scheduler called MDTM. The MDTM scheduler exploits underlying multicore layouts to optimize throughput by reducing delay and contention for I/O reading and writing operations. With ourmore » evaluations, we show how MDTM successfully avoids NUMA-based congestion and significantly improves end-to-end data transfer rates across high-speed wide area networks.« less

  3. MDTM: Optimizing Data Transfer using Multicore-Aware I/O Scheduling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Liang; Demar, Phil; Wu, Wenji

    2017-01-01

    Bulk data transfer is facing significant challenges in the coming era of big data. There are multiple performance bottlenecks along the end-to-end path from the source to destination storage system. The limitations of current generation data transfer tools themselves can have a significant impact on end-to-end data transfer rates. In this paper, we identify the issues that lead to underperformance of these tools, and present a new data transfer tool with an innovative I/O scheduler called MDTM. The MDTM scheduler exploits underlying multicore layouts to optimize throughput by reducing delay and contention for I/O reading and writing operations. With ourmore » evaluations, we show how MDTM successfully avoids NUMA-based congestion and significantly improves end-to-end data transfer rates across high-speed wide area networks.« less

  4. Research on the International Space Station: Understanding Future Potential from Current Accomplishments

    NASA Technical Reports Server (NTRS)

    Robinson, Julie A.

    2007-01-01

    In November 2007, the International Space Station (ISS) will have supported seven years of continuous presence in space, with 15 Expeditions completed. These years have been characterized by the numerous technical challenges of assembly as well as operational and logistical challenges related to the availability of transportation by the Space Shuttle. During this period, an active set of early research objectives have also been accomplished alongside the assembly. This paper will review the research accomplishments on ISS to date, with the objective of drawing insights on the potential of future research following completion of ISS assembly. By the end of Expedition 15, an expected 121 U.S.-managed investigations will have been conducted on ISS, with 91 of these completed. Many of these investigations include multiple scientific objectives, with an estimated total of 334 scientists served. Through February 2007, 101 scientific publications have been identified. Another 184 investigations have been sponsored by ISS international partners, which independently track their scientists served and results publication. Through this survey of U.S. research completed on ISS, three different themes will be addressed: (1) How have constraints on transportation of mass to orbit affected the types of research successfully completed on the ISS to date? What lessons can be learned for increasing the success of ISS as a research platform during the period following the retirement of the Space Shuttle? (2) How have constraints on crew time for research during assembly and the active participation of crewmembers as scientists affected the types of research successfully completed on the ISS to date? What lessons can be learned for optimizing research return following the increase in capacity from 3 to 6 crewmembers (planned for 2009)? What lessons can be learned for optimizing research return after assembly is complete? (3) What do early research results indicate about the various scientific disciplines represented in investigations on ISS? Are there lessons specific to human research, technology development, life sciences, and physical sciences that can be used to increase future research accomplishments? Research has been conducted and completed on ISS under a set of challenging constraints during the past 7 years. The history of research accomplished on ISS during this time serves as an indicator of the value and potential of ISS when full utilization begins. By learning from our early experience in completing research on ISS, NASA and our partners can be positioned to optimize research returns as a full crew complement comes onboard, assembly is completed, and research begins in full.

  5. Adding Data Management Services to Parallel File Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brandt, Scott

    2015-03-04

    The objective of this project, called DAMASC for “Data Management in Scientific Computing”, is to coalesce data management with parallel file system management to present a declarative interface to scientists for managing, querying, and analyzing extremely large data sets efficiently and predictably. Managing extremely large data sets is a key challenge of exascale computing. The overhead, energy, and cost of moving massive volumes of data demand designs where computation is close to storage. In current architectures, compute/analysis clusters access data in a physically separate parallel file system and largely leave it scientist to reduce data movement. Over the past decadesmore » the high-end computing community has adopted middleware with multiple layers of abstractions and specialized file formats such as NetCDF-4 and HDF5. These abstractions provide a limited set of high-level data processing functions, but have inherent functionality and performance limitations: middleware that provides access to the highly structured contents of scientific data files stored in the (unstructured) file systems can only optimize to the extent that file system interfaces permit; the highly structured formats of these files often impedes native file system performance optimizations. We are developing Damasc, an enhanced high-performance file system with native rich data management services. Damasc will enable efficient queries and updates over files stored in their native byte-stream format while retaining the inherent performance of file system data storage via declarative queries and updates over views of underlying files. Damasc has four key benefits for the development of data-intensive scientific code: (1) applications can use important data-management services, such as declarative queries, views, and provenance tracking, that are currently available only within database systems; (2) the use of these services becomes easier, as they are provided within a familiar file-based ecosystem; (3) common optimizations, e.g., indexing and caching, are readily supported across several file formats, avoiding effort duplication; and (4) performance improves significantly, as data processing is integrated more tightly with data storage. Our key contributions are: SciHadoop which explores changes to MapReduce assumption by taking advantage of semantics of structured data while preserving MapReduce’s failure and resource management; DataMods which extends common abstractions of parallel file systems so they become programmable such that they can be extended to natively support a variety of data models and can be hooked into emerging distributed runtimes such as Stanford’s Legion; and Miso which combines Hadoop and relational data warehousing to minimize time to insight, taking into account the overhead of ingesting data into data warehousing.« less

  6. Using spatial principles to optimize distributed computing for enabling the physical science discoveries

    PubMed Central

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-01-01

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779

  7. Using spatial principles to optimize distributed computing for enabling the physical science discoveries.

    PubMed

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-04-05

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.

  8. Study and Implementation of the End-to-End Data Pipeline for the Virtis Imaging Spectrometer Onbaord Venus Express: "From Science Operations Planning to Data Archiving and Higher Lever Processing"

    NASA Astrophysics Data System (ADS)

    Cardesín Moinelo, Alejandro

    2010-04-01

    This PhD Thesis describes the activities performed during the Research Program undertaken for two years at the Istituto Nazionale di AstroFisica in Rome, Italy, as active member of the VIRTIS Technical and Scientific Team, and one additional year at the European Space Astronomy Center in Madrid, Spain, as member of the Mars Express Science Ground Segment. This document will show a study of all sections of the Science Ground Segment of the Venus Express mission, from the planning of the scientific operations, to the generation, calibration and archiving of the science data, including the production of valuable high level products. We will present and discuss here the end-to-end diagram of the ground segment from the technical and scientific point of view, in order to describe the overall flow of information: from the original scientific requests of the principal investigator and interdisciplinary teams, up to the spacecraft, and down again for the analysis of the measurements and interpretation of the scientific results. These scientific results drive to new and more elaborated scientific requests, which are used as feedback to the planning cycle, closing the circle. Special attention is given here to describe the implementation and development of the data pipeline for the VIRTIS instrument onboard Venus Express. During the research program, both the raw data generation pipeline and the data calibration pipeline were developed and automated in order to produce the final raw and calibrated data products from the input telemetry of the instrument. The final raw and calibrated products presented in this work are currently being used by the VIRTIS Science team for data analysis and are distributed to the whole scientific community via the Planetary Science Archive. More than 20,000 raw data files and 10,000 calibrated products have already been generated after almost 4 years of mission. In the final part of the Thesis, we will also present some high level data processing methods developed for the Mapping channel of the VIRTIS instrument. These methods have been implemented for the generation of high level global maps of measured radiance over the whole planet, which can then be used for the understanding of the global dynamics and morphology of the Venusian atmosphere. This method is currently being used to compare different emissions probing at different altitudes from the low cloud layers up to the upper mesosphere, by using the averaged projected values of radiance observed by the instrument, such as the near infrared windows at 1.7 μm and 2.3μm, the thermal region at 3.8μm and 5μm plus the analysis of particular emissions in the night and day side of the planet. This research has been undertaken under guidance and supervision of Giuseppe Piccioni, VIRTIS co-Principal Investigator, with support of the entire VIRTIS technical and scientific team, in particular of the Archiving team in Paris (LESIA-Meudon). The work has also been done in close collaboration with the Science and Mission Operations Centres in Madrid and Darmstadt (European Space Agency), the EGSE software developer (Techno Systems), the manufacturer of the VIRTIS instrument (Galileo Avionica) and the developer of the VIRTIS onboard software (DLR Berlin). The outcome of the technical and scientific work presented in this thesis is currently being used by the VIRTIS team to continue the investigations on the Venusian atmosphere and plan new scientific observations to improve the overall knowledge of the solar system. At the end of this document we show some of the many technical and scientific contributions, which have already been published in several international journals and conferences, and some articles of the European Space Agency used for public outreach.

  9. Study of node and mass sensitivity of resonant mode based cantilevers with concentrated mass loading

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Kewei, E-mail: drzkw@126.com; Chai, Yuesheng; Fu, Jiahui

    2015-12-15

    Resonant-mode based cantilevers are an important type of acoustic wave based mass-sensing devices. In this work, the governing vibration equation of a bi-layer resonant-mode based cantilever attached with concentrated mass is established by using a modal analysis method. The effects of resonance modes and mass loading conditions on nodes and mass sensitivity of the cantilever were theoretically studied. The results suggested that the node did not shift when concentrated mass was loaded on a specific position. Mass sensitivity of the cantilever was linearly proportional to the square of the point displacement at the mass loading position for all the resonancemore » modes. For the first resonance mode, when mass loading position x{sub c} satisfied 0 < x{sub c} < ∼ 0.3l (l is the cantilever beam length and 0 represents the rigid end), mass sensitivity decreased as the mass increasing while the opposite trend was obtained when mass loading satisfied ∼0.3l ≤ x{sub c} ≤ l. Mass sensitivity did not change when concentrated mass was loaded at the rigid end. This work can provide scientific guidance to optimize the mass sensitivity of a resonant-mode based cantilever.« less

  10. Science Operations Management

    NASA Astrophysics Data System (ADS)

    Squibb, Gael F.

    1984-10-01

    The operation teams for the Infrared Astronomical Satellite (IRAS) included scientists from the IRAS International Science Team. The scientific decisions on an hour-to-hour basis, as well as the long-term strategic decisions, were made by science team members. The IRAS scientists were involved in the analysis of the instrument performance, the analysis of the quality of the data, the decision to reacquire data that was contaminated by radiation effects, the strategy for acquiring the survey data, and the process for using the telescope for additional observations, as well as the processing decisions required to ensure the publication of the final scientific products by end of flight operations plus one year. Early in the project, two science team members were selected to be responsible for the scientific operational decisions. One, located at the operations control center in England, was responsible for the scientific aspects of the satellite operations; the other, located at the scientific processing center in Pasadena, was responsible for the scientific aspects of the processing. These science team members were then responsible for approving the design and test of the tools to support their responsibilities and then, after launch, for using these tools in making their decisions. The ability of the project to generate the final science data products one year after the end of flight operations is due in a large measure to the active participation of the science team members in the operations. This paper presents a summary of the operational experiences gained from this scientific involvement.

  11. Implicaciones antropológicas y teológicas de la escatología científica

    NASA Astrophysics Data System (ADS)

    Funes, J.; Lares, M.; De los Rios, M.

    2017-10-01

    We present an interdisciplinary group devoted to the discussion of topics that are common to science, philosophy and teology. In particular, we approach the study of the end of the cosmos and analyze the antropological and teological implications of the scientific eschatology. From a scientific point of view, the end of the universe raises the challenge of making predictions from models that use observational evidence from a large span of past times, but can not be confirmed. Against this limitation, the complementary approach of the philosophy allows to build conceptual bridges between the scientific vision and the physical manifestation of the world, arising questions about the place of the human being in the cosmic eschatology. Also, we seek for the links between the natural and religious realisms, trying to establish relations between the revelation of the image of God and their manifestation in the observable universe.

  12. A Survey on Multimedia-Based Cross-Layer Optimization in Visual Sensor Networks

    PubMed Central

    Costa, Daniel G.; Guedes, Luiz Affonso

    2011-01-01

    Visual sensor networks (VSNs) comprised of battery-operated electronic devices endowed with low-resolution cameras have expanded the applicability of a series of monitoring applications. Those types of sensors are interconnected by ad hoc error-prone wireless links, imposing stringent restrictions on available bandwidth, end-to-end delay and packet error rates. In such context, multimedia coding is required for data compression and error-resilience, also ensuring energy preservation over the path(s) toward the sink and improving the end-to-end perceptual quality of the received media. Cross-layer optimization may enhance the expected efficiency of VSNs applications, disrupting the conventional information flow of the protocol layers. When the inner characteristics of the multimedia coding techniques are exploited by cross-layer protocols and architectures, higher efficiency may be obtained in visual sensor networks. This paper surveys recent research on multimedia-based cross-layer optimization, presenting the proposed strategies and mechanisms for transmission rate adjustment, congestion control, multipath selection, energy preservation and error recovery. We note that many multimedia-based cross-layer optimization solutions have been proposed in recent years, each one bringing a wealth of contributions to visual sensor networks. PMID:22163908

  13. Conceptual Change in Psychology Students' Acceptance of the Scientific Foundation of the Discipline

    ERIC Educational Resources Information Center

    Amsel, Eric; Ashley, Aaron; Baird, Todd; Johnston, Adam

    2014-01-01

    Two studies explored conceptual change in undergraduate psychology students' acceptance of the scientific foundations of the discipline. In Study 1, Introductory Psychology students completed the Psychology as Science questionnaire (PAS) at the beginning and end of the semester and did so from their own (Self Condition) and their instructors'…

  14. Pre-Service Elementary Mathematics Teachers' Metaphors on Scientific Research and Foundations of Their Perceptions

    ERIC Educational Resources Information Center

    Bas, Fatih

    2016-01-01

    In this study, it is aimed to investigate pre-service elementary mathematics teachers' perceptions about scientific research with metaphor analysis and determine the foundations of these perceptions. This phenomenological study was conducted with 182 participants. The data were collected with two open-ended survey forms formed for investigating…

  15. 77 FR 64118 - Center for Scientific Review; Notice of Closed Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-18

    ... . Name of Committee: Center for Scientific Review Special Emphasis Panel; Fellowships: Cell Biology, Developmental Biology, and Bioengineering. Date: November 15, 2012. Time: 8:00 a.m. to 6:00 p.m. Agenda: To... . Name of Committee: AIDS and Related Research Integrated Review Group; NeuroAIDS and other End-Organ...

  16. Optimization of Sparse Matrix-Vector Multiplication on Emerging Multicore Platforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Samuel; Oliker, Leonid; Vuduc, Richard

    2008-10-16

    We are witnessing a dramatic change in computer architecture due to the multicore paradigm shift, as every electronic device from cell phones to supercomputers confronts parallelism of unprecedented scale. To fully unleash the potential of these systems, the HPC community must develop multicore specific-optimization methodologies for important scientific computations. In this work, we examine sparse matrix-vector multiply (SpMV) - one of the most heavily used kernels in scientific computing - across a broad spectrum of multicore designs. Our experimental platform includes the homogeneous AMD quad-core, AMD dual-core, and Intel quad-core designs, the heterogeneous STI Cell, as well as one ofmore » the first scientific studies of the highly multithreaded Sun Victoria Falls (a Niagara2 SMP). We present several optimization strategies especially effective for the multicore environment, and demonstrate significant performance improvements compared to existing state-of-the-art serial and parallel SpMV implementations. Additionally, we present key insights into the architectural trade-offs of leading multicore design strategies, in the context of demanding memory-bound numerical algorithms.« less

  17. Thermodynamics and historical relevance of a jetting thermometer made of Chinese zisha ceramic

    NASA Astrophysics Data System (ADS)

    Lee, Vincent; Attinger, Daniel

    2016-07-01

    Following a recent trend of scientific studies on artwork, we study here the thermodynamics of a thermometer made of zisha ceramic, related to the Chinese tea culture. The thermometer represents a boy who “urinates” shortly after hot water is poured onto his head. Long jetting distance is said to indicate that the water temperature is hot enough to brew tea. Here, a thermodynamic model describes the jetting phenomenon of that pee-pee boy. The study demonstrates how thermal expansion of an interior air pocket causes jetting. A thermodynamic potential is shown to define maximum jetting velocity. Seven optimization criteria to maximize jetting distance are provided, including two dimensionless numbers. Predicted jetting distances, jet durations, and temperatures agree very well with infrared and optical measurements. Specifically, the study confirms that jetting distances are sensitive enough to measure water temperature in the context of tea brewing. Optimization results show that longer jets are produced by large individuals, with low body mass index, with a boyhood of medium size inclined at an angle π/4. The study ends by considering the possibility that ceramic jetting artifacts like the pee-pee boy might have been the first thermometers known to mankind, before Galileo Galilei’s thermoscope.

  18. Boys who pee the farthest have a large hollow head, a thin skin, and medium-size manhood

    NASA Astrophysics Data System (ADS)

    Attinger, Daniel; Lee, Vincent

    2016-11-01

    Following a recent trend of scientific studies on artwork, we study here the thermodynamics of a jetting thermometer made of ceramic, related to the Chinese tea culture. The thermometer represents a boy who "urinates" shortly after hot water is poured onto his head. Long jetting distance indicates if the water temperature is hot enough to brew tea. Here, a thermofluid model describes the jetting phenomenon of that pee-pee boy. The study demonstrates how thermal expansion of an interior air pocket causes jetting. The validity of assumptions underlying the Hagen-Poiseuille flow is discussed for urethra of finite length. A thermodynamic potential is shown to define maximum jetting velocity. Seven optimization criteria to maximize jetting distance are provided, including two dimensionless numbers. The dimensionless numbers are obtained by comparing the time scales of the internal pressure buildup due to heating, with that of pressure relief due to jetting. Optimization results show that longer jets are produced by large individuals, with low body mass index, with a boyhood of medium size inclined at an angle π/4. Analogies are drawn with pissing contests among humans and lobsters. The study ends by noting similitudes of working principle between that politically incorrect thermometer and Galileo Galilei's thermoscope.

  19. Thermodynamics and historical relevance of a jetting thermometer made of Chinese zisha ceramic

    PubMed Central

    Lee, Vincent; Attinger, Daniel

    2016-01-01

    Following a recent trend of scientific studies on artwork, we study here the thermodynamics of a thermometer made of zisha ceramic, related to the Chinese tea culture. The thermometer represents a boy who “urinates” shortly after hot water is poured onto his head. Long jetting distance is said to indicate that the water temperature is hot enough to brew tea. Here, a thermodynamic model describes the jetting phenomenon of that pee-pee boy. The study demonstrates how thermal expansion of an interior air pocket causes jetting. A thermodynamic potential is shown to define maximum jetting velocity. Seven optimization criteria to maximize jetting distance are provided, including two dimensionless numbers. Predicted jetting distances, jet durations, and temperatures agree very well with infrared and optical measurements. Specifically, the study confirms that jetting distances are sensitive enough to measure water temperature in the context of tea brewing. Optimization results show that longer jets are produced by large individuals, with low body mass index, with a boyhood of medium size inclined at an angle π/4. The study ends by considering the possibility that ceramic jetting artifacts like the pee-pee boy might have been the first thermometers known to mankind, before Galileo Galilei’s thermoscope. PMID:27431925

  20. Avi Purkayastha | NREL

    Science.gov Websites

    Austin, from 2001 to 2007. There he was principal in HPC applications and user support, as well as in research and development in large-scale scientific applications and different HPC systems and technologies Interests HPC applications performance and optimizations|HPC systems and accelerator technologies|Scientific

  1. Key issues surrounding the health impacts of electronic nicotine delivery systems (ENDS) and other sources of nicotine.

    PubMed

    Drope, Jeffrey; Cahn, Zachary; Kennedy, Rosemary; Liber, Alex C; Stoklosa, Michal; Henson, Rosemarie; Douglas, Clifford E; Drope, Jacqui

    2017-11-01

    Answer questions and earn CME/CNE Over the last decade, the use of electronic nicotine delivery systems (ENDS), including the electronic cigarette or e-cigarette, has grown rapidly. More youth now use ENDS than any tobacco product. This extensive research review shows that there are scientifically sound, sometimes competing arguments about ENDS that are not immediately and/or completely resolvable. However, the preponderance of the scientific evidence to date suggests that current-generation ENDS products are demonstrably less harmful than combustible tobacco products such as conventional cigarettes in several key ways, including by generating far lower levels of carcinogens and other toxic compounds than combustible products or those that contain tobacco. To place ENDS in context, the authors begin by reviewing the trends in use of major nicotine-containing products. Because nicotine is the common core-and highly addictive-constituent across all tobacco products, its toxicology is examined. With its long history as the only nicotine product widely accepted as being relatively safe, nicotine-replacement therapy (NRT) is also examined. A section is also included that examines snus, the most debated potential harm-reduction product before ENDS. Between discussions of NRT and snus, ENDS are extensively examined: what they are, knowledge about their level of "harm," their relationship to smoking cessation, the so-called gateway effect, and dual use/poly-use. CA Cancer J Clin 2017;67:449-471. © 2017 American Cancer Society. © 2017 American Cancer Society.

  2. Astronomy Village Reaches for New Heights

    NASA Astrophysics Data System (ADS)

    Croft, S. K.; Pompea, S. M.

    2007-12-01

    We are developing a set of complex, multimedia-based instructional modules emphasizing technical and scientific issues related to Giant Segmented Mirror Telescope project. The modules" pedagogy will be open-ended and problem-based to promote development of problem-solving skills. Problem- based-learning modules that emphasize work on open-ended complex real world problems are particularly valuable in illustrating and promoting a perspective on the process of science and engineering. Research in this area shows that these kinds of learning experiences are superior to more conventional student training in terms of gains in student learning. The format for the modules will be based on the award-winning multi-media educational Astronomy Village products that present students with a simulated environment: a mountaintop community surrounded by a cluster of telescopes, satellite receivers, and telecommunication towers. A number of "buildings" are found in the Village, such as a library, a laboratory, and an auditorium. Each building contains an array of information sources and computer simulations. Students navigate through their research with a mentor via imbedded video. The first module will be "Observatory Site Selection." Students will use astronomical data, basic weather information, and sky brightness data to select the best site for an observatory. Students will investigate the six GSMT sites considered by the professional site selection teams. Students will explore weather and basic site issues (e.g., roads and topography) using remote sensing images, computational fluid dynamics results, turbulence profiles, and scintillation of the different sites. Comparison of student problem solving with expert problem solving will also be done as part of the module. As part of a site selection team they will have to construct a case and present it on why they chose a particular site. The second module will address aspects of system engineering and optimization for a GSMT-like telescope. Basic system issues will be addressed and studied. These might include various controls issues and optimization issues such as mirror figure, mirror support stability, and wind loading trade-offs. Using system modeling and system optimization results from existing and early GSMT trade studies, we will create a simulation where students are part of an engineering design and optimization team. They will explore the cost/performance/schedule issues associate with the GSMT design.

  3. Computational alternatives to obtain time optimal jet engine control. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Basso, R. J.; Leake, R. J.

    1976-01-01

    Two computational methods to determine an open loop time optimal control sequence for a simple single spool turbojet engine are described by a set of nonlinear differential equations. Both methods are modifications of widely accepted algorithms which can solve fixed time unconstrained optimal control problems with a free right end. Constrained problems to be considered have fixed right ends and free time. Dynamic programming is defined on a standard problem and it yields a successive approximation solution to the time optimal problem of interest. A feedback control law is obtained and it is then used to determine the corresponding open loop control sequence. The Fletcher-Reeves conjugate gradient method has been selected for adaptation to solve a nonlinear optimal control problem with state variable and control constraints.

  4. Program manual for ASTOP, an Arbitrary space trajectory optimization program

    NASA Technical Reports Server (NTRS)

    Horsewood, J. L.

    1974-01-01

    The ASTOP program (an Arbitrary Space Trajectory Optimization Program) designed to generate optimum low-thrust trajectories in an N-body field while satisfying selected hardware and operational constraints is presented. The trajectory is divided into a number of segments or arcs over which the control is held constant. This constant control over each arc is optimized using a parameter optimization scheme based on gradient techniques. A modified Encke formulation of the equations of motion is employed. The program provides a wide range of constraint, end conditions, and performance index options. The basic approach is conducive to future expansion of features such as the incorporation of new constraints and the addition of new end conditions.

  5. The CASH Project

    NASA Astrophysics Data System (ADS)

    Seyler, F.; Bonnet, M.-P.; Calmant, S.; Cauhopé, M.; Cazenave, A.; Cochonneau, G.; Divol, J.; Do-Minh, K.; Frappart, F.; Gennero, M.-C.; Guyenne-Blin, K.; Huynh, F.; Leon, J. G.; Mangeas, M.; Mercier, F.; Rocquelain, GH.; Tocqueville, L.; Zanifé, O.-Z.

    2006-07-01

    CASH « Contribution of spatial altimetry to hydrology » aims at the definition of a global, standard, fast and long term access to a set of hydrological data concerning the greatest river basins in the world. The key questions to be answered are: what are the conditions for monitoring river water stages from altimetric radar data and how is it possible to combine altimetric data with other spatial sources or/and in-situ data in order to deliver useful parameters for hydrology community, both scientific and end users. The CASH project is ending mid-May of 2006 and there is yet a lot of tasks to be performed for altimetric heigths of continental water bodies becoming part of the scientific and end-users hydrologists day-to- day practice. The project has nethertheless delineated the way this use could be improved in a near future, and opened very interesting perspectives for ungauged or poorly gauged great basins in the world.

  6. Optimizing students’ scientific communication skills through higher order thinking virtual laboratory (HOTVL)

    NASA Astrophysics Data System (ADS)

    Sapriadil, S.; Setiawan, A.; Suhandi, A.; Malik, A.; Safitri, D.; Lisdiani, S. A. S.; Hermita, N.

    2018-05-01

    Communication skill is one skill that is very needed in this 21st century. Preparing and teaching this skill in teaching physics is relatively important. The focus of this research is to optimizing of students’ scientific communication skills after the applied higher order thinking virtual laboratory (HOTVL) on topic electric circuit. This research then employed experimental study particularly posttest-only control group design. The subject in this research involved thirty senior high school students which were taken using purposive sampling. A sample of seventy (70) students participated in the research. An equivalent number of thirty five (35) students were assigned to the control and experimental group. The results of this study found that students using higher order thinking virtual laboratory (HOTVL) in laboratory activities had higher scientific communication skills than students who used the verification virtual lab.

  7. Scheduling Multilevel Deadline-Constrained Scientific Workflows on Clouds Based on Cost Optimization

    DOE PAGES

    Malawski, Maciej; Figiela, Kamil; Bubak, Marian; ...

    2015-01-01

    This paper presents a cost optimization model for scheduling scientific workflows on IaaS clouds such as Amazon EC2 or RackSpace. We assume multiple IaaS clouds with heterogeneous virtual machine instances, with limited number of instances per cloud and hourly billing. Input and output data are stored on a cloud object store such as Amazon S3. Applications are scientific workflows modeled as DAGs as in the Pegasus Workflow Management System. We assume that tasks in the workflows are grouped into levels of identical tasks. Our model is specified using mathematical programming languages (AMPL and CMPL) and allows us to minimize themore » cost of workflow execution under deadline constraints. We present results obtained using our model and the benchmark workflows representing real scientific applications in a variety of domains. The data used for evaluation come from the synthetic workflows and from general purpose cloud benchmarks, as well as from the data measured in our own experiments with Montage, an astronomical application, executed on Amazon EC2 cloud. We indicate how this model can be used for scenarios that require resource planning for scientific workflows and their ensembles.« less

  8. Optimization of thrie beam terminal end shoe connection.

    DOT National Transportation Integrated Search

    2017-04-01

    Terminal thrie end shoes connect nested thrie beams to parapets or other bridge rail structure to provide a robust connectivity between a transition section and a rigid railing section. When connecting terminal end shoe to thrie beam transitions, the...

  9. Resource allocation for error resilient video coding over AWGN using optimization approach.

    PubMed

    An, Cheolhong; Nguyen, Truong Q

    2008-12-01

    The number of slices for error resilient video coding is jointly optimized with 802.11a-like media access control and the physical layers with automatic repeat request and rate compatible punctured convolutional code over additive white gaussian noise channel as well as channel times allocation for time division multiple access. For error resilient video coding, the relation between the number of slices and coding efficiency is analyzed and formulated as a mathematical model. It is applied for the joint optimization problem, and the problem is solved by a convex optimization method such as the primal-dual decomposition method. We compare the performance of a video communication system which uses the optimal number of slices with one that codes a picture as one slice. From numerical examples, end-to-end distortion of utility functions can be significantly reduced with the optimal slices of a picture especially at low signal-to-noise ratio.

  10. Optimization of end-pumped, actively Q-switched quasi-III-level lasers.

    PubMed

    Jabczynski, Jan K; Gorajek, Lukasz; Kwiatkowski, Jacek; Kaskow, Mateusz; Zendzian, Waldemar

    2011-08-15

    The new model of end-pumped quasi-III-level laser considering transient pumping processes, ground-state-depletion and up-conversion effects was developed. The model consists of two parts: pumping stage and Q-switched part, which can be separated in a case of active Q-switching regime. For pumping stage the semi-analytical model was developed, enabling the calculations for final occupation of upper laser level for given pump power and duration, spatial profile of pump beam, length and dopant level of gain medium. For quasi-stationary inversion, the optimization procedure of Q-switching regime based on Lagrange multiplier technique was developed. The new approach for optimization of CW regime of quasi-three-level lasers was developed to optimize the Q-switched lasers operating with high repetition rates. Both methods of optimizations enable calculation of optimal absorbance of gain medium and output losses for given pump rate. © 2011 Optical Society of America

  11. Palliative and end-of-life care in stroke: a statement for healthcare professionals from the American Heart Association/American Stroke Association.

    PubMed

    Holloway, Robert G; Arnold, Robert M; Creutzfeldt, Claire J; Lewis, Eldrin F; Lutz, Barbara J; McCann, Robert M; Rabinstein, Alejandro A; Saposnik, Gustavo; Sheth, Kevin N; Zahuranec, Darin B; Zipfel, Gregory J; Zorowitz, Richard D

    2014-06-01

    The purpose of this statement is to delineate basic expectations regarding primary palliative care competencies and skills to be considered, learned, and practiced by providers and healthcare services across hospitals and community settings when caring for patients and families with stroke. Members of the writing group were appointed by the American Heart Association Stroke Council's Scientific Statement Oversight Committee and the American Heart Association's Manuscript Oversight Committee. Members were chosen to reflect the diversity and expertise of professional roles in delivering optimal palliative care. Writing group members were assigned topics relevant to their areas of expertise, reviewed the appropriate literature, and drafted manuscript content and recommendations in accordance with the American Heart Association's framework for defining classes and level of evidence and recommendations. The palliative care needs of patients with serious or life-threatening stroke and their families are enormous: complex decision making, aligning treatment with goals, and symptom control. Primary palliative care should be available to all patients with serious or life-threatening stroke and their families throughout the entire course of illness. To optimally deliver primary palliative care, stroke systems of care and provider teams should (1) promote and practice patient- and family-centered care; (2) effectively estimate prognosis; (3) develop appropriate goals of care; (4) be familiar with the evidence for common stroke decisions with end-of-life implications; (5) assess and effectively manage emerging stroke symptoms; (6) possess experience with palliative treatments at the end of life; (7) assist with care coordination, including referral to a palliative care specialist or hospice if necessary; (8) provide the patient and family the opportunity for personal growth and make bereavement resources available if death is anticipated; and (9) actively participate in continuous quality improvement and research. Addressing the palliative care needs of patients and families throughout the course of illness can complement existing practices and improve the quality of life of stroke patients, their families, and their care providers. There is an urgent need for further research in this area. © 2014 American Heart Association, Inc.

  12. Religious Beliefs: Their Dynamics in Two Groups of Life Scientists

    ERIC Educational Resources Information Center

    Falcao, Eliane Brigida Morais

    2008-01-01

    The assumption that scientific knowledge would bring an end to religious belief has challenged many scholars, particularly since such a belief persists even among those devoted to scientific activities. In this paper the occurrence and nature of religious belief in groups of life scientists working in the UK and Brazil is discussed in the context…

  13. Teachers' Views of the Nature of Science: A Study on Pre-Service Science Teachers in Sabah, Malaysia

    ERIC Educational Resources Information Center

    Fah, Lay Yoon; Hoon, Khoo Chwee

    2011-01-01

    Science education in Malaysia nurtures a science and technology culture by focusing on the development of individuals who are competitive, dynamic, robust, resilient and able to master scientific knowledge and technological competency. To this end, the science curriculum in Malaysia gives conscious emphasis to the acquisition of scientific skills…

  14. Physical Sciences Preservice Teachers' Religious and Scientific Views Regarding the Origin of the Universe and Life

    ERIC Educational Resources Information Center

    Govender, Nadaraj

    2017-01-01

    This paper explores final-year physical sciences preservice teachers' religious and scientific views regarding the origin of the universe and life. Data was obtained from 10 preservice teachers from individual in-depth interviews conducted at the end of the Science Method module. Their viewpoints were analyzed using coding, sorting, and…

  15. Searching for Scientific Literacy and Critical Pedagogy in Socioscientific Curricula: A Critical Discourse Analysis

    ERIC Educational Resources Information Center

    Cummings, Kristina M.

    2017-01-01

    The omnipresence of science and technology in our society require the development of a critical and scientifically literate citizenry. However, the inclusion of socioscientific issues, which are open-ended controversial issues informed by both science and societal factors such as politics, economics, and ethics, do not guarantee the development of…

  16. Representations of the Nature of Scientific Knowledge in Turkish Biology Textbooks

    ERIC Educational Resources Information Center

    Irez, Serhat

    2016-01-01

    Considering the impact of textbooks on learning, this study set out to assess representations of the nature of scientific knowledge in Turkish 9th grade biology textbooks. To this end, the ten most commonly used 9th grade biology textbooks were analyzed. A qualitative research approach was utilized and the textbooks were analyzed using…

  17. 78 FR 29754 - Board of Scientific Counselors, National Center for Injury Prevention and Control, (BSC, NCIPC)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-21

    ... mild-Traumatic Injury Workgroup. There will be 15 minutes allotted for public comments at the end of... Scientific Counselors, National Center for Injury Prevention and Control, (BSC, NCIPC) In accordance with...-being; and (3) conduct and assist in research and control activities related to injury. The Board of...

  18. Educational Optimism among Parents: A Pilot Study

    ERIC Educational Resources Information Center

    Räty, Hannu; Kasanen, Kati

    2016-01-01

    This study explored parents' (N = 351) educational optimism in terms of their trust in the possibilities of school to develop children's intelligence. It was found that educational optimism could be depicted as a bipolar factor with optimism and pessimism on the opposing ends of the same dimension. Optimistic parents indicated more satisfaction…

  19. Use of Open-Ended Problems in Mathematics Classroom. Research Report 176.

    ERIC Educational Resources Information Center

    Pehkonen, Erkki, Ed.

    During the years 1993-96, there has existed an active discussion group entitled "Using Open-Ended Problems in Mathematics" as a part of the scientific program of the Psychology of Mathematics Education (PME) conference. This report contains revised versions of presentations given in the discussion group. Since the PME is an international…

  20. Modeling the Water Balloon Slingshot

    ERIC Educational Resources Information Center

    Bousquet, Benjamin D.; Figura, Charles C.

    2013-01-01

    In the introductory physics courses at Wartburg College, we have been working to create a lab experience focused on the scientific process itself rather than verification of physical laws presented in the classroom or textbook. To this end, we have developed a number of open-ended modeling exercises suitable for a variety of learning environments,…

  1. SPRT Calibration Uncertainties and Internal Quality Control at a Commercial SPRT Calibration Facility

    NASA Astrophysics Data System (ADS)

    Wiandt, T. J.

    2008-06-01

    The Hart Scientific Division of the Fluke Corporation operates two accredited standard platinum resistance thermometer (SPRT) calibration facilities, one at the Hart Scientific factory in Utah, USA, and the other at a service facility in Norwich, UK. The US facility is accredited through National Voluntary Laboratory Accreditation Program (NVLAP), and the UK facility is accredited through UKAS. Both provide SPRT calibrations using similar equipment and procedures, and at similar levels of uncertainty. These uncertainties are among the lowest available commercially. To achieve and maintain low uncertainties, it is required that the calibration procedures be thorough and optimized. However, to minimize customer downtime, it is also important that the instruments be calibrated in a timely manner and returned to the customer. Consequently, subjecting the instrument to repeated calibrations or extensive repeated measurements is not a viable approach. Additionally, these laboratories provide SPRT calibration services involving a wide variety of SPRT designs. These designs behave differently, yet predictably, when subjected to calibration measurements. To this end, an evaluation strategy involving both statistical process control and internal consistency measures is utilized to provide confidence in both the instrument calibration and the calibration process. This article describes the calibration facilities, procedure, uncertainty analysis, and internal quality assurance measures employed in the calibration of SPRTs. Data will be reviewed and generalities will be presented. Finally, challenges and considerations for future improvements will be discussed.

  2. Managing the care of patients receiving antiresorptive therapy for prevention and treatment of osteoporosis: executive summary of recommendations from the American Dental Association Council on Scientific Affairs.

    PubMed

    Hellstein, John W; Adler, Robert A; Edwards, Beatrice; Jacobsen, Peter L; Kalmar, John R; Koka, Sreenivas; Migliorati, Cesar A; Ristic, Helen

    2011-11-01

    This narrative review of osteonecrosis of the jaw in patients with low bone mass receiving treatment with antiresorptive agents is based on an appraisal of the literature by an advisory committee of the American Dental Association Council on Scientific Affairs. It updates the committee's 2008 advisory statement. The authors searched MEDLINE for literature published between May 2008 (the end date of the last search) and February 2011. This report contains recommendations based on the findings of the literature search and on expert opinion that relate to general dentistry; periodontal disease management; implant placement and maintenance; oral and maxillofacial surgery; endodontics; restorative dentistry and prosthodontics; orthodontics; and C-terminal telopeptide testing and drug holidays. The highest reliable estimate of antiresorptive agent-induced osteonecrosis of the jaw (ARONJ) prevalence is approximately 0.10 percent. Osteoporosis is responsible for considerable morbidity and mortality. Therefore, the benefit provided by antiresorptive therapy outweighs the low risk of developing osteonecrosis of the jaw. An oral health program consisting of sound hygiene practices and regular dental care may be the optimal approach for lowering ARONJ risk. No validated diagnostic technique exists to determine which patients are at increased risk of developing ARONJ. Discontinuing bisphosphonate therapy may not lower the risk but may have a negative effect on low-bone-mass-treatment outcomes.

  3. A decade of proteomics accomplished! Central and Eastern European Proteomic Conference (CEEPC) celebrates its 10th Anniversary in Budapest, Hungary.

    PubMed

    Gadher, Suresh Jivan; Drahos, László; Vékey, Károly; Kovarova, Hana

    2017-07-01

    The Central and Eastern European Proteomic Conference (CEEPC) proudly celebrated its 10th Anniversary with an exciting scientific program inclusive of proteome, proteomics and systems biology in Budapest, Hungary. Since 2007, CEEPC has represented 'state-of the-art' proteomics in and around Central and Eastern Europe and these series of conferences have become a well-recognized event in the proteomic calendar. Fresher challenges and global healthcare issues such as ageing and chronic diseases are driving clinical and scientific research towards regenerative, reparative and personalized medicine. To this end, proteomics may enable diverse intertwining research fields to reach their end goals. CEEPC will endeavor to facilitate these goals.

  4. Web-Accessible Scientific Workflow System for Performance Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roelof Versteeg; Roelof Versteeg; Trevor Rowe

    2006-03-01

    We describe the design and implementation of a web accessible scientific workflow system for environmental monitoring. This workflow environment integrates distributed, automated data acquisition with server side data management and information visualization through flexible browser based data access tools. Component technologies include a rich browser-based client (using dynamic Javascript and HTML/CSS) for data selection, a back-end server which uses PHP for data processing, user management, and result delivery, and third party applications which are invoked by the back-end using webservices. This environment allows for reproducible, transparent result generation by a diverse user base. It has been implemented for several monitoringmore » systems with different degrees of complexity.« less

  5. A scientific assessment of a new technology orbital telescope

    NASA Technical Reports Server (NTRS)

    1995-01-01

    As part of a program designed to test the Alpha chemical laser weapons system in space, the Ballistic Missile Defense Organization (BMDO) developed components of an agile, lightweight, 4-meter telescope, equipped with an advanced active-optics system. BMDO had proposed to make space available in the telescope's focal plane for instrumentation optimized for scientific applications in astrophysics and planetary astronomy for a potential flight mission. Such a flight mission could be undertaken if new or additional sponsorship can be found. Despite this uncertainty, BMDO requested assistance in defining the instrumentation and other design aspects necessary to enhance the scientific value of a pointing and tracking mission. In response to this request, the Space Studies Board established the Task Group on BMDO New Technology Orbital Observatory (TGBNTOO) and charged it to: (1) provide instrumentation, data management, and science-operations advice to BMDO to optimize the scientific value of a 4-meter mission; and (2) support a space studies board assessment of the relative scientific merit of the program. This report deals with the first of these tasks, assisting the Advanced Technology Demonstrator's (ATD's) program scientific potential. Given the potential scientific aspects of the 4-meter telescope, this project is referred to as the New Technology Orbital Telescope (NTOT), or as the ATD/NTOT, to emphasize its dual-use character. The task group's basic conclusion is that the ATD/NTOT mission does have the potential for contributing in a major way to astronomical goals.

  6. A concept analysis of optimality in perinatal health.

    PubMed

    Kennedy, Holly Powell

    2006-01-01

    This analysis was conducted to describe the concept of optimality and its appropriateness for perinatal health care. The concept was identified in 24 scientific disciplines. Across all disciplines, the universal definition of optimality is the robust, efficient, and cost-effective achievement of best possible outcomes within a rule-governed framework. Optimality, specifically defined for perinatal health care, is the maximal perinatal outcome with minimal intervention placed against the context of the woman's social, medical, and obstetric history.

  7. LBTO's long march to full operation: step 2

    NASA Astrophysics Data System (ADS)

    Veillet, Christian; Ashby, David S.; Christou, Julian C.; Hill, John M.; Little, John K.; Summers, Douglas M.; Wagner, R. Mark; Masciadri, Elena; Turchi, Alessio

    2016-08-01

    Step 1 (Veillet et al.1), after a review of the development of the Large Binocular Telescope Observatory (LBTO from the early concepts of the early 80s to mid-2014, outlined a six-year plan (LBT2020) aimed at optimizing LBTO's scientific production while mitigating the consequences of the inevitable setbacks brought on by the considerable complexity of the telescope and the very diverse nature of the LBTO partnership. Step 2 is now focusing on the first two years of implementation of this plan, presenting the encountered obstacles, technical, cultural and political, and how they were overcome. Weather and another incident with one of the Adaptive Secondaries slowed down commissioning activities. All the facility instruments should have been commissioned and offered in binocular mode in early or mid-2016. It will happen instead by the end of 2016. On a brighter side, the first scientific publications using the LBT as a 23-m telescope through interferometry were published in 2015 and the overall number of publications has been raising at a good pace. Three second generation instruments were selected, scheduled to come on the telescope in the next three to five years. They will all use the excellent performance of the LBT Adaptive Optics (AO), which will be even better thanks to an upgrade of the AO to be completed in 2018. Less progress than hoped was made to move the current observing mode of the telescope to a whole LBT-wide queue. In two years from now, we should have a fully operational telescope, including a laser-based Ground Layer AO (GLAO) system, hopefully fully running in queue, with new instruments in development, new services offered to the users, and a stronger scientific production.

  8. Theoretical Foundations of Wireless Networks

    DTIC Science & Technology

    2015-07-22

    Optimal transmission over a fading channel with imperfect channel state information,” in Global Telecommun. Conf., pp. 1–5, Houston TX , December 5-9...SECURITY CLASSIFICATION OF: The goal of this project is to develop a formal theory of wireless networks providing a scientific basis to understand...randomness and optimality. Randomness, in the form of fading, is a defining characteristic of wireless networks. Optimality is a suitable design

  9. A Near-Optimal Distributed QoS Constrained Routing Algorithm for Multichannel Wireless Sensor Networks

    PubMed Central

    Lin, Frank Yeong-Sung; Hsiao, Chiu-Han; Yen, Hong-Hsu; Hsieh, Yu-Jen

    2013-01-01

    One of the important applications in Wireless Sensor Networks (WSNs) is video surveillance that includes the tasks of video data processing and transmission. Processing and transmission of image and video data in WSNs has attracted a lot of attention in recent years. This is known as Wireless Visual Sensor Networks (WVSNs). WVSNs are distributed intelligent systems for collecting image or video data with unique performance, complexity, and quality of service challenges. WVSNs consist of a large number of battery-powered and resource constrained camera nodes. End-to-end delay is a very important Quality of Service (QoS) metric for video surveillance application in WVSNs. How to meet the stringent delay QoS in resource constrained WVSNs is a challenging issue that requires novel distributed and collaborative routing strategies. This paper proposes a Near-Optimal Distributed QoS Constrained (NODQC) routing algorithm to achieve an end-to-end route with lower delay and higher throughput. A Lagrangian Relaxation (LR)-based routing metric that considers the “system perspective” and “user perspective” is proposed to determine the near-optimal routing paths that satisfy end-to-end delay constraints with high system throughput. The empirical results show that the NODQC routing algorithm outperforms others in terms of higher system throughput with lower average end-to-end delay and delay jitter. In this paper, for the first time, the algorithm shows how to meet the delay QoS and at the same time how to achieve higher system throughput in stringently resource constrained WVSNs.

  10. Implementation of a multi-threaded framework for large-scale scientific applications

    DOE PAGES

    Sexton-Kennedy, E.; Gartung, Patrick; Jones, C. D.; ...

    2015-05-22

    The CMS experiment has recently completed the development of a multi-threaded capable application framework. In this paper, we will discuss the design, implementation and application of this framework to production applications in CMS. For the 2015 LHC run, this functionality is particularly critical for both our online and offline production applications, which depend on faster turn-around times and a reduced memory footprint relative to before. These applications are complex codes, each including a large number of physics-driven algorithms. While the framework is capable of running a mix of thread-safe and 'legacy' modules, algorithms running in our production applications need tomore » be thread-safe for optimal use of this multi-threaded framework at a large scale. Towards this end, we discuss the types of changes, which were necessary for our algorithms to achieve good performance of our multithreaded applications in a full-scale application. Lastly performance numbers for what has been achieved for the 2015 run are presented.« less

  11. Intraoperative mechanical ventilation for the pediatric patient.

    PubMed

    Kneyber, Martin C J

    2015-09-01

    Invasive mechanical ventilation is required when children undergo general anesthesia for any procedure. It is remarkable that one of the most practiced interventions such as pediatric mechanical ventilation is hardly supported by any scientific evidence but rather based on personal experience and data from adults, especially as ventilation itself is increasingly recognized as a harmful intervention that causes ventilator-induced lung injury. The use of low tidal volume and higher levels of positive end-expiratory pressure became an integral part of lung-protective ventilation following the outcomes of clinical trials in critically ill adults. This approach has been readily adopted in pediatric ventilation. However, a clear association between tidal volume and mortality has not been ascertained in pediatrics. In fact, experimental studies have suggested that young children might be less susceptible to ventilator-induced lung injury. As such, no recommendations on optimal lung-protective ventilation strategy in children with or without lung injury can be made. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Management of cataract in uveitis patients.

    PubMed

    Conway, Mandi D; Stern, Ethan; Enfield, David B; Peyman, Gholam A

    2018-01-01

    This review is timely because the outcomes of surgical invention in uveitic eyes with cataract can be optimized with adherence to strict anti-inflammatory principles. All eyes should be free of any cell/ flare for a minimum of 3 months preoperatively. Another helpful maneuver is to place dexamethasone in the infusion fluid or triamcinolone intracamerally at the end of surgery. Recent reports about the choice of intraocular lens material or lens design are germane to the best surgical outcome. Integrating these findings will promote better visual outcomes and allow advancement in research to further refine these surgical interventions in high-risk uveitic eyes. Control of inflammation has been shown to greatly improve postoperative outcomes in patients with uveitis. Despite better outcomes, more scientific research needs to be done regarding lens placement and materials and further research needs to adhere to the standardized reporting of uveitis nomenclature. Future studies should improve postoperative outcomes in eyes with uveitis so that they approach those of eyes undergoing routine cataract procedures.

  13. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY17.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Pugmire, David; Rogers, David

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  14. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY15 Q4.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  15. Proceedings of the Workshop on Software Engineering Foundations for End-User Programming (SEEUP 2009)

    DTIC Science & Technology

    2009-11-01

    interest of scientific and technical information exchange. This work is sponsored by the U.S. Department of Defense. The Software Engineering Institute is a...an interesting conti- nuum between how many different requirements a program must satisfy: the more complex and diverse the requirements, the more... Gender differences in approaches to end-user software development have also been reported in debugging feature usage [1] and in end-user web programming

  16. Exploiting on-node heterogeneity for in-situ analytics of climate simulations via a functional partitioning framework

    NASA Astrophysics Data System (ADS)

    Sapra, Karan; Gupta, Saurabh; Atchley, Scott; Anantharaj, Valentine; Miller, Ross; Vazhkudai, Sudharshan

    2016-04-01

    Efficient resource utilization is critical for improved end-to-end computing and workflow of scientific applications. Heterogeneous node architectures, such as the GPU-enabled Titan supercomputer at the Oak Ridge Leadership Computing Facility (OLCF), present us with further challenges. In many HPC applications on Titan, the accelerators are the primary compute engines while the CPUs orchestrate the offloading of work onto the accelerators, and moving the output back to the main memory. On the other hand, applications that do not exploit GPUs, the CPU usage is dominant while the GPUs idle. We utilized Heterogenous Functional Partitioning (HFP) runtime framework that can optimize usage of resources on a compute node to expedite an application's end-to-end workflow. This approach is different from existing techniques for in-situ analyses in that it provides a framework for on-the-fly analysis on-node by dynamically exploiting under-utilized resources therein. We have implemented in the Community Earth System Model (CESM) a new concurrent diagnostic processing capability enabled by the HFP framework. Various single variate statistics, such as means and distributions, are computed in-situ by launching HFP tasks on the GPU via the node local HFP daemon. Since our current configuration of CESM does not use GPU resources heavily, we can move these tasks to GPU using the HFP framework. Each rank running the atmospheric model in CESM pushes the variables of of interest via HFP function calls to the HFP daemon. This node local daemon is responsible for receiving the data from main program and launching the designated analytics tasks on the GPU. We have implemented these analytics tasks in C and use OpenACC directives to enable GPU acceleration. This methodology is also advantageous while executing GPU-enabled configurations of CESM when the CPUs will be idle during portions of the runtime. In our implementation results, we demonstrate that it is more efficient to use HFP framework to offload the tasks to GPUs instead of doing it in the main application. We observe increased resource utilization and overall productivity in this approach by using HFP framework for end-to-end workflow.

  17. Build infrastructure in publishing scientific journals to benefit medical scientists

    PubMed Central

    Dai, Ni; Xu, Dingyao; Zhong, Xiyao; Li, Li; Ling, Qibo

    2014-01-01

    There is urgent need for medical journals to optimize their publishing processes and strategies to satisfy the huge need for medical scientists to publish their articles, and then obtain better prestige and impact in scientific and research community. These strategies include optimizing the process of peer-review, utilizing open-access publishing models actively, finding ways of saving costs and getting revenue, smartly dealing with research fraud or misconduct, maintaining sound relationship with pharmaceutical companies, and managing to provide relevant and useful information for clinical practitioners and researchers. Scientists, publishers, societies and organizations need to work together to publish internationally renowned medical journals. PMID:24653634

  18. Build infrastructure in publishing scientific journals to benefit medical scientists.

    PubMed

    Dai, Ni; Xu, Dingyao; Zhong, Xiyao; Li, Li; Ling, Qibo; Bu, Zhaode

    2014-02-01

    There is urgent need for medical journals to optimize their publishing processes and strategies to satisfy the huge need for medical scientists to publish their articles, and then obtain better prestige and impact in scientific and research community. These strategies include optimizing the process of peer-review, utilizing open-access publishing models actively, finding ways of saving costs and getting revenue, smartly dealing with research fraud or misconduct, maintaining sound relationship with pharmaceutical companies, and managing to provide relevant and useful information for clinical practitioners and researchers. Scientists, publishers, societies and organizations need to work together to publish internationally renowned medical journals.

  19. The development of scientific identification theory to conduct operation research in education management

    NASA Astrophysics Data System (ADS)

    Hardhienata, S.

    2017-01-01

    Operations research is a general method used in the study and optimization of a system through modeling of the system. In the field of education, especially in education management, operations research has not been widely used. This paper gives an exposition of ideas about how operations research can be used to conduct research and optimization in the field of education management by developing SITOREM (Scientific Identification Theory for Operation Research in Education Management). To clarify the intent of the idea, an example of applying SITOREM to enhance the professional commitment of lecturers associated with achieving the vision of university will be described.

  20. Skylab

    NASA Image and Video Library

    1972-02-01

    The final version of the Marshall Space Flight Center managed Skylab consisted of four primary parts. One component was the Apollo Telescope Mount (ATM) that housed the first marned scientific telescopes in space. This picture is a view of the ATM spar, which contained the scientific instruments, as the multiple docking adapter (MDA) canister end is lowered over it. The MDA served to link the major parts of Skylab together.

  1. Which Sweetener Is Best for Yeast? An Inquiry-Based Learning for Conceptual Change

    ERIC Educational Resources Information Center

    Cherif, Abour H.; Siuda, JoElla E.; Kassem, Sana; Gialamas, Stefanos; Movahedzadeh, Farahnaz

    2017-01-01

    One way to help students understand the scientific inquiry process, and how it applies in investigative research, is to involve them in scientific investigation. An example of this would be letting them come to their own understanding of how different variables (e.g., starting products) can affect outcomes (e.g., variable quality end products)…

  2. "Probably True" Says the Expert: How Two Types of Lexical Hedges Influence Students' Evaluation of Scientificness

    ERIC Educational Resources Information Center

    Thiebach, Monja; Mayweg-Paus, Elisabeth; Jucks, Regina

    2015-01-01

    Contemporary school learning typically includes the processing of popular scientific information as found in journals, magazines, and/or the WWW. The German high school curriculum emphasizes that students should have achieved science literacy and have learned to evaluate the substance of text-based learning content by the end of high school.…

  3. Using a Historical Controversy to Teach Critical Thinking, the Meaning of "Theory", and the Status of Scientific Knowledge

    ERIC Educational Resources Information Center

    Montgomery, Keith

    2009-01-01

    It is important that students understand the "open-ended" nature of scientific knowledge and the correct relationship between facts and theory. One way this can be taught is to examine a past controversy in which the interpretation of facts was contested. The controversy discussed here, with suggestions for teaching, is "Expanding…

  4. The "History" of Victorian Scientific Naturalism: Huxley, Spencer and the "End" of natural history.

    PubMed

    Lightman, Bernard

    2016-08-01

    As part of their defence of evolutionary theory, T. H. Huxley and Herbert Spencer argued that natural history was no longer a legitimate scientific discipline. They outlined a secularized concept of life from biology to argue for the validity of naturalism. Despite their support for naturalism, they offered two different responses to the decline of natural history. Whereas Huxley emphasized the creation of a biological discipline, and all that that entailed, Spencer was more concerned with constructing an entire intellectual system based on the idea of evolution. In effect, Spencer wanted to create a new scientific worldview based on evolutionary theory. This had consequences for their understanding of human history, especially of how science had evolved through the ages. It affected their conceptions of human agency, contingency, and directionality in history. Examining Huxley's and Spencer's responses to the "end" of natural history reveals some of the deep divisions within scientific naturalism and the inherent problems of naturalism in general. Whereas Huxley chose to separate the natural and the historical, Spencer opted to fuse them into a single system. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.

  5. Getting real with the upcoming challenge of electronic nicotine delivery systems: The way forward for the South-East Asia region.

    PubMed

    Kaur, Jagdish; Rinkoo, Arvind Vashishta

    2017-09-01

    Electronic nicotine delivery systems (ENDS) are being marketed to tobacco smokers for use in places where smoking is not allowed or as aids similar to pharmaceutical nicotine products to help cigarette smokers quit tobacco use. These are often flavored to make them more attractive for youth - ENDS use may lead young nonsmokers to take up tobacco products. Neither safety nor efficacy as a cessation aid of ENDS has been scientifically demonstrated. The adverse health effects of secondhand aerosol cannot be ruled out. Weak regulation of these products might contribute to the expansion of the ENDS market - in which tobacco companies have a substantial stake - potentially renormalizing smoking habits and negating years of intense tobacco control campaigning. The current situation calls for galvanizing policy makers to gear up to this challenge in the Southeast Asia Region (SEAR) where the high burden of tobacco use is compounded by large proportion of young vulnerable population and limited established tobacco cessation facilities. Banning ENDS in the SEAR seems to be the most plausible approach at present. In the SEAR, Timor-Leste, Democratic People's Republic of Korea, and Thailand have taken the lead in banning these products. The other countries of the SEAR should follow suit. The SEAR countries may, however, choose to revise their strategy if unbiased scientific evidence emerges about efficacy of ENDS as a tobacco cessation aid. ENDS industry must show true motivation and willingness to develop and test ENDS as effective pharmaceutical tools in the regional context before asking for market authorization.

  6. Image gathering, coding, and processing: End-to-end optimization for efficient and robust acquisition of visual information

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.; Fales, Carl L.

    1990-01-01

    Researchers are concerned with the end-to-end performance of image gathering, coding, and processing. The applications range from high-resolution television to vision-based robotics, wherever the resolution, efficiency and robustness of visual information acquisition and processing are critical. For the presentation at this workshop, it is convenient to divide research activities into the following two overlapping areas: The first is the development of focal-plane processing techniques and technology to effectively combine image gathering with coding, with an emphasis on low-level vision processing akin to the retinal processing in human vision. The approach includes the familiar Laplacian pyramid, the new intensity-dependent spatial summation, and parallel sensing/processing networks. Three-dimensional image gathering is attained by combining laser ranging with sensor-array imaging. The second is the rigorous extension of information theory and optimal filtering to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing.

  7. Comparison and Evaluation of End-User Interfaces for Online Public Access Catalogs.

    ERIC Educational Resources Information Center

    Zumer, Maja

    End-user interfaces for the online public access catalogs (OPACs) of OhioLINK, a system linking major university and research libraries in Ohio, and its 16 member libraries, accessible through the Internet, are compared and evaluated from the user-oriented perspective. A common, systematic framework was used for the scientific observation of the…

  8. Scientific Instruments for Education in Early Twentieth-Century Spain

    ERIC Educational Resources Information Center

    Ruiz-Castell, Pedro

    2008-01-01

    1898 marked a crucial point in the end of the nineteenth-century Spanish crisis. The military defeat ending the Spanish-American War was seen as proof that the country was in terminal decline. With the ideals of regeneration spreading throughout Spanish society, the State became more interested in supporting and sponsoring science and technology,…

  9. Teachers' Tendencies to Promote Student-Led Science Projects: Associations with Their Views about Science

    ERIC Educational Resources Information Center

    Bencze, J. Lawrence; Bowen, G. Michael; Alsop, Steve

    2006-01-01

    School science students can benefit greatly from participation in student-directed, open-ended scientific inquiry projects. For various possible reasons, however, students tend not to be engaged in such inquiries. Among factors that may limit their opportunities to engage in open-ended inquiries of their design are teachers' conceptions about…

  10. Open-Ended Science Inquiry in Lower Secondary School: Are Students' Learning Needs Being Met?

    ERIC Educational Resources Information Center

    Whannell, Robert; Quinn, Fran; Taylor, Subhashni; Harris, Katherine; Cornish, Scott; Sharma, Manjula

    2018-01-01

    Australian science curricula have promoted the use of investigations that allow secondary students to engage deeply with the methods of scientific inquiry, through student-directed, open-ended investigations over an extended duration. This study presents the analysis of data relating to the frequency of completion and attitudes towards long…

  11. Assessing Teaching and Assessment Competences of Biology Teacher Trainees: Lessons from Item Development

    ERIC Educational Resources Information Center

    Hasse, Sascha; Joachim, Cora; Bögeholz, Susanne; Hammann, Marcus

    2014-01-01

    In Germany, science education standards for students at the end of grade nine have been in existance since 2005. Some of these standards are dedicated to scientific inquiry (e.g. experimentation). They describe which abilities learners are expected to possess at the end of grade nine. In the USA, several documents describe standards for…

  12. Assessing Teaching and Assessment Competences of Biology Teacher Trainees: Lessons from Item Development

    ERIC Educational Resources Information Center

    Hasse, Sascha; Joachim, Cora; Bögeholz, Susanne; Hammann, Marcus

    2014-01-01

    In Germany, science education standards for students at the end of grade nine have been in existence since 2005. Some of these standards are dedicated to scientific inquiry (e.g. experimentation). They describe which abilities learners are expected to possess at the end of grade nine. In the USA, several documents describe standards for…

  13. Building Scalable Knowledge Graphs for Earth Science

    NASA Technical Reports Server (NTRS)

    Ramachandran, Rahul; Maskey, Manil; Gatlin, Patrick; Zhang, Jia; Duan, Xiaoyi; Miller, J. J.; Bugbee, Kaylin; Christopher, Sundar; Freitag, Brian

    2017-01-01

    Knowledge Graphs link key entities in a specific domain with other entities via relationships. From these relationships, researchers can query knowledge graphs for probabilistic recommendations to infer new knowledge. Scientific papers are an untapped resource which knowledge graphs could leverage to accelerate research discovery. Goal: Develop an end-to-end (semi) automated methodology for constructing Knowledge Graphs for Earth Science.

  14. MEPAG Recommendations for a 2018 Mars Sample Return Caching Lander - Sample Types, Number, and Sizes

    NASA Technical Reports Server (NTRS)

    Allen, Carlton C.

    2011-01-01

    The return to Earth of geological and atmospheric samples from the surface of Mars is among the highest priority objectives of planetary science. The MEPAG Mars Sample Return (MSR) End-to-End International Science Analysis Group (MEPAG E2E-iSAG) was chartered to propose scientific objectives and priorities for returned sample science, and to map out the implications of these priorities, including for the proposed joint ESA-NASA 2018 mission that would be tasked with the crucial job of collecting and caching the samples. The E2E-iSAG identified four overarching scientific aims that relate to understanding: (A) the potential for life and its pre-biotic context, (B) the geologic processes that have affected the martian surface, (C) planetary evolution of Mars and its atmosphere, (D) potential for future human exploration. The types of samples deemed most likely to achieve the science objectives are, in priority order: (1A). Subaqueous or hydrothermal sediments (1B). Hydrothermally altered rocks or low temperature fluid-altered rocks (equal priority) (2). Unaltered igneous rocks (3). Regolith, including airfall dust (4). Present-day atmosphere and samples of sedimentary-igneous rocks containing ancient trapped atmosphere Collection of geologically well-characterized sample suites would add considerable value to interpretations of all collected rocks. To achieve this, the total number of rock samples should be about 30-40. In order to evaluate the size of individual samples required to meet the science objectives, the E2E-iSAG reviewed the analytical methods that would likely be applied to the returned samples by preliminary examination teams, for planetary protection (i.e., life detection, biohazard assessment) and, after distribution, by individual investigators. It was concluded that sample size should be sufficient to perform all high-priority analyses in triplicate. In keeping with long-established curatorial practice of extraterrestrial material, at least 40% by mass of each sample should be preserved to support future scientific investigations. Samples of 15-16 grams are considered optimal. The total mass of returned rocks, soils, blanks and standards should be approximately 500 grams. Atmospheric gas samples should be the equivalent of 50 cubic cm at 20 times Mars ambient atmospheric pressure.

  15. I/O Performance Characterization of Lustre and NASA Applications on Pleiades

    NASA Technical Reports Server (NTRS)

    Saini, Subhash; Rappleye, Jason; Chang, Johnny; Barker, David Peter; Biswas, Rupak; Mehrotra, Piyush

    2012-01-01

    In this paper we study the performance of the Lustre file system using five scientific and engineering applications representative of NASA workload on large-scale supercomputing systems such as NASA s Pleiades. In order to facilitate the collection of Lustre performance metrics, we have developed a software tool that exports a wide variety of client and server-side metrics using SGI's Performance Co-Pilot (PCP), and generates a human readable report on key metrics at the end of a batch job. These performance metrics are (a) amount of data read and written, (b) number of files opened and closed, and (c) remote procedure call (RPC) size distribution (4 KB to 1024 KB, in powers of 2) for I/O operations. RPC size distribution measures the efficiency of the Lustre client and can pinpoint problems such as small write sizes, disk fragmentation, etc. These extracted statistics are useful in determining the I/O pattern of the application and can assist in identifying possible improvements for users applications. Information on the number of file operations enables a scientist to optimize the I/O performance of their applications. Amount of I/O data helps users choose the optimal stripe size and stripe count to enhance I/O performance. In this paper, we demonstrate the usefulness of this tool on Pleiades for five production quality NASA scientific and engineering applications. We compare the latency of read and write operations under Lustre to that with NFS by tracing system calls and signals. We also investigate the read and write policies and study the effect of page cache size on I/O operations. We examine the performance impact of Lustre stripe size and stripe count along with performance evaluation of file per process and single shared file accessed by all the processes for NASA workload using parameterized IOR benchmark.

  16. Optimization, an Important Stage of Engineering Design

    ERIC Educational Resources Information Center

    Kelley, Todd R.

    2010-01-01

    A number of leaders in technology education have indicated that a major difference between the technological design process and the engineering design process is analysis and optimization. The analysis stage of the engineering design process is when mathematical models and scientific principles are employed to help the designer predict design…

  17. Optimizing Open-Ended Crowdsourcing: The Next Frontier in Crowdsourced Data Management.

    PubMed

    Parameswaran, Aditya; Sarma, Akash Das; Venkataraman, Vipul

    2016-12-01

    Crowdsourcing is the primary means to generate training data at scale, and when combined with sophisticated machine learning algorithms, crowdsourcing is an enabler for a variety of emergent automated applications impacting all spheres of our lives. This paper surveys the emerging field of formally reasoning about and optimizing open-ended crowdsourcing, a popular and crucially important, but severely understudied class of crowdsourcing-the next frontier in crowdsourced data management. The underlying challenges include distilling the right answer when none of the workers agree with each other, teasing apart the various perspectives adopted by workers when answering tasks, and effectively selecting between the many open-ended operators appropriate for a problem. We describe the approaches that we've found to be effective for open-ended crowdsourcing, drawing from our experiences in this space.

  18. Tool Support for Software Lookup Table Optimization

    DOE PAGES

    Wilcox, Chris; Strout, Michelle Mills; Bieman, James M.

    2011-01-01

    A number of scientific applications are performance-limited by expressions that repeatedly call costly elementary functions. Lookup table (LUT) optimization accelerates the evaluation of such functions by reusing previously computed results. LUT methods can speed up applications that tolerate an approximation of function results, thereby achieving a high level of fuzzy reuse. One problem with LUT optimization is the difficulty of controlling the tradeoff between performance and accuracy. The current practice of manual LUT optimization adds programming effort by requiring extensive experimentation to make this tradeoff, and such hand tuning can obfuscate algorithms. In this paper we describe a methodology andmore » tool implementation to improve the application of software LUT optimization. Our Mesa tool implements source-to-source transformations for C or C++ code to automate the tedious and error-prone aspects of LUT generation such as domain profiling, error analysis, and code generation. We evaluate Mesa with five scientific applications. Our results show a performance improvement of 3.0× and 6.9× for two molecular biology algorithms, 1.4× for a molecular dynamics program, 2.1× to 2.8× for a neural network application, and 4.6× for a hydrology calculation. We find that Mesa enables LUT optimization with more control over accuracy and less effort than manual approaches.« less

  19. Is it all in the game? Flow experience and scientific practices during an INPLACE mobile game

    NASA Astrophysics Data System (ADS)

    Bressler, Denise M.

    Mobile science learning games show promise for promoting scientific practices and high engagement. Researchers have quantified this engagement according to flow theory. Using an embedded mixed methods design, this study investigated whether an INPLACE mobile game promotes flow experience, scientific practices, and effective team collaboration. Students playing the game (n=59) were compared with students in a business-as-usual control activity (n=120). Using an open-ended instrument designed to measure scientific practices and a self-report flow survey, this study empirically assessed flow and learner's scientific practices. The game players had significantly higher levels of flow and scientific practices. Using a multiple case study approach, collaboration among game teams (n=3 teams) were qualitatively compared with control teams (n=3 teams). Game teams revealed not only higher levels of scientific practices but also higher levels of engaged responses and communal language. Control teams revealed lower levels of scientific practice along with higher levels of rejecting responses and command language. Implications for these findings are discussed.

  20. Linear triangular optimization technique and pricing scheme in residential energy management systems

    NASA Astrophysics Data System (ADS)

    Anees, Amir; Hussain, Iqtadar; AlKhaldi, Ali Hussain; Aslam, Muhammad

    2018-06-01

    This paper presents a new linear optimization algorithm for power scheduling of electric appliances. The proposed system is applied in a smart home community, in which community controller acts as a virtual distribution company for the end consumers. We also present a pricing scheme between community controller and its residential users based on real-time pricing and likely block rates. The results of the proposed optimization algorithm demonstrate that by applying the anticipated technique, not only end users can minimise the consumption cost, but it can also reduce the power peak to an average ratio which will be beneficial for the utilities as well.

  1. Global Snow from Space: Development of a Satellite-based, Terrestrial Snow Mission Planning Tool

    NASA Astrophysics Data System (ADS)

    Forman, B. A.; Kumar, S.; LeMoigne, J.; Nag, S.

    2017-12-01

    A global, satellite-based, terrestrial snow mission planning tool is proposed to help inform experimental mission design with relevance to snow depth and snow water equivalent (SWE). The idea leverages the capabilities of NASA's Land Information System (LIS) and the Tradespace Analysis Tool for Constellations (TAT-C) to harness the information content of Earth science mission data across a suite of hypothetical sensor designs, orbital configurations, data assimilation algorithms, and optimization and uncertainty techniques, including cost estimates and risk assessments of each hypothetical permutation. One objective of the proposed observing system simulation experiment (OSSE) is to assess the complementary - or perhaps contradictory - information content derived from the simultaneous collection of passive microwave (radiometer), active microwave (radar), and LIDAR observations from space-based platforms. The integrated system will enable a true end-to-end OSSE that can help quantify the value of observations based on their utility towards both scientific research and applications as well as to better guide future mission design. Science and mission planning questions addressed as part of this concept include: What observational records are needed (in space and time) to maximize terrestrial snow experimental utility? How might observations be coordinated (in space and time) to maximize this utility? What is the additional utility associated with an additional observation? How can future mission costs be minimized while ensuring Science requirements are fulfilled?

  2. Sharing clinical trial data on patient level: Opportunities and challenges

    PubMed Central

    Koenig, Franz; Slattery, Jim; Groves, Trish; Lang, Thomas; Benjamini, Yoav; Day, Simon; Bauer, Peter; Posch, Martin

    2015-01-01

    In recent months one of the most controversially discussed topics among regulatory agencies, the pharmaceutical industry, journal editors, and academia has been the sharing of patient-level clinical trial data. Several projects have been started such as the European Medicines Agency´s (EMA) “proactive publication of clinical trial data”, the BMJ open data campaign, or the AllTrials initiative. The executive director of the EMA, Dr. Guido Rasi, has recently announced that clinical trial data on patient level will be published from 2014 onwards (although it has since been delayed). The EMA draft policy on proactive access to clinical trial data was published at the end of June 2013 and open for public consultation until the end of September 2013. These initiatives will change the landscape of drug development and publication of medical research. They provide unprecedented opportunities for research and research synthesis, but pose new challenges for regulatory authorities, sponsors, scientific journals, and the public. Besides these general aspects, data sharing also entails intricate biostatistical questions such as problems of multiplicity. An important issue in this respect is the interpretation of multiple statistical analyses, both prospective and retrospective. Expertise in biostatistics is needed to assess the interpretation of such multiple analyses, for example, in the context of regulatory decision-making by optimizing procedural guidance and sophisticated analysis methods. PMID:24942505

  3. Towards the Development of a Global, Satellite-based, Terrestrial Snow Mission Planning Tool

    NASA Technical Reports Server (NTRS)

    Forman, Bart; Kumar, Sujay; Le Moigne, Jacqueline; Nag, Sreeja

    2017-01-01

    A global, satellite-based, terrestrial snow mission planning tool is proposed to help inform experimental mission design with relevance to snow depth and snow water equivalent (SWE). The idea leverages the capabilities of NASAs Land Information System (LIS) and the Tradespace Analysis Tool for Constellations (TAT C) to harness the information content of Earth science mission data across a suite of hypothetical sensor designs, orbital configurations, data assimilation algorithms, and optimization and uncertainty techniques, including cost estimates and risk assessments of each hypothetical orbital configuration.One objective the proposed observing system simulation experiment (OSSE) is to assess the complementary or perhaps contradictory information content derived from the simultaneous collection of passive microwave (radiometer), active microwave (radar), and LIDAR observations from space-based platforms. The integrated system will enable a true end-to-end OSSE that can help quantify the value of observations based on their utility towards both scientific research and applications as well as to better guide future mission design. Science and mission planning questions addressed as part of this concept include:1. What observational records are needed (in space and time) to maximize terrestrial snow experimental utility?2. How might observations be coordinated (in space and time) to maximize utility? 3. What is the additional utility associated with an additional observation?4. How can future mission costs being minimized while ensuring Science requirements are fulfilled?

  4. Towards the Development of a Global, Satellite-Based, Terrestrial Snow Mission Planning Tool

    NASA Technical Reports Server (NTRS)

    Forman, Bart; Kumar, Sujay; Le Moigne, Jacqueline; Nag, Sreeja

    2017-01-01

    A global, satellite-based, terrestrial snow mission planning tool is proposed to help inform experimental mission design with relevance to snow depth and snow water equivalent (SWE). The idea leverages the capabilities of NASA's Land Information System (LIS) and the Tradespace Analysis Tool for Constellations (TAT-C) to harness the information content of Earth science mission data across a suite of hypothetical sensor designs, orbital configurations, data assimilation algorithms, and optimization and uncertainty techniques, including cost estimates and risk assessments of each hypothetical permutation. One objective of the proposed observing system simulation experiment (OSSE) is to assess the complementary or perhaps contradictory information content derived from the simultaneous collection of passive microwave (radiometer), active microwave (radar), and LIDAR observations from space-based platforms. The integrated system will enable a true end-to-end OSSE that can help quantify the value of observations based on their utility towards both scientific research and applications as well as to better guide future mission design. Science and mission planning questions addressed as part of this concept include: What observational records are needed (in space and time) to maximize terrestrial snow experimental utility? How might observations be coordinated (in space and time) to maximize this utility? What is the additional utility associated with an additional observation? How can future mission costs be minimized while ensuring Science requirements are fulfilled?

  5. Feasibility study of the solar scientific instruments for Spacelab/Orbiter

    NASA Technical Reports Server (NTRS)

    Leritz, J.; Rasser, T.; Stone, E.; Lockhart, B.; Nobles, W.; Parham, J.; Eimers, D.; Peterson, D.; Barnhart, W.; Schrock, S.

    1981-01-01

    The feasibility and economics of mounting and operating a set of solar scientific instruments in the backup Skylab Apollo Telescope Mount (ATM) hardware was evaluated. The instruments used as the study test payload and integrated into the ATM were: the Solar EUV Telescope/Spectrometer; the Solar Active Region Observing Telescope; and the Lyman Alpha White Light Coronagraph. The backup ATM hardware consists of a central cruciform structure, called the "SPAR', a "Sun End Canister' and a "Multiple Docking Adapter End Canister'. Basically, the ATM hardware and software provides a structural interface for the instruments; a closely controlled thermal environment; and a very accurate attitude and pointing control capability. The hardware is an identical set to the hardware that flow on Skylab.

  6. Proceedings of the Scientific Data Compression Workshop

    NASA Technical Reports Server (NTRS)

    Ramapriyan, H. K. (Editor)

    1989-01-01

    Continuing advances in space and Earth science requires increasing amounts of data to be gathered from spaceborne sensors. NASA expects to launch sensors during the next two decades which will be capable of producing an aggregate of 1500 Megabits per second if operated simultaneously. Such high data rates cause stresses in all aspects of end-to-end data systems. Technologies and techniques are needed to relieve such stresses. Potential solutions to the massive data rate problems are: data editing, greater transmission bandwidths, higher density and faster media, and data compression. Through four subpanels on Science Payload Operations, Multispectral Imaging, Microwave Remote Sensing and Science Data Management, recommendations were made for research in data compression and scientific data applications to space platforms.

  7. Optimal design of a smart post-buckled beam actuator using bat algorithm: simulations and experiments

    NASA Astrophysics Data System (ADS)

    Mallick, Rajnish; Ganguli, Ranjan; Kumar, Ravi

    2017-05-01

    The optimized design of a smart post-buckled beam actuator (PBA) is performed in this study. A smart material based piezoceramic stack actuator is used as a prime-mover to drive the buckled beam actuator. Piezoceramic actuators are high force, small displacement devices; they possess high energy density and have high bandwidth. In this study, bench top experiments are conducted to investigate the angular tip deflections due to the PBA. A new design of a linear-to-linear motion amplification device (LX-4) is developed to circumvent the small displacement handicap of piezoceramic stack actuators. LX-4 enhances the piezoceramic actuator mechanical leverage by a factor of four. The PBA model is based on dynamic elastic stability and is analyzed using the Mathieu-Hill equation. A formal optimization is carried out using a newly developed meta-heuristic nature inspired algorithm, named as the bat algorithm (BA). The BA utilizes the echolocation capability of bats. An optimized PBA in conjunction with LX-4 generates end rotations of the order of 15° at the output end. The optimized PBA design incurs less weight and induces large end rotations, which will be useful in development of various mechanical and aerospace devices, such as helicopter trailing edge flaps, micro and nano aerial vehicles and other robotic systems.

  8. Objective lens simultaneously optimized for pupil ghosting, wavefront delivery and pupil imaging

    NASA Technical Reports Server (NTRS)

    Olczak, Eugene G (Inventor)

    2011-01-01

    An objective lens includes multiple optical elements disposed between a first end and a second end, each optical element oriented along an optical axis. Each optical surface of the multiple optical elements provides an angle of incidence to a marginal ray that is above a minimum threshold angle. This threshold angle minimizes pupil ghosts that may enter an interferometer. The objective lens also optimizes wavefront delivery and pupil imaging onto an optical surface under test.

  9. Development of optimized, graded-permeability axial groove heat pipes

    NASA Technical Reports Server (NTRS)

    Kapolnek, Michael R.; Holmes, H. Rolland

    1988-01-01

    Heat pipe performance can usually be improved by uniformly varying or grading wick permeability from end to end. A unique and cost effective method for grading the permeability of an axial groove heat pipe is described - selective chemical etching of the pipe casing. This method was developed and demonstrated on a proof-of-concept test article. The process improved the test article's performance by 50 percent. Further improvement is possible through the use of optimally etched grooves.

  10. SPOKES: An end-to-end simulation facility for spectroscopic cosmological surveys

    DOE PAGES

    Nord, B.; Amara, A.; Refregier, A.; ...

    2016-03-03

    The nature of dark matter, dark energy and large-scale gravity pose some of the most pressing questions in cosmology today. These fundamental questions require highly precise measurements, and a number of wide-field spectroscopic survey instruments are being designed to meet this requirement. A key component in these experiments is the development of a simulation tool to forecast science performance, define requirement flow-downs, optimize implementation, demonstrate feasibility, and prepare for exploitation. We present SPOKES (SPectrOscopic KEn Simulation), an end-to-end simulation facility for spectroscopic cosmological surveys designed to address this challenge. SPOKES is based on an integrated infrastructure, modular function organization, coherentmore » data handling and fast data access. These key features allow reproducibility of pipeline runs, enable ease of use and provide flexibility to update functions within the pipeline. The cyclic nature of the pipeline offers the possibility to make the science output an efficient measure for design optimization and feasibility testing. We present the architecture, first science, and computational performance results of the simulation pipeline. The framework is general, but for the benchmark tests, we use the Dark Energy Spectrometer (DESpec), one of the early concepts for the upcoming project, the Dark Energy Spectroscopic Instrument (DESI). As a result, we discuss how the SPOKES framework enables a rigorous process to optimize and exploit spectroscopic survey experiments in order to derive high-precision cosmological measurements optimally.« less

  11. Throughput Optimization of Continuous Biopharmaceutical Manufacturing Facilities.

    PubMed

    Garcia, Fernando A; Vandiver, Michael W

    2017-01-01

    In order to operate profitably under different product demand scenarios, biopharmaceutical companies must design their facilities with mass output flexibility in mind. Traditional biologics manufacturing technologies pose operational challenges in this regard due to their high costs and slow equipment turnaround times, restricting the types of products and mass quantities that can be processed. Modern plant design, however, has facilitated the development of lean and efficient bioprocessing facilities through footprint reduction and adoption of disposable and continuous manufacturing technologies. These development efforts have proven to be crucial in seeking to drastically reduce the high costs typically associated with the manufacturing of recombinant proteins. In this work, mathematical modeling is used to optimize annual production schedules for a single-product commercial facility operating with a continuous upstream and discrete batch downstream platform. Utilizing cell culture duration and volumetric productivity as process variables in the model, and annual plant throughput as the optimization objective, 3-D surface plots are created to understand the effect of process and facility design on expected mass output. The model shows that once a plant has been fully debottlenecked it is capable of processing well over a metric ton of product per year. Moreover, the analysis helped to uncover a major limiting constraint on plant performance, the stability of the neutralized viral inactivated pool, which may indicate that this should be a focus of attention during future process development efforts. LAY ABSTRACT: Biopharmaceutical process modeling can be used to design and optimize manufacturing facilities and help companies achieve a predetermined set of goals. One way to perform optimization is by making the most efficient use of process equipment in order to minimize the expenditure of capital, labor and plant resources. To that end, this paper introduces a novel mathematical algorithm used to determine the most optimal equipment scheduling configuration that maximizes the mass output for a facility producing a single product. The paper also illustrates how different scheduling arrangements can have a profound impact on the availability of plant resources, and identifies limiting constraints on the plant design. In addition, simulation data is presented using visualization techniques that aid in the interpretation of the scientific concepts discussed. © PDA, Inc. 2017.

  12. Impact of Including Authentic Inquiry Experiences in Methods Courses for Pre-Service Secondary Teachers

    NASA Astrophysics Data System (ADS)

    Slater, T. F.; Elfring, L.; Novodvorsky, I.; Talanquer, V.; Quintenz, J.

    2007-12-01

    Science education reform documents universally call for students to have authentic and meaningful experiences using real data in the context of their science education. The underlying philosophical position is that students analyzing data can have experiences that mimic actual research. In short, research experiences that reflect the scientific spirit of inquiry potentially can: prepare students to address real world complex problems; develop students' ability to use scientific methods; prepare students to critically evaluate the validity of data or evidence and of the consequent interpretations or conclusions; teach quantitative skills, technical methods, and scientific concepts; increase verbal, written, and graphical communication skills; and train students in the values and ethics of working with scientific data. However, it is unclear what the broader pre-service teacher preparation community is doing in preparing future teachers to promote, manage, and successful facilitate their own students in conducting authentic scientific inquiry. Surveys of undergraduates in secondary science education programs suggests that students have had almost no experiences themselves in conducting open scientific inquiry where they develop researchable questions, design strategies to pursue evidence, and communicate data-based conclusions. In response, the College of Science Teacher Preparation Program at the University of Arizona requires all students enrolled in its various science teaching methods courses to complete an open inquiry research project and defend their findings at a specially designed inquiry science mini-conference at the end of the term. End-of-term surveys show that students enjoy their research experience and believe that this experience enhances their ability to facilitate their own future students in conducting open inquiry.

  13. End-to-end remote sensing at the Science and Technology Laboratory of John C. Stennis Space Center

    NASA Technical Reports Server (NTRS)

    Kelly, Patrick; Rickman, Douglas; Smith, Eric

    1991-01-01

    The Science and Technology Laboratory (STL) of Stennis Space Center (SSC) was developing an expertise in remote sensing for more than a decade. Capabilities at SSC/STL include all major areas of the field. STL includes the Sensor Development Laboratory (SDL), Image Processing Center, a Learjet 23 flight platform, and on-staff scientific investigators.

  14. Strength training for the warfighter.

    PubMed

    Kraemer, William J; Szivak, Tunde K

    2012-07-01

    Optimizing strength training for the warfighter is challenged by past training philosophies that no longer serve the modern warfighter facing the "anaerobic battlefield." Training approaches for integration of strength with other needed physical capabilities have been shown to require a periodization model that has the flexibility for changes and is able to adapt to ever-changing circumstances affecting the quality of workouts. Additionally, sequencing of workouts to limit over-reaching and development of overtraining syndromes that end in loss of duty time and injury are paramount to long-term success. Allowing adequate time for rest and recovery and recognizing the negative influences of extreme exercise programs and excessive endurance training will be vital in moving physical training programs into a more modern perspective as used by elite strength-power anaerobic athletes in sports today. Because the warfighter is an elite athlete, it is time that training approaches that are scientifically based are updated within the military to match the functional demands of modern warfare and are given greater credence and value at the command levels. A needs analysis, development of periodized training modules, and individualization of programs are needed to optimize the strength of the modern warfighter. We now have the knowledge, professional coaches and nonprofit organization certifications with continuing education units, and modern training technology to allow this to happen. Ultimately, it only takes command decisions and implementation to make this possible.

  15. Detective quantum efficiency: a standard test to ensure optimal detector performance and low patient exposures

    NASA Astrophysics Data System (ADS)

    Escartin, Terenz R.; Nano, Tomi F.; Cunningham, Ian A.

    2016-03-01

    The detective quantum efficiency (DQE), expressed as a function of spatial frequency, describes the ability of an x-ray detector to produce high signal-to-noise ratio (SNR) images. While regulatory and scientific communities have used the DQE as a primary metric for optimizing detector design, the DQE is rarely used by end users to ensure high system performance is maintained. Of concern is that image quality varies across different systems for the same exposures with no current measures available to describe system performance. Therefore, here we conducted an initial DQE measurement survey of clinical x-ray systems using a DQE-testing instrument to identify their range of performance. Following laboratory validation, experiments revealed that the DQE of five different systems under the same exposure level (8.0 μGy) ranged from 0.36 to 0.75 at low spatial frequencies, and 0.02 to 0.4 at high spatial frequencies (3.5 cycles/mm). Furthermore, the DQE dropped substantially with decreasing detector exposure by a factor of up to 1.5x in the lowest spatial frequency, and a factor of 10x at 3.5 cycles/mm due to the effect of detector readout noise. It is concluded that DQE specifications in purchasing decisions, combined with periodic DQE testing, are important factors to ensure patients receive the health benefits of high-quality images for low x-ray exposures.

  16. Design and optimization of anode flow field of a large proton exchange membrane fuel cell for high hydrogen utilization

    NASA Astrophysics Data System (ADS)

    Yesilyurt, Serhat; Rizwandi, Omid

    2016-11-01

    We developed a CFD model of the anode flow field of a large proton exchange membrane fuel cell that operates under the ultra-low stoichiometric (ULS) flow conditions which intend to improve the disadvantages of the dead-ended operation such as severe voltage transient and carbon corrosion. Very small exit velocity must be high enough to remove accumulated nitrogen, and must be low enough to retain hydrogen in the active area. Stokes equations are used to model the flow distribution in the flow field, Maxwell-Stefan equations are used to model the transport of the species, and a voltage model is developed to model the reactions kinetics. Uniformity of the distribution of hydrogen concentration is quantified as the normalized area of the region in which the hydrogen mole fraction remains above a certain level, such as 0.9. Geometry of the anode flow field is modified to obtain optimal configuration; the number of baffles at the inlet, width of the gaps between baffles, width of the side gaps, and length of the central baffle are used as design variables. In the final design, the hydrogen-depleted region is less than 0.2% and the hydrogen utilization is above 99%. This work was supported by The Scientific and Technolo-gical Research Council of Turkey, TUBITAK-213M023.

  17. Opposing ends of the spectrum: Exploring trust in scientific and religious authorities.

    PubMed

    Cacciatore, Michael A; Browning, Nick; Scheufele, Dietram A; Brossard, Dominique; Xenos, Michael A; Corley, Elizabeth A

    2018-01-01

    Given the ethical questions that surround emerging science, this study is interested in studying public trust in scientific and religious authorities for information about the risks and benefits of science. Using data from a nationally representative survey of American adults, we employ regression analysis to better understand the relationships between several variables-including values, knowledge, and media attention-and trust in religious organizations and scientific institutions. We found that Evangelical Christians are generally more trusting of religious authority figures to tell the truth about the risks and benefits of science and technology, and only slightly less likely than non-Evangelicals to trust scientific authorities for the same information. We also found that many Evangelicals use mediated information and science knowledge differently than non-Evangelicals, with both increased knowledge and attention to scientific media having positive impacts on trust in scientific authorities among the latter, but not the former group.

  18. Framing new research in science literacy and language use: Authenticity, multiple discourses, and the Third Space

    NASA Astrophysics Data System (ADS)

    Wallace, Carolyn S.

    2004-11-01

    This article presents a theoretical framework in the form of a model on which to base research in scientific literacy and language use. The assumption guiding the framework is that scientific literacy is comprised of the abilities to think metacognitively, to read and write scientific texts, and to apply the elements of a scientific argument. The framework is composed of three theoretical constructs: authenticity, multiple discourses, and Bhabha's Third Space. Some of the implications of the framework are that students need opportunities to (a) use scientific language in everyday situations; (b) negotiate readily among the many discourse genres of science; and (c) collaborate with teachers and peers on the meaning of scientific language. These ideas are illustrated with data excerpts from contemporary research studies. A set of potential research issues for the future is posed at the end of the article.

  19. Execution time supports for adaptive scientific algorithms on distributed memory machines

    NASA Technical Reports Server (NTRS)

    Berryman, Harry; Saltz, Joel; Scroggs, Jeffrey

    1990-01-01

    Optimizations are considered that are required for efficient execution of code segments that consists of loops over distributed data structures. The PARTI (Parallel Automated Runtime Toolkit at ICASE) execution time primitives are designed to carry out these optimizations and can be used to implement a wide range of scientific algorithms on distributed memory machines. These primitives allow the user to control array mappings in a way that gives an appearance of shared memory. Computations can be based on a global index set. Primitives are used to carry out gather and scatter operations on distributed arrays. Communications patterns are derived at runtime, and the appropriate send and receive messages are automatically generated.

  20. Execution time support for scientific programs on distributed memory machines

    NASA Technical Reports Server (NTRS)

    Berryman, Harry; Saltz, Joel; Scroggs, Jeffrey

    1990-01-01

    Optimizations are considered that are required for efficient execution of code segments that consists of loops over distributed data structures. The PARTI (Parallel Automated Runtime Toolkit at ICASE) execution time primitives are designed to carry out these optimizations and can be used to implement a wide range of scientific algorithms on distributed memory machines. These primitives allow the user to control array mappings in a way that gives an appearance of shared memory. Computations can be based on a global index set. Primitives are used to carry out gather and scatter operations on distributed arrays. Communications patterns are derived at runtime, and the appropriate send and receive messages are automatically generated.

  1. Determination of the Conservation Time of Periodicals for Optimal Shelf Maintenance of a Library.

    ERIC Educational Resources Information Center

    Miyamoto, Sadaaki; Nakayama, Kazuhiko

    1981-01-01

    Presents a method based on a constrained optimization technique that determines the time of removal of scientific periodicals from the shelf of a library. A geometrical interpretation of the theoretical result is given, and a numerical example illustrates how the technique is applicable to real bibliographic data. (FM)

  2. Optimizing Word Learning via Links to Perceptual and Motoric Experience

    ERIC Educational Resources Information Center

    Hald, Lea A.; de Nooijer, Jacqueline; van Gog, Tamara; Bekkering, Harold

    2016-01-01

    The aim of this review is to consider how current vocabulary training methods could be optimized by considering recent scientific insights in how the brain represents conceptual knowledge. We outline the findings from several methods of vocabulary training. In each case, we consider how taking an embodied cognition perspective could impact word…

  3. Video Games for Neuro-Cognitive Optimization.

    PubMed

    Mishra, Jyoti; Anguera, Joaquin A; Gazzaley, Adam

    2016-04-20

    Sophisticated video games that integrate engaging cognitive training with real-time biosensing and neurostimulation have the potential to optimize cognitive performance in health and disease. We argue that technology development must be paired with rigorous scientific validation and discuss academic and industry opportunities in this field. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bales, Benjamin B; Barrett, Richard F

    In almost all modern scientific applications, developers achieve the greatest performance gains by tuning algorithms, communication systems, and memory access patterns, while leaving low level instruction optimizations to the compiler. Given the increasingly varied and complicated x86 architectures, the value of these optimizations is unclear, and, due to time and complexity constraints, it is difficult for many programmers to experiment with them. In this report we explore the potential gains of these 'last mile' optimization efforts on an AMD Barcelona processor, providing readers with relevant information so that they can decide whether investment in the presented optimizations is worthwhile.

  5. Reflections on Peter Slezak and the 'Sociology of Scientific Knowledge`

    NASA Astrophysics Data System (ADS)

    Suchting, W. A.

    The paper examines central parts of the first of two papers in this journal by Peter Slezak criticising sociology of scientific knowledge and also considers, independently, some of the main philosophical issues raised by the sociologists of science, in particular David Bloor. The general conclusion is that each account alludes to different and crucial aspects of the nature of knowledge without, severally or jointly, being able to theorise them adequately. The appendix contains epistemological theses central to a more adequate theory of scientific knowledge.... our Histories of six Thousand Moons make no Mention of any other, than the two great Empires of Lilliput and Blefuscu. Which mighty Powers have ... been engaged in a most obstinate War for six and thirty Moons past. It began upon the following Occasion. It is allowed on all Hands, that the primitive Way of breaking Eggs before we eat them, was upon the larger End: But ... the Emperor [of Lilliput] ... published an Edict, commanding all his Subjects, upon great Penalties, to break the smaller End of their Eggs. The People so resented this Law, that ... there have been six Rebellions raised on that Account ... These civil Commotions were constantly fomented by the Monarchs of Blefuscu ... It is computed, that eleven Thousand have, at several Times, suffered Death, rather than break Eggs at the smaller End. Many hundred large Volumes have published upon this Controversy ...

  6. Articulating uncertainty as part of scientific argumentation during model-based exoplanet detection tasks

    NASA Astrophysics Data System (ADS)

    Lee, Hee-Sun; Pallant, Amy; Pryputniewicz, Sarah

    2015-08-01

    Teaching scientific argumentation has emerged as an important goal for K-12 science education. In scientific argumentation, students are actively involved in coordinating evidence with theory based on their understanding of the scientific content and thinking critically about the strengths and weaknesses of the cited evidence in the context of the investigation. We developed a one-week-long online curriculum module called "Is there life in space?" where students conduct a series of four model-based tasks to learn how scientists detect extrasolar planets through the “wobble” and transit methods. The simulation model allows students to manipulate various parameters of an imaginary star and planet system such as planet size, orbit size, planet-orbiting-plane angle, and sensitivity of telescope equipment, and to adjust the display settings for graphs illustrating the relative velocity and light intensity of the star. Students can use model-based evidence to formulate an argument on whether particular signals in the graphs guarantee the presence of a planet. Students' argumentation is facilitated by the four-part prompts consisting of multiple-choice claim, open-ended explanation, Likert-scale uncertainty rating, and open-ended uncertainty rationale. We analyzed 1,013 scientific arguments formulated by 302 high school student groups taught by 7 teachers. We coded these arguments in terms of the accuracy of their claim, the sophistication of explanation connecting evidence to the established knowledge base, the uncertainty rating, and the scientific validity of uncertainty. We found that (1) only 18% of the students' uncertainty rationale involved critical reflection on limitations inherent in data and concepts, (2) 35% of students' uncertainty rationale reflected their assessment of personal ability and knowledge, rather than scientific sources of uncertainty related to the evidence, and (3) the nature of task such as the use of noisy data or the framing of critiquing scientists' discovery encouraged students' articulation of scientific uncertainty sources in different ways.

  7. Scientific and Technical Serials Holdings Optimization in an Inefficient Market: A LSU Serials Redesign Project Exercise.

    ERIC Educational Resources Information Center

    Bensman, Stephen J.; Wilder, Stanley J.

    1998-01-01

    Analyzes the structure of the library market for scientific and technical (ST) serials. Describes an exercise aimed at a theoretical reconstruction of the ST-serials holdings of Louisiana State University (LSU) Libraries. Discusses the set definitions, measures, and algorithms necessary in the design of a computer program to appraise ST serials.…

  8. The Scientific Enlightenment System in Russia in the Early Twentieth Century as a Model for Popularizing Science

    ERIC Educational Resources Information Center

    Balashova, Yuliya B.

    2016-01-01

    This research reconstructs the traditions of scientific enlightenment in Russia. The turn of the nineteenth and twentieth centuries was chosen as the most representative period. The modern age saw the establishment of the optimal model for advancing science in the global context and its crucial segment--Russian science. This period was…

  9. A neural network strategy for end-point optimization of batch processes.

    PubMed

    Krothapally, M; Palanki, S

    1999-01-01

    The traditional way of operating batch processes has been to utilize an open-loop "golden recipe". However, there can be substantial batch to batch variation in process conditions and this open-loop strategy can lead to non-optimal operation. In this paper, a new approach is presented for end-point optimization of batch processes by utilizing neural networks. This strategy involves the training of two neural networks; one to predict switching times and the other to predict the input profile in the singular region. This approach alleviates the computational problems associated with the classical Pontryagin's approach and the nonlinear programming approach. The efficacy of this scheme is illustrated via simulation of a fed-batch fermentation.

  10. Constraint Force Equation Methodology for Modeling Multi-Body Stage Separation Dynamics

    NASA Technical Reports Server (NTRS)

    Toniolo, Matthew D.; Tartabini, Paul V.; Pamadi, Bandu N.; Hotchko, Nathaniel

    2008-01-01

    This paper discusses a generalized approach to the multi-body separation problems in a launch vehicle staging environment based on constraint force methodology and its implementation into the Program to Optimize Simulated Trajectories II (POST2), a widely used trajectory design and optimization tool. This development facilitates the inclusion of stage separation analysis into POST2 for seamless end-to-end simulations of launch vehicle trajectories, thus simplifying the overall implementation and providing a range of modeling and optimization capabilities that are standard features in POST2. Analysis and results are presented for two test cases that validate the constraint force equation methodology in a stand-alone mode and its implementation in POST2.

  11. On-the-fly data assessment for high-throughput x-ray diffraction measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Fang; Pandolfi, Ronald; Van Campen, Douglas

    Investment in brighter sources and larger and faster detectors has accelerated the speed of data acquisition at national user facilities. The accelerated data acquisition offers many opportunities for the discovery of new materials, but it also presents a daunting challenge. The rate of data acquisition far exceeds the current speed of data quality assessment, resulting in less than optimal data and data coverage, which in extreme cases forces recollection of data. Herein, we show how this challenge can be addressed through the development of an approach that makes routine data assessment automatic and instantaneous. By extracting and visualizing customized attributesmore » in real time, data quality and coverage, as well as other scientifically relevant information contained in large data sets, is highlighted. Deployment of such an approach not only improves the quality of data but also helps optimize the usage of expensive characterization resources by prioritizing measurements of the highest scientific impact. We anticipate our approach will become a starting point for a sophisticated decision-tree that optimizes data quality and maximizes scientific content in real time through automation. Finally, with these efforts to integrate more automation in data collection and analysis, we can truly take advantage of the accelerating speed of data acquisition.« less

  12. Snowflake: A Lightweight Portable Stencil DSL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Nathan; Driscoll, Michael; Markley, Charles

    Stencil computations are not well optimized by general-purpose production compilers and the increased use of multicore, manycore, and accelerator-based systems makes the optimization problem even more challenging. In this paper we present Snowflake, a Domain Specific Language (DSL) for stencils that uses a 'micro-compiler' approach, i.e., small, focused, domain-specific code generators. The approach is similar to that used in image processing stencils, but Snowflake handles the much more complex stencils that arise in scientific computing, including complex boundary conditions, higher-order operators (larger stencils), higher dimensions, variable coefficients, non-unit-stride iteration spaces, and multiple input or output meshes. Snowflake is embedded inmore » the Python language, allowing it to interoperate with popular scientific tools like SciPy and iPython; it also takes advantage of built-in Python libraries for powerful dependence analysis as part of a just-in-time compiler. We demonstrate the power of the Snowflake language and the micro-compiler approach with a complex scientific benchmark, HPGMG, that exercises the generality of stencil support in Snowflake. By generating OpenMP comparable to, and OpenCL within a factor of 2x of hand-optimized HPGMG, Snowflake demonstrates that a micro-compiler can support diverse processor architectures and is performance-competitive whilst preserving a high-level Python implementation.« less

  13. Snowflake: A Lightweight Portable Stencil DSL

    DOE PAGES

    Zhang, Nathan; Driscoll, Michael; Markley, Charles; ...

    2017-05-01

    Stencil computations are not well optimized by general-purpose production compilers and the increased use of multicore, manycore, and accelerator-based systems makes the optimization problem even more challenging. In this paper we present Snowflake, a Domain Specific Language (DSL) for stencils that uses a 'micro-compiler' approach, i.e., small, focused, domain-specific code generators. The approach is similar to that used in image processing stencils, but Snowflake handles the much more complex stencils that arise in scientific computing, including complex boundary conditions, higher-order operators (larger stencils), higher dimensions, variable coefficients, non-unit-stride iteration spaces, and multiple input or output meshes. Snowflake is embedded inmore » the Python language, allowing it to interoperate with popular scientific tools like SciPy and iPython; it also takes advantage of built-in Python libraries for powerful dependence analysis as part of a just-in-time compiler. We demonstrate the power of the Snowflake language and the micro-compiler approach with a complex scientific benchmark, HPGMG, that exercises the generality of stencil support in Snowflake. By generating OpenMP comparable to, and OpenCL within a factor of 2x of hand-optimized HPGMG, Snowflake demonstrates that a micro-compiler can support diverse processor architectures and is performance-competitive whilst preserving a high-level Python implementation.« less

  14. Optimizing the Scientific Yield from a Randomized Controlled Trial (RCT): Evaluating Two Behavioral Interventions and Assessment Reactivity with a Single Trial

    PubMed Central

    Carey, Michael P.; Senn, Theresa E.; Coury-Doniger, Patricia; Urban, Marguerite A.; Vanable, Peter A.; Carey, Kate B.

    2013-01-01

    Randomized controlled trials (RCTs) remain the gold standard for evaluating intervention efficacy but are often costly. To optimize their scientific yield, RCTs can be designed to investigate multiple research questions. This paper describes an RCT that used a modified Solomon four-group design to simultaneously evaluate two, theoretically-guided, health promotion interventions as well as assessment reactivity. Recruited participants (N = 1010; 56% male; 69% African American) were randomly assigned to one of four conditions formed by crossing two intervention conditions (i.e., general health promotion vs. sexual risk reduction intervention) with two assessment conditions (i.e., general health vs. sexual health survey). After completing their assigned baseline assessment, participants received the assigned intervention, and returned for follow-ups at 3, 6, 9, and 12 months. In this report, we summarize baseline data, which show high levels of sexual risk behavior; alcohol, marijuana, and tobacco use; and fast food consumption. Sexual risk behaviors and substance use were correlated. Participants reported high satisfaction with both interventions but ratings for the sexual risk reduction intervention were higher. Planned follow-up sessions, and subsequent analyses, will assess changes in health behaviors including sexual risk behaviors. This study design demonstrates one way to optimize the scientific yield of an RCT. PMID:23816489

  15. Body-Cooling Paradigm in Sport: Maximizing Safety and Performance During Competition.

    PubMed

    Adams, William M; Hosokawa, Yuri; Casa, Douglas J

    2016-12-01

    Although body cooling has both performance and safety benefits, knowledge on optimizing cooling during specific sport competition is limited. To identify when, during sport competition, it is optimal for body cooling and to identify optimal body-cooling modalities to enhance safety and maximize sport performance. A comprehensive literature search was conducted to identify articles with specific context regarding body cooling, sport performance, and cooling modalities used during sport competition. A search of scientific peer-reviewed literature examining the effects of body cooling on exercise performance was done to examine the influence of body cooling on exercise performance. Subsequently, a literature search was done to identify effective cooling modalities that have been shown to improve exercise performance. The cooling modalities that are most effective in cooling the body during sport competition depend on the sport, timing of cooling, and feasibility based on the constraints of the sports rules and regulations. Factoring in the length of breaks (halftime substitutions, etc), the equipment worn during competition, and the cooling modalities that offer the greatest potential to cool must be considered in each individual sport. Scientific evidence supports using body cooling as a method of improving performance during sport competition. Developing a strategy to use cooling modalities that are scientifically evidence-based to improve performance while maximizing athlete's safety warrants further investigation.

  16. On-the-fly data assessment for high-throughput x-ray diffraction measurements

    DOE PAGES

    Ren, Fang; Pandolfi, Ronald; Van Campen, Douglas; ...

    2017-05-02

    Investment in brighter sources and larger and faster detectors has accelerated the speed of data acquisition at national user facilities. The accelerated data acquisition offers many opportunities for the discovery of new materials, but it also presents a daunting challenge. The rate of data acquisition far exceeds the current speed of data quality assessment, resulting in less than optimal data and data coverage, which in extreme cases forces recollection of data. Herein, we show how this challenge can be addressed through the development of an approach that makes routine data assessment automatic and instantaneous. By extracting and visualizing customized attributesmore » in real time, data quality and coverage, as well as other scientifically relevant information contained in large data sets, is highlighted. Deployment of such an approach not only improves the quality of data but also helps optimize the usage of expensive characterization resources by prioritizing measurements of the highest scientific impact. We anticipate our approach will become a starting point for a sophisticated decision-tree that optimizes data quality and maximizes scientific content in real time through automation. Finally, with these efforts to integrate more automation in data collection and analysis, we can truly take advantage of the accelerating speed of data acquisition.« less

  17. Corner flow control in high through-flow axial commercial fan/booster using blade 3-D optimization

    NASA Astrophysics Data System (ADS)

    Zhu, Fang; Jin, Donghai; Gui, Xingmin

    2012-02-01

    This study is aimed at using blade 3-D optimization to control corner flows in the high through-flow fan/booster of a high bypass ratio commercial turbofan engine. Two kinds of blade 3-D optimization, end-bending and bow, are focused on. On account of the respective operation mode and environment, the approach to 3-D aerodynamic modeling of rotor blades is different from stator vanes. Based on the understanding of the mechanism of the corner flow and the consideration of intensity problem for rotors, this paper uses a variety of blade 3-D optimization approaches, such as loading distribution optimization, perturbation of departure angles and stacking-axis manipulation, which are suitable for rotors and stators respectively. The obtained 3-D blades and vanes can improve the corner flow features by end-bending and bow effects. The results of this study show that flows in corners of the fan/booster, such as the fan hub region, the tip and hub of the vanes of the booster, are very complex and dominated by 3-D effects. The secondary flows there are found to have a strong detrimental effect on the compressor performance. The effects of both end-bending and bow can improve the flow separation in corners, but the specific ways they work and application scope are somewhat different. Redesigning the blades via blade 3-D optimization to control the corner flow has effectively reduced the loss generation and improved the stall margin by a large amount.

  18. Electric Field Distortion in Electro-Optical Devices Subjected to Ionizing Radiation.

    DTIC Science & Technology

    1983-12-26

    applies- ties of scientif ic advances to nam military spae system . Versatilty and flaxibility hews beon developed to a high degree by the lehoratory...personel In deeling with the many problems encountered ina the nation’s rapidly dsvelopnas space system . 1expertise In the latest scientific developments is...desiga, distributed architectures for spacoerne m o putars, fault-tolerant c.speter system , artificia intelligence. end microelectronics applications

  19. ANNUAL REPORT, JULY 1, 1957

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1958-10-31

    The progress and trends of research are presented along with a description of operational, service, end administrative activities. Some scientific and technical details are given on research programs in the physical sciences, life sciences, and engineering, however, more complete technical information is available in quarterly progress reports, BNL technical reports, and scientific and technical periodicals. A bibliography of these publications is appended. (For preceding period see BNL-426.) (D.E.B.)

  20. High-End Climate Science: Development of Modeling and Related Computing Capabilities

    DTIC Science & Technology

    2000-12-01

    toward strengthening research on key scientific issues. The Program has supported research that has led to substantial increases in knowledge , improved...provides overall direction and executive oversight of the USGCRP. Within this framework, agencies manage and coordinate Federally supported scientific...critical for the U.S. Global Change Research Program. Such models can be used to look backward to test the consistency of our knowledge of Earth system

  1. The scientific production in trauma of an emerging country

    PubMed Central

    2012-01-01

    Background The study aims to examine whether the end of specialty in trauma surgery in 2003 influenced the scientific productivity of the area in Brazil. Methods We identified and classified the manuscripts and their authors, from databases such as PubMed, Scielo and Plataforma Lattes and sites like Google, in addition to the list of members of SBAIT, the sole society in Brazil to congregate surgeons involved in trauma care in the country. We applied statistical tests to compare the periods of 1997-2003 and 2004-2010. We also analyzed the following variables: impact factor of journals in which manuscripts were published, journals, regional origin of authors, time since graduation, and conducting post-doctorate abroad. Results We observed a significant increase in publication rates of the analyzed groups over the years. There was a predominance of quantitative studies from the Southeast (especially the state of São Paulo). More time elapsed after graduation and the realization of postdoctoral studies abroad influenced the individual scientific productivity. Conclusion The number of articles published by authors from the area of trauma has been growing over the past 14 years in Brazil. The end of the specialty in trauma surgery in the country did not influence the scientific productivity in the area. PMID:23531364

  2. Four stages of a scientific discipline; four types of scientist.

    PubMed

    Shneider, Alexander M

    2009-05-01

    In this article I propose the classification of the evolutionary stages that a scientific discipline evolves through and the type of scientists that are the most productive at each stage. I believe that each scientific discipline evolves sequentially through four stages. Scientists at stage one introduce new objects and phenomena as subject matter for a new scientific discipline. To do this they have to introduce a new language adequately describing the subject matter. At stage two, scientists develop a toolbox of methods and techniques for the new discipline. Owing to this advancement in methodology, the spectrum of objects and phenomena that fall into the realm of the new science are further understood at this stage. Most of the specific knowledge is generated at the third stage, at which the highest number of original research publications is generated. The majority of third-stage investigation is based on the initial application of new research methods to objects and/or phenomena. The purpose of the fourth stage is to maintain and pass on scientific knowledge generated during the first three stages. Groundbreaking new discoveries are not made at this stage. However, new ways to present scientific information are generated, and crucial revisions are often made of the role of the discipline within the constantly evolving scientific environment. The very nature of each stage determines the optimal psychological type and modus operandi of the scientist operating within it. Thus, it is not only the talent and devotion of scientists that determines whether they are capable of contributing substantially but, rather, whether they have the 'right type' of talent for the chosen scientific discipline at that time. Understanding the four different evolutionary stages of a scientific discipline might be instrumental for many scientists in optimizing their career path, in addition to being useful in assembling scientific teams, precluding conflicts and maximizing productivity. The proposed model of scientific evolution might also be instrumental for society in organizing and managing the scientific process. No public policy aimed at stimulating the scientific process can be equally beneficial for all four stages. Attempts to apply the same criteria to scientists working on scientific disciplines at different stages of their scientific evolution would be stimulating for one and detrimental for another. In addition, researchers operating at a certain stage of scientific evolution might not possess the mindset adequate to evaluate and stimulate a discipline that is at a different evolutionary stage. This could be the reason for suboptimal implementation of otherwise well-conceived scientific policies.

  3. [[Medicinal broths in the books by Nicolas Lemery, a reflection of scientific developments?

    PubMed

    Motte-Florac, Élisabeth

    2016-03-01

    From Ancient times, medicinal broths have been an integral part of the diet fed to patients and convalescents. At the end of 17th century, medical and pharmaceutical knowledge and practices were to enter a period of major upheavals. Although also hitherto discredited, chemical drugs became all the rage, work in chemistry boomed and broths benefited. Do the first editions of the works of Nicolas Lemery reflect the knowledge of his time ? Do last editions – revised, corrected, annotated and completed – really reflect transformations in scientific disciplines, technological developments, and scientific advances, particularly in chemistry?

  4. [Reference citation].

    PubMed

    Brkić, Silvija

    2013-01-01

    Scientific and professional papers represent the information basis for scientific research and professional work. References important for the paper should be cited within the text, and listed at the end of the paper. This paper deals with different styles of reference citation. Special emphasis was placed on the Vancouver Style for reference citation in biomedical journals established by the International Committee of Medical Journal Editors. It includes original samples for citing various types of articles, both printed and electronic, as well as recommendations related to reference citation in accordance with the methodology and ethics of scientific research and guidelines for preparing manuscripts for publication.

  5. Microchannel DNA Sequencing by End-Labelled Free Solution Electrophoresis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barron, A.

    2005-09-29

    The further development of End-Labeled Free-Solution Electrophoresis will greatly simplify DNA separation and sequencing on microfluidic devices. The development and optimization of drag-tags is critical to the success of this research.

  6. 21 CFR 812.1 - Scope.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ..., the discovery and development of useful devices intended for human use, and to that end to maintain optimum freedom for scientific investigators in their pursuit of this purpose. This part provides...

  7. Diastolic chamber properties of the left ventricle assessed by global fitting of pressure-volume data: improving the gold standard of diastolic function.

    PubMed

    Bermejo, Javier; Yotti, Raquel; Pérez del Villar, Candelas; del Álamo, Juan C; Rodríguez-Pérez, Daniel; Martínez-Legazpi, Pablo; Benito, Yolanda; Antoranz, J Carlos; Desco, M Mar; González-Mansilla, Ana; Barrio, Alicia; Elízaga, Jaime; Fernández-Avilés, Francisco

    2013-08-15

    In cardiovascular research, relaxation and stiffness are calculated from pressure-volume (PV) curves by separately fitting the data during the isovolumic and end-diastolic phases (end-diastolic PV relationship), respectively. This method is limited because it assumes uncoupled active and passive properties during these phases, it penalizes statistical power, and it cannot account for elastic restoring forces. We aimed to improve this analysis by implementing a method based on global optimization of all PV diastolic data. In 1,000 Monte Carlo experiments, the optimization algorithm recovered entered parameters of diastolic properties below and above the equilibrium volume (intraclass correlation coefficients = 0.99). Inotropic modulation experiments in 26 pigs modified passive pressure generated by restoring forces due to changes in the operative and/or equilibrium volumes. Volume overload and coronary microembolization caused incomplete relaxation at end diastole (active pressure > 0.5 mmHg), rendering the end-diastolic PV relationship method ill-posed. In 28 patients undergoing PV cardiac catheterization, the new algorithm reduced the confidence intervals of stiffness parameters by one-fifth. The Jacobian matrix allowed visualizing the contribution of each property to instantaneous diastolic pressure on a per-patient basis. The algorithm allowed estimating stiffness from single-beat PV data (derivative of left ventricular pressure with respect to volume at end-diastolic volume intraclass correlation coefficient = 0.65, error = 0.07 ± 0.24 mmHg/ml). Thus, in clinical and preclinical research, global optimization algorithms provide the most complete, accurate, and reproducible assessment of global left ventricular diastolic chamber properties from PV data. Using global optimization, we were able to fully uncouple relaxation and passive PV curves for the first time in the intact heart.

  8. The Pharmacology of Regenerative Medicine

    PubMed Central

    Saul, Justin M.; Furth, Mark E.; Andersson, Karl-Erik

    2013-01-01

    Regenerative medicine is a rapidly evolving multidisciplinary, translational research enterprise whose explicit purpose is to advance technologies for the repair and replacement of damaged cells, tissues, and organs. Scientific progress in the field has been steady and expectations for its robust clinical application continue to rise. The major thesis of this review is that the pharmacological sciences will contribute critically to the accelerated translational progress and clinical utility of regenerative medicine technologies. In 2007, we coined the phrase “regenerative pharmacology” to describe the enormous possibilities that could occur at the interface between pharmacology, regenerative medicine, and tissue engineering. The operational definition of regenerative pharmacology is “the application of pharmacological sciences to accelerate, optimize, and characterize (either in vitro or in vivo) the development, maturation, and function of bioengineered and regenerating tissues.” As such, regenerative pharmacology seeks to cure disease through restoration of tissue/organ function. This strategy is distinct from standard pharmacotherapy, which is often limited to the amelioration of symptoms. Our goal here is to get pharmacologists more involved in this field of research by exposing them to the tools, opportunities, challenges, and interdisciplinary expertise that will be required to ensure awareness and galvanize involvement. To this end, we illustrate ways in which the pharmacological sciences can drive future innovations in regenerative medicine and tissue engineering and thus help to revolutionize the discovery of curative therapeutics. Hopefully, the broad foundational knowledge provided herein will spark sustained conversations among experts in diverse fields of scientific research to the benefit of all. PMID:23818131

  9. Mechanisms and Effects of Transcranial Direct Current Stimulation

    PubMed Central

    Giordano, James; Bikson, Marom; Kappenman, Emily S.; Clark, Vincent P.; Coslett, H. Branch; Hamblin, Michael R.; Hamilton, Roy; Jankord, Ryan; Kozumbo, Walter J.; McKinley, R. Andrew; Nitsche, Michael A.; Reilly, J. Patrick; Richardson, Jessica; Wurzman, Rachel

    2017-01-01

    The US Air Force Office of Scientific Research convened a meeting of researchers in the fields of neuroscience, psychology, engineering, and medicine to discuss most pressing issues facing ongoing research in the field of transcranial direct current stimulation (tDCS) and related techniques. In this study, we present opinions prepared by participants of the meeting, focusing on the most promising areas of research, immediate and future goals for the field, and the potential for hormesis theory to inform tDCS research. Scientific, medical, and ethical considerations support the ongoing testing of tDCS in healthy and clinical populations, provided best protocols are used to maximize safety. Notwithstanding the need for ongoing research, promising applications include enhancing vigilance/attention in healthy volunteers, which can accelerate training and support learning. Commonly, tDCS is used as an adjunct to training/rehabilitation tasks with the goal of leftward shift in the learning/treatment effect curves. Although trials are encouraging, elucidating the basic mechanisms of tDCS will accelerate validation and adoption. To this end, biomarkers (eg, clinical neuroimaging and findings from animal models) can support hypotheses linking neurobiological mechanisms and behavioral effects. Dosage can be optimized using computational models of current flow and understanding dose–response. Both biomarkers and dosimetry should guide individualized interventions with the goal of reducing variability. Insights from other applied energy domains, including ionizing radiation, transcranial magnetic stimulation, and low-level laser (light) therapy, can be prudently leveraged. PMID:28210202

  10. Tracking the Footprints Puzzle: The Problematic Persistence of Science-as-Process in Teaching the Nature and Culture of Science

    ERIC Educational Resources Information Center

    Ault, Charles R., Jr.; Dodick, Jeff

    2010-01-01

    For many decades, science educators have asked, "In what ways should learning the content of traditional subjects serve as the means to more general ends, such as understanding the nature of science or the processes of scientific inquiry?" Acceptance of these ends reduces the role of disciplinary context; the "Footprints Puzzle" and Oregon's…

  11. Modeling the Water Balloon Slingshot

    NASA Astrophysics Data System (ADS)

    Bousquet, Benjamin D.; Figura, Charles C.

    2013-01-01

    In the introductory physics courses at Wartburg College, we have been working to create a lab experience focused on the scientific process itself rather than verification of physical laws presented in the classroom or textbook. To this end, we have developed a number of open-ended modeling exercises suitable for a variety of learning environments, from non-science major classes to algebra-based and calculus-based introductory physics classes.

  12. User-Friendly End Station at the ALS for Nanostructure Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    F. J. Himpsel; P. Alivisatos; T. Callcott

    2006-07-05

    This is a construction project for an end station at the ALS, which is optimized for measuring NEXAFS of nanostructures with fluorescence detection. Compared to the usual electron yield detection, fluorescence is able to probe buried structures and is sensitive to dilute species, such as nanostructures supported on a substrate. Since the quantum yield for fluorescence is 10{sup -4}-10{sup -5} times smaller than for electrons in the soft x-ray regime, such an end station requires bright undulator beamlines at the ALS. In order to optimize the setup for a wide range of applications, two end stations were built: (1) Amore » simple, mobile chamber with efficient photon detection (>10{sup 4} times the solid angle collection of fluorescence spectrographs) and a built-in magnet for MCD measurements at EPU beamlines (Fig. 1 left). It allows rapid mapping the electronic states of nanostructures (nanocrystals, nanowires, tailored magnetic materials, buried interfaces, biologically-functionalized surfaces). It was used with BL 8.0 (linear polarized undulator) and BL 4.0 (variable polarization). (2) A sophisticated, stationary end station operating at Beamline 8.0 (Fig. 1 right). It contains an array of surface characterization instruments and a micro-focus capability for scanning across graded samples (wedges for thickness variation, stoichiometry gradients, and general variations of the sample preparation conditions for optimizing nanostructures).« less

  13. Dispositional Optimism

    PubMed Central

    Carver, Charles S.; Scheier, Michael F.

    2014-01-01

    Optimism is a cognitive construct (expectancies regarding future outcomes) that also relates to motivation: optimistic people exert effort, whereas pessimistic people disengage from effort. Study of optimism began largely in health contexts, finding positive associations between optimism and markers of better psychological and physical health. Physical health effects likely occur through differences in both health-promoting behaviors and physiological concomitants of coping. Recently, the scientific study of optimism has extended to the realm of social relations: new evidence indicates that optimists have better social connections, partly because they work harder at them. In this review, we examine the myriad ways this trait can benefit an individual, and our current understanding of the biological basis of optimism. PMID:24630971

  14. Kinematically redundant robot manipulators

    NASA Technical Reports Server (NTRS)

    Baillieul, J.; Hollerbach, J.; Brockett, R.; Martin, D.; Percy, R.; Thomas, R.

    1987-01-01

    Research on control, design and programming of kinematically redundant robot manipulators (KRRM) is discussed. These are devices in which there are more joint space degrees of freedom than are required to achieve every position and orientation of the end-effector necessary for a given task in a given workspace. The technological developments described here deal with: kinematic programming techniques for automatically generating joint-space trajectories to execute prescribed tasks; control of redundant manipulators to optimize dynamic criteria (e.g., applications of forces and moments at the end-effector that optimally distribute the loading of actuators); and design of KRRMs to optimize functionality in congested work environments or to achieve other goals unattainable with non-redundant manipulators. Kinematic programming techniques are discussed, which show that some pseudo-inverse techniques that have been proposed for redundant manipulator control fail to achieve the goals of avoiding kinematic singularities and also generating closed joint-space paths corresponding to close paths of the end effector in the workspace. The extended Jacobian is proposed as an alternative to pseudo-inverse techniques.

  15. Comparison of particle swarm optimization and differential evolution for aggregators' profit maximization in the demand response system

    NASA Astrophysics Data System (ADS)

    Wisittipanit, Nuttachat; Wisittipanich, Warisa

    2018-07-01

    Demand response (DR) refers to changes in the electricity use patterns of end-users in response to incentive payment designed to prompt lower electricity use during peak periods. Typically, there are three players in the DR system: an electric utility operator, a set of aggregators and a set of end-users. The DR model used in this study aims to minimize the operator's operational cost and offer rewards to aggregators, while profit-maximizing aggregators compete to sell DR services to the operator and provide compensation to end-users for altering their consumption profiles. This article presents the first application of two metaheuristics in the DR system: particle swarm optimization (PSO) and differential evolution (DE). The objective is to optimize the incentive payments during various periods to satisfy all stakeholders. The results show that DE significantly outperforms PSO, since it can attain better compensation rates, lower operational costs and higher aggregator profits.

  16. Spontaneous Effort During Mechanical Ventilation: Maximal Injury With Less Positive End-Expiratory Pressure.

    PubMed

    Yoshida, Takeshi; Roldan, Rollin; Beraldo, Marcelo A; Torsani, Vinicius; Gomes, Susimeire; De Santis, Roberta R; Costa, Eduardo L V; Tucci, Mauro R; Lima, Raul G; Kavanagh, Brian P; Amato, Marcelo B P

    2016-08-01

    We recently described how spontaneous effort during mechanical ventilation can cause "pendelluft," that is, displacement of gas from nondependent (more recruited) lung to dependent (less recruited) lung during early inspiration. Such transfer depends on the coexistence of more recruited (source) liquid-like lung regions together with less recruited (target) solid-like lung regions. Pendelluft may improve gas exchange, but because of tidal recruitment, it may also contribute to injury. We hypothesize that higher positive end-expiratory pressure levels decrease the propensity to pendelluft and that with lower positive end-expiratory pressure levels, pendelluft is associated with improved gas exchange but increased tidal recruitment. Crossover design. University animal research laboratory. Anesthetized landrace pigs. Surfactant depletion was achieved by saline lavage in anesthetized pigs, and ventilator-induced lung injury was produced by ventilation with high tidal volume and low positive end-expiratory pressure. Ventilation was continued in each of four conditions: positive end-expiratory pressure (low or optimized positive end-expiratory pressure after recruitment) and spontaneous breathing (present or absent). Tidal recruitment was assessed using dynamic CT and regional ventilation/perfusion using electric impedance tomography. Esophageal pressure was measured using an esophageal balloon manometer. Among the four conditions, spontaneous breathing at low positive end-expiratory pressure not only caused the largest degree of pendelluft, which was associated with improved ventilation/perfusion matching and oxygenation, but also generated the greatest tidal recruitment. At low positive end-expiratory pressure, paralysis worsened oxygenation but reduced tidal recruitment. Optimized positive end-expiratory pressure decreased the magnitude of spontaneous efforts (measured by esophageal pressure) despite using less sedation, from -5.6 ± 1.3 to -2.0 ± 0.7 cm H2O, while concomitantly reducing pendelluft and tidal recruitment. No pendelluft was observed in the absence of spontaneous effort. Spontaneous effort at low positive end-expiratory pressure improved oxygenation but promoted tidal recruitment associated with pendelluft. Optimized positive end-expiratory pressure (set after lung recruitment) may reverse the harmful effects of spontaneous breathing by reducing inspiratory effort, pendelluft, and tidal recruitment.

  17. Aurora painting pays tribute to Civil War's end

    USGS Publications Warehouse

    Love, Jeffrey J.

    2015-01-01

    In 1865, the same year the war ended, the American landscape artist Frederic Edwin Church unveiledAurora Borealis (pictured above), a dramatic and mysterious painting that can be interpreted in terms of 19th century romanticism, scientific philosophy, and Arctic missions of exploration. Aurora Borealiscan also be viewed as a restrained tribute to the end of the Civil War—a moving example of how science and current events served as the muses of late romantic artists [e.g., Carr, 1994, p. 277; Avery, 2011; Harvey, 2012].

  18. Analysis of Scientific Production in Food Science from 2003 to 2013.

    PubMed

    Guerrero-Bote, Vicente P; Moya-Anegón, Félix

    2015-12-01

    Food Science is an active discipline in scientific research. The improvements in Food Technology constitute a challenge for society to eradicate hunger, while achieving food safety. This work analyses the scientific production in Food Science of the 25 countries with the greatest output in this subject area in the period 2003 to 2013. The growth of China's production was striking, with the country becoming top-ranked by the end of the period. Some developing countries (such as Nigeria) achieved a major increase in production but reducing their proportion of scientific collaboration and their works' impact. There appear to be 2 international collaboration networks that get good results--one European and the other Pacific. © 2015 Institute of Food Technologists®

  19. Scientific Data Management Center for Enabling Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vouk, Mladen A.

    Managing scientific data has been identified by the scientific community as one of the most important emerging needs because of the sheer volume and increasing complexity of data being collected. Effectively generating, managing, and analyzing this information requires a comprehensive, end-to-end approach to data management that encompasses all of the stages from the initial data acquisition to the final analysis of the data. Fortunately, the data management problems encountered by most scientific domains are common enough to be addressed through shared technology solutions. Based on community input, we have identified three significant requirements. First, more efficient access to storage systemsmore » is needed. In particular, parallel file system and I/O system improvements are needed to write and read large volumes of data without slowing a simulation, analysis, or visualization engine. These processes are complicated by the fact that scientific data are structured differently for specific application domains, and are stored in specialized file formats. Second, scientists require technologies to facilitate better understanding of their data, in particular the ability to effectively perform complex data analysis and searches over extremely large data sets. Specialized feature discovery and statistical analysis techniques are needed before the data can be understood or visualized. Furthermore, interactive analysis requires techniques for efficiently selecting subsets of the data. Finally, generating the data, collecting and storing the results, keeping track of data provenance, data post-processing, and analysis of results is a tedious, fragmented process. Tools for automation of this process in a robust, tractable, and recoverable fashion are required to enhance scientific exploration. The SDM center was established under the SciDAC program to address these issues. The SciDAC-1 Scientific Data Management (SDM) Center succeeded in bringing an initial set of advanced data management technologies to DOE application scientists in astrophysics, climate, fusion, and biology. Equally important, it established collaborations with these scientists to better understand their science as well as their forthcoming data management and data analytics challenges. Building on our early successes, we have greatly enhanced, robustified, and deployed our technology to these communities. In some cases, we identified new needs that have been addressed in order to simplify the use of our technology by scientists. This report summarizes our work so far in SciDAC-2. Our approach is to employ an evolutionary development and deployment process: from research through prototypes to deployment and infrastructure. Accordingly, we have organized our activities in three layers that abstract the end-to-end data flow described above. We labeled the layers (from bottom to top): a) Storage Efficient Access (SEA), b) Data Mining and Analysis (DMA), c) Scientific Process Automation (SPA). The SEA layer is immediately on top of hardware, operating systems, file systems, and mass storage systems, and provides parallel data access technology, and transparent access to archival storage. The DMA layer, which builds on the functionality of the SEA layer, consists of indexing, feature identification, and parallel statistical analysis technology. The SPA layer, which is on top of the DMA layer, provides the ability to compose scientific workflows from the components in the DMA layer as well as application specific modules. NCSU work performed under this contract was primarily at the SPA layer.« less

  20. Consistency of nature of science views across scientific and socio-scientific contexts

    NASA Astrophysics Data System (ADS)

    Khishfe, Rola

    2017-03-01

    The purpose of the investigation was to investigate the consistency of NOS views among high school students across different scientific and socio-scientific contexts. A total of 261 high school students from eight different schools in Lebanon participated in the investigation. The schools were selected based on different geographical areas in Lebanon and the principals' consent to participate in the study. The investigation used a qualitative design to compare the responses of students across different contexts/topics. All the participants completed a five-item open-ended questionnaire, which includes five topics addressing scientific and socio-scientific contexts. The items of the questionnaire addressed the empirical, tentative, and subjective aspects of NOS. Quantitative and qualitative analyses were conducted to answer the research questions. Results showed that participants' views of the emphasised NOS aspects were mostly inconsistent. Plus, there was variance in participants' views of NOS between scientific and socio-scientific issues. Discussion of the results related to differential developmental progression, contextual factors, social constructivist perspective, different domains of knowledge, and students' individual differences.

  1. America Calls

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Born 75 years ago in the Manhattan Project that helped to end World War II, the national lab established at Oak Ridge, Tennessee, still serves national missions in energy, scientific discovery and national security today.

  2. Optimization Research of Generation Investment Based on Linear Programming Model

    NASA Astrophysics Data System (ADS)

    Wu, Juan; Ge, Xueqian

    Linear programming is an important branch of operational research and it is a mathematical method to assist the people to carry out scientific management. GAMS is an advanced simulation and optimization modeling language and it will combine a large number of complex mathematical programming, such as linear programming LP, nonlinear programming NLP, MIP and other mixed-integer programming with the system simulation. In this paper, based on the linear programming model, the optimized investment decision-making of generation is simulated and analyzed. At last, the optimal installed capacity of power plants and the final total cost are got, which provides the rational decision-making basis for optimized investments.

  3. Strategies to Improve Efficiency and Specificity of Degenerate Primers in PCR.

    PubMed

    Campos, Maria Jorge; Quesada, Alberto

    2017-01-01

    PCR with degenerate primers can be used to identify the coding sequence of an unknown protein or to detect a genetic variant within a gene family. These primers, which are complex mixtures of slightly different oligonucleotide sequences, can be optimized to increase the efficiency and/or specificity of PCR in the amplification of a sequence of interest by the introduction of mismatches with the target sequence and balancing their position toward the primers 5'- or 3'-ends. In this work, we explain in detail examples of rational design of primers in two different applications, including the use of specific determinants at the 3'-end, to: (1) improve PCR efficiency with coding sequences for members of a protein family by fully degeneration at a core box of conserved genetic information, with the reduction of degeneration at the 5'-end, and (2) optimize specificity of allelic discrimination of closely related orthologous by 5'-end degenerate primers.

  4. A System-Oriented Approach for the Optimal Control of Process Chains under Stochastic Influences

    NASA Astrophysics Data System (ADS)

    Senn, Melanie; Schäfer, Julian; Pollak, Jürgen; Link, Norbert

    2011-09-01

    Process chains in manufacturing consist of multiple connected processes in terms of dynamic systems. The properties of a product passing through such a process chain are influenced by the transformation of each single process. There exist various methods for the control of individual processes, such as classical state controllers from cybernetics or function mapping approaches realized by statistical learning. These controllers ensure that a desired state is obtained at process end despite of variations in the input and disturbances. The interactions between the single processes are thereby neglected, but play an important role in the optimization of the entire process chain. We divide the overall optimization into two phases: (1) the solution of the optimization problem by Dynamic Programming to find the optimal control variable values for each process for any encountered end state of its predecessor and (2) the application of the optimal control variables at runtime for the detected initial process state. The optimization problem is solved by selecting adequate control variables for each process in the chain backwards based on predefined quality requirements for the final product. For the demonstration of the proposed concept, we have chosen a process chain from sheet metal manufacturing with simplified transformation functions.

  5. Designing Courses that Encourage Post-College Scientific Literacy in General Education Students

    NASA Astrophysics Data System (ADS)

    Horodyskyj, L.

    2010-12-01

    In a time when domestic and foreign policy is becoming increasingly dependent on a robust understanding of scientific concepts (especially in regards to climate science), it is of vital importance that non-specialist students taking geoscience courses gain an understanding not only of Earth system processes, but also of how to discern scientific information from "spin". An experimental introductory level environmental geology course was developed at the Glendale Community College in Glendale, Arizona, in the fall of 2010 that sought to integrate collaborative learning, online resources, and science in the media. The goal of this course was for students to end the semester with not just an understanding of basic Earth systems concepts, but also with a set of tools for evaluating information presented by the media. This was accomplished by integrating several online sites that interface scientific data with popular web tools (ie, Google Maps) and collaborative exercises that required students to generate ideas based on their observations followed by evaluation and refinement of these ideas through interactions with peers and the instructor. The capstone activity included a series of homework assignments that required students to make note of science-related news stories in the media early in the semester, and then gradually begin critically evaluating these news sources, which will become their primary source of post-college geoscience information. This combination of activities will benefit students long after the semester has ended by giving them access to primary sources of scientific information, encouraging them to discuss and evaluate their ideas with their peers, and, most importantly, to critically evaluate the information they receive from the media and their peers so that they can become more scientifically literate citizens.

  6. End-to-End Information System design at the NASA Jet Propulsion Laboratory. [data transmission between user and space-based sensor

    NASA Technical Reports Server (NTRS)

    Hooke, A. J.

    1978-01-01

    In recognition of a pressing need of the 1980s to optimize the two-way flow of information between a ground-based user and a remote-space-based sensor, an end-to-end approach to the design of information systems has been adopted at the JPL. This paper reviews End-to-End Information System (EEIS) activity at the JPL, with attention given to the scope of the EEIS transfer function, and functional and physical elements of the EEIS. The relationship between the EEIS and the NASA End-to-End Data System program is discussed.

  7. Use of a Scaffolded Case Study Assignment to Enhance Students' Scientific Literacy Skills in Undergraduate Nutritional Science Education: Comparison between Traditional Lecture and Distance Education Course Formats

    ERIC Educational Resources Information Center

    Monk, Jennifer M.; Newton, Genevieve

    2018-01-01

    We investigated whether the implementation of a scaffolded case study assignment could increase student perceptions of their scientific literacy (SL) skills in a third year Nutritional Science course. The change in students' SL perceptions were assessed by the completion of two surveys (administered at the start and end of the semester) consisting…

  8. Transforming Training: A Perspective on the Need and Payoffs from Common Standards

    DTIC Science & Technology

    2006-12-01

    position the training research community on the eve of a scientific breakthrough. In the near future, the scientific community will likely benefit from...is likely to benefit from this ability to routinely cross-compare training technologies and techniques from laboratory training study results...concluding with a logical end such as " Bingo " (nearly out of fuel), all threats killed, or multiple friendly losses. While these training sessions

  9. IMP series report/bibliography

    NASA Technical Reports Server (NTRS)

    King, J. H.

    1971-01-01

    The main characteristics of the IMP spacecraft and experiments are considered and the scientific knowledge gained is presented in the form of abstracts of scientific papers using IMP data. Spacecraft characteristics, including temporal and spatial coverages, are presented followed by an annotated bibliography. Experiments conducted on all IMP's (including prelaunch IMP's H and J) are described. Figures are presented showing the time histories, through the end of 1970, of magnetic field, plasma, and energetic particle experiments.

  10. MO-FG-BRA-08: Swarm Intelligence-Based Personalized Respiratory Gating in Lung SAbR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Modiri, A; Sabouri, P; Sawant, A

    Purpose: Respiratory gating is widely deployed as a clinical motion-management strategy in lung radiotherapy. In conventional gating, the beam is turned on during a pre-determined phase window; typically, around end-exhalation. In this work, we challenge the notion that end-exhalation is always the optimal gating phase. Specifically, we use a swarm-intelligence-based, inverse planning approach to determine the optimal respiratory phase and MU for each beam with respect to (i) the state of the anatomy at each phase and (ii) the time spent in that state, estimated from long-term monitoring of the patient’s breathing motion. Methods: In a retrospective study of fivemore » lung cancer patients, we compared the dosimetric performance of our proposed personalized gating (PG) with that of conventional end-of-exhale gating (CEG) and a previously-developed, fully 4D-optimized plan (combined with MLC tracking delivery). For each patient, respiratory phase probabilities (indicative of the time duration of the phase) were estimated over 2 minutes from lung tumor motion traces recorded previously using the Synchrony system (Accuray Inc.). Based on this information, inverse planning optimization was performed to calculate the optimal respiratory gating phase and MU for each beam. To ensure practical deliverability, each PG beam was constrained to deliver the assigned MU over a time duration comparable to that of CEG delivery. Results: Maximum OAR sparing for the five patients achieved by the PG and the 4D plans compared to CEG plans was: Esophagus Dmax [PG:57%, 4D:37%], Heart Dmax [PG:71%, 4D:87%], Spinal cord Dmax [PG:18%, 4D:68%] and Lung V13 [PG:16%, 4D:31%]. While patients spent the most time in exhalation, the PG-optimization chose end-exhale only for 28% of beams. Conclusion: Our novel gating strategy achieved significant dosimetric improvements over conventional gating, and approached the upper limit represented by fully 4D optimized planning while being significantly simpler and more clinically translatable. This work was partially supported through research funding from National Institutes of Health (R01CA169102) and Varian Medical Systems, Palo Alto, CA, USA.« less

  11. Design Optimization of Systems Governed by Partial Differential Equations. Phase 1

    DTIC Science & Technology

    1989-03-01

    DIFFERENTIAL EQUATIONS" SUBMITTED TO: AIR FORCE OFFICE OF SCIENTIFIC RESEARCH AFOSR/NM ATTN: Major James Crowley BUILDING 410, ROOM 209 BOLLING AFB, DC 20332...of his algorithms called DELIGHT. We consider this work to be of signal importance for the future of all engineer- ing design optimization. Prof...to be set up in a subroutine, which would be called by the optimization code. We then intended to pursue a slow and orderly progression of the problem

  12. Optimization of sparse matrix-vector multiplication on emerging multicore platforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Samuel; Oliker, Leonid; Vuduc, Richard

    2007-01-01

    We are witnessing a dramatic change in computer architecture due to the multicore paradigm shift, as every electronic device from cell phones to supercomputers confronts parallelism of unprecedented scale. To fully unleash the potential of these systems, the HPC community must develop multicore specific optimization methodologies for important scientific computations. In this work, we examine sparse matrix-vector multiply (SpMV) - one of the most heavily used kernels in scientific computing - across a broad spectrum of multicore designs. Our experimental platform includes the homogeneous AMD dual-core and Intel quad-core designs, the heterogeneous STI Cell, as well as the first scientificmore » study of the highly multithreaded Sun Niagara2. We present several optimization strategies especially effective for the multicore environment, and demonstrate significant performance improvements compared to existing state-of-the-art serial and parallel SpMV implementations. Additionally, we present key insights into the architectural tradeoffs of leading multicore design strategies, in the context of demanding memory-bound numerical algorithms.« less

  13. Spatial Coverage Planning and Optimization for Planetary Exploration

    NASA Technical Reports Server (NTRS)

    Gaines, Daniel M.; Estlin, Tara; Chouinard, Caroline

    2008-01-01

    We are developing onboard planning and scheduling technology to enable in situ robotic explorers, such as rovers and aerobots, to more effectively assist scientists in planetary exploration. In our current work, we are focusing on situations in which the robot is exploring large geographical features such as craters, channels or regional boundaries. In to develop valid and high quality plans, the robot must take into account a range of scientific and engineering constraints and preferences. We have developed a system that incorporates multiobjective optimization and planning allowing the robot to generate high quality mission operations plans that respect resource limitations and mission constraints while attempting to maximize science and engineering objectives. An important scientific objective for the exploration of geological features is selecting observations that spatially cover an area of interest. We have developed a metric to enable an in situ explorer to reason about and track the spatial coverage quality of a plan. We describe this technique and show how it is combined in the overall multiobjective optimization and planning algorithm.

  14. Pythran: enabling static optimization of scientific Python programs

    NASA Astrophysics Data System (ADS)

    Guelton, Serge; Brunet, Pierrick; Amini, Mehdi; Merlini, Adrien; Corbillon, Xavier; Raynaud, Alan

    2015-01-01

    Pythran is an open source static compiler that turns modules written in a subset of Python language into native ones. Assuming that scientific modules do not rely much on the dynamic features of the language, it trades them for powerful, possibly inter-procedural, optimizations. These optimizations include detection of pure functions, temporary allocation removal, constant folding, Numpy ufunc fusion and parallelization, explicit thread-level parallelism through OpenMP annotations, false variable polymorphism pruning, and automatic vector instruction generation such as AVX or SSE. In addition to these compilation steps, Pythran provides a C++ runtime library that leverages the C++ STL to provide generic containers, and the Numeric Template Toolbox for Numpy support. It takes advantage of modern C++11 features such as variadic templates, type inference, move semantics and perfect forwarding, as well as classical idioms such as expression templates. Unlike the Cython approach, Pythran input code remains compatible with the Python interpreter. Output code is generally as efficient as the annotated Cython equivalent, if not more, but without the backward compatibility loss.

  15. Simulation in production of open rotor propellers: from optimal surface geometry to automated control of mechanical treatment

    NASA Astrophysics Data System (ADS)

    Grinyok, A.; Boychuk, I.; Perelygin, D.; Dantsevich, I.

    2018-03-01

    A complex method of the simulation and production design of open rotor propellers was studied. An end-to-end diagram was proposed for the evaluating, designing and experimental testing the optimal geometry of the propeller surface, for the machine control path generation as well as for simulating the cutting zone force condition and its relationship with the treatment accuracy which was defined by the propeller elastic deformation. The simulation data provided the realization of the combined automated path control of the cutting tool.

  16. A Simultaneous Approach to Optimizing Treatment Assignments with Mastery Scores. Research Report 89-5.

    ERIC Educational Resources Information Center

    Vos, Hans J.

    An approach to simultaneous optimization of assignments of subjects to treatments followed by an end-of-mastery test is presented using the framework of Bayesian decision theory. Focus is on demonstrating how rules for the simultaneous optimization of sequences of decisions can be found. The main advantages of the simultaneous approach, compared…

  17. Investigating Flow Experience and Scientific Practices During a Mobile Serious Educational Game

    NASA Astrophysics Data System (ADS)

    Bressler, Denise M.; Bodzin, Alec M.

    2016-10-01

    Mobile serious educational games (SEGs) show promise for promoting scientific practices and high engagement. Researchers have quantified this engagement according to flow theory. This study investigated whether a mobile SEG promotes flow experience and scientific practices with eighth-grade urban students. Students playing the game ( n = 59) were compared with students in a business-as-usual control activity ( n = 120). In both scenarios, students worked in small teams. Data measures included an open-ended instrument designed to measure scientific practices, a self-report flow survey, and classroom observations. The game players had significantly higher levels of flow and scientific practices compared to the control group. Observations revealed that game teams received less whole-class instruction and review compared to the control teams. Game teachers had primarily a guide-on-the-side role when facilitating the game, while control teachers predominantly used didactic instruction when facilitating the control activity. Implications for these findings are discussed.

  18. An integrated strategy of knowledge application for optimal e-health implementation: A multi-method study protocol

    PubMed Central

    Gagnon, Marie-Pierre; Légaré, France; Fortin, Jean-Paul; Lamothe, Lise; Labrecque, Michel; Duplantie, Julie

    2008-01-01

    Background E-health is increasingly valued for supporting: 1) access to quality health care services for all citizens; 2) information flow and exchange; 3) integrated health care services and 4) interprofessional collaboration. Nevertheless, several questions remain on the factors allowing an optimal integration of e-health in health care policies, organisations and practices. An evidence-based integrated strategy would maximise the efficacy and efficiency of e-health implementation. However, decisions regarding e-health applications are usually not evidence-based, which can lead to a sub-optimal use of these technologies. This study aims at understanding factors influencing the application of scientific knowledge for an optimal implementation of e-health in the health care system. Methods A three-year multi-method study is being conducted in the Province of Quebec (Canada). Decision-making at each decisional level (political, organisational and clinical) are analysed based on specific approaches. At the political level, critical incidents analysis is being used. This method will identify how decisions regarding the implementation of e-health could be influenced or not by scientific knowledge. Then, interviews with key-decision-makers will look at how knowledge was actually used to support their decisions, and what factors influenced its use. At the organisational level, e-health projects are being analysed as case studies in order to explore the use of scientific knowledge to support decision-making during the implementation of the technology. Interviews with promoters, managers and clinicians will be carried out in order to identify factors influencing the production and application of scientific knowledge. At the clinical level, questionnaires are being distributed to clinicians involved in e-health projects in order to analyse factors influencing knowledge application in their decision-making. Finally, a triangulation of the results will be done using mixed methodologies to allow a transversal analysis of the results at each of the decisional levels. Results This study will identify factors influencing the use of scientific evidence and other types of knowledge by decision-makers involved in planning, financing, implementing and evaluating e-health projects. Conclusion These results will be highly relevant to inform decision-makers who wish to optimise the implementation of e-health in the Quebec health care system. This study is extremely relevant given the context of major transformations in the health care system where e-health becomes a must. PMID:18435853

  19. Market Intelligence Guide

    DTIC Science & Technology

    2012-01-05

    learn about the latest designs , trends in fashion, and scientific breakthroughs in chair ergonomics . Using this tradeshow, the Furnishings Commodity...these tools is essential to designing the optimal contract that reaps the most value from the exchange. Therefore, this market intelligence guide is...portfolio matrix) that are transferrable to the not-for-profit sector are absent. Each of these tools is essential to designing the optimal contract that

  20. Development of Probiotic Formulation for the Treatment of Iron Deficiency Anemia.

    PubMed

    Korčok, Davor Jovan; Tršić-Milanović, Nada Aleksandar; Ivanović, Nevena Djuro; Đorđević, Brižita Ivan

    2018-04-01

    Probiotics are increasingly more present both as functional foods, and in pharmaceutical preparations with multiple levels of action that contribute to human health. Probiotics realize their positive effects with a proper dose, and by maintaining a declared number of probiotics cells by the expiration date. Important precondition for developing a probiotic product is the right choice of clinically proven probiotic strain, the choice of other active components, as well as, the optimization of the quantity of active component of probiotic per product dose. This scientific paper describes the optimization of the number of probiotics cells in the formulation of dietary supplement that contains probiotic culture Lactobacillus plantarum 299v, iron and vitamin C. Variations of the quantity of active component were analyzed in development batches of the encapsulated probiotic product categorized as dietary supplement with the following ingredients: probiotic culture, sucrosomal form of iron and vitamin C. Optimal quantity of active component L. plantarum of 50 mg, was selected. The purpose of this scientific paper is to select the optimal formulation of probiotic culture in a dietary supplement that contains iron and vitamin C, and to also determine its expiration date by the analysis of the number of viable probiotic cells.

  1. Differences between Lab Completion and Non-Completion on Student Performance in an Online Undergraduate Environmental Science Program

    NASA Astrophysics Data System (ADS)

    Corsi, Gianluca

    2011-12-01

    Web-based technology has revolutionized the way education is delivered. Although the advantages of online learning appeal to large numbers of students, some concerns arise. One major concern in online science education is the value that participation in labs has on student performance. The purpose of this study was to assess the relationships between lab completion and student academic success as measured by test grades, scientific self-confidence, scientific skills, and concept mastery. A random sample of 114 volunteer undergraduate students, from an online Environmental Science program at the American Public University System, was tested. The study followed a quantitative, non-experimental research design. Paired sample t-tests were used for statistical comparison between pre-lab and post-lab test grades, two scientific skills quizzes, and two scientific self-confidence surveys administered at the beginning and at the end of the course. The results of the paired sample t-tests revealed statistically significant improvements on all post-lab test scores: Air Pollution lab, t(112) = 6.759, p < .001; Home Chemicals lab t(114) = 8.585, p < .001; Water Use lab, t(116) = 6.657, p < .001; Trees and Carbon lab, t(113) = 9.921, p < .001; Stratospheric Ozone lab, t(112) =12.974, p < .001; Renewable Energy lab, t(115) = 7.369, p < .001. The end of the course Scientific Skills quiz revealed statistically significant improvements, t(112) = 8.221, p < .001. The results of the two surveys showed a statistically significant improvement on student Scientific Self-Confidence because of lab completion, t(114) = 3.015, p < .05. Because age and gender were available, regression models were developed. The results indicated weak multiple correlation coefficients and were not statistically significant at alpha = .05. Evidence suggests that labs play a positive role in a student's academic success. It is recommended that lab experiences be included in all online Environmental Science programs, with emphasis on open-ended inquiries, and adoption of online tools to enhance hands-on experiences, such as virtual reality platforms and digital animations. Future research is encouraged to investigate possible correlations between socio-demographic attributes and academic success of students enrolled in online science programs in reference to lab completion.

  2. [A nosology for supernatural phenomena and the construction of the 'possessed' brain in the nineteenth century].

    PubMed

    Goncalves, Valeria Portugal; Ortega, Francisco

    2013-06-01

    At the end of the twentieth century, supernatural phenomena such as so called trances and possession by spirits received a scientific classification, which includes the numerous diagnoses of the dominant psychiatry. At the end of the nineteenth century we can observe a process of scientific categorization of phenomena considered to have originated in superstition or popular imagination. In this work we show how trances and spiritual possession were studied by Franz Anton Mesmer and his followers when developing the concept of magnetism; by James Braid during the creation of his theory of hypnosis; and by Jean Martin Charcot, which marked the entry of hysteria into nosological classification. Despite the differences between these schools, we identify the use of the brain and cerebral metaphors as the foundation of theories of the mind.

  3. Dawning of A New Day on Dec. 22, 2012

    NASA Image and Video Library

    2017-12-08

    Dec. 21, 2012 was not the end of the world, contrary to some of the common beliefs out there. NASA's SDO satellite captured this image of the SUN on 12-22-12 at 00:14 UTC as the time rolled over into the new day. To learn more about why the world did not end yesterday, watch this Science @ NASA video: youtu.be/2wimiRUHMI4 or visit www.nasa.gov/2012 Credit: NASA/NOAA GOES Project NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  4. Cardiomed System for Medical Monitoring Onboard ISS

    NASA Astrophysics Data System (ADS)

    Lloret, J. C.; Aubry, P.; Nguyen, L.; Kozharinov, V.; Grachev, V.; Temnova, E.

    2008-06-01

    Cardiomed system was developed with two main objectives: (1) cardiovascular medical monitoring of cosmonauts onboard ISS together with LBNP countermeasure; (2) scientific study of the cardio-vascular system in micro-gravity. Cardiomed is an integrated end-to-end system, from the onboard segment operating different medical instruments, to the ground segment which provides real-time telemetry of on-board experiments and off-line analysis of physiological measurements. In the first part of the paper, Cardiomed is described from an architecture point of view together with some typical uses. In the second part, the most constraining requirements with respect to system design are introduced. Some requirements are generic; some are specific to medical follow-up, others to scientific objectives. In the last part, the main technical challenges which were addressed during the development and the qualification of Cardiomed and the lessons learnt are presented.

  5. [Reform of collective forest property in Liaoning Province: a discussion].

    PubMed

    Tai, Shan-shan; Hu, Yuan-man; Zhang, Hong-sheng; Han, Yu-ku; Xiao, Ze-chen

    2010-05-01

    The reform of collective forest property has increased the farmers' income, and brought new development to forestry. On the basis of expatiating the conception of collective forest property and related management system, this paper introduced in detail the course, main ways, and effects of the reform in Liaoning Province, analyzed the research progress of the reform and existing problems, and made appraisement and exceptions to the reform of collective forest property in Liaoning Province, aimed to give comments to the development and orientation of forward reform. In this province, the reform of collective forest property had the characteristics of classified reform, different reform types in different areas, and the main and affiliated reform being carried out at the same time. By the end of 2009, the main task had turned into affiliated reform. In the future, the reform should be focused on the optimal forestry management model to improve the forest economic, ecological, and social benefits, and using multi-disciplinary methods to strengthen the researches on the relationships between forestry management and forest ecological functions to provide scientific bases for the reform of collective forest property in Liaoning Province.

  6. Terascale Computing in Accelerator Science and Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ko, Kwok

    2002-08-21

    We have entered the age of ''terascale'' scientific computing. Processors and system architecture both continue to evolve; hundred-teraFLOP computers are expected in the next few years, and petaFLOP computers toward the end of this decade are conceivable. This ever-increasing power to solve previously intractable numerical problems benefits almost every field of science and engineering and is revolutionizing some of them, notably including accelerator physics and technology. At existing accelerators, it will help us optimize performance, expand operational parameter envelopes, and increase reliability. Design decisions for next-generation machines will be informed by unprecedented comprehensive and accurate modeling, as well as computer-aidedmore » engineering; all this will increase the likelihood that even their most advanced subsystems can be commissioned on time, within budget, and up to specifications. Advanced computing is also vital to developing new means of acceleration and exploring the behavior of beams under extreme conditions. With continued progress it will someday become reasonable to speak of a complete numerical model of all phenomena important to a particular accelerator.« less

  7. Understanding the socioeconomic heterogeneity in healthcare in US counties: the effect of population density, education and poverty on H1N1 pandemic mortality.

    PubMed

    Ponnambalam, L; Samavedham, L; Lee, H R; Ho, C S

    2012-05-01

    The recent outbreak of H1N1 has provided the scientific community with a sad but timely opportunity to understand the influence of socioeconomic determinants on H1N1 pandemic mortality. To this end, we have used data collected from 341 US counties to model H1N1 deaths/1000 using 12 socioeconomic predictors to discover why certain counties reported fewer H1N1 deaths compared to other counties. These predictors were then used to build a decision tree. The decision tree developed was then used to predict H1N1 mortality for the whole of the USA. Our estimate of 7667 H1N1 deaths are in accord with the lower bound of the CDC estimate of 8870 deaths. In addition to the H1N1 death estimates, we have listed possible counties to be targeted for health-related interventions. The respective state/county authorities can use these results as the basis to target and optimize the distribution of public health resources.

  8. Large-Scale NASA Science Applications on the Columbia Supercluster

    NASA Technical Reports Server (NTRS)

    Brooks, Walter

    2005-01-01

    Columbia, NASA's newest 61 teraflops supercomputer that became operational late last year, is a highly integrated Altix cluster of 10,240 processors, and was named to honor the crew of the Space Shuttle lost in early 2003. Constructed in just four months, Columbia increased NASA's computing capability ten-fold, and revitalized the Agency's high-end computing efforts. Significant cutting-edge science and engineering simulations in the areas of space and Earth sciences, as well as aeronautics and space operations, are already occurring on this largest operational Linux supercomputer, demonstrating its capacity and capability to accelerate NASA's space exploration vision. The presentation will describe how an integrated environment consisting not only of next-generation systems, but also modeling and simulation, high-speed networking, parallel performance optimization, and advanced data analysis and visualization, is being used to reduce design cycle time, accelerate scientific discovery, conduct parametric analysis of multiple scenarios, and enhance safety during the life cycle of NASA missions. The talk will conclude by discussing how NAS partnered with various NASA centers, other government agencies, computer industry, and academia, to create a national resource in large-scale modeling and simulation.

  9. Future Directions in Research on Racism-Related Stress and Racial-Ethnic Protective Factors for Black Youth.

    PubMed

    Jones, Shawn C T; Neblett, Enrique W

    2017-01-01

    Research on racism-related stress and racial-ethnic protective factors represents an important enterprise for optimizing the mental health of African American and other racial and ethnic minority youth. However, there has been a relative dearth of work on these factors in the clinical psychology research literature, and more work is needed in outlets such as these. To this end, the current article adopts a developmental psychopathology framework and uses recent empirical findings to outline our current understanding of racism-related stress and racial-ethnic protective factors (i.e., racial identity, racial socialization, Africentric worldview) for African American youth. We then provide nine recommendations-across basic, applied, and broader/cross-cutting research lines-that we prioritize as essential to advancing the future scientific investigation of this crucial research agenda. Within and across these recommendations, we issue a charge to researchers and clinicians alike, with the ultimate goal of alleviating the negative mental health impact that racism-related stress can have on the well-being and mental health of African American and other racial and ethnic minority youth.

  10. Efficient data management tools for the heterogeneous big data warehouse

    NASA Astrophysics Data System (ADS)

    Alekseev, A. A.; Osipova, V. V.; Ivanov, M. A.; Klimentov, A.; Grigorieva, N. V.; Nalamwar, H. S.

    2016-09-01

    The traditional RDBMS has been consistent for the normalized data structures. RDBMS served well for decades, but the technology is not optimal for data processing and analysis in data intensive fields like social networks, oil-gas industry, experiments at the Large Hadron Collider, etc. Several challenges have been raised recently on the scalability of data warehouse like workload against the transactional schema, in particular for the analysis of archived data or the aggregation of data for summary and accounting purposes. The paper evaluates new database technologies like HBase, Cassandra, and MongoDB commonly referred as NoSQL databases for handling messy, varied and large amount of data. The evaluation depends upon the performance, throughput and scalability of the above technologies for several scientific and industrial use-cases. This paper outlines the technologies and architectures needed for processing Big Data, as well as the description of the back-end application that implements data migration from RDBMS to NoSQL data warehouse, NoSQL database organization and how it could be useful for further data analytics.

  11. Autonomous terrain characterization and modelling for dynamic control of unmanned vehicles

    NASA Technical Reports Server (NTRS)

    Talukder, A.; Manduchi, R.; Castano, R.; Owens, K.; Matthies, L.; Castano, A.; Hogg, R.

    2002-01-01

    This end-to-end obstacle negotiation system is envisioned to be useful in optimized path planning and vehicle navigation in terrain conditions cluttered with vegetation, bushes, rocks, etc. Results on natural terrain with various natural materials are presented.

  12. Multi-mode horn

    NASA Technical Reports Server (NTRS)

    Neilson, Jeffrey M. (Inventor)

    2002-01-01

    A horn has an input aperture and an output aperture, and comprises a conductive inner surface formed by rotating a curve about a central axis. The curve comprises a first arc having an input aperture end and a transition end, and a second arc having a transition end and an output aperture end. When rotated about the central axis, the first arc input aperture end forms an input aperture, and the second arc output aperture end forms an output aperture. The curve is then optimized to provide a mode conversion which maximizes the power transfer of input energy to the Gaussian mode at the output aperture.

  13. The Nasa-Isro SAR Mission Science Data Products and Processing Workflows

    NASA Astrophysics Data System (ADS)

    Rosen, P. A.; Agram, P. S.; Lavalle, M.; Cohen, J.; Buckley, S.; Kumar, R.; Misra-Ray, A.; Ramanujam, V.; Agarwal, K. M.

    2017-12-01

    The NASA-ISRO SAR (NISAR) Mission is currently in the development phase and in the process of specifying its suite of data products and algorithmic workflows, responding to inputs from the NISAR Science and Applications Team. NISAR will provide raw data (Level 0), full-resolution complex imagery (Level 1), and interferometric and polarimetric image products (Level 2) for the entire data set, in both natural radar and geocoded coordinates. NASA and ISRO are coordinating the formats, meta-data layers, and algorithms for these products, for both the NASA-provided L-band radar and the ISRO-provided S-band radar. Higher level products will be also be generated for the purpose of calibration and validation, over large areas of Earth, including tectonic plate boundaries, ice sheets and sea-ice, and areas of ecosystem disturbance and change. This level of comprehensive product generation has been unprecedented for SAR missions in the past, and leads to storage processing challenges for the production system and the archive center. Further, recognizing the potential to support applications that require low latency product generation and delivery, the NISAR team is optimizing the entire end-to-end ground data system for such response, including exploring the advantages of cloud-based processing, algorithmic acceleration using GPUs, and on-demand processing schemes that minimize computational and transport costs, but allow rapid delivery to science and applications users. This paper will review the current products, workflows, and discuss the scientific and operational trade-space of mission capabilities.

  14. Nectar for the taking: the popularization of scientific bee culture in England, 1609-1809.

    PubMed

    Ebert, Adam

    2011-01-01

    This essay expands and refines academic knowledge of English beekeeping during the seventeenth and eighteenth centuries. Scientific beekeeping focused on improvement, which, in turn, depended on the dissemination of ideas and practices. This analysis, therefore, encompasses the mentalities and tactics of popularizers. The article also identifies two neglected concepts in the popularization campaign. First, popularizers saw scientific beekeeping as a way to end the tradition of killing the bees in order to safely harvest. Second, they sought to promote a rural industry for the economic welfare of the nation. The case study of Exeter's Western Apiarian Society reveals precisely how popularization functioned in reality. The result is a more thorough history of scientific beekeeping and how the rhetoric of improvement related to the culture of practice.

  15. Optimizing the scientific yield from a randomized controlled trial (RCT): evaluating two behavioral interventions and assessment reactivity with a single trial.

    PubMed

    Carey, Michael P; Senn, Theresa E; Coury-Doniger, Patricia; Urban, Marguerite A; Vanable, Peter A; Carey, Kate B

    2013-09-01

    Randomized controlled trials (RCTs) remain the gold standard for evaluating intervention efficacy but are often costly. To optimize their scientific yield, RCTs can be designed to investigate multiple research questions. This paper describes an RCT that used a modified Solomon four-group design to simultaneously evaluate two, theoretically-guided, health promotion interventions as well as assessment reactivity. Recruited participants (N = 1010; 56% male; 69% African American) were randomly assigned to one of four conditions formed by crossing two intervention conditions (i.e., general health promotion vs. sexual risk reduction intervention) with two assessment conditions (i.e., general health vs. sexual health survey). After completing their assigned baseline assessment, participants received the assigned intervention, and returned for follow-ups at 3, 6, 9, and 12 months. In this report, we summarize baseline data, which show high levels of sexual risk behavior; alcohol, marijuana, and tobacco use; and fast food consumption. Sexual risk behaviors and substance use were correlated. Participants reported high satisfaction with both interventions but ratings for the sexual risk reduction intervention were higher. Planned follow-up sessions, and subsequent analyses, will assess changes in health behaviors including sexual risk behaviors. This study design demonstrates one way to optimize the scientific yield of an RCT. © 2013 Elsevier Inc. All rights reserved.

  16. Curriculum Alignment with Vision and Change Improves Student Scientific Literacy

    PubMed Central

    Auerbach, Anna Jo; Schussler, Elisabeth E.

    2017-01-01

    The Vision and Change in Undergraduate Biology Education final report challenged institutions to reform their biology courses to focus on process skills and student active learning, among other recommendations. A large southeastern university implemented curricular changes to its majors’ introductory biology sequence in alignment with these recommendations. Discussion sections focused on developing student process skills were added to both lectures and a lab, and one semester of lab was removed. This curriculum was implemented using active-learning techniques paired with student collaboration. This study determined whether these changes resulted in a higher gain of student scientific literacy by conducting pre/posttesting of scientific literacy for two cohorts: students experiencing the unreformed curriculum and students experiencing the reformed curriculum. Retention of student scientific literacy for each cohort was also assessed 4 months later. At the end of the academic year, scientific literacy gains were significantly higher for students in the reformed curriculum (p = 0.005), with those students having double the scientific literacy gains of the cohort in the unreformed curriculum. Retention of scientific literacy did not differ between the cohorts. PMID:28495933

  17. Scientific Integrity Policy Creation and Implementation.

    NASA Astrophysics Data System (ADS)

    Koizumi, K.

    2017-12-01

    Ensuring the integrity of science was a priority for the Obama Administration. In March 2009, President Obama issued a Presidential Memorandum that recognized the need for the public to be able to trust the science and scientific process informing public policy decisions. In 2010, the White House Office of Science and Technology Policy (OSTP) issued a Memorandum providing guidelines for Federal departments and agencies to follow in developing scientific integrity policies. This Memorandum describes minimum standards for: (1) strengthening the foundations of scientific integrity in government, including by shielding scientific data and analysis from inappropriate political influence; (2) improving public communication about science and technology by promoting openness and transparency; (3) enhancing the ability of Federal Advisory Committees to provide independent scientific advice; and (4) supporting the professional development of government scientists and engineers. The Memorandum called upon the heads of departments and agencies to develop scientific integrity policies that meet these requirements. At the end of the Obama Administration, 24 Federal departments and agencies had developed and implemented scientific integrity policies consistent with the OSTP guidelines. This year, there are significant questions as to the Trump Administration's commitment to these scientific integrity policies and interest in the Congress in codifying these policies in law. The session will provide an update on the status of agency scientific integrity policies and legislation.

  18. Space and Earth Sciences, Computer Systems, and Scientific Data Analysis Support, Volume 1

    NASA Technical Reports Server (NTRS)

    Estes, Ronald H. (Editor)

    1993-01-01

    This Final Progress Report covers the specific technical activities of Hughes STX Corporation for the last contract triannual period of 1 June through 30 Sep. 1993, in support of assigned task activities at Goddard Space Flight Center (GSFC). It also provides a brief summary of work throughout the contract period of performance on each active task. Technical activity is presented in Volume 1, while financial and level-of-effort data is presented in Volume 2. Technical support was provided to all Division and Laboratories of Goddard's Space Sciences and Earth Sciences Directorates. Types of support include: scientific programming, systems programming, computer management, mission planning, scientific investigation, data analysis, data processing, data base creation and maintenance, instrumentation development, and management services. Mission and instruments supported include: ROSAT, Astro-D, BBXRT, XTE, AXAF, GRO, COBE, WIND, UIT, SMM, STIS, HEIDI, DE, URAP, CRRES, Voyagers, ISEE, San Marco, LAGEOS, TOPEX/Poseidon, Pioneer-Venus, Galileo, Cassini, Nimbus-7/TOMS, Meteor-3/TOMS, FIFE, BOREAS, TRMM, AVHRR, and Landsat. Accomplishments include: development of computing programs for mission science and data analysis, supercomputer applications support, computer network support, computational upgrades for data archival and analysis centers, end-to-end management for mission data flow, scientific modeling and results in the fields of space and Earth physics, planning and design of GSFC VO DAAC and VO IMS, fabrication, assembly, and testing of mission instrumentation, and design of mission operations center.

  19. Critical appraisal of scientific articles: part 1 of a series on evaluation of scientific publications.

    PubMed

    du Prel, Jean-Baptist; Röhrig, Bernd; Blettner, Maria

    2009-02-01

    In the era of evidence-based medicine, one of the most important skills a physician needs is the ability to analyze scientific literature critically. This is necessary to keep medical knowledge up to date and to ensure optimal patient care. The aim of this paper is to present an accessible introduction into critical appraisal of scientific articles. Using a selection of international literature, the reader is introduced to the principles of critical reading of scientific articles in medicine. For the sake of conciseness, detailed description of statistical methods is omitted. Widely accepted principles for critically appraising scientific articles are outlined. Basic knowledge of study design, structuring of an article, the role of different sections, of statistical presentations as well as sources of error and limitation are presented. The reader does not require extensive methodological knowledge. As far as necessary for critical appraisal of scientific articles, differences in research areas like epidemiology, clinical, and basic research are outlined. Further useful references are presented. Basic methodological knowledge is required to select and interpret scientific articles correctly.

  20. Experimental study on the optimal purge duration of a proton exchange membrane fuel cell with a dead-ended anode

    NASA Astrophysics Data System (ADS)

    Lin, Yu-Fen; Chen, Yong-Song

    2017-02-01

    When a proton exchange membrane fuel cell (PEMFC) is operated with a dead-ended anode, impurities gradually accumulate within the anode, resulting in a performance drop. An anode purge is thereby ultimately required to remove impurities within the anode. A purge strategy comprises purge interval (valve closed) and purge duration (valve is open). A short purge interval causes frequent and unnecessary activation of the valve, whereas a long purge interval leads to excessive impurity accumulation. A short purge duration causes an incomplete performance recovery, whereas a long purge duration results in low hydrogen utilization. In this study, a series of experimental trials was conducted to simultaneously measure the hydrogen supply rate and power generation of a PEMFC at a frequency of 50 Hz for various operating current density levels and purge durations. The effect of purge duration on the cell's energy efficiency was subsequently analyzed and discussed. The results showed that the optimal purge duration for the PEMFC was approximately 0.2 s. Based on the results of this study, a methodical process for determining optimal purge durations was ultimately proposed for widespread application. Purging approximately one-fourth of anode gas can obtain optimal energy efficiency for a PEMFC with a dead-ended anode.

  1. Assessing and Adapting Scientific Results for Space Weather Research to Operations (R2O)

    NASA Astrophysics Data System (ADS)

    Thompson, B. J.; Friedl, L.; Halford, A. J.; Mays, M. L.; Pulkkinen, A. A.; Singer, H. J.; Stehr, J. W.

    2017-12-01

    Why doesn't a solid scientific paper necessarily result in a tangible improvement in space weather capability? A well-known challenge in space weather forecasting is investing effort to turn the results of basic scientific research into operational knowledge. This process is commonly known as "Research to Operations," abbreviated R2O. There are several aspects of this process: 1) How relevant is the scientific result to a particular space weather process? 2) If fully utilized, how much will that result improve the reliability of the forecast for the associated process? 3) How much effort will this transition require? Is it already in a relatively usable form, or will it require a great deal of adaptation? 4) How much burden will be placed on forecasters? Is it "plug-and-play" or will it require effort to operate? 5) How can robust space weather forecasting identify challenges for new research? This presentation will cover several approaches that have potential utility in assessing scientific results for use in space weather research. The demonstration of utility is the first step, relating to the establishment of metrics to ensure that there will be a clear benefit to the end user. The presentation will then move to means of determining cost vs. benefit, (where cost involves the full effort required to transition the science to forecasting, and benefit concerns the improvement of forecast reliability), and conclude with a discussion of the role of end users and forecasters in driving further innovation via "O2R."

  2. End-to-End Information System design at the NASA Jet Propulsion Laboratory

    NASA Technical Reports Server (NTRS)

    Hooke, A. J.

    1978-01-01

    Recognizing a pressing need of the 1980s to optimize the two-way flow of information between a ground-based user and a remote space-based sensor, an end-to-end approach to the design of information systems has been adopted at the Jet Propulsion Laboratory. The objectives of this effort are to ensure that all flight projects adequately cope with information flow problems at an early stage of system design, and that cost-effective, multi-mission capabilities are developed when capital investments are made in supporting elements. The paper reviews the End-to-End Information System (EEIS) activity at the Laboratory, and notes the ties to the NASA End-to-End Data System program.

  3. Constrained optimal multi-phase lunar landing trajectory with minimum fuel consumption

    NASA Astrophysics Data System (ADS)

    Mathavaraj, S.; Pandiyan, R.; Padhi, R.

    2017-12-01

    A Legendre pseudo spectral philosophy based multi-phase constrained fuel-optimal trajectory design approach is presented in this paper. The objective here is to find an optimal approach to successfully guide a lunar lander from perilune (18km altitude) of a transfer orbit to a height of 100m over a specific landing site. After attaining 100m altitude, there is a mission critical re-targeting phase, which has very different objective (but is not critical for fuel optimization) and hence is not considered in this paper. The proposed approach takes into account various mission constraints in different phases from perilune to the landing site. These constraints include phase-1 ('braking with rough navigation') from 18km altitude to 7km altitude where navigation accuracy is poor, phase-2 ('attitude hold') to hold the lander attitude for 35sec for vision camera processing for obtaining navigation error, and phase-3 ('braking with precise navigation') from end of phase-2 to 100m altitude over the landing site, where navigation accuracy is good (due to vision camera navigation inputs). At the end of phase-1, there are constraints on position and attitude. In Phase-2, the attitude must be held throughout. At the end of phase-3, the constraints include accuracy in position, velocity as well as attitude orientation. The proposed optimal trajectory technique satisfies the mission constraints in each phase and provides an overall fuel-minimizing guidance command history.

  4. Discrete Optimization of Electronic Hyperpolarizabilities in a Chemical Subspace

    DTIC Science & Technology

    2009-05-01

    molecular design. Methods for optimization in discrete spaces have been studied extensively and recently reviewed ( 5). Optimization methods include...integer programming, as in branch-and-bound techniques (including dead-end elimination [ 6]), simulated annealing ( 7), and genetic algorithms ( 8...These algorithms have found renewed interest and application in molecular and materials design (9- 12) . Recently, new approaches have been

  5. Big Data Challenges for Large Radio Arrays

    NASA Technical Reports Server (NTRS)

    Jones, Dayton L.; Wagstaff, Kiri; Thompson, David; D'Addario, Larry; Navarro, Robert; Mattmann, Chris; Majid, Walid; Lazio, Joseph; Preston, Robert; Rebbapragada, Umaa

    2012-01-01

    Future large radio astronomy arrays, particularly the Square Kilometre Array (SKA), will be able to generate data at rates far higher than can be analyzed or stored affordably with current practices. This is, by definition, a "big data" problem, and requires an end-to-end solution if future radio arrays are to reach their full scientific potential. Similar data processing, transport, storage, and management challenges face next-generation facilities in many other fields.

  6. The Ethics of Educational Research

    ERIC Educational Resources Information Center

    McKague, Ormond

    1975-01-01

    Author delivered an address that compared the two directions that scientific research may take; that is, to the millenium of Skinner and Schaff or to the end of man as seen by Huxley and Ellul. (Author/RK)

  7. Physics Division annual progress report for period ending December 31, 1975. [ORNL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1976-05-01

    Separate abstracts were prepared for each of the data-containing sections of this report. Additional sections deal with publications, titles of papers presented at scientific and technical meetings, personnel, etc. (RWR)

  8. A stochastic algorithm for global optimization and for best populations: A test case of side chains in proteins

    PubMed Central

    Glick, Meir; Rayan, Anwar; Goldblum, Amiram

    2002-01-01

    The problem of global optimization is pivotal in a variety of scientific fields. Here, we present a robust stochastic search method that is able to find the global minimum for a given cost function, as well as, in most cases, any number of best solutions for very large combinatorial “explosive” systems. The algorithm iteratively eliminates variable values that contribute consistently to the highest end of a cost function's spectrum of values for the full system. Values that have not been eliminated are retained for a full, exhaustive search, allowing the creation of an ordered population of best solutions, which includes the global minimum. We demonstrate the ability of the algorithm to explore the conformational space of side chains in eight proteins, with 54 to 263 residues, to reproduce a population of their low energy conformations. The 1,000 lowest energy solutions are identical in the stochastic (with two different seed numbers) and full, exhaustive searches for six of eight proteins. The others retain the lowest 141 and 213 (of 1,000) conformations, depending on the seed number, and the maximal difference between stochastic and exhaustive is only about 0.15 Kcal/mol. The energy gap between the lowest and highest of the 1,000 low-energy conformers in eight proteins is between 0.55 and 3.64 Kcal/mol. This algorithm offers real opportunities for solving problems of high complexity in structural biology and in other fields of science and technology. PMID:11792838

  9. Workflows for Full Waveform Inversions

    NASA Astrophysics Data System (ADS)

    Boehm, Christian; Krischer, Lion; Afanasiev, Michael; van Driel, Martin; May, Dave A.; Rietmann, Max; Fichtner, Andreas

    2017-04-01

    Despite many theoretical advances and the increasing availability of high-performance computing clusters, full seismic waveform inversions still face considerable challenges regarding data and workflow management. While the community has access to solvers which can harness modern heterogeneous computing architectures, the computational bottleneck has fallen to these often manpower-bounded issues that need to be overcome to facilitate further progress. Modern inversions involve huge amounts of data and require a tight integration between numerical PDE solvers, data acquisition and processing systems, nonlinear optimization libraries, and job orchestration frameworks. To this end we created a set of libraries and applications revolving around Salvus (http://salvus.io), a novel software package designed to solve large-scale full waveform inverse problems. This presentation focuses on solving passive source seismic full waveform inversions from local to global scales with Salvus. We discuss (i) design choices for the aforementioned components required for full waveform modeling and inversion, (ii) their implementation in the Salvus framework, and (iii) how it is all tied together by a usable workflow system. We combine state-of-the-art algorithms ranging from high-order finite-element solutions of the wave equation to quasi-Newton optimization algorithms using trust-region methods that can handle inexact derivatives. All is steered by an automated interactive graph-based workflow framework capable of orchestrating all necessary pieces. This naturally facilitates the creation of new Earth models and hopefully sparks new scientific insights. Additionally, and even more importantly, it enhances reproducibility and reliability of the final results.

  10. Constructing Scientific Explanations: a System of Analysis for Students' Explanations

    NASA Astrophysics Data System (ADS)

    de Andrade, Vanessa; Freire, Sofia; Baptista, Mónica

    2017-08-01

    This article describes a system of analysis aimed at characterizing students' scientific explanations. Science education literature and reform documents have been highlighting the importance of scientific explanations for students' conceptual understanding and for their understanding of the nature of scientific knowledge. Nevertheless, and despite general agreement regarding the potential of having students construct their own explanations, a consensual notion of scientific explanation has still not been reached. As a result, within science education literature, there are several frameworks defining scientific explanations, with different foci as well as different notions of what accounts as a good explanation. Considering this, and based on a more ample project, we developed a system of analysis to characterize students' explanations. It was conceptualized and developed based on theories and models of scientific explanations, science education literature, and from examples of students' explanations collected by an open-ended questionnaire. With this paper, it is our goal to present the system of analysis, illustrating it with specific examples of students' collected explanations. In addition, we expect to point out its adequacy and utility for analyzing and characterizing students' scientific explanations as well as for tracing their progression.

  11. Questioning the evidence for a claim in a socio-scientific issue: an aspect of scientific literacy

    NASA Astrophysics Data System (ADS)

    Roberts, Ros; Gott, Richard

    2010-11-01

    Understanding the science in a 'socio-scientific issue' is at the heart of the varied definitions of 'scientific literacy'. Many consider that understanding evidence is necessary to participate in decision making and to challenge the science that affects people's lives. A model is described that links practical work, argumentation and scientific literacy which is used as the basis of this research. If students are explicitly taught about evidence does this transfer to students asking questions in the context of a local socio-scientific issue? What do they ask questions about? Sixty-five primary teacher training students were given the pre-test, before being taught the 'concepts of evidence' and applying them in an open-ended investigation and were tested again 15 weeks later. Data were coded using Toulmin's argument pattern (TAP) and the 'concepts of evidence'. After the intervention it was found that, in relation to a socio-scientific issue, they raised significantly more questions specifically about the evidence that lead to the scientists' claims although questions explicitly targeting the quality of the data were still rare. This has implications for curricula that aim for scientific literacy.

  12. Optimization of Multiple Related Negotiation through Multi-Negotiation Network

    NASA Astrophysics Data System (ADS)

    Ren, Fenghui; Zhang, Minjie; Miao, Chunyan; Shen, Zhiqi

    In this paper, a Multi-Negotiation Network (MNN) and a Multi- Negotiation Influence Diagram (MNID) are proposed to optimally handle Multiple Related Negotiations (MRN) in a multi-agent system. Most popular, state-of-the-art approaches perform MRN sequentially. However, a sequential procedure may not optimally execute MRN in terms of maximizing the global outcome, and may even lead to unnecessary losses in some situations. The motivation of this research is to use a MNN to handle MRN concurrently so as to maximize the expected utility of MRN. Firstly, both the joint success rate and the joint utility by considering all related negotiations are dynamically calculated based on a MNN. Secondly, by employing a MNID, an agent's possible decision on each related negotiation is reflected by the value of expected utility. Lastly, through comparing expected utilities between all possible policies to conduct MRN, an optimal policy is generated to optimize the global outcome of MRN. The experimental results indicate that the proposed approach can improve the global outcome of MRN in a successful end scenario, and avoid unnecessary losses in an unsuccessful end scenario.

  13. geoKepler Workflow Module for Computationally Scalable and Reproducible Geoprocessing and Modeling

    NASA Astrophysics Data System (ADS)

    Cowart, C.; Block, J.; Crawl, D.; Graham, J.; Gupta, A.; Nguyen, M.; de Callafon, R.; Smarr, L.; Altintas, I.

    2015-12-01

    The NSF-funded WIFIRE project has developed an open-source, online geospatial workflow platform for unifying geoprocessing tools and models for for fire and other geospatially dependent modeling applications. It is a product of WIFIRE's objective to build an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. geoKepler includes a set of reusable GIS components, or actors, for the Kepler Scientific Workflow System (https://kepler-project.org). Actors exist for reading and writing GIS data in formats such as Shapefile, GeoJSON, KML, and using OGC web services such as WFS. The actors also allow for calling geoprocessing tools in other packages such as GDAL and GRASS. Kepler integrates functions from multiple platforms and file formats into one framework, thus enabling optimal GIS interoperability, model coupling, and scalability. Products of the GIS actors can be fed directly to models such as FARSITE and WRF. Kepler's ability to schedule and scale processes using Hadoop and Spark also makes geoprocessing ultimately extensible and computationally scalable. The reusable workflows in geoKepler can be made to run automatically when alerted by real-time environmental conditions. Here, we show breakthroughs in the speed of creating complex data for hazard assessments with this platform. We also demonstrate geoKepler workflows that use Data Assimilation to ingest real-time weather data into wildfire simulations, and for data mining techniques to gain insight into environmental conditions affecting fire behavior. Existing machine learning tools and libraries such as R and MLlib are being leveraged for this purpose in Kepler, as well as Kepler's Distributed Data Parallel (DDP) capability to provide a framework for scalable processing. geoKepler workflows can be executed via an iPython notebook as a part of a Jupyter hub at UC San Diego for sharing and reporting of the scientific analysis and results from various runs of geoKepler workflows. The communication between iPython and Kepler workflow executions is established through an iPython magic function for Kepler that we have implemented. In summary, geoKepler is an ecosystem that makes geospatial processing and analysis of any kind programmable, reusable, scalable and sharable.

  14. Aeolus End-To-End Simulator and Wind Retrieval Algorithms up to Level 1B

    NASA Astrophysics Data System (ADS)

    Reitebuch, Oliver; Marksteiner, Uwe; Rompel, Marc; Meringer, Markus; Schmidt, Karsten; Huber, Dorit; Nikolaus, Ines; Dabas, Alain; Marshall, Jonathan; de Bruin, Frank; Kanitz, Thomas; Straume, Anne-Grete

    2018-04-01

    The first wind lidar in space ALADIN will be deployed on ESÁs Aeolus mission. In order to assess the performance of ALADIN and to optimize the wind retrieval and calibration algorithms an end-to-end simulator was developed. This allows realistic simulations of data downlinked by Aeolus. Together with operational processors this setup is used to assess random and systematic error sources and perform sensitivity studies about the influence of atmospheric and instrument parameters.

  15. An Exploration Of Fuel Optimal Two-impulse Transfers To Cyclers in the Earth-Moon System

    NASA Astrophysics Data System (ADS)

    Hosseinisianaki, Saghar

    2011-12-01

    This research explores the optimum two-impulse transfers between a low Earth orbit and cycler orbits in the Earth-Moon circular restricted three-body framework, emphasizing the optimization strategy. Cyclers are those types of periodic orbits that meet both the Earth and the Moon periodically. A spacecraft on such trajectories are under the influence of both the Earth and the Moon gravitational fields. Cyclers have gained recent interest as baseline orbits for several Earth-Moon mission concepts, notably in relation to human exploration. In this thesis it is shown that a direct optimization starting from the classic lambert initial guess may not be adequate for these problems and propose a three-step optimization solver to improve the domain of convergence toward an optimal solution. The first step consists of finding feasible trajectories with a given transfer time. I employ Lambert's problem to provide initial guess to optimize the error in arrival position. This includes the analysis of the liability of Lambert's solution as an initial guess. Once a feasible trajectory is found, the velocity impulse is only a function of transfer time, departure, and arrival points' phases. The second step consists of the optimization of impulse over transfer time which results in the minimum impulse transfer for fixed end points. Finally, the third step is mapping the optimal solutions as the end points are varied.

  16. An Exploration Of Fuel Optimal Two-impulse Transfers To Cyclers in the Earth-Moon System

    NASA Astrophysics Data System (ADS)

    Hosseinisianaki, Saghar

    This research explores the optimum two-impulse transfers between a low Earth orbit and cycler orbits in the Earth-Moon circular restricted three-body framework, emphasizing the optimization strategy. Cyclers are those types of periodic orbits that meet both the Earth and the Moon periodically. A spacecraft on such trajectories are under the influence of both the Earth and the Moon gravitational fields. Cyclers have gained recent interest as baseline orbits for several Earth-Moon mission concepts, notably in relation to human exploration. In this thesis it is shown that a direct optimization starting from the classic lambert initial guess may not be adequate for these problems and propose a three-step optimization solver to improve the domain of convergence toward an optimal solution. The first step consists of finding feasible trajectories with a given transfer time. I employ Lambert's problem to provide initial guess to optimize the error in arrival position. This includes the analysis of the liability of Lambert's solution as an initial guess. Once a feasible trajectory is found, the velocity impulse is only a function of transfer time, departure, and arrival points' phases. The second step consists of the optimization of impulse over transfer time which results in the minimum impulse transfer for fixed end points. Finally, the third step is mapping the optimal solutions as the end points are varied.

  17. Communication System Architecture for Planetary Exploration

    NASA Technical Reports Server (NTRS)

    Braham, Stephen P.; Alena, Richard; Gilbaugh, Bruce; Glass, Brian; Norvig, Peter (Technical Monitor)

    2001-01-01

    Future human missions to Mars will require effective communications supporting exploration activities and scientific field data collection. Constraints on cost, size, weight and power consumption for all communications equipment make optimization of these systems very important. These information and communication systems connect people and systems together into coherent teams performing the difficult and hazardous tasks inherent in planetary exploration. The communication network supporting vehicle telemetry data, mission operations, and scientific collaboration must have excellent reliability, and flexibility.

  18. Adaptive and Optimal Control of Stochastic Dynamical Systems

    DTIC Science & Technology

    2015-09-14

    Advances in Statistics, Probability and Actuarial Sciences , Vol. 1, World Scientific, 2012, 451- 463. [4] T. E. Duncan and B. Pasik-Duncan, A...S. N. Cohen, T. K. Siu and H. Yang) Advances in Statistics, Probability and Actuarial Sciences , Vol. 1, World Scientific, 2012, 451-463. 4. T. E...games with gen- eral noise processes, Models and Methods in Economics and Management Science : Essays in Honor of Charles S. Tapiero, (eds. F. El

  19. Ending versus controlling versus employing addiction in the tobacco-caused disease endgame: moral psychological perspectives.

    PubMed

    Kozlowski, Lynn T

    2013-05-01

    Even though interest in reducing or eliminating tobacco-caused diseases is a common goal in tobacco control, many experts hold different views on addiction as a target of intervention. Some consider tobacco-caused addiction as a tobacco-caused disease to be eliminated alongside the other diseases. Some consider tobacco-caused addiction as a much lower priority disease to be eliminated, and a subset of this group is prepared to employ addiction to tobacco (nicotine) as a tool to reduce other tobacco-caused disease. These varying attitudes towards ending, controlling or employing tobacco addiction to reduce damage from tobacco use constitute quite different approaches to tobacco control and cause conflict among those in tobacco control. Moral psychological analyses argue that there is more than scientific evidence involved in supporting this continuum of approaches. Divergent values also influence positions in tobacco control. Attention to these values as well as the scientific evidence should be included in policy and practice in tobacco control. It is not that one constellation of values is necessarily superior, but debates need to be informed by and engage discussions of these values as well as the scientific evidence.

  20. Highlights of the 16th annual scientific sessions of the Society for Cardiovascular Magnetic Resonance.

    PubMed

    Carpenter, John-Paul; Patel, Amit R; Fernandes, Juliano Lara

    2013-07-19

    The 16th Annual Scientific Sessions of the Society for Cardiovascular Magnetic Resonance (SCMR) took place in San Francisco, USA at the end of January 2013. With a faculty of experts from across the world, this congress provided a wealth of insight into cutting-edge research and technological development. This review article intends to provide a highlight of what represented the most significant advances in the field of cardiovascular magnetic resonance (CMR) during this year's meeting.

  1. [Critical mass, explosive participation at the Max-Planck Institute about research of the living conditions of the scientific-technical world in Starnberg].

    PubMed

    Sonntag, Philipp

    2014-01-01

    Reviewers of the Max-Planck-Institut zur Erforschung der Lebensbedingungen der wissenschaftlich-technischen Welt (MPIL) did focus upon an abundance of vague reports of evaluative commissions, of benchmarking, of scientific modes. Thus it remained rather neglected, what staff actually had researched. An example: Progression and end of project AKR (Work-Consumption-Assessment) does display all kinds of related emotions at MPIL, and the sensitive guidance by Carl Friedrich von Weizsäcker.

  2. Celebrating Ciência & Saúde Coletiva, reminiscing about the trajectory of Physis.

    PubMed

    de Camargo Júnior, Kenneth Rochel

    2015-07-01

    We describe the history and characteristics of the journal Physis by presenting an overview of what was published in the 24 years since its first issue. We present some data relating to quotations and a critical discussion of same. We end with a discussion of recently observed movements in policies of government agencies regarding scientific policy, expressing our concern and advocating a policy fomenting the diversity and the broadening of the scope of vehicles of scientific dissemination.

  3. Opportunities and benefits as determinants of the direction of scientific research.

    PubMed

    Bhattacharya, Jay; Packalen, Mikko

    2011-07-01

    Scientific research and private-sector technological innovation differ in objectives, constraints, and organizational forms. Scientific research may thus not be driven by the direct practical benefit to others in the way that private-sector innovation is. Alternatively, some - yet largely unexplored - mechanisms drive the direction of scientific research to respond to the expected public benefit. We test these two competing hypotheses of scientific research. This is important because any coherent specification of what constitutes the socially optimal allocation of research requires that scientists take the public practical benefit of their work into account in setting their agenda. We examine whether the composition of medical research responds to changes in disease prevalence, while accounting for the quality of available research opportunities. We match biomedical publications data with disease prevalence data and develop new methods for estimating the quality of research opportunities from textual information and structural productivity parameters. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. Opportunities and Benefits as Determinants of the Direction of Scientific Research*

    PubMed Central

    Bhattacharya, Jay; Packale, Mikko

    2017-01-01

    Scientific research and private-sector technological innovation differ in objectives, constraints, and organizational forms. Scientific research may thus not be driven by the direct practical benefit to others in the way that private-sector innovation is. Alternatively, some–yet largely unexplored-mechanisms drive the direction of scientific research to respond to the expected public benefit. We test these two competing hypotheses of scientific research. This is important because any coherent specification of what constitutes the socially optimal allocation of research requires that scientists take the public practical benefit of their work into account in setting their agenda. We examine whether the composition of medical research responds to changes in disease prevalence, while accounting for the quality of available research opportunities. We match biomedical publications data with disease prevalence data and develop new methods for estimating the quality of research opportunities from textual information and structural productivity parameters. PMID:21683461

  5. Mars Smart Lander Simulations for Entry, Descent, and Landing

    NASA Technical Reports Server (NTRS)

    Striepe, S. A.; Way, D. W.; Balaram, J.

    2002-01-01

    Two primary simulations have been developed and are being updated for the Mars Smart Lander Entry, Descent, and Landing (EDL). The high fidelity engineering end-to-end EDL simulation that is based on NASA Langley's Program to Optimize Simulated Trajectories (POST) and the end-to-end real-time, hardware-in-the-loop simulation testbed, which is based on NASA JPL's (Jet Propulsion Laboratory) Dynamics Simulator for Entry, Descent and Surface landing (DSENDS). This paper presents the status of these Mars Smart Lander EDL end-to-end simulations at this time. Various models, capabilities, as well as validation and verification for these simulations are discussed.

  6. A research study review of effectiveness of treatments for psychiatric conditions common to end-stage cancer patients: needs assessment for future research and an impassioned plea.

    PubMed

    Johnson, Ralph J

    2018-04-03

    Rates of psychiatric conditions common to end-stage cancer patients (delirium, depression, anxiety disorders) remain unchanged. However, patient numbers have increased as the population has aged; indeed, cancer is a chief cause of mortality and morbidity in older populations. Effectiveness of psychiatric interventions and research to evaluate, inform, and improve interventions is critical to these patients' care. This article's intent is to report results from a recent review study on the effectiveness of interventions for psychiatric conditions common to end-stage cancer patients; the review study assessed the state of research regarding treatment effectiveness. Unlike previous review studies, this one included non-traditional/alternative therapies and spirituality interventions that have undergone scientific inquiry. A five-phase systematic strategy and a theoretic grounded iterative methodology were used to identify studies for inclusion and to craft an integrated, synthesized, comprehensive, and reasonably current end-product. Psychiatric medication therapies undoubtedly are the most powerful treatments. Among them, the most effective (i.e., "best practices benchmarks") are: (1) for delirium, typical antipsychotics-though there is no difference between typical vs. atypical and other antipsychotics, except for different side-effect profiles, (2) for depression, if patient life expectancy is ≥4-6 weeks, then a selective serotonin reuptake inhibitor (SSRI), and if < 3 weeks, then psychostimulants or ketamine, and these generally are useful anytime in the cancer disease course, and (3) for anxiety disorders, bio-diazepams (BDZs) are most used and most effective. A universal consensus suggests that psychosocial (i.e., talk) therapy and spirituality interventions fortify the therapeutic alliance and psychiatric medication protocols. However, trial studies have had mixed results regarding effectiveness in reducing psychiatric symptoms, even for touted psychotherapies. This study's findings prompted a testable linear conceptual model of co-factors and their importance for providing effective psychiatric care for end-stage cancer patients. The complicated and tricky part is negotiating patients' diagnoses while articulating internal intricacies within and between each of the model's co-factors. There is a relative absence of scientifically derived information and need for more large-scale, diverse scientific inquiry. Thus, this article is an impassioned plea for accelerated study and better care for end-stage cancer patients' psychiatric conditions.

  7. End-to-end Cyberinfrastructure and Data Services for Earth System Science Education and Research: Unidata's Plans and Directions

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M.

    2005-12-01

    A revolution is underway in the role played by cyberinfrastructure and data services in the conduct of research and education. We live in an era of an unprecedented data volume from diverse sources, multidisciplinary analysis and synthesis, and active, learner-centered education emphasis. For example, modern remote-sensing systems like hyperspectral satellite instruments generate terabytes of data each day. Environmental problems such as global change and water cycle transcend disciplinary as well as geographic boundaries, and their solution requires integrated earth system science approaches. Contemporary education strategies recommend adopting an Earth system science approach for teaching the geosciences, employing new pedagogical techniques such as enquiry-based learning and hands-on activities. Needless to add, today's education and research enterprise depends heavily on robust, flexible and scalable cyberinfrastructure, especially on the ready availability of quality data and appropriate tools to manipulate and integrate those data. Fortuitously, rapid advances in computing and communication technologies have also revolutionized how data, tools and services are being incorporated into the teaching and scientific enterprise. The exponential growth in the use of the Internet in education and research, largely due to the advent of the World Wide Web, is by now well documented. On the other hand, how some of the other technological and community trends that have shaped the use of cyberinfrastructure, especially data services, is less well understood. For example, the computing industry is converging on an approach called Web services that enables a standard and yet revolutionary way of building applications and methods to connect and exchange information over the Web. This new approach, based on XML - a widely accepted format for exchanging data and corresponding semantics over the Internet - enables applications, computer systems, and information processes to work together in a fundamentally different way. Likewise, the advent of digital libraries, grid computing platforms, interoperable frameworks, standards and protocols, open-source software, and community atmospheric models have been important drivers in shaping the use of a new generation of end-to-end cyberinfrastructure for solving some of the most challenging scientific and educational problems. In this talk, I will present an overview of the scientific, technological, and educational drivers and discuss recent developments in cyberinfrastructure and Unidata's role and directions in providing robust, end-to-end data services for solving geoscientific problems and advancing student learning.

  8. 78 FR 64967 - Center for Scientific Review; Amended Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-30

    ....m., Crystal City Marriott, 1999 Jefferson Davis Highway, Arlington, VA 22202 which was published in... and end on November 15, 2013. The time and location remain the same. The meeting is closed to the...

  9. Scientific and Technical Development of the Next Generation Space Telescope

    NASA Technical Reports Server (NTRS)

    Burg, Richard

    2003-01-01

    The Next Generation Space Telescope (NGST) is part of the Origins program and is the key mission to discover the origins of galaxies in the Universe. It is essential that scientific requirements be translated into technical specifications at the beginning of the program and that there is technical participation by astronomers in the design and modeling of the observatory. During the active time period of this grant, the PI participated in the NGST program at GSFC by participating in the development of the Design Reference Mission, the development of the full end-to-end model of the observatory, the design trade-off based on the modeling, the Science Instrument Module definition and modeling, the study of proto-mission and test-bed development, and by participating in meetings including quarterly reviews and support of the NGST SWG. This work was documented in a series of NGST Monographs that are available on the NGST web site.

  10. Commentary: Eight Ways to Prevent Cancer: a framework for effective prevention messages for the public

    PubMed Central

    Dart, Hank; Wolin, Kathleen Y.; Colditz, Graham A.

    2013-01-01

    Research over the past 40 years has convincingly shown that lifestyle factors play a huge role in cancer incidence and mortality. The public, though, can often discount the preventability of cancer. That health information on the Internet is a vast and often scientifically suspect commodity makes promoting important and sound cancer prevention messages to the pubic even more difficult. To help address these issues and improve the public’s knowledge of, and attitudes toward, cancer prevention, there need to be concerted efforts to create evidence-based, user-friendly information about behaviors that could greatly reduce overall cancer risk. Toward this end, we condensed the current scientific evidence on the topic into eight key behaviors. While not an end in themselves, “8 Ways to Stay Healthy and Prevent Cancer” forms an evidence-based and targeted framework that supports broader cancer prevention efforts. PMID:22367724

  11. Turning the tide against tuberculosis.

    PubMed

    Padayatchi, Nesri; Naidu, Naressa; Friedland, Gerald; Naidoo, Kasavan; Conradie, Francesca; Naidoo, Kogieleum; O'Donnell, Max Roe

    2017-03-01

    Despite affecting men, women, and children for millennia, tuberculosis (TB) is the most neglected disease. In contrast, the global response to HIV has reached a defining moment. By uniting efforts, promptly integrating major scientific findings for both treatment and prevention, and scaling up services, the once inconceivable end to the HIV epidemic may no longer be an illusion. "The world has made defeating AIDS a top priority. This is a blessing. But TB remains ignored" - Nelson Mandela. While there is no doubt that revolutionary diagnostics and new and repurposed drugs have provided some hope in the fight against TB, it is evident that scientific advances on their own are inadequate to achieve the World Health Organization's ambitious goal to end TB by 2035. In this article, the consequences of a myopic and conventional biomedical approach to TB, which has ultimately permeated to the level of individual patient care, are highlighted. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. A two-step parameter optimization algorithm for improving estimation of optical properties using spatial frequency domain imaging

    NASA Astrophysics Data System (ADS)

    Hu, Dong; Lu, Renfu; Ying, Yibin

    2018-03-01

    This research was aimed at optimizing the inverse algorithm for estimating the optical absorption (μa) and reduced scattering (μs‧) coefficients from spatial frequency domain diffuse reflectance. Studies were first conducted to determine the optimal frequency resolution and start and end frequencies in terms of the reciprocal of mean free path (1/mfp‧). The results showed that the optimal frequency resolution increased with μs‧ and remained stable when μs‧ was larger than 2 mm-1. The optimal end frequency decreased from 0.3/mfp‧ to 0.16/mfp‧ with μs‧ ranging from 0.4 mm-1 to 3 mm-1, while the optimal start frequency remained at 0 mm-1. A two-step parameter estimation method was proposed based on the optimized frequency parameters, which improved estimation accuracies by 37.5% and 9.8% for μa and μs‧, respectively, compared with the conventional one-step method. Experimental validations with seven liquid optical phantoms showed that the optimized algorithm resulted in the mean absolute errors of 15.4%, 7.6%, 5.0% for μa and 16.4%, 18.0%, 18.3% for μs‧ at the wavelengths of 675 nm, 700 nm, and 715 nm, respectively. Hence, implementation of the optimized parameter estimation method should be considered in order to improve the measurement of optical properties of biological materials when using spatial frequency domain imaging technique.

  13. WINGS: WFIRST Infrared Nearby Galaxy Survey

    NASA Astrophysics Data System (ADS)

    Williams, Benjamin

    WFIRST's combination of wide field and high resolution will revolutionize the study of nearby galaxies. We propose to produce and analyze simulated WFIRST data of nearby galaxies and their halos to maximize the scientific yield in the limited observing time available, ensuring the legacy value of WFIRST's eventual archive. We will model both halo structure and resolved stellar populations to optimize WFIRST's constraints on both dark matter and galaxy formation models in the local universe. WFIRST can map galaxy structure down to ~35 mag/square arcsecond using individual stars. The resulting maps of stellar halos and accreting dwarf companions will provide stringent tests of galaxy formation and dark matter models on galactic (and even sub-galactic) scales, which is where the most theoretical tension exists with the Lambda-CDM model. With a careful, coordinated plan, WFIRST can be expected to improve current sample sizes by 2 orders of magnitude, down to surface brightness limits comparable to those currently reached only in the Local Group, and that are >4 magnitudes fainter than achievable from the ground due to limitations in star-galaxy separation. WFIRST's maps of galaxy halos will simultaneously produce photometry for billions of stars in the main bodies of galaxies within 10 Mpc. These data will transform studies of star formation histories that track stellar mass growth as a function of time and position within a galaxy. They also will constrain critical stellar evolution models of the near-infrared bright, rapidly evolving stars that can contribute significantly to the integrated light of galaxies in the near-infrared. Thus, with WFIRST we can derive the detailed evolution of individual galaxies, reconstruct the complete history of star formation in the nearby universe, and put crucial constraints on the theoretical models used to interpret near-infrared extragalactic observations. We propose a three-component work plan that will ensure these gains by testing and optimizing WFIRST observing strategies and providing science guidance to trade studies of observatory requirements such as field of view, pixel scale and filter selection. First, we will perform extensive simulations of galaxies' halo substructures and stellar populations that will be used as input for optimizing observing strategies and sample selection. Second, we will develop a pipeline that optimizes stellar photometry, proper motion, and variability measurements with WFIRST. This software will: maximize data quality & scientific yield; provide essential, independent calibrations to the larger WFIRST efforts; and rapidly provide accurate photometry and astrometry to the community. Third, we will derive quantitative performance metrics to fairly evaluate trade-offs between different survey strategies and WFIRST performance capabilities. The end result of this effort will be: (1) an efficient survey strategy that maximizes the scientific yield of what would otherwise be a chaotic archive of observations from small, un-coordinated programs; (2) a suite of analysis tools and a state-of-the-art pipeline that can be deployed after launch to rapidly deliver stellar photometry to the public; (3) a platform to independently verify the calibration and point spread function modeling that are essential to the primary WFIRST goals, but that are best tested from images of stellar populations. These activities will be carried out by a Science Investigation Team that has decades of experience in using nearby galaxies to inform fundamental topics in astrophysics. This team is composed of researchers who have led the charge in observational and theoretical studies of resolved stellar populations and stellar halos. With our combined background, we are poised to take full advantage of the large field of view and high spatial resolution WFIRST will offer.

  14. Research on logistics scheduling based on PSO

    NASA Astrophysics Data System (ADS)

    Bao, Huifang; Zhou, Linli; Liu, Lei

    2017-08-01

    With the rapid development of e-commerce based on the network, the logistics distribution support of e-commerce is becoming more and more obvious. The optimization of vehicle distribution routing can improve the economic benefit and realize the scientific of logistics [1]. Therefore, the study of logistics distribution vehicle routing optimization problem is not only of great theoretical significance, but also of considerable value of value. Particle swarm optimization algorithm is a kind of evolutionary algorithm, which is based on the random solution and the optimal solution by iteration, and the quality of the solution is evaluated through fitness. In order to obtain a more ideal logistics scheduling scheme, this paper proposes a logistics model based on particle swarm optimization algorithm.

  15. Collaboration pathway(s) using new tools for optimizing `operational' climate monitoring from space

    NASA Astrophysics Data System (ADS)

    Helmuth, Douglas B.; Selva, Daniel; Dwyer, Morgan M.

    2015-09-01

    Consistently collecting the earth's climate signatures remains a priority for world governments and international scientific organizations. Architecting a long term solution requires transforming scientific missions into an optimized robust `operational' constellation that addresses the collective needs of policy makers, scientific communities and global academic users for trusted data. The application of new tools offers pathways for global architecture collaboration. Recent rule-based expert system (RBES) optimization modeling of the intended NPOESS architecture becomes a surrogate for global operational climate monitoring architecture(s). These rulebased systems tools provide valuable insight for global climate architectures, by comparison/evaluation of alternatives and the sheer range of trade space explored. Optimization of climate monitoring architecture(s) for a partial list of ECV (essential climate variables) is explored and described in detail with dialogue on appropriate rule-based valuations. These optimization tool(s) suggest global collaboration advantages and elicit responses from the audience and climate science community. This paper will focus on recent research exploring joint requirement implications of the high profile NPOESS architecture and extends the research and tools to optimization for a climate centric case study. This reflects work from SPIE RS Conferences 2013 and 2014, abridged for simplification30, 32. First, the heavily securitized NPOESS architecture; inspired the recent research question - was Complexity (as a cost/risk factor) overlooked when considering the benefits of aggregating different missions into a single platform. Now years later a complete reversal; should agencies considering Disaggregation as the answer. We'll discuss what some academic research suggests. Second, using the GCOS requirements of earth climate observations via ECV (essential climate variables) many collected from space-based sensors; and accepting their definitions of global coverages intended to ensure the needs of major global and international organizations (UNFCCC and IPCC) are met as a core objective. Consider how new optimization tools like rule-based engines (RBES) offer alternative methods of evaluating collaborative architectures and constellations? What would the trade space of optimized operational climate monitoring architectures of ECV look like? Third, using the RBES tool kit (2014) demonstrate with application to a climate centric rule-based decision engine - optimizing architectural trades of earth observation satellite systems, allowing comparison(s) to existing architectures and gaining insights for global collaborative architectures. How difficult is it to pull together an optimized climate case study - utilizing for example 12 climate based instruments on multiple existing platforms and nominal handful of orbits; for best cost and performance benefits against the collection requirements of representative set of ECV. How much effort and resources would an organization expect to invest to realize these analysis and utility benefits?

  16. Sustainable Development: A Strategy for Regaining Control of Northern Mali

    DTIC Science & Technology

    2014-06-01

    informal attempts to conduct evasive maneuvers to achieve desired end results. The Project for National Security Reform argued that at times “… end runs...recognizing the internal borders that France established in the early twentieth century . Still, Model II optimally assigns projects based on... Project Design 4. In the end , Model I allocated the projects while addressing the following supplemental research questions posed in chapters I and

  17. Technology for increased human productivity and safety on orbit

    NASA Technical Reports Server (NTRS)

    Ambrus, Judith; Gartrell, Charles F.

    1991-01-01

    Technologies are addressed that can facilitate the efficient performance of station operations on the Space Station Freedom (SSF) and thereby optimize the utilization of SSF for scientific research. The dedication of SSF capabilities to scientific study and to the payload-user community is a key goal of the program. Robotics applications are discussed in terms of automating the processing of experiment materials on-orbit by transferring ampules to a furnace system or by handling plant-tissue cultures. Noncontact temperature measurement and medical support technology are considered important technologies for maximizing time for scientific purposes. Detailed examinations are conducted of other technologies including advanced data systems and furnace designs. The addition of the listed technologies can provide an environment in which scientific research is more efficient and accurate.

  18. Interactive Implementation of the Optimal Systems Control Design Program (OPTSYSX) on the IBM 3033.

    DTIC Science & Technology

    1984-03-01

    DAS A44 159 INTERACTIVE IMPLEMENTATION OF THE OPTIMAL SYSTEMS I CONTROL DESIGN PROGRAM (OPTSYSX) ON THE 1DM 3033(U NAVAL POSTGRADUATE SCHOOL MONTEREY...noesear end idswtif’r b block number) Optimal Systems Control Systems Control Control Systems 10.; ABSTRACT (Continu an reveree side ff Roe684v ad Id yI...34 by block number) .- This thesis discusses the modification of an existing Optimal Systems Control FORTRAN program (OPTSYS) originally obtained from

  19. Structure-based design of novel chemical modification of the 3'-overhang for optimization of short interfering RNA performance.

    PubMed

    Xu, Lexing; Wang, Xin; He, Hongwei; Zhou, Jinming; Li, Xiaoyu; Ma, Hongtao; Li, Zelin; Zeng, Yi; Shao, Rongguang; Cen, Shan; Wang, Yucheng

    2015-02-10

    Short interfering RNAs (siRNAs) are broadly used to manipulate gene expression in mammalian cells. Although chemical modification is useful for increasing the potency of siRNAs in vivo, rational optimization of siRNA performance through chemical modification is still a challenge. In this work, we designed and synthesized a set of siRNAs containing modified two-nucleotide 3'-overhangs with the aim of strengthening the interaction between the 3'-end of the siRNA strand and the PAZ domain of Ago2. Their efficiency of binding to the PAZ domain was calculated using a computer modeling program, followed by measurement of RNA-Ago2 interaction in a surface plasmon resonance biochemical assay. The results suggest that increasing the level of binding of the 3'-end of the guiding strand with the PAZ domain, and/or reducing the level of binding of the sense strand through modifying the two-nucleotide 3'-overhangs, affects preferential strand selection and improves siRNA activity, while we cannot exclude the possibility that the modifications at the 3'-end of the sense strand may also affect the recognition of the 5'-end of the guiding strand by the MID domain. Taken together, our work presents a strategy for optimizing siRNA performance through asymmetric chemical modification of 3'-overhangs and also helps to develop the computer modeling method for rational siRNA design.

  20. Reticence, Accuracy and Efficacy

    NASA Astrophysics Data System (ADS)

    Oreskes, N.; Lewandowsky, S.

    2015-12-01

    James Hansen has cautioned the scientific community against "reticence," by which he means a reluctance to speak in public about the threat of climate change. This may contribute to social inaction, with the result that society fails to respond appropriately to threats that are well understood scientifically. Against this, others have warned against the dangers of "crying wolf," suggesting that reticence protects scientific credibility. We argue that both these positions are missing an important point: that reticence is not only a matter of style but also of substance. In previous work, Bysse et al. (2013) showed that scientific projections of key indicators of climate change have been skewed towards the low end of actual events, suggesting a bias in scientific work. More recently, we have shown that scientific efforts to be responsive to contrarian challenges have led scientists to adopt the terminology of a "pause" or "hiatus" in climate warming, despite the lack of evidence to support such a conclusion (Lewandowsky et al., 2015a. 2015b). In the former case, scientific conservatism has led to under-estimation of climate related changes. In the latter case, the use of misleading terminology has perpetuated scientific misunderstanding and hindered effective communication. Scientific communication should embody two equally important goals: 1) accuracy in communicating scientific information and 2) efficacy in expressing what that information means. Scientists should strive to be neither conservative nor adventurous but to be accurate, and to communicate that accurate information effectively.

  1. Scientific, legal, and ethical challenges of end-of-life organ procurement in emergency medicine.

    PubMed

    Rady, Mohamed Y; Verheijde, Joseph L; McGregor, Joan L

    2010-09-01

    We review (1) scientific evidence questioning the validity of declaring death and procuring organs in heart-beating (i.e., neurological standard of death) and non-heart-beating (i.e., circulatory-respiratory standard of death) donation; (2) consequences of collaborative programs realigning hospital policies to maximize access of procurement coordinators to critically and terminally ill patients as potential donors on arrival in emergency departments; and (3) ethical and legal ramifications of current practices of organ procurement on patients and their families. Relevant publications in peer-reviewed journals and government websites. Scientific evidence undermines the biological criteria of death that underpin the definition of death in heart-beating (i.e., neurological standard) and non-heart-beating (i.e., circulatory-respiratory standard) donation. Philosophical reinterpretation of the neurological and circulatory-respiratory standards in the death statute, to avoid the appearance of organ procurement as an active life-ending intervention, lacks public and medical consensus. Collaborative programs bundle procurement coordinators together with hospital staff for a team-huddle and implement a quality improvement tool for a Rapid Assessment of Hospital Procurement Barriers in Donation. Procurement coordinators have access to critically ill patients during the course of medical treatment with no donation consent and with family or surrogates unaware of their roles. How these programs affect the medical care of these patients has not been studied. Policies enforcing end-of-life organ procurement can have unintended consequences: (1) erosion of care in the patient's best interests, (2) lack of transparency, and (3) ethical and legal ramifications of flawed standards of declaring death. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  2. Subduction indices in Calabro-Sicilian arc : Training for Experimental Skills Testing and collaborative work for students in scientific terminal class in high school.

    NASA Astrophysics Data System (ADS)

    Gendron, Faustine; Bollori, Lucas; Villeneuve, Felix

    2017-04-01

    In France, at the end of the last year in high school, students of the scientific terminal class have written exams in all subjects they are studying, and in "Life and Earth's Sciences", they also have an Experimental Skills Testing in order to rate them in scientific approach. This one-hour evaluation is made of four steps: - During the first evaluation, students have to show that they are able to propose a scientific strategy connected to a scientific problem. - During the second evaluation, they have to experiment. - During the third evaluation, they have to introduce their results. - During the last evaluation, they have to deduce and conclude. The final testing take place at the end of May, but during all the school year, teachers have to train their students, and it's impossible to make them work on real subjects. Therefore, it's necessary to produce new subjects every year. Linked to a fall school in Sicily last October, my colleagues and I have decided to create a new Experimental Skills Test to use new examples and illustrate subduction in the Mediterranean Sea with Aeolian Islands. We would like to make our pupils understand what the Aeolian volcanism is due to, by using information, equipment and software, etc. we have in our classrooms in our high school. Since we have found several ways for our students to prove that the Aeolian Islands are linked to a subduction zone, we have decided, following our research, to divide the new experimental skills testing in three different tests, in order to make students train on most of the equipment and then to share their results to produce a collaborative final work.

  3. Dynamic Decision Making under Uncertainty and Partial Information

    DTIC Science & Technology

    2013-11-14

    integral under the natural filtration generated by the Brownian motions . This compact expression potentially enables us to design sub- optimal penalties...bounds on bermudan option price under jump diffusion processes. Quantitative Finance , 2013. Under review, available at http://arxiv.org/abs/1305.4321... Finance , 19:53 – 71, 2009. [3] D.P. Bertsekas. Dynamic Programming and Optimal Control. Athena Scientific, 4th edition, 2012. [4] D.B. Brown and J.E

  4. Analysis of Optimal Jitter Buffer Size for VoIP QoS under WiMAX Power-Saving Mode

    NASA Astrophysics Data System (ADS)

    Kim, Hyungsuk; Kim, Taehyoun

    VoIP service is expected as one of the key applications of Mobile WiMAX, but the speech quality of VoIP service often suffers deterioration due to the fluctuating transmission delay called jitter. This is commonly ameliorated by a de-jitter buffer, and we aim to find the optimal size of de-jitter buffer to achieve speech quality comparable to PSTN. We developed a new model of the packet drops at the de-jitter buffer and the end-to-end packet delay which takes account of the additional delay introduced by the WiMAX power-saving mode. Using our model, we analyzed the optimal size of the de-jitter buffer for various network parameters, and showed that the results obtained by analysis accord with simulation results.

  5. Integrated approach for automatic target recognition using a network of collaborative sensors.

    PubMed

    Mahalanobis, Abhijit; Van Nevel, Alan

    2006-10-01

    We introduce what is believed to be a novel concept by which several sensors with automatic target recognition (ATR) capability collaborate to recognize objects. Such an approach would be suitable for netted systems in which the sensors and platforms can coordinate to optimize end-to-end performance. We use correlation filtering techniques to facilitate the development of the concept, although other ATR algorithms may be easily substituted. Essentially, a self-configuring geometry of netted platforms is proposed that positions the sensors optimally with respect to each other, and takes into account the interactions among the sensor, the recognition algorithms, and the classes of the objects to be recognized. We show how such a paradigm optimizes overall performance, and illustrate the collaborative ATR scheme for recognizing targets in synthetic aperture radar imagery by using viewing position as a sensor parameter.

  6. Integrated starting and running amalgam assembly for an electrodeless fluorescent lamp

    DOEpatents

    Borowiec, Joseph Christopher; Cocoma, John Paul; Roberts, Victor David

    1998-01-01

    An integrated starting and running amalgam assembly for an electrodeless SEF fluorescent lamp includes a wire mesh amalgam support constructed to jointly optimize positions of a starting amalgam and a running amalgam in the lamp, thereby optimizing mercury vapor pressure in the lamp during both starting and steady-state operation in order to rapidly achieve and maintain high light output. The wire mesh amalgam support is constructed to support the starting amalgam toward one end thereof and the running amalgam toward the other end thereof, and the wire mesh is rolled for friction-fitting within the exhaust tube of the lamp. The positions of the starting and running amalgams on the wire mesh are jointly optimized such that high light output is achieved quickly and maintained, while avoiding any significant reduction in light output between starting and running operation.

  7. Optimal cube-connected cube multiprocessors

    NASA Technical Reports Server (NTRS)

    Sun, Xian-He; Wu, Jie

    1993-01-01

    Many CFD (computational fluid dynamics) and other scientific applications can be partitioned into subproblems. However, in general the partitioned subproblems are very large. They demand high performance computing power themselves, and the solutions of the subproblems have to be combined at each time step. The cube-connect cube (CCCube) architecture is studied. The CCCube architecture is an extended hypercube structure with each node represented as a cube. It requires fewer physical links between nodes than the hypercube, and provides the same communication support as the hypercube does on many applications. The reduced physical links can be used to enhance the bandwidth of the remaining links and, therefore, enhance the overall performance. The concept and the method to obtain optimal CCCubes, which are the CCCubes with a minimum number of links under a given total number of nodes, are proposed. The superiority of optimal CCCubes over standard hypercubes was also shown in terms of the link usage in the embedding of a binomial tree. A useful computation structure based on a semi-binomial tree for divide-and-conquer type of parallel algorithms was identified. It was shown that this structure can be implemented in optimal CCCubes without performance degradation compared with regular hypercubes. The result presented should provide a useful approach to design of scientific parallel computers.

  8. NCAR Earth Observing Laboratory - An End-to-End Observational Science Enterprise

    NASA Astrophysics Data System (ADS)

    Rockwell, A.; Baeuerle, B.; Grubišić, V.; Hock, T. F.; Lee, W. C.; Ranson, J.; Stith, J. L.; Stossmeister, G.

    2017-12-01

    Researchers who want to understand and describe the Earth System require high-quality observations of the atmosphere, ocean, and biosphere. Making these observations not only requires capable research platforms and state-of-the-art instrumentation but also benefits from comprehensive in-field project management and data services. NCAR's Earth Observing Laboratory (EOL) is an end-to-end observational science enterprise that provides leadership in observational research to scientists from universities, U.S. government agencies, and NCAR. Deployment: EOL manages the majority of the NSF Lower Atmosphere Observing Facilities, which includes research aircraft, radars, lidars, profilers, and surface and sounding systems. This suite is designed to address a wide range of Earth system science - from microscale to climate process studies and from the planet's surface into the Upper Troposphere/Lower Stratosphere. EOL offers scientific, technical, operational, and logistics support to small and large field campaigns across the globe. Development: By working closely with the scientific community, EOL's engineering and scientific staff actively develop the next generation of observing facilities, staying abreast of emerging trends, technologies, and applications in order to improve our measurement capabilities. Through our Design and Fabrication Services, we also offer high-level engineering and technical expertise, mechanical design, and fabrication to the atmospheric research community. Data Services: EOL's platforms and instruments collect unique datasets that must be validated, archived, and made available to the research community. EOL's Data Management and Services deliver high-quality datasets and metadata in ways that are transparent, secure, and easily accessible. We are committed to the highest standard of data stewardship from collection to validation to archival. Discovery: EOL promotes curiosity about Earth science, and fosters advanced understanding of the processes involved in observational research. Through EOL's Education and Outreach Program, we strive to inspire and develop the next generation of observational scientists and engineers by offering a range of educational, experiential, and outreach opportunities, including engineering internships.

  9. Crash in Australian outback ends NASA ballooning season

    NASA Astrophysics Data System (ADS)

    Harris, Margaret

    2010-06-01

    NASA has temporarily suspended all its scientific balloon launches after the balloon-borne Nuclear Compton Tele scope (NCT) crashed during take-off, scattering a trail of debris across the remote launch site and overturning a nearby parked car.

  10. 75 FR 63492 - Center for Scientific Review; Notice of Closed Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-15

    ....gov . Name of Committee: AIDS and Related Research Integrated Review Group; NeuroAIDS and Other End...: AIDS International Training and Research Program. Date: December 3-4, 2010. Time: 8 a.m. to 5 p.m...

  11. Introductory Biology Textbooks Under-Represent Scientific Process

    PubMed Central

    Duncan, Dara B.; Lubman, Alexandra; Hoskins, Sally G.

    2011-01-01

    Attrition of undergraduates from Biology majors is a long-standing problem. Introductory courses that fail to engage students or spark their curiosity by emphasizing the open-ended and creative nature of biological investigation and discovery could contribute to student detachment from the field. Our hypothesis was that introductory biology books devote relatively few figures to illustration of the design and interpretation of experiments or field studies, thereby de-emphasizing the scientific process. To investigate this possibility, we examined figures in six Introductory Biology textbooks published in 2008. On average, multistep scientific investigations were presented in fewer than 5% of the hundreds of figures in each book. Devoting such a small percentage of figures to the processes by which discoveries are made discourages an emphasis on scientific thinking. We suggest that by increasing significantly the illustration of scientific investigations, textbooks could support undergraduates’ early interest in biology, stimulate the development of design and analytical skills, and inspire some students to participate in investigations of their own. PMID:23653758

  12. SCOSTEP: Understanding the Climate and Weather of the Sun-Earth System

    NASA Technical Reports Server (NTRS)

    Gopalswamy, Natchimuthuk

    2011-01-01

    The international solar-terrestrial physics community had recognized the importance of space weather more than a decade ago, which resulted in a number of international collaborative activities such as the Climate and Weather of the Sun Earth System (CAWSES) by the Scientific Committee on Solar Terrestrial Physics (SCOSTEP). The CAWSES program is the current major scientific program of SCOSTEP that will continue until the end of the year 2013. The CAWSES program has brought scientists from all over the world together to tackle the scientific issues behind the Sun-Earth connected system and explore ways of helping the human society. In addition to the vast array of space instruments, ground based instruments have been deployed, which not only filled voids in data coverage, but also inducted young scientists from developing countries into the scientific community. This paper presents a summary of CAWSES and other SCOSTEP activities that promote space weather science via complementary approaches in international scientific collaborations, capacity building, and public outreach.

  13. Scientific Habits of Mind in Virtual Worlds

    NASA Astrophysics Data System (ADS)

    Steinkuehler, Constance; Duncan, Sean

    2008-12-01

    In today's increasingly "flat" world of globalization (Friedman 2005), the need for a scientifically literate citizenry has grown more urgent. Yet, by some measures, we have done a poor job at fostering scientific habits of mind in schools. Recent research on informal games-based learning indicates that such technologies and the communities they evoke may be one viable alternative—not as a substitute for teachers and classrooms, but as an alternative to textbooks and science labs. This paper presents empirical evidence about the potential of games for fostering scientific habits of mind. In particular, we examine the scientific habits of mind and dispositions that characterize online discussion forums of the massively multiplayer online game World of Warcraft. Eighty-six percent of the forum discussions were posts engaged in "social knowledge construction" rather than social banter. Over half of the posts evidenced systems based reasoning, one in ten evidenced model-based reasoning, and 65% displayed an evaluative epistemology in which knowledge is treated as an open-ended process of evaluation and argument.

  14. The scientific, the literary and the popular: Commerce and the reimagining of the scientific journal in Britain, 1813–1825

    PubMed Central

    Topham, Jonathan R.

    2016-01-01

    As scientists question the recent dominance of the scientific journal, the varied richness of its past offers useful materials for reflection. This paper examines four innovative journals founded and run by leading publishers and men of science in the 1810s and 1820s, which contributed to a significant reimagining of the form. Relying on a new distinction between the ‘literary’ and the ‘scientific’ to define their market, those who produced the journals intended to maximize their readership and profits by making them to some extent ‘popular’. While these attempts ended in commercial failure, not least because of the rapidly diversifying periodical market in which they operated, their history makes clear the important role that commerce has played both in defining the purposes and audiences of scientific journals and in the conceptualization of the scientific project. It also informs the ongoing debate concerning how the multiple audiences for science can be addressed in ways that are commercially and practically viable.

  15. Optimizing read-out of the NECTAr front-end electronics

    NASA Astrophysics Data System (ADS)

    Vorobiov, S.; Feinstein, F.; Bolmont, J.; Corona, P.; Delagnes, E.; Falvard, A.; Gascón, D.; Glicenstein, J.-F.; Naumann, C. L.; Nayman, P.; Ribo, M.; Sanuy, A.; Tavernet, J.-P.; Toussenel, F.; Vincent, P.

    2012-12-01

    We describe the optimization of the read-out specifications of the NECTAr front-end electronics for the Cherenkov Telescope Array (CTA). The NECTAr project aims at building and testing a demonstrator module of a new front-end electronics design, which takes an advantage of the know-how acquired while building the cameras of the CAT, H.E.S.S.-I and H.E.S.S.-II experiments. The goal of the optimization work is to define the specifications of the digitizing electronics of a CTA camera, in particular integration time window, sampling rate, analog bandwidth using physics simulations. We employed for this work real photomultiplier pulses, sampled at 100 ps with a 600 MHz bandwidth oscilloscope. The individual pulses are drawn randomly at the times at which the photo-electrons, originating from atmospheric showers, arrive at the focal planes of imaging atmospheric Cherenkov telescopes. The timing information is extracted from the existing CTA simulations on the GRID and organized in a local database, together with all the relevant physical parameters (energy, primary particle type, zenith angle, distance from the shower axis, pixel offset from the optical axis, night-sky background level, etc.), and detector configurations (telescope types, camera/mirror configurations, etc.). While investigating the parameter space, an optimal pixel charge integration time window, which minimizes relative error in the measured charge, has been determined. This will allow to gain in sensitivity and to lower the energy threshold of CTA telescopes. We present results of our optimizations and first measurements obtained using the NECTAr demonstrator module.

  16. Wide-Field Infrared Survey Telescope (WFIRST) Interim Report

    NASA Technical Reports Server (NTRS)

    Green, J.; Schechter, P.; Baltay, C.; Bean, R.; Bennett, D.; Brown, R.; Conselice, C.; Donahue, M.; Gaudi, S.; Lauer, T.; hide

    2011-01-01

    The New Worlds, New Horizons (NWNH) in Astronomy and Astrophysics 2010 Decadal Survey prioritized the community consensus for ground-based and space-based observatories. Recognizing that many of the community s key questions could be answered with a wide-field infrared survey telescope in space, and that the decade would be one of budget austerity, WFIRST was top ranked in the large space mission category. In addition to the powerful new science that could be accomplished with a wide-field infrared telescope, the WFIRST mission was determined to be both technologically ready and only a small fraction of the cost of previous flagship missions, such as HST or JWST. In response to the top ranking by the community, NASA formed the WFIRST Science Definition Team (SDT) and Project Office. The SDT was charged with fleshing out the NWNH scientific requirements to a greater level of detail. NWNH evaluated the risk and cost of the JDEM-Omega mission design, as submitted by NASA, and stated that it should serve as the basis for the WFIRST mission. The SDT and Project Office were charged with developing a mission optimized for achieving the science goals laid out by the NWNH re-port. The SDT and Project Office opted to use the JDEM-Omega hardware configuration as an initial start-ing point for the hardware implementation. JDEM-Omega and WFIRST both have an infrared imager with a filter wheel, as well as counter-dispersed moderate resolution spectrometers. The primary advantage of space observations is being above the Earth's atmosphere, which absorbs, scatters, warps and emits light. Observing from above the atmosphere enables WFIRST to obtain precision infrared measurements of the shapes of galaxies for weak lensing, infrared light-curves of supernovae and exoplanet microlensing events with low systematic errors, and infrared measurements of the H hydrogen line to be cleanly detected in the 1

  17. Scientific production of medical sciences universities in north of iran.

    PubMed

    Siamian, Hasan; Firooz, Mousa Yamin; Vahedi, Mohammad; Aligolbandi, Kobra

    2013-01-01

    NONE DECLARED. The study of the scientific evidence citation production by famous databases of the world is one of the important indicators to evaluate and rank the universities. The study at investigating the scientific production of Northern Iran Medical Sciences Universities in Scopus from 2005 through 2010. This survey used scientometrics technique. The samples under studies were the scientific products of four northern Iran Medical universities. Viewpoints quantity of the Scientific Products Mazandaran University of Medical Sciences stands first and of Babol University of Medical Sciences ranks the end, but from the viewpoints of quality of scientific products of considering the H-Index and the number of cited papers the Mazandaran University of Medical Sciences is a head from the other universities under study. From the viewpoints of subject of the papers, the highest scientific products belonged to the faculty of Pharmacy affiliated to Mazandaran University of Medial Sciences, but the three other universities for the genetics and biochemistry. Results showed that the Mazandaran University of Medical Sciences as compared to the other understudies universities ranks higher for the number of articles, cited articles, number of hard work authors and H-Index of Scopus database from 2005 through 2010.

  18. The Swedish Research Council's definition of 'scientific misconduct': a critique.

    PubMed

    Salwén, Håkan

    2015-02-01

    There is no consensus over the proper definition of 'scientific misconduct.' There are differences in opinion not only between countries but also between research institutions in the same country. This is unfortunate. Without a widely accepted definition it is difficult for scientists to adjust to new research milieux. This might hamper scientific innovation and make cooperation difficult. Furthermore, due to the potentially damaging consequences it is important to combat misconduct. But how frequent is it and what measures are efficient? Without an appropriate definition there are no interesting answers to these questions. In order to achieve a high degree of consensus and to foster research integrity, the international dialogue over the proper definition of 'scientific misconduct' must be on going. Yet, the scientific community should not end up with the definition suggested by the Swedish Research Council. The definition the council advocates does not satisfy the ordinary language condition. That is, the definition is not consistent with how 'scientific misconduct' is used by scientists. I will show that this is due to the fact that it refers to false results. I generalise this and argue that no adequate definition of 'scientific misconduct' makes such a reference.

  19. Design for life-cycle profit with simultaneous consideration of initial manufacturing and end-of-life remanufacturing

    NASA Astrophysics Data System (ADS)

    Kwak, Minjung; Kim, Harrison

    2015-01-01

    Remanufacturing is emerging as a promising solution for achieving green, profitable businesses. This article considers a manufacturer that produces new products and also remanufactured versions of the new products that become available at the end of their life cycle. For such a manufacturer, design decisions at the initial design stage determine both the current profit from manufacturing and future profit from remanufacturing. To maximize the total profit, design decisions must carefully consider both ends of product life cycle, i.e. manufacturing and end-of-life stages. This article proposes a decision-support model for the life-cycle design using mixed-integer nonlinear programming. With an aim to maximize the total life-cycle profit, the proposed model searches for an (at least locally) optimal product design (i.e. design specifications and the selling price) for the new and remanufactured products. It optimizes both the initial design and design upgrades at the end-of-life stage and also provides corresponding production strategies, including production quantities and take-back rate. The model is extended to a multi-objective model that maximizes both economic profit and environmental-impact saving. To illustrate, the developed model is demonstrated with an example of a desktop computer.

  20. Saturated fat -a never ending story?

    PubMed

    Svendsen, Karianne; Arnesen, Erik; Retterstøl, Kjetil

    2017-01-01

    Science has no clear message regarding health effects of saturated fats, it seems. Different RCTs, prospective cohort studies and meta-analysis have led to contrasting conclusions. The aim of the present commentary is to discuss some possible reasons for an apparently never-ending fat controversy. They are of a purely scientific nature, which is important to recognize, but unfortunately hard to overcome. First is the placebo problem. In pharmaceutical science, evidence-based medicine is often synonymous with data on verified medical events from long-lasting double-blind randomized placebo controlled trials. In nutritional science the lack of double-blind design and lack of placebo food generate less conclusive data than those achieved in pharmaceutical science. Some scientists may apply the same type of scientific criteria used to evaluate the effects of drugs for foods. This leaves an impression of insufficient data since in this respect the fundamental criteria for evidence based medicine are not present. The next scientific problem is the energy balance equation. In contrast to pharmaceuticals, nutrients contain energy. An increased intake of one nutrient will lead to a decreased intake of another. The effect of change in only one nutrient is then difficult to isolate. Lastly, in nutritional science, generalizability is difficult compared to pharmaceutical science. Food culture interferes with lifestyle and food habits change over time. In conclusion, all available knowledge, from molecular experiments to population studies, must be taken in to account, to convert scientific data into dietary recommendations.

  1. Ethics of Mandatory Research Biopsy for Correlative End Points Within Clinical Trials in Oncology

    PubMed Central

    Peppercorn, Jeffrey; Shapira, Iuliana; Collyar, Deborah; Deshields, Teresa; Lin, Nancy; Krop, Ian; Grunwald, Hans; Friedman, Paula; Partridge, Ann H.; Schilsky, Richard L.; Bertagnolli, Monica M.

    2010-01-01

    Clinical investigators in oncology are increasingly interested in using molecular analysis of cancer tissue to understand the biologic bases of response or resistance to novel interventions and to develop prognostic and predictive biomarkers that will guide clinical decision making. Some scientific questions of this nature can only be addressed, or may best be addressed, through the conduct of a clinical trial in which research biopsies are obtained from all participants. However, trial designs with mandatory research biopsies have raised ethical concerns related to the risk of harm to participants, the adequacy of voluntary informed consent, and the potential for misunderstanding among research participants when access to an experimental intervention is linked to the requirement to undergo a research biopsy. In consideration of the ethical and scientific issues at stake in this debate, the Cancer and Leukemia Group B Ethics Committee proposes guidelines for clinical trials involving mandatory research biopsies. Any cancer clinical trial that requires research biopsies of participants must be well designed to address the scientific question, obtain the biopsy in a way that minimizes risk, and ensure that research participants are fully informed of the risks, rationale, and requirements of the study, as well as of treatment alternatives. Further guidelines and discussions of this issue are specified in this position paper. We feel that if these principles are respected, an informed adult with cancer can both understand and voluntarily consent to participation in a clinical trial involving mandatory research biopsy for scientific end points. PMID:20406927

  2. Getting the Public Addicted to Scientific Data Through Social Media

    NASA Astrophysics Data System (ADS)

    Hasenkopf, C. A.; Flasher, J. C.; Lodoysamba, S.

    2013-12-01

    Effectively communicating about a scientific topic to the public can be challenging for scientists for a variety reasons that often boil down to an inadequate bridge between general knowledge and the specialized scientific knowledge needed to understand the context of what a scientist from a particular field wishes to convey. This issue makes it difficult for the public to interpret scientific information and leaves it vulnerable to misinterpretation and misrepresentation. Rather than 'dumb down' scientific information to the public, we believe the most effective way to bridge this gap is to provide a means for the public to have easy access to - and get addicted to! - the actual scientific data itself, presented in a straightforward form. To this end, we will discuss an air quality public awareness campaign that we launched in one of the most polluted cities in the world, Ulaanbaatar, Mongolia, over the past year. We have installed an air quality instrument at a university in Mongolia, and we automatically post data from the instrument on Facebook (UB Air Quality Info) and Twitter (@UB_Air). We provide infographics on how to understand the data, share relevant articles and local activities, and monitor the sites for questions from the public about the data. We also maintain a website that posts aggregate air quality information (http://ubdata.herokuapp.com) and publicly shares the code that automatically connects our air quality instrument to the social media sites. This social media project, the first of its kind in Mongolia, has been an effective way to provide: (1) a quantifiable context to the public about air pollution issues in Ulaanbaatar, (2) a forum for the public and decision makers - from ambassadors to politicians - to engage with experts in the field and each other, and (3) a device that helps prevent misrepresentation (or fabrication) of data for political ends. We will also discuss the broader utility of our project and possible application to other fields.

  3. TH-CD-207A-04: Optimized Respiratory Gating for Abnormal Breathers in Pancreatic SBRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, W; Miften, M; Schefter, T

    Purpose: Pancreatic SBRT is uniquely challenging due to both the erratic/unstable motion of the pancreas and the close proximity of the radiosensitive small bowel. Respiratory gating can mitigate this effect, but the irregularity of motion severely affects traditional phase-based gating. The purpose of this study was to analyze real-time motion data of pancreatic tumors to optimize the efficacy and accuracy of respiratory gating, with the overall goal of enabling dose escalated pancreatic SBRT. Methods: Fifteen pancreatic SBRT patients received 30–33 Gy in 5 fractions on a Varian TrueBeam STx unit. Abdominal compression was used to reduce the amplitude of tumormore » motion, and daily cone-beam computed tomography (CBCT) scans were acquired prior to each treatment for target localization purposes. For this study, breathing data (phase and amplitude) were collected during each CBCT scan using Varian’s Real-Time Position Management system. An in-house template matching technique was used to track the superior-inferior motion of implanted fiducial markers in CBCT projection images. Using tumor motion and breathing data, phase-based or amplitude-based respiratory gating was simulated for all 75 fractions, targeting either end-exhalation or end-inhalation phases of breathing. Results: For the average patient, gating at end-exhalation offered the best reductions in effective motion for equal duty cycles. However, optimal central phase angle varied widely (range: 0–92%, mean±SD: 49±12%), and phase-based gating windows typically associated with end-exhalation (i.e., “30–70%”) were rarely ideal. Amplitude-based gating significantly outperformed phase-based gating, with average effective ranges for amplitude-based gating 25% lower than phase-based gating ranges (as much as 73% lower). Amplitude-based gating was consistently better suited to accommodate abnormal breathing patterns. For both phase-based and amplitude-based gating, end-exhalation provided significantly better results than end-inhalation. Conclusion: Amplitude-based gating reliably outperformed phase-based gating, and end-exhalation was more suitable than end-inhalation. These results will be used to guide future dose-escalation trials. Research funding provided by Varian Medical Systems to Miften and Jones.« less

  4. Scientific reasoning skills development in the introductory biology courses for undergraduates

    NASA Astrophysics Data System (ADS)

    Schen, Melissa S.

    Scientific reasoning is a skill of critical importance to those students who seek to become professional scientists. Yet, there is little research on the development of such reasoning in science majors. In addition, scientific reasoning is often investigated as two separate entities: hypothetico-deductive reasoning and argumentation, even though these skills may be linked. With regard to argumentation, most investigations look at its use in discussing socioscientific issues, not in analyzing scientific data. As scientists often use the same argumentation skills to develop and support conclusions, this avenue needs to be investigated. This study seeks to address these issues and establish a baseline of both hypothetico-deductive reasoning and argumentation of scientific data of biology majors through their engagement in introductory biology coursework. This descriptive study investigated the development of undergraduates' scientific reasoning skills by assessing them multiple times throughout a two-quarter introductory biology course sequence for majors. Participants were assessed at the beginning of the first quarter, end of the first quarter, and end of the second quarter. A split-half version of the revised Lawson Classroom Test of Scientific Reasoning (LCTSR) and a paper and pencil argumentation instrument developed for this study were utilized to assess student hypothetico-deductive reasoning and argumentation skills, respectively. To identify factors that may influence scientific reasoning development, demographic information regarding age, gender, science coursework completed, and future plans was collected. Evidence for course emphasis on scientific reasoning was found in lecture notes, assignments, and laboratory exercises. This study did not find any trends of improvement in the students' hypothetico-deductive reasoning or argumentation skills either during the first quarter or over both quarters. Specific difficulties in the control of variables and direct hypothetico-deductive reasoning were found through analysis of the LCTSR data. Students were also found to have trouble identifying and rebutting counterarguments, compared to generating initial arguments from scientific data sets. Although no overall improvement was found, a moderate, positive relationship was detected between LCTSR and argumentation scores at each administration, affirming the predicted association. Lastly, no difference was determined between biology majors and other students also enrolled in the courses. Overall, the results found here are similar to those classified in the literature for both hypothetico-deductive reasoning and argumentation, indicating that biology majors may be similar to other populations studied. Also, as no explicit attention was paid to scientific reasoning skills in the two courses, these findings complement those that illustrate a need for direct attention to foster the development of these skills. These results suggest the need to develop direct and explicit methods in order to improve the scientific reasoning skills of future biological scientists early in their undergraduate years.

  5. Dynamic Non-Hierarchical File Systems for Exascale Storage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, Darrell E.; Miller, Ethan L

    This constitutes the final report for “Dynamic Non-Hierarchical File Systems for Exascale Storage”. The ultimate goal of this project was to improve data management in scientific computing and high-end computing (HEC) applications, and to achieve this goal we proposed: to develop the first, HEC-targeted, file system featuring rich metadata and provenance collection, extreme scalability, and future storage hardware integration as core design goals, and to evaluate and develop a flexible non-hierarchical file system interface suitable for providing more powerful and intuitive data management interfaces to HEC and scientific computing users. Data management is swiftly becoming a serious problem in themore » scientific community – while copious amounts of data are good for obtaining results, finding the right data is often daunting and sometimes impossible. Scientists participating in a Department of Energy workshop noted that most of their time was spent “...finding, processing, organizing, and moving data and it’s going to get much worse”. Scientists should not be forced to become data mining experts in order to retrieve the data they want, nor should they be expected to remember the naming convention they used several years ago for a set of experiments they now wish to revisit. Ideally, locating the data you need would be as easy as browsing the web. Unfortunately, existing data management approaches are usually based on hierarchical naming, a 40 year-old technology designed to manage thousands of files, not exabytes of data. Today’s systems do not take advantage of the rich array of metadata that current high-end computing (HEC) file systems can gather, including content-based metadata and provenance1 information. As a result, current metadata search approaches are typically ad hoc and often work by providing a parallel management system to the “main” file system, as is done in Linux (the locate utility), personal computers, and enterprise search appliances. These search applications are often optimized for a single file system, making it difficult to move files and their metadata between file systems. Users have tried to solve this problem in several ways, including the use of separate databases to index file properties, the encoding of file properties into file names, and separately gathering and managing provenance data, but none of these approaches has worked well, either due to limited usefulness or scalability, or both. Our research addressed several key issues: High-performance, real-time metadata harvesting: extracting important attributes from files dynamically and immediately updating indexes used to improve search; Transparent, automatic, and secure provenance capture: recording the data inputs and processing steps used in the production of each file in the system; Scalable indexing: indexes that are optimized for integration with the file system; Dynamic file system structure: our approach provides dynamic directories similar to those in semantic file systems, but these are the native organization rather than a feature grafted onto a conventional system. In addition to these goals, our research effort will include evaluating the impact of new storage technologies on the file system design and performance. In particular, the indexing and metadata harvesting functions can potentially benefit from the performance improvements promised by new storage class memories.« less

  6. Optimal fabrication processes for unidirectional metal-matrix composites: A computational simulation

    NASA Technical Reports Server (NTRS)

    Saravanos, D. A.; Murthy, P. L. N.; Morel, M.

    1990-01-01

    A method is proposed for optimizing the fabrication process of unidirectional metal matrix composites. The temperature and pressure histories are optimized such that the residual microstresses of the composite at the end of the fabrication process are minimized and the material integrity throughout the process is ensured. The response of the composite during the fabrication is simulated based on a nonlinear micromechanics theory. The optimal fabrication problem is formulated and solved with non-linear programming. Application cases regarding the optimization of the fabrication cool-down phases of unidirectional ultra-high modulus graphite/copper and silicon carbide/titanium composites are presented.

  7. Optimal fabrication processes for unidirectional metal-matrix composites - A computational simulation

    NASA Technical Reports Server (NTRS)

    Saravanos, D. A.; Murthy, P. L. N.; Morel, M.

    1990-01-01

    A method is proposed for optimizing the fabrication process of unidirectional metal matrix composites. The temperature and pressure histories are optimized such that the residual microstresses of the composite at the end of the fabrication process are minimized and the material integrity throughout the process is ensured. The response of the composite during the fabrication is simulated based on a nonlinear micromechanics theory. The optimal fabrication problem is formulated and solved with nonlinear programming. Application cases regarding the optimization of the fabrication cool-down phases of unidirectional ultra-high modulus graphite/copper and silicon carbide/titanium composites are presented.

  8. Multi-objective Optimization on Helium Liquefier Using Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Wang, H. R.; Xiong, L. Y.; Peng, N.; Meng, Y. R.; Liu, L. Q.

    2017-02-01

    Research on optimization of helium liquefier is limited at home and abroad, and most of the optimization is single-objective based on Collins cycle. In this paper, a multi-objective optimization is conducted using genetic algorithm (GA) on the 40 L/h helium liquefier developed by Technical Institute of Physics and Chemistry of the Chinese Academy of Science (TIPC, CAS), steady solutions are obtained in the end. In addition, the exergy loss of the optimized system is studied in the case of with and without liquid nitrogen pre-cooling. The results have guiding significance for the future design of large helium liquefier.

  9. Department of Defense In-House RDT and E Activities: Management Analysis Report for Fiscal Year 1992

    DTIC Science & Technology

    1994-01-25

    follow which cover the Army, Navy, Air Force and the Defense Nuclear Agency. Organizational changes for FY92 appear in Appendix A, including the new ...0.000 _ PERSONN DATA (END OF FISCAL YEAR 1992) S NWTS, & ENGINEMR TECNIC AL SUPPORT TYPE END STRENGTH. :PIDS: OTHER &OTHIER PERSONNEL MILITARY 67 17...PROPERTY 54.922 ADMIN 25.520 * NEW CAPITAL EQUIPMENT 43.634 OTHER 39.652 EQUIPMENT 40.581 TOTAL 172.458 * NEW SCIENTIFIC & ENG. EQUIP. 1.239 ACRES 53

  10. Changes in Somatosensory Responsiveness in Behaving Monkeys and Human Sub

    DTIC Science & Technology

    1991-08-30

    OFFICE SYMBOL 7a. NAME OF MONITORING ORGANIZATION University of Tennessee, Memphis f Air Force Office of Scientific Research/NL Sc. ADDRESS (City, State...wrist and at the elbow with velcro straps. Each animal palm manipulated a smooth aluminum plate attached at one end to the axle of a brushless DC...display are described above. The subject’s hand rested on a flat aluminum handle coupled at one end to the axle of a brushless DC torque motor while the

  11. [ETHICAL CONFLICTS AT THE END OF LIFE FROM NURSE PERCEPTION].

    PubMed

    Calvo Rodríguez, Begoña; Berdial Cabal, Ignacio

    2015-10-01

    Current medicine tends to dehumanize the end of life process, which contributes to generate certain ethical conflict to nurse staff The major scientific datadas research. The four main ethical conflicts detected are: decisions making, communicating information, futile treatments and hydration and artificial feeding. The nurses suffer moral distress with ethical conflicts by the obligation of safeguard the dignity and rights of their patients. The lack of training and experience to treat ethical problems contribute to increase nurse disconfort.

  12. Trellis coding techniques for mobile communications

    NASA Technical Reports Server (NTRS)

    Divsalar, D.; Simon, M. K.; Jedrey, T.

    1988-01-01

    A criterion for designing optimum trellis codes to be used over fading channels is given. A technique is shown for reducing certain multiple trellis codes, optimally designed for the fading channel, to conventional (i.e., multiplicity one) trellis codes. The computational cutoff rate R0 is evaluated for MPSK transmitted over fading channels. Examples of trellis codes optimally designed for the Rayleigh fading channel are given and compared with respect to R0. Two types of modulation/demodulation techniques are considered, namely coherent (using pilot tone-aided carrier recovery) and differentially coherent with Doppler frequency correction. Simulation results are given for end-to-end performance of two trellis-coded systems.

  13. Titration of Ideal Positive End-expiratory Pressure in Acute Respiratory Distress Syndrome: Comparison between Lower Inflection Point and Esophageal Pressure Method Using Volumetric Capnography.

    PubMed

    Baikunje, Nandakishore; Sehgal, Inderpaul Singh; Dhooria, Sahajal; Prasad, Kuruswamy Thurai; Agarwal, Ritesh

    2017-05-01

    The tenets of mechanical ventilation in acute respiratory distress syndrome (ARDS) include the utilization of low tidal volume and optimal application of positive end-expiratory pressure (PEEP). Optimal PEEP in ARDS is characterized by reduction in alveolar dead space along with improvement in the lung compliance and resultant betterment in oxygenation. There are various methods of setting PEEP in ARDS. Herein, we report a patient of ARDS, wherein we employed measurement of dead space using volumetric capnography to compare two different PEEP strategies, namely, the lower inflection point and transpulmonary pressure monitoring.

  14. The optimal SAM surface functional group for producing a biomimetic HA coating on Ti.

    PubMed

    Liu, D P; Majewski, P; O'Neill, B K; Ngothai, Y; Colby, C B

    2006-06-15

    Commercial interest is growing in biomimetic methods that employ self assembled mono-layers (SAMs) to produce biocompatible HA coatings on Ti-based orthopedic implants. Recently, separate studies have considered HA formation for various SAM surface functional groups. However, these have often neglected to verify crystallinity of the HA coating, which is essential for optimal bioactivity. Furthermore, differing experimental and analytical methods make performance comparisons difficult. This article investigates and evaluates HA formation for four of the most promising surface functional groups: --OH, --SO(3)H, --PO(4)H(2) and --COOH. All of them successfully formed a HA coating at Ca/P ratios between 1.49 and 1.62. However, only the --SO(3)H and --COOH end groups produced a predominantly crystalline HA. Furthermore, the --COOH end group yielded the thickest layer and possessed crystalline characteristics very similar to that of the human bone. The --COOH end group appears to provide the optimal SAM surface interface for nucleation and growth of biomimetic crystalline HA. Intriguingly, this finding may lend support to explanations elsewhere of why human bone sialoprotein is such a potent nucleator of HA and is attributed to the protein's glutamic acid-rich sequences.

  15. Design of freeze-drying processes for pharmaceuticals: practical advice.

    PubMed

    Tang, Xiaolin; Pikal, Michael J

    2004-02-01

    Design of freeze-drying processes is often approached with a "trial and error" experimental plan or, worse yet, the protocol used in the first laboratory run is adopted without further attempts at optimization. Consequently, commercial freeze-drying processes are often neither robust nor efficient. It is our thesis that design of an "optimized" freeze-drying process is not particularly difficult for most products, as long as some simple rules based on well-accepted scientific principles are followed. It is the purpose of this review to discuss the scientific foundations of the freeze-drying process design and then to consolidate these principles into a set of guidelines for rational process design and optimization. General advice is given concerning common stability issues with proteins, but unusual and difficult stability issues are beyond the scope of this review. Control of ice nucleation and crystallization during the freezing step is discussed, and the impact of freezing on the rest of the process and final product quality is reviewed. Representative freezing protocols are presented. The significance of the collapse temperature and the thermal transition, denoted Tg', are discussed, and procedures for the selection of the "target product temperature" for primary drying are presented. Furthermore, guidelines are given for selection of the optimal shelf temperature and chamber pressure settings required to achieve the target product temperature without thermal and/or mass transfer overload of the freeze dryer. Finally, guidelines and "rules" for optimization of secondary drying and representative secondary drying protocols are presented.

  16. On the Efficacy of Source Code Optimizations for Cache-Based Systems

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob F.; Saphir, William C.

    1998-01-01

    Obtaining high performance without machine-specific tuning is an important goal of scientific application programmers. Since most scientific processing is done on commodity microprocessors with hierarchical memory systems, this goal of "portable performance" can be achieved if a common set of optimization principles is effective for all such systems. It is widely believed, or at least hoped, that portable performance can be realized. The rule of thumb for optimization on hierarchical memory systems is to maximize temporal and spatial locality of memory references by reusing data and minimizing memory access stride. We investigate the effects of a number of optimizations on the performance of three related kernels taken from a computational fluid dynamics application. Timing the kernels on a range of processors, we observe an inconsistent and often counterintuitive impact of the optimizations on performance. In particular, code variations that have a positive impact on one architecture can have a negative impact on another, and variations expected to be unimportant can produce large effects. Moreover, we find that cache miss rates - as reported by a cache simulation tool, and confirmed by hardware counters - only partially explain the results. By contrast, the compiler-generated assembly code provides more insight by revealing the importance of processor-specific instructions and of compiler maturity, both of which strongly, and sometimes unexpectedly, influence performance. We conclude that it is difficult to obtain performance portability on modern cache-based computers, and comment on the implications of this result.

  17. On the Efficacy of Source Code Optimizations for Cache-Based Systems

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob F.; Saphir, William C.; Saini, Subhash (Technical Monitor)

    1998-01-01

    Obtaining high performance without machine-specific tuning is an important goal of scientific application programmers. Since most scientific processing is done on commodity microprocessors with hierarchical memory systems, this goal of "portable performance" can be achieved if a common set of optimization principles is effective for all such systems. It is widely believed, or at least hoped, that portable performance can be realized. The rule of thumb for optimization on hierarchical memory systems is to maximize temporal and spatial locality of memory references by reusing data and minimizing memory access stride. We investigate the effects of a number of optimizations on the performance of three related kernels taken from a computational fluid dynamics application. Timing the kernels on a range of processors, we observe an inconsistent and often counterintuitive impact of the optimizations on performance. In particular, code variations that have a positive impact on one architecture can have a negative impact on another, and variations expected to be unimportant can produce large effects. Moreover, we find that cache miss rates-as reported by a cache simulation tool, and confirmed by hardware counters-only partially explain the results. By contrast, the compiler-generated assembly code provides more insight by revealing the importance of processor-specific instructions and of compiler maturity, both of which strongly, and sometimes unexpectedly, influence performance. We conclude that it is difficult to obtain performance portability on modern cache-based computers, and comment on the implications of this result.

  18. Long Term Care

    MedlinePlus

    ... Groups Social Workers Scientific Advisory Board News Financials Strategic Plan Careers Shop Locate Resources Media Center What is HD? What is HD? Stages of HD The Scope of HD Who Is At Risk Genetic Testing & Family Planning Juvenile HD An End to HD? Where to ...

  19. Guiding Principles for Data Requirements

    EPA Pesticide Factsheets

    The principles in the document are intended to help guide the identification of data needs, promote and optimize full use of existing knowledge, provide consistency in the data request process across all scientific disciplines for pesticide review.

  20. Scientific travel in the Atlantic world: the French expedition to Gorée and the Antilles, 1681-1683.

    PubMed

    Dew, Nicholas

    2010-03-01

    Although historians have long recognized the importance of long-range scientific expeditions in both the practice and culture of eighteenth- and nineteenth-century science, it is less well understood how this form of scientific organization emerged and became established in the seventeenth and early eighteenth centuries. In the late seventeenth century new European scientific institutions tried to make use of globalized trade networks for their own ends, but to do so proved difficult. This paper offers a case history of one such expedition, the voyage sponsored by the French Académie royale des sciences to Gorée (in modern Senegal) and the Caribbean islands of Guadeloupe and Martinique in 1681-3. The voyage of Varin, Deshayes and de Glos reveals how the process of travel itself caused problems for instruments and observers alike.

  1. The Developing of the Scientific Knowledge and the Change of the Human Condition

    NASA Astrophysics Data System (ADS)

    Palazzi, Giordano Diambrini

    2005-04-01

    In this short review we will show how the new scientific development mainly born in the western countries has produced since the end of 1700s an enormous increase in the level of life and of the number of their inhabitant, as never happened since the beginning of the human species. With the export of the scientific and technological culture in the other countries, like eastern Europe, in north and south America, and later in China and India (to quote the main examples), also their welfare condition have increased or are developing now. For what is concerning the second part of this short review, we try to explain why the most important future needs would be to insert, step by step, the developing countries inside the community of "interacting minds", in order to propagate the scientific culture (but not only) and to make it evolving by the contribution of the full humanity.

  2. Social Welfare Policy and Inequalities in Health. Preconceived Truths in Scientific Research.

    PubMed

    Regidor, Enrique

    2016-08-02

    At the end of the first decade of the present century debates arose in social epidemiology. These debates set those who defend the existence of a relation between the political and/or welfare stage regime and the magnitude of socioeconomic inequalities in health against those who maintain the facts do not support such a relation. These debates are similar to other debates in epidemiology in the 1990s related with theories of how diseases are produced and the factors that determine their distribution in the population. Whereas some authors find it impossible to separate ethical and political aspects and professional values from scientific arguments, others consider that epidemiologists and other scientists should make an effort to distinguish between scientific and unscientific considerations. In this paper the author reflects about the harmony that keep science, politics and ethics in the scientific practice on health inequalities, although the empirical evidence is contrary to that harmonious effect.

  3. International Ultraviolet Explorer Observatory operations

    NASA Technical Reports Server (NTRS)

    1985-01-01

    This volume contains the final report for the International Ultraviolet Explorer IUE Observatory Operations contract. The fundamental operational objective of the International Ultraviolet Explorer (IUE) program is to translate competitively selected observing programs into IUE observations, to reduce these observations into meaningful scientific data, and then to present these data to the Guest Observer in a form amenable to the pursuit of scientific research. The IUE Observatory is the key to this objective since it is the central control and support facility for all science operations functions within the IUE Project. In carrying out the operation of this facility, a number of complex functions were provided beginning with telescope scheduling and operation, proceeding to data processing, and ending with data distribution and scientific data analysis. In support of these critical-path functions, a number of other significant activities were also provided, including scientific instrument calibration, systems analysis, and software support. Routine activities have been summarized briefly whenever possible.

  4. Optimal morphologic response to preoperative chemotherapy: an alternate outcome end point before resection of hepatic colorectal metastases.

    PubMed

    Shindoh, Junichi; Loyer, Evelyne M; Kopetz, Scott; Boonsirikamchai, Piyaporn; Maru, Dipen M; Chun, Yun Shin; Zimmitti, Giuseppe; Curley, Steven A; Charnsangavej, Chusilp; Aloia, Thomas A; Vauthey, Jean-Nicolas

    2012-12-20

    The purposes of this study were to confirm the prognostic value of an optimal morphologic response to preoperative chemotherapy in patients undergoing chemotherapy with or without bevacizumab before resection of colorectal liver metastases (CLM) and to identify predictors of the optimal morphologic response. The study included 209 patients who underwent resection of CLM after preoperative chemotherapy with oxaliplatin- or irinotecan-based regimens with or without bevacizumab. Radiologic responses were classified as optimal or suboptimal according to the morphologic response criteria. Overall survival (OS) was determined, and prognostic factors associated with an optimal response were identified in multivariate analysis. An optimal morphologic response was observed in 47% of patients treated with bevacizumab and 12% of patients treated without bevacizumab (P < .001). The 3- and 5-year OS rates were higher in the optimal response group (82% and 74%, respectively) compared with the suboptimal response group (60% and 45%, respectively; P < .001). On multivariate analysis, suboptimal morphologic response was an independent predictor of worse OS (hazard ratio, 2.09; P = .007). Receipt of bevacizumab (odds ratio, 6.71; P < .001) and largest metastasis before chemotherapy of ≤ 3 cm (odds ratio, 2.12; P = .025) were significantly associated with optimal morphologic response. The morphologic response showed no specific correlation with conventional size-based RECIST criteria, and it was superior to RECIST in predicting major pathologic response. Independent of preoperative chemotherapy regimen, optimal morphologic response is sufficiently correlated with OS to be considered a surrogate therapeutic end point for patients with CLM.

  5. Optimal Morphologic Response to Preoperative Chemotherapy: An Alternate Outcome End Point Before Resection of Hepatic Colorectal Metastases

    PubMed Central

    Shindoh, Junichi; Loyer, Evelyne M.; Kopetz, Scott; Boonsirikamchai, Piyaporn; Maru, Dipen M.; Chun, Yun Shin; Zimmitti, Giuseppe; Curley, Steven A.; Charnsangavej, Chusilp; Aloia, Thomas A.; Vauthey, Jean-Nicolas

    2012-01-01

    Purpose The purposes of this study were to confirm the prognostic value of an optimal morphologic response to preoperative chemotherapy in patients undergoing chemotherapy with or without bevacizumab before resection of colorectal liver metastases (CLM) and to identify predictors of the optimal morphologic response. Patients and Methods The study included 209 patients who underwent resection of CLM after preoperative chemotherapy with oxaliplatin- or irinotecan-based regimens with or without bevacizumab. Radiologic responses were classified as optimal or suboptimal according to the morphologic response criteria. Overall survival (OS) was determined, and prognostic factors associated with an optimal response were identified in multivariate analysis. Results An optimal morphologic response was observed in 47% of patients treated with bevacizumab and 12% of patients treated without bevacizumab (P < .001). The 3- and 5-year OS rates were higher in the optimal response group (82% and 74%, respectively) compared with the suboptimal response group (60% and 45%, respectively; P < .001). On multivariate analysis, suboptimal morphologic response was an independent predictor of worse OS (hazard ratio, 2.09; P = .007). Receipt of bevacizumab (odds ratio, 6.71; P < .001) and largest metastasis before chemotherapy of ≤ 3 cm (odds ratio, 2.12; P = .025) were significantly associated with optimal morphologic response. The morphologic response showed no specific correlation with conventional size-based RECIST criteria, and it was superior to RECIST in predicting major pathologic response. Conclusion Independent of preoperative chemotherapy regimen, optimal morphologic response is sufficiently correlated with OS to be considered a surrogate therapeutic end point for patients with CLM. PMID:23150701

  6. END STAGE CARDIAC AMYLOIDOSIS: PREDICTORS OF SURVIVAL TO CARDIAC TRANSPLANTATION AND LONG TERM OUTCOMES

    PubMed Central

    Gilstrap, Lauren Gray; Niehaus, Emily; Malhotra, Rajeev; Ton, Van-Khue; Watts, James; Seldin, David C.; Madsen, Joren C.; Semigran, Marc J.

    2013-01-01

    Background Orthotopic heart transplant (OHT) followed by myeloablative chemotherapy and autologous stem cell transplant (ASCT) has been successful in the treatment of light chain (AL) cardiac amyloidosis. The purpose of this study is to identify predictors of survival to OHT in patients with end stage heart failure due to AL amyloidosis, and compare post-OHT survival of cardiac amyloid patients to that of other cardiomyopathy patients undergoing OHT. Methods From January 2000 to June 2011, 31 patients with end stage heart failure secondary to AL amyloidosis were listed for OHT at Massachusetts General Hospital (MGH). Univariate and multivariate regression analyses identified predictors of survival to OHT. Kaplan-Meier analysis compared survival between MGH amyloidosis patients and the Scientific Registry of Transplant Recipients (SRTR) non-amyloid cardiomyopathy patients. Results Low body mass index (BMI) was the only predictor of survival to OHT in patients with end stage heart failure due to cardiac amyloidosis. Survival of cardiac amyloid patients who died prior to receiving a donor heart was only 63 ± 45 days after listing. Patients who survived to OHT received a donor organ at 53 ± 48 days after listing. Survival of AL amyloidosis patients on the waitlist was less than patients waitlisted for all other non-amyloid diagnoses. The long-term survival of transplanted amyloid patients was no different than the survival of non-amyloid, restrictive (p=0.34), non-amyloid dilated (p=0.34) or all non-amyloid cardiomyopathy patients (p=0.22) in the SRTR database. Conclusions Those that survive to OHT followed by ASCT have a survival rate similar to other cardiomyopathy patients undergoing OHT. However, more than one third of the patients died awaiting OHT. The only predictor of survival to OHT in AL amyloidosis patients was low BMI, which correlated with shorter waitlist time. To optimize the survival of these patients, access to donor organs must be improved. In light chain (AL) amyloidosis, amyloid fibrils derived from clonal lambda or kappa immunoglobulin light chains deposit abnormally in organs. Cardiac involvement is apparent echocardiographically in 60% of AL amyloidosis patients at the time of diagnosis, with clinical evidence of heart failure in 69% of patients.1 The median survival of AL amyloidosis patients presenting with any heart failure symptom is 8.5 months2 and even less for end-stage heart failure pateints. PMID:24200511

  7. Study on the optimization allocation of wind-solar in power system based on multi-region production simulation

    NASA Astrophysics Data System (ADS)

    Xu, Zhicheng; Yuan, Bo; Zhang, Fuqiang

    2018-06-01

    In this paper, a power supply optimization model is proposed. The model takes the minimum fossil energy consumption as the target, considering the output characteristics of the conventional power supply and the renewable power supply. The optimal capacity ratio of wind-solar in the power supply under various constraints is calculated, and the interrelation between conventional power supply and renewable energy is analyzed in the system of high proportion renewable energy integration. Using the model, we can provide scientific guidance for the coordinated and orderly development of renewable energy and conventional power sources.

  8. ePave: A Self-Powered Wireless Sensor for Smart and Autonomous Pavement.

    PubMed

    Xiao, Jian; Zou, Xiang; Xu, Wenyao

    2017-09-26

    "Smart Pavement" is an emerging infrastructure for various on-road applications in transportation and road engineering. However, existing road monitoring solutions demand a certain periodic maintenance effort due to battery life limits in the sensor systems. To this end, we present an end-to-end self-powered wireless sensor-ePave-to facilitate smart and autonomous pavements. The ePave system includes a self-power module, an ultra-low-power sensor system, a wireless transmission module and a built-in power management module. First, we performed an empirical study to characterize the piezoelectric module in order to optimize energy-harvesting efficiency. Second, we developed an integrated sensor system with the optimized energy harvester. An adaptive power knob is designated to adjust the power consumption according to energy budgeting. Finally, we intensively evaluated the ePave system in real-world applications to examine the system's performance and explore the trade-off.

  9. JacksonBot - Design, Simulation and Optimal Control of an Action Painting Robot

    NASA Astrophysics Data System (ADS)

    Raschke, Michael; Mombaur, Katja; Schubert, Alexander

    We present the robotics platform JacksonBot which is capable to produce paintings inspired by the Action Painting style of Jackson Pollock. A dynamically moving robot arm splashes color from a container at the end effector on the canvas. The paintings produced by this platform rely on a combination of the algorithmic generation of robot arm motions with random effects of the splashing color. The robot can be considered as a complex and powerful tool to generate art works programmed by a user. Desired end effector motions can be prescribed either by mathematical functions, by point sequences or by data glove motions. We have evaluated the effect of different shapes of input motions on the resulting painting. In order to compute the robot joint trajectories necessary to move along a desired end effector path, we use an optimal control based approach to solve the inverse kinematics problem.

  10. ePave: A Self-Powered Wireless Sensor for Smart and Autonomous Pavement

    PubMed Central

    Xiao, Jian; Zou, Xiang

    2017-01-01

    “Smart Pavement” is an emerging infrastructure for various on-road applications in transportation and road engineering. However, existing road monitoring solutions demand a certain periodic maintenance effort due to battery life limits in the sensor systems. To this end, we present an end-to-end self-powered wireless sensor—ePave—to facilitate smart and autonomous pavements. The ePave system includes a self-power module, an ultra-low-power sensor system, a wireless transmission module and a built-in power management module. First, we performed an empirical study to characterize the piezoelectric module in order to optimize energy-harvesting efficiency. Second, we developed an integrated sensor system with the optimized energy harvester. An adaptive power knob is designated to adjust the power consumption according to energy budgeting. Finally, we intensively evaluated the ePave system in real-world applications to examine the system’s performance and explore the trade-off. PMID:28954430

  11. STAR Data Reconstruction at NERSC/Cori, an adaptable Docker container approach for HPC

    NASA Astrophysics Data System (ADS)

    Mustafa, Mustafa; Balewski, Jan; Lauret, Jérôme; Porter, Jefferson; Canon, Shane; Gerhardt, Lisa; Hajdu, Levente; Lukascsyk, Mark

    2017-10-01

    As HPC facilities grow their resources, adaptation of classic HEP/NP workflows becomes a need. Linux containers may very well offer a way to lower the bar to exploiting such resources and at the time, help collaboration to reach vast elastic resources on such facilities and address their massive current and future data processing challenges. In this proceeding, we showcase STAR data reconstruction workflow at Cori HPC system at NERSC. STAR software is packaged in a Docker image and runs at Cori in Shifter containers. We highlight two of the typical end-to-end optimization challenges for such pipelines: 1) data transfer rate which was carried over ESnet after optimizing end points and 2) scalable deployment of conditions database in an HPC environment. Our tests demonstrate equally efficient data processing workflows on Cori/HPC, comparable to standard Linux clusters.

  12. Case study: technology initiative led to advanced lead optimization screening processes at Bristol-Myers Squibb, 2004-2009.

    PubMed

    Zhang, Litao; Cvijic, Mary Ellen; Lippy, Jonathan; Myslik, James; Brenner, Stephen L; Binnie, Alastair; Houston, John G

    2012-07-01

    In this paper, we review the key solutions that enabled evolution of the lead optimization screening support process at Bristol-Myers Squibb (BMS) between 2004 and 2009. During this time, technology infrastructure investment and scientific expertise integration laid the foundations to build and tailor lead optimization screening support models across all therapeutic groups at BMS. Together, harnessing advanced screening technology platforms and expanding panel screening strategy led to a paradigm shift at BMS in supporting lead optimization screening capability. Parallel SAR and structure liability relationship (SLR) screening approaches were first and broadly introduced to empower more-rapid and -informed decisions about chemical synthesis strategy and to broaden options for identifying high-quality drug candidates during lead optimization. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. Experiments: Why and How?

    PubMed

    Hansson, Sven Ove

    2016-06-01

    An experiment, in the standard scientific sense of the term, is a procedure in which some object of study is subjected to interventions (manipulations) that aim at obtaining a predictable outcome or at least predictable aspects of the outcome. The distinction between an experiment and a non-experimental observation is important since they are tailored to different epistemic needs. Experimentation has its origin in pre-scientific technological experiments that were undertaken in order to find the best technological means to achieve chosen ends. Important parts of the methodological arsenal of modern experimental science can be traced back to this pre-scientific, technological tradition. It is claimed that experimentation involves a unique combination of acting and observing, a combination whose unique epistemological properties have not yet been fully clarified.

  14. Real-Time Seismic Displays in Museums Appeal to the Public

    NASA Astrophysics Data System (ADS)

    Smith, Meagan; Taber, John; Hubenthal, Michael

    2006-02-01

    Technology provides people with constant access to the latest news, weather, and entertainment. Not surprisingly, the public increasingly demands that the most current information be available for immediate consumption. For museums striving to educate the public and to maintain and expand visitor interest, gone are the days of passively conveying scientific concepts through static displays. Instead, science museums must find creative ways to capture the public's interest-successful advocacy for research funding, solutions to environmental problems, even future generations' scientific innovation depend on this. To this end, the continuous collection and dissemination of real-time science information by the scientific community offers museums an opportunity to capitalize on visitors' data addiction and increase the public's interest in, and understanding of, the Earth system.

  15. National Seismic Network of Georgia

    NASA Astrophysics Data System (ADS)

    Tumanova, N.; Kakhoberashvili, S.; Omarashvili, V.; Tserodze, M.; Akubardia, D.

    2016-12-01

    Georgia, as a part of the Southern Caucasus, is tectonically active and structurally complex region. It is one of the most active segments of the Alpine-Himalayan collision belt. The deformation and the associated seismicity are due to the continent-continent collision between the Arabian and Eurasian plates. Seismic Monitoring of country and the quality of seismic data is the major tool for the rapid response policy, population safety, basic scientific research and in the end for the sustainable development of the country. National Seismic Network of Georgia has been developing since the end of 19th century. Digital era of the network started from 2003. Recently continuous data streams from 25 stations acquired and analyzed in the real time. Data is combined to calculate rapid location and magnitude for the earthquake. Information for the bigger events (Ml>=3.5) is simultaneously transferred to the website of the monitoring center and to the related governmental agencies. To improve rapid earthquake location and magnitude estimation the seismic network was enhanced by installing additional 7 new stations. Each new station is equipped with coupled Broadband and Strong Motion seismometers and permanent GPS system as well. To select the sites for the 7 new base stations, we used standard network optimization techniques. To choose the optimal sites for new stations we've taken into account geometry of the existed seismic network, topographic conditions of the site. For each site we studied local geology (Vs30 was mandatory for each site), local noise level and seismic vault construction parameters. Due to the country elevation, stations were installed in the high mountains, no accessible in winter due to the heavy snow conditions. To secure online data transmission we used satellite data transmission as well as cell data network coverage from the different local companies. As a result we've already have the improved earthquake location and event magnitudes. We've analyzed data from each station to calculate signal-to-nose ratio. Comparing these calculations with the ones for the existed stations showed that signal-to-nose ratio for new stations has much better value. National Seismic Network of Georgia is planning to install more stations to improve seismic network coverage.

  16. The Solar Probe mission - Mission design concepts and requirements

    NASA Technical Reports Server (NTRS)

    Ayon, Juan A.

    1992-01-01

    The Solar Probe concept as studied by the Jet Propulsion Laboratory represents the first mission to combine out-of-the-ecliptic scientific coverage with multiple, close solar encounters (at 4 solar radii). The scientific objectives of the mission have driven the investigation and analysis of several mission design concepts, all optimized to meet the science/mission requirements. This paper reviews those mission design concepts developed, the science objectives that drive the mission design, and the principle mission requirements associated with these various concepts.

  17. The credibility crisis in research: Can economics tools help?

    PubMed Central

    Gall, Thomas; Ioannidis, John P. A.; Maniadis, Zacharias

    2017-01-01

    The issue of nonreplicable evidence has attracted considerable attention across biomedical and other sciences. This concern is accompanied by an increasing interest in reforming research incentives and practices. How to optimally perform these reforms is a scientific problem in itself, and economics has several scientific methods that can help evaluate research reforms. Here, we review these methods and show their potential. Prominent among them are mathematical modeling and laboratory experiments that constitute affordable ways to approximate the effects of policies with wide-ranging implications. PMID:28445470

  18. Cosmic rays and cosmological speculations in the 1920s

    NASA Astrophysics Data System (ADS)

    Demaria, M.; Russo, A.

    1988-07-01

    A controversy which opposed the American physicist Millikan to the English scientist Jeans is discussed. It is an interesting aspect of a debate about the cultural and social value of science which shook the scientific comunity as well as the public. Ideological and religious belief concurred with Millikan's scientific work in designing this unified perspective. This American optimism and progressive ideology of science clashed with Jeans' and Eddington's claim of the cosmological validity of the second law of thermodynamics.

  19. The credibility crisis in research: Can economics tools help?

    PubMed

    Gall, Thomas; Ioannidis, John P A; Maniadis, Zacharias

    2017-04-01

    The issue of nonreplicable evidence has attracted considerable attention across biomedical and other sciences. This concern is accompanied by an increasing interest in reforming research incentives and practices. How to optimally perform these reforms is a scientific problem in itself, and economics has several scientific methods that can help evaluate research reforms. Here, we review these methods and show their potential. Prominent among them are mathematical modeling and laboratory experiments that constitute affordable ways to approximate the effects of policies with wide-ranging implications.

  20. Computer Synthesis Approaches of Hyperboloid Gear Drives with Linear Contact

    NASA Astrophysics Data System (ADS)

    Abadjiev, Valentin; Kawasaki, Haruhisa

    2014-09-01

    The computer design has improved forming different type software for scientific researches in the field of gearing theory as well as performing an adequate scientific support of the gear drives manufacture. Here are attached computer programs that are based on mathematical models as a result of scientific researches. The modern gear transmissions require the construction of new mathematical approaches to their geometric, technological and strength analysis. The process of optimization, synthesis and design is based on adequate iteration procedures to find out an optimal solution by varying definite parameters. The study is dedicated to accepted methodology in the creation of soft- ware for the synthesis of a class high reduction hyperboloid gears - Spiroid and Helicon ones (Spiroid and Helicon are trademarks registered by the Illinois Tool Works, Chicago, Ill). The developed basic computer products belong to software, based on original mathematical models. They are based on the two mathematical models for the synthesis: "upon a pitch contact point" and "upon a mesh region". Computer programs are worked out on the basis of the described mathematical models, and the relations between them are shown. The application of the shown approaches to the synthesis of commented gear drives is illustrated.

  1. Harnessing the power of emerging petascale platforms

    NASA Astrophysics Data System (ADS)

    Mellor-Crummey, John

    2007-07-01

    As part of the US Department of Energy's Scientific Discovery through Advanced Computing (SciDAC-2) program, science teams are tackling problems that require computational simulation and modeling at the petascale. A grand challenge for computer science is to develop software technology that makes it easier to harness the power of these systems to aid scientific discovery. As part of its activities, the SciDAC-2 Center for Scalable Application Development Software (CScADS) is building open source software tools to support efficient scientific computing on the emerging leadership-class platforms. In this paper, we describe two tools for performance analysis and tuning that are being developed as part of CScADS: a tool for analyzing scalability and performance, and a tool for optimizing loop nests for better node performance. We motivate these tools by showing how they apply to S3D, a turbulent combustion code under development at Sandia National Laboratory. For S3D, our node performance analysis tool helped uncover several performance bottlenecks. Using our loop nest optimization tool, we transformed S3D's most costly loop nest to reduce execution time by a factor of 2.94 for a processor working on a 503 domain.

  2. A communication efficient and scalable distributed data mining for the astronomical data

    NASA Astrophysics Data System (ADS)

    Govada, A.; Sahay, S. K.

    2016-07-01

    In 2020, ∼60PB of archived data will be accessible to the astronomers. But to analyze such a paramount data will be a challenging task. This is basically due to the computational model used to download the data from complex geographically distributed archives to a central site and then analyzing it in the local systems. Because the data has to be downloaded to the central site, the network BW limitation will be a hindrance for the scientific discoveries. Also analyzing this PB-scale on local machines in a centralized manner is challenging. In this, virtual observatory is a step towards this problem, however, it does not provide the data mining model (Zhang et al., 2004). Adding the distributed data mining layer to the VO can be the solution in which the knowledge can be downloaded by the astronomers instead the raw data and thereafter astronomers can either reconstruct the data back from the downloaded knowledge or use the knowledge directly for further analysis. Therefore, in this paper, we present Distributed Load Balancing Principal Component Analysis for optimally distributing the computation among the available nodes to minimize the transmission cost and downloading cost for the end user. The experimental analysis is done with Fundamental Plane (FP) data, Gadotti data and complex Mfeat data. In terms of transmission cost, our approach performs better than Qi et al. and Yue et al. The analysis shows that with the complex Mfeat data ∼90% downloading cost can be reduced for the end user with the negligible loss in accuracy.

  3. On-going clinical trials for elderly patients with a hematological malignancy: are we addressing the right end points?

    PubMed

    Hamaker, M E; Stauder, R; van Munster, B C

    2014-03-01

    Cancer societies and research cooperative groups worldwide have urged for the development of cancer trials that will address those outcome measures that are most relevant to older patients. We set out to determine the characteristics and study objectives of current clinical trials in hematological patients. The United States National Institutes of Health clinical trial registry was searched on 1 July 2013, for currently recruiting phase I, II or III clinical trials in hematological malignancies. Trial characteristics and study objectives were extracted from the registry website. In the 1207 clinical trials included in this overview, patient-centered outcome measures such as quality of life, health care utilization and functional capacity were only incorporated in a small number of trials (8%, 4% and 0.7% of trials, respectively). Even in trials developed exclusively for older patients, the primary focus lies on standard end points such as toxicity, efficacy and survival, while patient-centered outcome measures are included in less than one-fifth of studies. Currently on-going clinical trials in hematological malignancies are unlikely to significantly improve our knowledge of the optimal treatment of older patients as those outcome measures that are of primary importance to this patient population are still included in only a minority of studies. As a scientific community, we cannot continue to simply acknowledge this issue, but must all participate in taking the necessary steps to enable the delivery of evidence-based, tailor-made and patient-focused cancer care to our rapidly growing elderly patient population.

  4. On-going clinical trials for elderly patients with a hematological malignancy: are we addressing the right end points?†

    PubMed Central

    Hamaker, M. E.; Stauder, R.; van Munster, B. C.

    2014-01-01

    Background Cancer societies and research cooperative groups worldwide have urged for the development of cancer trials that will address those outcome measures that are most relevant to older patients. We set out to determine the characteristics and study objectives of current clinical trials in hematological patients. Method The United States National Institutes of Health clinical trial registry was searched on 1 July 2013, for currently recruiting phase I, II or III clinical trials in hematological malignancies. Trial characteristics and study objectives were extracted from the registry website. Results In the 1207 clinical trials included in this overview, patient-centered outcome measures such as quality of life, health care utilization and functional capacity were only incorporated in a small number of trials (8%, 4% and 0.7% of trials, respectively). Even in trials developed exclusively for older patients, the primary focus lies on standard end points such as toxicity, efficacy and survival, while patient-centered outcome measures are included in less than one-fifth of studies. Conclusion Currently on-going clinical trials in hematological malignancies are unlikely to significantly improve our knowledge of the optimal treatment of older patients as those outcome measures that are of primary importance to this patient population are still included in only a minority of studies. As a scientific community, we cannot continue to simply acknowledge this issue, but must all participate in taking the necessary steps to enable the delivery of evidence-based, tailor-made and patient-focused cancer care to our rapidly growing elderly patient population. PMID:24458474

  5. Astroinformatics, data mining and the future of astronomical research

    NASA Astrophysics Data System (ADS)

    Brescia, Massimo; Longo, Giuseppe

    2013-08-01

    Astronomy, as many other scientific disciplines, is facing a true data deluge which is bound to change both the praxis and the methodology of every day research work. The emerging field of astroinformatics, while on the one end appears crucial to face the technological challenges, on the other is opening new exciting perspectives for new astronomical discoveries through the implementation of advanced data mining procedures. The complexity of astronomical data and the variety of scientific problems, however, call for innovative algorithms and methods as well as for an extreme usage of ICT technologies.

  6. Cardiopulmonary Resuscitation in Adults and Children With Mechanical Circulatory Support: A Scientific Statement From the American Heart Association.

    PubMed

    Peberdy, Mary Ann; Gluck, Jason A; Ornato, Joseph P; Bermudez, Christian A; Griffin, Russell E; Kasirajan, Vigneshwar; Kerber, Richard E; Lewis, Eldrin F; Link, Mark S; Miller, Corinne; Teuteberg, Jeffrey J; Thiagarajan, Ravi; Weiss, Robert M; O'Neil, Brian

    2017-06-13

    Cardiac arrest in patients on mechanical support is a new phenomenon brought about by the increased use of this therapy in patients with end-stage heart failure. This American Heart Association scientific statement highlights the recognition and treatment of cardiovascular collapse or cardiopulmonary arrest in an adult or pediatric patient who has a ventricular assist device or total artificial heart. Specific, expert consensus recommendations are provided for the role of external chest compressions in such patients. © 2017 American Heart Association, Inc.

  7. Electronic wastes

    NASA Astrophysics Data System (ADS)

    Regel-Rosocka, Magdalena

    2018-03-01

    E-waste amount is growing at about 4% annually, and has become the fastest growing waste stream in the industrialized world. Over 50 million tons of e-waste are produced globally each year, and some of them end up in landfills causing danger of toxic chemicals leakage over time. E-waste is also sent to developing countries where informal processing of waste electrical and electronic equipment (WEEE) causes serious health and pollution problems. A huge interest in recovery of valuable metals from WEEE is clearly visible in a great number of scientific, popular scientific publications or government and industrial reports.

  8. European Scientific Notes. Volume 37, Number 5,

    DTIC Science & Technology

    1983-05-31

    along the Maidenhead, UK, 4 - 6 May 1983. route. Total costs are about $6.40 (US) Sixth Workshop on IMS Observations per kilometer to operate and to...Diego, CA Hull HU5 2DW (27-29 June 1983) Aviation Psychology Lab Ohio State Univ. ( 4 - 6 July 1983) Wright-Patterson AFB ( 4 - 6 July 1983) Dr. R.J.A.W...37-5 PD- 3o q/s" 4 . TITLE (end &tbtltio) S. TYPE OF REPORT 0 PEIiOD COVERED Monthly EUROPEAN SCIENTIFIC NOTES May s. PERFORMING ORO. REPORT HUMMER 7

  9. High current/high power beam experiments from the space station

    NASA Technical Reports Server (NTRS)

    Cohen, Herbert A.

    1986-01-01

    In this overview, on the possible uses of high power beams aboard the space station, the advantages of the space station as compared to previous space vehicles are considered along with the kind of intense beams that could be generated, the possible scientific uses of these beams and associated problems. This order was delibrately chosen to emphasize that the means, that is, the high power particle ejection devices, will lead towards the possible ends, scientific measurements in the Earth's upper atmosphere using large fluxes of energetic particles.

  10. Digitizing the KSO white light images

    NASA Astrophysics Data System (ADS)

    Pötzi, W.

    From 1989 up to 2007 the Sun was observed at the Kanzelhöhe Observatory in white light on photographic film material. The images are on transparent sheet films and are not available to the scientific community now. With a photo scanner for transparent film material the films are now scanned and then prepared for scientific use. The programs for post processing are already finished and as an output FITS and JPEG-files are produced. The scanning should be finished end of 2011 and the data should then be available via our homepage.

  11. Optical Characteristics of the Marshall Space Flight Center Solar Ultraviolet Magnetograph

    NASA Technical Reports Server (NTRS)

    West, E. A.; Porter, J. G.; Davis, J. M.; Gary, G. A.; Adams, M.; Smith, S.; Hraba, J. F.

    2001-01-01

    This paper will describe the scientific objectives of the Marshall Space Flight Center (MSFC) Solar Ultraviolet Magnetograph Investigation (SUMI) and the optical components that have been developed to meet those objectives. In order to test the scientific feasibility of measuring magnetic fields in the UV, a sounding rocket payload is being developed. This paper will discuss: (1) the scientific measurements that will be made by the SUMI sounding rocket program, (2) how the optics have been optimized for simultaneous measurements of two magnetic lines CIV (1550 Angstroms) and MgII (2800 Angstroms), and (3) the optical, reflectance, transmission and polarization measurements that have been made on the SUMI telescope mirror and polarimeter.

  12. Evolution of the Two Cultures controversy

    NASA Astrophysics Data System (ADS)

    Bieniek, Ronald J.

    1981-05-01

    The Two Cultures schism is a persistent problem in our society. For over a century, scientific spokesmen and literary critics, from T. H. Huxley and M. Arnold to C. P. Snow and F. R. Leavis, have been involved in the Two Cultures issue. This article examines the evolution of the controversy between the ''scientific'' and ''humanistic'' elements of Western culture and its relation to educational policies. The division and antagonism between these two cultures appears to have arisen from differences in the human attitudes that they are preceived to engender. ''Scientific'' professionalism has been associated with a progressive optimism and self-assurance that nurture a broad humanitarianism, while the ''literary'' tradition is characterized by restraint, acceptance, and a more selective humanism.

  13. Tachycardia detection in ICDs by Boston Scientific : Algorithms, pearls, and pitfalls.

    PubMed

    Zanker, Norbert; Schuster, Diane; Gilkerson, James; Stein, Kenneth

    2016-09-01

    The aim of this study was to summarize how implantable cardioverter defibrillators (ICDs) by Boston Scientific sense, detect, discriminate rhythms, and classify episodes. Modern devices include multiple programming selections, diagnostic features, therapy options, memory functions, and device-related history features. Device operation includes logical steps from sensing, detection, discrimination, therapy delivery to history recording. The program is designed to facilitate the application of the device algorithms to the individual patient's clinical needs. Features and functions described in this article represent a selective excerpt by the authors from Boston Scientific publicly available product resources. Programming of ICDs may affect patient outcomes. Patient-adapted and optimized programming requires understanding of device operation and concepts.

  14. The GEO Geohazard Supersites and Natural Laboratories - GSNL 2.0: improving societal benefits of Geohazard science

    NASA Astrophysics Data System (ADS)

    Salvi, Stefano

    2016-04-01

    The Geohazard Supersites and Natural Laboratories initiative began with the "Frascati declaration" at the conclusion of the 3rd International Geohazards workshop of GEO held in November 2007 in Frascati, Italy. The recommendation of the workshop was "to stimulate an international and intergovernmental effort to monitor and study selected reference sites by establishing open access to relevant datasets according to GEO principles, to foster the collaboration between all various partners and end-users". This recommendation was later formalized in the GEO Work Plan as Component 2 of the GEO task DI-01, part of the GEO Disasters Societal Benefit Area. Today GSNL has grown to a voluntary collaboration among monitoring agencies, scientific community and the CEOS space agencies, working to improve the scientific understanding of earthquake and volcanic phenomena and enable better risk assessment and emergency management. According to its principles, actions in GSNL are focused on specific areas of the world, the Supersites, for which large amounts of in situ and satellite data are made openly available to all scientists. These areas are selected based on the importance of the scientific problems, as well as on the amount of population at risk, and should be evenly distributed among developed and less developed countries. Seven Supersites have been established to date, six of which on volcanic areas (Hawaii, US; Icelandic volcanoes; Mt. Etna, IT; Campi Flegrei, IT; Ecuadorian volcanoes, Taupo, NZ), and one on a seismic area (Western North Anatolian fault, TR). One more proposals is being evaluated: the Corinth Gulf in Greece. The Supersites have succeeded in promoting new scientific developments by providing a framework for an easier access to EO and in situ data. Coordination among researchers at the global scale has been achieved only where the Supersite activities were sustained through well established projects. For some Supersites a close coordination between scientists and end-users has been established or consolidated, and the clear advantages arising from such collaboration has stimulated a new vision for the GSNL initiative (GSNL 2.0). The status of the initiative and the future developments of GSNL 2.0, aiming to increase the uptake of the Supersite geohazard science by local end-users, will be presented at the meeting and discussed with the scientific community.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bailey, David H.

    The NAS Parallel Benchmarks (NPB) are a suite of parallel computer performance benchmarks. They were originally developed at the NASA Ames Research Center in 1991 to assess high-end parallel supercomputers. Although they are no longer used as widely as they once were for comparing high-end system performance, they continue to be studied and analyzed a great deal in the high-performance computing community. The acronym 'NAS' originally stood for the Numerical Aeronautical Simulation Program at NASA Ames. The name of this organization was subsequently changed to the Numerical Aerospace Simulation Program, and more recently to the NASA Advanced Supercomputing Center, althoughmore » the acronym remains 'NAS.' The developers of the original NPB suite were David H. Bailey, Eric Barszcz, John Barton, David Browning, Russell Carter, LeoDagum, Rod Fatoohi, Samuel Fineberg, Paul Frederickson, Thomas Lasinski, Rob Schreiber, Horst Simon, V. Venkatakrishnan and Sisira Weeratunga. The original NAS Parallel Benchmarks consisted of eight individual benchmark problems, each of which focused on some aspect of scientific computing. The principal focus was in computational aerophysics, although most of these benchmarks have much broader relevance, since in a much larger sense they are typical of many real-world scientific computing applications. The NPB suite grew out of the need for a more rational procedure to select new supercomputers for acquisition by NASA. The emergence of commercially available highly parallel computer systems in the late 1980s offered an attractive alternative to parallel vector supercomputers that had been the mainstay of high-end scientific computing. However, the introduction of highly parallel systems was accompanied by a regrettable level of hype, not only on the part of the commercial vendors but even, in some cases, by scientists using the systems. As a result, it was difficult to discern whether the new systems offered any fundamental performance advantage over vector supercomputers, and, if so, which of the parallel offerings would be most useful in real-world scientific computation. In part to draw attention to some of the performance reporting abuses prevalent at the time, the present author wrote a humorous essay 'Twelve Ways to Fool the Masses,' which described in a light-hearted way a number of the questionable ways in which both vendor marketing people and scientists were inflating and distorting their performance results. All of this underscored the need for an objective and scientifically defensible measure to compare performance on these systems.« less

  16. The next generation of data capturing - digital ink for the data stewards of the future

    NASA Astrophysics Data System (ADS)

    Czerniak, A.; Fleischer, D.; Schirnick, C.

    2012-12-01

    Data stewardship of the future requires the continuation from an expert driven discipline into a general scientific routine. One solution how this expansion can be done is the use of data management infrastructures already in the student education. Unsurprisingly, well-known drawbacks in terms of data stewardship from the scientific use complicate this expansion into the educational programs. The advantage of educational programs usually based on the application of standard methods is depleted by the general data capturing process at the point of publication or end of project lifetime. Considering student courses as short projects there are no publications and the end of the course exams keep students just like scientists away from data stewardship tasks. The Kiel Data Management Infrastructure brings the data capturing right in the data creation process. With this approach student education courses can be just another use case of data capturing. Smoothing the data capturing process and making use of available technologies drove the Kiel Data Management Infrastructure into a prototype testing of the use of 'digital ink' and the later on possible handwriting recognition. Making the data digitalization as easy as possible without abandoning the standards of paper-based protocols is the use case 'Smart Pens'. This technology fills the gap between the very long-lasting paper protocols and the effort depending digitalization of field and sampling data but it's also robust enough to work with battery powered devices. The combination of the Kiel Data Management Infrastructure with the 'digital ink' technology enables the data capturing from student education to high-end scientific lab work. Valuing educational data equally to scientific lab data is a strong signal to the researchers of the future while their work is recognized all the way from their undergraduate stage to their post-doc position. Students memorize that their data work is not neglected at any time and so they realize that their is no excuse of keeping any data away from the data management infrastructure. The technology of 'digital ink' is a milestone for the data stewardship discipline and fits perfectly into the a lot of gaps between the data creation and the data infrastructure and as long as we do not establish the life long data capturing support for the scientific career we can not complain about reluctant data submissions.

  17. Mechanical Design and Optimization of Swarm-Capable UAV Launch Systems

    DTIC Science & Technology

    2015-06-01

    stakeholders. The end result was the successful development and demonstration of a launching system prototype specifically developed to rapidly launch a...requirements for the stakeholders. The end result was the successful development and demonstration of a launching system prototype specifically developed to... Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122 7 Conclusion 125 7.1 Summary of Findings

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, Brian W; Brunhart-Lupo, Nicholas J; Gruchalla, Kenny M

    This brochure describes a system dynamics simulation (SD) framework that supports an end-to-end analysis workflow that is optimized for deployment on ESIF facilities(Peregrine and the Insight Center). It includes (I) parallel and distributed simulation of SD models, (ii) real-time 3D visualization of running simulations, and (iii) comprehensive database-oriented persistence of simulation metadata, inputs, and outputs.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, Brian W; Brunhart-Lupo, Nicholas J; Gruchalla, Kenny M

    This presentation describes a system dynamics simulation (SD) framework that supports an end-to-end analysis workflow that is optimized for deployment on ESIF facilities(Peregrine and the Insight Center). It includes (I) parallel and distributed simulation of SD models, (ii) real-time 3D visualization of running simulations, and (iii) comprehensive database-oriented persistence of simulation metadata, inputs, and outputs.

  20. Electronic and geometric properties of ETS-10: QM/MM studies of cluster models.

    PubMed

    Zimmerman, Anne Marie; Doren, Douglas J; Lobo, Raul F

    2006-05-11

    Hybrid DFT/MM methods have been used to investigate the electronic and geometric properties of the microporous titanosilicate ETS-10. A comparison of finite length and periodic models demonstrates that band gap energies for ETS-10 can be well represented with relatively small cluster models. Optimization of finite clusters leads to different local geometries for bulk and end sites, where the local bulk TiO6 geometry is in good agreement with recent experimental results. Geometry optimizations reveal that any asymmetry within the axial O-Ti-O chain is negligible. The band gap in the optimized model corresponds to a O(2p) --> Tibulk(3d) transition. The results suggest that the three Ti atom, single chain, symmetric, finite cluster is an effective model for the geometric and electronic properties of bulk and end TiO6 groups in ETS-10.

  1. Optimized Two-Party Video Chat with Restored Eye Contact Using Graphics Hardware

    NASA Astrophysics Data System (ADS)

    Dumont, Maarten; Rogmans, Sammy; Maesen, Steven; Bekaert, Philippe

    We present a practical system prototype to convincingly restore eye contact between two video chat participants, with a minimal amount of constraints. The proposed six-fold camera setup is easily integrated into the monitor frame, and is used to interpolate an image as if its virtual camera captured the image through a transparent screen. The peer user has a large freedom of movement, resulting in system specifications that enable genuine practical usage. Our software framework thereby harnesses the powerful computational resources inside graphics hardware, and maximizes arithmetic intensity to achieve over real-time performance up to 42 frames per second for 800 ×600 resolution images. Furthermore, an optimal set of fine tuned parameters are presented, that optimizes the end-to-end performance of the application to achieve high subjective visual quality, and still allows for further algorithmic advancement without loosing its real-time capabilities.

  2. Rapid optimization of multiple-burn rocket flights.

    NASA Technical Reports Server (NTRS)

    Brown, K. R.; Harrold, E. F.; Johnson, G. W.

    1972-01-01

    Different formulations of the fuel optimization problem for multiple burn trajectories are considered. It is shown that certain customary idealizing assumptions lead to an ill-posed optimization problem for which no solution exists. Several ways are discussed for avoiding such difficulties by more realistic problem statements. An iterative solution of the boundary value problem is presented together with efficient coast arc computations, the right end conditions for various orbital missions, and some test results.

  3. Standardized EMCS Energy Savings Calculations.

    DTIC Science & Technology

    1982-09-01

    Reset 56 4.12 Boiler Optimization 57 4.13 Chiller Optimization 58 4.14 Chiller Water Temperature Reset 58 4.15 Condenser Water Temperature.Reset 59...gal, Btu/kwh, etc. (See page 32) 4.13 CHILLER OPTIMIZATION These savings are applicable only to chilled water plants with multiple chillers . The...temperature at end of shutdown period in OF To = hot water temperature setpoint in °F TON = chiller capacity in tons Ts = average temperature of surroundings in

  4. Stock optimizing: maximizing reinforcers per session on a variable-interval schedule.

    PubMed Central

    Silberberg, A; Bauman, R; Hursh, S

    1993-01-01

    In Experiment 1, 2 monkeys earned their daily food ration by pressing a key that delivered food according to a variable-interval 3-min schedule. In Phases 1 and 4, sessions ended after 3 hr. In Phases 2 and 3, sessions ended after a fixed number of responses that reduced food intake and body weights from levels during Phases 1 and 4. Monkeys responded at higher rates and emitted more responses per food delivery when the food earned in a session was reduced. In Experiment 2, monkeys earned their daily food ration by depositing tokens into the response panel. Deposits delivered food according to a variable-interval 3-min schedule. When the token supply was unlimited (Phases 1, 3, and 5), sessions ended after 3 hr. In Phases 2 and 4, sessions ended after 150 tokens were deposited, resulting in a decrease in food intake and body weight. Both monkeys responded at lower rates and emitted fewer responses per food delivery when the food earned in a session was reduced. Experiment 1's results are consistent with a strength account, according to which the phases that reduced body weights increased food's value and therefore increased subjects' response rates. The results of Experiment 2 are consistent with an optimizing strategy, because lowering response rates when food is restricted defends body weight on variable-interval schedules. These contrasting results may be attributed to the discriminability of the contingency between response number and the end of a session being greater in Experiment 2 than in Experiment 1. In consequence, subjects lowered their response rates in order to increase the number of reinforcers per session (stock optimizing). PMID:8454960

  5. A Computational Framework for Quantifying and Optimizing the Performance of Observational Networks in 4D-Var Data Assimilation

    NASA Astrophysics Data System (ADS)

    Cioaca, Alexandru

    A deep scientific understanding of complex physical systems, such as the atmosphere, can be achieved neither by direct measurements nor by numerical simulations alone. Data assimila- tion is a rigorous procedure to fuse information from a priori knowledge of the system state, the physical laws governing the evolution of the system, and real measurements, all with associated error statistics. Data assimilation produces best (a posteriori) estimates of model states and parameter values, and results in considerably improved computer simulations. The acquisition and use of observations in data assimilation raises several important scientific questions related to optimal sensor network design, quantification of data impact, pruning redundant data, and identifying the most beneficial additional observations. These questions originate in operational data assimilation practice, and have started to attract considerable interest in the recent past. This dissertation advances the state of knowledge in four dimensional variational (4D-Var) data assimilation by developing, implementing, and validating a novel computational framework for estimating observation impact and for optimizing sensor networks. The framework builds on the powerful methodologies of second-order adjoint modeling and the 4D-Var sensitivity equations. Efficient computational approaches for quantifying the observation impact include matrix free linear algebra algorithms and low-rank approximations of the sensitivities to observations. The sensor network configuration problem is formulated as a meta-optimization problem. Best values for parameters such as sensor location are obtained by optimizing a performance criterion, subject to the constraint posed by the 4D-Var optimization. Tractable computational solutions to this "optimization-constrained" optimization problem are provided. The results of this work can be directly applied to the deployment of intelligent sensors and adaptive observations, as well as to reducing the operating costs of measuring networks, while preserving their ability to capture the essential features of the system under consideration.

  6. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    NASA Astrophysics Data System (ADS)

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; Paterno, Marc; Sehrish, Saba

    2015-12-01

    The scientific discovery process can be advanced by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally, it is important for scientists to be able to share their workflows with collaborators. We have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC); the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In this paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.

  7. Informing Drought Preparedness and Response with the South Asia Land Data Assimilation System

    NASA Astrophysics Data System (ADS)

    Zaitchik, B. F.; Ghatak, D.; Matin, M. A.; Qamer, F. M.; Adhikary, B.; Bajracharya, B.; Nelson, J.; Pulla, S. T.; Ellenburg, W. L.

    2017-12-01

    Decision-relevant drought monitoring in South Asia is a challenge from both a scientific and an institutional perspective. Scientifically, climatic diversity, inconsistent in situ monitoring, complex hydrology, and incomplete knowledge of atmospheric processes mean that monitoring and prediction are fraught with uncertainty. Institutionally, drought monitoring efforts need to align with the information needs and decision-making processes of relevant agencies at national and subnational levels. Here we present first results from an emerging operational drought monitoring and forecast system developed and supported by the NASA SERVIR Hindu-Kush Himalaya hub. The system has been designed in consultation with end users from multiple sectors in South Asian countries to maximize decision-relevant information content in the monitoring and forecast products. Monitoring of meteorological, agricultural, and hydrological drought is accomplished using the South Asia Land Data Assimilation System, a platform that supports multiple land surface models and meteorological forcing datasets to characterize uncertainty, and subseasonal to seasonal hydrological forecasts are produced by driving South Asia LDAS with downscaled meteorological fields drawn from an ensemble of global dynamically-based forecast systems. Results are disseminated to end users through a Tethys online visualization platform and custom communications that provide user oriented, easily accessible, timely, and decision-relevant scientific information.

  8. Constructing Scientific Arguments Using Evidence from Dynamic Computational Climate Models

    NASA Astrophysics Data System (ADS)

    Pallant, Amy; Lee, Hee-Sun

    2015-04-01

    Modeling and argumentation are two important scientific practices students need to develop throughout school years. In this paper, we investigated how middle and high school students ( N = 512) construct a scientific argument based on evidence from computational models with which they simulated climate change. We designed scientific argumentation tasks with three increasingly complex dynamic climate models. Each scientific argumentation task consisted of four parts: multiple-choice claim, openended explanation, five-point Likert scale uncertainty rating, and open-ended uncertainty rationale. We coded 1,294 scientific arguments in terms of a claim's consistency with current scientific consensus, whether explanations were model based or knowledge based and categorized the sources of uncertainty (personal vs. scientific). We used chi-square and ANOVA tests to identify significant patterns. Results indicate that (1) a majority of students incorporated models as evidence to support their claims, (2) most students used model output results shown on graphs to confirm their claim rather than to explain simulated molecular processes, (3) students' dependence on model results and their uncertainty rating diminished as the dynamic climate models became more and more complex, (4) some students' misconceptions interfered with observing and interpreting model results or simulated processes, and (5) students' uncertainty sources reflected more frequently on their assessment of personal knowledge or abilities related to the tasks than on their critical examination of scientific evidence resulting from models. These findings have implications for teaching and research related to the integration of scientific argumentation and modeling practices to address complex Earth systems.

  9. Organizing human functioning and rehabilitation research into distinct scientific fields. Part I: Developing a comprehensive structure from the cell to society.

    PubMed

    Stucki, Gerold; Grimby, Gunnar

    2007-05-01

    There is a need to organize rehabilitation and related research into distinct scientific fields in order to overcome the current limitations of rehabilitation research. Based on the general distinction in basic, applied and professional sciences applicable to research in general, and the rehabilitation relevant distinction between the comprehensive perspective based on WHO's integrative model of human functioning (ICF) and the partial perspective focusing on the biomedical aspects of functioning, it is possible to identify 5 distinct scientific fields of human functioning and rehabilitation research. These are the emerging human functioning sciences and integrative rehabilitation sciences from the comprehensive perspective, the established biosciences and biomedical rehabilitation sciences and engineering from the partial perspective, and the professional rehabilitation sciences at the cutting edge of research and practice. The human functioning sciences aim to understand human functioning and to identify targets for comprehensive interventions, with the goal of contributing to the minimization of the experience of disability in the population. The biosciences in rehabilitation aim to explain body injury and repair and to identify targets for biomedical interventions. The integrative rehabilitation sciences design and study comprehensive assessments and interventions that integrate biomedical, personal factor and environmental approaches suited to optimize people's performance. The biomedical rehabilitation sciences and engineering study diagnostic measures and interventions suitable to minimize impairment, including symptom control, and to optimize people's capacity. The professional rehabilitation sciences study how to provide best care with the goal of enabling people with health conditions experiencing or likely to experience disability to achieve and maintain optimal functioning in interaction with the environment. The organization of human functioning and rehabilitation research into the 5 distinct scientific fields facilitates the development of academic training programs and career building as well as the development of research structures dedicated to human functioning and rehabilitation research.

  10. Modular entanglement.

    PubMed

    Gualdi, Giulia; Giampaolo, Salvatore M; Illuminati, Fabrizio

    2011-02-04

    We introduce and discuss the concept of modular entanglement. This is the entanglement that is established between the end points of modular systems composed by sets of interacting moduli of arbitrarily fixed size. We show that end-to-end modular entanglement scales in the thermodynamic limit and rapidly saturates with the number of constituent moduli. We clarify the mechanisms underlying the onset of entanglement between distant and noninteracting quantum systems and its optimization for applications to quantum repeaters and entanglement distribution and sharing.

  11. 78 FR 64961 - Center for Scientific Review; Amended Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-30

    ... Charles Street, Baltimore, MD 21201 which was published in the Federal Register on September 9, 2013, 78 FR 174 pgs. 55086-55087. The meeting will start on November 20, 2013 at 8:00 a.m. and will end on...

  12. [Susmann Galant (1896-1978). A Russian-Swiss supporter and opponent of Sigmund Freud].

    PubMed

    Müller, Christian

    2012-01-01

    The scientific activity of this Russian psychiatrist is depicted in a short biography. His ambivalent attitude to Freud's dream theory is emphasized. At the end of his medical career he became full professor of psychiatry at Khabarovsk.

  13. Early Rockets

    NASA Image and Video Library

    2004-04-15

    By the end of the 19th century, soldiers, sailors, and practical and not-so practical inventors, had developed a stake in rocketry. Skillful theorists, like Konstantian Tsiolkovsky in Russia, were examining the fundamental scientific theories behind rocketry. They were begirning to consider the possibility of space travel

  14. 78 FR 2659 - Application(s) for Duty-Free Entry of Scientific Instruments

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-14

    ..., 2201 West End Ave., Nashville, TN 37235. Instrument: Electron Microscope. Manufacturer: FEI Company... St., West Lafayette, IN 47907-2024. Instrument: Electron Microscope. Manufacturer: FEI Company, the..., microorganisms, nanomaterials, and chemical compounds. Justification for Duty-Free Entry: There are no...

  15. Adverse Outcome Pathways: From Research to Regulation - Scientific Workshop Report

    EPA Science Inventory

    An adverse outcome pathway (AOP) organizes existing knowledge on chemical mode of action, starting with a molecular initiating event such as receptor binding, continuing through key events, and ending with an adverse outcome such as reproductive impairment. AOPs can help identify...

  16. Evolutionary optimization methods for accelerator design

    NASA Astrophysics Data System (ADS)

    Poklonskiy, Alexey A.

    Many problems from the fields of accelerator physics and beam theory can be formulated as optimization problems and, as such, solved using optimization methods. Despite growing efficiency of the optimization methods, the adoption of modern optimization techniques in these fields is rather limited. Evolutionary Algorithms (EAs) form a relatively new and actively developed optimization methods family. They possess many attractive features such as: ease of the implementation, modest requirements on the objective function, a good tolerance to noise, robustness, and the ability to perform a global search efficiently. In this work we study the application of EAs to problems from accelerator physics and beam theory. We review the most commonly used methods of unconstrained optimization and describe the GATool, evolutionary algorithm and the software package, used in this work, in detail. Then we use a set of test problems to assess its performance in terms of computational resources, quality of the obtained result, and the tradeoff between them. We justify the choice of GATool as a heuristic method to generate cutoff values for the COSY-GO rigorous global optimization package for the COSY Infinity scientific computing package. We design the model of their mutual interaction and demonstrate that the quality of the result obtained by GATool increases as the information about the search domain is refined, which supports the usefulness of this model. We Giscuss GATool's performance on the problems suffering from static and dynamic noise and study useful strategies of GATool parameter tuning for these and other difficult problems. We review the challenges of constrained optimization with EAs and methods commonly used to overcome them. We describe REPA, a new constrained optimization method based on repairing, in exquisite detail, including the properties of its two repairing techniques: REFIND and REPROPT. We assess REPROPT's performance on the standard constrained optimization test problems for EA with a variety of different configurations and suggest optimal default parameter values based on the results. Then we study the performance of the REPA method on the same set of test problems and compare the obtained results with those of several commonly used constrained optimization methods with EA. Based on the obtained results, particularly on the outstanding performance of REPA on test problem that presents significant difficulty for other reviewed EAs, we conclude that the proposed method is useful and competitive. We discuss REPA parameter tuning for difficult problems and critically review some of the problems from the de-facto standard test problem set for the constrained optimization with EA. In order to demonstrate the practical usefulness of the developed method, we study several problems of accelerator design and demonstrate how they can be solved with EAs. These problems include a simple accelerator design problem (design a quadrupole triplet to be stigmatically imaging, find all possible solutions), a complex real-life accelerator design problem (an optimization of the front end section for the future neutrino factory), and a problem of the normal form defect function optimization which is used to rigorously estimate the stability of the beam dynamics in circular accelerators. The positive results we obtained suggest that the application of EAs to problems from accelerator theory can be very beneficial and has large potential. The developed optimization scenarios and tools can be used to approach similar problems.

  17. Science, the public, and social elites: how the general public, scientists, top politicians and managers perceive science.

    PubMed

    Prpić, Katarina

    2011-11-01

    This paper finds that the Croatian public's and the social elites' perceptions of science are a mixture of scientific and technological optimism, of the tendency to absolve science of social responsibility, of skepticism about the social effects of science, and of cognitive optimism and skepticism. However, perceptions differ significantly according to the different social roles and the wider value system of the observed groups. The survey data show some key similarities, as well as certain specificities in the configuration of the types of views of the four groups--the public, scientists, politicians and managers. The results suggest that the well-known typology of the four cultures reveals some of the ideologies of the key actors of scientific and technological policy. The greatest social, primarily educational and socio-spatial, differentiation of the perceptions of science was found in the general public.

  18. On the optimal use of fictitious time in variation of parameters methods with application to BG14

    NASA Technical Reports Server (NTRS)

    Gottlieb, Robert G.

    1991-01-01

    The optimal way to use fictitious time in variation of parameter methods is presented. Setting fictitious time to zero at the end of each step is shown to cure the instability associated with some types of problems. Only some parameters are reinitialized, thereby retaining redundant information.

  19. ILP-based co-optimization of cut mask layout, dummy fill, and timing for sub-14nm BEOL technology

    NASA Astrophysics Data System (ADS)

    Han, Kwangsoo; Kahng, Andrew B.; Lee, Hyein; Wang, Lutong

    2015-10-01

    Self-aligned multiple patterning (SAMP), due to its low overlay error, has emerged as the leading option for 1D gridded back-end-of-line (BEOL) in sub-14nm nodes. To form actual routing patterns from a uniform "sea of wires", a cut mask is needed for line-end cutting or realization of space between routing segments. Constraints on cut shapes and minimum cut spacing result in end-of-line (EOL) extensions and non-functional (i.e. dummy fill) patterns; the resulting capacitance and timing changes must be consistent with signoff performance analyses and their impacts should be minimized. In this work, we address the co-optimization of cut mask layout, dummy fill, and design timing for sub-14nm BEOL design. Our central contribution is an optimizer based on integer linear programming (ILP) to minimize the timing impact due to EOL extensions, considering (i) minimum cut spacing arising in sub-14nm nodes; (ii) cut assignment to different cut masks (color assignment); and (iii) the eligibility to merge two unit-size cuts into a bigger cut. We also propose a heuristic approach to remove dummy fills after the ILP-based optimization by extending the usage of cut masks. Our heuristic can improve critical path performance under minimum metal density and mask density constraints. In our experiments, we study the impact of number of cut masks, minimum cut spacing and metal density under various constraints. Our studies of optimized cut mask solutions in these varying contexts give new insight into the tradeoff of performance and cost that is afforded by cut mask patterning technology options.

  20. STAR: FPGA-based software defined satellite transponder

    NASA Astrophysics Data System (ADS)

    Davalle, Daniele; Cassettari, Riccardo; Saponara, Sergio; Fanucci, Luca; Cucchi, Luca; Bigongiari, Franco; Errico, Walter

    2013-05-01

    This paper presents STAR, a flexible Telemetry, Tracking & Command (TT&C) transponder for Earth Observation (EO) small satellites, developed in collaboration with INTECS and SITAEL companies. With respect to state-of-the-art EO transponders, STAR includes the possibility of scientific data transfer thanks to the 40 Mbps downlink data-rate. This feature represents an important optimization in terms of hardware mass, which is important for EO small satellites. Furthermore, in-flight re-configurability of communication parameters via telecommand is important for in-orbit link optimization, which is especially useful for low orbit satellites where visibility can be as short as few hundreds of seconds. STAR exploits the principles of digital radio to minimize the analog section of the transceiver. 70MHz intermediate frequency (IF) is the interface with an external S/X band radio-frequency front-end. The system is composed of a dedicated configurable high-speed digital signal processing part, the Signal Processor (SP), described in technology-independent VHDL working with a clock frequency of 184.32MHz and a low speed control part, the Control Processor (CP), based on the 32-bit Gaisler LEON3 processor clocked at 32 MHz, with SpaceWire and CAN interfaces. The quantization parameters were fine-tailored to reach a trade-off between hardware complexity and implementation loss which is less than 0.5 dB at BER = 10-5 for the RX chain. The IF ports require 8-bit precision. The system prototype is fitted on the Xilinx Virtex 6 VLX75T-FF484 FPGA of which a space-qualified version has been announced. The total device occupation is 82 %.

  1. Defining Quality in Cardiovascular Imaging: A Scientific Statement From the American Heart Association.

    PubMed

    Shaw, Leslee J; Blankstein, Ron; Jacobs, Jill E; Leipsic, Jonathon A; Kwong, Raymond Y; Taqueti, Viviany R; Beanlands, Rob S B; Mieres, Jennifer H; Flamm, Scott D; Gerber, Thomas C; Spertus, John; Di Carli, Marcelo F

    2017-12-01

    The aims of the current statement are to refine the definition of quality in cardiovascular imaging and to propose novel methodological approaches to inform the demonstration of quality in imaging in future clinical trials and registries. We propose defining quality in cardiovascular imaging using an analytical framework put forth by the Institute of Medicine whereby quality was defined as testing being safe, effective, patient-centered, timely, equitable, and efficient. The implications of each of these components of quality health care are as essential for cardiovascular imaging as they are for other areas within health care. Our proposed statement may serve as the foundation for integrating these quality indicators into establishing designations of quality laboratory practices and developing standards for value-based payment reform for imaging services. We also include recommendations for future clinical research to fulfill quality aims within cardiovascular imaging, including clinical hypotheses of improving patient outcomes, the importance of health status as an end point, and deferred testing options. Future research should evolve to define novel methods optimized for the role of cardiovascular imaging for detecting disease and guiding treatment and to demonstrate the role of cardiovascular imaging in facilitating healthcare quality. © 2017 American Heart Association, Inc.

  2. Nutrient intake and food habits of soccer players: analyzing the correlates of eating practice.

    PubMed

    García-Rovés, Pablo M; García-Zapico, Pedro; Patterson, Angeles M; Iglesias-Gutiérrez, Eduardo

    2014-07-18

    Despite the impact and popularity of soccer, and the growing field of soccer-related scientific research, little attention has been devoted to the nutritional intake and eating habits of soccer players. Moreover, the few studies that have addressed this issue suggest that the nutritional intake of soccer players is inadequate, underscoring the need for better adherence to nutritional recommendations and the development and implementation of nutrition education programs. The objective of these programs would be to promote healthy eating habits for male and female soccer players of all ages to optimize performance and provide health benefits that last beyond the end of a player's career. To date, no well-designed nutrition education program has been implemented for soccer players. The design and implementation of such an intervention requires a priori knowledge of nutritional intake and other correlates of food selection, such as food preferences and the influence of field position on nutrient intake, as well as detailed analysis of nutritional intake on match days, on which little data is available. Our aim is to provide an up-to-date overview of the nutritional intake, eating habits, and correlates of eating practice of soccer players.

  3. The Vasimr Engine: Project Status and Recent Accomplishments

    NASA Technical Reports Server (NTRS)

    ChangDiaz, Franklin R.; Squire, Jared P.; Bering, Edgar A., III; Baitty, F. Wally; Goulding, Richard H.; Bengtson, Roger D.

    2004-01-01

    The development of the Variable Specific Impulse Magnetoplasma Rocket (VASIMR) was initiated in the late 1970s to address a critical requirement for fast, high-power interplanetary space transportation. While not being a fusion rocket, it nevertheless borrows heavily from that technology and takes advantage of the natural topology of open-ended magnetic systems. In addition to its high power density and high exhaust velocity, VASIMR is capable of "constant power throttling" a feature, which allows in-flight mission-optimization of thrust and specific impulse to enhance performance and reduce trip time. A NASA-led, research team, involving industry, academia and government facilities is pursuing the development of this concept in the United States. The technology can be validated, in the near term, in venues such as the International Space Station, where it can also serve as both a drag compensation device and a plasma contactor for the orbital facility. Other near-Earth applications in the commercial and scientific satellite sectors are also envisioned. This presentation covers the evolution of the VASIMR concept to its present status, as well as recent accomplishments in our understanding of the physics. Approaches and collaborative programs addressing the major technical challenges will also be presented.

  4. Cyclodextrins improving the physicochemical and pharmacological properties of antidepressant drugs: a patent review.

    PubMed

    Diniz, Tâmara Coimbra; Pinto, Tiago Coimbra Costa; Menezes, Paula Dos Passos; Silva, Juliane Cabral; Teles, Roxana Braga de Andrade; Ximenes, Rosana Christine Cavalcanti; Guimarães, Adriana Gibara; Serafini, Mairim Russo; Araújo, Adriano Antunes de Souza; Quintans Júnior, Lucindo José; Almeida, Jackson Roberto Guedes da Silva

    2018-01-01

    Depression is a serious mood disorder and is one of the most common mental illnesses. Despite the availability of several classes of antidepressants, a substantial percentage of patients are unresponsive to these drugs, which have a slow onset of action in addition to producing undesirable side effects. Some scientific evidence suggests that cyclodextrins (CDs) can improve the physicochemical and pharmacological profile of antidepressant drugs (ADDs). The purpose of this paper is to disclose current data technology prospects involving antidepressant drugs and cyclodextrins. Areas covered: We conducted a patent review to evaluate the antidepressive activity of the compounds complexed in CDs, and we analyzed whether these complexes improved their physicochemical properties and pharmacological action. The present review used 8 specialized patent databases for patent research, using the term 'cyclodextrin' combined with 'antidepressive agents' and its related terms. We found 608 patents. In the end, considering the inclusion criteria, 27 patents reporting the benefits of complexation of ADDs with CDs were included. Expert opinion: The use of CDs can be considered an important tool for the optimization of physicochemical and pharmacological properties of ADDs, such as stability, solubility and bioavailability.

  5. [State of the art and future trends in technology for computed tomography dose reduction].

    PubMed

    Calzado Cantera, A; Hernández-Girón, I; Salvadó Artells, M; Rodríguez González, R

    2013-12-01

    The introduction of helical and multislice acquisitions in CT scanners together with decreased image reconstruction times has had a tremendous impact on radiological practice. Technological developments in the last 10 to 12 years have enabled very high quality images to be obtained in a very short time. Improved image quality has led to an increase in the number of indications for CT. In parallel to this development, radiation exposure in patients has increased considerably. Concern about the potential health risks posed by CT imaging, reflected in diverse initiatives and actions by official organs and scientific societies, has prompted the search for ways to reduce radiation exposure in patients without compromising diagnostic efficacy. To this end, good practice guidelines have been established, special applications have been developed for scanners, and research has been undertaken to optimize the clinical use of CT. Noteworthy technical developments incorporated in scanners include the different modes of X-ray tube current modulation, automatic selection of voltage settings, selective organ protection, adaptive collimation, and iterative reconstruction. The appropriate use of these tools to reduce radiation doses requires thorough knowledge of how they work. Copyright © 2013 SERAM. Published by Elsevier Espana. All rights reserved.

  6. Nutrient Intake and Food Habits of Soccer Players: Analyzing the Correlates of Eating Practice

    PubMed Central

    García-Rovés, Pablo M.; García-Zapico, Pedro; Patterson, Ángeles M.; Iglesias-Gutiérrez, Eduardo

    2014-01-01

    Despite the impact and popularity of soccer, and the growing field of soccer-related scientific research, little attention has been devoted to the nutritional intake and eating habits of soccer players. Moreover, the few studies that have addressed this issue suggest that the nutritional intake of soccer players is inadequate, underscoring the need for better adherence to nutritional recommendations and the development and implementation of nutrition education programs. The objective of these programs would be to promote healthy eating habits for male and female soccer players of all ages to optimize performance and provide health benefits that last beyond the end of a player’s career. To date, no well-designed nutrition education program has been implemented for soccer players. The design and implementation of such an intervention requires a priori knowledge of nutritional intake and other correlates of food selection, such as food preferences and the influence of field position on nutrient intake, as well as detailed analysis of nutritional intake on match days, on which little data is available. Our aim is to provide an up-to-date overview of the nutritional intake, eating habits, and correlates of eating practice of soccer players. PMID:25045939

  7. A writing-intensive course improves biology undergraduates' perception and confidence of their abilities to read scientific literature and communicate science.

    PubMed

    Brownell, Sara E; Price, Jordan V; Steinman, Lawrence

    2013-03-01

    Most scientists agree that comprehension of primary scientific papers and communication of scientific concepts are two of the most important skills that we can teach, but few undergraduate biology courses make these explicit course goals. We designed an undergraduate neuroimmunology course that uses a writing-intensive format. Using a mixture of primary literature, writing assignments directed toward a layperson and scientist audience, and in-class discussions, we aimed to improve the ability of students to 1) comprehend primary scientific papers, 2) communicate science to a scientific audience, and 3) communicate science to a layperson audience. We offered the course for three consecutive years and evaluated its impact on student perception and confidence using a combination of pre- and postcourse survey questions and coded open-ended responses. Students showed gains in both the perception of their understanding of primary scientific papers and of their abilities to communicate science to scientific and layperson audiences. These results indicate that this unique format can teach both communication skills and basic science to undergraduate biology students. We urge others to adopt a similar format for undergraduate biology courses to teach process skills in addition to content, thus broadening and strengthening the impact of undergraduate courses.

  8. Engaging pre-service teachers to teach science contextually with scientific approach instructional video

    NASA Astrophysics Data System (ADS)

    Susantini, E.; Kurniasari, I.; Fauziah, A. N. M.; Prastowo, T.; Kholiq, A.; Rosdiana, L.

    2018-01-01

    Contextual teaching and learning (CTL) present new concepts in real experiences and situations, where students can find out the meaningful relationship between abstract ideas and practical applications. Implementation of CTL using scientific approach fosters teachers to find constructive ways of delivering and organizing science contents in science classroom settings. An instructional video for modelling by using a scientific approach in CTL was then developed. Questionnaires with open-ended questions were used to, asking whether modelling through instructional video could help them to teach science contextually with a scientific approach or not. Data for pre-service teachers’ views were analyzed descriptively. The aims of this research are to engage pre-service teachers in learning how to teach CTL and to show how their responses to learning and how to teach CTL using the video. The study showed that ten pre-service teachers in science department were involved, all observed through videos that demonstrated a combined material of CTL and scientific approach and completed worksheets to analyze the video contents. The results show that pre-service teachers could learn to teach contextual teaching and make use of scientific approach in science classroom settings with the help of model in the video.

  9. Improving Scientific Research and Writing Skills through Peer Review and Empirical Group Learning †

    PubMed Central

    Senkevitch, Emilee; Smith, Ann C.; Marbach-Ad, Gili; Song, Wenxia

    2011-01-01

    Here we describe a semester-long, multipart activity called “Read and wRite to reveal the Research process” (R3) that was designed to teach students the elements of a scientific research paper. We implemented R3 in an advanced immunology course. In R3, we paralleled the activities of reading, discussion, and presentation of relevant immunology work from primary research papers with student writing, discussion, and presentation of their own lab findings. We used reading, discussing, and writing activities to introduce students to the rationale for basic components of a scientific research paper, the method of composing a scientific paper, and the applications of course content to scientific research. As a final part of R3, students worked collaboratively to construct a Group Research Paper that reported on a hypothesis-driven research project, followed by a peer review activity that mimicked the last stage of the scientific publishing process. Assessment of student learning revealed a statistically significant gain in student performance on writing in the style of a research paper from the start of the semester to the end of the semester. PMID:23653760

  10. Scientific literacy for democratic decision-making

    NASA Astrophysics Data System (ADS)

    Yacoubian, Hagop A.

    2018-02-01

    Scientifically literate citizens must be able to engage in making decisions on science-based social issues. In this paper, I start by showing examples of science curricula and policy documents that capitalise the importance of engaging future citizens in decision-making processes whether at the personal or at the societal levels. I elucidate the ideological underpinnings behind a number of the statements within those documents that have defined the trajectory of scientific literacy and have shaped what ought to be considered as personal and societal benefits. I argue that science curricula and policy documents can truly endorse scientific literacy when they embed principles of democratic education at their core. The latter entails fostering learning experiences where some of the underlying assumptions and political ideologies are brought to the conscious level and future citizens encouraged to reflect upon them critically and explicitly. Such a proposal empowers the future citizens to engage in critical deliberation on science-based social issues without taking the underlying status quo for granted. I end up the paper by situating the preparation of scientifically literate citizens within a framework of democratic education, discuss conditions through which a curriculum for scientific literacy can serve democratic decision-making processes, and provide modest recommendations.

  11. Pain in children--are we accomplishing the optimal pain treatment?

    PubMed

    Lundeberg, Stefan

    2015-01-01

    Morphine, paracetamol and local anesthetics have for a long time been the foremost used analgesics in the pediatric patient by tradition but not always enough effective and associated with side effects. The purpose with this article is to propose alternative approaches in pain management, not always supported up by substantial scientific work but from a combination of science and clinical experience in the field. The scientific literature has been reviewed in parts regarding different aspects of pain assessment and analgesics used for treatment of diverse pain conditions with focus on procedural and acute pain. Clinical experience has been added to form the suggested improvements in accomplishing an improved pain management in pediatric patients. The aim with pain management in children should be a tailored analgesic medication with an individual acceptable pain level and optimal degree of mobilization with as little side effects as possible. Simple techniques of pain control are as effective as and complex techniques in pediatrics but the technique used is not of the highest importance in achieving a good pain management. Increased interest and improved education of the doctors prescribing analgesics is important in accomplishing a better pain management. The optimal treatment with analgesics is depending on the analysis of pain origin and analgesics used should be adjusted thereafter. A multimodal treatment regime is advocated for optimal analgesic effect. © 2014 John Wiley & Sons Ltd.

  12. Does the Discussion of Socio-Scientific Issues require a Paradigm Shift in Science Teachers' Thinking?

    NASA Astrophysics Data System (ADS)

    Day, Stephen P.; Bryce, Tom G. K.

    2011-08-01

    The purpose of this study was to characterise secondary school science teachers' conceptual models of discussion, against the background that a number of researchers have found that discussion of socio-scientific issues in science classrooms is rare, somewhat discomforting for teachers and its purpose unclear. Recent research indicates that when science teachers do engage in socio-scientific discussion, the quality is poor and is teacher-centred where pupils' views do not figure prominently (far less be clarified and integrated with their scientific learning). This has led to calls for such dialogue to be conducted by humanities teachers. The question arising from such thinking is: Do science teachers hold different conceptual models of discussion from their humanities colleagues? Using semi-structured interviews, three groups each of six teachers (experienced science teachers, experienced humanities teachers, and newly qualified science teachers) were interviewed in-depth in order to characterise their conceptual understanding of discussion as a teaching strategy. Analysis of the interview transcripts utilised the constant comparison approach of grounded theory. Five conceptual models of discussion emerged from an analysis of the data-discussion: (1) as a teacher-mediated discourse; (2) as open-ended inquiry; (3) for the development of reasoning skills; (4) as mediated transfer of knowledge to real-life contexts; and (5) as practice for democratic citizenship. The results confirmed that the science teachers' emphasis tended to stress practice for democratic citizenship whereas the humanities teachers' emphasis was more towards open-ended inquiry and for the development of reasoning skills.

  13. The PRISM project

    NASA Astrophysics Data System (ADS)

    Guilyardi, E.

    2003-04-01

    The European Union's PRISM infrastructure project (PRogram for Integrated earth System Modelling) aims at designing a flexible environment to easily assemble and run Earth System Models (http://prism.enes.org). Europe's widely distributed modelling expertise is both a strength and a challenge. Recognizing this, the PRISM project aims at developing an efficient shared modelling software infrastructure for climate scientists, providing them with an opportunity for greater focus on scientific issues, including the necessary scientific diversity (models and approaches). The proposed PRISM system includes 1) the use - or definition - and promotion of scientific and technical standards to increase component modularity, 2) an end-to-end software environment (coupler, user interface, diagnostics) to launch, monitor and analyze complex Earth System Models built around the existing and future community models, 3) testing and quality standards to ensure HPC performance on a variety of platforms and 4) community wide inputs and requirements capture in all stages of system specifications and design through user/developers meetings, workshops and thematic schools. This science driven project, led by 22 institutes* and started December 1st 2001, benefits from a unique gathering of scientific and technical expertise. More than 30 models (both global and regional) have expressed interest to be part of the PRISM system and 6 types of components have been identified: atmosphere, atmosphere chemistry, land surface, ocean, sea ice and ocean biochemistry. Progress and overall architecture design will be presented. * MPI-Met (Coordinator), KNMI (co-coordinator), MPI-M&D, Met Office, University of Reading, IPSL, Meteo-France, CERFACS, DMI, SMHI, NERSC, ETH Zurich, INGV, MPI-BGC, PIK, ECMWF, UCL-ASTR, NEC, FECIT, SGI, SUN, CCRLE

  14. A comparative study on stress and compliance based structural topology optimization

    NASA Astrophysics Data System (ADS)

    Hailu Shimels, G.; Dereje Engida, W.; Fakhruldin Mohd, H.

    2017-10-01

    Most of structural topology optimization problems have been formulated and solved to either minimize compliance or weight of a structure under volume or stress constraints, respectively. Even if, a lot of researches are conducted on these two formulation techniques separately, there is no clear comparative study between the two approaches. This paper intends to compare these formulation techniques, so that an end user or designer can choose the best one based on the problems they have. Benchmark problems under the same boundary and loading conditions are defined, solved and results are compared based on these formulations. Simulation results shows that the two formulation techniques are dependent on the type of loading and boundary conditions defined. Maximum stress induced in the design domain is higher when the design domains are formulated using compliance based formulations. Optimal layouts from compliance minimization formulation has complex layout than stress based ones which may lead the manufacturing of the optimal layouts to be challenging. Optimal layouts from compliance based formulations are dependent on the material to be distributed. On the other hand, optimal layouts from stress based formulation are dependent on the type of material used to define the design domain. High computational time for stress based topology optimization is still a challenge because of the definition of stress constraints at element level. Results also shows that adjustment of convergence criterions can be an alternative solution to minimize the maximum stress developed in optimal layouts. Therefore, a designer or end user should choose a method of formulation based on the design domain defined and boundary conditions considered.

  15. Postmodernism: A Dead End in Social Work Epistemology

    ERIC Educational Resources Information Center

    Caputo, Richard; Epstein, William; Stoesz, David; Thyer, Bruce

    2015-01-01

    Postmodernism continues to have a detrimental influence on social work, questioning the Enlightenment, criticizing established research methods, and challenging scientific authority. The promotion of postmodernism by editors of "Social Work" and the "Journal of Social Work Education" has elevated postmodernism, placing it on a…

  16. 78 FR 66372 - Center for Scientific Review; Amended Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-05

    ... September 05, 2013, 78 FR 54664-54665. The meeting will be held at the Hotel Monaco Baltimore, 2 North Charles Street, Baltimore, MD 21201 on November 22, 2013, starting at 08:00 a.m. and ending at 07:00 p.m...

  17. FROM ASSESSMENT TO POLICY--LESSONS LEARNED FROM THE U.S. NATIONAL ASSESSMENT (Journal Article)

    EPA Science Inventory

    The process of translating scientific information into timely and useful insights that inform policy and resource management decisions, despite the existence of uncertainties, is a difficult and challenging task. Policy-focused assessment is one approach to achieving this end. ...

  18. FROM ASSESSMENT TO POLICY: LESSONS LEARNED FROM THE U.S. NATIONAL ASSESSMENT

    EPA Science Inventory

    The process of translating scientific information into timely and useful insights that inform policy and resource management decisions, despite the existence of uncertainties, is a difficult and challenging task. Policy-focused assessment is one approach to achieving this end. I...

  19. Cheep, Chirp, Twitter, and Whistle

    ERIC Educational Resources Information Center

    Silverman, Emily; Coffman, Margaret; Younker, Betty

    2007-01-01

    This article describes an interdisciplinary, activity-based lesson plan implemented in a third/fourth-grade classroom. During these activities, students use musical concepts to think about, illustrate, and discuss animal behavior, and they use scientific concepts to motivate musical composition and performance. The lesson ends with small group…

  20. The Scientific Art of Contract Negotiations.

    ERIC Educational Resources Information Center

    Kuo, Kent; Wilson, Nancy

    2001-01-01

    Discusses negotiating with vendors when purchasing campus information technology. Describes the mechanics of a negotiation (deep discounts, maintenance and support costs, future discounts, warranties and remedies, training and consulting), vendor approaches to negotiating, composition of the negotiating team, and what to do after negotiations end.…

Top