Experiences in Teaching a Graduate Course on Model-Driven Software Development
ERIC Educational Resources Information Center
Tekinerdogan, Bedir
2011-01-01
Model-driven software development (MDSD) aims to support the development and evolution of software intensive systems using the basic concepts of model, metamodel, and model transformation. In parallel with the ongoing academic research, MDSD is more and more applied in industrial practices. After being accepted both by a broad community of…
Consistent Evolution of Software Artifacts and Non-Functional Models
2014-11-14
induce bad software performance)? 15. SUBJECT TERMS EOARD, Nano particles, Photo-Acoustic Sensors, Model-Driven Engineering ( MDE ), Software Performance...Università degli Studi dell’Aquila, Via Vetoio, 67100 L’Aquila, Italy Email: vittorio.cortellessa@univaq.it Web : http: // www. di. univaq. it/ cortelle/ Phone...Model-Driven Engineering ( MDE ), Software Performance Engineering (SPE), Change Propagation, Performance Antipatterns. For sake of readability of the
Aspect-Oriented Model-Driven Software Product Line Engineering
NASA Astrophysics Data System (ADS)
Groher, Iris; Voelter, Markus
Software product line engineering aims to reduce development time, effort, cost, and complexity by taking advantage of the commonality within a portfolio of similar products. The effectiveness of a software product line approach directly depends on how well feature variability within the portfolio is implemented and managed throughout the development lifecycle, from early analysis through maintenance and evolution. This article presents an approach that facilitates variability implementation, management, and tracing by integrating model-driven and aspect-oriented software development. Features are separated in models and composed of aspect-oriented composition techniques on model level. Model transformations support the transition from problem to solution space models. Aspect-oriented techniques enable the explicit expression and modularization of variability on model, template, and code level. The presented concepts are illustrated with a case study of a home automation system.
NASA Astrophysics Data System (ADS)
Gaševic, Dragan; Djuric, Dragan; Devedžic, Vladan
A relevant initiative from the software engineering community called Model Driven Engineering (MDE) is being developed in parallel with the Semantic Web (Mellor et al. 2003a). The MDE approach to software development suggests that one should first develop a model of the system under study, which is then transformed into the real thing (i.e., an executable software entity). The most important research initiative in this area is the Model Driven Architecture (MDA), which is Model Driven Architecture being developed under the umbrella of the Object Management Group (OMG). This chapter describes the basic concepts of this software engineering effort.
A UML-based metamodel for software evolution process
NASA Astrophysics Data System (ADS)
Jiang, Zuo; Zhou, Wei-Hong; Fu, Zhi-Tao; Xiong, Shun-Qing
2014-04-01
A software evolution process is a set of interrelated software processes under which the corresponding software is evolving. An object-oriented software evolution process meta-model (OO-EPMM), abstract syntax and formal OCL constraint of meta-model are presented in this paper. OO-EPMM can not only represent software development process, but also represent software evolution.
CrossTalk. The Journal of Defense Software Engineering. Volume 23, Number 6, Nov/Dec 2010
2010-11-01
Model of archi- tectural design. It guides developers to apply effort to their software architecture commensurate with the risks faced by...Driven Model is the promotion of risk to prominence. It is possible to apply the Risk-Driven Model to essentially any software development process...succeed without any planned architecture work, while many high-risk projects would fail without it . The Risk-Driven Model walks a middle path
Agile IT: Thinking in User-Centric Models
NASA Astrophysics Data System (ADS)
Margaria, Tiziana; Steffen, Bernhard
We advocate a new teaching direction for modern CS curricula: extreme model-driven development (XMDD), a new development paradigm designed to continuously involve the customer/application expert throughout the whole systems' life cycle. Based on the `One-Thing Approach', which works by successively enriching and refining one single artifact, system development becomes in essence a user-centric orchestration of intuitive service functionality. XMDD differs radically from classical software development, which, in our opinion is no longer adequate for the bulk of application programming - in particular when it comes to heterogeneous, cross organizational systems which must adapt to rapidly changing market requirements. Thus there is a need for new curricula addressing this model-driven, lightweight, and cooperative development paradigm that puts the user process in the center of the development and the application expert in control of the process evolution.
NASA Astrophysics Data System (ADS)
Bock, Michael; Conrad, Olaf; Günther, Andreas; Gehrt, Ernst; Baritz, Rainer; Böhner, Jürgen
2018-04-01
We propose the implementation of the Soil and Landscape Evolution Model (SaLEM) for the spatiotemporal investigation of soil parent material evolution following a lithologically differentiated approach. Relevant parts of the established Geomorphic/Orogenic Landscape Evolution Model (GOLEM) have been adapted for an operational Geographical Information System (GIS) tool within the open-source software framework System for Automated Geoscientific Analyses (SAGA), thus taking advantage of SAGA's capabilities for geomorphometric analyses. The model is driven by palaeoclimatic data (temperature, precipitation) representative of periglacial areas in northern Germany over the last 50 000 years. The initial conditions have been determined for a test site by a digital terrain model and a geological model. Weathering, erosion and transport functions are calibrated using extrinsic (climatic) and intrinsic (lithologic) parameter data. First results indicate that our differentiated SaLEM approach shows some evidence for the spatiotemporal prediction of important soil parental material properties (particularly its depth). Future research will focus on the validation of the results against field data, and the influence of discrete events (mass movements, floods) on soil parent material formation has to be evaluated.
A generic open-source software framework supporting scenario simulations in bioterrorist crises.
Falenski, Alexander; Filter, Matthias; Thöns, Christian; Weiser, Armin A; Wigger, Jan-Frederik; Davis, Matthew; Douglas, Judith V; Edlund, Stefan; Hu, Kun; Kaufman, James H; Appel, Bernd; Käsbohrer, Annemarie
2013-09-01
Since the 2001 anthrax attack in the United States, awareness of threats originating from bioterrorism has grown. This led internationally to increased research efforts to improve knowledge of and approaches to protecting human and animal populations against the threat from such attacks. A collaborative effort in this context is the extension of the open-source Spatiotemporal Epidemiological Modeler (STEM) simulation and modeling software for agro- or bioterrorist crisis scenarios. STEM, originally designed to enable community-driven public health disease models and simulations, was extended with new features that enable integration of proprietary data as well as visualization of agent spread along supply and production chains. STEM now provides a fully developed open-source software infrastructure supporting critical modeling tasks such as ad hoc model generation, parameter estimation, simulation of scenario evolution, estimation of effects of mitigation or management measures, and documentation. This open-source software resource can be used free of charge. Additionally, STEM provides critical features like built-in worldwide data on administrative boundaries, transportation networks, or environmental conditions (eg, rainfall, temperature, elevation, vegetation). Users can easily combine their own confidential data with built-in public data to create customized models of desired resolution. STEM also supports collaborative and joint efforts in crisis situations by extended import and export functionalities. In this article we demonstrate specifically those new software features implemented to accomplish STEM application in agro- or bioterrorist crisis scenarios.
NASA Technical Reports Server (NTRS)
Butler, Roy
2013-01-01
The growth in computer hardware performance, coupled with reduced energy requirements, has led to a rapid expansion of the resources available to software systems, driving them towards greater logical abstraction, flexibility, and complexity. This shift in focus from compacting functionality into a limited field towards developing layered, multi-state architectures in a grand field has both driven and been driven by the history of embedded processor design in the robotic spacecraft industry.The combinatorial growth of interprocess conditions is accompanied by benefits (concurrent development, situational autonomy, and evolution of goals) and drawbacks (late integration, non-deterministic interactions, and multifaceted anomalies) in achieving mission success, as illustrated by the case of the Mars Reconnaissance Orbiter. Approaches to optimizing the benefits while mitigating the drawbacks have taken the form of the formalization of requirements, modular design practices, extensive system simulation, and spacecraft data trend analysis. The growth of hardware capability and software complexity can be expected to continue, with future directions including stackable commodity subsystems, computer-generated algorithms, runtime reconfigurable processors, and greater autonomy.
Galdzicki, Michal; Clancy, Kevin P; Oberortner, Ernst; Pocock, Matthew; Quinn, Jacqueline Y; Rodriguez, Cesar A; Roehner, Nicholas; Wilson, Mandy L; Adam, Laura; Anderson, J Christopher; Bartley, Bryan A; Beal, Jacob; Chandran, Deepak; Chen, Joanna; Densmore, Douglas; Endy, Drew; Grünberg, Raik; Hallinan, Jennifer; Hillson, Nathan J; Johnson, Jeffrey D; Kuchinsky, Allan; Lux, Matthew; Misirli, Goksel; Peccoud, Jean; Plahar, Hector A; Sirin, Evren; Stan, Guy-Bart; Villalobos, Alan; Wipat, Anil; Gennari, John H; Myers, Chris J; Sauro, Herbert M
2014-06-01
The re-use of previously validated designs is critical to the evolution of synthetic biology from a research discipline to an engineering practice. Here we describe the Synthetic Biology Open Language (SBOL), a proposed data standard for exchanging designs within the synthetic biology community. SBOL represents synthetic biology designs in a community-driven, formalized format for exchange between software tools, research groups and commercial service providers. The SBOL Developers Group has implemented SBOL as an XML/RDF serialization and provides software libraries and specification documentation to help developers implement SBOL in their own software. We describe early successes, including a demonstration of the utility of SBOL for information exchange between several different software tools and repositories from both academic and industrial partners. As a community-driven standard, SBOL will be updated as synthetic biology evolves to provide specific capabilities for different aspects of the synthetic biology workflow.
The MDE Diploma: First International Postgraduate Specialization in Model-Driven Engineering
ERIC Educational Resources Information Center
Cabot, Jordi; Tisi, Massimo
2011-01-01
Model-Driven Engineering (MDE) is changing the way we build, operate, and maintain our software-intensive systems. Several projects using MDE practices are reporting significant improvements in quality and performance but, to be able to handle these projects, software engineers need a set of technical and interpersonal skills that are currently…
A Comparison and Evaluation of Real-Time Software Systems Modeling Languages
NASA Technical Reports Server (NTRS)
Evensen, Kenneth D.; Weiss, Kathryn Anne
2010-01-01
A model-driven approach to real-time software systems development enables the conceptualization of software, fostering a more thorough understanding of its often complex architecture and behavior while promoting the documentation and analysis of concerns common to real-time embedded systems such as scheduling, resource allocation, and performance. Several modeling languages have been developed to assist in the model-driven software engineering effort for real-time systems, and these languages are beginning to gain traction with practitioners throughout the aerospace industry. This paper presents a survey of several real-time software system modeling languages, namely the Architectural Analysis and Design Language (AADL), the Unified Modeling Language (UML), Systems Modeling Language (SysML), the Modeling and Analysis of Real-Time Embedded Systems (MARTE) UML profile, and the AADL for UML profile. Each language has its advantages and disadvantages, and in order to adequately describe a real-time software system's architecture, a complementary use of multiple languages is almost certainly necessary. This paper aims to explore these languages in the context of understanding the value each brings to the model-driven software engineering effort and to determine if it is feasible and practical to combine aspects of the various modeling languages to achieve more complete coverage in architectural descriptions. To this end, each language is evaluated with respect to a set of criteria such as scope, formalisms, and architectural coverage. An example is used to help illustrate the capabilities of the various languages.
The application of domain-driven design in NMS
NASA Astrophysics Data System (ADS)
Zhang, Jinsong; Chen, Yan; Qin, Shengjun
2011-12-01
In the traditional design approach of data-model-driven, system analysis and design phases are often separated which makes the demand information can not be expressed explicitly. The method is also easy to lead developer to the process-oriented programming, making codes between the modules or between hierarchies disordered. So it is hard to meet requirement of system scalability. The paper proposes a software hiberarchy based on rich domain model according to domain-driven design named FHRDM, then the Webwork + Spring + Hibernate (WSH) framework is determined. Domain-driven design aims to construct a domain model which not only meets the demand of the field where the software exists but also meets the need of software development. In this way, problems in Navigational Maritime System (NMS) development like big system business volumes, difficulty of requirement elicitation, high development costs and long development cycle can be resolved successfully.
NASA Astrophysics Data System (ADS)
Stippich, Christian; Glasmacher, Ulrich Anton; Hackspacher, Peter
2015-04-01
The aim of the research is to quantify the long-term landscape evolution of the South Atlantic passive continental margin (SAPCM) in SE-Brazil and NW-Namibia. Excellent onshore outcrop conditions and complete rift to post-rift archives between Sao Paulo and Porto Alegre and in the transition from Namibia to Angola (onshore Walvis ridge) allow a high precision quantification of exhumation, and uplift rates, influencing physical parameters, long-term acting forces, and process-response systems. Research will integrate the published and partly published thermochronological data from Brazil and Namibia, and test lately published new concepts on causes of long-term landscape evolution at rifted margins. The climate-continental margin-mantle coupled process-response system is caused by the interaction between endogenous and exogenous forces, which are related to the mantle-process driven rift - drift - passive continental margin evolution of the South Atlantic, and the climate change since the Early/Late Cretaceous climate maximum. Special emphasis will be given to the influence of long-living transform faults such as the Florianopolis Fracture Zone (FFZ) on the long-term topography evolution of the SAPCM's. A long-term landscape evolution model with process rates will be achieved by thermo-kinematic 3-D modeling (software code PECUBE1,2 and FastScape3). Testing model solutions obtained for a multidimensional parameter space against the real thermochronological and geomorphological data set, the most likely combinations of parameter rates, and values can be constrained. The data and models will allow separating the exogenous and endogenous forces and their process rates. References 1. Braun, J., 2003. Pecube: A new finite element code to solve the 3D heat transport equation including the effects of a time-varying, finite amplitude surface topography. Computers and Geosciences, v.29, pp.787-794. 2. Braun, J., van der Beek, P., Valla, P., Robert, X., Herman, F., Goltzbacj, C., Pedersen, V., Perry, C., Simon-Labric, T., Prigent, C. 2012. Quantifying rates of landscape evolution and tectonic processes by thermochronology and numerical modeling of crustal heat transport using PECUBE. Tectonophysics, v.524-525, pp.1-28. 3. Braun, J. and Willett, S.D., 2013. A very efficient, O(n), implicit and parallel method to solve the basic stream power law equation governing fluvial incision and landscape evolution. Geomorphology, v.180-181, 170-179.
ISEES: an institute for sustainable software to accelerate environmental science
NASA Astrophysics Data System (ADS)
Jones, M. B.; Schildhauer, M.; Fox, P. A.
2013-12-01
Software is essential to the full science lifecycle, spanning data acquisition, processing, quality assessment, data integration, analysis, modeling, and visualization. Software runs our meteorological sensor systems, our data loggers, and our ocean gliders. Every aspect of science is impacted by, and improved by, software. Scientific advances ranging from modeling climate change to the sequencing of the human genome have been rendered possible in the last few decades due to the massive improvements in the capabilities of computers to process data through software. This pivotal role of software in science is broadly acknowledged, while simultaneously being systematically undervalued through minimal investments in maintenance and innovation. As a community, we need to embrace the creation, use, and maintenance of software within science, and address problems such as code complexity, openness,reproducibility, and accessibility. We also need to fully develop new skills and practices in software engineering as a core competency in our earth science disciplines, starting with undergraduate and graduate education and extending into university and agency professional positions. The Institute for Sustainable Earth and Environmental Software (ISEES) is being envisioned as a community-driven activity that can facilitate and galvanize activites around scientific software in an analogous way to synthesis centers such as NCEAS and NESCent that have stimulated massive advances in ecology and evolution. We will describe the results of six workshops (Science Drivers, Software Lifecycles, Software Components, Workforce Development and Training, Sustainability and Governance, and Community Engagement) that have been held in 2013 to envision such an institute. We will present community recommendations from these workshops and our strategic vision for how ISEES will address the technical issues in the software lifecycle, sustainability of the whole software ecosystem, and the critical issue of computational training for the scientific community. Process for envisioning ISEES.
Using Decision Structures for Policy Analysis in Software Product-line Evolution - A Case Study
NASA Astrophysics Data System (ADS)
Sarang, Nita; Sanglikar, Mukund A.
Project management decisions are the primary basis for project success (or failure). Mostly, such decisions are based on an intuitive understanding of the underlying software engineering and management process and have a likelihood of being misjudged. Our problem domain is product-line evolution. We model the dynamics of the process by incorporating feedback loops appropriate to two decision structures: staffing policy, and the forces of growth associated with long-term software evolution. The model is executable and supports project managers to assess the long-term effects of possible actions. Our work also corroborates results from earlier studies of E-type systems, in particular the FEAST project and the rules for software evolution, planning and management.
A software development and evolution model based on decision-making
NASA Technical Reports Server (NTRS)
Wild, J. Christian; Dong, Jinghuan; Maly, Kurt
1991-01-01
Design is a complex activity whose purpose is to construct an artifact which satisfies a set of constraints and requirements. However the design process is not well understood. The software design and evolution process is the focus of interest, and a three dimensional software development space organized around a decision-making paradigm is presented. An initial instantiation of this model called 3DPM(sub p) which was partly implemented, is presented. Discussion of the use of this model in software reuse and process management is given.
Constraint-Driven Software Design: An Escape from the Waterfall Model.
ERIC Educational Resources Information Center
de Hoog, Robert; And Others
1994-01-01
Presents the principles of a development methodology for software design based on a nonlinear, product-driven approach that integrates quality aspects. Two examples are given to show that the flexibility needed for building high quality systems leads to integrated development environments in which methodology, product, and tools are closely…
NASA Astrophysics Data System (ADS)
Istanbulluoglu, E.; Vivoni, E. R.; Ivanov, V. Y.; Bras, R. L.
2005-12-01
Landscape morphology has an important control on the spatial and temporal organization of basin hydrologic response to climate forcing, affecting soil moisture redistribution as well as vegetation function. On the other hand, erosion, driven by hydrology and modulated by vegetation, produces landforms over geologic time scales that reflect characteristic signatures of the dominant land forming process. Responding to extreme climate events or anthropogenic disturbances of the land surface, infrequent but rapid forms of erosion (e.g., arroyo development, landsliding) can modify topography such that basin hydrology is significantly influenced. Despite significant advances in both hydrologic and geomorphic modeling over the past two decades, the dynamic interactions between basin hydrology, geomorphology and terrestrial ecology are not adequately captured in current model frameworks. In order to investigate hydrologic-geomorphic-ecologic interactions at the basin scale we present initial efforts in integrating the CHILD landscape evolution model (Tucker et al. 2001) with the tRIBS hydrology model (Ivanov et al. 2004), both developed in a common software environment. In this talk, we present preliminary results of the numerical modeling of the coupled evolution of basin hydro-geomorphic response and resulting landscape morphology in two sets of examples. First, we discuss the long-term evolution of both the hydrologic response and the resulting basin morphology from an initially uplifted plateau. In the second set of modeling experiments, we implement changes in climate and land-use to an existing topography and compare basin hydrologic response to the model results when landscape form is fixed (e.g. no coupling between hydrology and geomorphology). Model results stress the importance of internal basin dynamics, including runoff generation mechanisms and hydrologic states, in shaping hydrologic response as well as the importance of employing comprehensive conceptualizations of hydrology in modeling landscape evolution.
Toward a user-driven approach to radiology software solutions: putting the wag back in the dog.
Morgan, Matthew; Mates, Jonathan; Chang, Paul
2006-09-01
The relationship between healthcare providers and the software industry is evolving. In many cases, industry's traditional, market-driven model is failing to meet the increasingly sophisticated and appropriately individualized needs of providers. Advances in both technology infrastructure and development methodologies have set the stage for the transition from a vendor-driven to a more user-driven process of solution engineering. To make this transition, providers must take an active role in the development process and vendors must provide flexible frameworks on which to build. Only then can the provider/vendor relationship mature from a purchaser/supplier to a codesigner/partner model, where true insight and innovation can occur.
Test Driven Development: Lessons from a Simple Scientific Model
NASA Astrophysics Data System (ADS)
Clune, T. L.; Kuo, K.
2010-12-01
In the commercial software industry, unit testing frameworks have emerged as a disruptive technology that has permanently altered the process by which software is developed. Unit testing frameworks significantly reduce traditional barriers, both practical and psychological, to creating and executing tests that verify software implementations. A new development paradigm, known as test driven development (TDD), has emerged from unit testing practices, in which low-level tests (i.e. unit tests) are created by developers prior to implementing new pieces of code. Although somewhat counter-intuitive, this approach actually improves developer productivity. In addition to reducing the average time for detecting software defects (bugs), the requirement to provide procedure interfaces that enable testing frequently leads to superior design decisions. Although TDD is widely accepted in many software domains, its applicability to scientific modeling still warrants reasonable skepticism. While the technique is clearly relevant for infrastructure layers of scientific models such as the Earth System Modeling Framework (ESMF), numerical and scientific components pose a number of challenges to TDD that are not often encountered in commercial software. Nonetheless, our experience leads us to believe that the technique has great potential not only for developer productivity, but also as a tool for understanding and documenting the basic scientific assumptions upon which our models are implemented. We will provide a brief introduction to test driven development and then discuss our experience in using TDD to implement a relatively simple numerical model that simulates the growth of snowflakes. Many of the lessons learned are directly applicable to larger scientific models.
Software For Computing Reliability Of Other Software
NASA Technical Reports Server (NTRS)
Nikora, Allen; Antczak, Thomas M.; Lyu, Michael
1995-01-01
Computer Aided Software Reliability Estimation (CASRE) computer program developed for use in measuring reliability of other software. Easier for non-specialists in reliability to use than many other currently available programs developed for same purpose. CASRE incorporates mathematical modeling capabilities of public-domain Statistical Modeling and Estimation of Reliability Functions for Software (SMERFS) computer program and runs in Windows software environment. Provides menu-driven command interface; enabling and disabling of menu options guides user through (1) selection of set of failure data, (2) execution of mathematical model, and (3) analysis of results from model. Written in C language.
Components for Atomistic-to-Continuum Multiscale Modeling of Flow in Micro- and Nanofluidic Systems
Adalsteinsson, Helgi; Debusschere, Bert J.; Long, Kevin R.; ...
2008-01-01
Micro- and nanofluidics pose a series of significant challenges for science-based modeling. Key among those are the wide separation of length- and timescales between interface phenomena and bulk flow and the spatially heterogeneous solution properties near solid-liquid interfaces. It is not uncommon for characteristic scales in these systems to span nine orders of magnitude from the atomic motions in particle dynamics up to evolution of mass transport at the macroscale level, making explicit particle models intractable for all but the simplest systems. Recently, atomistic-to-continuum (A2C) multiscale simulations have gained a lot of interest as an approach to rigorously handle particle-levelmore » dynamics while also tracking evolution of large-scale macroscale behavior. While these methods are clearly not applicable to all classes of simulations, they are finding traction in systems in which tight-binding, and physically important, dynamics at system interfaces have complex effects on the slower-evolving large-scale evolution of the surrounding medium. These conditions allow decomposition of the simulation into discrete domains, either spatially or temporally. In this paper, we describe how features of domain decomposed simulation systems can be harnessed to yield flexible and efficient software for multiscale simulations of electric field-driven micro- and nanofluidics.« less
Model Driven Engineering with Ontology Technologies
NASA Astrophysics Data System (ADS)
Staab, Steffen; Walter, Tobias; Gröner, Gerd; Parreiras, Fernando Silva
Ontologies constitute formal models of some aspect of the world that may be used for drawing interesting logical conclusions even for large models. Software models capture relevant characteristics of a software artifact to be developed, yet, most often these software models have limited formal semantics, or the underlying (often graphical) software language varies from case to case in a way that makes it hard if not impossible to fix its semantics. In this contribution, we survey the use of ontology technologies for software modeling in order to carry over advantages from ontology technologies to the software modeling domain. It will turn out that ontology-based metamodels constitute a core means for exploiting expressive ontology reasoning in the software modeling domain while remaining flexible enough to accommodate varying needs of software modelers.
The Evolution of Big Data and Learning Analytics in American Higher Education
ERIC Educational Resources Information Center
Picciano, Anthony G.
2012-01-01
Data-driven decision making, popularized in the 1980s and 1990s, is evolving into a vastly more sophisticated concept known as big data that relies on software approaches generally referred to as analytics. Big data and analytics for instructional applications are in their infancy and will take a few years to mature, although their presence is…
Documentation Driven Development for Complex Real-Time Systems
2004-12-01
This paper presents a novel approach for development of complex real - time systems , called the documentation-driven development (DDD) approach. This... time systems . DDD will also support automated software generation based on a computational model and some relevant techniques. DDD includes two main...stakeholders to be easily involved in development processes and, therefore, significantly improve the agility of software development for complex real
Modeling Student Software Testing Processes: Attitudes, Behaviors, Interventions, and Their Effects
ERIC Educational Resources Information Center
Buffardi, Kevin John
2014-01-01
Effective software testing identifies potential bugs and helps correct them, producing more reliable and maintainable software. As software development processes have evolved, incremental testing techniques have grown in popularity, particularly with introduction of test-driven development (TDD). However, many programmers struggle to adopt TDD's…
AST: Activity-Security-Trust driven modeling of time varying networks.
Wang, Jian; Xu, Jiake; Liu, Yanheng; Deng, Weiwen
2016-02-18
Network modeling is a flexible mathematical structure that enables to identify statistical regularities and structural principles hidden in complex systems. The majority of recent driving forces in modeling complex networks are originated from activity, in which an activity potential of a time invariant function is introduced to identify agents' interactions and to construct an activity-driven model. However, the new-emerging network evolutions are already deeply coupled with not only the explicit factors (e.g. activity) but also the implicit considerations (e.g. security and trust), so more intrinsic driving forces behind should be integrated into the modeling of time varying networks. The agents undoubtedly seek to build a time-dependent trade-off among activity, security, and trust in generating a new connection to another. Thus, we reasonably propose the Activity-Security-Trust (AST) driven model through synthetically considering the explicit and implicit driving forces (e.g. activity, security, and trust) underlying the decision process. AST-driven model facilitates to more accurately capture highly dynamical network behaviors and figure out the complex evolution process, allowing a profound understanding of the effects of security and trust in driving network evolution, and improving the biases induced by only involving activity representations in analyzing the dynamical processes.
Hite, Jessica L; Cressler, Clayton E
2018-05-05
What drives the evolution of parasite life-history traits? Recent studies suggest that linking within- and between-host processes can provide key insight into both disease dynamics and parasite evolution. Still, it remains difficult to understand how to pinpoint the critical factors connecting these cross-scale feedbacks, particularly under non-equilibrium conditions; many natural host populations inherently fluctuate and parasites themselves can strongly alter the stability of host populations. Here, we develop a general model framework that mechanistically links resources to parasite evolution across a gradient of stable and unstable conditions. First, we dynamically link resources and between-host processes (host density, stability, transmission) to virulence evolution, using a 'non-nested' model. Then, we consider a 'nested' model where population-level processes (transmission and virulence) depend on resource-driven changes to individual-level (within-host) processes (energetics, immune function, parasite production). Contrary to 'non-nested' model predictions, the 'nested' model reveals complex effects of host population dynamics on parasite evolution, including regions of evolutionary bistability; evolution can push parasites towards strongly or weakly stabilizing strategies. This bistability results from dynamic feedbacks between resource-driven changes to host density, host immune function and parasite production. Together, these results highlight how cross-scale feedbacks can provide key insights into the structuring role of parasites and parasite evolution.This article is part of the theme issue 'Anthropogenic resource subsidies and host-parasite dynamics in wildlife'. © 2018 The Author(s).
Model-Driven Development for scientific computing. An upgrade of the RHEEDGr program
NASA Astrophysics Data System (ADS)
Daniluk, Andrzej
2009-11-01
Model-Driven Engineering (MDE) is the software engineering discipline, which considers models as the most important element for software development, and for the maintenance and evolution of software, through model transformation. Model-Driven Architecture (MDA) is the approach for software development under the Model-Driven Engineering framework. This paper surveys the core MDA technology that was used to upgrade of the RHEEDGR program to C++0x language standards. New version program summaryProgram title: RHEEDGR-09 Catalogue identifier: ADUY_v3_0 Program summary URL:
Research to Operations of Ionospheric Scintillation Detection and Forecasting
NASA Astrophysics Data System (ADS)
Jones, J.; Scro, K.; Payne, D.; Ruhge, R.; Erickson, B.; Andorka, S.; Ludwig, C.; Karmann, J.; Ebelhar, D.
Ionospheric Scintillation refers to random fluctuations in phase and amplitude of electromagnetic waves caused by a rapidly varying refractive index due to turbulent features in the ionosphere. Scintillation of transionospheric UHF and L-Band radio frequency signals is particularly troublesome since this phenomenon can lead to degradation of signal strength and integrity that can negatively impact satellite communications and navigation, radar, or radio signals from other systems that traverse or interact with the ionosphere. Although ionospheric scintillation occurs in both the equatorial and polar regions of the Earth, the focus of this modeling effort is on equatorial scintillation. The ionospheric scintillation model is data-driven in a sense that scintillation observations are used to perform detection and characterization of scintillation structures. These structures are then propagated to future times using drift and decay models to represent the natural evolution of ionospheric scintillation. The impact on radio signals is also determined by the model and represented in graphical format to the user. A frequency scaling algorithm allows for impact analysis on frequencies other than the observation frequencies. The project began with lab-grade software and through a tailored Agile development process, deployed operational-grade code to a DoD operational center. The Agile development process promotes adaptive promote adaptive planning, evolutionary development, early delivery, continuous improvement, regular collaboration with the customer, and encourage rapid and flexible response to customer-driven changes. The Agile philosophy values individuals and interactions over processes and tools, working software over comprehensive documentation, customer collaboration over contract negotiation, and responding to change over following a rigid plan. The end result was an operational capability that met customer expectations. Details of the model and the process of operational integration are discussed as well as lessons learned to improve performance on future projects.
Designing an optimal software intensive system acquisition: A game theoretic approach
NASA Astrophysics Data System (ADS)
Buettner, Douglas John
The development of schedule-constrained software-intensive space systems is challenging. Case study data from national security space programs developed at the U.S. Air Force Space and Missile Systems Center (USAF SMC) provide evidence of the strong desire by contractors to skip or severely reduce software development design and early defect detection methods in these schedule-constrained environments. The research findings suggest recommendations to fully address these issues at numerous levels. However, the observations lead us to investigate modeling and theoretical methods to fundamentally understand what motivated this behavior in the first place. As a result, Madachy's inspection-based system dynamics model is modified to include unit testing and an integration test feedback loop. This Modified Madachy Model (MMM) is used as a tool to investigate the consequences of this behavior on the observed defect dynamics for two remarkably different case study software projects. Latin Hypercube sampling of the MMM with sample distributions for quality, schedule and cost-driven strategies demonstrate that the higher cost and effort quality-driven strategies provide consistently better schedule performance than the schedule-driven up-front effort-reduction strategies. Game theory reasoning for schedule-driven engineers cutting corners on inspections and unit testing is based on the case study evidence and Austin's agency model to describe the observed phenomena. Game theory concepts are then used to argue that the source of the problem and hence the solution to developers cutting corners on quality for schedule-driven system acquisitions ultimately lies with the government. The game theory arguments also lead to the suggestion that the use of a multi-player dynamic Nash bargaining game provides a solution for our observed lack of quality game between the government (the acquirer) and "large-corporation" software developers. A note is provided that argues this multi-player dynamic Nash bargaining game also provides the solution to Freeman Dyson's problem, for a way to place a label of good or bad on systems.
Conceptual Modeling in the Time of the Revolution: Part II
NASA Astrophysics Data System (ADS)
Mylopoulos, John
Conceptual Modeling was a marginal research topic at the very fringes of Computer Science in the 60s and 70s, when the discipline was dominated by topics focusing on programs, systems and hardware architectures. Over the years, however, the field has moved to centre stage and has come to claim a central role both in Computer Science research and practice in diverse areas, such as Software Engineering, Databases, Information Systems, the Semantic Web, Business Process Management, Service-Oriented Computing, Multi-Agent Systems, Knowledge Management, and more. The transformation was greatly aided by the adoption of standards in modeling languages (e.g., UML), and model-based methodologies (e.g., Model-Driven Architectures) by the Object Management Group (OMG) and other standards organizations. We briefly review the history of the field over the past 40 years, focusing on the evolution of key ideas. We then note some open challenges and report on-going research, covering topics such as the representation of variability in conceptual models, capturing model intentions, and models of laws.
Model driven development of clinical information sytems using openEHR.
Atalag, Koray; Yang, Hong Yul; Tempero, Ewan; Warren, Jim
2011-01-01
openEHR and the recent international standard (ISO 13606) defined a model driven software development methodology for health information systems. However there is little evidence in the literature describing implementation; especially for desktop clinical applications. This paper presents an implementation pathway using .Net/C# technology for Microsoft Windows desktop platforms. An endoscopy reporting application driven by openEHR Archetypes and Templates has been developed. A set of novel GUI directives has been defined and presented which guides the automatic graphical user interface generator to render widgets properly. We also reveal the development steps and important design decisions; from modelling to the final software product. This might provide guidance for other developers and form evidence required for the adoption of these standards for vendors and national programs alike.
Tamura, Koichiro; Peterson, Daniel; Peterson, Nicholas; Stecher, Glen; Nei, Masatoshi; Kumar, Sudhir
2011-01-01
Comparative analysis of molecular sequence data is essential for reconstructing the evolutionary histories of species and inferring the nature and extent of selective forces shaping the evolution of genes and species. Here, we announce the release of Molecular Evolutionary Genetics Analysis version 5 (MEGA5), which is a user-friendly software for mining online databases, building sequence alignments and phylogenetic trees, and using methods of evolutionary bioinformatics in basic biology, biomedicine, and evolution. The newest addition in MEGA5 is a collection of maximum likelihood (ML) analyses for inferring evolutionary trees, selecting best-fit substitution models (nucleotide or amino acid), inferring ancestral states and sequences (along with probabilities), and estimating evolutionary rates site-by-site. In computer simulation analyses, ML tree inference algorithms in MEGA5 compared favorably with other software packages in terms of computational efficiency and the accuracy of the estimates of phylogenetic trees, substitution parameters, and rate variation among sites. The MEGA user interface has now been enhanced to be activity driven to make it easier for the use of both beginners and experienced scientists. This version of MEGA is intended for the Windows platform, and it has been configured for effective use on Mac OS X and Linux desktops. It is available free of charge from http://www.megasoftware.net. PMID:21546353
Diffusive smoothing of surfzone bathymetry by gravity-driven sediment transport
NASA Astrophysics Data System (ADS)
Moulton, M. R.; Elgar, S.; Raubenheimer, B.
2012-12-01
Gravity-driven sediment transport often is assumed to have a small effect on the evolution of nearshore morphology. Here, it is shown that down-slope gravity-driven sediment transport is an important process acting to smooth steep bathymetric features in the surfzone. Gravity-driven transport can be modeled as a diffusive term in the sediment continuity equation governing temporal (t) changes in bed level (h): ∂h/∂t ≈ κ ▽2h, where κ is a sediment diffusion coefficient that is a function of the bed shear stress (τb) and sediment properties, such as the grain size and the angle of repose. Field observations of waves, currents, and the evolution of large excavated holes (initially 10-m wide and 2-m deep, with sides as steep as 35°) in an energetic surfzone are consistent with diffusive smoothing by gravity. Specifically, comparisons of κ estimated from the measured bed evolution with those estimated with numerical model results for several transport theories suggest that gravity-driven sediment transport dominates the bed evolution, with κ proportional to a power of τb. The models are initiated with observed bathymetry and forced with observed waves and currents. The diffusion coefficients from the measurements and from the model simulations were on average of order 10-5 m2/s, implying evolution time scales of days for features with length scales of 10 m. The dependence of κ on τb varies for different transport theories and for high and low shear stress regimes. The US Army Corps of Engineers Field Research Facility, Duck, NC provided excellent logistical support. Funded by a National Security Science and Engineering Faculty Fellowship, a National Defense Science and Engineering Graduate Fellowship, and the Office of Naval Research.
Mantini, Dante; Hasson, Uri; Betti, Viviana; Perrucci, Mauro G.; Romani, Gian Luca; Corbetta, Maurizio; Orban, Guy A.; Vanduffel, Wim
2012-01-01
Evolution-driven functional changes in the primate brain are typically assessed by aligning monkey and human activation maps using cortical surface expansion models. These models use putative homologous areas as registration landmarks, assuming they are functionally correspondent. In cases where functional changes have occurred in an area, this assumption prohibits to reveal whether other areas may have assumed lost functions. Here we describe a method to examine functional correspondences across species. Without making spatial assumptions, we assess similarities in sensory-driven functional magnetic resonance imaging responses between monkey (Macaca mulatta) and human brain areas by means of temporal correlation. Using natural vision data, we reveal regions for which functional processing has shifted to topologically divergent locations during evolution. We conclude that substantial evolution-driven functional reorganizations have occurred, not always consistent with cortical expansion processes. This novel framework for evaluating changes in functional architecture is crucial to building more accurate evolutionary models. PMID:22306809
AST: Activity-Security-Trust driven modeling of time varying networks
Wang, Jian; Xu, Jiake; Liu, Yanheng; Deng, Weiwen
2016-01-01
Network modeling is a flexible mathematical structure that enables to identify statistical regularities and structural principles hidden in complex systems. The majority of recent driving forces in modeling complex networks are originated from activity, in which an activity potential of a time invariant function is introduced to identify agents’ interactions and to construct an activity-driven model. However, the new-emerging network evolutions are already deeply coupled with not only the explicit factors (e.g. activity) but also the implicit considerations (e.g. security and trust), so more intrinsic driving forces behind should be integrated into the modeling of time varying networks. The agents undoubtedly seek to build a time-dependent trade-off among activity, security, and trust in generating a new connection to another. Thus, we reasonably propose the Activity-Security-Trust (AST) driven model through synthetically considering the explicit and implicit driving forces (e.g. activity, security, and trust) underlying the decision process. AST-driven model facilitates to more accurately capture highly dynamical network behaviors and figure out the complex evolution process, allowing a profound understanding of the effects of security and trust in driving network evolution, and improving the biases induced by only involving activity representations in analyzing the dynamical processes. PMID:26888717
The Evolution of Software Pricing: From Box Licenses to Application Service Provider Models.
ERIC Educational Resources Information Center
Bontis, Nick; Chung, Honsan
2000-01-01
Describes three different pricing models for software. Findings of this case study support the proposition that software pricing is a complex and subjective process. The key determinant of alignment between vendor and user is the nature of value in the software to the buyer. This value proposition may range from increased cost reduction to…
Testing the Accuracy of Data-driven MHD Simulations of Active Region Evolution and Eruption
NASA Astrophysics Data System (ADS)
Leake, J. E.; Linton, M.; Schuck, P. W.
2017-12-01
Models for the evolution of the solar coronal magnetic field are vital for understanding solar activity, yet the best measurements of the magnetic field lie at the photosphere, necessitating the recent development of coronal models which are "data-driven" at the photosphere. Using magnetohydrodynamic simulations of active region formation and our recently created validation framework we investigate the source of errors in data-driven models that use surface measurements of the magnetic field, and derived MHD quantities, to model the coronal magnetic field. The primary sources of errors in these studies are the temporal and spatial resolution of the surface measurements. We will discuss the implications of theses studies for accurately modeling the build up and release of coronal magnetic energy based on photospheric magnetic field observations.
Model-Driven Development for PDS4 Software and Services
NASA Astrophysics Data System (ADS)
Hughes, J. S.; Crichton, D. J.; Algermissen, S. S.; Cayanan, M. D.; Joyner, R. S.; Hardman, S. H.; Padams, J. H.
2018-04-01
PDS4 data product labels provide the information necessary for processing the referenced digital object. However, significantly more information is available in the PDS4 Information Model. This additional information is made available for use, by both software and services, to configure, promote resiliency, and improve interoperability.
Software evolution. What kind of evolution?
NASA Astrophysics Data System (ADS)
Torres-Carbonell, J. J.; Parets-Llorca, J.
2001-06-01
Most Software Systems capable of adapting to the environment or of performing some kind of adaptive activity (such as pattern learning, behavior simulations and the like) use concepts and models from Biology. Nevertheless, such approaches are based on the Modern Synthesis, i.e., Darwinism plus Mendelism, and this implies preadaptive mutations in, and subsequent selection of the better adapted individuals. These pre-adaptive changes usually do not produce the desired effect, are virtually useless and require some kind of backtracking for the system to obtain profit from adaptation. It is our contention that an evolutionary approach in Software Systems development cannot be based on pre-adaptive mutations, but rather on post-adaptive ones, that is, anticipatory mutations and modifications (Lamarkism). A novel way of understanding evolution in Software Systems based on applied Lamarkism is presented and a framework that allows the incorporation of modifications according to the necessities of the system and the will of the modeller is proposed.
Towards Model-Driven End-User Development in CALL
ERIC Educational Resources Information Center
Farmer, Rod; Gruba, Paul
2006-01-01
The purpose of this article is to introduce end-user development (EUD) processes to the CALL software development community. EUD refers to the active participation of end-users, as non-professional developers, in the software development life cycle. Unlike formal software engineering approaches, the focus in EUD on means/ends development is…
NASA Astrophysics Data System (ADS)
Stippich, Christian; Krob, Florian; Glasmacher, Ulrich A.; Hackspacher, Peter C.
2016-04-01
The aim of the research is to quantify the long-term evolution of the western South Atlantic passive continental margin (SAPCM) in SE-Brazil. Excellent onshore outcrop conditions and extensive pre-rift to post-rift archives between São Paulo and Laguna allow a high precision quantification of exhumation, and rock uplift rates, influencing physical parameters, long-term acting forces, and process-response systems. Research will integrate published1 and partly published thermochronological data from Brazil, and test lately published new concepts on causes of long-term landscape and lithospheric evolution in southern Brazil. Six distinct lithospheric blocks (Laguna, Florianópolis, Curitiba, Ilha Comprida, Peruibe and Santos), which are separated by fracture zones1 are characterized by individual thermochronological age spectra. Furthermore, the thermal evolution derived by numerical modeling indicates variable post-rift exhumation histories of these blocks. In this context, we will provide information on the causes for the complex exhumation history of the Florianópolis, and adjacent blocks. The climate-continental margin-mantle coupled process-response system is caused by the interaction between endogenous and exogenous forces, which are related to the mantle-process driven rift - drift - passive continental margin evolution of the South Atlantic, and the climate change since the Early/Late Cretaceous climate maximum. Special emphasis will be given to the influence of long-living transform faults such as the Florianopolis Fracture Zone (FFZ) on the long-term topography evolution of the SAPCM's. A long-term landscape evolution model with process rates will be achieved by thermo-kinematic 3-D modeling (software code PECUBE2,3 and FastScape4). Testing model solutions obtained for a multidimensional parameter space against the real thermochronological and geomorphological data set, the most likely combinations of parameter rates, and values can be constrained. The data and models will allow separating the exogenous and endogenous forces and their process rates. References 1. Karl, M., Glasmacher, U.A., Kollenz, S., Franco-Magalhaes, A.O.B., Stockli, D.F., Hackspacher, P., 2013. Evolution of the South Atlantic passive continental margin in southern Brazil derived from zircon and apatite (U-Th-Sm)/He and fission-track data. Tectonophysics, Volume 604, Pages 224-244. 2. Braun, J., 2003. Pecube: A new finite element code to solve the 3D heat transport equation including the effects of a time-varying, finite amplitude surface topography. Computers and Geosciences, v.29, pp.787-794. 3. Braun, J., van der Beek, P., Valla, P., Robert, X., Herman, F., Goltzbacj, C., Pedersen, V., Perry, C., Simon-Labric, T., Prigent, C. 2012. Quantifying rates of landscape evolution and tectonic processes by thermochronology and numerical modeling of crustal heat transport using PECUBE. Tectonophysics, v.524-525, pp.1-28. 4. Braun, J. and Willett, S.D., 2013. A very efficient, O(n), implicit and parallel method to solve the basic stream power law equation governing fluvial incision and landscape evolution. Geomorphology, v.180-181, 170-179.
NASA Astrophysics Data System (ADS)
Duffy, Christopher; Leonard, Lorne; Shi, Yuning; Bhatt, Gopal; Hanson, Paul; Gil, Yolanda; Yu, Xuan
2015-04-01
Using a series of recent examples and papers we explore some progress and potential for virtual (cyber-) collaboration inspired by access to high resolution, harmonized public-sector data at continental scales [1]. The first example describes 7 meso-scale catchments in Pennsylvania, USA where the watershed is forced by climate reanalysis and IPCC future climate scenarios (Intergovernmental Panel on Climate Change). We show how existing public-sector data and community models are currently able to resolve fine-scale eco-hydrologic processes regarding wetland response to climate change [2]. The results reveal that regional climate change is only part of the story, with large variations in flood and drought response associated with differences in terrain, physiography, landuse and/or hydrogeology. The importance of community-driven virtual testbeds are demonstrated in the context of Critical Zone Observatories, where earth scientists from around the world are organizing hydro-geophysical data and model results to explore new processes that couple hydrologic models with land-atmosphere interaction, biogeochemical weathering, carbon-nitrogen cycle, landscape evolution and ecosystem services [3][4]. Critical Zone cyber-research demonstrates how data-driven model development requires a flexible computational structure where process modules are relatively easy to incorporate and where new data structures can be implemented [5]. From the perspective of "Big-Data" the paper points out that extrapolating results from virtual observatories to catchments at continental scales, will require centralized or cloud-based cyberinfrastructure as a necessary condition for effectively sharing petabytes of data and model results [6]. Finally we outline how innovative cyber-science is supporting earth-science learning, sharing and exploration through the use of on-line tools where hydrologists and limnologists are sharing data and models for simulating the coupled impacts of catchment hydrology on lake eco-hydrology (NSF-INSPIRE, IIS1344272). The research attempts to use a virtual environment (www.organicdatascience.org) to break down disciplinary barriers and support emergent communities of science. [1] Source: Leonard and Duffy, 2013, Environmental Modelling & Software; [2] Source: Yu et al, 2014, Computers in Geoscience; [3] Source: Duffy et al, 2014, Procedia Earth and Planetary Science; [4] Source: Shi et al, Journal of Hydrometeorology, 2014; [5] Source: Bhatt et al, 2014, Environmental Modelling & Software ; [6] Leonard and Duffy, 2014, Environmental Modelling and Software.
Barnes, Samuel R; Ng, Thomas S C; Santa-Maria, Naomi; Montagne, Axel; Zlokovic, Berislav V; Jacobs, Russell E
2015-06-16
Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a promising technique to characterize pathology and evaluate treatment response. However, analysis of DCE-MRI data is complex and benefits from concurrent analysis of multiple kinetic models and parameters. Few software tools are currently available that specifically focuses on DCE-MRI analysis with multiple kinetic models. Here, we developed ROCKETSHIP, an open-source, flexible and modular software for DCE-MRI analysis. ROCKETSHIP incorporates analyses with multiple kinetic models, including data-driven nested model analysis. ROCKETSHIP was implemented using the MATLAB programming language. Robustness of the software to provide reliable fits using multiple kinetic models is demonstrated using simulated data. Simulations also demonstrate the utility of the data-driven nested model analysis. Applicability of ROCKETSHIP for both preclinical and clinical studies is shown using DCE-MRI studies of the human brain and a murine tumor model. A DCE-MRI software suite was implemented and tested using simulations. Its applicability to both preclinical and clinical datasets is shown. ROCKETSHIP was designed to be easily accessible for the beginner, but flexible enough for changes or additions to be made by the advanced user as well. The availability of a flexible analysis tool will aid future studies using DCE-MRI. A public release of ROCKETSHIP is available at https://github.com/petmri/ROCKETSHIP .
Adopting best practices: "Agility" moves from software development to healthcare project management.
Kitzmiller, Rebecca; Hunt, Eleanor; Sproat, Sara Breckenridge
2006-01-01
It is time for a change in mindset in how nurses operationalize system implementations and manage projects. Computers and systems have evolved over time from unwieldy mysterious machines of the past to ubiquitous computer use in every aspect of daily lives and work sites. Yet, disconcertingly, the process used to implement these systems has not evolved. Technology implementation does not need to be a struggle. It is time to adapt traditional plan-driven implementation methods to incorporate agile techniques. Agility is a concept borrowed from software development and is presented here because it encourages flexibility, adaptation, and continuous learning as part of the implementation process. Agility values communication and harnesses change to an advantage, which facilitates the natural evolution of an adaptable implementation process. Specific examples of agility in an implementation are described, and plan-driven implementation stages are adapted to incorporate relevant agile techniques. This comparison demonstrates how an agile approach enhances traditional implementation techniques to meet the demands of today's complex healthcare environments.
Evolution of Secondary Software Businesses: Understanding Industry Dynamics
NASA Astrophysics Data System (ADS)
Tyrväinen, Pasi; Warsta, Juhani; Seppänen, Veikko
Primary software industry originates from IBM's decision to unbundle software-related computer system development activities to external partners. This kind of outsourcing from an enterprise internal software development activity is a common means to start a new software business serving a vertical software market. It combines knowledge of the vertical market process with competence in software development. In this research, we present and analyze the key figures of the Finnish secondary software industry, in order to quantify its interaction with the primary software industry during the period of 2000-2003. On the basis of the empirical data, we present a model for evolution of a secondary software business, which makes explicit the industry dynamics. It represents the shift from internal software developed for competitive advantage to development of products supporting standard business processes on top of standardized technologies. We also discuss the implications for software business strategies in each phase.
Instrumentation: Software-Driven Instrumentation: The New Wave.
ERIC Educational Resources Information Center
Salit, M. L.; Parsons, M. L.
1985-01-01
Software-driven instrumentation makes measurements that demand a computer as an integral part of either control, data acquisition, or data reduction. The structure of such instrumentation, hardware requirements, and software requirements are discussed. Examples of software-driven instrumentation (such as wavelength-modulated continuum source…
Simulation of a Canard in Fluid Flow Driven by a Piezoelectric Beam with a Software Control Loop
2014-04-01
The canard is actuated by a piezoelectric beam that bends as voltage is applied. The voltage is controlled by a software subroutine that measures...Dynamic system Modeling Co-simulation Simulation Abaqus Finite element analysis (FEA) Finite element method (FEM) Computational...is unlimited. i CONTENTS Page Introduction 1 Model Description 1 Fluid Model 2 Structural Model 3 Control Subroutine 4 Results 4
Built To Last: Using Iterative Development Models for Sustainable Scientific Software Development
NASA Astrophysics Data System (ADS)
Jasiak, M. E.; Truslove, I.; Savoie, M.
2013-12-01
In scientific research, software development exists fundamentally for the results they create. The core research must take focus. It seems natural to researchers, driven by grant deadlines, that every dollar invested in software development should be used to push the boundaries of problem solving. This system of values is frequently misaligned with those of the software being created in a sustainable fashion; short-term optimizations create longer-term sustainability issues. The National Snow and Ice Data Center (NSIDC) has taken bold cultural steps in using agile and lean development and management methodologies to help its researchers meet critical deadlines, while building in the necessary support structure for the code to live far beyond its original milestones. Agile and lean software development and methodologies including Scrum, Kanban, Continuous Delivery and Test-Driven Development have seen widespread adoption within NSIDC. This focus on development methods is combined with an emphasis on explaining to researchers why these methods produce more desirable results for everyone, as well as promoting developers interacting with researchers. This presentation will describe NSIDC's current scientific software development model, how this addresses the short-term versus sustainability dichotomy, the lessons learned and successes realized by transitioning to this agile and lean-influenced model, and the current challenges faced by the organization.
Cognitive algorithms: dynamic logic, working of the mind, evolution of consciousness and cultures
NASA Astrophysics Data System (ADS)
Perlovsky, Leonid I.
2007-04-01
The paper discusses evolution of consciousness driven by the knowledge instinct, a fundamental mechanism of the mind which determines its higher cognitive functions. Dynamic logic mathematically describes the knowledge instinct. It overcomes past mathematical difficulties encountered in modeling intelligence and relates it to mechanisms of concepts, emotions, instincts, consciousness and unconscious. The two main aspects of the knowledge instinct are differentiation and synthesis. Differentiation is driven by dynamic logic and proceeds from vague and unconscious states to more crisp and conscious states, from less knowledge to more knowledge at each hierarchical level of the mind. Synthesis is driven by dynamic logic operating in a hierarchical organization of the mind; it strives to achieve unity and meaning of knowledge: every concept finds its deeper and more general meaning at a higher level. These mechanisms are in complex relationship of symbiosis and opposition, which leads to complex dynamics of evolution of consciousness and cultures. Modeling this dynamics in a population leads to predictions for the evolution of consciousness, and cultures. Cultural predictive models can be compared to experimental data and used for improvement of human conditions. We discuss existing evidence and future research directions.
A Model-Driven Co-Design Framework for Fusing Control and Scheduling Viewpoints.
Sundharam, Sakthivel Manikandan; Navet, Nicolas; Altmeyer, Sebastian; Havet, Lionel
2018-02-20
Model-Driven Engineering (MDE) is widely applied in the industry to develop new software functions and integrate them into the existing run-time environment of a Cyber-Physical System (CPS). The design of a software component involves designers from various viewpoints such as control theory, software engineering, safety, etc. In practice, while a designer from one discipline focuses on the core aspects of his field (for instance, a control engineer concentrates on designing a stable controller), he neglects or considers less importantly the other engineering aspects (for instance, real-time software engineering or energy efficiency). This may cause some of the functional and non-functional requirements not to be met satisfactorily. In this work, we present a co-design framework based on timing tolerance contract to address such design gaps between control and real-time software engineering. The framework consists of three steps: controller design, verified by jitter margin analysis along with co-simulation, software design verified by a novel schedulability analysis, and the run-time verification by monitoring the execution of the models on target. This framework builds on CPAL (Cyber-Physical Action Language), an MDE design environment based on model-interpretation, which enforces a timing-realistic behavior in simulation through timing and scheduling annotations. The application of our framework is exemplified in the design of an automotive cruise control system.
A Model-Driven Co-Design Framework for Fusing Control and Scheduling Viewpoints
Navet, Nicolas; Havet, Lionel
2018-01-01
Model-Driven Engineering (MDE) is widely applied in the industry to develop new software functions and integrate them into the existing run-time environment of a Cyber-Physical System (CPS). The design of a software component involves designers from various viewpoints such as control theory, software engineering, safety, etc. In practice, while a designer from one discipline focuses on the core aspects of his field (for instance, a control engineer concentrates on designing a stable controller), he neglects or considers less importantly the other engineering aspects (for instance, real-time software engineering or energy efficiency). This may cause some of the functional and non-functional requirements not to be met satisfactorily. In this work, we present a co-design framework based on timing tolerance contract to address such design gaps between control and real-time software engineering. The framework consists of three steps: controller design, verified by jitter margin analysis along with co-simulation, software design verified by a novel schedulability analysis, and the run-time verification by monitoring the execution of the models on target. This framework builds on CPAL (Cyber-Physical Action Language), an MDE design environment based on model-interpretation, which enforces a timing-realistic behavior in simulation through timing and scheduling annotations. The application of our framework is exemplified in the design of an automotive cruise control system. PMID:29461489
Dependability modeling and assessment in UML-based software development.
Bernardi, Simona; Merseguer, José; Petriu, Dorina C
2012-01-01
Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results.
Dependability Modeling and Assessment in UML-Based Software Development
Bernardi, Simona; Merseguer, José; Petriu, Dorina C.
2012-01-01
Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results. PMID:22988428
Large-scale and Long-duration Simulation of a Multi-stage Eruptive Solar Event
NASA Astrophysics Data System (ADS)
Jiang, chaowei; Hu, Qiang; Wu, S. T.
2015-04-01
We employ a data-driven 3D MHD active region evolution model by using the Conservation Element and Solution Element (CESE) numerical method. This newly developed model retains the full MHD effects, allowing time-dependent boundary conditions and time evolution studies. The time-dependent simulation is driven by measured vector magnetograms and the method of MHD characteristics on the bottom boundary. We have applied the model to investigate the coronal magnetic field evolution of AR11283 which was characterized by a pre-existing sigmoid structure in the core region and multiple eruptions, both in relatively small and large scales. We have succeeded in producing the core magnetic field structure and the subsequent eruptions of flux-rope structures (see https://dl.dropboxusercontent.com/u/96898685/large.mp4 for an animation) as the measured vector magnetograms on the bottom boundary evolve in time with constant flux emergence. The whole process, lasting for about an hour in real time, compares well with the corresponding SDO/AIA and coronagraph imaging observations. From these results, we show the capability of the model, largely data-driven, that is able to simulate complex, topological, and highly dynamic active region evolutions. (We acknowledge partial support of NSF grants AGS 1153323 and AGS 1062050, and data support from SDO/HMI and AIA teams).
Consistent model driven architecture
NASA Astrophysics Data System (ADS)
Niepostyn, Stanisław J.
2015-09-01
The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.
NASA Technical Reports Server (NTRS)
Barnes, Jeffrey M.
2011-01-01
All software systems of significant size and longevity eventually undergo changes to their basic architectural structure. Such changes may be prompted by evolving requirements, changing technology, or other reasons. Whatever the cause, software architecture evolution is commonplace in real world software projects. Recently, software architecture researchers have begun to study this phenomenon in depth. However, this work has suffered from problems of validation; research in this area has tended to make heavy use of toy examples and hypothetical scenarios and has not been well supported by real world examples. To help address this problem, I describe an ongoing effort at the Jet Propulsion Laboratory to re-architect the Advanced Multimission Operations System (AMMOS), which is used to operate NASA's deep-space and astrophysics missions. Based on examination of project documents and interviews with project personnel, I describe the goals and approach of this evolution effort and then present models that capture some of the key architectural changes. Finally, I demonstrate how approaches and formal methods from my previous research in architecture evolution may be applied to this evolution, while using languages and tools already in place at the Jet Propulsion Laboratory.
The Need for Software Architecture Evaluation in the Acquisition of Software-Intensive Sysetms
2014-01-01
Function and Performance Specification GIG Global Information Grid ISO International Standard Organisation MDA Model Driven Architecture...architecture and design, which is a key part of knowledge-based economy UNCLASSIFIED DSTO-TR-2936 UNCLASSIFIED 24 Allow Australian SMEs to
Composable Framework Support for Software-FMEA Through Model Execution
NASA Astrophysics Data System (ADS)
Kocsis, Imre; Patricia, Andras; Brancati, Francesco; Rossi, Francesco
2016-08-01
Performing Failure Modes and Effect Analysis (FMEA) during software architecture design is becoming a basic requirement in an increasing number of domains; however, due to the lack of standardized early design phase model execution, classic SW-FMEA approaches carry significant risks and are human effort-intensive even in processes that use Model-Driven Engineering.Recently, modelling languages with standardized executable semantics have emerged. Building on earlier results, this paper describes framework support for generating executable error propagation models from such models during software architecture design. The approach carries the promise of increased precision, decreased risk and more automated execution for SW-FMEA during dependability- critical system development.
USDA-ARS?s Scientific Manuscript database
A model for the evolution of pyrolysis products in a fluidized bed has been developed. In this study the unsteady constitutive transport equations for inert gas flow and decomposition kinetics were modeled using the commercial computational fluid dynamics (CFD) software FLUENT-12. The model system d...
Real-time computing platform for spiking neurons (RT-spike).
Ros, Eduardo; Ortigosa, Eva M; Agís, Rodrigo; Carrillo, Richard; Arnold, Michael
2006-07-01
A computing platform is described for simulating arbitrary networks of spiking neurons in real time. A hybrid computing scheme is adopted that uses both software and hardware components to manage the tradeoff between flexibility and computational power; the neuron model is implemented in hardware and the network model and the learning are implemented in software. The incremental transition of the software components into hardware is supported. We focus on a spike response model (SRM) for a neuron where the synapses are modeled as input-driven conductances. The temporal dynamics of the synaptic integration process are modeled with a synaptic time constant that results in a gradual injection of charge. This type of model is computationally expensive and is not easily amenable to existing software-based event-driven approaches. As an alternative we have designed an efficient time-based computing architecture in hardware, where the different stages of the neuron model are processed in parallel. Further improvements occur by computing multiple neurons in parallel using multiple processing units. This design is tested using reconfigurable hardware and its scalability and performance evaluated. Our overall goal is to investigate biologically realistic models for the real-time control of robots operating within closed action-perception loops, and so we evaluate the performance of the system on simulating a model of the cerebellum where the emulation of the temporal dynamics of the synaptic integration process is important.
The Evolution of Constructivist Learning Environments: Immersion in Distributed, Virtual Worlds.
ERIC Educational Resources Information Center
Dede, Chris
1995-01-01
Discusses the evolution of constructivist learning environments and examines the collaboration of simulated software models, virtual environments, and evolving mental models via immersion in artificial realities. A sidebar gives a realistic example of a student navigating through cyberspace. (JMV)
Modeling the Evolution of a Science Project in Software-Reliant System Acquisition Programs
2013-07-24
might: • Limit worker burnout • Perform better regarding schedule 10 Software Technology Conference April 10, 2013 © 2013 Carnegie Mellon...University The Evolution of a Science Project Key Preliminary Findings -3 The tipping point contributes to the “90% Done” Syndrome Percentage...worker burnout - SP User Satisfaction SP increasing satisfaction indicated satisfaction + + B3 Moderating User Satisfaction overage switch demand switch
A posteriori operation detection in evolving software models
Langer, Philip; Wimmer, Manuel; Brosch, Petra; Herrmannsdörfer, Markus; Seidl, Martina; Wieland, Konrad; Kappel, Gerti
2013-01-01
As every software artifact, also software models are subject to continuous evolution. The operations applied between two successive versions of a model are crucial for understanding its evolution. Generic approaches for detecting operations a posteriori identify atomic operations, but neglect composite operations, such as refactorings, which leads to cluttered difference reports. To tackle this limitation, we present an orthogonal extension of existing atomic operation detection approaches for detecting also composite operations. Our approach searches for occurrences of composite operations within a set of detected atomic operations in a post-processing manner. One major benefit is the reuse of specifications available for executing composite operations also for detecting applications of them. We evaluate the accuracy of the approach in a real-world case study and investigate the scalability of our implementation in an experiment. PMID:23471366
Test Driven Development of Scientific Models
NASA Technical Reports Server (NTRS)
Clune, Thomas L.
2012-01-01
Test-Driven Development (TDD) is a software development process that promises many advantages for developer productivity and has become widely accepted among professional software engineers. As the name suggests, TDD practitioners alternate between writing short automated tests and producing code that passes those tests. Although this overly simplified description will undoubtedly sound prohibitively burdensome to many uninitiated developers, the advent of powerful unit-testing frameworks greatly reduces the effort required to produce and routinely execute suites of tests. By testimony, many developers find TDD to be addicting after only a few days of exposure, and find it unthinkable to return to previous practices. Of course, scientific/technical software differs from other software categories in a number of important respects, but I nonetheless believe that TDD is quite applicable to the development of such software and has the potential to significantly improve programmer productivity and code quality within the scientific community. After a detailed introduction to TDD, I will present the experience within the Software Systems Support Office (SSSO) in applying the technique to various scientific applications. This discussion will emphasize the various direct and indirect benefits as well as some of the difficulties and limitations of the methodology. I will conclude with a brief description of pFUnit, a unit testing framework I co-developed to support test-driven development of parallel Fortran applications.
A Python Calculator for Supernova Remnant Evolution
NASA Astrophysics Data System (ADS)
Leahy, D. A.; Williams, J. E.
2017-05-01
A freely available Python code for modeling supernova remnant (SNR) evolution has been created. This software is intended for two purposes: to understand SNR evolution and to use in modeling observations of SNR for obtaining good estimates of SNR properties. It includes all phases for the standard path of evolution for spherically symmetric SNRs. In addition, alternate evolutionary models are available, including evolution in a cloudy ISM, the fractional energy-loss model, and evolution in a hot low-density ISM. The graphical interface takes in various parameters and produces outputs such as shock radius and velocity versus time, as well as SNR surface brightness profile and spectrum. Some interesting properties of SNR evolution are demonstrated using the program.
Formation and Assembly of Massive Star Clusters
NASA Astrophysics Data System (ADS)
McMillan, Stephen
The formation of stars and star clusters is a major unresolved problem in astrophysics. It is central to modeling stellar populations and understanding galaxy luminosity distributions in cosmological models. Young massive clusters are major components of starburst galaxies, while globular clusters are cornerstones of the cosmic distance scale and represent vital laboratories for studies of stellar dynamics and stellar evolution. Yet how these clusters form and how rapidly and efficiently they expel their natal gas remain unclear, as do the consequences of this gas expulsion for cluster structure and survival. Also unclear is how the properties of low-mass clusters, which form from small-scale instabilities in galactic disks and inform much of our understanding of cluster formation and star-formation efficiency, differ from those of more massive clusters, which probably formed in starburst events driven by fast accretion at high redshift, or colliding gas flows in merging galaxies. Modeling cluster formation requires simulating many simultaneous physical processes, placing stringent demands on both software and hardware. Simulations of galaxies evolving in cosmological contexts usually lack the numerical resolution to simulate star formation in detail. They do not include detailed treatments of important physical effects such as magnetic fields, radiation pressure, ionization, and supernova feedback. Simulations of smaller clusters include these effects, but fall far short of the mass of even single young globular clusters. With major advances in computing power and software, we can now directly address this problem. We propose to model the formation of massive star clusters by integrating the FLASH adaptive mesh refinement magnetohydrodynamics (MHD) code into the Astrophysical Multi-purpose Software Environment (AMUSE) framework, to work with existing stellar-dynamical and stellar evolution modules in AMUSE. All software will be freely distributed on-line, allowing open access to state-of- the-art simulation techniques within a modern, modular software environment. We will follow the gravitational collapse of 0.1-10 million-solar mass gas clouds through star formation and coalescence into a star cluster, modeling in detail the coupling of the gas and the newborn stars. We will study the effects of star formation by detecting accreting regions of gas in self-gravitating, turbulent, MHD, FLASH models that we will translate into collisional dynamical systems of stars modeled with an N-body code, coupled together in the AMUSE framework. Our FLASH models will include treatments of radiative transfer from the newly formed stars, including heating and radiative acceleration of the surrounding gas. Specific questions to be addressed are: (1) How efficiently does the gas in a star forming region form stars, how does this depend on mass, metallicity, and other parameters, and what terminates star formation? What observational predictions can be made to constrain our models? (2) How important are different mechanisms for driving turbulence and removing gas from a cluster: accretion, radiative feedback, and mechanical feedback? (3) How does the infant mortality rate of young clusters depend on the initial properties of the parent cloud? (4) What are the characteristic formation timescales of massive star clusters, and what observable imprints does the assembly process leave on their structure at an age of 10-20 Myr, when formation is essentially complete and many clusters can be observed? These studies are directly relevant to NASA missions at many electromagnetic wavelengths, including Chandra, GALEX, Hubble, and Spitzer. Each traces different aspects of cluster formation and evolution: X-rays trace supernovae, ultraviolet traces young stars, visible colors can distinguish between young blue stars and older red stars, and the infrared directly shows young embedded star clusters.
Geodynamic modeling of the capture and release of a plume conduit by a migrating mid-ocean ridge
NASA Astrophysics Data System (ADS)
Hall, P. S.
2011-12-01
plates over the relatively stationary, long-lived conduits of mantle plumes. However, paleomagnetic data from the Hawaii-Emperor Seamount Chain suggests that the Hawaiian hotspot moved rapidly (~40 mm/yr) between 81 - 47 Ma [Tarduno et al., 2003]. Recently, Tarduno et al. [2009] suggested that this period of rapid motion might be the surface expression of a plume conduit returning to a largely vertical orientation after having been captured and tilted as the result of being "run over" by migrating mid-ocean ridge. I report on a series of analog geodynamic experiments designed to characterize the evolution of a plume conduit as a mid-ocean ridge migrates over. Experiments were conducted in a clear acrylic tank (100 cm x 70 cm x 50 cm) filled with commercial grade high-fructose corn syrup. Plate-driven flow is modeled by dragging two sheets of Mylar film (driven by independent DC motors) in opposite directions over the surface of the fluid. Ridge migration is achieved by moving the point at which the mylar sheets diverge using a separate motor drive. Buoyant plume flow is generated using a small electrical heater placed at the bottom of the tank. Plate velocities and ridge migration rate are controlled and plume temperature monitored using LabView software. Experiments are recorded using digital video which is then analyzed using digital image analysis software to track the position and shape of the plume conduit throughout the course of the experiment. The intersection of the plume conduit with the surface of the fluid is taken as an analog for the locus of hotspot volcanism and tracked as a function of time to obtain a hotspot migration rate. Results show that the plume conduit experiences significant tilting immediately following the passage of the migrating ridge.
Evolution of Flexible Multibody Dynamics for Simulation Applications Supporting Human Spaceflight
NASA Technical Reports Server (NTRS)
Huynh, An; Brain, Thomas A.; MacLean, John R.; Quiocho, Leslie J.
2016-01-01
During the course of transition from the Space Shuttle and International Space Station programs to the Orion and Journey to Mars exploration programs, a generic flexible multibody dynamics formulation and associated software implementation has evolved to meet an ever changing set of requirements at the NASA Johnson Space Center (JSC). Challenging problems related to large transitional topologies and robotic free-flyer vehicle capture/ release, contact dynamics, and exploration missions concept evaluation through simulation (e.g., asteroid surface operations) have driven this continued development. Coupled with this need is the requirement to oftentimes support human spaceflight operations in real-time. Moreover, it has been desirable to allow even more rapid prototyping of on-orbit manipulator and spacecraft systems, to support less complex infrastructure software for massively integrated simulations, to yield further computational efficiencies, and to take advantage of recent advances and availability of multi-core computing platforms. Since engineering analysis, procedures development, and crew familiarity/training for human spaceflight is fundamental to JSC's charter, there is also a strong desire to share and reuse models in both the non-realtime and real-time domains, with the goal of retaining as much multibody dynamics fidelity as possible. Three specific enhancements are reviewed here: (1) linked list organization to address large transitional topologies, (2) body level model order reduction, and (3) parallel formulation/implementation. This paper provides a detailed overview of these primary updates to JSC's flexible multibody dynamics algorithms as well as a comparison of numerical results to previous formulations and associated software.
Are Earth System model software engineering practices fit for purpose? A case study.
NASA Astrophysics Data System (ADS)
Easterbrook, S. M.; Johns, T. C.
2009-04-01
We present some analysis and conclusions from a case study of the culture and practices of scientists at the Met Office and Hadley Centre working on the development of software for climate and Earth System models using the MetUM infrastructure. The study examined how scientists think about software correctness, prioritize their requirements in making changes, and develop a shared understanding of the resulting models. We conclude that highly customized techniques driven strongly by scientific research goals have evolved for verification and validation of such models. In a formal software engineering context these represents costly, but invaluable, software integration tests with considerable benefits. The software engineering practices seen also exhibit recognisable features of both agile and open source software development projects - self-organisation of teams consistent with a meritocracy rather than top-down organisation, extensive use of informal communication channels, and software developers who are generally also users and science domain experts. We draw some general conclusions on whether these practices work well, and what new software engineering challenges may lie ahead as Earth System models become ever more complex and petascale computing becomes the norm.
Evaluation of software maintain ability with open EHR - a comparison of architectures.
Atalag, Koray; Yang, Hong Yul; Tempero, Ewan; Warren, James R
2014-11-01
To assess whether it is easier to maintain a clinical information system developed using open EHR model driven development versus mainstream methods. A new open source application (GastrOS) has been developed following open EHR's multi-level modelling approach using .Net/C# based on the same requirements of an existing clinically used application developed using Microsoft Visual Basic and Access database. Almost all the domain knowledge was embedded into the software code and data model in the latter. The same domain knowledge has been expressed as a set of open EHR Archetypes in GastrOS. We then introduced eight real-world change requests that had accumulated during live clinical usage, and implemented these in both systems while measuring time for various development tasks and change in software size for each change request. Overall it took half the time to implement changes in GastrOS. However it was the more difficult application to modify for one change request, suggesting the nature of change is also important. It was not possible to implement changes by modelling only. Comparison of relative measures of time and software size change within each application highlights how architectural differences affected maintain ability across change requests. The use of open EHR model driven development can result in better software maintain ability. The degree to which open EHR affects software maintain ability depends on the extent and nature of domain knowledge involved in changes. Although we used relative measures for time and software size, confounding factors could not be totally excluded as a controlled study design was not feasible. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Modeling Dynamic Evolution of Online Friendship Network
NASA Astrophysics Data System (ADS)
Wu, Lian-Ren; Yan, Qiang
2012-10-01
In this paper, we study the dynamic evolution of friendship network in SNS (Social Networking Site). Our analysis suggests that an individual joining a community depends not only on the number of friends he or she has within the community, but also on the friendship network generated by those friends. In addition, we propose a model which is based on two processes: first, connecting nearest neighbors; second, strength driven attachment mechanism. The model reflects two facts: first, in the social network it is a universal phenomenon that two nodes are connected when they have at least one common neighbor; second, new nodes connect more likely to nodes which have larger weights and interactions, a phenomenon called strength driven attachment (also called weight driven attachment). From the simulation results, we find that degree distribution P(k), strength distribution P(s), and degree-strength correlation are all consistent with empirical data.
Software Product Lines: Report of the 2009 U.S. Army Software Product Line Workshop
2009-04-01
record system was fielded in 2008. One early challenge for Overwatch was coming up with a funding model that would support core asset development (a...match the organizational model to the funding model . Product line architecture is essential. Address product line requirements up front. Put processes...when trying to move from a customer-driven, product-specific funding model to one in which at least some of the funds are allocated to the creation and
NASA Astrophysics Data System (ADS)
Zhang, C.; Scholz, C. A.
2016-12-01
The sedimentary basins in the East African Rift are considered excellent modern examples for investigating sedimentary infilling and evolution of extensional systems. Some lakes in the western branch of the rift have formed within single-segment systems, and include Lake Albert and Lake Edward. The largest and oldest lakes developed within multi-segment systems, and these include Lake Tanganyika and Lake Malawi. This research aims to explore processes of erosion and sedimentary infilling of the catchment area in single-segment rift (SSR) and multi-segment rift (MSR) systems. We consider different conditions of regional precipitation and evaporation, and assess the resulting facies architecture through forward modeling, using state-of-the-art commercial basin modeling software. Dionisos is a three-dimensional numerical stratigraphic forward modeling software program, which simulates basin-scale sediment transport based on empirical water- and gravity-driven diffusion equations. It was classically used to quantify the sedimentary architecture and basin infilling of both marine siliciclastic and carbonate environments. However, we apply this approach to continental rift basin environments. In this research, two scenarios are developed, one for a MSR and the other for a SSR. The modeled systems simulate the ratio of drainage area and lake surface area observed in modern Lake Tanganyika and Lake Albert, which are examples of MSRs and SSRs, respectively. The main parameters, such as maximum subsidence rate, water- and gravity-driven diffusion coefficients, rainfall, and evaporation, are approximated using these real-world examples. The results of 5 million year model runs with 50,000 year time steps show that MSRs are characterized by a deep water lake with relatively modest sediment accumulation, while the SSRs are characterized by a nearly overfilled lake with shallow water depths and thick sediment accumulation. The preliminary modeling results conform to the features of sedimentary infills revealed by seismic reflection data acquired in Lake Tanganyika and Lake Albert. Future models will refine the parameters of rainfall and evaporation in these two scenarios to better evaluate detailed basin facies architecture.
HOW SIGNIFICANT IS RADIATION PRESSURE IN THE DYNAMICS OF THE GAS AROUND YOUNG STELLAR CLUSTERS?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Silich, Sergiy; Tenorio-Tagle, Guillermo, E-mail: silich@inaoep.mx
2013-03-01
The impact of radiation pressure on the dynamics of the gas in the vicinity of young stellar clusters is thoroughly discussed. The radiation over the thermal/ram pressure ratio time evolution is calculated explicitly and the crucial roles of the cluster mechanical power, the strong time evolution of the ionizing photon flux, and the bolometric luminosity of the exciting cluster are stressed. It is shown that radiation has only a narrow window of opportunity to dominate the wind-driven shell dynamics. This may occur only at early stages of the bubble evolution and if the shell expands into a dusty and/or amore » very dense proto-cluster medium. The impact of radiation pressure on the wind-driven shell always becomes negligible after about 3 Myr. Finally, the wind-driven model results allow one to compare the model predictions with the distribution of thermal pressure derived from X-ray observations. The shape of the thermal pressure profile then allows us to distinguish between the energy and the momentum-dominated regimes of expansion and thus conclude whether radiative losses of energy or the leakage of hot gas from the bubble interior have been significant during bubble evolution.« less
Parallel Software Model Checking
2015-01-08
checker. This project will explore this strategy to parallelize the generalized PDR algorithm for software model checking. It belongs to TF1 due to its ... focus on formal verification . Generalized PDR. Generalized Property Driven Rechability (GPDR) i is an algorithm for solving HORN-SMT reachability...subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE 08
Model-Driven Useware Engineering
NASA Astrophysics Data System (ADS)
Meixner, Gerrit; Seissler, Marc; Breiner, Kai
User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.
NASA Astrophysics Data System (ADS)
Rojas, Maisa; Seth, Anji
2003-08-01
of this study, the RegCM's ability to simulate circulation and rainfall observed in the two extreme seasons was demonstrated when driven at the lateral boundaries by reanalyzed forcing. Seasonal integrations with the RegCM driven by GCM ensemble-derived lateral boundary forcing demonstrate that the nested model responds well to the SST forcing, by capturing the major features of the circulation and rainfall differences between the two years. The GCM-driven model also improves upon the monthly evolution of rainfall compared with that from the GCM. However, the nested model rainfall simulations for the two seasons are degraded compared with those from the reanalyses-driven RegCM integrations. The poor location of the Atlantic intertropical convergence zone (ITCZ) in the GCM leads to excess rainfall in Nordeste in the nested model.An expanded domain was tested, wherein the RegCM was permitted more internal freedom to respond to SST and regional orographic forcing. Results show that the RegCM is able to improve the location of the ITCZ, and the seasonal evolution of rainfall in Nordeste, the Amazon region, and the southeastern region of Brazil. However, it remains that the limiting factor in the skill of the nested modeling system is the quality of the lateral boundary forcing provided by the global model.
Predicting Software Suitability Using a Bayesian Belief Network
NASA Technical Reports Server (NTRS)
Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.
2005-01-01
The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.
Rozhok, Andrii I; Salstrom, Jennifer L; DeGregori, James
2014-12-01
Age-dependent tissue decline and increased cancer incidence are widely accepted to be rate-limited by the accumulation of somatic mutations over time. Current models of carcinogenesis are dominated by the assumption that oncogenic mutations have defined advantageous fitness effects on recipient stem and progenitor cells, promoting and rate-limiting somatic evolution. However, this assumption is markedly discrepant with evolutionary theory, whereby fitness is a dynamic property of a phenotype imposed upon and widely modulated by environment. We computationally modeled dynamic microenvironment-dependent fitness alterations in hematopoietic stem cells (HSC) within the Sprengel-Liebig system known to govern evolution at the population level. Our model for the first time integrates real data on age-dependent dynamics of HSC division rates, pool size, and accumulation of genetic changes and demonstrates that somatic evolution is not rate-limited by the occurrence of mutations, but instead results from aged microenvironment-driven alterations in the selective/fitness value of previously accumulated genetic changes. Our results are also consistent with evolutionary models of aging and thus oppose both somatic mutation-centric paradigms of carcinogenesis and tissue functional decline. In total, we demonstrate that aging directly promotes HSC fitness decline and somatic evolution via non-cell-autonomous mechanisms.
NASA Astrophysics Data System (ADS)
Huang, Ailing; Zang, Guangzhi; He, Zhengbing; Guan, Wei
2017-05-01
Urban public transit system is a typical mixed complex network with dynamic flow, and its evolution should be a process coupling topological structure with flow dynamics, which has received little attention. This paper presents the R-space to make a comparative empirical analysis on Beijing’s flow-weighted transit route network (TRN) and we found that both the Beijing’s TRNs in the year of 2011 and 2015 exhibit the scale-free properties. As such, we propose an evolution model driven by flow to simulate the development of TRNs with consideration of the passengers’ dynamical behaviors triggered by topological change. The model simulates that the evolution of TRN is an iterative process. At each time step, a certain number of new routes are generated driven by travel demands, which leads to dynamical evolution of new routes’ flow and triggers perturbation in nearby routes that will further impact the next round of opening new routes. We present the theoretical analysis based on the mean-field theory, as well as the numerical simulation for this model. The results obtained agree well with our empirical analysis results, which indicate that our model can simulate the TRN evolution with scale-free properties for distributions of node’s strength and degree. The purpose of this paper is to illustrate the global evolutional mechanism of transit network that will be used to exploit planning and design strategies for real TRNs.
Wind-Driven Global Evolution of Protoplanetary Disks
NASA Astrophysics Data System (ADS)
Bai, Xue-Ning
It has been realized in the recent years that magnetized disk winds
Campos, Marcelino; Llorens, Carlos; Sempere, José M; Futami, Ricardo; Rodriguez, Irene; Carrasco, Purificación; Capilla, Rafael; Latorre, Amparo; Coque, Teresa M; Moya, Andres; Baquero, Fernando
2015-08-05
Antibiotic resistance is a major biomedical problem upon which public health systems demand solutions to construe the dynamics and epidemiological risk of resistant bacteria in anthropogenically-altered environments. The implementation of computable models with reciprocity within and between levels of biological organization (i.e. essential nesting) is central for studying antibiotic resistances. Antibiotic resistance is not just the result of antibiotic-driven selection but more properly the consequence of a complex hierarchy of processes shaping the ecology and evolution of the distinct subcellular, cellular and supra-cellular vehicles involved in the dissemination of resistance genes. Such a complex background motivated us to explore the P-system standards of membrane computing an innovative natural computing formalism that abstracts the notion of movement across membranes to simulate antibiotic resistance evolution processes across nested levels of micro- and macro-environmental organization in a given ecosystem. In this article, we introduce ARES (Antibiotic Resistance Evolution Simulator) a software device that simulates P-system model scenarios with five types of nested computing membranes oriented to emulate a hierarchy of eco-biological compartments, i.e. a) peripheral ecosystem; b) local environment; c) reservoir of supplies; d) animal host; and e) host's associated bacterial organisms (microbiome). Computational objects emulating molecular entities such as plasmids, antibiotic resistance genes, antimicrobials, and/or other substances can be introduced into this framework and may interact and evolve together with the membranes, according to a set of pre-established rules and specifications. ARES has been implemented as an online server and offers additional tools for storage and model editing and downstream analysis. The stochastic nature of the P-system model implemented in ARES explicitly links within and between host dynamics into a simulation, with feedback reciprocity among the different units of selection influenced by antibiotic exposure at various ecological levels. ARES offers the possibility of modeling predictive multilevel scenarios of antibiotic resistance evolution that can be interrogated, edited and re-simulated if necessary, with different parameters, until a correct model description of the process in the real world is convincingly approached. ARES can be accessed at http://gydb.org/ares.
Test Driven Development of Scientific Models
NASA Technical Reports Server (NTRS)
Clune, Thomas L.
2014-01-01
Test-Driven Development (TDD), a software development process that promises many advantages for developer productivity and software reliability, has become widely accepted among professional software engineers. As the name suggests, TDD practitioners alternate between writing short automated tests and producing code that passes those tests. Although this overly simplified description will undoubtedly sound prohibitively burdensome to many uninitiated developers, the advent of powerful unit-testing frameworks greatly reduces the effort required to produce and routinely execute suites of tests. By testimony, many developers find TDD to be addicting after only a few days of exposure, and find it unthinkable to return to previous practices.After a brief overview of the TDD process and my experience in applying the methodology for development activities at Goddard, I will delve more deeply into some of the challenges that are posed by numerical and scientific software as well as tools and implementation approaches that should address those challenges.
NASA Astrophysics Data System (ADS)
Nakamura, Ko; Takiwaki, Tomoya; Kuroda, Takami; Kotake, Kei
2015-12-01
We present an overview of two-dimensional (2D) core-collapse supernova simulations employing a neutrino transport scheme by the isotropic diffusion source approximation. We study 101 solar-metallicity, 247 ultra metal-poor, and 30 zero-metal progenitors covering zero-age main sequence mass from 10.8 M⊙ to 75.0 M⊙. Using the 378 progenitors in total, we systematically investigate how the differences in the structures of these multiple progenitors impact the hydrodynamics evolution. By following a long-term evolution over 1.0 s after bounce, most of the computed models exhibit neutrino-driven revival of the stalled bounce shock at ˜200-800 ms postbounce, leading to the possibility of explosion. Pushing the boundaries of expectations in previous one-dimensional studies, our results confirm that the compactness parameter ξ that characterizes the structure of the progenitors is also a key in 2D to diagnosing the properties of neutrino-driven explosions. Models with high ξ undergo high ram pressure from the accreting matter onto the stalled shock, which affects the subsequent evolution of the shock expansion and the mass of the protoneutron star under the influence of neutrino-driven convection and the standing accretion-shock instability. We show that the accretion luminosity becomes higher for models with high ξ, which makes the growth rate of the diagnostic explosion energy higher and the synthesized nickel mass bigger. We find that these explosion characteristics tend to show a monotonic increase as a function of the compactness parameter ξ.
2011-01-01
Background Computational models play an increasingly important role in the assessment and control of public health crises, as demonstrated during the 2009 H1N1 influenza pandemic. Much research has been done in recent years in the development of sophisticated data-driven models for realistic computer-based simulations of infectious disease spreading. However, only a few computational tools are presently available for assessing scenarios, predicting epidemic evolutions, and managing health emergencies that can benefit a broad audience of users including policy makers and health institutions. Results We present "GLEaMviz", a publicly available software system that simulates the spread of emerging human-to-human infectious diseases across the world. The GLEaMviz tool comprises three components: the client application, the proxy middleware, and the simulation engine. The latter two components constitute the GLEaMviz server. The simulation engine leverages on the Global Epidemic and Mobility (GLEaM) framework, a stochastic computational scheme that integrates worldwide high-resolution demographic and mobility data to simulate disease spread on the global scale. The GLEaMviz design aims at maximizing flexibility in defining the disease compartmental model and configuring the simulation scenario; it allows the user to set a variety of parameters including: compartment-specific features, transition values, and environmental effects. The output is a dynamic map and a corresponding set of charts that quantitatively describe the geo-temporal evolution of the disease. The software is designed as a client-server system. The multi-platform client, which can be installed on the user's local machine, is used to set up simulations that will be executed on the server, thus avoiding specific requirements for large computational capabilities on the user side. Conclusions The user-friendly graphical interface of the GLEaMviz tool, along with its high level of detail and the realism of its embedded modeling approach, opens up the platform to simulate realistic epidemic scenarios. These features make the GLEaMviz computational tool a convenient teaching/training tool as well as a first step toward the development of a computational tool aimed at facilitating the use and exploitation of computational models for the policy making and scenario analysis of infectious disease outbreaks. PMID:21288355
End-to-end observatory software modeling using domain specific languages
NASA Astrophysics Data System (ADS)
Filgueira, José M.; Bec, Matthieu; Liu, Ning; Peng, Chien; Soto, José
2014-07-01
The Giant Magellan Telescope (GMT) is a 25-meter extremely large telescope that is being built by an international consortium of universities and research institutions. Its software and control system is being developed using a set of Domain Specific Languages (DSL) that supports a model driven development methodology integrated with an Agile management process. This approach promotes the use of standardized models that capture the component architecture of the system, that facilitate the construction of technical specifications in a uniform way, that facilitate communication between developers and domain experts and that provide a framework to ensure the successful integration of the software subsystems developed by the GMT partner institutions.
Water Network Tool for Resilience v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-12-09
WNTR is a python package designed to simulate and analyze resilience of water distribution networks. The software includes: - Pressure driven and demand driven hydraulic simulation - Water quality simulation to track concentration, trace, and water age - Conditional controls to simulate power outages - Models to simulate pipe breaks - A wide range of resilience metrics - Analysis and visualization tools
CrossTalk: The Journal of Defense Software Engineering. Volume 19, Number 9
2006-09-01
it does. Several freely down- loadable methodologies have emerged to support the developer in modeling threats to applications and other soft...SECURIS. Model -Driven Develop - ment and Analysis of Secure Information Systems <www.sintef.no/ content/page1_1824.aspx>. 10. The SECURIS Project ...By applying these methods to the SDLC , we can actively reduce the number of known vulnerabilities in software as it is developed . For
Development of a methodology for assessing the safety of embedded software systems
NASA Technical Reports Server (NTRS)
Garrett, C. J.; Guarro, S. B.; Apostolakis, G. E.
1993-01-01
A Dynamic Flowgraph Methodology (DFM) based on an integrated approach to modeling and analyzing the behavior of software-driven embedded systems for assessing and verifying reliability and safety is discussed. DFM is based on an extension of the Logic Flowgraph Methodology to incorporate state transition models. System models which express the logic of the system in terms of causal relationships between physical variables and temporal characteristics of software modules are analyzed to determine how a certain state can be reached. This is done by developing timed fault trees which take the form of logical combinations of static trees relating the system parameters at different point in time. The resulting information concerning the hardware and software states can be used to eliminate unsafe execution paths and identify testing criteria for safety critical software functions.
NASA Astrophysics Data System (ADS)
van Maanen, Barend; Nicholls, Robert J.; French, Jon R.; Barkwith, Andrew; Bonaldo, Davide; Burningham, Helene; Brad Murray, A.; Payo, Andres; Sutherland, James; Thornhill, Gillian; Townend, Ian H.; van der Wegen, Mick; Walkden, Mike J. A.
2016-03-01
Coastal and shoreline management increasingly needs to consider morphological change occurring at decadal to centennial timescales, especially that related to climate change and sea-level rise. This requires the development of morphological models operating at a mesoscale, defined by time and length scales of the order 101 to 102 years and 101 to 102 km. So-called 'reduced complexity' models that represent critical processes at scales not much smaller than the primary scale of interest, and are regulated by capturing the critical feedbacks that govern landform behaviour, are proving effective as a means of exploring emergent coastal behaviour at a landscape scale. Such models tend to be computationally efficient and are thus easily applied within a probabilistic framework. At the same time, reductionist models, built upon a more detailed description of hydrodynamic and sediment transport processes, are capable of application at increasingly broad spatial and temporal scales. More qualitative modelling approaches are also emerging that can guide the development and deployment of quantitative models, and these can be supplemented by varied data-driven modelling approaches that can achieve new explanatory insights from observational datasets. Such disparate approaches have hitherto been pursued largely in isolation by mutually exclusive modelling communities. Brought together, they have the potential to facilitate a step change in our ability to simulate the evolution of coastal morphology at scales that are most relevant to managing erosion and flood risk. Here, we advocate and outline a new integrated modelling framework that deploys coupled mesoscale reduced complexity models, reductionist coastal area models, data-driven approaches, and qualitative conceptual models. Integration of these heterogeneous approaches gives rise to model compositions that can potentially resolve decadal- to centennial-scale behaviour of diverse coupled open coast, estuary and inner shelf settings. This vision is illustrated through an idealised composition of models for a ~ 70 km stretch of the Suffolk coast, eastern England. A key advantage of model linking is that it allows a wide range of real-world situations to be simulated from a small set of model components. However, this process involves more than just the development of software that allows for flexible model coupling. The compatibility of radically different modelling assumptions remains to be carefully assessed and testing as well as evaluating uncertainties of models in composition are areas that require further attention.
Strong Stellar-driven Outflows Shape the Evolution of Galaxies at Cosmic Dawn
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fontanot, Fabio; De Lucia, Gabriella; Hirschmann, Michaela
We study galaxy mass assembly and cosmic star formation rate (SFR) at high redshift (z ≳ 4), by comparing data from multiwavelength surveys with predictions from the GAlaxy Evolution and Assembly (gaea) model. gaea implements a stellar feedback scheme partially based on cosmological hydrodynamical simulations, which features strong stellar-driven outflows and mass-dependent timescales for the re-accretion of ejected gas. In previous work, we have shown that this scheme is able to correctly reproduce the evolution of the galaxy stellar mass function (GSMF) up to z ∼ 3. We contrast model predictions with both rest-frame ultraviolet (UV) and optical luminosity functionsmore » (LFs), which are mostly sensitive to the SFR and stellar mass, respectively. We show that gaea is able to reproduce the shape and redshift evolution of both sets of LFs. We study the impact of dust on the predicted LFs, and we find that the required level of dust attenuation is in qualitative agreement with recent estimates based on the UV continuum slope. The consistency between data and model predictions holds for the redshift evolution of the physical quantities well beyond the redshift range considered for the calibration of the original model. In particular, we show that gaea is able to recover the evolution of the GSMF up to z ∼ 7 and the cosmic SFR density up to z ∼ 10.« less
Toward a Formal Model of the Design and Evolution of Software
1988-12-20
should have the flezibiity to support a variety of design methodologies, be compinhenaive enough to encompass the gamut of software lifecycle...the future. It should have the flezibility to support a variety of design methodologies, be comprehensive enough to encompass the gamut of software...variety of design methodologies, be comprehensive enough to encompass the gamut of software lifecycle activities, and be precise enough to provide the
Karabatsos, George
2017-02-01
Most of applied statistics involves regression analysis of data. In practice, it is important to specify a regression model that has minimal assumptions which are not violated by data, to ensure that statistical inferences from the model are informative and not misleading. This paper presents a stand-alone and menu-driven software package, Bayesian Regression: Nonparametric and Parametric Models, constructed from MATLAB Compiler. Currently, this package gives the user a choice from 83 Bayesian models for data analysis. They include 47 Bayesian nonparametric (BNP) infinite-mixture regression models; 5 BNP infinite-mixture models for density estimation; and 31 normal random effects models (HLMs), including normal linear models. Each of the 78 regression models handles either a continuous, binary, or ordinal dependent variable, and can handle multi-level (grouped) data. All 83 Bayesian models can handle the analysis of weighted observations (e.g., for meta-analysis), and the analysis of left-censored, right-censored, and/or interval-censored data. Each BNP infinite-mixture model has a mixture distribution assigned one of various BNP prior distributions, including priors defined by either the Dirichlet process, Pitman-Yor process (including the normalized stable process), beta (two-parameter) process, normalized inverse-Gaussian process, geometric weights prior, dependent Dirichlet process, or the dependent infinite-probits prior. The software user can mouse-click to select a Bayesian model and perform data analysis via Markov chain Monte Carlo (MCMC) sampling. After the sampling completes, the software automatically opens text output that reports MCMC-based estimates of the model's posterior distribution and model predictive fit to the data. Additional text and/or graphical output can be generated by mouse-clicking other menu options. This includes output of MCMC convergence analyses, and estimates of the model's posterior predictive distribution, for selected functionals and values of covariates. The software is illustrated through the BNP regression analysis of real data.
Working with the HL7 metamodel in a Model Driven Engineering context.
Martínez-García, A; García-García, J A; Escalona, M J; Parra-Calderón, C L
2015-10-01
HL7 (Health Level 7) International is an organization that defines health information standards. Most HL7 domain information models have been designed according to a proprietary graphic language whose domain models are based on the HL7 metamodel. Many researchers have considered using HL7 in the MDE (Model-Driven Engineering) context. A limitation has been identified: all MDE tools support UML (Unified Modeling Language), which is a standard model language, but most do not support the HL7 proprietary model language. We want to support software engineers without HL7 experience, thus real-world problems would be modeled by them by defining system requirements in UML that are compliant with HL7 domain models transparently. The objective of the present research is to connect HL7 with software analysis using a generic model-based approach. This paper introduces a first approach to an HL7 MDE solution that considers the MIF (Model Interchange Format) metamodel proposed by HL7 by making use of a plug-in developed in the EA (Enterprise Architect) tool. Copyright © 2015 Elsevier Inc. All rights reserved.
Linking Goal-Oriented Requirements and Model-Driven Development
NASA Astrophysics Data System (ADS)
Pastor, Oscar; Giachetti, Giovanni
In the context of Goal-Oriented Requirement Engineering (GORE) there are interesting modeling approaches for the analysis of complex scenarios that are oriented to obtain and represent the relevant requirements for the development of software products. However, the way to use these GORE models in an automated Model-Driven Development (MDD) process is not clear, and, in general terms, the translation of these models into the final software products is still manually performed. Therefore, in this chapter, we show an approach to automatically link GORE models and MDD processes, which has been elaborated by considering the experience obtained from linking the i * framework with an industrially applied MDD approach. The linking approach proposed is formulated by means of a generic process that is based on current modeling standards and technologies in order to facilitate its application for different MDD and GORE approaches. Special attention is paid to how this process generates appropriate model transformation mechanisms to automatically obtain MDD conceptual models from GORE models, and how it can be used to specify validation mechanisms to assure the correct model transformations.
Test Driven Development of a Parameterized Ice Sheet Component
NASA Astrophysics Data System (ADS)
Clune, T.
2011-12-01
Test driven development (TDD) is a software development methodology that offers many advantages over traditional approaches including reduced development and maintenance costs, improved reliability, and superior design quality. Although TDD is widely accepted in many software communities, the suitability to scientific software is largely undemonstrated and warrants a degree of skepticism. Indeed, numerical algorithms pose several challenges to unit testing in general, and TDD in particular. Among these challenges are the need to have simple, non-redundant closed-form expressions to compare against the results obtained from the implementation as well as realistic error estimates. The necessity for serial and parallel performance raises additional concerns for many scientific applicaitons. In previous work I demonstrated that TDD performed well for the development of a relatively simple numerical model that simulates the growth of snowflakes, but the results were anecdotal and of limited relevance to far more complex software components typical of climate models. This investigation has now been extended by successfully applying TDD to the implementation of a substantial portion of a new parameterized ice sheet component within a full climate model. After a brief introduction to TDD, I will present techniques that address some of the obstacles encountered with numerical algorithms. I will conclude with some quantitative and qualitative comparisons against climate components developed in a more traditional manner.
A Model-Driven Approach to Teaching Concurrency
ERIC Educational Resources Information Center
Carro, Manuel; Herranz, Angel; Marino, Julio
2013-01-01
We present an undergraduate course on concurrent programming where formal models are used in different stages of the learning process. The main practical difference with other approaches lies in the fact that the ability to develop correct concurrent software relies on a systematic transformation of formal models of inter-process interaction (so…
NASA Astrophysics Data System (ADS)
Tchoufag, Joël; Fabre, David; Magnaudet, Jacques
2015-09-01
Gravity- or buoyancy-driven bodies moving in a slightly viscous fluid frequently follow fluttering or helical paths. Current models of such systems are largely empirical and fail to predict several of the key features of their evolution, especially close to the onset of path instability. Here, using a weakly nonlinear expansion of the full set of governing equations, we present a new generic reduced-order model based on a pair of amplitude equations with exact coefficients that drive the evolution of the first pair of unstable modes. We show that the predictions of this model for the style (e.g., fluttering or spiraling) and characteristics (e.g., frequency and maximum inclination angle) of path oscillations compare well with various recent data for both solid disks and air bubbles.
NASA Astrophysics Data System (ADS)
Magnaudet, Jacques; Tchoufag, Joel; Fabre, David
2015-11-01
Gravity/buoyancy-driven bodies moving in a slightly viscous fluid frequently follow fluttering or helical paths. Current models of such systems are largely empirical and fail to predict several of the key features of their evolution, especially close to the onset of path instability. Using a weakly nonlinear expansion of the full set of governing equations, we derive a new generic reduced-order model of this class of phenomena based on a pair of amplitude equations with exact coefficients that drive the evolution of the first pair of unstable modes. We show that the predictions of this model for the style (eg. fluttering or spiraling) and characteristics (eg. frequency and maximum inclination angle) of path oscillations compare well with various recent data for both solid disks and air bubbles.
Evolution of safety-critical requirements post-launch
NASA Technical Reports Server (NTRS)
Lutz, R. R.; Mikulski, I. C.
2001-01-01
This paper reports the results of a small study of requirements changes to the onboard software of three spacecraft subsequent to launch. Only those requirement changes that resulted from post-launch anoma-lies (i.e., during operations) were of interest here, since the goal was to better understand the relation-ship between critical anomalies during operations and how safety-critical requirements evolve. The results of the study were surprising in that anomaly-driven, post-launch requirements changes were rarely due to previous requirements having been incorrect. Instead, changes involved new requirements (1) for the software to handle rare events or (2) for the software to compensate for hardware failures or limitations. The prevalence of new requirements as a result of post-launch anomalies suggests a need for increased requirements-engineering support of maintenance activities in these systems. The results also confirm both the difficulty and the benefits of pursuing requirements completeness, especially in terms of fault tolerance, during development of critical systems.
NASA Astrophysics Data System (ADS)
Schlegel, N.; Seroussi, H. L.; Boening, C.; Larour, E. Y.; Limonadi, D.; Schodlok, M.; Watkins, M. M.
2017-12-01
The Jet Propulsion Laboratory-University of California at Irvine Ice Sheet System Model (ISSM) is a thermo-mechanical 2D/3D parallelized finite element software used to physically model the continental-scale flow of ice at high resolutions. Embedded into ISSM are uncertainty quantification (UQ) tools, based on the Design Analysis Kit for Optimization and Terascale Applications (DAKOTA) software. ISSM-DAKOTA offers various UQ methods for the investigation of how errors in model input impact uncertainty in simulation results. We utilize these tools to regionally sample model input and key parameters, based on specified bounds of uncertainty, and run a suite of continental-scale 100-year ISSM forward simulations of the Antarctic Ice Sheet. Resulting diagnostics (e.g., spread in local mass flux and regional mass balance) inform our conclusion about which parameters and/or forcing has the greatest impact on century-scale model simulations of ice sheet evolution. The results allow us to prioritize the key datasets and measurements that are critical for the minimization of ice sheet model uncertainty. Overall, we find that Antartica's total sea level contribution is strongly affected by grounding line retreat, which is driven by the magnitude of ice shelf basal melt rates and by errors in bedrock topography. In addition, results suggest that after 100 years of simulation, Thwaites glacier is the most significant source of model uncertainty, and its drainage basin has the largest potential for future sea level contribution. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.
Noise-driven bias in the non-local voter model
NASA Astrophysics Data System (ADS)
Minors, Kevin; Rogers, Tim; Yates, Christian A.
2018-04-01
Is it more effective to have a strong influence over a small domain, or a weaker influence over a larger one? Here, we introduce and analyse an off-lattice generalisation of the voter model, in which the range and strength of agents' influence are control parameters. We consider both low- and high-density regimes and, using distinct mathematical approaches, derive analytical predictions for the evolution of agent densities. We find that, even when the agents are equally persuasive on average, those whose influence is wider but weaker have an overall noise-driven advantage allowing them to reliably dominate the entire population. We discuss the implications of our results and the potential of our model (or adaptations thereof) to improve the understanding of political campaign strategies and the evolution of disease.
NASA Astrophysics Data System (ADS)
S, Kyriacou; E, Kontoleontos; S, Weissenberger; L, Mangani; E, Casartelli; I, Skouteropoulou; M, Gattringer; A, Gehrer; M, Buchmayr
2014-03-01
An efficient hydraulic optimization procedure, suitable for industrial use, requires an advanced optimization tool (EASY software), a fast solver (block coupled CFD) and a flexible geometry generation tool. EASY optimization software is a PCA-driven metamodel-assisted Evolutionary Algorithm (MAEA (PCA)) that can be used in both single- (SOO) and multiobjective optimization (MOO) problems. In MAEAs, low cost surrogate evaluation models are used to screen out non-promising individuals during the evolution and exclude them from the expensive, problem specific evaluation, here the solution of Navier-Stokes equations. For additional reduction of the optimization CPU cost, the PCA technique is used to identify dependences among the design variables and to exploit them in order to efficiently drive the application of the evolution operators. To further enhance the hydraulic optimization procedure, a very robust and fast Navier-Stokes solver has been developed. This incompressible CFD solver employs a pressure-based block-coupled approach, solving the governing equations simultaneously. This method, apart from being robust and fast, also provides a big gain in terms of computational cost. In order to optimize the geometry of hydraulic machines, an automatic geometry and mesh generation tool is necessary. The geometry generation tool used in this work is entirely based on b-spline curves and surfaces. In what follows, the components of the tool chain are outlined in some detail and the optimization results of hydraulic machine components are shown in order to demonstrate the performance of the presented optimization procedure.
Challenges in Managing Trustworthy Large-scale Digital Science
NASA Astrophysics Data System (ADS)
Evans, B. J. K.
2017-12-01
The increased use of large-scale international digital science has opened a number of challenges for managing, handling, using and preserving scientific information. The large volumes of information are driven by three main categories - model outputs including coupled models and ensembles, data products that have been processing to a level of usability, and increasingly heuristically driven data analysis. These data products are increasingly the ones that are usable by the broad communities, and far in excess of the raw instruments data outputs. The data, software and workflows are then shared and replicated to allow broad use at an international scale, which places further demands of infrastructure to support how the information is managed reliably across distributed resources. Users necessarily rely on these underlying "black boxes" so that they are productive to produce new scientific outcomes. The software for these systems depend on computational infrastructure, software interconnected systems, and information capture systems. This ranges from the fundamentals of the reliability of the compute hardware, system software stacks and libraries, and the model software. Due to these complexities and capacity of the infrastructure, there is an increased emphasis of transparency of the approach and robustness of the methods over the full reproducibility. Furthermore, with large volume data management, it is increasingly difficult to store the historical versions of all model and derived data. Instead, the emphasis is on the ability to access the updated products and the reliability by which both previous outcomes are still relevant and can be updated for the new information. We will discuss these challenges and some of the approaches underway that are being used to address these issues.
NASA Technical Reports Server (NTRS)
Tischer, A. E.
1987-01-01
The failure information propagation model (FIPM) data base was developed to store and manipulate the large amount of information anticipated for the various Space Shuttle Main Engine (SSME) FIPMs. The organization and structure of the FIPM data base is described, including a summary of the data fields and key attributes associated with each FIPM data file. The menu-driven software developed to facilitate and control the entry, modification, and listing of data base records is also discussed. The transfer of the FIPM data base and software to the NASA Marshall Space Flight Center is described. Complete listings of all of the data base definition commands and software procedures are included in the appendixes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faculjak, D.A.
1988-03-01
Graphics Manager (GFXMGR) is menu-driven, user-friendly software designed to interactively create, edit, and delete graphics displays on the Advanced Electronics Design (AED) graphics controller, Model 767. The software runs on the VAX family of computers and has been used successfully in security applications to create and change site layouts (maps) of specific facilities. GFXMGR greatly benefits graphics development by minimizing display-development time, reducing tedium on the part of the user, and improving system performance. It is anticipated that GFXMGR can be used to create graphics displays for many types of applications. 8 figs., 2 tabs.
ERIC Educational Resources Information Center
Eastment, David
Despite the evolution of software for computer-assisted language learning (CALL), teacher resistance remains high. Early software for language instruction was almost exclusively designed for drill and practice. That approach was later replaced by a model in which the computer provided a stimulus for students, most often as a partner in games.…
Balancing Plan-Driven and Agile Methods in Software Engineering Project Courses
NASA Astrophysics Data System (ADS)
Boehm, Barry; Port, Dan; Winsor Brown, A.
2002-09-01
For the past 6 years, we have been teaching a two-semester software engineering project course. The students organize into 5-person teams and develop largely web-based electronic services projects for real USC campus clients. We have been using and evolving a method called Model- Based (System) Architecting and Software Engineering (MBASE) for use in both the course and in industrial applications. The MBASE Guidelines include a lot of documents. We teach risk-driven documentation: if it is risky to document something, and not risky to leave it out (e.g., GUI screen placements), leave it out. Even so, students tend to associate more documentation with higher grades, although our grading eventually discourages this. We are always on the lookout for ways to have students learn best practices without having to produce excessive documentation. Thus, we were very interested in analyzing the various emerging agile methods. We found that agile methods and milestone plan-driven methods are part of a “how much planning is enough?” spectrum. Both agile and plan-driven methods have home grounds of project characteristics where they clearly work best, and where the other will have difficulties. Hybrid agile/plan-driven approaches are feasible, and necessary for projects having a mix of agile and plan-driven home ground characteristics. Information technology trends are going more toward the agile methods' home ground characteristics of emergent requirements and rapid change, although there is a concurrent increase in concern with dependability. As a result, we are currently experimenting with risk-driven combinations of MBASE and agile methods, such as integrating requirements, test plans, peer reviews, and pair programming into “agile quality management.”
A Software Hub for High Assurance Model-Driven Development and Analysis
2007-01-23
verification of UML models in TLPVS. In Thomas Baar, Alfred Strohmeier, Ana Moreira, and Stephen J. Mellor, editors, UML 2004 - The Unified Modeling...volume 3785 of Lecture Notes in Computer Science, pages 52–65, Manchester, UK, Nov 2005. Springer. [GH04] Günter Graw and Peter Herrmann. Transformation
Colaborated Architechture Framework for Composition UML 2.0 in Zachman Framework
NASA Astrophysics Data System (ADS)
Hermawan; Hastarista, Fika
2016-01-01
Zachman Framework (ZF) is the framework of enterprise architechture that most widely adopted in the Enterprise Information System (EIS) development. In this study, has been developed Colaborated Architechture Framework (CAF) to collaborate ZF with Unified Modeling Language (UML) 2.0 modeling. The CAF provides the composition of ZF matrix that each cell is consist of the Model Driven architechture (MDA) from the various UML models and many Software Requirement Specification (SRS) documents. Implementation of this modeling is used to develops Enterprise Resource Planning (ERP). Because ERP have a coverage of applications in large numbers and complexly relations, it is necessary to use Agile Model Driven Design (AMDD) approach as an advanced method to transforms MDA into components of application modules with efficiently and accurately. Finally, through the using of the CAF, give good achievement in fullfilment the needs from all stakeholders that are involved in the overall process stage of Rational Unified Process (RUP), and also obtaining a high satisfaction to fullfiled the functionality features of the ERP software in PT. Iglas (Persero) Gresik.
Can We Practically Bring Physics-based Modeling Into Operational Analytics Tools?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Granderson, Jessica; Bonvini, Marco; Piette, Mary Ann
We present that analytics software is increasingly used to improve and maintain operational efficiency in commercial buildings. Energy managers, owners, and operators are using a diversity of commercial offerings often referred to as Energy Information Systems, Fault Detection and Diagnostic (FDD) systems, or more broadly Energy Management and Information Systems, to cost-effectively enable savings on the order of ten to twenty percent. Most of these systems use data from meters and sensors, with rule-based and/or data-driven models to characterize system and building behavior. In contrast, physics-based modeling uses first-principles and engineering models (e.g., efficiency curves) to characterize system and buildingmore » behavior. Historically, these physics-based approaches have been used in the design phase of the building life cycle or in retrofit analyses. Researchers have begun exploring the benefits of integrating physics-based models with operational data analytics tools, bridging the gap between design and operations. In this paper, we detail the development and operator use of a software tool that uses hybrid data-driven and physics-based approaches to cooling plant FDD and optimization. Specifically, we describe the system architecture, models, and FDD and optimization algorithms; advantages and disadvantages with respect to purely data-driven approaches; and practical implications for scaling and replicating these techniques. Finally, we conclude with an evaluation of the future potential for such tools and future research opportunities.« less
Software LS-MIDA for efficient mass isotopomer distribution analysis in metabolic modelling.
Ahmed, Zeeshan; Zeeshan, Saman; Huber, Claudia; Hensel, Michael; Schomburg, Dietmar; Münch, Richard; Eisenreich, Wolfgang; Dandekar, Thomas
2013-07-09
The knowledge of metabolic pathways and fluxes is important to understand the adaptation of organisms to their biotic and abiotic environment. The specific distribution of stable isotope labelled precursors into metabolic products can be taken as fingerprints of the metabolic events and dynamics through the metabolic networks. An open-source software is required that easily and rapidly calculates from mass spectra of labelled metabolites, derivatives and their fragments global isotope excess and isotopomer distribution. The open-source software "Least Square Mass Isotopomer Analyzer" (LS-MIDA) is presented that processes experimental mass spectrometry (MS) data on the basis of metabolite information such as the number of atoms in the compound, mass to charge ratio (m/e or m/z) values of the compounds and fragments under study, and the experimental relative MS intensities reflecting the enrichments of isotopomers in 13C- or 15 N-labelled compounds, in comparison to the natural abundances in the unlabelled molecules. The software uses Brauman's least square method of linear regression. As a result, global isotope enrichments of the metabolite or fragment under study and the molar abundances of each isotopomer are obtained and displayed. The new software provides an open-source platform that easily and rapidly converts experimental MS patterns of labelled metabolites into isotopomer enrichments that are the basis for subsequent observation-driven analysis of pathways and fluxes, as well as for model-driven metabolic flux calculations.
Cellular automaton model for molecular traffic jams
NASA Astrophysics Data System (ADS)
Belitsky, V.; Schütz, G. M.
2011-07-01
We consider the time evolution of an exactly solvable cellular automaton with random initial conditions both in the large-scale hydrodynamic limit and on the microscopic level. This model is a version of the totally asymmetric simple exclusion process with sublattice parallel update and thus may serve as a model for studying traffic jams in systems of self-driven particles. We study the emergence of shocks from the microscopic dynamics of the model. In particular, we introduce shock measures whose time evolution we can compute explicitly, both in the thermodynamic limit and for open boundaries where a boundary-induced phase transition driven by the motion of a shock occurs. The motion of the shock, which results from the collective dynamics of the exclusion particles, is a random walk with an internal degree of freedom that determines the jump direction. This type of hopping dynamics is reminiscent of some transport phenomena in biological systems.
Driven-dissipative quantum Monte Carlo method for open quantum systems
NASA Astrophysics Data System (ADS)
Nagy, Alexandra; Savona, Vincenzo
2018-05-01
We develop a real-time full configuration-interaction quantum Monte Carlo approach to model driven-dissipative open quantum systems with Markovian system-bath coupling. The method enables stochastic sampling of the Liouville-von Neumann time evolution of the density matrix thanks to a massively parallel algorithm, thus providing estimates of observables on the nonequilibrium steady state. We present the underlying theory and introduce an initiator technique and importance sampling to reduce the statistical error. Finally, we demonstrate the efficiency of our approach by applying it to the driven-dissipative two-dimensional X Y Z spin-1/2 model on a lattice.
Towards Test Driven Development for Computational Science with pFUnit
NASA Technical Reports Server (NTRS)
Rilee, Michael L.; Clune, Thomas L.
2014-01-01
Developers working in Computational Science & Engineering (CSE)/High Performance Computing (HPC) must contend with constant change due to advances in computing technology and science. Test Driven Development (TDD) is a methodology that mitigates software development risks due to change at the cost of adding comprehensive and continuous testing to the development process. Testing frameworks tailored for CSE/HPC, like pFUnit, can lower the barriers to such testing, yet CSE software faces unique constraints foreign to the broader software engineering community. Effective testing of numerical software requires a comprehensive suite of oracles, i.e., use cases with known answers, as well as robust estimates for the unavoidable numerical errors associated with implementation with finite-precision arithmetic. At first glance these concerns often seem exceedingly challenging or even insurmountable for real-world scientific applications. However, we argue that this common perception is incorrect and driven by (1) a conflation between model validation and software verification and (2) the general tendency in the scientific community to develop relatively coarse-grained, large procedures that compound numerous algorithmic steps.We believe TDD can be applied routinely to numerical software if developers pursue fine-grained implementations that permit testing, neatly side-stepping concerns about needing nontrivial oracles as well as the accumulation of errors. We present an example of a successful, complex legacy CSE/HPC code whose development process shares some aspects with TDD, which we contrast with current and potential capabilities. A mix of our proposed methodology and framework support should enable everyday use of TDD by CSE-expert developers.
Towards a comprehensive framework for reuse: A reuse-enabling software evolution environment
NASA Technical Reports Server (NTRS)
Basili, V. R.; Rombach, H. D.
1988-01-01
Reuse of products, processes and knowledge will be the key to enable the software industry to achieve the dramatic improvement in productivity and quality required to satisfy the anticipated growing demand. Although experience shows that certain kinds of reuse can be successful, general success has been elusive. A software life-cycle technology which allows broad and extensive reuse could provide the means to achieving the desired order-of-magnitude improvements. The scope of a comprehensive framework for understanding, planning, evaluating and motivating reuse practices and the necessary research activities is outlined. As a first step towards such a framework, a reuse-enabling software evolution environment model is introduced which provides a basis for the effective recording of experience, the generalization and tailoring of experience, the formalization of experience, and the (re-)use of experience.
Goal-Based Domain Modeling as a Basis for Cross-Disciplinary Systems Engineering
NASA Astrophysics Data System (ADS)
Jarke, Matthias; Nissen, Hans W.; Rose, Thomas; Schmitz, Dominik
Small and medium-sized enterprises (SMEs) are important drivers for innovation. In particular, project-driven SMEs that closely cooperate with their customers have specific needs in regard to information engineering of their development process. They need a fast requirements capture since this is most often included in the (unpaid) offer development phase. At the same time, they need to maintain and reuse the knowledge and experiences they have gathered in previous projects extensively as it is their core asset. The situation is complicated further if the application field crosses disciplinary boundaries. To bridge the gaps and perspectives, we focus on shared goals and dependencies captured in models at a conceptual level. Such a model-based approach also offers a smarter connection to subsequent development stages, including a high share of automated code generation. In the approach presented here, the agent- and goal-oriented formalism i * is therefore extended by domain models to facilitate information organization. This extension permits a domain model-based similarity search, and a model-based transformation towards subsequent development stages. Our approach also addresses the evolution of domain models reflecting the experiences from completed projects. The approach is illustrated with a case study on software-intensive control systems in an SME of the automotive domain.
Pardo, Lorena; García, Alvaro; de Espinosa, Francisco Montero; Brebøl, Klaus
2011-03-01
The determination of the characteristic frequencies of an electromechanical resonance does not provide enough data to obtain the material properties of piezoceramics, including all losses, from complex impedance measurements. Values of impedance around resonance and antiresonance frequencies are also required to calculate the material losses. Uncoupled resonances are needed for this purpose. The shear plates used for the material characterization present unavoidable mode coupling of the shear mode and other modes of the plate. A study of the evolution of the complex material coefficients as the coupling of modes evolves with the change in the aspect ratio (lateral dimension/thickness) of the plate is presented here. These are obtained using software. A soft commercial PZT ceramic was used in this study and several shear plates amenable to material characterization were obtained in the range of aspect ratios below 15. The validity of the material properties for 3-D modeling of piezoceramics is assessed by means of finite element analysis, which shows that uncoupled resonances are virtually pure thickness-driven shear modes.
2015-04-29
in which we applied these adaptation patterns to an adaptive news web server intended to tolerate extremely heavy, unexpected loads. To address...collection of existing models used as benchmarks for OO-based refactoring and an existing web -based repository called REMODD to provide users with model...invariant properties. Specifically, we developed Avida- MDE (based on the Avida digital evolution platform) to support the automatic generation of software
Evolution of protoplanetary discs with magnetically driven disc winds
NASA Astrophysics Data System (ADS)
Suzuki, Takeru K.; Ogihara, Masahiro; Morbidelli, Alessandro; Crida, Aurélien; Guillot, Tristan
2016-12-01
Aims: We investigate the evolution of protoplanetary discs (PPDs) with magnetically driven disc winds and viscous heating. Methods: We considered an initially massive disc with 0.1 M⊙ to track the evolution from the early stage of PPDs. We solved the time evolution of surface density and temperature by taking into account viscous heating and the loss of mass and angular momentum by the disc winds within the framework of a standard α model for accretion discs. Our model parameters, turbulent viscosity, disc wind mass-loss, and disc wind torque, which were adopted from local magnetohydrodynamical simulations and constrained by the global energetics of the gravitational accretion, largely depends on the physical condition of PPDs, particularly on the evolution of the vertical magnetic flux in weakly ionized PPDs. Results: Although there are still uncertainties concerning the evolution of the vertical magnetic flux that remains, the surface densities show a large variety, depending on the combination of these three parameters, some of which are very different from the surface density expected from the standard accretion. When a PPD is in a wind-driven accretion state with the preserved vertical magnetic field, the radial dependence of the surface density can be positive in the inner region <1-10 au. The mass accretion rates are consistent with observations, even in the very low level of magnetohydrodynamical turbulence. Such a positive radial slope of the surface density strongly affects planet formation because it inhibits the inward drift or even causes the outward drift of pebble- to boulder-sized solid bodies, and it also slows down or even reversed the inward type-I migration of protoplanets. Conclusions: The variety of our calculated PPDs should yield a wide variety of exoplanet systems.
MATTS- A Step Towards Model Based Testing
NASA Astrophysics Data System (ADS)
Herpel, H.-J.; Willich, G.; Li, J.; Xie, J.; Johansen, B.; Kvinnesland, K.; Krueger, S.; Barrios, P.
2016-08-01
In this paper we describe a Model Based approach to testing of on-board software and compare it with traditional validation strategy currently applied to satellite software. The major problems that software engineering will face over at least the next two decades are increasing application complexity driven by the need for autonomy and serious application robustness. In other words, how do we actually get to declare success when trying to build applications one or two orders of magnitude more complex than today's applications. To solve the problems addressed above the software engineering process has to be improved at least for two aspects: 1) Software design and 2) Software testing. The software design process has to evolve towards model-based approaches with extensive use of code generators. Today, testing is an essential, but time and resource consuming activity in the software development process. Generating a short, but effective test suite usually requires a lot of manual work and expert knowledge. In a model-based process, among other subtasks, test construction and test execution can also be partially automated. The basic idea behind the presented study was to start from a formal model (e.g. State Machines), generate abstract test cases which are then converted to concrete executable test cases (input and expected output pairs). The generated concrete test cases were applied to an on-board software. Results were collected and evaluated wrt. applicability, cost-efficiency, effectiveness at fault finding, and scalability.
Ontology-Driven Information Integration
NASA Technical Reports Server (NTRS)
Tissot, Florence; Menzel, Chris
2005-01-01
Ontology-driven information integration (ODII) is a method of computerized, automated sharing of information among specialists who have expertise in different domains and who are members of subdivisions of a large, complex enterprise (e.g., an engineering project, a government agency, or a business). In ODII, one uses rigorous mathematical techniques to develop computational models of engineering and/or business information and processes. These models are then used to develop software tools that support the reliable processing and exchange of information among the subdivisions of this enterprise or between this enterprise and other enterprises.
Road embankment and slope stabilization.
DOT National Transportation Integrated Search
2010-07-31
This report and the accompanying software are part of efforts to improve the characterization and analysis of pilestabilized : slopes using one or two rows of driven piles. A combination of the limit equilibrium analysis and strain : wedge (SW) model...
Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha
2016-05-01
A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.
Towards a Global Evolutionary Model of Protoplanetary Disks
NASA Astrophysics Data System (ADS)
Bai, Xue-Ning
2016-04-01
A global picture of the evolution of protoplanetary disks (PPDs) is key to understanding almost every aspect of planet formation, where standard α-disk models have been continually employed for their simplicity. In the meantime, disk mass loss has been conventionally attributed to photoevaporation, which controls disk dispersal. However, a paradigm shift toward accretion driven by magnetized disk winds has taken place in recent years, thanks to studies of non-ideal magnetohydrodynamic effects in PPDs. I present a framework of global PPD evolution aiming to incorporate these advances, highlighting the role of wind-driven accretion and wind mass loss. Disk evolution is found to be largely dominated by wind-driven processes, and viscous spreading is suppressed. The timescale of disk evolution is controlled primarily by the amount of external magnetic flux threading the disks, and how rapidly the disk loses the flux. Rapid disk dispersal can be achieved if the disk is able to hold most of its magnetic flux during the evolution. In addition, because wind launching requires a sufficient level of ionization at the disk surface (mainly via external far-UV (FUV) radiation), wind kinematics is also affected by the FUV penetration depth and disk geometry. For a typical disk lifetime of a few million years, the disk loses approximately the same amount of mass through the wind as through accretion onto the protostar, and most of the wind mass loss proceeds from the outer disk via a slow wind. Fractional wind mass loss increases with increasing disk lifetime. Significant wind mass loss likely substantially enhances the dust-to-gas mass ratio and promotes planet formation.
Agent-based models of cellular systems.
Cannata, Nicola; Corradini, Flavio; Merelli, Emanuela; Tesei, Luca
2013-01-01
Software agents are particularly suitable for engineering models and simulations of cellular systems. In a very natural and intuitive manner, individual software components are therein delegated to reproduce "in silico" the behavior of individual components of alive systems at a given level of resolution. Individuals' actions and interactions among individuals allow complex collective behavior to emerge. In this chapter we first introduce the readers to software agents and multi-agent systems, reviewing the evolution of agent-based modeling of biomolecular systems in the last decade. We then describe the main tools, platforms, and methodologies available for programming societies of agents, possibly profiting also of toolkits that do not require advanced programming skills.
Software Tools for Emittance Measurement and Matching for 12 GeV CEBAF
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turner, Dennis L.
2016-05-01
This paper discusses model-driven setup of the Continuous Electron Beam Accelerator Facility (CEBAF) for the 12GeV era, focusing on qsUtility. qsUtility is a set of software tools created to perform emittance measurements, analyze those measurements, and compute optics corrections based upon the measurements.qsUtility was developed as a toolset to facilitate reducing machine configuration time and reproducibility by way of an accurate accelerator model, and to provide Operations staff with tools to measure and correct machine optics with little or no assistance from optics experts.
Woźniak, Natalia Joanna; Sicard, Adrien
2018-07-01
Flowers represent a key innovation during plant evolution. Driven by reproductive optimization, evolution of flower morphology has been central in boosting species diversification. In most cases, this has happened through specialized interactions with animal pollinators and subsequent reduction of gene flow between specialized morphs. While radiation has led to an enormous variability in flower forms and sizes, recurrent evolutionary patterns can be observed. Here, we discuss the targets of selection involved in major trends of pollinator-driven flower evolution. We review recent findings on their adaptive values, developmental grounds and genetic bases, in an attempt to better understand the repeated nature of pollinator-driven flower evolution. This analysis highlights how structural innovation can provide flexibility in phenotypic evolution, adaptation and speciation. Copyright © 2017 Elsevier Ltd. All rights reserved.
Desiderata for computable representations of electronic health records-driven phenotype algorithms
Mo, Huan; Thompson, William K; Rasmussen, Luke V; Pacheco, Jennifer A; Jiang, Guoqian; Kiefer, Richard; Zhu, Qian; Xu, Jie; Montague, Enid; Carrell, David S; Lingren, Todd; Mentch, Frank D; Ni, Yizhao; Wehbe, Firas H; Peissig, Peggy L; Tromp, Gerard; Larson, Eric B; Chute, Christopher G; Pathak, Jyotishman; Speltz, Peter; Kho, Abel N; Jarvik, Gail P; Bejan, Cosmin A; Williams, Marc S; Borthwick, Kenneth; Kitchner, Terrie E; Roden, Dan M; Harris, Paul A
2015-01-01
Background Electronic health records (EHRs) are increasingly used for clinical and translational research through the creation of phenotype algorithms. Currently, phenotype algorithms are most commonly represented as noncomputable descriptive documents and knowledge artifacts that detail the protocols for querying diagnoses, symptoms, procedures, medications, and/or text-driven medical concepts, and are primarily meant for human comprehension. We present desiderata for developing a computable phenotype representation model (PheRM). Methods A team of clinicians and informaticians reviewed common features for multisite phenotype algorithms published in PheKB.org and existing phenotype representation platforms. We also evaluated well-known diagnostic criteria and clinical decision-making guidelines to encompass a broader category of algorithms. Results We propose 10 desired characteristics for a flexible, computable PheRM: (1) structure clinical data into queryable forms; (2) recommend use of a common data model, but also support customization for the variability and availability of EHR data among sites; (3) support both human-readable and computable representations of phenotype algorithms; (4) implement set operations and relational algebra for modeling phenotype algorithms; (5) represent phenotype criteria with structured rules; (6) support defining temporal relations between events; (7) use standardized terminologies and ontologies, and facilitate reuse of value sets; (8) define representations for text searching and natural language processing; (9) provide interfaces for external software algorithms; and (10) maintain backward compatibility. Conclusion A computable PheRM is needed for true phenotype portability and reliability across different EHR products and healthcare systems. These desiderata are a guide to inform the establishment and evolution of EHR phenotype algorithm authoring platforms and languages. PMID:26342218
A high-speed, large-capacity, 'jukebox' optical disk system
NASA Technical Reports Server (NTRS)
Ammon, G. J.; Calabria, J. A.; Thomas, D. T.
1985-01-01
Two optical disk 'jukebox' mass storage systems which provide access to any data in a store of 10 to the 13th bits (1250G bytes) within six seconds have been developed. The optical disk jukebox system is divided into two units, including a hardware/software controller and a disk drive. The controller provides flexibility and adaptability, through a ROM-based microcode-driven data processor and a ROM-based software-driven control processor. The cartridge storage module contains 125 optical disks housed in protective cartridges. Attention is given to a conceptual view of the disk drive unit, the NASA optical disk system, the NASA database management system configuration, the NASA optical disk system interface, and an open systems interconnect reference model.
The Knowledge-Based Software Assistant: Beyond CASE
NASA Technical Reports Server (NTRS)
Carozzoni, Joseph A.
1993-01-01
This paper will outline the similarities and differences between two paradigms of software development. Both support the whole software life cycle and provide automation for most of the software development process, but have different approaches. The CASE approach is based on a set of tools linked by a central data repository. This tool-based approach is data driven and views software development as a series of sequential steps, each resulting in a product. The Knowledge-Based Software Assistant (KBSA) approach, a radical departure from existing software development practices, is knowledge driven and centers around a formalized software development process. KBSA views software development as an incremental, iterative, and evolutionary process with development occurring at the specification level.
Zhao, Lei; Lim Choi Keung, Sarah N; Taweel, Adel; Tyler, Edward; Ogunsina, Ire; Rossiter, James; Delaney, Brendan C; Peterson, Kevin A; Hobbs, F D Richard; Arvanitis, Theodoros N
2012-01-01
Heterogeneous data models and coding schemes for electronic health records present challenges for automated search across distributed data sources. This paper describes a loosely coupled software framework based on the terminology controlled approach to enable the interoperation between the search interface and heterogeneous data sources. Software components interoperate via common terminology service and abstract criteria model so as to promote component reuse and incremental system evolution.
Simulation and animation of sensor-driven robots.
Chen, C; Trivedi, M M; Bidlack, C R
1994-10-01
Most simulation and animation systems utilized in robotics are concerned with simulation of the robot and its environment without simulation of sensors. These systems have difficulty in handling robots that utilize sensory feedback in their operation. In this paper, a new design of an environment for simulation, animation, and visualization of sensor-driven robots is presented. As sensor technology advances, increasing numbers of robots are equipped with various types of sophisticated sensors. The main goal of creating the visualization environment is to aid the automatic robot programming and off-line programming capabilities of sensor-driven robots. The software system will help the users visualize the motion and reaction of the sensor-driven robot under their control program. Therefore, the efficiency of the software development is increased, the reliability of the software and the operation safety of the robot are ensured, and the cost of new software development is reduced. Conventional computer-graphics-based robot simulation and animation software packages lack of capabilities for robot sensing simulation. This paper describes a system designed to overcome this deficiency.
Tumor morphology and phenotypic evolution driven by selective pressure from the microenvironment.
Anderson, Alexander R A; Weaver, Alissa M; Cummings, Peter T; Quaranta, Vito
2006-12-01
Emergence of invasive behavior in cancer is life-threatening, yet ill-defined due to its multifactorial nature. We present a multiscale mathematical model of cancer invasion, which considers cellular and microenvironmental factors simultaneously and interactively. Unexpectedly, the model simulations predict that harsh tumor microenvironment conditions (e.g., hypoxia, heterogenous extracellular matrix) exert a dramatic selective force on the tumor, which grows as an invasive mass with fingering margins, dominated by a few clones with aggressive traits. In contrast, mild microenvironment conditions (e.g., normoxia, homogeneous matrix) allow clones with similar aggressive traits to coexist with less aggressive phenotypes in a heterogeneous tumor mass with smooth, noninvasive margins. Thus, the genetic make-up of a cancer cell may realize its invasive potential through a clonal evolution process driven by definable microenvironmental selective forces. Our mathematical model provides a theoretical/experimental framework to quantitatively characterize this selective pressure for invasion and test ways to eliminate it.
Random Matrix Approach to Quantum Adiabatic Evolution Algorithms
NASA Technical Reports Server (NTRS)
Boulatov, Alexei; Smelyanskiy, Vadier N.
2004-01-01
We analyze the power of quantum adiabatic evolution algorithms (Q-QA) for solving random NP-hard optimization problems within a theoretical framework based on the random matrix theory (RMT). We present two types of the driven RMT models. In the first model, the driving Hamiltonian is represented by Brownian motion in the matrix space. We use the Brownian motion model to obtain a description of multiple avoided crossing phenomena. We show that the failure mechanism of the QAA is due to the interaction of the ground state with the "cloud" formed by all the excited states, confirming that in the driven RMT models. the Landau-Zener mechanism of dissipation is not important. We show that the QAEA has a finite probability of success in a certain range of parameters. implying the polynomial complexity of the algorithm. The second model corresponds to the standard QAEA with the problem Hamiltonian taken from the Gaussian Unitary RMT ensemble (GUE). We show that the level dynamics in this model can be mapped onto the dynamics in the Brownian motion model. However, the driven RMT model always leads to the exponential complexity of the algorithm due to the presence of the long-range intertemporal correlations of the eigenvalues. Our results indicate that the weakness of effective transitions is the leading effect that can make the Markovian type QAEA successful.
SpreaD3: Interactive Visualization of Spatiotemporal History and Trait Evolutionary Processes.
Bielejec, Filip; Baele, Guy; Vrancken, Bram; Suchard, Marc A; Rambaut, Andrew; Lemey, Philippe
2016-08-01
Model-based phylogenetic reconstructions increasingly consider spatial or phenotypic traits in conjunction with sequence data to study evolutionary processes. Alongside parameter estimation, visualization of ancestral reconstructions represents an integral part of these analyses. Here, we present a complete overhaul of the spatial phylogenetic reconstruction of evolutionary dynamics software, now called SpreaD3 to emphasize the use of data-driven documents, as an analysis and visualization package that primarily complements Bayesian inference in BEAST (http://beast.bio.ed.ac.uk, last accessed 9 May 2016). The integration of JavaScript D3 libraries (www.d3.org, last accessed 9 May 2016) offers novel interactive web-based visualization capacities that are not restricted to spatial traits and extend to any discrete or continuously valued trait for any organism of interest. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Sources and Sinks: A Stochastic Model of Evolution in Heterogeneous Environments
NASA Astrophysics Data System (ADS)
Hermsen, Rutger; Hwa, Terence
2010-12-01
We study evolution driven by spatial heterogeneity in a stochastic model of source-sink ecologies. A sink is a habitat where mortality exceeds reproduction so that a local population persists only due to immigration from a source. Immigrants can, however, adapt to conditions in the sink by mutation. To characterize the adaptation rate, we derive expressions for the first arrival time of adapted mutants. The joint effects of migration, mutation, birth, and death result in two distinct parameter regimes. These results may pertain to the rapid evolution of drug-resistant pathogens and insects.
Regional-Scale Salt Tectonics Modelling: Bench-Scale Validation and Extension to Field-Scale
NASA Astrophysics Data System (ADS)
Crook, A. J. L.; Yu, J. G.; Thornton, D. A.
2010-05-01
The role of salt in the evolution of the West African continental margin, and in particular its impact on hydrocarbon migration and trap formation, is an important research topic. It has attracted many researchers who have based their research on bench-scale experiments, numerical models and seismic observations. This research has shown that the evolution is very complex. For example, regional analogue bench-scale models of the Angolan margin (Fort et al., 2004) indicate a complex system with an upslope extensional domain with sealed tilted blocks, growth fault and rollover systems and extensional diapers, and a downslope contractional domain with squeezed diapirs, polyharmonic folds and thrust faults, and late-stage folding and thrusting. Numerical models have the potential to provide additional insight into the evolution of these salt driven passive margins. The longer-term aim is to calibrate regional-scale evolution models, and then to evaluate the effect of the depositional history on the current day geomechanical and hydrogeologic state in potential target hydrocarbon reservoir formations adjacent to individual salt bodies. To achieve this goal the burial and deformational history of the sediment must be modelled from initial deposition to the current-day state, while also accounting for the reaction and transport processes occurring in the margin. Accurate forward modeling is, however complex, and necessitates advanced procedures for the prediction of fault formation and evolution, representation of the extreme deformations in the salt, and for coupling the geomechanical, fluid flow and temperature fields. The evolution of the sediment due to a combination of mechanical compaction, chemical compaction and creep relaxation must also be represented. In this paper ongoing research on a computational approach for forward modelling complex structural evolution, with particular reference to passive margins driven by salt tectonics is presented. The approach is an extension of a previously published approach (Crook et al., 2006a, 2006b) that focused on predictive modelling of structure evolution in 2-D sandbox experiments, and in particular two extensional sand box experiments that exhibit complex fault development including a series of superimposed crestal collapse graben systems (McClay, 1990) . The formulation adopts a finite strain Lagrangian method, complemented by advanced localization prediction algorithms and robust and efficient automated adaptive meshing techniques. The sediment is represented by an elasto-viscoplastic constitutive model based on extended critical state concepts, which enables representation of the combined effect of mechanical and chemical compaction. This is achieved by directly coupling the evolution of the material state boundary surface with both the mechanically and chemically driven porosity change. Using these procedures the evolution of the geological structures arises naturally from the imposed boundary conditions without the requirement of seeding using initial imperfections. Simulations are presented for regional bench-scale models based on the analogue experiments presented by Fort et al. (2004), together with additional insights provided by the numerical models. It is shown that the behaviour observed in both the extensional and compressional zones of these analogue models arises naturally in the finite element simulations. Extension of these models to the field-scale is then discussed and several simulations are presented to highlight important issues related to practical field-scale numerical modelling.
A Two Species Bump-On-Tail Model With Relaxation for Energetic Particle Driven Modes
NASA Astrophysics Data System (ADS)
Aslanyan, V.; Porkolab, M.; Sharapov, S. E.; Spong, D. A.
2017-10-01
Energetic particle driven Alfvén Eigenmodes (AEs) observed in present day experiments exhibit various nonlinear behaviours varying from steady state amplitude at a fixed frequency to bursting amplitudes and sweeping frequency. Using the appropriate action-angle variables, the problem of resonant wave-particle interaction becomes effectively one-dimensional. Previously, a simple one-dimensional Bump-On-Tail (BOT) model has proven to be one of the most effective in describing characteristic nonlinear near-threshold wave evolution scenarios. In particular, dynamical friction causes bursting mode evolution, while diffusive relaxation may give steady-state, periodic or chaotic mode evolution. BOT has now been extended to include two populations of fast particles, with one dominated by dynamical friction at the resonance and the other by diffusion; the relative size of the populations determines the temporal evolution of the resulting wave. This suggests an explanation for recent observations on the TJ-II stellarator, where a transition between steady state and bursting occured as the magnetic configuration varied. The two species model is then applied to burning plasma with drag-dominated alpha particles and diffusion-dominated ICRH accelerated minority ions. This work was supported by the US DoE and the RCUK Energy Programme [Grant Number EP/P012450/1].
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, John W., IV; Henderson, Richard; Futrell, Michael T.
1991-01-01
The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The focus here is on the design of components that make up the FPP. These components serve as supporting systems for the Integration Mechanism and the Framework Processor and provide the 'glue' that ties the FPP together. Also discussed are the components that allow the platform to operate in a distributed, heterogeneous environment and to manage the development and evolution of software system artifacts.
Cleanroom Software Engineering Reference Model. Version 1.0.
1996-11-01
teams. It also serves as a baseline for continued evolution of Cleanroom practice. The scope of the CRM is software management , specification...addition to project staff, participants include management , peer organization representatives, and customer representatives as appropriate for...2 Review the status of the process with management , the project team, peer groups, and the customer . These verification activities include
Reflection of a Year Long Model-Driven Business and UI Modeling Development Project
NASA Astrophysics Data System (ADS)
Sukaviriya, Noi; Mani, Senthil; Sinha, Vibha
Model-driven software development enables users to specify an application at a high level - a level that better matches problem domain. It also promises the users with better analysis and automation. Our work embarks on two collaborating domains - business process and human interactions - to build an application. Business modeling expresses business operations and flows then creates business flow implementation. Human interaction modeling expresses a UI design, its relationship with business data, logic, and flow, and can generate working UI. This double modeling approach automates the production of a working system with UI and business logic connected. This paper discusses the human aspects of this modeling approach after a year long of building a procurement outsourcing contract application using the approach - the result of which was deployed in December 2008. The paper discusses in multiple areas the happy endings and some heartache. We end with insights on how a model-driven approach could do better for humans in the process.
Studying the laws of software evolution in a long-lived FLOSS project.
Gonzalez-Barahona, Jesus M; Robles, Gregorio; Herraiz, Israel; Ortega, Felipe
2014-07-01
Some free, open-source software projects have been around for quite a long time, the longest living ones dating from the early 1980s. For some of them, detailed information about their evolution is available in source code management systems tracking all their code changes for periods of more than 15 years. This paper examines in detail the evolution of one of such projects, glibc, with the main aim of understanding how it evolved and how it matched Lehman's laws of software evolution. As a result, we have developed a methodology for studying the evolution of such long-lived projects based on the information in their source code management repository, described in detail several aspects of the history of glibc, including some activity and size metrics, and found how some of the laws of software evolution may not hold in this case. © 2013 The Authors. Journal of Software: Evolution and Process published by John Wiley & Sons Ltd.
Studying the laws of software evolution in a long-lived FLOSS project
Gonzalez-Barahona, Jesus M; Robles, Gregorio; Herraiz, Israel; Ortega, Felipe
2014-01-01
Some free, open-source software projects have been around for quite a long time, the longest living ones dating from the early 1980s. For some of them, detailed information about their evolution is available in source code management systems tracking all their code changes for periods of more than 15 years. This paper examines in detail the evolution of one of such projects, glibc, with the main aim of understanding how it evolved and how it matched Lehman's laws of software evolution. As a result, we have developed a methodology for studying the evolution of such long-lived projects based on the information in their source code management repository, described in detail several aspects of the history of glibc, including some activity and size metrics, and found how some of the laws of software evolution may not hold in this case. © 2013 The Authors. Journal of Software: Evolution and Process published by John Wiley & Sons Ltd. PMID:25893093
Test Particle Stability in Exoplanet Systems
NASA Astrophysics Data System (ADS)
Frewen, Shane; Hansen, B. M.
2011-01-01
Astronomy is currently going through a golden age of exoplanet discovery. Yet despite that, there is limited research on the evolution of exoplanet systems driven by stellar evolution. In this work we look at the stability of test particles in known exoplanet systems during the host star's main sequence and white dwarf stages. In particular, we compare the instability regions that develop before and after the star loses mass to form a white dwarf, a process which causes the semi-major axes of the outer planets to expand adiabatically. We investigate the possibility of secular and resonant perturbations resulting in these regions as well as the method of removal of test particles for the instability regions, such as ejection and collision with the central star. To run our simulations we used the MERCURY software package (Chambers, 1999) and evolved our systems for over 108 years using a hybrid symplectic/Bulirsch-Stoer integrator.
Multi-Mission Power Analysis Tool (MMPAT) Version 3
NASA Technical Reports Server (NTRS)
Wood, Eric G.; Chang, George W.; Chen, Fannie C.
2012-01-01
The Multi-Mission Power Analysis Tool (MMPAT) simulates a spacecraft power subsystem including the power source (solar array and/or radioisotope thermoelectric generator), bus-voltage control, secondary battery (lithium-ion or nickel-hydrogen), thermostatic heaters, and power-consuming equipment. It handles multiple mission types including heliocentric orbiters, planetary orbiters, and surface operations. Being parametrically driven along with its user-programmable features can reduce or even eliminate any need for software modifications when configuring it for a particular spacecraft. It provides multiple levels of fidelity, thereby fulfilling the vast majority of a project s power simulation needs throughout the lifecycle. It can operate in a stand-alone mode with a graphical user interface, in batch mode, or as a library linked with other tools. This software can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. Added flexibility is provided through user-designed state models and table-driven parameters. MMPAT is designed to be used by a variety of users, such as power subsystem engineers for sizing power subsystem components; mission planners for adjusting mission scenarios using power profiles generated by the model; system engineers for performing system- level trade studies using the results of the model during the early design phases of a spacecraft; and operations personnel for high-fidelity modeling of the essential power aspect of the planning picture.
NASA Astrophysics Data System (ADS)
Georgiev, Bozhidar; Georgieva, Adriana
2013-12-01
In this paper, are presented some possibilities concerning the implementation of a test-driven development as a programming method. Here is offered a different point of view for creation of advanced programming techniques (build tests before programming source with all necessary software tools and modules respectively). Therefore, this nontraditional approach for easier programmer's work through building tests at first is preferable way of software development. This approach allows comparatively simple programming (applied with different object-oriented programming languages as for example JAVA, XML, PYTHON etc.). It is predictable way to develop software tools and to provide help about creating better software that is also easier to maintain. Test-driven programming is able to replace more complicated casual paradigms, used by many programmers.
Revel8or: Model Driven Capacity Planning Tool Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Liming; Liu, Yan; Bui, Ngoc B.
2007-05-31
Designing complex multi-tier applications that must meet strict performance requirements is a challenging software engineering problem. Ideally, the application architect could derive accurate performance predictions early in the project life-cycle, leveraging initial application design-level models and a description of the target software and hardware platforms. To this end, we have developed a capacity planning tool suite for component-based applications, called Revel8tor. The tool adheres to the model driven development paradigm and supports benchmarking and performance prediction for J2EE, .Net and Web services platforms. The suite is composed of three different tools: MDAPerf, MDABench and DSLBench. MDAPerf allows annotation of designmore » diagrams and derives performance analysis models. MDABench allows a customized benchmark application to be modeled in the UML 2.0 Testing Profile and automatically generates a deployable application, with measurement automatically conducted. DSLBench allows the same benchmark modeling and generation to be conducted using a simple performance engineering Domain Specific Language (DSL) in Microsoft Visual Studio. DSLBench integrates with Visual Studio and reuses its load testing infrastructure. Together, the tool suite can assist capacity planning across platforms in an automated fashion.« less
Optimizing romanian maritime coastline using mathematical model Litpack
NASA Astrophysics Data System (ADS)
Anton, I. A.; Panaitescu, M.; Panaitescu, F. V.
2017-08-01
There are many methods and tools to study shoreline change in coastal engineering. LITPACK is a numerical model included in MIKE software developed by DHI (Danish Hydraulic Institute). With this matehematical model we can simulate coastline evolution and profile along beach. Research and methodology: the paper contents location of the study area, the current status of Midia-Mangalia shoreline, protection objectives, the changes of shoreline after having protected constructions. In this paper are presented numerical and graphycal results obtained with this model for studying the romanian maritime coastline in area MIDIA-MANGALIA: non-cohesive sediment transport, long-shore current and littoral drift, coastline evolution, crossshore profile evolution, the development of the coastline position in time.
Software Architecture Evolution
ERIC Educational Resources Information Center
Barnes, Jeffrey M.
2013-01-01
Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…
Levy, Ofir; Dayan, Tamar; Kronfeld-Schor, Noga; Porter, Warren P
2012-06-01
Most mammals can be characterized as nocturnal or diurnal. However infrequently, species may overcome evolutionary constraints and alter their activity patterns. We modeled the fundamental temporal niche of a diurnal desert rodent, the golden spiny mouse, Acomys russatus. This species can shift into nocturnal activity in the absence of its congener, the common spiny mouse, Acomys cahirinus, suggesting that it was competitively driven into diurnality and that this shift in a small desert rodent may involve physiological costs. Therefore, we compared metabolic costs of diurnal versus nocturnal activity using a biophysical model to evaluate the preferred temporal niche of this species. The model predicted that energy expenditure during foraging is almost always lower during the day except during midday in summer at the less sheltered microhabitat. We also found that a shift in summer to foraging in less sheltered microhabitats in response to predation pressure and food availability involves a significant physiological cost moderated by midday reduction in activity. Thus, adaptation to diurnality may reflect the "ghost of competition past"; climate-driven diurnality is an alternative but less likely hypothesis. While climate is considered to play a major role in the physiology and evolution of mammals, this is the first study to model its potential to affect the evolution of activity patterns of mammals.
A coevolving model based on preferential triadic closure for social media networks
Li, Menghui; Zou, Hailin; Guan, Shuguang; Gong, Xiaofeng; Li, Kun; Di, Zengru; Lai, Choy-Heng
2013-01-01
The dynamical origin of complex networks, i.e., the underlying principles governing network evolution, is a crucial issue in network study. In this paper, by carrying out analysis to the temporal data of Flickr and Epinions–two typical social media networks, we found that the dynamical pattern in neighborhood, especially the formation of triadic links, plays a dominant role in the evolution of networks. We thus proposed a coevolving dynamical model for such networks, in which the evolution is only driven by the local dynamics–the preferential triadic closure. Numerical experiments verified that the model can reproduce global properties which are qualitatively consistent with the empirical observations. PMID:23979061
Yardang evolution from maturity to demise
NASA Astrophysics Data System (ADS)
Barchyn, Thomas E.; Hugenholtz, Chris H.
2015-07-01
Yardangs are enigmatic wind-parallel ridges sculpted by aeolian processes that are found extensively in arid environments on Earth and Mars. No general theory exists to explain the long-term evolution of yardangs, curtailing modeling of landscape evolution and dynamics of suspended sediment release. We present a hypothesis of yardang evolution using relative rates of sediment flux, interyardang corridor downcutting, yardang denudation, substrate erodibility, and substrate clast content. To develop and sustain yardangs, corridor downcutting must exceed yardang vertical denudation and deflation. However, erosion of substrate yields considerable quantities of sediment that shelters corridors, slowing downcutting. We model the evolution of yardangs through various combinations of rates and substrate compositions, demonstrating the life span, suspended sediment release, and resulting landscape evolution. We find that yardangs have a distinct and predictable evolution, with inevitable demise and unexpectedly dynamic and autogenic erosion rates driven by subtle differences in substrate clast composition.
How Evolution May Work Through Curiosity-Driven Developmental Process.
Oudeyer, Pierre-Yves; Smith, Linda B
2016-04-01
Infants' own activities create and actively select their learning experiences. Here we review recent models of embodied information seeking and curiosity-driven learning and show that these mechanisms have deep implications for development and evolution. We discuss how these mechanisms yield self-organized epigenesis with emergent ordered behavioral and cognitive developmental stages. We describe a robotic experiment that explored the hypothesis that progress in learning, in and for itself, generates intrinsic rewards: The robot learners probabilistically selected experiences according to their potential for reducing uncertainty. In these experiments, curiosity-driven learning led the robot learner to successively discover object affordances and vocal interaction with its peers. We explain how a learning curriculum adapted to the current constraints of the learning system automatically formed, constraining learning and shaping the developmental trajectory. The observed trajectories in the robot experiment share many properties with those in infant development, including a mixture of regularities and diversities in the developmental patterns. Finally, we argue that such emergent developmental structures can guide and constrain evolution, in particular with regard to the origins of language. Copyright © 2016 Cognitive Science Society, Inc.
Open cyberGIS software for geospatial research and education in the big data era
NASA Astrophysics Data System (ADS)
Wang, Shaowen; Liu, Yan; Padmanabhan, Anand
CyberGIS represents an interdisciplinary field combining advanced cyberinfrastructure, geographic information science and systems (GIS), spatial analysis and modeling, and a number of geospatial domains to improve research productivity and enable scientific breakthroughs. It has emerged as new-generation GIS that enable unprecedented advances in data-driven knowledge discovery, visualization and visual analytics, and collaborative problem solving and decision-making. This paper describes three open software strategies-open access, source, and integration-to serve various research and education purposes of diverse geospatial communities. These strategies have been implemented in a leading-edge cyberGIS software environment through three corresponding software modalities: CyberGIS Gateway, Toolkit, and Middleware, and achieved broad and significant impacts.
An exchange format for use-cases of hospital information systems.
Masuda, G; Sakamoto, N; Sakai, R; Yamamoto, R
2001-01-01
Object-oriented software development is a powerful methodology for development of large hospital information systems. We think use-case driven approach is particularly useful for the development. In the use-cases driven approach, use-cases are documented at the first stage in the software development process and they are used through the whole steps in a variety of ways. Therefore, it is important to exchange and share the use-cases and make effective use of them through the overall lifecycle of a development process. In this paper, we propose a method of sharing and exchanging use-case models between applications, developers, and projects. We design an XML based exchange format for use-cases. We then discuss an application of the exchange format to support several software development activities. We preliminarily implemented a support system for object-oriented analysis based on the exchange format. The result shows that using the structural and semantic information in the exchange format enables the support system to assist the object-oriented analysis successfully.
Floquet prethermalization in the resonantly driven Hubbard model
NASA Astrophysics Data System (ADS)
Herrmann, Andreas; Murakami, Yuta; Eckstein, Martin; Werner, Philipp
2017-12-01
We demonstrate the existence of long-lived prethermalized states in the Mott insulating Hubbard model driven by periodic electric fields. These states, which also exist in the resonantly driven case with a large density of photo-induced doublons and holons, are characterized by a nonzero current and an effective temperature of the doublons and holons which depends sensitively on the driving condition. Focusing on the specific case of resonantly driven models whose effective time-independent Hamiltonian in the high-frequency driving limit corresponds to noninteracting fermions, we show that the time evolution of the double occupation can be reproduced by the effective Hamiltonian, and that the prethermalization plateaus at finite driving frequency are controlled by the next-to-leading-order correction in the high-frequency expansion of the effective Hamiltonian. We propose a numerical procedure to determine an effective Hubbard interaction that mimics the correlation effects induced by these higher-order terms.
Just-in-time Database-Driven Web Applications
2003-01-01
"Just-in-time" database-driven Web applications are inexpensive, quickly-developed software that can be put to many uses within a health care organization. Database-driven Web applications garnered 73873 hits on our system-wide intranet in 2002. They enabled collaboration and communication via user-friendly Web browser-based interfaces for both mission-critical and patient-care-critical functions. Nineteen database-driven Web applications were developed. The application categories that comprised 80% of the hits were results reporting (27%), graduate medical education (26%), research (20%), and bed availability (8%). The mean number of hits per application was 3888 (SD = 5598; range, 14-19879). A model is described for just-in-time database-driven Web application development and an example given with a popular HTML editor and database program. PMID:14517109
Can Landscape Evolution Models (LEMs) be used to reconstruct palaeo-climate and sea-level histories?
NASA Astrophysics Data System (ADS)
Leyland, J.; Darby, S. E.
2011-12-01
Reconstruction of palaeo-environmental conditions over long time periods is notoriously difficult, especially where there are limited or no proxy records from which to extract data. Application of landscape evolution models (LEMs) for palaeo-environmental reconstruction involves hindcast modeling, in which simulation scenarios are configured with specific model variables and parameters chosen to reflect a specific hypothesis of environmental change. In this form of modeling, the environmental time series utilized are considered credible when modeled and observed landscape metrics converge. Herein we account for the uncertainties involved in evaluating the degree to which the model simulations and observations converge using Monte Carlo analysis of reduced complexity `metamodels'. The technique is applied to a case study focused on a specific set of gullies found on the southwest coast of the Isle of Wight, UK. A key factor controlling the Holocene evolution of these coastal gullies is the balance between rates of sea-cliff retreat (driven by sea-level rise) and headwards incision caused by knickpoint migration (driven by the rate of runoff). We simulate these processes using a version of the GOLEM model that has been modified to represent sea-cliff retreat. A Central Composite Design (CCD) sampling technique was employed, enabling the trajectories of gully response to different combinations of driving conditions to be modeled explicitly. In some of these simulations, where the range of bedrock erodibility (0.03 to 0.04 m0.2 a-1) and rate of sea-level change (0.005 to 0.0059 m a-1) is tightly constrained, modeled gully forms conform closely to those observed in reality, enabling a suite of climate and sea-level change scenarios which plausibly explain the Holocene evolution of the Isle of Wight gullies to be identified.
Simulation and animation of sensor-driven robots
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, C.; Trivedi, M.M.; Bidlack, C.R.
1994-10-01
Most simulation and animation systems utilized in robotics are concerned with simulation of the robot and its environment without simulation of sensors. These systems have difficulty in handling robots that utilize sensory feedback in their operation. In this paper, a new design of an environment for simulation, animation, and visualization of sensor-driven robots is presented. As sensor technology advances, increasing numbers of robots are equipped with various types of sophisticated sensors. The main goal of creating the visualization environment is to aide the automatic robot programming and off-line programming capabilities of sensor-driven robots. The software system will help the usersmore » visualize the motion and reaction of the sensor-driven robot under their control program. Therefore, the efficiency of the software development is increased, the reliability of the software and the operation safety of the robot are ensured, and the cost of new software development is reduced. Conventional computer-graphics-based robot simulation and animation software packages lack of capabilities for robot sensing simulation. This paper describes a system designed to overcome this deficiency.« less
TTLEM: Open access tool for building numerically accurate landscape evolution models in MATLAB
NASA Astrophysics Data System (ADS)
Campforts, Benjamin; Schwanghart, Wolfgang; Govers, Gerard
2017-04-01
Despite a growing interest in LEMs, accuracy assessment of the numerical methods they are based on has received little attention. Here, we present TTLEM which is an open access landscape evolution package designed to develop and test your own scenarios and hypothesises. TTLEM uses a higher order flux-limiting finite-volume method to simulate river incision and tectonic displacement. We show that this scheme significantly influences the evolution of simulated landscapes and the spatial and temporal variability of erosion rates. Moreover, it allows the simulation of lateral tectonic displacement on a fixed grid. Through the use of a simple GUI the software produces visible output of evolving landscapes through model run time. In this contribution, we illustrate numerical landscape evolution through a set of movies spanning different spatial and temporal scales. We focus on the erosional domain and use both spatially constant and variable input values for uplift, lateral tectonic shortening, erodibility and precipitation. Moreover, we illustrate the relevance of a stochastic approach for realistic hillslope response modelling. TTLEM is a fully open source software package, written in MATLAB and based on the TopoToolbox platform (topotoolbox.wordpress.com). Installation instructions can be found on this website and the therefore designed GitHub repository.
Testing the Accuracy of Data-driven MHD Simulations of Active Region Evolution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leake, James E.; Linton, Mark G.; Schuck, Peter W., E-mail: james.e.leake@nasa.gov
Models for the evolution of the solar coronal magnetic field are vital for understanding solar activity, yet the best measurements of the magnetic field lie at the photosphere, necessitating the development of coronal models which are “data-driven” at the photosphere. We present an investigation to determine the feasibility and accuracy of such methods. Our validation framework uses a simulation of active region (AR) formation, modeling the emergence of magnetic flux from the convection zone to the corona, as a ground-truth data set, to supply both the photospheric information and to perform the validation of the data-driven method. We focus ourmore » investigation on how the accuracy of the data-driven model depends on the temporal frequency of the driving data. The Helioseismic and Magnetic Imager on NASA’s Solar Dynamics Observatory produces full-disk vector magnetic field measurements at a 12-minute cadence. Using our framework we show that ARs that emerge over 25 hr can be modeled by the data-driving method with only ∼1% error in the free magnetic energy, assuming the photospheric information is specified every 12 minutes. However, for rapidly evolving features, under-sampling of the dynamics at this cadence leads to a strobe effect, generating large electric currents and incorrect coronal morphology and energies. We derive a sampling condition for the driving cadence based on the evolution of these small-scale features, and show that higher-cadence driving can lead to acceptable errors. Future work will investigate the source of errors associated with deriving plasma variables from the photospheric magnetograms as well as other sources of errors, such as reduced resolution, instrument bias, and noise.« less
Eissing, Thomas; Kuepfer, Lars; Becker, Corina; Block, Michael; Coboeken, Katrin; Gaub, Thomas; Goerlitz, Linus; Jaeger, Juergen; Loosen, Roland; Ludewig, Bernd; Meyer, Michaela; Niederalt, Christoph; Sevestre, Michael; Siegmund, Hans-Ulrich; Solodenko, Juri; Thelen, Kirstin; Telle, Ulrich; Weiss, Wolfgang; Wendl, Thomas; Willmann, Stefan; Lippert, Joerg
2011-01-01
Today, in silico studies and trial simulations already complement experimental approaches in pharmaceutical R&D and have become indispensable tools for decision making and communication with regulatory agencies. While biology is multiscale by nature, project work, and software tools usually focus on isolated aspects of drug action, such as pharmacokinetics at the organism scale or pharmacodynamic interaction on the molecular level. We present a modeling and simulation software platform consisting of PK-Sim® and MoBi® capable of building and simulating models that integrate across biological scales. A prototypical multiscale model for the progression of a pancreatic tumor and its response to pharmacotherapy is constructed and virtual patients are treated with a prodrug activated by hepatic metabolization. Tumor growth is driven by signal transduction leading to cell cycle transition and proliferation. Free tumor concentrations of the active metabolite inhibit Raf kinase in the signaling cascade and thereby cell cycle progression. In a virtual clinical study, the individual therapeutic outcome of the chemotherapeutic intervention is simulated for a large population with heterogeneous genomic background. Thereby, the platform allows efficient model building and integration of biological knowledge and prior data from all biological scales. Experimental in vitro model systems can be linked with observations in animal experiments and clinical trials. The interplay between patients, diseases, and drugs and topics with high clinical relevance such as the role of pharmacogenomics, drug–drug, or drug–metabolite interactions can be addressed using this mechanistic, insight driven multiscale modeling approach. PMID:21483730
The Heavy Links between Geological Events and Vascular Plants Evolution: A Brief Outline
Piombino, Aldo
2016-01-01
Since the rise of photosynthesis, life has influenced terrestrial atmosphere, particularly the O2 and the CO2 content (the latter being originally more than 95%), changing the chemistry of waters, atmosphere, and soils. Billions of years after, a far offspring of these first unicellular forms conquered emerging lands, not only completely changing landscape, but also modifying geological cycles of deposition and erosion, many chemical and physical characteristics of soils and fresh waters, and, more, the cycle of various elements. So, there are no doubts that vascular plants modified geology; but it is true that also geology has affected (and, more, has driven) plant evolution. New software, PyRate, has determined vascular plant origin and diversification through a Bayesian analysis of fossil record from Silurian to today, particularly observing their origination and extinction rate. A comparison between PyRate data and geological history suggests that geological events massively influenced plant evolution and that also the rise of nonflowering seed plants and the fast diffusion of flowering plants can be explained, almost partly, with the environmental condition changes induced by geological phenomena. PMID:26966609
The Heavy Links between Geological Events and Vascular Plants Evolution: A Brief Outline.
Piombino, Aldo
2016-01-01
Since the rise of photosynthesis, life has influenced terrestrial atmosphere, particularly the O2 and the CO2 content (the latter being originally more than 95%), changing the chemistry of waters, atmosphere, and soils. Billions of years after, a far offspring of these first unicellular forms conquered emerging lands, not only completely changing landscape, but also modifying geological cycles of deposition and erosion, many chemical and physical characteristics of soils and fresh waters, and, more, the cycle of various elements. So, there are no doubts that vascular plants modified geology; but it is true that also geology has affected (and, more, has driven) plant evolution. New software, PyRate, has determined vascular plant origin and diversification through a Bayesian analysis of fossil record from Silurian to today, particularly observing their origination and extinction rate. A comparison between PyRate data and geological history suggests that geological events massively influenced plant evolution and that also the rise of nonflowering seed plants and the fast diffusion of flowering plants can be explained, almost partly, with the environmental condition changes induced by geological phenomena.
The evolutionary and behavioral modification of consumer responses to environmental change.
Abrams, Peter A
2014-02-21
How will evolution or other forms of adaptive change alter the response of a consumer species' population density to environmentally driven changes in population growth parameters? This question is addressed by analyzing some simple consumer-resource models to separate the ecological and evolutionary components of the population's response. Ecological responses are always decreased population size, but evolution of traits that have effects on both resource uptake rate and another fitness-related parameter may magnify, offset, or reverse this population decrease. Evolution can change ecologically driven decreases in population size to increases; this is likely when: (1) resources are initially below the density that maximizes resource growth, and (2) the evolutionary response decreases the consumer's resource uptake rate. Evolutionary magnification of the ecological decreases in population size can occur when the environmental change is higher trait-independent mortality. Such evolution-driven decreases are most likely when uptake-rate traits increase and the resource is initially below its maximum growth density. It is common for the difference between the new eco-evolutionary equilibrium and the new ecological equilibrium to be larger than that between the original and new ecological equilibrium densities. The relative magnitudes of ecological and evolutionary effects often depend sensitively on the magnitude of the environmental change and the nature of resource growth. © 2013 Elsevier Ltd. All rights reserved.
Federating Cyber and Physical Models for Event-Driven Situational Awareness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stephan, Eric G.; Pawlowski, Ronald A.; Sridhar, Siddharth
The purpose of this paper is to describe a novel method to improve electric power system monitoring and control software application interoperability. This method employs the concept of federation, which is defined as the use of existing models that represent aspects of a system in specific domains (such as physical and cyber security domains) and building interface to link all of domain models.
NASA Technical Reports Server (NTRS)
Changsheng, LI; Frolking, Steve; Frolking, Tod A.
1992-01-01
Simulations of N2O and CO2 emissions from soils were conducted with a rain-event driven, process-oriented model (DNDC) of nitrogen and carbon cycling processes in soils. The magnitude and trends of simulated N2O (or N2O + N2) and CO2 emissions were consistent with the results obtained in field experiments. The successful simulation of these emissions from the range of soil types examined demonstrates that the DNDC will be a useful tool for the study of linkages among climate, soil-atmosphere interactions, land use, and trace gas fluxes.
Monogamy and haplodiploidy act in synergy to promote the evolution of eusociality.
Fromhage, Lutz; Kokko, Hanna
2011-07-19
In eusocial species, some individuals sacrifice their own reproduction for the benefit of others. The evolutionary transition towards eusociality may have been facilitated by ancestral species having a monogamous mating system (the monogamy hypothesis) or a haplodiploid genetic system (the haplodiploidy hypothesis), or it may have been entirely driven by other (ecological) factors. Here we show, using a model that describes the dynamics of insect colony foundation, growth and death, that monogamy and haplodiploidy facilitate the evolution of eusociality in a novel, mutually reinforcing way. Our findings support the recently questioned importance of relatedness for the evolution of eusociality, and simultaneously highlight the importance of explicitly accounting for the ecological rules of colony foundation, growth and death in models of social evolution.
Extraordinary Oscillations of an Ordinary Forced Pendulum
ERIC Educational Resources Information Center
Butikov, Eugene I.
2008-01-01
Several well-known and newly discovered counterintuitive regular and chaotic modes of the sinusoidally driven rigid planar pendulum are discussed and illustrated by computer simulations. The software supporting the investigation offers many interesting predefined examples that demonstrate various peculiarities of this famous physical model.…
ISM simulations: an overview of models
NASA Astrophysics Data System (ADS)
de Avillez, M. A.; Breitschwerdt, D.; Asgekar, A.; Spitoni, E.
2015-03-01
Until recently the dynamical evolution of the interstellar medium (ISM) was simulated using collisional ionization equilibrium (CIE) conditions. However, the ISM is a dynamical system, in which the plasma is naturally driven out of equilibrium due to atomic and dynamic processes operating on different timescales. A step forward in the field comprises a multi-fluid approach taking into account the joint thermal and dynamical evolutions of the ISM gas.
Scenario driven data modelling: a method for integrating diverse sources of data and data streams
2011-01-01
Background Biology is rapidly becoming a data intensive, data-driven science. It is essential that data is represented and connected in ways that best represent its full conceptual content and allows both automated integration and data driven decision-making. Recent advancements in distributed multi-relational directed graphs, implemented in the form of the Semantic Web make it possible to deal with complicated heterogeneous data in new and interesting ways. Results This paper presents a new approach, scenario driven data modelling (SDDM), that integrates multi-relational directed graphs with data streams. SDDM can be applied to virtually any data integration challenge with widely divergent types of data and data streams. In this work, we explored integrating genetics data with reports from traditional media. SDDM was applied to the New Delhi metallo-beta-lactamase gene (NDM-1), an emerging global health threat. The SDDM process constructed a scenario, created a RDF multi-relational directed graph that linked diverse types of data to the Semantic Web, implemented RDF conversion tools (RDFizers) to bring content into the Sematic Web, identified data streams and analytical routines to analyse those streams, and identified user requirements and graph traversals to meet end-user requirements. Conclusions We provided an example where SDDM was applied to a complex data integration challenge. The process created a model of the emerging NDM-1 health threat, identified and filled gaps in that model, and constructed reliable software that monitored data streams based on the scenario derived multi-relational directed graph. The SDDM process significantly reduced the software requirements phase by letting the scenario and resulting multi-relational directed graph define what is possible and then set the scope of the user requirements. Approaches like SDDM will be critical to the future of data intensive, data-driven science because they automate the process of converting massive data streams into usable knowledge. PMID:22165854
Formalism Challenges of the Cougaar Model Driven Architecture
NASA Technical Reports Server (NTRS)
Bohner, Shawn A.; George, Boby; Gracanin, Denis; Hinchey, Michael G.
2004-01-01
The Cognitive Agent Architecture (Cougaar) is one of the most sophisticated distributed agent architectures developed today. As part of its research and evolution, Cougaar is being studied for application to large, logistics-based applications for the Department of Defense (DoD). Anticipiting future complex applications of Cougaar, we are investigating the Model Driven Architecture (MDA) approach to understand how effective it would be for increasing productivity in Cougar-based development efforts. Recognizing the sophistication of the Cougaar development environment and the limitations of transformation technologies for agents, we have systematically developed an approach that combines component assembly in the large and transformation in the small. This paper describes some of the key elements that went into the Cougaar Model Driven Architecture approach and the characteristics that drove the approach.
Evaluating Predictive Models of Software Quality
NASA Astrophysics Data System (ADS)
Ciaschini, V.; Canaparo, M.; Ronchieri, E.; Salomoni, D.
2014-06-01
Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.
NASA Astrophysics Data System (ADS)
Perez, J. C.; Chandran, B. D. G.
2017-12-01
In this work we present recent results from high-resolution direct numerical simulations and a phenomenological model that describes the radial evolution of reflection-driven Alfven Wave turbulence in the solar atmosphere and the inner solar wind. The simulations are performed inside a narrow magnetic flux tube that models a coronal hole extending from the solar surface through the chromosphere and into the solar corona to approximately 21 solar radii. The simulations include prescribed empirical profiles that account for the inhomogeneities in density, background flow, and the background magnetic field present in coronal holes. Alfven waves are injected into the solar corona by imposing random, time-dependent velocity and magnetic field fluctuations at the photosphere. The phenomenological model incorporates three important features observed in the simulations: dynamic alignment, weak/strong nonlinear AW-AW interactions, and that the outward-propagating AWs launched by the Sun split into two populations with different characteristic frequencies. Model and simulations are in good agreement and show that when the key physical parameters are chosen within observational constraints, reflection-driven Alfven turbulence is a plausible mechanism for the heating and acceleration of the fast solar wind. By flying a virtual Parker Solar Probe (PSP) through the simulations, we will also establish comparisons between the model and simulations with the kind of single-point measurements that PSP will provide.
Desiderata for computable representations of electronic health records-driven phenotype algorithms.
Mo, Huan; Thompson, William K; Rasmussen, Luke V; Pacheco, Jennifer A; Jiang, Guoqian; Kiefer, Richard; Zhu, Qian; Xu, Jie; Montague, Enid; Carrell, David S; Lingren, Todd; Mentch, Frank D; Ni, Yizhao; Wehbe, Firas H; Peissig, Peggy L; Tromp, Gerard; Larson, Eric B; Chute, Christopher G; Pathak, Jyotishman; Denny, Joshua C; Speltz, Peter; Kho, Abel N; Jarvik, Gail P; Bejan, Cosmin A; Williams, Marc S; Borthwick, Kenneth; Kitchner, Terrie E; Roden, Dan M; Harris, Paul A
2015-11-01
Electronic health records (EHRs) are increasingly used for clinical and translational research through the creation of phenotype algorithms. Currently, phenotype algorithms are most commonly represented as noncomputable descriptive documents and knowledge artifacts that detail the protocols for querying diagnoses, symptoms, procedures, medications, and/or text-driven medical concepts, and are primarily meant for human comprehension. We present desiderata for developing a computable phenotype representation model (PheRM). A team of clinicians and informaticians reviewed common features for multisite phenotype algorithms published in PheKB.org and existing phenotype representation platforms. We also evaluated well-known diagnostic criteria and clinical decision-making guidelines to encompass a broader category of algorithms. We propose 10 desired characteristics for a flexible, computable PheRM: (1) structure clinical data into queryable forms; (2) recommend use of a common data model, but also support customization for the variability and availability of EHR data among sites; (3) support both human-readable and computable representations of phenotype algorithms; (4) implement set operations and relational algebra for modeling phenotype algorithms; (5) represent phenotype criteria with structured rules; (6) support defining temporal relations between events; (7) use standardized terminologies and ontologies, and facilitate reuse of value sets; (8) define representations for text searching and natural language processing; (9) provide interfaces for external software algorithms; and (10) maintain backward compatibility. A computable PheRM is needed for true phenotype portability and reliability across different EHR products and healthcare systems. These desiderata are a guide to inform the establishment and evolution of EHR phenotype algorithm authoring platforms and languages. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.
PyEvolve: a toolkit for statistical modelling of molecular evolution.
Butterfield, Andrew; Vedagiri, Vivek; Lang, Edward; Lawrence, Cath; Wakefield, Matthew J; Isaev, Alexander; Huttley, Gavin A
2004-01-05
Examining the distribution of variation has proven an extremely profitable technique in the effort to identify sequences of biological significance. Most approaches in the field, however, evaluate only the conserved portions of sequences - ignoring the biological significance of sequence differences. A suite of sophisticated likelihood based statistical models from the field of molecular evolution provides the basis for extracting the information from the full distribution of sequence variation. The number of different problems to which phylogeny-based maximum likelihood calculations can be applied is extensive. Available software packages that can perform likelihood calculations suffer from a lack of flexibility and scalability, or employ error-prone approaches to model parameterisation. Here we describe the implementation of PyEvolve, a toolkit for the application of existing, and development of new, statistical methods for molecular evolution. We present the object architecture and design schema of PyEvolve, which includes an adaptable multi-level parallelisation schema. The approach for defining new methods is illustrated by implementing a novel dinucleotide model of substitution that includes a parameter for mutation of methylated CpG's, which required 8 lines of standard Python code to define. Benchmarking was performed using either a dinucleotide or codon substitution model applied to an alignment of BRCA1 sequences from 20 mammals, or a 10 species subset. Up to five-fold parallel performance gains over serial were recorded. Compared to leading alternative software, PyEvolve exhibited significantly better real world performance for parameter rich models with a large data set, reducing the time required for optimisation from approximately 10 days to approximately 6 hours. PyEvolve provides flexible functionality that can be used either for statistical modelling of molecular evolution, or the development of new methods in the field. The toolkit can be used interactively or by writing and executing scripts. The toolkit uses efficient processes for specifying the parameterisation of statistical models, and implements numerous optimisations that make highly parameter rich likelihood functions solvable within hours on multi-cpu hardware. PyEvolve can be readily adapted in response to changing computational demands and hardware configurations to maximise performance. PyEvolve is released under the GPL and can be downloaded from http://cbis.anu.edu.au/software.
Predictive assimilation framework to support contaminated site understanding and remediation
NASA Astrophysics Data System (ADS)
Versteeg, R. J.; Bianchi, M.; Hubbard, S. S.
2014-12-01
Subsurface system behavior at contaminated sites is driven and controlled by the interplay of physical, chemical, and biological processes occurring at multiple temporal and spatial scales. Effective remediation and monitoring planning requires an understanding of this complexity that is current, predictive (with some level of confidence) and actionable. We present and demonstrate a predictive assimilation framework (PAF). This framework automatically ingests, quality controls and stores near real-time environmental data and processes these data using different inversion and modeling codes to provide information on the current state and evolution of the subsurface system. PAF is implemented as a cloud based software application which has five components: (1) data acquisition, (2) data management, (3) data assimilation and processing, (4) visualization and result deliver and (5) orchestration. Access to and interaction with PAF is done through a standard browser. PAF is designed to be modular so that it can ingest and process different data streams dependent on the site. We will present an implementation of PAF which uses data from a highly instrumented site (the DOE Rifle Subsurface Biogeochemistry Field Observatory in Rifle, Colorado) for which PAF automatically ingests hydrological data and forward models groundwater flow in the saturated zone.
McFadden, David G.; Politi, Katerina; Bhutkar, Arjun; Chen, Frances K.; Song, Xiaoling; Pirun, Mono; Santiago, Philip M.; Kim-Kiselak, Caroline; Platt, James T.; Lee, Emily; Hodges, Emily; Rosebrock, Adam P.; Bronson, Roderick T.; Socci, Nicholas D.; Hannon, Gregory J.; Jacks, Tyler; Varmus, Harold
2016-01-01
Genetically engineered mouse models (GEMMs) of cancer are increasingly being used to assess putative driver mutations identified by large-scale sequencing of human cancer genomes. To accurately interpret experiments that introduce additional mutations, an understanding of the somatic genetic profile and evolution of GEMM tumors is necessary. Here, we performed whole-exome sequencing of tumors from three GEMMs of lung adenocarcinoma driven by mutant epidermal growth factor receptor (EGFR), mutant Kirsten rat sarcoma viral oncogene homolog (Kras), or overexpression of MYC proto-oncogene. Tumors from EGFR- and Kras-driven models exhibited, respectively, 0.02 and 0.07 nonsynonymous mutations per megabase, a dramatically lower average mutational frequency than observed in human lung adenocarcinomas. Tumors from models driven by strong cancer drivers (mutant EGFR and Kras) harbored few mutations in known cancer genes, whereas tumors driven by MYC, a weaker initiating oncogene in the murine lung, acquired recurrent clonal oncogenic Kras mutations. In addition, although EGFR- and Kras-driven models both exhibited recurrent whole-chromosome DNA copy number alterations, the specific chromosomes altered by gain or loss were different in each model. These data demonstrate that GEMM tumors exhibit relatively simple somatic genotypes compared with human cancers of a similar type, making these autochthonous model systems useful for additive engineering approaches to assess the potential of novel mutations on tumorigenesis, cancer progression, and drug sensitivity. PMID:27702896
McFadden, David G; Politi, Katerina; Bhutkar, Arjun; Chen, Frances K; Song, Xiaoling; Pirun, Mono; Santiago, Philip M; Kim-Kiselak, Caroline; Platt, James T; Lee, Emily; Hodges, Emily; Rosebrock, Adam P; Bronson, Roderick T; Socci, Nicholas D; Hannon, Gregory J; Jacks, Tyler; Varmus, Harold
2016-10-18
Genetically engineered mouse models (GEMMs) of cancer are increasingly being used to assess putative driver mutations identified by large-scale sequencing of human cancer genomes. To accurately interpret experiments that introduce additional mutations, an understanding of the somatic genetic profile and evolution of GEMM tumors is necessary. Here, we performed whole-exome sequencing of tumors from three GEMMs of lung adenocarcinoma driven by mutant epidermal growth factor receptor (EGFR), mutant Kirsten rat sarcoma viral oncogene homolog (Kras), or overexpression of MYC proto-oncogene. Tumors from EGFR- and Kras-driven models exhibited, respectively, 0.02 and 0.07 nonsynonymous mutations per megabase, a dramatically lower average mutational frequency than observed in human lung adenocarcinomas. Tumors from models driven by strong cancer drivers (mutant EGFR and Kras) harbored few mutations in known cancer genes, whereas tumors driven by MYC, a weaker initiating oncogene in the murine lung, acquired recurrent clonal oncogenic Kras mutations. In addition, although EGFR- and Kras-driven models both exhibited recurrent whole-chromosome DNA copy number alterations, the specific chromosomes altered by gain or loss were different in each model. These data demonstrate that GEMM tumors exhibit relatively simple somatic genotypes compared with human cancers of a similar type, making these autochthonous model systems useful for additive engineering approaches to assess the potential of novel mutations on tumorigenesis, cancer progression, and drug sensitivity.
Model-Drive Architecture for Agent-Based Systems
NASA Technical Reports Server (NTRS)
Gradanin, Denis; Singh, H. Lally; Bohner, Shawn A.; Hinchey, Michael G.
2004-01-01
The Model Driven Architecture (MDA) approach uses a platform-independent model to define system functionality, or requirements, using some specification language. The requirements are then translated to a platform-specific model for implementation. An agent architecture based on the human cognitive model of planning, the Cognitive Agent Architecture (Cougaar) is selected for the implementation platform. The resulting Cougaar MDA prescribes certain kinds of models to be used, how those models may be prepared and the relationships of the different kinds of models. Using the existing Cougaar architecture, the level of application composition is elevated from individual components to domain level model specifications in order to generate software artifacts. The software artifacts generation is based on a metamodel. Each component maps to a UML structured component which is then converted into multiple artifacts: Cougaar/Java code, documentation, and test cases.
Solar Energy Evolution and Diffusion Studies | Solar Research | NREL
industry-wide studies that use data-driven and evidence-based methods to identify characteristics developed models of U.S. household PV adoption. The project also conducted two market pilots to test methods
Real-time divergent evolution in plants driven by pollinators
Gervasi, Daniel D. L.; Schiestl, Florian P
2017-01-01
Pollinator-driven diversification is thought to be a major source of floral variation in plants. Our knowledge of this process is, however, limited to indirect assessments of evolutionary changes. Here, we employ experimental evolution with fast cycling Brassica rapa plants to demonstrate adaptive evolution driven by different pollinators. Our study shows pollinator-driven divergent selection as well as divergent evolution in plant traits. Plants pollinated by bumblebees evolved taller size and more fragrant flowers with increased ultraviolet reflection. Bumblebees preferred bumblebee-pollinated plants over hoverfly-pollinated plants at the end of the experiment, showing that plants had adapted to the bumblebees' preferences. Plants with hoverfly pollination became shorter, had reduced emission of some floral volatiles, but increased fitness through augmented autonomous self-pollination. Our study demonstrates that changes in pollinator communities can have rapid consequences on the evolution of plant traits and mating system. PMID:28291771
Stav, Erlend; Walderhaug, Ståle; Mikalsen, Marius; Hanke, Sten; Benc, Ivan
2013-11-01
The proper use of ICT services can support seniors in living independently longer. While such services are starting to emerge, current proprietary solutions are often expensive, covering only isolated parts of seniors' needs, and lack support for sharing information between services and between users. For developers, the challenge is that it is complex and time consuming to develop high quality, interoperable services, and new techniques are needed to simplify the development and reduce the development costs. This paper provides the complete view of the experiences gained in the MPOWER project with respect to using model-driven development (MDD) techniques for Service Oriented Architecture (SOA) system development in the Ambient Assisted Living (AAL) domain. To address this challenge, the approach of the European research project MPOWER (2006-2009) was to investigate and record the user needs, define a set of reusable software services based on these needs, and then implement pilot systems using these services. Further, a model-driven toolchain covering key development phases was developed to support software developers through this process. Evaluations were conducted both on the technical artefacts (methodology and tools), and on end user experience from using the pilot systems in trial sites. The outcome of the work on the user needs is a knowledge base recorded as a Unified Modeling Language (UML) model. This comprehensive model describes actors, use cases, and features derived from these. The model further includes the design of a set of software services, including full trace information back to the features and use cases motivating their design. Based on the model, the services were implemented for use in Service Oriented Architecture (SOA) systems, and are publicly available as open source software. The services were successfully used in the realization of two pilot applications. There is therefore a direct and traceable link from the user needs of the elderly, through the service design knowledge base, to the service and pilot implementations. The evaluation of the SOA approach on the developers in the project revealed that SOA is useful with respect to job performance and quality. Furthermore, they think SOA is easy to use and support development of AAL applications. An important finding is that the developers clearly report that they intend to use SOA in the future, but not for all type of projects. With respect to using model-driven development in web services design and implementation, the developers reported that it was useful. However, it is important that the code generated from the models is correct if the full potential of MDD should be achieved. The pilots and their evaluation in the trial sites showed that the services of the platform are sufficient to create suitable systems for end users in the domain. A SOA platform with a set of reusable domain services is a suitable foundation for more rapid development and tailoring of assisted living systems covering reoccurring needs among elderly users. It is feasible to realize a tool-chain for model-driven development of SOA applications in the AAL domain, and such a tool-chain can be accepted and found useful by software developers. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Technical Data Exchange Software Tools Adapted to Distributed Microsatellite Design
NASA Astrophysics Data System (ADS)
Pache, Charly
2002-01-01
One critical issue concerning distributed design of satellites, is the collaborative work it requires. In particular, the exchange of data between each group responsible for each subsystem can be complex and very time-consuming. The goal of this paper is to present a design collaborative tool, the SSETI Design Model (SDM), specifically developed for enabling satellite distributed design. SDM is actually used in the ongoing Student Space Exploration &Technology (SSETI) initiative (www.sseti.net). SSETI is lead by European Space Agency (ESA) outreach office (http://www.estec.esa.nl/outreach), involving student groups from all over Europe for design, construction and launch of a microsatellite. The first part of this paper presents the current version of the SDM tool, a collection of Microsoft Excel linked worksheets, one for each subsystem. An overview of the project framework/structure is given, explaining the different actors, the flows between them, as well as the different types of data and the links - formulas - between data sets. Unified Modeling Language (UML) diagrams give an overview of the different parts . Then the SDM's functionalities, developed in VBA scripts (Visual Basic for Application), are introduced, as well as the interactive features, user interfaces and administration tools. The second part discusses the capabilities and limitations of SDM current version. Taking into account these capabilities and limitations, the third part outlines the next version of SDM, a web-oriented, database-driven evolution of the current version. This new approach will enable real-time data exchange and processing between the different actors of the mission. Comprehensive UML diagrams will guide the audience through the entire modeling process of such a system. Tradeoffs simulation capabilities, security, reliability, hardware and software issues will also be thoroughly discussed.
Is a larger refuge always better? Dispersal and dose in pesticide resistance evolution
Takahashi, Daisuke; Yamanaka, Takehiko; Sudo, Masaaki; Andow, David A.
2017-01-01
The evolution of resistance against pesticides is an important problem of modern agriculture. The high‐dose/refuge strategy, which divides the landscape into treated and nontreated (refuge) patches, has proven effective at delaying resistance evolution. However, theoretical understanding is still incomplete, especially for combinations of limited dispersal and partially recessive resistance. We reformulate a two‐patch model based on the Comins model and derive a simple quadratic approximation to analyze the effects of limited dispersal, refuge size, and dominance for high efficacy treatments on the rate of evolution. When a small but substantial number of heterozygotes can survive in the treated patch, a larger refuge always reduces the rate of resistance evolution. However, when dominance is small enough, the evolutionary dynamics in the refuge population, which is indirectly driven by migrants from the treated patch, mainly describes the resistance evolution in the landscape. In this case, for small refuges, increasing the refuge size will increase the rate of resistance evolution. Our analysis distils major driving forces from the model, and can provide a framework for understanding directional selection in source‐sink environments. PMID:28422284
Pal, P; Kumar, R; Srivastava, N; Chaudhuri, J
2014-02-01
A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater.
KRAS-driven lung adenocarcinoma: combined DDR1/Notch inhibition as an effective therapy
Ambrogio, Chiara; Nadal, Ernest; Villanueva, Alberto; Gómez-López, Gonzalo; Cash, Timothy P; Barbacid, Mariano; Santamaría, David
2016-01-01
Understanding the early evolution of cancer heterogeneity during the initial steps of tumorigenesis can uncover vulnerabilities of cancer cells that may be masked at later stages. We describe a comprehensive approach employing gene expression analysis in early lesions to identify novel therapeutic targets and the use of mouse models to test synthetic lethal drug combinations to treat human Kirsten rat sarcoma viral oncogene homologue (KRAS)-driven lung adenocarcinoma. PMID:27843638
Pedagogical Issues in Object Orientation.
ERIC Educational Resources Information Center
Nerur, Sridhar; Ramanujan, Sam; Kesh, Someswar
2002-01-01
Discusses the need for people with object-oriented (OO) skills, explains benefits of OO in software development, and addresses some of the difficulties in teaching OO. Topics include the evolution of programming languages; differences between OO and traditional approaches; differences from data modeling; and Unified Modeling Language (UML) and…
Changing Needs, Changing Models: Instructional Technology Training at Bronx Community College
ERIC Educational Resources Information Center
Wach, Howard
2007-01-01
In this article Harold Wach describes the gradual evolution of instructional technology faculty development programs at Bronx Community College from "one-shot" two-hour software training sessions toward a comprehensive model that combines intensive summer sessions, academic year implementation, peer mentoring, and accountability. The…
The Ozone Widget Framework: towards modularity of C2 human interfaces
NASA Astrophysics Data System (ADS)
Hellar, David Benjamin; Vega, Laurian C.
2012-05-01
The Ozone Widget Framework (OWF) is a common webtop environment for distribution across the enterprise. A key mission driver for OWF is to enable rapid capability delivery by lowering time-to-market with lightweight components. OWF has been released as Government Open Source Software and has been deployed in a variety of C2 net-centric contexts ranging from real-time analytics, cyber-situational awareness, to strategic and operational planning. This paper discusses the current and future evolution of OWF including the availability of the OZONE Marketplace (OMP), useractivity driven metrics, and architecture enhancements for accessibility. Together, OWF is moving towards the rapid delivery of modular human interfaces supporting modern and future command and control contexts.
Anatomy of an anesthesia information management system.
Shah, Nirav J; Tremper, Kevin K; Kheterpal, Sachin
2011-09-01
Anesthesia information management systems (AIMS) have become more prevalent as more sophisticated hardware and software have increased usability and reliability. National mandates and incentives have driven adoption as well. AIMS can be developed in one of several software models (Web based, client/server, or incorporated into a medical device). Irrespective of the development model, the best AIMS have a feature set that allows for comprehensive management of workflow for an anesthesiologist. Key features include preoperative, intraoperative, and postoperative documentation; quality assurance; billing; compliance and operational reporting; patient and operating room tracking; and integration with hospital electronic medical records. Copyright © 2011 Elsevier Inc. All rights reserved.
Particle force model effects in a shock-driven multiphase instability
NASA Astrophysics Data System (ADS)
Black, W. J.; Denissen, N.; McFarland, J. A.
2018-05-01
This work presents simulations on a shock-driven multiphase instability (SDMI) at an initial particle volume fraction of 1% with the addition of a suite of particle force models applicable in dense flows. These models include pressure-gradient, added-mass, and interparticle force terms in an effort to capture the effects neighboring particles have in non-dilute flow regimes. Two studies are presented here: the first seeks to investigate the individual contributions of the force models, while the second study focuses on examining the effect of these force models on the hydrodynamic evolution of a SDMI with various particle relaxation times (particle sizes). In the force study, it was found that the pressure gradient and interparticle forces have little effect on the instability under the conditions examined, while the added-mass force decreases the vorticity deposition and alters the morphology of the instability. The relaxation-time study likewise showed a decrease in metrics associated with the evolution of the SDMI for all sizes when the particle force models were included. The inclusion of these models showed significant morphological differences in both the particle and carrier species fields, which increased as particle relaxation times increased.
Partition-based discrete-time quantum walks
NASA Astrophysics Data System (ADS)
Konno, Norio; Portugal, Renato; Sato, Iwao; Segawa, Etsuo
2018-04-01
We introduce a family of discrete-time quantum walks, called two-partition model, based on two equivalence-class partitions of the computational basis, which establish the notion of local dynamics. This family encompasses most versions of unitary discrete-time quantum walks driven by two local operators studied in literature, such as the coined model, Szegedy's model, and the 2-tessellable staggered model. We also analyze the connection of those models with the two-step coined model, which is driven by the square of the evolution operator of the standard discrete-time coined walk. We prove formally that the two-step coined model, an extension of Szegedy model for multigraphs, and the two-tessellable staggered model are unitarily equivalent. Then, selecting one specific model among those families is a matter of taste not generality.
On the upscaling of process-based models in deltaic applications
NASA Astrophysics Data System (ADS)
Li, L.; Storms, J. E. A.; Walstra, D. J. R.
2018-03-01
Process-based numerical models are increasingly used to study the evolution of marine and terrestrial depositional environments. Whilst a detailed description of small-scale processes provides an accurate representation of reality, application on geological timescales is restrained by the associated increase in computational time. In order to reduce the computational time, a number of acceleration methods are combined and evaluated for a schematic supply-driven delta (static base level) and an accommodation-driven delta (variable base level). The performance of the combined acceleration methods is evaluated by comparing the morphological indicators such as distributary channel networking and delta volumes derived from the model predictions for various levels of acceleration. The results of the accelerated models are compared to the outcomes from a series of simulations to capture autogenic variability. Autogenic variability is quantified by re-running identical models on an initial bathymetry with 1 cm added noise. The overall results show that the variability of the accelerated models fall within the autogenic variability range, suggesting that the application of acceleration methods does not significantly affect the simulated delta evolution. The Time-scale compression method (the acceleration method introduced in this paper) results in an increased computational efficiency of 75% without adversely affecting the simulated delta evolution compared to a base case. The combination of the Time-scale compression method with the existing acceleration methods has the potential to extend the application range of process-based models towards geologic timescales.
An Effective Continuum Model for the Gas Evolution in Internal Steam Drives
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsimpanogiannis, Ioannis N.; Yortsos, Yanis C.
This report examines the gas phase growth from a supersaturated, slightly compressible, liquid in a porous medium, driven by heat transfer and controlled by the application of a constant-rate decline of the system pressure.
Status of Ongoing Work in Software TRAs/TRLs
2010-04-29
to changes/updates being driven by corporate market dynamics • Changes not under control or under the influence of the PMO! • On programs with long...observed and reported esearc articles, peer- reviewed white papers, point papers, early conceptual models n a ca em c , experimental as c researc
Landform Erosion and Volatile Redistribution on Ganymede and Callisto
NASA Technical Reports Server (NTRS)
Moore, Jeffrey Morgan; Howard, Alan D.; McKinnon, William B.; Schenk, Paul M.; Wood, Stephen E.
2009-01-01
We have been modeling landscape evolution on the Galilean satellites driven by volatile transport. Our work directly addresses some of the most fundamental issues pertinent to deciphering icy Galilean satellite geologic histories by employing techniques currently at the forefront of terrestrial, martian, and icy satellite landscape evolution studies [e.g., 1-6], including modeling of surface and subsurface energy and volatile exchanges, and computer simulation of long-term landform evolution by a variety of processes. A quantitative understanding of the expression and rates of landform erosion, and of volatile redistribution on landforms, is especially essential in interpreting endogenic landforms that have, in many cases, been significantly modified by erosion [e.g., 7-9].
The effect of sheared toroidal rotation on pressure driven magnetic islands in toroidal plasmas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hegna, C. C.
2016-05-15
The impact of sheared toroidal rotation on the evolution of pressure driven magnetic islands in tokamak plasmas is investigated using a resistive magnetohydrodynamics model augmented by a neoclassical Ohm's law. Particular attention is paid to the asymptotic matching data as the Mercier indices are altered in the presence of sheared flow. Analysis of the nonlinear island Grad-Shafranov equation shows that sheared flows tend to amplify the stabilizing pressure/curvature contribution to pressure driven islands in toroidal tokamaks relative to the island bootstrap current contribution. As such, sheared toroidal rotation tends to reduce saturated magnetic island widths.
A Model for Mapping Linkages between Health and Education Agencies To Improve School Health.
ERIC Educational Resources Information Center
St. Leger, Lawrence; Nutbeam, Don
2000-01-01
Reviews the evolution of efforts to develop effective, sustainable school health programs, arguing that efforts were significantly driven by public health priorities and have not adequately accounted for educational perspectives. A model illustrating linkages between different school-based inputs and strategies and long-term health and educational…
Wind Turbine Blade CAD Models Used as Scaffolding Technique to Teach Design Engineers
ERIC Educational Resources Information Center
Irwin, John
2013-01-01
The Siemens PLM CAD software NX is commonly used for designing mechanical systems, and in complex systems such as the emerging area of wind power, the ability to have a model controlled by design parameters is a certain advantage. Formula driven expressions based on the amount of available wind in an area can drive the amount of effective surface…
Models and Frameworks: A Synergistic Association for Developing Component-Based Applications
Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A.; Álvarez, Bárbara
2014-01-01
The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development. PMID:25147858
Models and frameworks: a synergistic association for developing component-based applications.
Alonso, Diego; Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A; Álvarez, Bárbara
2014-01-01
The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development.
NASA Astrophysics Data System (ADS)
Huang, Shiquan; Yi, Youping; Li, Pengchuan
2011-05-01
In recent years, multi-scale simulation technique of metal forming is gaining significant attention for prediction of the whole deformation process and microstructure evolution of product. The advances of numerical simulation at macro-scale level on metal forming are remarkable and the commercial FEM software, such as Deform2D/3D, has found a wide application in the fields of metal forming. However, the simulation method of multi-scale has little application due to the non-linearity of microstructure evolution during forming and the difficulty of modeling at the micro-scale level. This work deals with the modeling of microstructure evolution and a new method of multi-scale simulation in forging process. The aviation material 7050 aluminum alloy has been used as example for modeling of microstructure evolution. The corresponding thermal simulated experiment has been performed on Gleeble 1500 machine. The tested specimens have been analyzed for modeling of dislocation density, nucleation and growth of recrystallization(DRX). The source program using cellular automaton (CA) method has been developed to simulate the grain nucleation and growth, in which the change of grain topology structure caused by the metal deformation was considered. The physical fields at macro-scale level such as temperature field, stress and strain fields, which can be obtained by commercial software Deform 3D, are coupled with the deformed storage energy at micro-scale level by dislocation model to realize the multi-scale simulation. This method was explained by forging process simulation of the aircraft wheel hub forging. Coupled the results of Deform 3D with CA results, the forging deformation progress and the microstructure evolution at any point of forging could be simulated. For verifying the efficiency of simulation, experiments of aircraft wheel hub forging have been done in the laboratory and the comparison of simulation and experiment result has been discussed in details.
New Model for Ionospheric Irregularities at Mars
NASA Astrophysics Data System (ADS)
Keskinen, M. J.
2018-03-01
A new model for ionospheric irregularities at Mars is presented. It is shown that wind-driven currents in the dynamo region of the Martian ionosphere can be unstable to the electromagnetic gradient drift instability. This plasma instability can generate ionospheric density and magnetic field irregularities with scale sizes of approximately 15-20 km down to a few kilometers. We show that the instability-driven magnetic field fluctuation amplitudes relative to background are correlated with the ionospheric density fluctuation amplitudes relative to background. Our results can explain recent observations made by the Mars Atmosphere and Volatile EvolutioN spacecraft in the Martian ionosphere dynamo region.
State-transition diagrams for biologists.
Bersini, Hugues; Klatzmann, David; Six, Adrien; Thomas-Vaslin, Véronique
2012-01-01
It is clearly in the tradition of biologists to conceptualize the dynamical evolution of biological systems in terms of state-transitions of biological objects. This paper is mainly concerned with (but obviously not limited too) the immunological branch of biology and shows how the adoption of UML (Unified Modeling Language) state-transition diagrams can ease the modeling, the understanding, the coding, the manipulation or the documentation of population-based immune software model generally defined as a set of ordinary differential equations (ODE), describing the evolution in time of populations of various biological objects. Moreover, that same UML adoption naturally entails a far from negligible representational economy since one graphical item of the diagram might have to be repeated in various places of the mathematical model. First, the main graphical elements of the UML state-transition diagram and how they can be mapped onto a corresponding ODE mathematical model are presented. Then, two already published immune models of thymocyte behavior and time evolution in the thymus, the first one originally conceived as an ODE population-based model whereas the second one as an agent-based one, are refactored and expressed in a state-transition form so as to make them much easier to understand and their respective code easier to access, to modify and run. As an illustrative proof, for any immunologist, it should be possible to understand faithfully enough what the two software models are supposed to reproduce and how they execute with no need to plunge into the Java or Fortran lines.
State-Transition Diagrams for Biologists
Bersini, Hugues; Klatzmann, David; Six, Adrien; Thomas-Vaslin, Véronique
2012-01-01
It is clearly in the tradition of biologists to conceptualize the dynamical evolution of biological systems in terms of state-transitions of biological objects. This paper is mainly concerned with (but obviously not limited too) the immunological branch of biology and shows how the adoption of UML (Unified Modeling Language) state-transition diagrams can ease the modeling, the understanding, the coding, the manipulation or the documentation of population-based immune software model generally defined as a set of ordinary differential equations (ODE), describing the evolution in time of populations of various biological objects. Moreover, that same UML adoption naturally entails a far from negligible representational economy since one graphical item of the diagram might have to be repeated in various places of the mathematical model. First, the main graphical elements of the UML state-transition diagram and how they can be mapped onto a corresponding ODE mathematical model are presented. Then, two already published immune models of thymocyte behavior and time evolution in the thymus, the first one originally conceived as an ODE population-based model whereas the second one as an agent-based one, are refactored and expressed in a state-transition form so as to make them much easier to understand and their respective code easier to access, to modify and run. As an illustrative proof, for any immunologist, it should be possible to understand faithfully enough what the two software models are supposed to reproduce and how they execute with no need to plunge into the Java or Fortran lines. PMID:22844438
Usability-driven evolution of a space instrument
NASA Astrophysics Data System (ADS)
McCalden, Alec
2012-09-01
The use of resources in the cradle-to-grave timeline of a space instrument might be significantly improved by considering the concept of usability from the start of the mission. The methodology proposed here includes giving early priority in a programme to the iterative development of a simulator that models instrument operation, and allowing this to evolve ahead of the actual instrument specification and fabrication. The advantages include reduction of risk in software development by shifting much of it to earlier in a programme than is typical, plus a test programme that uses and thereby proves the same support systems that may be used for flight. A new development flow for an instrument is suggested, showing how the system engineering phases used by the space agencies could be reworked in line with these ideas. This methodology is also likely to contribute to a better understanding between the various disciplines involved in the creation of a new instrument. The result should better capture the science needs, implement them more accurately with less wasted effort, and more fully allow the best ideas from all team members to be considered.
UWB Tracking Software Development
NASA Technical Reports Server (NTRS)
Gross, Julia; Arndt, Dickey; Ngo, Phong; Phan, Chau; Dusl, John; Ni, Jianjun; Rafford, Melinda
2006-01-01
An Ultra-Wideband (UWB) two-cluster Angle of Arrival (AOA) tracking prototype system is currently being developed and tested at NASA Johnson Space Center for space exploration applications. This talk discusses the software development efforts for this UWB two-cluster AOA tracking system. The role the software plays in this system is to take waveform data from two UWB radio receivers as an input, feed this input into an AOA tracking algorithm, and generate the target position as an output. The architecture of the software (Input/Output Interface and Algorithm Core) will be introduced in this talk. The development of this software has three phases. In Phase I, the software is mostly Matlab driven and calls C++ socket functions to provide the communication links to the radios. This is beneficial in the early stage when it is necessary to frequently test changes in the algorithm. Phase II of the development is to have the software mostly C++ driven and call a Matlab function for the AOA tracking algorithm. This is beneficial in order to send the tracking results to other systems and also to improve the tracking update rate of the system. The third phase is part of future work and is to have the software completely C++ driven with a graphics user interface. This software design enables the fine resolution tracking of the UWB two-cluster AOA tracking system.
Bigdata Driven Cloud Security: A Survey
NASA Astrophysics Data System (ADS)
Raja, K.; Hanifa, Sabibullah Mohamed
2017-08-01
Cloud Computing (CC) is a fast-growing technology to perform massive-scale and complex computing. It eliminates the need to maintain expensive computing hardware, dedicated space, and software. Recently, it has been observed that massive growth in the scale of data or big data generated through cloud computing. CC consists of a front-end, includes the users’ computers and software required to access the cloud network, and back-end consists of various computers, servers and database systems that create the cloud. In SaaS (Software as-a-Service - end users to utilize outsourced software), PaaS (Platform as-a-Service-platform is provided) and IaaS (Infrastructure as-a-Service-physical environment is outsourced), and DaaS (Database as-a-Service-data can be housed within a cloud), where leading / traditional cloud ecosystem delivers the cloud services become a powerful and popular architecture. Many challenges and issues are in security or threats, most vital barrier for cloud computing environment. The main barrier to the adoption of CC in health care relates to Data security. When placing and transmitting data using public networks, cyber attacks in any form are anticipated in CC. Hence, cloud service users need to understand the risk of data breaches and adoption of service delivery model during deployment. This survey deeply covers the CC security issues (covering Data Security in Health care) so as to researchers can develop the robust security application models using Big Data (BD) on CC (can be created / deployed easily). Since, BD evaluation is driven by fast-growing cloud-based applications developed using virtualized technologies. In this purview, MapReduce [12] is a good example of big data processing in a cloud environment, and a model for Cloud providers.
A three-phase amplification of the cosmic magnetic field in galaxies
NASA Astrophysics Data System (ADS)
Martin-Alvarez, Sergio; Devriendt, Julien; Slyz, Adrianne; Teyssier, Romain
2018-06-01
Arguably the main challenge of galactic magnetism studies is to explain how the interstellar medium of galaxies reaches energetic equipartition despite the extremely weak cosmic primordial magnetic fields that are originally predicted to thread the inter-galactic medium. Previous numerical studies of isolated galaxies suggest that a fast dynamo amplification might suffice to bridge the gap spanning many orders of magnitude in strength between the weak early Universe magnetic fields and the ones observed in high redshift galaxies. To better understand their evolution in the cosmological context of hierarchical galaxy growth, we probe the amplification process undergone by the cosmic magnetic field within a spiral galaxy to unprecedented accuracy by means of a suite of constrained transport magnetohydrodynamical adaptive mesh refinement cosmological zoom simulations with different stellar feedback prescriptions. A galactic turbulent dynamo is found to be naturally excited in this cosmological environment, being responsible for most of the amplification of the magnetic energy. Indeed, we find that the magnetic energy spectra of simulated galaxies display telltale inverse cascades. Overall, the amplification process can be divided in three main phases, which are related to different physical mechanisms driving galaxy evolution: an initial collapse phase, an accretion-driven phase, and a feedback-driven phase. While different feedback models affect the magnetic field amplification differently, all tested models prove to be subdominant at early epochs, before the feedback-driven phase is reached. Thus the three-phase evolution paradigm is found to be quite robust vis-a-vis feedback prescriptions.
Experimental investigation of nozzle/plume aerodynamics at hypersonic speeds
NASA Technical Reports Server (NTRS)
Bogdanoff, David W.; Cambier, Jean-Luc
1993-01-01
Work continued on the improvement of 16-Inch Shock Tunnel. This comprised studies of ways of improving driver gas ignition, an improved driver gas mixing system, an axial translation system for the driver tube, improved diaphragm materials (carbon steel vs. stainless steel), a copper liner for the part of the driven tube near the nozzle, the use of a buffer gas between the driver and driven gases, the use of N2O in the driven tube, the use of a converging driven tube, operation of the facility as a non-reflected shock tunnel and expansion tube, operation with heated hydrogen or helium driver gas, the use of detonations in the driver and the construction of an enlarged test section. Maintenance and developmental work continued on the scramjet combustor continued. New software which greatly speeds up data analysis has been written and brought on line. In particular, software which provides very rapid generation of model surface heat flux profiles has been brought on line. A considerable amount of theoretical work was performed in connection with upgrading the 16 Inch Shock Tunnel Facility. A one-dimensional Godunov code for very high velocities and any equation of state is intended to add viscous effects in studying the operation of the Shock Tunnel and also of two-stage light gas guns.
A Simple and Accurate Rate-Driven Infiltration Model
NASA Astrophysics Data System (ADS)
Cui, G.; Zhu, J.
2017-12-01
In this study, we develop a novel Rate-Driven Infiltration Model (RDIMOD) for simulating infiltration into soils. Unlike traditional methods, RDIMOD avoids numerically solving the highly non-linear Richards equation or simply modeling with empirical parameters. RDIMOD employs infiltration rate as model input to simulate one-dimensional infiltration process by solving an ordinary differential equation. The model can simulate the evolutions of wetting front, infiltration rate, and cumulative infiltration on any surface slope including vertical and horizontal directions. Comparing to the results from the Richards equation for both vertical infiltration and horizontal infiltration, RDIMOD simply and accurately predicts infiltration processes for any type of soils and soil hydraulic models without numerical difficulty. Taking into account the accuracy, capability, and computational effectiveness and stability, RDIMOD can be used in large-scale hydrologic and land-atmosphere modeling.
A two level mutation-selection model of cultural evolution and diversity.
Salazar-Ciudad, Isaac
2010-11-21
Cultural evolution is a complex process that can happen at several levels. At the level of individuals in a population, each human bears a set of cultural traits that he or she can transmit to its offspring (vertical transmission) or to other members of his or her society (horizontal transmission). The relative frequency of a cultural trait in a population or society can thus increase or decrease with the relative reproductive success of its bearers (individual's level) or the relative success of transmission (called the idea's level). This article presents a mathematical model on the interplay between these two levels. The first aim of this article is to explore when cultural evolution is driven by the idea's level, when it is driven by the individual's level and when it is driven by both. These three possibilities are explored in relation to (a) the amount of interchange of cultural traits between individuals, (b) the selective pressure acting on individuals, (c) the rate of production of new cultural traits, (d) the individual's capacity to remember cultural traits and to the population size. The aim is to explore the conditions in which cultural evolution does not lead to a better adaptation of individuals to the environment. This is to contrast the spread of fitness-enhancing ideas, which make individual bearers better adapted to the environment, to the spread of "selfish" ideas, which spread well simply because they are easy to remember but do not help their individual bearers (and may even hurt them). At the same time this article explores in which conditions the adaptation of individuals is maximal. The second aim is to explore how these factors affect cultural diversity, or the amount of different cultural traits in a population. This study suggests that a larger interchange of cultural traits between populations could lead to cultural evolution not improving the adaptation of individuals to their environment and to a decrease of cultural diversity. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Izygon, Michel E.
1992-01-01
This report is an attempt to clarify some of the concerns raised about the OMT method, specifically that OMT is weaker than the Booch method in a few key areas. This interim report specifically addresses the following issues: (1) is OMT object-oriented or only data-driven?; (2) can OMT be used as a front-end to implementation in C++?; (3) the inheritance concept in OMT is in contradiction with the 'pure and real' inheritance concept found in object-oriented (OO) design; (4) low support for software life-cycle issues, for project and risk management; (5) uselessness of functional modeling for the ROSE project; and (6) problems with event-driven and simulation systems. The conclusion of this report is that both Booch's method and Rumbaugh's method are good OO methods, each with strengths and weaknesses in different areas of the development process.
Shock Wave Propagation in Cementitious Materials at Micro/Meso Scales
NASA Astrophysics Data System (ADS)
Rajendran, Arunachalam
2015-06-01
The mechanical and constitutive response of materials like cement, and bio materials like fish scale and abalone shell is very complex due to heterogeneities that are inherently present in the nano and microstructures. The intrinsic constitutive behaviors are driven by the chemical composition and the molecular, micro, and meso structures. Therefore, it becomes important to identify the material genome as the building block for the material. For instance, in cementitious materials, the genome of C-S-H phase (the glue or the paste) that holds the various clinkers, such as the dicalcium silicate, tricalcium silicate, calcium ferroaluminates, and others is extremely complex. Often mechanical behaviors of C-S-H type materials are influenced by the chemistry and the structures at all nano to micro length scales. By explicitly modeling the molecular structures using appropriate potentials, it is then possible to compute the elastic tensor from molecular dynamics simulations using all atom method. The elastic tensors for the C-S-H gel and other clinkers are determined using the software suite ``Accelrys Materials Studio.'' A strain rate dependent, fracture mechanics based tensile damage model has been incorporated into ABAQUS finite element code to model spall evolution in the heterogeneous cementitious material with all constituents explicitly modeled through one micron element resolution. This paper presents results from nano/micro/meso scale analyses of shock wave propagation in a heterogeneous cementitious material using both molecular dynamic and finite element codes.
Supporting Source Code Comprehension during Software Evolution and Maintenance
ERIC Educational Resources Information Center
Alhindawi, Nouh
2013-01-01
This dissertation addresses the problems of program comprehension to support the evolution of large-scale software systems. The research concerns how software engineers locate features and concepts along with categorizing changes within very large bodies of source code along with their versioned histories. More specifically, advanced Information…
Model-Driven Development of Interactive Multimedia Applications with MML
NASA Astrophysics Data System (ADS)
Pleuss, Andreas; Hussmann, Heinrich
There is an increasing demand for high-quality interactive applications which combine complex application logic with a sophisticated user interface, making use of individual media objects like graphics, animations, 3D graphics, audio or video. Their development is still challenging as it requires the integration of software design, user interface design, and media design.
Combining Domain-driven Design and Mashups for Service Development
NASA Astrophysics Data System (ADS)
Iglesias, Carlos A.; Fernández-Villamor, José Ignacio; Del Pozo, David; Garulli, Luca; García, Boni
This chapter presents the Romulus project approach to Service Development using Java-based web technologies. Romulus aims at improving productivity of service development by providing a tool-supported model to conceive Java-based web applications. This model follows a Domain Driven Design approach, which states that the primary focus of software projects should be the core domain and domain logic. Romulus proposes a tool-supported model, Roma Metaframework, that provides an abstraction layer on top of existing web frameworks and automates the application generation from the domain model. This metaframework follows an object centric approach, and complements Domain Driven Design by identifying the most common cross-cutting concerns (security, service, view, ...) of web applications. The metaframework uses annotations for enriching the domain model with these cross-cutting concerns, so-called aspects. In addition, the chapter presents the usage of mashup technology in the metaframework for service composition, using the web mashup editor MyCocktail. This approach is applied to a scenario of the Mobile Phone Service Portability case study for the development of a new service.
NASA Technical Reports Server (NTRS)
Denney, Ewen W.; Fischer, Bernd
2009-01-01
Model-based development and automated code generation are increasingly used for production code in safety-critical applications, but since code generators are typically not qualified, the generated code must still be fully tested, reviewed, and certified. This is particularly arduous for mathematical and control engineering software which requires reviewers to trace subtle details of textbook formulas and algorithms to the code, and to match requirements (e.g., physical units or coordinate frames) not represented explicitly in models or code. Both tasks are complicated by the often opaque nature of auto-generated code. We address these problems by developing a verification-driven approach to traceability and documentation. We apply the AUTOCERT verification system to identify and then verify mathematical concepts in the code, based on a mathematical domain theory, and then use these verified traceability links between concepts, code, and verification conditions to construct a natural language report that provides a high-level structured argument explaining why and how the code uses the assumptions and complies with the requirements. We have applied our approach to generate review documents for several sub-systems of NASA s Project Constellation.
Compaction-Driven Evolution of Pluto's Rocky Core: Implications for Water-Rock Interactions
NASA Astrophysics Data System (ADS)
Gabasova, L. R.; Tobie, G.; Choblet, G.
2018-05-01
We model the compaction of Pluto's rocky core after accretion and explore the potential for hydrothermal circulation within the porous layer, as well as examine its effect on core cooling and the persistence of a liquid internal ocean.
Simulating Technology Processes to Foster Learning.
ERIC Educational Resources Information Center
Krumholtz, Nira
1998-01-01
Based on a spiral model of technology evolution, elementary students used LOGO computer software to become both developers and users of technology. The computerized environment enabled 87% to reach intuitive understanding of physical concepts; 24% expressed more formal scientific understanding. (SK)
Is a larger refuge always better? Dispersal and dose in pesticide resistance evolution.
Takahashi, Daisuke; Yamanaka, Takehiko; Sudo, Masaaki; Andow, David A
2017-06-01
The evolution of resistance against pesticides is an important problem of modern agriculture. The high-dose/refuge strategy, which divides the landscape into treated and nontreated (refuge) patches, has proven effective at delaying resistance evolution. However, theoretical understanding is still incomplete, especially for combinations of limited dispersal and partially recessive resistance. We reformulate a two-patch model based on the Comins model and derive a simple quadratic approximation to analyze the effects of limited dispersal, refuge size, and dominance for high efficacy treatments on the rate of evolution. When a small but substantial number of heterozygotes can survive in the treated patch, a larger refuge always reduces the rate of resistance evolution. However, when dominance is small enough, the evolutionary dynamics in the refuge population, which is indirectly driven by migrants from the treated patch, mainly describes the resistance evolution in the landscape. In this case, for small refuges, increasing the refuge size will increase the rate of resistance evolution. Our analysis distils major driving forces from the model, and can provide a framework for understanding directional selection in source-sink environments. © 2017 The Author(s). Evolution published by Wiley Periodicals, Inc. on behalf of The Society for the Study of Evolution.
Generic analysis of kinetically driven inflation
NASA Astrophysics Data System (ADS)
Saitou, Rio
2018-04-01
We perform a model-independent analysis of kinetically driven inflation (KDI) which (partially) includes generalized G-inflation and ghost inflation. We evaluate the background evolution splitting into the inflationary attractor and the perturbation around it. We also consider the quantum fluctuation of the scalar mode with a usual scaling and derive the spectral index, ignoring the contribution from the second-order products of slow-roll parameters. Using these formalisms, we find that within our generic framework the models of KDI which possess the shift symmetry of scalar field cannot create the quantum fluctuation consistent with the observation. Breaking the shift symmetry, we obtain a few essential conditions for viable models of KDI associated with the graceful exit.
Mathematical Modeling of the Origins of Life
NASA Technical Reports Server (NTRS)
Pohorille, Andrew
2006-01-01
The emergence of early metabolism - a network of catalyzed chemical reactions that supported self-maintenance, growth, reproduction and evolution of the ancestors of contemporary cells (protocells) was a critical, but still very poorly understood step on the path from inanimate to animate matter. Here, it is proposed and tested through mathematical modeling of biochemically plausible systems that the emergence of metabolism and its initial evolution towards higher complexity preceded the emergence of a genome. Even though the formation of protocellular metabolism was driven by non-genomic, highly stochastic processes the outcome was largely deterministic, strongly constrained by laws of chemistry. It is shown that such concepts as speciation and fitness to the environment, developed in the context of genomic evolution, also held in the absence of a genome.
Extended MHD Modeling of Tearing-Driven Magnetic Relaxation
NASA Astrophysics Data System (ADS)
Sauppe, Joshua
2016-10-01
Driven plasma pinch configurations are characterized by the gradual accumulation and episodic release of free energy in discrete relaxation events. The hallmark of this relaxation in a reversed-field pinch (RFP) plasma is flattening of the parallel current density profile effected by a fluctuation-induced dynamo emf in Ohm's law. Nonlinear two-fluid modeling of macroscopic RFP dynamics has shown appreciable coupling of magnetic relaxation and the evolution of plasma flow. Accurate modeling of RFP dynamics requires the Hall effect in Ohm's law as well as first order ion finite Larmor radius (FLR) effects, represented by the Braginskii ion gyroviscous stress tensor. New results find that the Hall dynamo effect from < J × B > / ne can counter the MHD effect from - < V × B > in some of the relaxation events. The MHD effect dominates these events and relaxes the current profile toward the Taylor state, but the opposition of the two dynamos generates plasma flow in the direction of equilibrium current density, consistent with experimental measurements. Detailed experimental measurements of the MHD and Hall emf terms are compared to these extended MHD predictions. Tracking the evolution of magnetic energy, helicity, and hybrid helicity during relaxation identifies the most important contributions in single-fluid and two-fluid models. Magnetic helicity is well conserved relative to the magnetic energy during relaxation. The hybrid helicity is dominated by magnetic helicity in realistic low-beta pinch conditions and is also well conserved. Differences of less than 1 % between magnetic helicity and hybrid helicity are observed with two-fluid modeling and result from cross helicity evolution through ion FLR effects, which have not been included in contemporary relaxation theories. The kinetic energy driven by relaxation in the computations is dominated by velocity components perpendicular to the magnetic field, an effect that had not been predicted. Work performed at University of Wisconsin-Madison. LA-UR-16-24727.
Two species drag/diffusion model for energetic particle driven modes
NASA Astrophysics Data System (ADS)
Aslanyan, V.; Sharapov, S. E.; Spong, D. A.; Porkolab, M.
2017-12-01
A nonlinear bump-on-tail model for the growth and saturation of energetic particle driven plasma waves has been extended to include two populations of fast particles—one dominated by dynamical friction at the resonance and the other by velocity space diffusion. The resulting temporal evolution of the wave amplitude and frequency depends on the relative weight of the two populations. The two species model is applied to burning plasma with drag-dominated alpha particles and diffusion-dominated ICRH accelerated minority ions, showing the stabilization of bursting modes. The model also suggests an explanation for the recent observations on the TJ-II stellarator, where Alfvén Eigenmodes transition between steady state and bursting as the magnetic configuration varied.
The reverse evolution from multicellularity to unicellularity during carcinogenesis.
Chen, Han; Lin, Fangqin; Xing, Ke; He, Xionglei
2015-03-09
Theoretical reasoning suggests that cancer may result from a knockdown of the genetic constraints that evolved for the maintenance of metazoan multicellularity. By characterizing the whole-life history of a xenograft tumour, here we show that metastasis is driven by positive selection for general loss-of-function mutations on multicellularity-related genes. Expression analyses reveal mainly downregulation of multicellularity-related genes and an evolving expression profile towards that of embryonic stem cells, the cell type resembling unicellular life in its capacity of unlimited clonal proliferation. Also, the emergence of metazoan multicellularity ~600 Myr ago is accompanied by an elevated birth rate of cancer genes, and there are more loss-of-function tumour suppressors than activated oncogenes in a typical tumour. These data collectively suggest that cancer represents a loss-of-function-driven reverse evolution back to the unicellular 'ground state'. This cancer evolution model may account for inter-/intratumoural genetic heterogeneity, could explain distant-organ metastases and hold implications for cancer therapy.
Genetic Epidemiology and Public Health: The Evolution From Theory to Technology.
Fallin, M Daniele; Duggal, Priya; Beaty, Terri H
2016-03-01
Genetic epidemiology represents a hybrid of epidemiologic designs and statistical models that explicitly consider both genetic and environmental risk factors for disease. It is a relatively new field in public health; the term was first coined only 35 years ago. In this short time, the field has been through a major evolution, changing from a field driven by theory, without the technology for genetic measurement or computational capacity to apply much of the designs and methods developed, to a field driven by rapidly expanding technology in genomic measurement and computational analyses while epidemiologic theory struggles to keep up. In this commentary, we describe 4 different eras of genetic epidemiology, spanning this evolution from theory to technology, what we have learned, what we have added to the broader field of public health, and what remains to be done. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Generative Models in Deep Learning: Constraints for Galaxy Evolution
NASA Astrophysics Data System (ADS)
Turp, Maximilian Dennis; Schawinski, Kevin; Zhang, Ce; Weigel, Anna K.
2018-01-01
New techniques are essential to make advances in the field of galaxy evolution. Recent developments in the field of artificial intelligence and machine learning have proven that these tools can be applied to problems far more complex than simple image recognition. We use these purely data driven approaches to investigate the process of star formation quenching. We show that Variational Autoencoders provide a powerful method to forward model the process of galaxy quenching. Our results imply that simple changes in specific star formation rate and bulge to disk ratio cannot fully describe the properties of the quenched population.
General Dynamics of Topology and Traffic on Weighted Technological Networks
NASA Astrophysics Data System (ADS)
Wang, Wen-Xu; Wang, Bing-Hong; Hu, Bo; Yan, Gang; Ou, Qing
2005-05-01
For most technical networks, the interplay of dynamics, traffic, and topology is assumed crucial to their evolution. In this Letter, we propose a traffic-driven evolution model of weighted technological networks. By introducing a general strength-coupling mechanism under which the traffic and topology mutually interact, the model gives power-law distributions of degree, weight, and strength, as confirmed in many real networks. Particularly, depending on a parameter W that controls the total weight growth of the system, the nontrivial clustering coefficient C, degree assortativity coefficient r, and degree-strength correlation are all consistent with empirical evidence.
Optimizing Automatic Deployment Using Non-functional Requirement Annotations
NASA Astrophysics Data System (ADS)
Kugele, Stefan; Haberl, Wolfgang; Tautschnig, Michael; Wechs, Martin
Model-driven development has become common practice in design of safety-critical real-time systems. High-level modeling constructs help to reduce the overall system complexity apparent to developers. This abstraction caters for fewer implementation errors in the resulting systems. In order to retain correctness of the model down to the software executed on a concrete platform, human faults during implementation must be avoided. This calls for an automatic, unattended deployment process including allocation, scheduling, and platform configuration.
Microstructure Modeling of Third Generation Disk Alloys
NASA Technical Reports Server (NTRS)
Jou, Herng-Jeng
2010-01-01
The objective of this program was to model, validate, and predict the precipitation microstructure evolution, using PrecipiCalc (QuesTek Innovations LLC) software, for 3rd generation Ni-based gas turbine disc superalloys during processing and service, with a set of logical and consistent experiments and characterizations. Furthermore, within this program, the originally research-oriented microstructure simulation tool was to be further improved and implemented to be a useful and user-friendly engineering tool. In this report, the key accomplishments achieved during the third year (2009) of the program are summarized. The activities of this year included: Further development of multistep precipitation simulation framework for gamma prime microstructure evolution during heat treatment; Calibration and validation of gamma prime microstructure modeling with supersolvus heat treated LSHR; Modeling of the microstructure evolution of the minor phases, particularly carbides, during isothermal aging, representing the long term microstructure stability during thermal exposure; and the implementation of software tools. During the research and development efforts to extend the precipitation microstructure modeling and prediction capability in this 3-year program, we identified a hurdle, related to slow gamma prime coarsening rate, with no satisfactory scientific explanation currently available. It is desirable to raise this issue to the Ni-based superalloys research community, with hope that in future there will be a mechanistic understanding and physics-based treatment to overcome the hurdle. In the mean time, an empirical correction factor was developed in this modeling effort to capture the experimental observations.
Component Models for Semantic Web Languages
NASA Astrophysics Data System (ADS)
Henriksson, Jakob; Aßmann, Uwe
Intelligent applications and agents on the Semantic Web typically need to be specified with, or interact with specifications written in, many different kinds of formal languages. Such languages include ontology languages, data and metadata query languages, as well as transformation languages. As learnt from years of experience in development of complex software systems, languages need to support some form of component-based development. Components enable higher software quality, better understanding and reusability of already developed artifacts. Any component approach contains an underlying component model, a description detailing what valid components are and how components can interact. With the multitude of languages developed for the Semantic Web, what are their underlying component models? Do we need to develop one for each language, or is a more general and reusable approach achievable? We present a language-driven component model specification approach. This means that a component model can be (automatically) generated from a given base language (actually, its specification, e.g. its grammar). As a consequence, we can provide components for different languages and simplify the development of software artifacts used on the Semantic Web.
El-Atwani, O.; Norris, S. A.; Ludwig, K.; ...
2015-12-16
In this study, several proposed mechanisms and theoretical models exist concerning nanostructure evolution on III-V semiconductors (particularly GaSb) via ion beam irradiation. However, making quantitative contact between experiment on the one hand and model-parameter dependent predictions from different theories on the other is usually difficult. In this study, we take a different approach and provide an experimental investigation with a range of targets (GaSb, GaAs, GaP) and ion species (Ne, Ar, Kr, Xe) to determine new parametric trends regarding nanostructure evolution. Concurrently, atomistic simulations using binary collision approximation over the same ion/target combinations were performed to determine parametric trends onmore » several quantities related to existing model. A comparison of experimental and numerical trends reveals that the two are broadly consistent under the assumption that instabilities are driven by chemical instability based on phase separation. Furthermore, the atomistic simulations and a survey of material thermodynamic properties suggest that a plausible microscopic mechanism for this process is an ion-enhanced mobility associated with energy deposition by collision cascades.« less
SPH Modelling of Sea-ice Pack Dynamics
NASA Astrophysics Data System (ADS)
Staroszczyk, Ryszard
2017-12-01
The paper is concerned with the problem of sea-ice pack motion and deformation under the action of wind and water currents. Differential equations describing the dynamics of ice, with its very distinct mateFfigrial responses in converging and diverging flows, express the mass and linear momentum balances on the horizontal plane (the free surface of the ocean). These equations are solved by the fully Lagrangian method of smoothed particle hydrodynamics (SPH). Assuming that the ice behaviour can be approximated by a non-linearly viscous rheology, the proposed SPH model has been used to simulate the evolution of a sea-ice pack driven by wind drag stresses. The results of numerical simulations illustrate the evolution of an ice pack, including variations in ice thickness and ice area fraction in space and time. The effects of different initial ice pack configurations and of different conditions assumed at the coast-ice interface are examined. In particular, the SPH model is applied to a pack flow driven by a vortex wind to demonstrate how well the Lagrangian formulation can capture large deformations and displacements of sea ice.
Different structures formed at HII boundaries
NASA Astrophysics Data System (ADS)
Miao, Jingqi; Cornwall, Paul; Kinnear, Tim
2015-03-01
Hydrodynamic simulations on the evolution of molecular clouds (MCs) at HII boundaries are used to show that radiation driven implosion (RDI) model can create almost all of the different morphological structures, such as a single bright-rimmed cloud (BRC), fragment structure and multiple elephant trunk (ET) structures.
Academic Testing and Grading with Spreadsheet Software.
ERIC Educational Resources Information Center
Ho, James K.
1987-01-01
Explains how spreadsheet software can be used in the design and grading of academic tests and in assigning grades. Macro programs and menu-driven software are highlighted and an example using IBM PCs and Lotus 1-2-3 software is given. (Author/LRW)
The role of biotic forces in driving macroevolution: beyond the Red Queen
Voje, Kjetil L.; Holen, Øistein H.; Liow, Lee Hsiang; Stenseth, Nils Chr.
2015-01-01
A multitude of hypotheses claim that abiotic factors are the main drivers of macroevolutionary change. By contrast, Van Valen's Red Queen hypothesis is often put forward as the sole representative of the view that biotic forcing is the main evolutionary driver. This imbalance of hypotheses does not reflect our current knowledge: theoretical work demonstrates the plausibility of biotically driven long-term evolution, whereas empirical work suggests a central role for biotic forcing in macroevolution. We call for a more pluralistic view of how biotic forces may drive long-term evolution that is compatible with both phenotypic stasis in the fossil record and with non-constant extinction rates. Promising avenues of research include contrasting predictions from relevant theories within ecology and macroevolution, as well as embracing both abiotic and biotic proxies while modelling long-term evolutionary data. By fitting models describing hypotheses of biotically driven macroevolution to data, we could dissect their predictions and transcend beyond pattern description, possibly narrowing the divide between our current understanding of micro- and macroevolution. PMID:25948685
Climate change-driven cliff and beach evolution at decadal to centennial time scales
Erikson, Li; O'Neill, Andrea; Barnard, Patrick; Vitousek, Sean; Limber, Patrick
2017-01-01
Here we develop a computationally efficient method that evolves cross-shore profiles of sand beaches with or without cliffs along natural and urban coastal environments and across expansive geographic areas at decadal to centennial time-scales driven by 21st century climate change projections. The model requires projected sea level rise rates, extrema of nearshore wave conditions, bluff recession and shoreline change rates, and cross-shore profiles representing present-day conditions. The model is applied to the ~470-km long coast of the Southern California Bight, USA, using recently available projected nearshore waves and bluff recession and shoreline change rates. The results indicate that eroded cliff material, from unarmored cliffs, contribute 11% to 26% to the total sediment budget. Historical beach nourishment rates will need to increase by more than 30% for a 0.25 m sea level rise (~2044) and by at least 75% by the year 2100 for a 1 m sea level rise, if evolution of the shoreline is to keep pace with rising sea levels.
BioContainers: an open-source and community-driven framework for software standardization.
da Veiga Leprevost, Felipe; Grüning, Björn A; Alves Aflitos, Saulo; Röst, Hannes L; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I; Perez-Riverol, Yasset
2017-08-15
BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). The software is freely available at github.com/BioContainers/. yperez@ebi.ac.uk. © The Author(s) 2017. Published by Oxford University Press.
BioContainers: an open-source and community-driven framework for software standardization
da Veiga Leprevost, Felipe; Grüning, Björn A.; Alves Aflitos, Saulo; Röst, Hannes L.; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C.; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I.; Perez-Riverol, Yasset
2017-01-01
Abstract Motivation BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). Availability and Implementation The software is freely available at github.com/BioContainers/. Contact yperez@ebi.ac.uk PMID:28379341
ENES the European Network for Earth System modelling and its infrastructure projects IS-ENES
NASA Astrophysics Data System (ADS)
Guglielmo, Francesca; Joussaume, Sylvie; Parinet, Marie
2016-04-01
The scientific community working on climate modelling is organized within the European Network for Earth System modelling (ENES). In the past decade, several European university departments, research centres, meteorological services, computer centres, and industrial partners engaged in the creation of ENES with the purpose of working together and cooperating towards the further development of the network, by signing a Memorandum of Understanding. As of 2015, the consortium counts 47 partners. The climate modelling community, and thus ENES, faces challenges which are both science-driven, i.e. analysing of the full complexity of the Earth System to improve our understanding and prediction of climate changes, and have multi-faceted societal implications, as a better representation of climate change on regional scales leads to improved understanding and prediction of impacts and to the development and provision of climate services. ENES, promoting and endorsing projects and initiatives, helps in developing and evaluating of state-of-the-art climate and Earth system models, facilitates model inter-comparison studies, encourages exchanges of software and model results, and fosters the use of high performance computing facilities dedicated to high-resolution multi-model experiments. ENES brings together public and private partners, integrates countries underrepresented in climate modelling studies, and reaches out to different user communities, thus enhancing European expertise and competitiveness. In this need of sophisticated models, world-class, high-performance computers, and state-of-the-art software solutions to make efficient use of models, data and hardware, a key role is played by the constitution and maintenance of a solid infrastructure, developing and providing services to the different user communities. ENES has investigated the infrastructural needs and has received funding from the EU FP7 program for the IS-ENES (InfraStructure for ENES) phase I and II projects. We present here the case study of an existing network of institutions brought together toward common goals by a non-binding agreement, ENES, and of its two IS-ENES projects. These latter will be discussed in their double role as a means to provide and/or maintain the actual infrastructure (hardware, software, skilled human resources, services) to achieve ENES scientific goals -fulfilling the aims set in a strategy document-, but also to inform and provide to the network a structured way of working and of interacting with the extended community. The genesis and evolution of the network and the interaction network/projects will also be analysed in terms of long-term sustainability.
Open data models for smart health interconnected applications: the example of openEHR.
Demski, Hans; Garde, Sebastian; Hildebrand, Claudia
2016-10-22
Smart Health is known as a concept that enhances networking, intelligent data processing and combining patient data with other parameters. Open data models can play an important role in creating a framework for providing interoperable data services that support the development of innovative Smart Health applications profiting from data fusion and sharing. This article describes a model-driven engineering approach based on standardized clinical information models and explores its application for the development of interoperable electronic health record systems. The following possible model-driven procedures were considered: provision of data schemes for data exchange, automated generation of artefacts for application development and native platforms that directly execute the models. The applicability of the approach in practice was examined using the openEHR framework as an example. A comprehensive infrastructure for model-driven engineering of electronic health records is presented using the example of the openEHR framework. It is shown that data schema definitions to be used in common practice software development processes can be derived from domain models. The capabilities for automatic creation of implementation artefacts (e.g., data entry forms) are demonstrated. Complementary programming libraries and frameworks that foster the use of open data models are introduced. Several compatible health data platforms are listed. They provide standard based interfaces for interconnecting with further applications. Open data models help build a framework for interoperable data services that support the development of innovative Smart Health applications. Related tools for model-driven application development foster semantic interoperability and interconnected innovative applications.
NEAMS SOFTWARE V&V PLAN FOR THE MARMOT SOFTWARE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michael R Tonks
2014-03-01
In order to ensure the accuracy and quality of the microstructure based materials models being developed in conjunction with MARMOT simulations, MARMOT must undergo exhaustive verification and validation. Only after this process can we confidently rely on the MARMOT code to predict the microstructure evolution within the fuel. Therefore, in this report we lay out a V&V plan for the MARMOT code, highlighting where existing data could be used and where new data is required.
NASA Astrophysics Data System (ADS)
Kumar, Ashish; Dasgupta, Dwaipayan; Maroudas, Dimitrios
2017-07-01
We report a systematic study of complex pattern formation resulting from the driven dynamics of single-layer homoepitaxial islands on surfaces of face-centered-cubic (fcc) crystalline conducting substrates under the action of an externally applied electric field. The analysis is based on an experimentally validated nonlinear model of mass transport via island edge atomic diffusion, which also accounts for edge diffusional anisotropy. We analyze the morphological stability and simulate the field-driven evolution of rounded islands for an electric field oriented along the fast edge diffusion direction. For larger-than-critical island sizes on {110 } and {100 } fcc substrates, we show that multiple necking instabilities generate complex island patterns, including not-simply-connected void-containing islands mediated by sequences of breakup and coalescence events and distributed symmetrically with respect to the electric field direction. We analyze the dependence of the formed patterns on the original island size and on the duration of application of the external field. Starting from a single large rounded island, we characterize the evolution of the number of daughter islands and their average size and uniformity. The evolution of the average island size follows a universal power-law scaling relation, and the evolution of the total edge length of the islands in the complex pattern follows Kolmogorov-Johnson-Mehl-Avrami kinetics. Our study makes a strong case for the use of electric fields, as precisely controlled macroscopic forcing, toward surface patterning involving complex nanoscale features.
Nakano, Shogo; Asano, Yasuhisa
2015-02-03
Development of software and methods for design of complete sequences of functional proteins could contribute to studies of protein engineering and protein evolution. To this end, we developed the INTMSAlign software, and used it to design functional proteins and evaluate their usefulness. The software could assign both consensus and correlation residues of target proteins. We generated three protein sequences with S-selective hydroxynitrile lyase (S-HNL) activity, which we call designed S-HNLs; these proteins folded as efficiently as the native S-HNL. Sequence and biochemical analysis of the designed S-HNLs suggested that accumulation of neutral mutations occurs during the process of S-HNLs evolution from a low-activity form to a high-activity (native) form. Taken together, our results demonstrate that our software and the associated methods could be applied not only to design of complete sequences, but also to predictions of protein evolution, especially within families such as esterases and S-HNLs.
NASA Astrophysics Data System (ADS)
Nakano, Shogo; Asano, Yasuhisa
2015-02-01
Development of software and methods for design of complete sequences of functional proteins could contribute to studies of protein engineering and protein evolution. To this end, we developed the INTMSAlign software, and used it to design functional proteins and evaluate their usefulness. The software could assign both consensus and correlation residues of target proteins. We generated three protein sequences with S-selective hydroxynitrile lyase (S-HNL) activity, which we call designed S-HNLs; these proteins folded as efficiently as the native S-HNL. Sequence and biochemical analysis of the designed S-HNLs suggested that accumulation of neutral mutations occurs during the process of S-HNLs evolution from a low-activity form to a high-activity (native) form. Taken together, our results demonstrate that our software and the associated methods could be applied not only to design of complete sequences, but also to predictions of protein evolution, especially within families such as esterases and S-HNLs.
System on chip module configured for event-driven architecture
Robbins, Kevin; Brady, Charles E.; Ashlock, Tad A.
2017-10-17
A system on chip (SoC) module is described herein, wherein the SoC modules comprise a processor subsystem and a hardware logic subsystem. The processor subsystem and hardware logic subsystem are in communication with one another, and transmit event messages between one another. The processor subsystem executes software actors, while the hardware logic subsystem includes hardware actors, the software actors and hardware actors conform to an event-driven architecture, such that the software actors receive and generate event messages and the hardware actors receive and generate event messages.
Allowing for Slow Evolution of Background Plasma in the 3D FDTD Plasma, Sheath, and Antenna Model
NASA Astrophysics Data System (ADS)
Smithe, David; Jenkins, Thomas; King, Jake
2015-11-01
We are working to include a slow-time evolution capability for what has previously been the static background plasma parameters, in the 3D finite-difference time-domain (FDTD) plasma and sheath model used to model ICRF antennas in fusion plasmas. A key aspect of this is SOL-density time-evolution driven by ponderomotive rarefaction from the strong fields in the vicinity of the antenna. We demonstrate and benchmark a Scalar Ponderomotive Potential method, based on local field amplitudes, which is included in the 3D simulation. And present a more advanced Tensor Ponderomotive Potential approach, which we hope to employ in the future, which should improve the physical fidelity in the highly anisotropic environment of the SOL. Finally, we demonstrate and benchmark slow time (non-linear) evolution of the RF sheath, and include realistic collisional effects from the neutral gas. Support from US DOE Grants DE-FC02-08ER54953, DE-FG02-09ER55006.
Chakrabortty, S; Sen, M; Pal, P
2014-03-01
A simulation software (ARRPA) has been developed in Microsoft Visual Basic platform for optimization and control of a novel membrane-integrated arsenic separation plant in the backdrop of absence of such software. The user-friendly, menu-driven software is based on a dynamic linearized mathematical model, developed for the hybrid treatment scheme. The model captures the chemical kinetics in the pre-treating chemical reactor and the separation and transport phenomena involved in nanofiltration. The software has been validated through extensive experimental investigations. The agreement between the outputs from computer simulation program and the experimental findings are excellent and consistent under varying operating conditions reflecting high degree of accuracy and reliability of the software. High values of the overall correlation coefficient (R (2) = 0.989) and Willmott d-index (0.989) are indicators of the capability of the software in analyzing performance of the plant. The software permits pre-analysis, manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. Performance analysis of the whole system as well as the individual units is possible using the tool. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for removal of arsenic from contaminated groundwater.
NASA Astrophysics Data System (ADS)
Tonini, C.; Mutch, S. J.; Wyithe, J. S. B.; Croton, D. J.
2017-03-01
We investigate the properties of the stellar populations of model galaxies as a function of galaxy evolutionary history and angular momentum content. We use the new semi-analytic model presented in Tonini et al. This new model follows the angular momentum evolution of gas and stars, providing the base for a new star formation recipe, and treatment of the effects of mergers that depends on the central galaxy dynamical structure. We find that the new recipes have the effect of boosting the efficiency of the baryonic cycle in producing and recycling metals, as well as preventing minor mergers from diluting the metallicity of bulges and ellipticals. The model reproduces the stellar mass-stellar metallicity relation for galaxies above 1010 solar masses, including Brightest Cluster Galaxies. Model discs, galaxies dominated by instability-driven components, and merger-driven objects each stem from different evolutionary channels. These model galaxies therefore occupy different loci in the galaxy mass-size relation, which we find to be in accord with the ATLAS 3D classification of disc galaxies, fast rotators and slow rotators. We find that the stellar populations' properties depend on the galaxy evolutionary type, with more evolved stellar populations being part of systems that have lost or dissipated more angular momentum during their assembly history.
Cyberinfrastructure to Support Collaborative and Reproducible Computational Hydrologic Modeling
NASA Astrophysics Data System (ADS)
Goodall, J. L.; Castronova, A. M.; Bandaragoda, C.; Morsy, M. M.; Sadler, J. M.; Essawy, B.; Tarboton, D. G.; Malik, T.; Nijssen, B.; Clark, M. P.; Liu, Y.; Wang, S. W.
2017-12-01
Creating cyberinfrastructure to support reproducibility of computational hydrologic models is an important research challenge. Addressing this challenge requires open and reusable code and data with machine and human readable metadata, organized in ways that allow others to replicate results and verify published findings. Specific digital objects that must be tracked for reproducible computational hydrologic modeling include (1) raw initial datasets, (2) data processing scripts used to clean and organize the data, (3) processed model inputs, (4) model results, and (5) the model code with an itemization of all software dependencies and computational requirements. HydroShare is a cyberinfrastructure under active development designed to help users store, share, and publish digital research products in order to improve reproducibility in computational hydrology, with an architecture supporting hydrologic-specific resource metadata. Researchers can upload data required for modeling, add hydrology-specific metadata to these resources, and use the data directly within HydroShare.org for collaborative modeling using tools like CyberGIS, Sciunit-CLI, and JupyterHub that have been integrated with HydroShare to run models using notebooks, Docker containers, and cloud resources. Current research aims to implement the Structure For Unifying Multiple Modeling Alternatives (SUMMA) hydrologic model within HydroShare to support hypothesis-driven hydrologic modeling while also taking advantage of the HydroShare cyberinfrastructure. The goal of this integration is to create the cyberinfrastructure that supports hypothesis-driven model experimentation, education, and training efforts by lowering barriers to entry, reducing the time spent on informatics technology and software development, and supporting collaborative research within and across research groups.
Baldwin, Krystal L; Kannan, Vaishnavi; Flahaven, Emily L; Parks, Cassandra J; Ott, Jason M; Willett, Duwayne L
2018-01-01
Background Moving to electronic health records (EHRs) confers substantial benefits but risks unintended consequences. Modern EHRs consist of complex software code with extensive local configurability options, which can introduce defects. Defects in clinical decision support (CDS) tools are surprisingly common. Feasible approaches to prevent and detect defects in EHR configuration, including CDS tools, are needed. In complex software systems, use of test–driven development and automated regression testing promotes reliability. Test–driven development encourages modular, testable design and expanding regression test coverage. Automated regression test suites improve software quality, providing a “safety net” for future software modifications. Each automated acceptance test serves multiple purposes, as requirements (prior to build), acceptance testing (on completion of build), regression testing (once live), and “living” design documentation. Rapid-cycle development or “agile” methods are being successfully applied to CDS development. The agile practice of automated test–driven development is not widely adopted, perhaps because most EHR software code is vendor-developed. However, key CDS advisory configuration design decisions and rules stored in the EHR may prove amenable to automated testing as “executable requirements.” Objective We aimed to establish feasibility of acceptance test–driven development of clinical decision support advisories in a commonly used EHR, using an open source automated acceptance testing framework (FitNesse). Methods Acceptance tests were initially constructed as spreadsheet tables to facilitate clinical review. Each table specified one aspect of the CDS advisory’s expected behavior. Table contents were then imported into a test suite in FitNesse, which queried the EHR database to automate testing. Tests and corresponding CDS configuration were migrated together from the development environment to production, with tests becoming part of the production regression test suite. Results We used test–driven development to construct a new CDS tool advising Emergency Department nurses to perform a swallowing assessment prior to administering oral medication to a patient with suspected stroke. Test tables specified desired behavior for (1) applicable clinical settings, (2) triggering action, (3) rule logic, (4) user interface, and (5) system actions in response to user input. Automated test suite results for the “executable requirements” are shown prior to building the CDS alert, during build, and after successful build. Conclusions Automated acceptance test–driven development and continuous regression testing of CDS configuration in a commercial EHR proves feasible with open source software. Automated test–driven development offers one potential contribution to achieving high-reliability EHR configuration. Vetting acceptance tests with clinicians elicits their input on crucial configuration details early during initial CDS design and iteratively during rapid-cycle optimization. PMID:29653922
Basit, Mujeeb A; Baldwin, Krystal L; Kannan, Vaishnavi; Flahaven, Emily L; Parks, Cassandra J; Ott, Jason M; Willett, Duwayne L
2018-04-13
Moving to electronic health records (EHRs) confers substantial benefits but risks unintended consequences. Modern EHRs consist of complex software code with extensive local configurability options, which can introduce defects. Defects in clinical decision support (CDS) tools are surprisingly common. Feasible approaches to prevent and detect defects in EHR configuration, including CDS tools, are needed. In complex software systems, use of test-driven development and automated regression testing promotes reliability. Test-driven development encourages modular, testable design and expanding regression test coverage. Automated regression test suites improve software quality, providing a "safety net" for future software modifications. Each automated acceptance test serves multiple purposes, as requirements (prior to build), acceptance testing (on completion of build), regression testing (once live), and "living" design documentation. Rapid-cycle development or "agile" methods are being successfully applied to CDS development. The agile practice of automated test-driven development is not widely adopted, perhaps because most EHR software code is vendor-developed. However, key CDS advisory configuration design decisions and rules stored in the EHR may prove amenable to automated testing as "executable requirements." We aimed to establish feasibility of acceptance test-driven development of clinical decision support advisories in a commonly used EHR, using an open source automated acceptance testing framework (FitNesse). Acceptance tests were initially constructed as spreadsheet tables to facilitate clinical review. Each table specified one aspect of the CDS advisory's expected behavior. Table contents were then imported into a test suite in FitNesse, which queried the EHR database to automate testing. Tests and corresponding CDS configuration were migrated together from the development environment to production, with tests becoming part of the production regression test suite. We used test-driven development to construct a new CDS tool advising Emergency Department nurses to perform a swallowing assessment prior to administering oral medication to a patient with suspected stroke. Test tables specified desired behavior for (1) applicable clinical settings, (2) triggering action, (3) rule logic, (4) user interface, and (5) system actions in response to user input. Automated test suite results for the "executable requirements" are shown prior to building the CDS alert, during build, and after successful build. Automated acceptance test-driven development and continuous regression testing of CDS configuration in a commercial EHR proves feasible with open source software. Automated test-driven development offers one potential contribution to achieving high-reliability EHR configuration. Vetting acceptance tests with clinicians elicits their input on crucial configuration details early during initial CDS design and iteratively during rapid-cycle optimization. ©Mujeeb A Basit, Krystal L Baldwin, Vaishnavi Kannan, Emily L Flahaven, Cassandra J Parks, Jason M Ott, Duwayne L Willett. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 13.04.2018.
A Vision on the Status and Evolution of HEP Physics Software Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Canal, P.; Elvira, D.; Hatcher, R.
2013-07-28
This paper represents the vision of the members of the Fermilab Scientific Computing Division's Computational Physics Department (SCD-CPD) on the status and the evolution of various HEP software tools such as the Geant4 detector simulation toolkit, the Pythia and GENIE physics generators, and the ROOT data analysis framework. The goal of this paper is to contribute ideas to the Snowmass 2013 process toward the composition of a unified document on the current status and potential evolution of the physics software tools which are essential to HEP.
Direct Numerical Simulation of Fingering Instabilities in Coating Flows
NASA Astrophysics Data System (ADS)
Eres, Murat H.; Schwartz, Leonard W.
1998-11-01
We consider stability and finger formation in free surface flows. Gravity driven downhill drainage and temperature gradient driven climbing flows are two examples of such problems. The former situation occurs when a mound of viscous liquid on a vertical wall is allowed to flow. Constant surface shear stress due to temperature gradients (Marangoni stress) can initiate the latter problem. The evolution equations are derived using the lubrication approximation. We also include the effects of finite-contact angles in the evolution equations using a disjoining pressure model. Evolution equations for both problems are solved using an efficient alternating-direction-implicit method. For both problems a one-dimensional base state is established, that is steady in a moving reference frame. This base state is unstable to transverse perturbations. The transverse wavenumbers for the most rapidly growing modes are found through direct numerical solution of the nonlinear evolution equations, and are compared with published experimental results. For a range of finite equilibrium contact angles, the fingers can grow without limit leading to semi-finite steady fingers in a moving coordinate system. A computer generated movie of the nonlinear simulation results, for several sets of input parameters, will be shown.
1990-05-01
Sanders Associates. Inc. A demonstration of knowledge-based support for the evolut ;cnry development of software system requirements uskig mitV/9 text...Conference Commiffee W Douga W~t Spin-Off Technologies 4 AN OVERVIEW OF RADC’S KNOWLEDGE BASED SOFTWARE ASSISTANT PROGRAM Donald M. Elefante Rome Air...Knowledge-Based Software Assistant is a formally based, computer-mediated paradigm for the specification, development, evolution , and Ir ig term
Multiphysics of bone remodeling: A 2D mesoscale activation simulation.
Spingarn, C; Wagner, D; Rémond, Y; George, D
2017-01-01
In this work, we present an evolutive trabecular model for bone remodeling based on a boundary detection algorithm accounting for both biology and applied mechanical forces, known to be an important factor in bone evolution. A finite element (FE) numerical model using the Abaqus/Standard® software was used with a UMAT subroutine to solve the governing coupled mechanical-biological non-linear differential equations of the bone evolution model. The simulations present cell activation on a simplified trabeculae configuration organization with trabecular thickness of 200µm. For this activation process, the results confirm that the trabeculae are mainly oriented in the active direction of the principal mechanical stresses and according to the principal applied mechanical load directions. The trabeculae surface activation is clearly identified and can provide understanding of the different bone cell activations in more complex geometries and load conditions.
2015-01-01
Background Multiscale approaches for integrating submodels of various levels of biological organization into a single model became the major tool of systems biology. In this paper, we have constructed and simulated a set of multiscale models of spatially distributed microbial communities and study an influence of unevenly distributed environmental factors on the genetic diversity and evolution of the community members. Results Haploid Evolutionary Constructor software http://evol-constructor.bionet.nsc.ru/ was expanded by adding the tool for the spatial modeling of a microbial community (1D, 2D and 3D versions). A set of the models of spatially distributed communities was built to demonstrate that the spatial distribution of cells affects both intensity of selection and evolution rate. Conclusion In spatially heterogeneous communities, the change in the direction of the environmental flow might be reflected in local irregular population dynamics, while the genetic structure of populations (frequencies of the alleles) remains stable. Furthermore, in spatially heterogeneous communities, the chemotaxis might dramatically affect the evolution of community members. PMID:25708911
Maintaining Quality and Confidence in Open-Source, Evolving Software: Lessons Learned with PFLOTRAN
NASA Astrophysics Data System (ADS)
Frederick, J. M.; Hammond, G. E.
2017-12-01
Software evolution in an open-source framework poses a major challenge to a geoscientific simulator, but when properly managed, the pay-off can be enormous for both the developers and the community at large. Developers must juggle implementing new scientific process models, adopting increasingly efficient numerical methods and programming paradigms, changing funding sources (or total lack of funding), while also ensuring that legacy code remains functional and reported bugs are fixed in a timely manner. With robust software engineering and a plan for long-term maintenance, a simulator can evolve over time incorporating and leveraging many advances in the computational and domain sciences. In this positive light, what practices in software engineering and code maintenance can be employed within open-source development to maximize the positive aspects of software evolution and community contributions while minimizing its negative side effects? This presentation will discusses steps taken in the development of PFLOTRAN (www.pflotran.org), an open source, massively parallel subsurface simulator for multiphase, multicomponent, and multiscale reactive flow and transport processes in porous media. As PFLOTRAN's user base and development team continues to grow, it has become increasingly important to implement strategies which ensure sustainable software development while maintaining software quality and community confidence. In this presentation, we will share our experiences and "lessons learned" within the context of our open-source development framework and community engagement efforts. Topics discussed will include how we've leveraged both standard software engineering principles, such as coding standards, version control, and automated testing, as well unique advantages of object-oriented design in process model coupling, to ensure software quality and confidence. We will also be prepared to discuss the major challenges faced by most open-source software teams, such as on-boarding new developers or one-time contributions, dealing with competitors or lookie-loos, and other downsides of complete transparency, as well as our approach to community engagement, including a user group email list, hosting short courses and workshops for new users, and maintaining a website. SAND2017-8174A
NASA Technical Reports Server (NTRS)
King, Ellis; Hart, Jeremy; Odegard, Ryan
2010-01-01
The Orion Crew Exploration Vehicle (CET) is being designed to include significantly more automation capability than either the Space Shuttle or the International Space Station (ISS). In particular, the vehicle flight software has requirements to accommodate increasingly automated missions throughout all phases of flight. A data-driven flight software architecture will provide an evolvable automation capability to sequence through Guidance, Navigation & Control (GN&C) flight software modes and configurations while maintaining the required flexibility and human control over the automation. This flexibility is a key aspect needed to address the maturation of operational concepts, to permit ground and crew operators to gain trust in the system and mitigate unpredictability in human spaceflight. To allow for mission flexibility and reconfrgurability, a data driven approach is being taken to load the mission event plan as well cis the flight software artifacts associated with the GN&C subsystem. A database of GN&C level sequencing data is presented which manages and tracks the mission specific and algorithm parameters to provide a capability to schedule GN&C events within mission segments. The flight software data schema for performing automated mission sequencing is presented with a concept of operations for interactions with ground and onboard crew members. A prototype architecture for fault identification, isolation and recovery interactions with the automation software is presented and discussed as a forward work item.
Ten recommendations for software engineering in research.
Hastings, Janna; Haug, Kenneth; Steinbeck, Christoph
2014-01-01
Research in the context of data-driven science requires a backbone of well-written software, but scientific researchers are typically not trained at length in software engineering, the principles for creating better software products. To address this gap, in particular for young researchers new to programming, we give ten recommendations to ensure the usability, sustainability and practicality of research software.
NASA Technical Reports Server (NTRS)
Hart, Andrew F.; Verma, Rishi; Mattmann, Chris A.; Crichton, Daniel J.; Kelly, Sean; Kincaid, Heather; Hughes, Steven; Ramirez, Paul; Goodale, Cameron; Anton, Kristen;
2012-01-01
For the past decade, the NASA Jet Propulsion Laboratory, in collaboration with Dartmouth University has served as the center for informatics for the Early Detection Research Network (EDRN). The EDRN is a multi-institution research effort funded by the U.S. National Cancer Institute (NCI) and tasked with identifying and validating biomarkers for the early detection of cancer. As the distributed network has grown, increasingly formal processes have been developed for the acquisition, curation, storage, and dissemination of heterogeneous research information assets, and an informatics infrastructure has emerged. In this paper we discuss the evolution of EDRN informatics, its success as a mechanism for distributed information integration, and the potential sustainability and reuse benefits of emerging efforts to make the platform components themselves open source. We describe our experience transitioning a large closed-source software system to a community driven, open source project at the Apache Software Foundation, and point to lessons learned that will guide our present efforts to promote the reuse of the EDRN informatics infrastructure by a broader community.
Archaeological data reveal slow rates of evolution during plant domestication.
Purugganan, Michael D; Fuller, Dorian Q
2011-01-01
Domestication is an evolutionary process of species divergence in which morphological and physiological changes result from the cultivation/tending of plant or animal species by a mutualistic partner, most prominently humans. Darwin used domestication as an analogy to evolution by natural selection although there is strong debate on whether this process of species evolution by human association is an appropriate model for evolutionary study. There is a presumption that selection under domestication is strong and most models assume rapid evolution of cultivated species. Using archaeological data for 11 species from 60 archaeological sites, we measure rates of evolution in two plant domestication traits--nonshattering and grain/seed size increase. Contrary to previous assumptions, we find the rates of phenotypic evolution during domestication are slow, and significantly lower or comparable to those observed among wild species subjected to natural selection. Our study indicates that the magnitudes of the rates of evolution during the domestication process, including the strength of selection, may be similar to those measured for wild species. This suggests that domestication may be driven by unconscious selection pressures similar to that observed for natural selection, and the study of the domestication process may indeed prove to be a valid model for the study of evolutionary change. © 2010 The Author(s). Evolution© 2010 The Society for the Study of Evolution.
Services Oriented Smart City Platform Based On 3d City Model Visualization
NASA Astrophysics Data System (ADS)
Prandi, F.; Soave, M.; Devigili, F.; Andreolli, M.; De Amicis, R.
2014-04-01
The rapid technological evolution, which is characterizing all the disciplines involved within the wide concept of smart cities, is becoming a key factor to trigger true user-driven innovation. However to fully develop the Smart City concept to a wide geographical target, it is required an infrastructure that allows the integration of heterogeneous geographical information and sensor networks into a common technological ground. In this context 3D city models will play an increasingly important role in our daily lives and become an essential part of the modern city information infrastructure (Spatial Data Infrastructure). The work presented in this paper describes an innovative Services Oriented Architecture software platform aimed at providing smartcities services on top of 3D urban models. 3D city models are the basis of many applications and can became the platform for integrating city information within the Smart-Cites context. In particular the paper will investigate how the efficient visualisation of 3D city models using different levels of detail (LODs) is one of the pivotal technological challenge to support Smart-Cities applications. The goal is to provide to the final user realistic and abstract 3D representations of the urban environment and the possibility to interact with a massive amounts of semantic information contained into the geospatial 3D city model. The proposed solution, using OCG standards and a custom service to provide 3D city models, lets the users to consume the services and interact with the 3D model via Web in a more effective way.
Data-Driven Software Framework for Web-Based ISS Telescience
NASA Technical Reports Server (NTRS)
Tso, Kam S.
2005-01-01
Software that enables authorized users to monitor and control scientific payloads aboard the International Space Station (ISS) from diverse terrestrial locations equipped with Internet connections is undergoing development. This software reflects a data-driven approach to distributed operations. A Web-based software framework leverages prior developments in Java and Extensible Markup Language (XML) to create portable code and portable data, to which one can gain access via Web-browser software on almost any common computer. Open-source software is used extensively to minimize cost; the framework also accommodates enterprise-class server software to satisfy needs for high performance and security. To accommodate the diversity of ISS experiments and users, the framework emphasizes openness and extensibility. Users can take advantage of available viewer software to create their own client programs according to their particular preferences, and can upload these programs for custom processing of data, generation of views, and planning of experiments. The same software system, possibly augmented with a subset of data and additional software tools, could be used for public outreach by enabling public users to replay telescience experiments, conduct their experiments with simulated payloads, and create their own client programs and other custom software.
Development strategies for the satellite flight software on-board Meteosat Third Generation
NASA Astrophysics Data System (ADS)
Tipaldi, Massimo; Legendre, Cedric; Koopmann, Olliver; Ferraguto, Massimo; Wenker, Ralf; D'Angelo, Gianni
2018-04-01
Nowadays, satellites are becoming increasingly software dependent. Satellite Flight Software (FSW), that is to say, the application software running on the satellite main On-Board Computer (OBC), plays a relevant role in implementing complex space mission requirements. In this paper, we examine relevant technical approaches and programmatic strategies adopted for the development of the Meteosat Third Generation Satellite (MTG) FSW. To begin with, we present its layered model-based architecture, and the means for ensuring a robust and reliable interaction among the FSW components. Then, we focus on the selection of an effective software development life cycle model. In particular, by combining plan-driven and agile approaches, we can fulfill the need of having preliminary SW versions. They can be used for the elicitation of complex system-level requirements as well as for the initial satellite integration and testing activities. Another important aspect can be identified in the testing activities. Indeed, very demanding quality requirements have to be fulfilled in satellite SW applications. This manuscript proposes a test automation framework, which uses an XML-based test procedure language independent of the underlying test environment. Finally, a short overview of the MTG FSW sizing and timing budgets concludes the paper.
Is High Primordial Deuterium Consistent with Galactic Evolution?
NASA Astrophysics Data System (ADS)
Tosi, Monica; Steigman, Gary; Matteucci, Francesca; Chiappini, Cristina
1998-05-01
Galactic destruction of primordial deuterium is inevitably linked through star formation to the chemical evolution of the Galaxy. The relatively high present gas content and low metallicity suggest only modest D destruction. In concert with deuterium abundances derived from solar system and/or interstellar observations, this suggests a primordial deuterium abundance in possible conflict with data from some high-redshift, low-metallicity QSO absorbers. We have explored a variety of chemical evolution models including infall of processed material and early, supernovae-driven winds with the aim of identifying models with large D destruction that are consistent with the observations of stellar-produced heavy elements. When such models are confronted with data, we reconfirm that only modest destruction of deuterium (less than a factor of 3) is permitted. When combined with solar system and interstellar data, these results favor the low deuterium abundances derived for the QSO absorbers by Tytler et al.
Coherent population transfer in multi-level Allen-Eberly models
NASA Astrophysics Data System (ADS)
Li, Wei; Cen, Li-Xiang
2018-04-01
We investigate the solvability of multi-level extensions of the Allen-Eberly model and the population transfer yielded by the corresponding dynamical evolution. We demonstrate that, under a matching condition of the frequency, the driven two-level system and its multi-level extensions possess a stationary-state solution in a canonical representation associated with a unitary transformation. As a consequence, we show that the resulting protocol is able to realize complete population transfer in a nonadiabatic manner. Moreover, we explore the imperfect pulsing process with truncation and display that the nonadiabatic effect in the evolution can lead to suppression to the cutoff error of the protocol.
NASA Technical Reports Server (NTRS)
Plumlee, G. S.; Ridley, W. I.; Debraal, J. D.; Reed, M. H.
1993-01-01
Chemical reaction path calculations were used to model the minerals that might have formed at or near the Martian surface as a result of volcano or meteorite impact driven hydrothermal systems; weathering at the Martian surface during an early warm, wet climate; and near-zero or sub-zero C brine-regolith reactions in the current cold climate. Although the chemical reaction path calculations carried out do not define the exact mineralogical evolution of the Martian surface over time, they do place valuable geochemical constraints on the types of minerals that formed from an aqueous phase under various surficial and geochemically complex conditions.
General Purpose Data-Driven Monitoring for Space Operations
NASA Technical Reports Server (NTRS)
Iverson, David L.; Martin, Rodney A.; Schwabacher, Mark A.; Spirkovska, Liljana; Taylor, William McCaa; Castle, Joseph P.; Mackey, Ryan M.
2009-01-01
As modern space propulsion and exploration systems improve in capability and efficiency, their designs are becoming increasingly sophisticated and complex. Determining the health state of these systems, using traditional parameter limit checking, model-based, or rule-based methods, is becoming more difficult as the number of sensors and component interactions grow. Data-driven monitoring techniques have been developed to address these issues by analyzing system operations data to automatically characterize normal system behavior. System health can be monitored by comparing real-time operating data with these nominal characterizations, providing detection of anomalous data signatures indicative of system faults or failures. The Inductive Monitoring System (IMS) is a data-driven system health monitoring software tool that has been successfully applied to several aerospace applications. IMS uses a data mining technique called clustering to analyze archived system data and characterize normal interactions between parameters. The scope of IMS based data-driven monitoring applications continues to expand with current development activities. Successful IMS deployment in the International Space Station (ISS) flight control room to monitor ISS attitude control systems has led to applications in other ISS flight control disciplines, such as thermal control. It has also generated interest in data-driven monitoring capability for Constellation, NASA's program to replace the Space Shuttle with new launch vehicles and spacecraft capable of returning astronauts to the moon, and then on to Mars. Several projects are currently underway to evaluate and mature the IMS technology and complementary tools for use in the Constellation program. These include an experiment on board the Air Force TacSat-3 satellite, and ground systems monitoring for NASA's Ares I-X and Ares I launch vehicles. The TacSat-3 Vehicle System Management (TVSM) project is a software experiment to integrate fault and anomaly detection algorithms and diagnosis tools with executive and adaptive planning functions contained in the flight software on-board the Air Force Research Laboratory TacSat-3 satellite. The TVSM software package will be uploaded after launch to monitor spacecraft subsystems such as power and guidance, navigation, and control (GN&C). It will analyze data in real-time to demonstrate detection of faults and unusual conditions, diagnose problems, and react to threats to spacecraft health and mission goals. The experiment will demonstrate the feasibility and effectiveness of integrated system health management (ISHM) technologies with both ground and on-board experiments.
Integrating MPI and deduplication engines: a software architecture roadmap.
Baksi, Dibyendu
2009-03-01
The objective of this paper is to clarify the major concepts related to architecture and design of patient identity management software systems so that an implementor looking to solve a specific integration problem in the context of a Master Patient Index (MPI) and a deduplication engine can address the relevant issues. The ideas presented are illustrated in the context of a reference use case from Integrating the Health Enterprise Patient Identifier Cross-referencing (IHE PIX) profile. Sound software engineering principles using the latest design paradigm of model driven architecture (MDA) are applied to define different views of the architecture. The main contribution of the paper is a clear software architecture roadmap for implementors of patient identity management systems. Conceptual design in terms of static and dynamic views of the interfaces is provided as an example of platform independent model. This makes the roadmap applicable to any specific solutions of MPI, deduplication library or software platform. Stakeholders in need of integration of MPIs and deduplication engines can evaluate vendor specific solutions and software platform technologies in terms of fundamental concepts and can make informed decisions that preserve investment. This also allows freedom from vendor lock-in and the ability to kick-start integration efforts based on a solid architecture.
Brown, Jason L; Bennett, Joseph R; French, Connor M
2017-01-01
SDMtoolbox 2.0 is a software package for spatial studies of ecology, evolution, and genetics. The release of SDMtoolbox 2.0 allows researchers to use the most current ArcGIS software and MaxEnt software, and reduces the amount of time that would be spent developing common solutions. The central aim of this software is to automate complicated and repetitive spatial analyses in an intuitive graphical user interface. One core tenant facilitates careful parameterization of species distribution models (SDMs) to maximize each model's discriminatory ability and minimize overfitting. This includes carefully processing of occurrence data, environmental data, and model parameterization. This program directly interfaces with MaxEnt, one of the most powerful and widely used species distribution modeling software programs, although SDMtoolbox 2.0 is not limited to species distribution modeling or restricted to modeling in MaxEnt. Many of the SDM pre- and post-processing tools have 'universal' analogs for use with any modeling software. The current version contains a total of 79 scripts that harness the power of ArcGIS for macroecology, landscape genetics, and evolutionary studies. For example, these tools allow for biodiversity quantification (such as species richness or corrected weighted endemism), generation of least-cost paths and corridors among shared haplotypes, assessment of the significance of spatial randomizations, and enforcement of dispersal limitations of SDMs projected into future climates-to only name a few functions contained in SDMtoolbox 2.0. Lastly, dozens of generalized tools exists for batch processing and conversion of GIS data types or formats, which are broadly useful to any ArcMap user.
2010-02-01
through software -as-a- service ( SaaS ) (Nitu 2009, Sedayao 2008). In practice, an organization’s initial SOA implementation almost never attempts to cover...004 Nitu. "Configurability in SaaS ( Software as a Service ) Applications." Proceedings of the 2nd An- nual Conference on India Software Engineering...and evolution of service -oriented systems. In 2007, the Software Engineering Institute started assembling a SOA Research Agenda based on a
Convection- and SASI-driven flows in parametrized models of core-collapse supernova explosions
Endeve, E.; Cardall, C. Y.; Budiardja, R. D.; ...
2016-01-21
We present initial results from three-dimensional simulations of parametrized core-collapse supernova (CCSN) explosions obtained with our astrophysical simulation code General Astrophysical Simulation System (GenASIS). We are interested in nonlinear flows resulting from neutrino-driven convection and the standing accretion shock instability (SASI) in the CCSN environment prior to and during the explosion. By varying parameters in our model that control neutrino heating and shock dissociation, our simulations result in convection-dominated and SASI-dominated evolution. We describe this initial set of simulation results in some detail. To characterize the turbulent flows in the simulations, we compute and compare velocity power spectra from convection-dominatedmore » and SASI-dominated (both non-exploding and exploding) models. When compared to SASI-dominated models, convection-dominated models exhibit significantly more power on small spatial scales.« less
Supernova-driven outflows and chemical evolution of dwarf spheroidal galaxies
Qian, Yong-Zhong; Wasserburg, G. J.
2012-01-01
We present a general phenomenological model for the metallicity distribution (MD) in terms of [Fe/H] for dwarf spheroidal galaxies (dSphs). These galaxies appear to have stopped accreting gas from the intergalactic medium and are fossilized systems with their stars undergoing slow internal evolution. For a wide variety of infall histories of unprocessed baryonic matter to feed star formation, most of the observed MDs can be well described by our model. The key requirement is that the fraction of the gas mass lost by supernova-driven outflows is close to unity. This model also predicts a relationship between the total stellar mass and the mean metallicity for dSphs in accord with properties of their dark matter halos. The model further predicts as a natural consequence that the abundance ratios [E/Fe] for elements such as O, Mg, and Si decrease for stellar populations at the higher end of the [Fe/H] range in a dSph. We show that, for infall rates far below the net rate of gas loss to star formation and outflows, the MD in our model is very sharply peaked at one [Fe/H] value, similar to what is observed in most globular clusters. This result suggests that globular clusters may be end members of the same family as dSphs. PMID:22411827
Supernova-driven outflows and chemical evolution of dwarf spheroidal galaxies.
Qian, Yong-Zhong; Wasserburg, G J
2012-03-27
We present a general phenomenological model for the metallicity distribution (MD) in terms of [Fe/H] for dwarf spheroidal galaxies (dSphs). These galaxies appear to have stopped accreting gas from the intergalactic medium and are fossilized systems with their stars undergoing slow internal evolution. For a wide variety of infall histories of unprocessed baryonic matter to feed star formation, most of the observed MDs can be well described by our model. The key requirement is that the fraction of the gas mass lost by supernova-driven outflows is close to unity. This model also predicts a relationship between the total stellar mass and the mean metallicity for dSphs in accord with properties of their dark matter halos. The model further predicts as a natural consequence that the abundance ratios [E/Fe] for elements such as O, Mg, and Si decrease for stellar populations at the higher end of the [Fe/H] range in a dSph. We show that, for infall rates far below the net rate of gas loss to star formation and outflows, the MD in our model is very sharply peaked at one [Fe/H] value, similar to what is observed in most globular clusters. This result suggests that globular clusters may be end members of the same family as dSphs.
Behavior driven testing in ALMA telescope calibration software
NASA Astrophysics Data System (ADS)
Gil, Juan P.; Garces, Mario; Broguiere, Dominique; Shen, Tzu-Chiang
2016-07-01
ALMA software development cycle includes well defined testing stages that involves developers, testers and scientists. We adapted Behavior Driven Development (BDD) to testing activities applied to Telescope Calibration (TELCAL) software. BDD is an agile technique that encourages communication between roles by defining test cases using natural language to specify features and scenarios, what allows participants to share a common language and provides a high level set of automated tests. This work describes how we implemented and maintain BDD testing for TELCAL, the infrastructure needed to support it and proposals to expand this technique to other subsystems.
Study on Capturing Functional Requirements of the New Product Based on Evolution
NASA Astrophysics Data System (ADS)
Liu, Fang; Song, Liya; Bai, Zhonghang; Zhang, Peng
In order to exist in an increasingly competitive global marketplace, it is important for corporations to forecast the evolutionary direction of new products rapidly and effectively. Most products in the world are developed based on the design of existing products. In the product design, capturing functional requirements is a key step. Function is continuously evolving, which is driven by the evolution of needs and technologies. So the functional requirements of new product can be forecasted based on the functions of existing product. Eight laws of function evolution are put forward in this paper. The process model of capturing the functional requirements of new product based on function evolution is proposed. An example illustrates the design process.
NASA Astrophysics Data System (ADS)
Zemek, Peter G.; Plowman, Steven V.
2010-04-01
Advances in hardware have miniaturized the emissions spectrometer and associated optics, rendering them easily deployed in the field. Such systems are also suitable for vehicle mounting, and can provide high quality data and concentration information in minutes. Advances in software have accompanied this hardware evolution, enabling the development of portable point-and-click OP-FTIR systems that weigh less than 16 lbs. These systems are ideal for first-responders, military, law enforcement, forensics, and screening applications using optical remote sensing (ORS) methodologies. With canned methods and interchangeable detectors, the new generation of OP-FTIR technology is coupled to the latest forward reference-type model software to provide point-and-click technology. These software models have been established for some time. However, refined user-friendly models that use active, passive, and solar occultation methodologies now allow the user to quickly field-screen and quantify plumes, fence-lines, and combustion incident scenarios in high-temporal-resolution. Synthetic background generation is now redundant as the models use highly accurate instrument line shape (ILS) convolutions and several other parameters, in conjunction with radiative transfer model databases to model a single calibration spectrum to collected sample spectra. Data retrievals are performed directly on single beam spectra using non-linear classical least squares (NLCLS). Typically, the Hitran line database is used to generate the initial calibration spectrum contained within the software.
Space station dynamics, attitude control and momentum management
NASA Technical Reports Server (NTRS)
Sunkel, John W.; Singh, Ramen P.; Vengopal, Ravi
1989-01-01
The Space Station Attitude Control System software test-bed provides a rigorous environment for the design, development and functional verification of GN and C algorithms and software. The approach taken for the simulation of the vehicle dynamics and environmental models using a computationally efficient algorithm is discussed. The simulation includes capabilities for docking/berthing dynamics, prescribed motion dynamics associated with the Mobile Remote Manipulator System (MRMS) and microgravity disturbances. The vehicle dynamics module interfaces with the test-bed through the central Communicator facility which is in turn driven by the Station Control Simulator (SCS) Executive. The Communicator addresses issues such as the interface between the discrete flight software and the continuous vehicle dynamics, and multi-programming aspects such as the complex flow of control in real-time programs. Combined with the flight software and redundancy management modules, the facility provides a flexible, user-oriented simulation platform.
Peterson, Daniel A; Hardy, Nate B; Morse, Geoffrey E; Stocks, Ian C; Okusu, Akiko; Normark, Benjamin B
2015-10-01
A jack of all trades can be master of none-this intuitive idea underlies most theoretical models of host-use evolution in plant-feeding insects, yet empirical support for trade-offs in performance on distinct host plants is weak. Trade-offs may influence the long-term evolution of host use while being difficult to detect in extant populations, but host-use evolution may also be driven by adaptations for generalism. Here we used host-use data from insect collection records to parameterize a phylogenetic model of host-use evolution in armored scale insects, a large family of plant-feeding insects with a simple, pathogen-like life history. We found that a model incorporating positive correlations between evolutionary changes in host performance best fit the observed patterns of diaspidid presence and absence on nearly all focal host taxa, suggesting that adaptations to particular hosts also enhance performance on other hosts. In contrast to the widely invoked trade-off model, we advocate a "toolbox" model of host-use evolution in which armored scale insects accumulate a set of independent genetic tools, each of which is under selection for a single function but may be useful on multiple hosts. © 2015 The Author(s).
2014-09-30
continuation of the evolution of the Regional Oceanic Modeling System (ROMS) as a multi-scale, multi-process model and its utilization for...hydrostatic component of ROMS (Kanarska et al., 2007) is required to increase its efficiency and generality. The non-hydrostatic ROMS involves the solution...instability and wind-driven mixing. For the computational regime where those processes can be partially, but not yet fully resolved, it will
Model-driven approach to data collection and reporting for quality improvement
Curcin, Vasa; Woodcock, Thomas; Poots, Alan J.; Majeed, Azeem; Bell, Derek
2014-01-01
Continuous data collection and analysis have been shown essential to achieving improvement in healthcare. However, the data required for local improvement initiatives are often not readily available from hospital Electronic Health Record (EHR) systems or not routinely collected. Furthermore, improvement teams are often restricted in time and funding thus requiring inexpensive and rapid tools to support their work. Hence, the informatics challenge in healthcare local improvement initiatives consists of providing a mechanism for rapid modelling of the local domain by non-informatics experts, including performance metric definitions, and grounded in established improvement techniques. We investigate the feasibility of a model-driven software approach to address this challenge, whereby an improvement model designed by a team is used to automatically generate required electronic data collection instruments and reporting tools. To that goal, we have designed a generic Improvement Data Model (IDM) to capture the data items and quality measures relevant to the project, and constructed Web Improvement Support in Healthcare (WISH), a prototype tool that takes user-generated IDM models and creates a data schema, data collection web interfaces, and a set of live reports, based on Statistical Process Control (SPC) for use by improvement teams. The software has been successfully used in over 50 improvement projects, with more than 700 users. We present in detail the experiences of one of those initiatives, Chronic Obstructive Pulmonary Disease project in Northwest London hospitals. The specific challenges of improvement in healthcare are analysed and the benefits and limitations of the approach are discussed. PMID:24874182
ERIC Educational Resources Information Center
Data Research Associates, Inc., St. Louis, MO.
The topic of open systems as it relates to the needs of libraries to establish interoperability between dissimilar computer systems can be clarified by an understanding of the background and evolution of the issue. The International Standards Organization developed a model to link dissimilar computers, and this model has evolved into consensus…
Empirical evidence of climate's role in Rocky Mountain landscape evolution
NASA Astrophysics Data System (ADS)
Riihimaki, Catherine A.; Reiners, Peter W.
2012-06-01
Climate may be the dominant factor affecting landscape evolution during the late Cenozoic, but models that connect climate and landscape evolution cannot be tested without precise ages of landforms. Zircon (U-Th)/He ages of clinker, metamorphosed rock formed by burning of underlying coal seams, provide constraints on the spatial and temporal patterns of Quaternary erosion in the Powder River basin of Wyoming and Montana. The age distribution of 86 sites shows two temporal patterns: (1) a bias toward younger ages because of erosion of older clinker and (2) periodic occurrence of coal fires likely corresponding with particular climatic regimes. Statistical t tests of the ages and spectral analyses of the age probability density function indicate that these episodes of frequent coal fires most likely correspond with times of high eccentricity in Earth's orbit, possibly driven by increased seasonality in the region causing increased erosion rates and coal exhumation. Correlation of ages with interglacial time periods is weaker. The correlations between climate and coal fires improve when only samples greater than 50 km from the front of the Bighorn Range, the site of the nearest alpine glaciation, are compared. Together, these results indicate that the interaction between upstream glaciation and downstream erosion is likely not the dominant control on Quaternary landscape evolution in the Powder River basin, particularly since 0.5 Ma. Instead, incision rates are likely controlled by the response of streams to climate shifts within the basin itself, possibly changes in local precipitation rates or frequency-magnitude distributions, with no discernable lag time between climate changes and landscape responses. Clinker ages are consistent with numerical models in which stream erosion is driven by fluctuations in stream power on thousand year timescales within the basins, possibly as a result of changing precipitation patterns, and is driven by regional rock uplift on million year timescales.
NASA Astrophysics Data System (ADS)
Xiong, Ming; Zheng, Huinan; Wu, S. T.; Wang, Yuming; Wang, Shui
2007-11-01
Numerical studies of the interplanetary "multiple magnetic clouds (Multi-MC)" are performed by a 2.5-dimensional ideal magnetohydrodynamic (MHD) model in the heliospheric meridional plane. Both slow MC1 and fast MC2 are initially emerged along the heliospheric equator, one after another with different time intervals. The coupling of two MCs could be considered as the comprehensive interaction between two systems, each comprising of an MC body and its driven shock. The MC2-driven shock and MC2 body are successively involved into interaction with MC1 body. The momentum is transferred from MC2 to MC1. After the passage of MC2-driven shock front, magnetic field lines in MC1 medium previously compressed by MC2-driven shock are prevented from being restored by the MC2 body pushing. MC1 body undergoes the most violent compression from the ambient solar wind ahead, continuous penetration of MC2-driven shock through MC1 body, and persistent pushing of MC2 body at MC1 tail boundary. As the evolution proceeds, the MC1 body suffers from larger and larger compression, and its original vulnerable magnetic elasticity becomes stiffer and stiffer. So there exists a maximum compressibility of Multi-MC when the accumulated elasticity can balance the external compression. This cutoff limit of compressibility mainly decides the maximally available geoeffectiveness of Multi-MC because the geoeffectiveness enhancement of MCs interacting is ascribed to the compression. Particularly, the greatest geoeffectiveness is excited among all combinations of each MC helicity, if magnetic field lines in the interacting region of Multi-MC are all southward. Multi-MC completes its final evolutionary stage when the MC2-driven shock is merged with MC1-driven shock into a stronger compound shock. With respect to Multi-MC geoeffectiveness, the evolution stage is a dominant factor, whereas the collision intensity is a subordinate one. The magnetic elasticity, magnetic helicity of each MC, and compression between each other are the key physical factors for the formation, propagation, evolution, and resulting geoeffectiveness of interplanetary Multi-MC.
Assisted stellar suicide: the wind-driven evolution of the recurrent nova T Pyxidis
NASA Astrophysics Data System (ADS)
Knigge, Ch.; King, A. R.; Patterson, J.
2000-12-01
We show that the extremely high luminosity of the short-period recurrent nova T Pyx in quiescence can be understood if this system is a wind-driven supersoft x-ray source (SSS). In this scenario, a strong, radiation-induced wind is excited from the secondary star and accelerates the binary evolution. The accretion rate is therefore much higher than in an ordinary cataclysmic binary at the same orbital period, as is the luminosity of the white dwarf primary. In the steady state, the enhanced luminosity is just sufficient to maintain the wind from the secondary. The accretion rate and luminosity predicted by the wind-driven model for T Pyx are in good agreement with the observational evidence. X-ray observations with Chandra or XMM may be able to confirm T Pyx's status as a SSS. T Pyx's lifetime in the wind-driven state is on the order of a million years. Its ultimate fate is not certain, but the system may very well end up destroying itself, either via the complete evaporation of the secondary star, or in a Type Ia supernova if the white dwarf reaches the Chandrasekhar limit. Thus either the primary, the secondary, or both may currently be committing assisted stellar suicide.
RCHILD - an R-package for flexible use of the landscape evolution model CHILD
NASA Astrophysics Data System (ADS)
Dietze, Michael
2014-05-01
Landscape evolution models provide powerful approaches to numerically assess earth surface processes, to quantify rates of landscape change, infer sediment transfer rates, estimate sediment budgets, investigate the consequences of changes in external drivers on a geomorphic system, to provide spatio-temporal interpolations between known landscape states or to test conceptual hypotheses. CHILD (Channel-Hillslope Integrated Landscape Development Model) is one of the most-used models of landscape change in the context of at least tectonic and geomorphologic process interactions. Running CHILD from command line and working with the model output can be a rather awkward task (static model control via text input file, only numeric output in text files). The package RCHILD is a collection of functions for the free statistical software R that help using CHILD in a flexible, dynamic and user-friendly way. The comprised functions allow creating maps, real-time scenes, animations and further thematic plots from model output. The model input files can be modified dynamically and, hence, (feedback-related) changes in external factors can be implemented iteratively. Output files can be written to common formats that can be readily imported to standard GIS software. This contribution presents the basic functionality of the model CHILD as visualised and modified by the package. A rough overview of the available functions is given. Application examples help to illustrate the great potential of numeric modelling of geomorphologic processes.
Modelling and measurements of bunch profiles at the LHC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Papadopoulou, S.; Antoniou, F.; Argyropoulos, T.
The bunch profiles in the LHC are often observed to be non-Gaussian, both at Flat Bottom (FB) and Flat Top (FT) energies. Especially at FT, an evolution of the tail population in time is observed. In this respect, the Monte-Carlo Software for IBS and Radiation effects (SIRE) is used to track different types of beam distributions. The impact of the distribution shape on the evolution of bunch characteristics is studied. The results are compared with observations from the LHC Run 2 data.
Verheggen, Kenneth; Raeder, Helge; Berven, Frode S; Martens, Lennart; Barsnes, Harald; Vaudel, Marc
2017-09-13
Sequence database search engines are bioinformatics algorithms that identify peptides from tandem mass spectra using a reference protein sequence database. Two decades of development, notably driven by advances in mass spectrometry, have provided scientists with more than 30 published search engines, each with its own properties. In this review, we present the common paradigm behind the different implementations, and its limitations for modern mass spectrometry datasets. We also detail how the search engines attempt to alleviate these limitations, and provide an overview of the different software frameworks available to the researcher. Finally, we highlight alternative approaches for the identification of proteomic mass spectrometry datasets, either as a replacement for, or as a complement to, sequence database search engines. © 2017 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gray, George T
2010-12-14
Widespread research over the past five decades has provided a wealth of experimental data and insight concerning shock hardening and the spallation response of materials subjected to square-topped shock-wave loading profiles. Less quantitative data have been gathered on the effect of direct, in-contact, high explosive (HE)-driven Taylor wave (or triangular-wave) loading profile shock loading on the shock hardening, damage evolution, or spallation response of materials. Explosive loading induces an impulse dubbed a 'Taylor Wave'. This is a significantly different loading history than that achieved by a square-topped impulse in terms of both the pulse duration at a fixed peak pressure,more » and a different unloading strain rate from the peak Hugoniot state achieved. The goal of this research is to quantify the influence of shockwave obliquity on the spallation response of copper and tantalum by subjecting plates of each material to HE-driven sweeping detonation-wave loading and quantify both the wave propagation and the post-mortem damage evolution. This talk will summarize our current understanding of damage evolution during sweeping detonation-wave spallation loading in Cu and Ta and show comparisons to modeling simulations. The spallation responses of Cu and Ta are both shown to be critically dependent on the shockwave profile and the stress-state of the shock. Based on variations in the specifics of the shock drive (pulse shape, peak stress, shock obliquity) and sample geometry in Cu and Ta, 'spall strength' varies by over a factor of two and the details of the mechanisms of the damage evolution is seen to vary. Simplistic models of spallation, such as P{sub min} based on 1-D square-top shock data lack the physics to capture the influence of kinetics on damage evolution such as that operative during sweeping detonation loading. Such considerations are important for the development of predictive models of damage evolution and spallation in metals and alloys.« less
Corredor, Iván; Bernardos, Ana M; Iglesias, Josué; Casar, José R
2012-01-01
Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT) and Web of Things (WoT) are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i) to integrate sensing and actuating functionalities into everyday objects, and (ii) to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD) methodology based on the Model Driven Architecture (MDA). This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.
Innovation diffusion on time-varying activity driven networks
NASA Astrophysics Data System (ADS)
Rizzo, Alessandro; Porfiri, Maurizio
2016-01-01
Since its introduction in the 1960s, the theory of innovation diffusion has contributed to the advancement of several research fields, such as marketing management and consumer behavior. The 1969 seminal paper by Bass [F.M. Bass, Manag. Sci. 15, 215 (1969)] introduced a model of product growth for consumer durables, which has been extensively used to predict innovation diffusion across a range of applications. Here, we propose a novel approach to study innovation diffusion, where interactions among individuals are mediated by the dynamics of a time-varying network. Our approach is based on the Bass' model, and overcomes key limitations of previous studies, which assumed timescale separation between the individual dynamics and the evolution of the connectivity patterns. Thus, we do not hypothesize homogeneous mixing among individuals or the existence of a fixed interaction network. We formulate our approach in the framework of activity driven networks to enable the analysis of the concurrent evolution of the interaction and individual dynamics. Numerical simulations offer a systematic analysis of the model behavior and highlight the role of individual activity on market penetration when targeted advertisement campaigns are designed, or a competition between two different products takes place.
Asymptotic theory of time-varying social networks with heterogeneous activity and tie allocation.
Ubaldi, Enrico; Perra, Nicola; Karsai, Márton; Vezzani, Alessandro; Burioni, Raffaella; Vespignani, Alessandro
2016-10-24
The dynamic of social networks is driven by the interplay between diverse mechanisms that still challenge our theoretical and modelling efforts. Amongst them, two are known to play a central role in shaping the networks evolution, namely the heterogeneous propensity of individuals to i) be socially active and ii) establish a new social relationships with their alters. Here, we empirically characterise these two mechanisms in seven real networks describing temporal human interactions in three different settings: scientific collaborations, Twitter mentions, and mobile phone calls. We find that the individuals' social activity and their strategy in choosing ties where to allocate their social interactions can be quantitatively described and encoded in a simple stochastic network modelling framework. The Master Equation of the model can be solved in the asymptotic limit. The analytical solutions provide an explicit description of both the system dynamic and the dynamical scaling laws characterising crucial aspects about the evolution of the networks. The analytical predictions match with accuracy the empirical observations, thus validating the theoretical approach. Our results provide a rigorous dynamical system framework that can be extended to include other processes shaping social dynamics and to generate data driven predictions for the asymptotic behaviour of social networks.
Stress evolution during caldera collapse
NASA Astrophysics Data System (ADS)
Holohan, E. P.; Schöpfer, M. P. J.; Walsh, J. J.
2015-07-01
The mechanics of caldera collapse are subject of long-running debate. Particular uncertainties concern how stresses around a magma reservoir relate to fracturing as the reservoir roof collapses, and how roof collapse in turn impacts upon the reservoir. We used two-dimensional Distinct Element Method models to characterise the evolution of stress around a depleting sub-surface magma body during gravity-driven collapse of its roof. These models illustrate how principal stress orientations rotate during progressive deformation so that roof fracturing transitions from initial reverse faulting to later normal faulting. They also reveal four end-member stress paths to fracture, each corresponding to a particular location within the roof. Analysis of these paths indicates that fractures associated with ultimate roof failure initiate in compression (i.e. as shear fractures). We also report on how mechanical and geometric conditions in the roof affect pre-failure unloading and post-failure reloading of the reservoir. In particular, the models show how residual friction within a failed roof could, without friction reduction mechanisms or fluid-derived counter-effects, inhibit a return to a lithostatically equilibrated pressure in the magma reservoir. Many of these findings should be transferable to other gravity-driven collapse processes, such as sinkhole formation, mine collapse and subsidence above hydrocarbon reservoirs.
Asymptotic theory of time-varying social networks with heterogeneous activity and tie allocation
NASA Astrophysics Data System (ADS)
Ubaldi, Enrico; Perra, Nicola; Karsai, Márton; Vezzani, Alessandro; Burioni, Raffaella; Vespignani, Alessandro
2016-10-01
The dynamic of social networks is driven by the interplay between diverse mechanisms that still challenge our theoretical and modelling efforts. Amongst them, two are known to play a central role in shaping the networks evolution, namely the heterogeneous propensity of individuals to i) be socially active and ii) establish a new social relationships with their alters. Here, we empirically characterise these two mechanisms in seven real networks describing temporal human interactions in three different settings: scientific collaborations, Twitter mentions, and mobile phone calls. We find that the individuals’ social activity and their strategy in choosing ties where to allocate their social interactions can be quantitatively described and encoded in a simple stochastic network modelling framework. The Master Equation of the model can be solved in the asymptotic limit. The analytical solutions provide an explicit description of both the system dynamic and the dynamical scaling laws characterising crucial aspects about the evolution of the networks. The analytical predictions match with accuracy the empirical observations, thus validating the theoretical approach. Our results provide a rigorous dynamical system framework that can be extended to include other processes shaping social dynamics and to generate data driven predictions for the asymptotic behaviour of social networks.
Identification of tumor evolution patterns by means of inductive logic programming.
Bevilacqua, Vitoantonio; Chiarappa, Patrizia; Mastronardi, Giuseppe; Menolascina, Filippo; Paradiso, Angelo; Tommasi, Stefania
2008-06-01
In considering key events of genomic disorders in the development and progression of cancer, the correlation between genomic instability and carcinogenesis is currently under investigation. In this work, we propose an inductive logic programming approach to the problem of modeling evolution patterns for breast cancer. Using this approach, it is possible to extract fingerprints of stages of the disease that can be used in order to develop and deliver the most adequate therapies to patients. Furthermore, such a model can help physicians and biologists in the elucidation of molecular dynamics underlying the aberrations-waterfall model behind carcinogenesis. By showing results obtained on a real-world dataset, we try to give some hints about further approach to the knowledge-driven validations of such hypotheses.
Statistics of certain models of evolution
NASA Astrophysics Data System (ADS)
Standish, Russell K.
1999-02-01
In a recent paper, Newman [J. Theo. Bio. 189, 235 (1997)] surveys the literature on power law spectra in evolution, self-organized criticality and presents a model of his own to arrive at a conclusion that self-organized criticality is not necessary for evolution. Not only did he miss a key model (Ecolab) that has a clear self-organized critical mechanism, but also Newman's model exhibits the same mechanism that gives rise to power law behavior, as does Ecolab. Newman's model is, in fact, a ``mean field'' approximation of a self-organized critical system. In this paper, I have also implemented Newman's model using the Ecolab software, removing the restriction that the number of species must remain constant. It turns out that the requirement of constant species number is nontrivial, leading to a global coupling between species that is similar in effect to the species interactions seen in Ecolab. In fact, the model must self-organize to a state where the long time average of speciations balances that of the extinctions; otherwise, the system either collapses or explodes. In view of this, Newman's model does not provide the hoped-for counterexample to the presence of self-organized criticality in evolution, but does provide a simple, almost analytic model that can be used to understand more intricate models such as Ecolab.
Gao, Yi; Kikinis, Ron; Bouix, Sylvain; Shenton, Martha; Tannenbaum, Allen
2012-08-01
Extracting anatomical and functional significant structures renders one of the important tasks for both the theoretical study of the medical image analysis, and the clinical and practical community. In the past, much work has been dedicated only to the algorithmic development. Nevertheless, for clinical end users, a well designed algorithm with an interactive software is necessary for an algorithm to be utilized in their daily work. Furthermore, the software would better be open sourced in order to be used and validated by not only the authors but also the entire community. Therefore, the contribution of the present work is twofolds: first, we propose a new robust statistics based conformal metric and the conformal area driven multiple active contour framework, to simultaneously extract multiple targets from MR and CT medical imagery in 3D. Second, an open source graphically interactive 3D segmentation tool based on the aforementioned contour evolution is implemented and is publicly available for end users on multiple platforms. In using this software for the segmentation task, the process is initiated by the user drawn strokes (seeds) in the target region in the image. Then, the local robust statistics are used to describe the object features, and such features are learned adaptively from the seeds under a non-parametric estimation scheme. Subsequently, several active contours evolve simultaneously with their interactions being motivated by the principles of action and reaction-this not only guarantees mutual exclusiveness among the contours, but also no longer relies upon the assumption that the multiple objects fill the entire image domain, which was tacitly or explicitly assumed in many previous works. In doing so, the contours interact and converge to equilibrium at the desired positions of the desired multiple objects. Furthermore, with the aim of not only validating the algorithm and the software, but also demonstrating how the tool is to be used, we provide the reader reproducible experiments that demonstrate the capability of the proposed segmentation tool on several public available data sets. Copyright © 2012 Elsevier B.V. All rights reserved.
A 3D Interactive Multi-object Segmentation Tool using Local Robust Statistics Driven Active Contours
Gao, Yi; Kikinis, Ron; Bouix, Sylvain; Shenton, Martha; Tannenbaum, Allen
2012-01-01
Extracting anatomical and functional significant structures renders one of the important tasks for both the theoretical study of the medical image analysis, and the clinical and practical community. In the past, much work has been dedicated only to the algorithmic development. Nevertheless, for clinical end users, a well designed algorithm with an interactive software is necessary for an algorithm to be utilized in their daily work. Furthermore, the software would better be open sourced in order to be used and validated by not only the authors but also the entire community. Therefore, the contribution of the present work is twofolds: First, we propose a new robust statistics based conformal metric and the conformal area driven multiple active contour framework, to simultaneously extract multiple targets from MR and CT medical imagery in 3D. Second, an open source graphically interactive 3D segmentation tool based on the aforementioned contour evolution is implemented and is publicly available for end users on multiple platforms. In using this software for the segmentation task, the process is initiated by the user drawn strokes (seeds) in the target region in the image. Then, the local robust statistics are used to describe the object features, and such features are learned adaptively from the seeds under a non-parametric estimation scheme. Subsequently, several active contours evolve simultaneously with their interactions being motivated by the principles of action and reaction — This not only guarantees mutual exclusiveness among the contours, but also no longer relies upon the assumption that the multiple objects fill the entire image domain, which was tacitly or explicitly assumed in many previous works. In doing so, the contours interact and converge to equilibrium at the desired positions of the desired multiple objects. Furthermore, with the aim of not only validating the algorithm and the software, but also demonstrating how the tool is to be used, we provide the reader reproducible experiments that demonstrate the capability of the proposed segmentation tool on several public available data sets. PMID:22831773
Constructive neutral evolution: exploring evolutionary theory's curious disconnect.
Stoltzfus, Arlin
2012-10-13
Constructive neutral evolution (CNE) suggests that neutral evolution may follow a stepwise path to extravagance. Whether or not CNE is common, the mere possibility raises provocative questions about causation: in classical neo-Darwinian thinking, selection is the sole source of creativity and direction, the only force that can cause trends or build complex features. However, much of contemporary evolutionary genetics departs from the conception of evolution underlying neo-Darwinism, resulting in a widening gap between what formal models allow, and what the prevailing view of the causes of evolution suggests. In particular, a mutationist conception of evolution as a 2-step origin-fixation process has been a source of theoretical innovation for 40 years, appearing not only in the Neutral Theory, but also in recent breakthroughs in modeling adaptation (the "mutational landscape" model), and in practical software for sequence analysis. In this conception, mutation is not a source of raw materials, but an agent that introduces novelty, while selection is not an agent that shapes features, but a stochastic sieve. This view, which now lays claim to important theoretical, experimental, and practical results, demands our attention. CNE provides a way to explore its most significant implications about the role of variation in evolution. Alex Kondrashov, Eugene Koonin and Johann Peter Gogarten reviewed this article.
Constructive neutral evolution: exploring evolutionary theory’s curious disconnect
2012-01-01
Abstract Constructive neutral evolution (CNE) suggests that neutral evolution may follow a stepwise path to extravagance. Whether or not CNE is common, the mere possibility raises provocative questions about causation: in classical neo-Darwinian thinking, selection is the sole source of creativity and direction, the only force that can cause trends or build complex features. However, much of contemporary evolutionary genetics departs from the conception of evolution underlying neo-Darwinism, resulting in a widening gap between what formal models allow, and what the prevailing view of the causes of evolution suggests. In particular, a mutationist conception of evolution as a 2-step origin-fixation process has been a source of theoretical innovation for 40 years, appearing not only in the Neutral Theory, but also in recent breakthroughs in modeling adaptation (the “mutational landscape” model), and in practical software for sequence analysis. In this conception, mutation is not a source of raw materials, but an agent that introduces novelty, while selection is not an agent that shapes features, but a stochastic sieve. This view, which now lays claim to important theoretical, experimental, and practical results, demands our attention. CNE provides a way to explore its most significant implications about the role of variation in evolution. Reviewers Alex Kondrashov, Eugene Koonin and Johann Peter Gogarten reviewed this article. PMID:23062217
NASA Astrophysics Data System (ADS)
Ames, D. P.; Peterson, M.; Larsen, J.
2016-12-01
A steady flow of manuscripts describing integrated water resources management (IWRM) modelling has been published in Environmental Modelling & Software since the journal's inaugural issue in 1997. These papers represent two decades of peer-reviewed scientific knowledge regarding methods, practices, and protocols for conducting IWRM. We have undertaken to explore this specific assemblage of literature with the intention of identifying commonly reported procedures in terms of data integration methods, modelling techniques, approaches to stakeholder participation, means of communication of model results, and other elements of the model development and application life cycle. Initial results from this effort will be presented including a summary of commonly used practices, and their evolution over the past two decades. We anticipate that results will show a pattern of movement toward greater use of both stakeholder/participatory modelling methods as well as increased use of automated methods for data integration and model preparation. Interestingly, such results could be interpreted to show that the availability of better, faster, and more integrated software tools and technologies free the modeler to take a less technocratic and more human approach to water resources modelling.
McDonald, Thomas O; Michor, Franziska
2017-07-15
SIApopr (Simulating Infinite-Allele populations) is an R package to simulate time-homogeneous and inhomogeneous stochastic branching processes under a very flexible set of assumptions using the speed of C ++. The software simulates clonal evolution with the emergence of driver and passenger mutations under the infinite-allele assumption. The software is an application of the Gillespie Stochastic Simulation Algorithm expanded to a large number of cell types and scenarios, with the intention of allowing users to easily modify existing models or create their own. SIApopr is available as an R library on Github ( https://github.com/olliemcdonald/siapopr ). Supplementary data are available at Bioinformatics online. michor@jimmy.harvard.edu. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
1992-06-01
presents the concept of software Total Quality Management (TQM) which focuses on the entire process of software acquisition, as a partial solution to...software TQM can be applied to software acquisition. Software Development, Software Acquisition, Total Quality management (TQM), Army Tactical Missile
DynamO: a free O(N) general event-driven molecular dynamics simulator.
Bannerman, M N; Sargant, R; Lue, L
2011-11-30
Molecular dynamics algorithms for systems of particles interacting through discrete or "hard" potentials are fundamentally different to the methods for continuous or "soft" potential systems. Although many software packages have been developed for continuous potential systems, software for discrete potential systems based on event-driven algorithms are relatively scarce and specialized. We present DynamO, a general event-driven simulation package, which displays the optimal O(N) asymptotic scaling of the computational cost with the number of particles N, rather than the O(N) scaling found in most standard algorithms. DynamO provides reference implementations of the best available event-driven algorithms. These techniques allow the rapid simulation of both complex and large (>10(6) particles) systems for long times. The performance of the program is benchmarked for elastic hard sphere systems, homogeneous cooling and sheared inelastic hard spheres, and equilibrium Lennard-Jones fluids. This software and its documentation are distributed under the GNU General Public license and can be freely downloaded from http://marcusbannerman.co.uk/dynamo. Copyright © 2011 Wiley Periodicals, Inc.
The jABC Approach to Rigorous Collaborative Development of SCM Applications
NASA Astrophysics Data System (ADS)
Hörmann, Martina; Margaria, Tiziana; Mender, Thomas; Nagel, Ralf; Steffen, Bernhard; Trinh, Hong
Our approach to the model-driven collaborative design of IKEA's P3 Delivery Management Process uses the jABC [9] for model driven mediation and choreography to complement a RUP-based (Rational Unified Process) development process. jABC is a framework for service development based on Lightweight Process Coordination. Users (product developers and system/software designers) easily develop services and applications by composing reusable building-blocks into (flow-) graph structures that can be animated, analyzed, simulated, verified, executed, and compiled. This way of handling the collaborative design of complex embedded systems has proven to be effective and adequate for the cooperation of non-programmers and non-technical people, which is the focus of this contribution, and it is now being rolled out in the operative practice.
Wichuk, Kristine; Brynjólfsson, Sigurður; Fu, Weiqi
2014-01-01
We recently evaluated the relationship between abiotic environmental stresses and lutein biosynthesis in the green microalga Dunaliella salina and suggested a rational design of stress-driven adaptive evolution experiments for carotenoids production in microalgae. Here, we summarize our recent findings regarding the biotechnological production of carotenoids from microalgae and outline emerging technology in this field. Carotenoid metabolic pathways are characterized in several representative algal species as they pave the way for biotechnology development. The adaptive evolution strategy is highlighted in connection with enhanced growth rate and carotenoid metabolism. In addition, available genetic modification tools are described, with emphasis on model species. A brief discussion on the role of lights as limiting factors in carotenoid production in microalgae is also included. Overall, our analysis suggests that light-driven metabolism and the photosynthetic efficiency of microalgae in photobioreactors are the main bottlenecks in enhancing biotechnological potential of carotenoid production from microalgae.
Diffusion and transport in locally disordered driven lattices
NASA Astrophysics Data System (ADS)
Wulf, Thomas; Okupnik, Alexander; Schmelcher, Peter
2016-09-01
We study the effect of disorder on the particle density evolution in a classical Hamiltonian driven lattice setup. If the disorder is localized within a finite sub-domain of the lattice, the emergence of strong tails in the density distribution which even increases towards larger positions is shown, thus yielding a highly non-Gaussian particle density evolution. As the key underlying mechanism, we identify the conversion between different components of the unperturbed systems mixed phase space which is induced by the disorder. Based on the introduction of individual conversion rates between chaotic and regular components, a theoretical model is developed which correctly predicts the scaling of the particle density. The effect of disorder on the transport properties is studied where a significant enhancement of the transport for cases of localized disorder is shown, thereby contrasting strongly the merely weak modification of the transport for global disorder.
NASA Technical Reports Server (NTRS)
Schmid, R. M.
1973-01-01
The vestibulo-ocular system is examined from the standpoint of system theory. The evolution of a mathematical model of the vestibulo-ocular system in an attempt to match more and more experimental data is followed step by step. The final model explains many characteristics of the eye movement in vestibularly induced nystagmus. The analysis of the dynamic behavior of the model at the different stages of its development is illustrated in time domain, mainly in a qualitative way.
Data-Driven Approaches for Paraphrasing across Language Variations
ERIC Educational Resources Information Center
Xu, Wei
2014-01-01
Our language changes very rapidly, accompanying political, social and cultural trends, as well as the evolution of science and technology. The Internet, especially the social media, has accelerated this process of change. This poses a severe challenge for both human beings and natural language processing (NLP) systems, which usually only model a…
Dynamic loading and release in Johnson Space Center Lunar regolith simulant
NASA Astrophysics Data System (ADS)
Plesko, C. S.; Jensen, B. J.; Wescott, B. L.; Skinner McKee, T. E.
2011-10-01
The behavior of regolith under dynamic loading is important for the study of planetary evolution, impact cratering, and other topics. Here we present the initial results of explosively driven flier plate experiments and numerical models of compaction and release in samples of the JSC-1A Lunar regolith simulant.
In North America, ammonia (NH3) is increasingly being recognized not only for its role in atmospheric aerosol formation but also as an important component of atmospheric nitrogen deposition. This has been driven by the evolution of policies to protect ecosystems from nitrogen ov...
In North America, ammonia (NH3) is increasingly being recognized not only for its role in atmospheric aerosol formation but also as an important component of atmospheric nitrogen deposition. This has been driven by the evolution of policies to protect ecosystems from nitrogen ov...
Irradiation-driven Mass Transfer Cycles in Compact Binaries
NASA Astrophysics Data System (ADS)
Büning, A.; Ritter, H.
2005-08-01
We elaborate on the analytical model of Ritter, Zhang, & Kolb (2000) which describes the basic physics of irradiation-driven mass transfer cycles in semi-detached compact binary systems. In particular, we take into account a contribution to the thermal relaxation of the donor star which is unrelated to irradiation and which was neglected in previous studies. We present results of simulations of the evolution of compact binaries undergoing mass transfer cycles, in particular also of systems with a nuclear evolved donor star. These computations have been carried out with a stellar evolution code which computes mass transfer implicitly and models irradiation of the donor star in a point source approximation, thereby allowing for much more realistic simulations than were hitherto possible. We find that low-mass X-ray binaries (LMXBs) and cataclysmic variables (CVs) with orbital periods ⪉ 6hr can undergo mass transfer cycles only for low angular momentum loss rates. CVs containing a giant donor or one near the terminal age main sequence are more stable than previously thought, but can possibly also undergo mass transfer cycles.
Measuring the software process and product: Lessons learned in the SEL
NASA Technical Reports Server (NTRS)
Basili, V. R.
1985-01-01
The software development process and product can and should be measured. The software measurement process at the Software Engineering Laboratory (SEL) has taught a major lesson: develop a goal-driven paradigm (also characterized as a goal/question/metric paradigm) for data collection. Project analysis under this paradigm leads to a design for evaluating and improving the methodology of software development and maintenance.
Milestones in Software Engineering and Knowledge Engineering History: A Comparative Review
del Águila, Isabel M.; Palma, José; Túnez, Samuel
2014-01-01
We present a review of the historical evolution of software engineering, intertwining it with the history of knowledge engineering because “those who cannot remember the past are condemned to repeat it.” This retrospective represents a further step forward to understanding the current state of both types of engineerings; history has also positive experiences; some of them we would like to remember and to repeat. Two types of engineerings had parallel and divergent evolutions but following a similar pattern. We also define a set of milestones that represent a convergence or divergence of the software development methodologies. These milestones do not appear at the same time in software engineering and knowledge engineering, so lessons learned in one discipline can help in the evolution of the other one. PMID:24624046
Milestones in software engineering and knowledge engineering history: a comparative review.
del Águila, Isabel M; Palma, José; Túnez, Samuel
2014-01-01
We present a review of the historical evolution of software engineering, intertwining it with the history of knowledge engineering because "those who cannot remember the past are condemned to repeat it." This retrospective represents a further step forward to understanding the current state of both types of engineerings; history has also positive experiences; some of them we would like to remember and to repeat. Two types of engineerings had parallel and divergent evolutions but following a similar pattern. We also define a set of milestones that represent a convergence or divergence of the software development methodologies. These milestones do not appear at the same time in software engineering and knowledge engineering, so lessons learned in one discipline can help in the evolution of the other one.
Galaxy Zoo: Observing secular evolution through bars
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheung, Edmond; Faber, S. M.; Koo, David C.
In this paper, we use the Galaxy Zoo 2 data set to study the behavior of bars in disk galaxies as a function of specific star formation rate (SSFR) and bulge prominence. Our sample consists of 13,295 disk galaxies, with an overall (strong) bar fraction of 23.6% ± 0.4%, of which 1154 barred galaxies also have bar length (BL) measurements. These samples are the largest ever used to study the role of bars in galaxy evolution. We find that the likelihood of a galaxy hosting a bar is anticorrelated with SSFR, regardless of stellar mass or bulge prominence. We findmore » that the trends of bar likelihood and BL with bulge prominence are bimodal with SSFR. We interpret these observations using state-of-the-art simulations of bar evolution that include live halos and the effects of gas and star formation. We suggest our observed trends of bar likelihood with SSFR are driven by the gas fraction of the disks, a factor demonstrated to significantly retard both bar formation and evolution in models. We interpret the bimodal relationship between bulge prominence and bar properties as being due to the complicated effects of classical bulges and central mass concentrations on bar evolution and also to the growth of disky pseudobulges by bar evolution. These results represent empirical evidence for secular evolution driven by bars in disk galaxies. This work suggests that bars are not stagnant structures within disk galaxies but are a critical evolutionary driver of their host galaxies in the local universe (z < 1).« less
The Past, Present, and Future of Demand-Driven Acquisitions in Academic Libraries
ERIC Educational Resources Information Center
Goedeken, Edward A.; Lawson, Karen
2015-01-01
Demand-driven acquisitions (DDA) programs have become a well-established approach toward integrating user involvement in the process of building academic library collections. However, these programs are in a constant state of evolution. A recent iteration in this evolution of ebook availability is the advent of large ebook collections whose…
Inhomogeneous chemical evolution of r-process elements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wehmeyer, B., E-mail: benjamin.wehmeyer@unibas.ch; Thielemann, F.-K.; Pignatari, M.
2016-06-21
We report the results of a galactic chemical evolution (GCE) study for r-process- and alpha elements. For this work, we used the inhomogeneous GCE model ”ICE”, which allows to keep track of the galactic abundances of elements produced by different astrophysical sites. The main input parameters for this study were: a) The Neutron Star Merger (NSM) coalescence time scale, the probability of NSMs, and for the sub-class of ”magneto-rotationally driven Supernovae” (”Jet-SNe”), their occurence rate in comparison to ”standard” Supernovae (SNe).
NASA Astrophysics Data System (ADS)
Singh, H.; Donetsky, D.; Liu, J.; Attenkofer, K.; Cheng, B.; Trelewicz, J. R.; Lubomirsky, I.; Stavitski, E.; Frenkel, A. I.
2018-04-01
We report the development, testing, and demonstration of a setup for modulation excitation spectroscopy experiments at the Inner Shell Spectroscopy beamline of National Synchrotron Light Source - II. A computer algorithm and dedicated software were developed for asynchronous data processing and analysis. We demonstrate the reconstruction of X-ray absorption spectra for different time points within the modulation pulse using a model system. This setup and the software are intended for a broad range of functional materials which exhibit structural and/or electronic responses to the external stimulation, such as catalysts, energy and battery materials, and electromechanical devices.
US hydropower resource assessment for Hawaii
DOE Office of Scientific and Technical Information (OSTI.GOV)
Francfort, J.E.
1996-09-01
US DOE is developing an estimate of the undeveloped hydropower potential in US. The Hydropower Evaluation Software (HES) is a computer model developed by INEL for this purpose. HES measures the undeveloped hydropower resources available in US, using uniform criteria for measurement. The software was tested using hydropower information and data provided by Southwestern Power Administration. It is a menu-driven program that allows the PC user to assign environmental attributes to potential hydropower sites, calculate development suitability factors for each site based on the environmental attributes, and generate reports. This report describes the resource assessment results for the State ofmore » Hawaii.« less
3D Modeling of Antenna Driven Slow Waves Excited by Antennas Near the Plasma Edge
NASA Astrophysics Data System (ADS)
Smithe, David; Jenkins, Thomas
2016-10-01
Prior work with the 3D finite-difference time-domain (FDTD) plasma and sheath model used to model ICRF antennas in fusion plasmas has highlighted the possibility of slow wave excitation at the very low end of the SOL density range, and thus the prudent need for a slow-time evolution model to treat SOL density modifications due to the RF itself. At higher frequency, the DIII-D helicon antenna has much easier access to a parasitic slow wave excitation, and in this case the Faraday screen provides the dominant means of controlling the content of the launched mode, with antenna end-effects remaining a concern. In both cases, the danger is the same, with the slow-wave propagating into a lower-hybrid resonance layer a short distance ( cm) away from the antenna, which would parasitically absorb power, transferring energy to the SOL edge plasma, primarily through electron-neutral collisions. We will present 3D modeling of antennas at both ICRF and helicon frequencies. We've added a slow-time evolution capability for the SOL plasma density to include ponderomotive force driven rarefaction from the strong fields in the vicinity of the antenna, and show initial application to NSTX antenna geometry and plasma configurations. The model is based on a Scalar Ponderomotive Potential method, using self-consistently computed local field amplitudes from the 3D simulation.
Bar-Code System for a Microbiological Laboratory
NASA Technical Reports Server (NTRS)
Law, Jennifer; Kirschner, Larry
2007-01-01
A bar-code system has been assembled for a microbiological laboratory that must examine a large number of samples. The system includes a commercial bar-code reader, computer hardware and software components, plus custom-designed database software. The software generates a user-friendly, menu-driven interface.
NASA Astrophysics Data System (ADS)
Semushin, I. V.; Tsyganova, J. V.; Ugarov, V. V.; Afanasova, A. I.
2018-05-01
Russian higher education institutions' tradition of teaching large-enrolled classes is impairing student striving for individual prominence, one-upmanship, and hopes for originality. Intending to converting these drawbacks into benefits, a Project-Centred Education Model (PCEM) has been introduced to deliver Computational Mathematics and Information Science courses. The model combines a Frontal Competitive Approach and a Project-Driven Learning (PDL) framework. The PDL framework has been developed by stating and solving three design problems: (i) enhance the diversity of project assignments on specific computation methods algorithmic approaches, (ii) balance similarity and dissimilarity of the project assignments, and (iii) develop a software assessment tool suitable for evaluating the technological maturity of students' project deliverables and thus reducing instructor's workload and possible overlook. The positive experience accumulated over 15 years shows that implementing the PCEM keeps students motivated to strive for success in rising to higher levels of their computational and software engineering skills.
Edge-driven microplate kinematics
Schouten, Hans; Klitgord, Kim D.; Gallo, David G.
1993-01-01
It is known from plate tectonic reconstructions that oceanic microplates undergo rapid rotation about a vertical axis and that the instantaneous rotation axes describing the microplate's motion relative to the bounding major plates are frequently located close to its margins with those plates, close to the tips of propagating rifts. We propose a class of edge-driven block models to illustrate how slip across the microplate margins, block rotation, and propagation of rifting may be related to the relative motion of the plates on either side. An important feature of these edge-driven models is that the instantaneous rotation axes are always located on the margins between block and two bounding plates. According to those models the pseudofaults or traces of disrupted seafloor resulting from the propagation of rifting between microplate and major plates may be used independently to approximately trace the continuous kinematic evolution of the microplate back in time. Pseudofault geometries and matching rotations of the Easter microplate show that for most of its 5 m.y. history, block rotation could be driven by the drag of the Nazca and Pacific plates on the microplate's edges rather than by a shear flow of mantle underneath.
Sensor Web Dynamic Measurement Techniques and Adaptive Observing Strategies
NASA Technical Reports Server (NTRS)
Talabac, Stephen J.
2004-01-01
Sensor Web observing systems may have the potential to significantly improve our ability to monitor, understand, and predict the evolution of rapidly evolving, transient, or variable environmental features and events. This improvement will come about by integrating novel data collection techniques, new or improved instruments, emerging communications technologies and protocols, sensor mark-up languages, and interoperable planning and scheduling systems. In contrast to today's observing systems, "event-driven" sensor webs will synthesize real- or near-real time measurements and information from other platforms and then react by reconfiguring the platforms and instruments to invoke new measurement modes and adaptive observation strategies. Similarly, "model-driven" sensor webs will utilize environmental prediction models to initiate targeted sensor measurements or to use a new observing strategy. The sensor web concept contrasts with today's data collection techniques and observing system operations concepts where independent measurements are made by remote sensing and in situ platforms that do not share, and therefore cannot act upon, potentially useful complementary sensor measurement data and platform state information. This presentation describes NASA's view of event-driven and model-driven Sensor Webs and highlights several research and development activities at the Goddard Space Flight Center.
An Object Oriented Extensible Architecture for Affordable Aerospace Propulsion Systems
NASA Technical Reports Server (NTRS)
Follen, Gregory J.
2003-01-01
Driven by a need to explore and develop propulsion systems that exceeded current computing capabilities, NASA Glenn embarked on a novel strategy leading to the development of an architecture that enables propulsion simulations never thought possible before. Full engine 3 Dimensional Computational Fluid Dynamic propulsion system simulations were deemed impossible due to the impracticality of the hardware and software computing systems required. However, with a software paradigm shift and an embracing of parallel and distributed processing, an architecture was designed to meet the needs of future propulsion system modeling. The author suggests that the architecture designed at the NASA Glenn Research Center for propulsion system modeling has potential for impacting the direction of development of affordable weapons systems currently under consideration by the Applied Vehicle Technology Panel (AVT).
Data-Driven Decision Making as a Tool to Improve Software Development Productivity
ERIC Educational Resources Information Center
Brown, Mary Erin
2013-01-01
The worldwide software project failure rate, based on a survey of information technology software manager's view of user satisfaction, product quality, and staff productivity, is estimated to be between 24% and 36% and software project success has not kept pace with the advances in hardware. The problem addressed by this study was the limited…
Constraints on genes shape long-term conservation of macro-synteny in metazoan genomes.
Lv, Jie; Havlak, Paul; Putnam, Nicholas H
2011-10-05
Many metazoan genomes conserve chromosome-scale gene linkage relationships ("macro-synteny") from the common ancestor of multicellular animal life 1234, but the biological explanation for this conservation is still unknown. Double cut and join (DCJ) is a simple, well-studied model of neutral genome evolution amenable to both simulation and mathematical analysis 5, but as we show here, it is not sufficent to explain long-term macro-synteny conservation. We examine a family of simple (one-parameter) extensions of DCJ to identify models and choices of parameters consistent with the levels of macro- and micro-synteny conservation observed among animal genomes. Our software implements a flexible strategy for incorporating genomic context into the DCJ model to incorporate various types of genomic context ("DCJ-[C]"), and is available as open source software from http://github.com/putnamlab/dcj-c. A simple model of genome evolution, in which DCJ moves are allowed only if they maintain chromosomal linkage among a set of constrained genes, can simultaneously account for the level of macro-synteny conservation and for correlated conservation among multiple pairs of species. Simulations under this model indicate that a constraint on approximately 7% of metazoan genes is sufficient to constrain genome rearrangement to an average rate of 25 inversions and 1.7 translocations per million years.
NASA Astrophysics Data System (ADS)
Wolfs, Vincent; Willems, Patrick
2015-04-01
Water managers rely increasingly on mathematical simulation models that represent individual parts of the water system, such as the river, sewer system or waste water treatment plant. The current evolution towards integral water management requires the integration of these distinct components, leading to an increased model scale and scope. Besides this growing model complexity, certain applications gained interest and importance, such as uncertainty and sensitivity analyses, auto-calibration of models and real time control. All these applications share the need for models with a very limited calculation time, either for performing a large number of simulations, or a long term simulation followed by a statistical post-processing of the results. The use of the commonly applied detailed models that solve (part of) the de Saint-Venant equations is infeasible for these applications or such integrated modelling due to several reasons, of which a too long simulation time and the inability to couple submodels made in different software environments are the main ones. Instead, practitioners must use simplified models for these purposes. These models are characterized by empirical relationships and sacrifice model detail and accuracy for increased computational efficiency. The presented research discusses the development of a flexible integral modelling platform that complies with the following three key requirements: (1) Include a modelling approach for water quantity predictions for rivers, floodplains, sewer systems and rainfall runoff routing that require a minimal calculation time; (2) A fast and semi-automatic model configuration, thereby making maximum use of data of existing detailed models and measurements; (3) Have a calculation scheme based on open source code to allow for future extensions or the coupling with other models. First, a novel and flexible modular modelling approach based on the storage cell concept was developed. This approach divides each subcomponent, such as the river, sewer of floodplain, in an arrangement of interconnected cells, thereby lumping processes in space and time. Depending on the behaviour of the system that needs to be emulated and the desired level of accuracy, variables of interest can be predicted by adopting and calibrating one of the predefined model structures, such as weir equations, transfer functions (Wolfs et al., 2013) and self-learning structures including neural networks, model trees and fuzzy systems (Wolfs and Willems, 2013, 2014). Next, a software tool was developed to facilitate and speed-up model configuration. A close integration is foreseen with the MIKE (DHI) and InfoWorks (Innovyze) software. The created software tool also automatically sets up the calculation scheme in C programming language. The developed modelling approach and software were tested extensively on multiple case studies, including uncertainty flood mapping along a river, real time control of hydraulic structures to prevent flooding, and the quantification of the effect on floods of retention basins in a coupled sewer-river system (De Vleeschauwer et al., 2014). Research is currently being done on the extension of the modelling approach and accompanying software tool with physicochemical water quality modules. Acknowledgments This research was supported by the Agency for Innovation by Science and Technology in Flanders (IWT). The authors would like to thank DHI and Innovyze for the MIKE and InfoWorks licenses. References • De Vleeschauwer, K., Weustenraad, J., Nolf, C., Wolfs, V., De Meulder, B., Shannon, K., Willems, P. (2014). Green - blue water in the city: quantification of impact of source control versus end-of-pipe solutions on sewer and river floods. Water Science and Technology, 70 (11), 1825-1837. • Wolfs, V., Villazon Gomez, M., Willems, P. (2013). Development of a semi-automated model identification and calibration tool for conceptual modelling of sewer systems. Water Science and Technology, 68 (1), 167-175. • Wolfs, V., Willems, P. (2013). A data driven approach using Takagi-Sugeno models for computationally efficient lumped floodplain modeling. Journal of Hydrology, 503, 222-232. • Wolfs, V., Willems, P. (2014). Development of discharge-stage curves affected by hysteresis using time varying models, model tree and neural networks. Environmental Modelling & Software, 55, 107-119.
Evolution of weighted complex bus transit networks with flow
NASA Astrophysics Data System (ADS)
Huang, Ailing; Xiong, Jie; Shen, Jinsheng; Guan, Wei
2016-02-01
Study on the intrinsic properties and evolutional mechanism of urban public transit networks (PTNs) has great significance for transit planning and control, particularly considering passengers’ dynamic behaviors. This paper presents an empirical analysis for exploring the complex properties of Beijing’s weighted bus transit network (BTN) based on passenger flow in L-space, and proposes a bi-level evolution model to simulate the development of transit routes from the view of complex network. The model is an iterative process that is driven by passengers’ travel demands and dual-controlled interest mechanism, which is composed of passengers’ spatio-temporal requirements and cost constraint of transit agencies. Also, the flow’s dynamic behaviors, including the evolutions of travel demand, sectional flow attracted by a new link and flow perturbation triggered in nearby routes, are taken into consideration in the evolutional process. We present the numerical experiment to validate the model, where the main parameters are estimated by using distribution functions that are deduced from real-world data. The results obtained have proven that our model can generate a BTN with complex properties, such as the scale-free behavior or small-world phenomenon, which shows an agreement with our empirical results. Our study’s results can be exploited to optimize the real BTN’s structure and improve the network’s robustness.
Macro-actor execution on multilevel data-driven architectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaudiot, J.L.; Najjar, W.
1988-12-31
The data-flow model of computation brings to multiprocessors high programmability at the expense of increased overhead. Applying the model at a higher level leads to better performance but also introduces loss of parallelism. We demonstrate here syntax directed program decomposition methods for the creation of large macro-actors in numerical algorithms. In order to alleviate some of the problems introduced by the lower resolution interpretation, we describe a multi-level of resolution and analyze the requirements for its actual hardware and software integration.
Evolution of Software-Only-Simulation at NASA IV and V
NASA Technical Reports Server (NTRS)
McCarty, Justin; Morris, Justin; Zemerick, Scott
2014-01-01
Software-Only-Simulations have been an emerging but quickly developing field of study throughout NASA. The NASA Independent Verification Validation (IVV) Independent Test Capability (ITC) team has been rapidly building a collection of simulators for a wide range of NASA missions. ITC specializes in full end-to-end simulations that enable developers, VV personnel, and operators to test-as-you-fly. In four years, the team has delivered a wide variety of spacecraft simulations that have ranged from low complexity science missions such as the Global Precipitation Management (GPM) satellite and the Deep Space Climate Observatory (DSCOVR), to the extremely complex missions such as the James Webb Space Telescope (JWST) and Space Launch System (SLS).This paper describes the evolution of ITCs technologies and processes that have been utilized to design, implement, and deploy end-to-end simulation environments for various NASA missions. A comparison of mission simulators are discussed with focus on technology and lessons learned in complexity, hardware modeling, and continuous integration. The paper also describes the methods for executing the missions unmodified flight software binaries (not cross-compiled) for verification and validation activities.
An arms race between producers and scroungers can drive the evolution of social cognition
2014-01-01
The “social intelligence hypothesis” states that the need to cope with complexities of social life has driven the evolution of advanced cognitive abilities. It is usually invoked in the context of challenges arising from complex intragroup structures, hierarchies, and alliances. However, a fundamental aspect of group living remains largely unexplored as a driving force in cognitive evolution: the competition between individuals searching for resources (producers) and conspecifics that parasitize their findings (scroungers). In populations of social foragers, abilities that enable scroungers to steal by outsmarting producers, and those allowing producers to prevent theft by outsmarting scroungers, are likely to be beneficial and may fuel a cognitive arms race. Using analytical theory and agent-based simulations, we present a general model for such a race that is driven by the producer–scrounger game and show that the race’s plausibility is dramatically affected by the nature of the evolving abilities. If scrounging and scrounging avoidance rely on separate, strategy-specific cognitive abilities, arms races are short-lived and have a limited effect on cognition. However, general cognitive abilities that facilitate both scrounging and scrounging avoidance undergo stable, long-lasting arms races. Thus, ubiquitous foraging interactions may lead to the evolution of general cognitive abilities in social animals, without the requirement of complex intragroup structures. PMID:24822021
Can clues from evolution unlock the molecular development of the cerebellum?
Butts, Thomas; Chaplin, Natalie; Wingate, Richard J T
2011-02-01
The cerebellum sits at the rostral end of the vertebrate hindbrain and is responsible for sensory and motor integration. Owing to its relatively simple architecture, it is one of the most powerful model systems for studying brain evolution and development. Over the last decade, the combination of molecular fate mapping techniques in the mouse and experimental studies, both in vitro and in vivo, in mouse and chick have significantly advanced our understanding of cerebellar neurogenesis in space and time. In amniotes, the most numerous cell type in the cerebellum, and indeed the brain, is the cerebellar granule neurons, and these are born from a transient secondary proliferative zone, the external granule layer (EGL), where proliferation is driven by sonic hedgehog signalling and causes cerebellar foliation. Recent studies in zebrafish and sharks have shown that while the molecular mechanisms of neurogenesis appear conserved across vertebrates, the EGL as a site of shh-driven transit amplification is not, and is therefore implicated as a key amniote innovation that facilitated the evolution of the elaborate foliated cerebella found in birds and mammals. Ellucidating the molecular mechanisms underlying the origin of the EGL in evolution could have significant impacts on our understanding of the molecular details of cerebellar development.
Transportable Applications Environment (TAE) Tenth Users' Conference
NASA Technical Reports Server (NTRS)
Rouff, Chris (Editor); Harris, Elfrieda (Editor); Yeager, Arleen (Editor)
1993-01-01
Conference proceedings are represented in graphic visual-aid form. Presentation and panel discussion topics include user experiences with C++ and Ada; the design and interaction of the user interface; the history and goals of TAE; commercialization and testing of TAE Plus; Computer-Human Interaction Models (CHIMES); data driven objects; item-to-item connections and object dependencies; and integration with other software. There follows a list of conference attendees.
Geography and Similarity of Regional Cuisines in China
Zhu, Yu-Xiao; Huang, Junming; Zhang, Zi-Ke; Zhang, Qian-Ming; Zhou, Tao; Ahn, Yong-Yeol
2013-01-01
Food occupies a central position in every culture and it is therefore of great interest to understand the evolution of food culture. The advent of the World Wide Web and online recipe repositories have begun to provide unprecedented opportunities for data-driven, quantitative study of food culture. Here we harness an online database documenting recipes from various Chinese regional cuisines and investigate the similarity of regional cuisines in terms of geography and climate. We find that geographical proximity, rather than climate proximity, is a crucial factor that determines the similarity of regional cuisines. We develop a model of regional cuisine evolution that provides helpful clues for understanding the evolution of cuisines and cultures. PMID:24260166
Geography and similarity of regional cuisines in China.
Zhu, Yu-Xiao; Huang, Junming; Zhang, Zi-Ke; Zhang, Qian-Ming; Zhou, Tao; Ahn, Yong-Yeol
2013-01-01
Food occupies a central position in every culture and it is therefore of great interest to understand the evolution of food culture. The advent of the World Wide Web and online recipe repositories have begun to provide unprecedented opportunities for data-driven, quantitative study of food culture. Here we harness an online database documenting recipes from various Chinese regional cuisines and investigate the similarity of regional cuisines in terms of geography and climate. We find that geographical proximity, rather than climate proximity, is a crucial factor that determines the similarity of regional cuisines. We develop a model of regional cuisine evolution that provides helpful clues for understanding the evolution of cuisines and cultures.
The Legacy of Space Shuttle Flight Software
NASA Technical Reports Server (NTRS)
Hickey, Christopher J.; Loveall, James B.; Orr, James K.; Klausman, Andrew L.
2011-01-01
The initial goals of the Space Shuttle Program required that the avionics and software systems blaze new trails in advancing avionics system technology. Many of the requirements placed on avionics and software were accomplished for the first time on this program. Examples include comprehensive digital fly-by-wire technology, use of a digital databus for flight critical functions, fail operational/fail safe requirements, complex automated redundancy management, and the use of a high-order software language for flight software development. In order to meet the operational and safety goals of the program, the Space Shuttle software had to be extremely high quality, reliable, robust, reconfigurable and maintainable. To achieve this, the software development team evolved a software process focused on continuous process improvement and defect elimination that consistently produced highly predictable and top quality results, providing software managers the confidence needed to sign each Certificate of Flight Readiness (COFR). This process, which has been appraised at Capability Maturity Model (CMM)/Capability Maturity Model Integration (CMMI) Level 5, has resulted in one of the lowest software defect rates in the industry. This paper will present an overview of the evolution of the Primary Avionics Software System (PASS) project and processes over thirty years, an argument for strong statistical control of software processes with examples, an overview of the success story for identifying and driving out errors before flight, a case study of the few significant software issues and how they were either identified before flight or slipped through the process onto a flight vehicle, and identification of the valuable lessons learned over the life of the project.
2011-09-01
service -oriented systems • Software -as-a- Service ( SaaS ) • social network infrastructures • Internet marketing • mobile computing • context awareness...Maintenance and Evolution of Service -Oriented Systems (MESOA 2010), organized by members of the Carnegie Mellon Software Engineering Institute’s...CMU/SEI-2011-SR-008 | 1 1 Workshop Introduction The Software Engineering Institute (SEI) started developing a service -oriented architecture
Rapid climate change and the rate of adaptation: insight from experimental quantitative genetics.
Shaw, Ruth G; Etterson, Julie R
2012-09-01
Evolution proceeds unceasingly in all biological populations. It is clear that climate-driven evolution has molded plants in deep time and within extant populations. However, it is less certain whether adaptive evolution can proceed sufficiently rapidly to maintain the fitness and demographic stability of populations subjected to exceptionally rapid contemporary climate change. Here, we consider this question, drawing on current evidence on the rate of plant range shifts and the potential for an adaptive evolutionary response. We emphasize advances in understanding based on theoretical studies that model interacting evolutionary processes, and we provide an overview of quantitative genetic approaches that can parameterize these models to provide more meaningful predictions of the dynamic interplay between genetics, demography and evolution. We outline further research that can clarify both the adaptive potential of plant populations as climate continues to change and the role played by ongoing adaptation in their persistence. © 2012 The Authors. New Phytologist © 2012 New Phytologist Trust.
Cheng, Ren-Chung; Kuntner, Matjaž
2014-10-01
Sexual dimorphism describes substantial differences between male and female phenotypes. In spiders, sexual dimorphism research almost exclusively focuses on size, and recent studies have recovered steady evolutionary size increases in females, and independent evolutionary size changes in males. Their discordance is due to negative allometric size patterns caused by different selection pressures on male and female sizes (converse Rensch's rule). Here, we investigated macroevolutionary patterns of sexual size dimorphism (SSD) in Argiopinae, a global lineage of orb-weaving spiders with varying degrees of SSD. We devised a Bayesian and maximum-likelihood molecular species-level phylogeny, and then used it to reconstruct sex-specific size evolution, to examine general hypotheses and different models of size evolution, to test for sexual size coevolution, and to examine allometric patterns of SSD. Our results, revealing ancestral moderate sizes and SSD, failed to reject the Brownian motion model, which suggests a nondirectional size evolution. Contrary to predictions, male and female sizes were phylogenetically correlated, and SSD evolution was isometric. We interpret these results to question the classical explanations of female-biased SSD via fecundity, gravity, and differential mortality. In argiopines, SSD evolution may be driven by these or additional selection mechanisms, but perhaps at different phylogenetic scales. © 2014 The Author(s). Evolution © 2014 The Society for the Study of Evolution.
Asymmetric ecological conditions favor Red-Queen type of continued evolution over stasis.
Nordbotten, Jan Martin; Stenseth, Nils C
2016-02-16
Four decades ago, Leigh Van Valen presented the Red Queen's hypothesis to account for evolution of species within a multispecies ecological community [Van Valen L (1973) Evol Theory 1(1):1-30]. The overall conclusion of Van Valen's analysis was that evolution would continue even in the absence of abiotic perturbations. Stenseth and Maynard Smith presented in 1984 [Stenseth NC, Maynard Smith J (1984) Evolution 38(4):870-880] a model for the Red Queen's hypothesis showing that both Red-Queen type of continuous evolution and stasis could result from a model with biotically driven evolution. However, although that contribution demonstrated that both evolutionary outcomes were possible, it did not identify which ecological conditions would lead to each of these evolutionary outcomes. Here, we provide, using a simple, yet general population-biologically founded eco-evolutionary model, such analytically derived conditions: Stasis will predominantly emerge whenever the ecological system contains only symmetric ecological interactions, whereas both Red-Queen and stasis type of evolution may result if the ecological interactions are asymmetrical, and more likely so with increasing degree of asymmetry in the ecological system (i.e., the more trophic interactions, host-pathogen interactions, and the like there are [i.e., +/- type of ecological interactions as well as asymmetric competitive (-/-) and mutualistic (+/+) ecological interactions]). In the special case of no between-generational genetic variance, our results also predict dynamics within these types of purely ecological systems.
EcoliWiki: a wiki-based community resource for Escherichia coli
McIntosh, Brenley K.; Renfro, Daniel P.; Knapp, Gwendowlyn S.; Lairikyengbam, Chanchala R.; Liles, Nathan M.; Niu, Lili; Supak, Amanda M.; Venkatraman, Anand; Zweifel, Adrienne E.; Siegele, Deborah A.; Hu, James C.
2012-01-01
EcoliWiki is the community annotation component of the PortEco (http://porteco.org; formerly EcoliHub) project, an online data resource that integrates information on laboratory strains of Escherichia coli, its phages, plasmids and mobile genetic elements. As one of the early adopters of the wiki approach to model organism databases, EcoliWiki was designed to not only facilitate community-driven sharing of biological knowledge about E. coli as a model organism, but also to be interoperable with other data resources. EcoliWiki content currently covers genes from five laboratory E. coli strains, 21 bacteriophage genomes, F plasmid and eight transposons. EcoliWiki integrates the Mediawiki wiki platform with other open-source software tools and in-house software development to extend how wikis can be used for model organism databases. EcoliWiki can be accessed online at http://ecoliwiki.net. PMID:22064863
NASA Astrophysics Data System (ADS)
Doranti Tiritan, Carolina; Hackspacher, Peter C.; Glasmacher, Ulrich A.
2014-05-01
The Poços de Caldas Plateau in the southeastern Brazil, and it is characterized by a high relief topography supported by the pre-Cambrian crystalline rocks and by the Poços de Caldas Alkaline Massif (PCAM). Ulbrich et al (2002) determine that the ages for the predominant PCAM intermediate rocks were constrained ~83Ma. In addition, geologic observations indicates the phonolites, tinguaites and nepheline syenites were emplaced in a continuous and rapid sequence lasting between 1 to 2 Ma. The topography is characterized by dissected plateau with irregular topographic ridges and peaks with elevations between 900 and 1300m (a.s.l.) on the metamorphic basement and from 1300 to 1700m (a.s.l) on the PCAM region. Therefore, the aim of the work was quantify the main processes that were responsible for the evolution of the landscape by using methods as the low temperature thermochronology and the 3D thermokinematic modeling, for obtaining data of uplift and erosion rates and to correlate them with the thermal gradients of the region. The 3D thermokinematic modeling was obtained using the software code PECUBE (Braun 2003).
The software architecture to control the Cherenkov Telescope Array
NASA Astrophysics Data System (ADS)
Oya, I.; Füßling, M.; Antonino, P. O.; Conforti, V.; Hagge, L.; Melkumyan, D.; Morgenstern, A.; Tosti, G.; Schwanke, U.; Schwarz, J.; Wegner, P.; Colomé, J.; Lyard, E.
2016-07-01
The Cherenkov Telescope Array (CTA) project is an initiative to build two large arrays of Cherenkov gamma- ray telescopes. CTA will be deployed as two installations, one in the northern and the other in the southern hemisphere, containing dozens of telescopes of different sizes. CTA is a big step forward in the field of ground- based gamma-ray astronomy, not only because of the expected scientific return, but also due to the order-of- magnitude larger scale of the instrument to be controlled. The performance requirements associated with such a large and distributed astronomical installation require a thoughtful analysis to determine the best software solutions. The array control and data acquisition (ACTL) work-package within the CTA initiative will deliver the software to control and acquire the data from the CTA instrumentation. In this contribution we present the current status of the formal ACTL system decomposition into software building blocks and the relationships among them. The system is modelled via the Systems Modelling Language (SysML) formalism. To cope with the complexity of the system, this architecture model is sub-divided into different perspectives. The relationships with the stakeholders and external systems are used to create the first perspective, the context of the ACTL software system. Use cases are employed to describe the interaction of those external elements with the ACTL system and are traced to a hierarchy of functionalities (abstract system functions) describing the internal structure of the ACTL system. These functions are then traced to fully specified logical elements (software components), the deployment of which as technical elements, is also described. This modelling approach allows us to decompose the ACTL software in elements to be created and the ow of information within the system, providing us with a clear way to identify sub-system interdependencies. This architectural approach allows us to build the ACTL system model and trace requirements to deliverables (source code, documentation, etc.), and permits the implementation of a flexible use-case driven software development approach thanks to the traceability from use cases to the logical software elements. The Alma Common Software (ACS) container/component framework, used for the control of the Atacama Large Millimeter/submillimeter Array (ALMA) is the basis for the ACTL software and as such it is considered as an integral part of the software architecture.
The Liquid Argon Software Toolkit (LArSoft): Goals, Status and Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pordes, Rush; Snider, Erica
LArSoft is a toolkit that provides a software infrastructure and algorithms for the simulation, reconstruction and analysis of events in Liquid Argon Time Projection Chambers (LArTPCs). It is used by the ArgoNeuT, LArIAT, MicroBooNE, DUNE (including 35ton prototype and ProtoDUNE) and SBND experiments. The LArSoft collaboration provides an environment for the development, use, and sharing of code across experiments. The ultimate goal is to develop fully automatic processes for reconstruction and analysis of LArTPC events. The toolkit is based on the art framework and has a well-defined architecture to interface to other packages, including to GEANT4 and GENIE simulation softwaremore » and the Pandora software development kit for pattern recognition. It is designed to facilitate and support the evolution of algorithms including their transition to new computing platforms. The development of the toolkit is driven by the scientific stakeholders involved. The core infrastructure includes standard definitions of types and constants, means to input experiment geometries as well as meta and event- data in several formats, and relevant general utilities. Examples of algorithms experiments have contributed to date are: photon-propagation; particle identification; hit finding, track finding and fitting; electromagnetic shower identification and reconstruction. We report on the status of the toolkit and plans for future work.« less
Bennett, Joseph R.; French, Connor M.
2017-01-01
SDMtoolbox 2.0 is a software package for spatial studies of ecology, evolution, and genetics. The release of SDMtoolbox 2.0 allows researchers to use the most current ArcGIS software and MaxEnt software, and reduces the amount of time that would be spent developing common solutions. The central aim of this software is to automate complicated and repetitive spatial analyses in an intuitive graphical user interface. One core tenant facilitates careful parameterization of species distribution models (SDMs) to maximize each model’s discriminatory ability and minimize overfitting. This includes carefully processing of occurrence data, environmental data, and model parameterization. This program directly interfaces with MaxEnt, one of the most powerful and widely used species distribution modeling software programs, although SDMtoolbox 2.0 is not limited to species distribution modeling or restricted to modeling in MaxEnt. Many of the SDM pre- and post-processing tools have ‘universal’ analogs for use with any modeling software. The current version contains a total of 79 scripts that harness the power of ArcGIS for macroecology, landscape genetics, and evolutionary studies. For example, these tools allow for biodiversity quantification (such as species richness or corrected weighted endemism), generation of least-cost paths and corridors among shared haplotypes, assessment of the significance of spatial randomizations, and enforcement of dispersal limitations of SDMs projected into future climates—to only name a few functions contained in SDMtoolbox 2.0. Lastly, dozens of generalized tools exists for batch processing and conversion of GIS data types or formats, which are broadly useful to any ArcMap user. PMID:29230356
A Cloud Based Framework For Monitoring And Predicting Subsurface System Behaviour
NASA Astrophysics Data System (ADS)
Versteeg, R. J.; Rodzianko, A.; Johnson, D. V.; Soltanian, M. R.; Dwivedi, D.; Dafflon, B.; Tran, A. P.; Versteeg, O. J.
2015-12-01
Subsurface system behavior is driven and controlled by the interplay of physical, chemical, and biological processes which occur at multiple temporal and spatial scales. Capabilities to monitor, understand and predict this behavior in an effective and timely manner are needed for both scientific purposes and for effective subsurface system management. Such capabilities require three elements: Models, Data and an enabling cyberinfrastructure, which allow users to use these models and data in an effective manner. Under a DOE Office of Science funded STTR award Subsurface Insights and LBNL have designed and implemented a cloud based predictive assimilation framework (PAF) which automatically ingests, controls quality and stores heterogeneous physical and chemical subsurface data and processes these data using different inversion and modeling codes to provide information on the current state and evolution of subsurface systems. PAF is implemented as a modular cloud based software application with five components: (1) data acquisition, (2) data management, (3) data assimilation and processing, (4) visualization and result delivery and (5) orchestration. Serverside PAF uses ZF2 (a PHP web application framework) and Python and both open source (ODM2) and in house developed data models. Clientside PAF uses CSS and JS to allow for interactive data visualization and analysis. Client side modularity (which allows for a responsive interface) of the system is achieved by implementing each core capability of PAF (such as data visualization, user configuration and control, electrical geophysical monitoring and email/SMS alerts on data streams) as a SPA (Single Page Application). One of the recent enhancements is the full integration of a number of flow and mass transport and parameter estimation codes (e.g., MODFLOW, MT3DMS, PHT3D, TOUGH, PFLOTRAN) in this framework. This integration allows for autonomous and user controlled modeling of hydrological and geochemical processes. In our presentation we will discuss our software architecture and present the results of using these codes and the overall developed performance of our framework using hydrological, geochemical and geophysical data from the LBNL SFA2 Rifle field site.
Flight dynamics system software development environment (FDS/SDE) tutorial
NASA Technical Reports Server (NTRS)
Buell, John; Myers, Philip
1986-01-01
A sample development scenario using the Flight Dynamics System Software Development Environment (FDS/SDE) is presented. The SDE uses a menu-driven, fill-in-the-blanks format that provides online help at all steps, thus eliminating lengthy training and allowing immediate use of this new software development tool.
X-Ray Probes of Cosmic Star Formation History
NASA Technical Reports Server (NTRS)
Ghosh, Pranab; White, Nicholas E.
2001-01-01
We discuss the imprints left by a cosmological evolution of the star formation rate (SFR) on the evolution of X-ray luminosities Lx of normal galaxies, using the scheme earlier proposed by us, wherein the evolution of LX of a galaxy is driven by the evolution of its X-ray binary population. As indicated in our earlier work, the profile of Lx with redshift can both serve as a diagnostic probe of the SFR profile and constrain evolutionary models for X-ray binaries. We report here the first calculation of the expected evolution of X-ray luminosities of galaxies, updating our work by using a suite of more recently developed SFR profiles that span the currently plausible range. The first Chandra deep imaging results on Lx evolution are beginning to probe the SFR profile of bright spiral galaxies; the early results are consistent with predictions based on current SFR models. Using these new SFR profiles, the resolution of the "birthrate problem" of low-mass X-ray binaries and recycled, millisecond pulsars in terms of an evolving global SFR is more complete. We discuss the possible impact of the variations in the SFR profile of individual galaxies and galaxy types.
Chaos and unpredictability in evolution.
Doebeli, Michael; Ispolatov, Iaroslav
2014-05-01
The possibility of complicated dynamic behavior driven by nonlinear feedbacks in dynamical systems has revolutionized science in the latter part of the last century. Yet despite examples of complicated frequency dynamics, the possibility of long-term evolutionary chaos is rarely considered. The concept of "survival of the fittest" is central to much evolutionary thinking and embodies a perspective of evolution as a directional optimization process exhibiting simple, predictable dynamics. This perspective is adequate for simple scenarios, when frequency-independent selection acts on scalar phenotypes. However, in most organisms many phenotypic properties combine in complicated ways to determine ecological interactions, and hence frequency-dependent selection. Therefore, it is natural to consider models for evolutionary dynamics generated by frequency-dependent selection acting simultaneously on many different phenotypes. Here we show that complicated, chaotic dynamics of long-term evolutionary trajectories in phenotype space is very common in a large class of such models when the dimension of phenotype space is large, and when there are selective interactions between the phenotypic components. Our results suggest that the perspective of evolution as a process with simple, predictable dynamics covers only a small fragment of long-term evolution. © 2014 The Author(s). Evolution © 2014 The Society for the Study of Evolution.
NASA Astrophysics Data System (ADS)
Citrin, J.; Bourdelle, C.; Casson, F. J.; Angioni, C.; Bonanomi, N.; Camenen, Y.; Garbet, X.; Garzotti, L.; Görler, T.; Gürcan, O.; Koechl, F.; Imbeaux, F.; Linder, O.; van de Plassche, K.; Strand, P.; Szepesi, G.; Contributors, JET
2017-12-01
Quasilinear turbulent transport models are a successful tool for prediction of core tokamak plasma profiles in many regimes. Their success hinges on the reproduction of local nonlinear gyrokinetic fluxes. We focus on significant progress in the quasilinear gyrokinetic transport model QuaLiKiz (Bourdelle et al 2016 Plasma Phys. Control. Fusion 58 014036), which employs an approximated solution of the mode structures to significantly speed up computation time compared to full linear gyrokinetic solvers. Optimisation of the dispersion relation solution algorithm within integrated modelling applications leads to flux calculations × {10}6-7 faster than local nonlinear simulations. This allows tractable simulation of flux-driven dynamic profile evolution including all transport channels: ion and electron heat, main particles, impurities, and momentum. Furthermore, QuaLiKiz now includes the impact of rotation and temperature anisotropy induced poloidal asymmetry on heavy impurity transport, important for W-transport applications. Application within the JETTO integrated modelling code results in 1 s of JET plasma simulation within 10 h using 10 CPUs. Simultaneous predictions of core density, temperature, and toroidal rotation profiles for both JET hybrid and baseline experiments are presented, covering both ion and electron turbulence scales. The simulations are successfully compared to measured profiles, with agreement mostly in the 5%-25% range according to standard figures of merit. QuaLiKiz is now open source and available at www.qualikiz.com.
Model-driven approach to data collection and reporting for quality improvement.
Curcin, Vasa; Woodcock, Thomas; Poots, Alan J; Majeed, Azeem; Bell, Derek
2014-12-01
Continuous data collection and analysis have been shown essential to achieving improvement in healthcare. However, the data required for local improvement initiatives are often not readily available from hospital Electronic Health Record (EHR) systems or not routinely collected. Furthermore, improvement teams are often restricted in time and funding thus requiring inexpensive and rapid tools to support their work. Hence, the informatics challenge in healthcare local improvement initiatives consists of providing a mechanism for rapid modelling of the local domain by non-informatics experts, including performance metric definitions, and grounded in established improvement techniques. We investigate the feasibility of a model-driven software approach to address this challenge, whereby an improvement model designed by a team is used to automatically generate required electronic data collection instruments and reporting tools. To that goal, we have designed a generic Improvement Data Model (IDM) to capture the data items and quality measures relevant to the project, and constructed Web Improvement Support in Healthcare (WISH), a prototype tool that takes user-generated IDM models and creates a data schema, data collection web interfaces, and a set of live reports, based on Statistical Process Control (SPC) for use by improvement teams. The software has been successfully used in over 50 improvement projects, with more than 700 users. We present in detail the experiences of one of those initiatives, Chronic Obstructive Pulmonary Disease project in Northwest London hospitals. The specific challenges of improvement in healthcare are analysed and the benefits and limitations of the approach are discussed. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Mizas, Ch; Sirakoulis, G Ch; Mardiris, V; Karafyllidis, I; Glykos, N; Sandaltzopoulos, R
2008-04-01
Change of DNA sequence that fuels evolution is, to a certain extent, a deterministic process because mutagenesis does not occur in an absolutely random manner. So far, it has not been possible to decipher the rules that govern DNA sequence evolution due to the extreme complexity of the entire process. In our attempt to approach this issue we focus solely on the mechanisms of mutagenesis and deliberately disregard the role of natural selection. Hence, in this analysis, evolution refers to the accumulation of genetic alterations that originate from mutations and are transmitted through generations without being subjected to natural selection. We have developed a software tool that allows modelling of a DNA sequence as a one-dimensional cellular automaton (CA) with four states per cell which correspond to the four DNA bases, i.e. A, C, T and G. The four states are represented by numbers of the quaternary number system. Moreover, we have developed genetic algorithms (GAs) in order to determine the rules of CA evolution that simulate the DNA evolution process. Linear evolution rules were considered and square matrices were used to represent them. If DNA sequences of different evolution steps are available, our approach allows the determination of the underlying evolution rule(s). Conversely, once the evolution rules are deciphered, our tool may reconstruct the DNA sequence in any previous evolution step for which the exact sequence information was unknown. The developed tool may be used to test various parameters that could influence evolution. We describe a paradigm relying on the assumption that mutagenesis is governed by a near-neighbour-dependent mechanism. Based on the satisfactory performance of our system in the deliberately simplified example, we propose that our approach could offer a starting point for future attempts to understand the mechanisms that govern evolution. The developed software is open-source and has a user-friendly graphical input interface.
Time series analysis for minority game simulations of financial markets
NASA Astrophysics Data System (ADS)
Ferreira, Fernando F.; Francisco, Gerson; Machado, Birajara S.; Muruganandam, Paulsamy
2003-04-01
The minority game (MG) model introduced recently provides promising insights into the understanding of the evolution of prices, indices and rates in the financial markets. In this paper we perform a time series analysis of the model employing tools from statistics, dynamical systems theory and stochastic processes. Using benchmark systems and a financial index for comparison, several conclusions are obtained about the generating mechanism for this kind of evolution. The motion is deterministic, driven by occasional random external perturbation. When the interval between two successive perturbations is sufficiently large, one can find low dimensional chaos in this regime. However, the full motion of the MG model is found to be similar to that of the first differences of the SP500 index: stochastic, nonlinear and (unit root) stationary.
Molecular and Kinetic Models for High-rate Thermal Degradation of Polyethylene
Lane, J. Matthew; Moore, Nathan W.
2018-02-01
Thermal degradation of polyethylene is studied under the extremely high rate temperature ramps expected in laser-driven and X-ray ablation experiments—from 10 10 to 10 14 K/s in isochoric, condensed phases. The molecular evolution and macroscopic state variables are extracted as a function of density from reactive molecular dynamics simulations using the ReaxFF potential. The enthalpy, dissociation onset temperature, bond evolution, and observed cross-linking are shown to be rate dependent. These results are used to parametrize a kinetic rate model for the decomposition and coalescence of hydrocarbons as a function of temperature, temperature ramp rate, and density. In conclusion, the resultsmore » are contrasted to first-order random-scission macrokinetic models often assumed for pyrolysis of linear polyethylene under ambient conditions.« less
Molecular and Kinetic Models for High-rate Thermal Degradation of Polyethylene
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lane, J. Matthew; Moore, Nathan W.
Thermal degradation of polyethylene is studied under the extremely high rate temperature ramps expected in laser-driven and X-ray ablation experiments—from 10 10 to 10 14 K/s in isochoric, condensed phases. The molecular evolution and macroscopic state variables are extracted as a function of density from reactive molecular dynamics simulations using the ReaxFF potential. The enthalpy, dissociation onset temperature, bond evolution, and observed cross-linking are shown to be rate dependent. These results are used to parametrize a kinetic rate model for the decomposition and coalescence of hydrocarbons as a function of temperature, temperature ramp rate, and density. In conclusion, the resultsmore » are contrasted to first-order random-scission macrokinetic models often assumed for pyrolysis of linear polyethylene under ambient conditions.« less
Enhancing GIS Capabilities for High Resolution Earth Science Grids
NASA Astrophysics Data System (ADS)
Koziol, B. W.; Oehmke, R.; Li, P.; O'Kuinghttons, R.; Theurich, G.; DeLuca, C.
2017-12-01
Applications for high performance GIS will continue to increase as Earth system models pursue more realistic representations of Earth system processes. Finer spatial resolution model input and output, unstructured or irregular modeling grids, data assimilation, and regional coordinate systems present novel challenges for GIS frameworks operating in the Earth system modeling domain. This presentation provides an overview of two GIS-driven applications that combine high performance software with big geospatial datasets to produce value-added tools for the modeling and geoscientific community. First, a large-scale interpolation experiment using National Hydrography Dataset (NHD) catchments, a high resolution rectilinear CONUS grid, and the Earth System Modeling Framework's (ESMF) conservative interpolation capability will be described. ESMF is a parallel, high-performance software toolkit that provides capabilities (e.g. interpolation) for building and coupling Earth science applications. ESMF is developed primarily by the NOAA Environmental Software Infrastructure and Interoperability (NESII) group. The purpose of this experiment was to test and demonstrate the utility of high performance scientific software in traditional GIS domains. Special attention will be paid to the nuanced requirements for dealing with high resolution, unstructured grids in scientific data formats. Second, a chunked interpolation application using ESMF and OpenClimateGIS (OCGIS) will demonstrate how spatial subsetting can virtually remove computing resource ceilings for very high spatial resolution interpolation operations. OCGIS is a NESII-developed Python software package designed for the geospatial manipulation of high-dimensional scientific datasets. An overview of the data processing workflow, why a chunked approach is required, and how the application could be adapted to meet operational requirements will be discussed here. In addition, we'll provide a general overview of OCGIS's parallel subsetting capabilities including challenges in the design and implementation of a scientific data subsetter.
Efficient common-envelope ejection through dust-driven winds
NASA Astrophysics Data System (ADS)
Glanz, Hila; Perets, Hagai B.
2018-04-01
Common-envelope evolution (CEE) is the short-lived phase in the life of an interacting binary-system during which two stars orbit inside a single shared envelope. Such evolution is thought to lead to the inspiral of the binary, the ejection of the extended envelope and the formation of a remnant short-period binary. However, detailed hydrodynamical models of CEE encounter major difficulties. They show that following the inspiral most of the envelope is not ejected; though it expands to larger separations, it remains bound to the binary. Here we propose that dust-driven winds can be produced following the CEE. These can evaporate the envelope following similar processes operating in the ejection of the envelopes of AGB stars. Pulsations in an AGB-star drives the expansion of its envelope, allowing the material to cool down to low temperatures thus enabling dust condensation. Radiation pressure on the dust accelerates it, and through its coupling to the gas it drives winds which eventually completely erode the envelope. We show that the inspiral phase in CE-binaries can effectively replace the role of stellar pulsation and drive the CE expansion to scales comparable with those of AGB stars, and give rise to efficient mass-loss through dust-driven winds.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Santillán, David; Juanes, Ruben; Cueto-Felgueroso, Luis
Propagation of fluid-driven fractures plays an important role in natural and engineering processes, including transport of magma in the lithosphere, geologic sequestration of carbon dioxide, and oil and gas recovery from low-permeability formations, among many others. The simulation of fracture propagation poses a computational challenge as a result of the complex physics of fracture and the need to capture disparate length scales. Phase field models represent fractures as a diffuse interface and enjoy the advantage that fracture nucleation, propagation, branching, or twisting can be simulated without ad hoc computational strategies like remeshing or local enrichment of the solution space. Heremore » we propose a new quasi-static phase field formulation for modeling fluid-driven fracturing in elastic media at small strains. The approach fully couples the fluid flow in the fracture (described via the Reynolds lubrication approximation) and the deformation of the surrounding medium. The flow is solved on a lower dimensionality mesh immersed in the elastic medium. This approach leads to accurate coupling of both physics. We assessed the performance of the model extensively by comparing results for the evolution of fracture length, aperture, and fracture fluid pressure against analytical solutions under different fracture propagation regimes. Thus, the excellent performance of the numerical model in all regimes builds confidence in the applicability of phase field approaches to simulate fluid-driven fracture.« less
Santillán, David; Juanes, Ruben; Cueto-Felgueroso, Luis
2017-04-20
Propagation of fluid-driven fractures plays an important role in natural and engineering processes, including transport of magma in the lithosphere, geologic sequestration of carbon dioxide, and oil and gas recovery from low-permeability formations, among many others. The simulation of fracture propagation poses a computational challenge as a result of the complex physics of fracture and the need to capture disparate length scales. Phase field models represent fractures as a diffuse interface and enjoy the advantage that fracture nucleation, propagation, branching, or twisting can be simulated without ad hoc computational strategies like remeshing or local enrichment of the solution space. Heremore » we propose a new quasi-static phase field formulation for modeling fluid-driven fracturing in elastic media at small strains. The approach fully couples the fluid flow in the fracture (described via the Reynolds lubrication approximation) and the deformation of the surrounding medium. The flow is solved on a lower dimensionality mesh immersed in the elastic medium. This approach leads to accurate coupling of both physics. We assessed the performance of the model extensively by comparing results for the evolution of fracture length, aperture, and fracture fluid pressure against analytical solutions under different fracture propagation regimes. Thus, the excellent performance of the numerical model in all regimes builds confidence in the applicability of phase field approaches to simulate fluid-driven fracture.« less
Opinion evolution in different social acquaintance networks.
Chen, Xi; Zhang, Xiao; Wu, Zhan; Wang, Hongwei; Wang, Guohua; Li, Wei
2017-11-01
Social acquaintance networks influenced by social culture and social policy have a great impact on public opinion evolution in daily life. Based on the differences between socio-culture and social policy, three different social acquaintance networks (kinship-priority acquaintance network, independence-priority acquaintance network, and hybrid acquaintance network) incorporating heredity proportion p h and variation proportion p v are proposed in this paper. Numerical experiments are conducted to investigate network topology and different phenomena during opinion evolution, using the Deffuant model. We found that in kinship-priority acquaintance networks, similar to the Chinese traditional acquaintance networks, opinions always achieve fragmentation, resulting in the formation of multiple large clusters and many small clusters due to the fact that individuals believe more in their relatives and live in a relatively closed environment. In independence-priority acquaintance networks, similar to Western acquaintance networks, the results are similar to those in the kinship-priority acquaintance network. In hybrid acquaintance networks, similar to the Chinese modern acquaintance networks, only a few clusters are formed indicating that in modern China, opinions are more likely to reach consensus on a large scale. These results are similar to the opinion evolution phenomena in modern society, proving the rationality and applicability of network models combined with social culture and policy. We also found a threshold curve p v +2p h =2.05 in the results for the final opinion clusters and evolution time. Above the threshold curve, opinions could easily reach consensus. Based on the above experimental results, a culture-policy-driven mechanism for the opinion dynamic is worth promoting in this paper, that is, opinion dynamics can be driven by different social cultures and policies through the influence of heredity and variation in interpersonal relationship networks. This finding is of great significance for predicting opinion evolution under different acquaintance networks and formulating reasonable policies based on cultural characteristics to guide public opinion.
Opinion evolution in different social acquaintance networks
NASA Astrophysics Data System (ADS)
Chen, Xi; Zhang, Xiao; Wu, Zhan; Wang, Hongwei; Wang, Guohua; Li, Wei
2017-11-01
Social acquaintance networks influenced by social culture and social policy have a great impact on public opinion evolution in daily life. Based on the differences between socio-culture and social policy, three different social acquaintance networks (kinship-priority acquaintance network, independence-priority acquaintance network, and hybrid acquaintance network) incorporating heredity proportion ph and variation proportion pv are proposed in this paper. Numerical experiments are conducted to investigate network topology and different phenomena during opinion evolution, using the Deffuant model. We found that in kinship-priority acquaintance networks, similar to the Chinese traditional acquaintance networks, opinions always achieve fragmentation, resulting in the formation of multiple large clusters and many small clusters due to the fact that individuals believe more in their relatives and live in a relatively closed environment. In independence-priority acquaintance networks, similar to Western acquaintance networks, the results are similar to those in the kinship-priority acquaintance network. In hybrid acquaintance networks, similar to the Chinese modern acquaintance networks, only a few clusters are formed indicating that in modern China, opinions are more likely to reach consensus on a large scale. These results are similar to the opinion evolution phenomena in modern society, proving the rationality and applicability of network models combined with social culture and policy. We also found a threshold curve pv+2 ph=2.05 in the results for the final opinion clusters and evolution time. Above the threshold curve, opinions could easily reach consensus. Based on the above experimental results, a culture-policy-driven mechanism for the opinion dynamic is worth promoting in this paper, that is, opinion dynamics can be driven by different social cultures and policies through the influence of heredity and variation in interpersonal relationship networks. This finding is of great significance for predicting opinion evolution under different acquaintance networks and formulating reasonable policies based on cultural characteristics to guide public opinion.
Architectures for Distributed and Complex M-Learning Systems: Applying Intelligent Technologies
ERIC Educational Resources Information Center
Caballe, Santi, Ed.; Xhafa, Fatos, Ed.; Daradoumis, Thanasis, Ed.; Juan, Angel A., Ed.
2009-01-01
Over the last decade, the needs of educational organizations have been changing in accordance with increasingly complex pedagogical models and with the technological evolution of e-learning environments with very dynamic teaching and learning requirements. This book explores state-of-the-art software architectures and platforms used to support…
2012-07-01
3.3.4 User Community Management 14 3.3.5 Uncontrolled Prototype Growth 14 3.3.6 Project Manager Decisions 15 3.3.7 The 90% Syndrome 15 3.3.8 Re...Figure 3: 90% Syndrome Due to Rippling Rework in the Production Development 21 Figure 4: Causal Loop Diagram of "The Evolution of a Science Project...Unintended Burnout Due to Overtime 60 V | CMU/SEI-2012-TR-001 Acknowledgments Many people have worked to sponsor and improve this report and the
Strategy evolution driven by switching probabilities in structured multi-agent systems
NASA Astrophysics Data System (ADS)
Zhang, Jianlei; Chen, Zengqiang; Li, Zhiqi
2017-10-01
Evolutionary mechanism driving the commonly seen cooperation among unrelated individuals is puzzling. Related models for evolutionary games on graphs traditionally assume that players imitate their successful neighbours with higher benefits. Notably, an implicit assumption here is that players are always able to acquire the required pay-off information. To relax this restrictive assumption, a contact-based model has been proposed, where switching probabilities between strategies drive the strategy evolution. However, the explicit and quantified relation between a player's switching probability for her strategies and the number of her neighbours remains unknown. This is especially a key point in heterogeneously structured system, where players may differ in the numbers of their neighbours. Focusing on this, here we present an augmented model by introducing an attenuation coefficient and evaluate its influence on the evolution dynamics. Results show that the individual influence on others is negatively correlated with the contact numbers specified by the network topologies. Results further provide the conditions under which the coexisting strategies can be calculated analytically.
Morphological Evolution of Pit-Patterned Si(001) Substrates Driven by Surface-Energy Reduction
NASA Astrophysics Data System (ADS)
Salvalaglio, Marco; Backofen, Rainer; Voigt, Axel; Montalenti, Francesco
2017-09-01
Lateral ordering of heteroepitaxial islands can be conveniently achieved by suitable pit-patterning of the substrate prior to deposition. Controlling shape, orientation, and size of the pits is not trivial as, being metastable, they can significantly evolve during deposition/annealing. In this paper, we exploit a continuum model to explore the typical metastable pit morphologies that can be expected on Si(001), depending on the initial depth/shape. Evolution is predicted using a surface-diffusion model, formulated in a phase-field framework, and tackling surface-energy anisotropy. Results are shown to nicely reproduce typical metastable shapes reported in the literature. Moreover, long time scale evolutions of pit profiles with different depths are found to follow a similar kinetic pathway. The model is also exploited to treat the case of heteroepitaxial growth involving two materials characterized by different facets in their equilibrium Wulff's shape. This can lead to significant changes in morphologies, such as a rotation of the pit during deposition as evidenced in Ge/Si experiments.
Morphological Evolution of Pit-Patterned Si(001) Substrates Driven by Surface-Energy Reduction.
Salvalaglio, Marco; Backofen, Rainer; Voigt, Axel; Montalenti, Francesco
2017-09-29
Lateral ordering of heteroepitaxial islands can be conveniently achieved by suitable pit-patterning of the substrate prior to deposition. Controlling shape, orientation, and size of the pits is not trivial as, being metastable, they can significantly evolve during deposition/annealing. In this paper, we exploit a continuum model to explore the typical metastable pit morphologies that can be expected on Si(001), depending on the initial depth/shape. Evolution is predicted using a surface-diffusion model, formulated in a phase-field framework, and tackling surface-energy anisotropy. Results are shown to nicely reproduce typical metastable shapes reported in the literature. Moreover, long time scale evolutions of pit profiles with different depths are found to follow a similar kinetic pathway. The model is also exploited to treat the case of heteroepitaxial growth involving two materials characterized by different facets in their equilibrium Wulff's shape. This can lead to significant changes in morphologies, such as a rotation of the pit during deposition as evidenced in Ge/Si experiments.
Linear dynamical modes as new variables for data-driven ENSO forecast
NASA Astrophysics Data System (ADS)
Gavrilov, Andrey; Seleznev, Aleksei; Mukhin, Dmitry; Loskutov, Evgeny; Feigin, Alexander; Kurths, Juergen
2018-05-01
A new data-driven model for analysis and prediction of spatially distributed time series is proposed. The model is based on a linear dynamical mode (LDM) decomposition of the observed data which is derived from a recently developed nonlinear dimensionality reduction approach. The key point of this approach is its ability to take into account simple dynamical properties of the observed system by means of revealing the system's dominant time scales. The LDMs are used as new variables for empirical construction of a nonlinear stochastic evolution operator. The method is applied to the sea surface temperature anomaly field in the tropical belt where the El Nino Southern Oscillation (ENSO) is the main mode of variability. The advantage of LDMs versus traditionally used empirical orthogonal function decomposition is demonstrated for this data. Specifically, it is shown that the new model has a competitive ENSO forecast skill in comparison with the other existing ENSO models.
A Menu-Driven Interface to Unix-Based Resources
Evans, Elizabeth A.
1989-01-01
Unix has often been overlooked in the past as a viable operating system for anyone other than computer scientists. Its terseness, non-mnemonic nature of the commands, and the lack of user-friendly software to run under it are but a few of the user-related reasons which have been cited. It is, nevertheless, the operating system of choice in many cases. This paper describes a menu-driven interface to Unix which provides user-friendlier access to the software resources available on the computers running under Unix.
NASA Astrophysics Data System (ADS)
Oliveira, Micael
The CECAM Electronic Structure Library (ESL) is a community-driven effort to segregate shared pieces of software as libraries that could be contributed and used by the community. Besides allowing to share the burden of developing and maintaining complex pieces of software, these can also become a target for re-coding by software engineers as hardware evolves, ensuring that electronic structure codes remain at the forefront of HPC trends. In a series of workshops hosted at the CECAM HQ in Lausanne, the tools and infrastructure for the project were prepared, and the first contributions were included and made available online (http://esl.cecam.org). In this talk I will present the different aspects and aims of the ESL and how these can be useful for the electronic structure community.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitehead, Alfred J.; McMillan, Stephen L. W.; Vesperini, Enrico
2013-12-01
We perform a series of simulations of evolving star clusters using the Astrophysical Multipurpose Software Environment (AMUSE), a new community-based multi-physics simulation package, and compare our results to existing work. These simulations model a star cluster beginning with a King model distribution and a selection of power-law initial mass functions and contain a tidal cutoff. They are evolved using collisional stellar dynamics and include mass loss due to stellar evolution. After studying and understanding that the differences between AMUSE results and results from previous studies are understood, we explored the variation in cluster lifetimes due to the random realization noisemore » introduced by transforming a King model to specific initial conditions. This random realization noise can affect the lifetime of a simulated star cluster by up to 30%. Two modes of star cluster dissolution were identified: a mass evolution curve that contains a runaway cluster dissolution with a sudden loss of mass, and a dissolution mode that does not contain this feature. We refer to these dissolution modes as 'dynamical' and 'relaxation' dominated, respectively. For Salpeter-like initial mass functions, we determined the boundary between these two modes in terms of the dynamical and relaxation timescales.« less
Dhole, Sumit; Stern, Caitlin A; Servedio, Maria R
2018-04-01
The evolution of mating displays as indicators of male quality has been the subject of extensive theoretical and empirical research for over four decades. Research has also addressed the evolution of female mate choice favoring such indicators. Yet, much debate still exists about whether displays can evolve through the indirect benefits of female mate choice. Here, we use a population genetic model to investigate how the extent to which females can directly detect male quality influences the evolution of female choosiness and male displays. We use a continuum framework that incorporates indicator mechanisms that are traditionally modeled separately. Counter to intuition, we find that intermediate levels of direct detection of male quality can facilitate, rather than impede, the evolution of female choosiness and male displays in broad regions of this continuum. We examine how this evolution is driven by selective forces on genetic quality and on the display, and find that direct detection of male quality results in stronger indirect selection favoring female choosiness. Our results imply that displays maybe more likely to evolve when female choosiness has already evolved to discriminate perceptible forms of male quality. They also highlight the importance of considering general female choosiness, as well as preference, in studies of "good genes." © 2018 The Author(s). Evolution © 2018 The Society for the Study of Evolution.
Calibration and analysis of genome-based models for microbial ecology.
Louca, Stilianos; Doebeli, Michael
2015-10-16
Microbial ecosystem modeling is complicated by the large number of unknown parameters and the lack of appropriate calibration tools. Here we present a novel computational framework for modeling microbial ecosystems, which combines genome-based model construction with statistical analysis and calibration to experimental data. Using this framework, we examined the dynamics of a community of Escherichia coli strains that emerged in laboratory evolution experiments, during which an ancestral strain diversified into two coexisting ecotypes. We constructed a microbial community model comprising the ancestral and the evolved strains, which we calibrated using separate monoculture experiments. Simulations reproduced the successional dynamics in the evolution experiments, and pathway activation patterns observed in microarray transcript profiles. Our approach yielded detailed insights into the metabolic processes that drove bacterial diversification, involving acetate cross-feeding and competition for organic carbon and oxygen. Our framework provides a missing link towards a data-driven mechanistic microbial ecology.
Ariane 5-ALF: Evolution of the Ariane 5 Data Handling System
NASA Astrophysics Data System (ADS)
Notebaert, O.; Stransky, Arnaud; Corin, Hans; Hult, Torbjorn; Bonnerot, Georges-Albert
2004-06-01
In the coming years, the Ariane 5 On-Board-Computer (OBC) will handle missions and performances enhancements together with the need for significantly reducing costs and the replacement of obsolescent components. The OBC evolution is naturally driven by these factors, but also needs to consider the SW system compliance. Indeed, it would be a major concern that the necessary change of the underlying HW should imply new development of the flight software, mission database and ground control system.The Ariane 5 SW uses ADA language, which enables verifiable definition of the interfaces and provides a standardized level of the real-time behavior. To enforce portability, it has a layered architecture that clearly separates application SW and data from the lower level software. In addition, the on-board mission data is managed thanks to the extraction of an image of the systems database located in a structured memory area (the exchange memory). Used for all interchanges between the system application software and the launcher's subsystems and peripherals, the exchange memory is the virtual view of the Ariane 5 system from the flight SW standpoint. Thanks to these early architectural and structural choices, portability on future hardware is theoretically guaranteed, whenever the exchange memory data structures and the service layer interfaces remains stable. The ALF working group has defined and manufactured a mock-up that fulfils these architectural constraints with a completely new on-board computer featuring improvements such as the microprocessor replacement as well as an advanced integrated I/O controller for access to the system data bus. Lower level SW has been prototyped on this new hardware in order to fulfill the same level of services as the current one while completely hiding the underlying HW/SW implementation to the rest of the system. Functional and performance evaluation of this platform consolidated at system level will show the potential benefits and the limits of such approach.
Group invariant solution for a pre-existing fracture driven by a power-law fluid in permeable rock
NASA Astrophysics Data System (ADS)
Fareo, A. G.; Mason, D. P.
2016-06-01
Group invariant analytical and numerical solutions for the evolution of a two-dimensional fracture with nonzero initial length in permeable rock and driven by an incompressible non-Newtonian fluid of power-law rheology are obtained. The effect of fluid leak-off on the evolution of the power-law fluid fracture is investigated.
Zueva, Ksenia J.; Lumme, Jaakko; Veselov, Alexey E.; Kent, Matthew P.; Lien, Sigbjørn; Primmer, Craig R.
2014-01-01
Mechanisms of host-parasite co-adaptation have long been of interest in evolutionary biology; however, determining the genetic basis of parasite resistance has been challenging. Current advances in genome technologies provide new opportunities for obtaining a genome-scale view of the action of parasite-driven natural selection in wild populations and thus facilitate the search for specific genomic regions underlying inter-population differences in pathogen response. European populations of Atlantic salmon (Salmo salar L.) exhibit natural variance in susceptibility levels to the ectoparasite Gyrodactylus salaris Malmberg 1957, ranging from resistance to extreme susceptibility, and are therefore a good model for studying the evolution of virulence and resistance. However, distinguishing the molecular signatures of genetic drift and environment-associated selection in small populations such as land-locked Atlantic salmon populations presents a challenge, specifically in the search for pathogen-driven selection. We used a novel genome-scan analysis approach that enabled us to i) identify signals of selection in salmon populations affected by varying levels of genetic drift and ii) separate potentially selected loci into the categories of pathogen (G. salaris)-driven selection and selection acting upon other environmental characteristics. A total of 4631 single nucleotide polymorphisms (SNPs) were screened in Atlantic salmon from 12 different northern European populations. We identified three genomic regions potentially affected by parasite-driven selection, as well as three regions presumably affected by salinity-driven directional selection. Functional annotation of candidate SNPs is consistent with the role of the detected genomic regions in immune defence and, implicitly, in osmoregulation. These results provide new insights into the genetic basis of pathogen susceptibility in Atlantic salmon and will enable future searches for the specific genes involved. PMID:24670947
Zueva, Ksenia J; Lumme, Jaakko; Veselov, Alexey E; Kent, Matthew P; Lien, Sigbjørn; Primmer, Craig R
2014-01-01
Mechanisms of host-parasite co-adaptation have long been of interest in evolutionary biology; however, determining the genetic basis of parasite resistance has been challenging. Current advances in genome technologies provide new opportunities for obtaining a genome-scale view of the action of parasite-driven natural selection in wild populations and thus facilitate the search for specific genomic regions underlying inter-population differences in pathogen response. European populations of Atlantic salmon (Salmo salar L.) exhibit natural variance in susceptibility levels to the ectoparasite Gyrodactylus salaris Malmberg 1957, ranging from resistance to extreme susceptibility, and are therefore a good model for studying the evolution of virulence and resistance. However, distinguishing the molecular signatures of genetic drift and environment-associated selection in small populations such as land-locked Atlantic salmon populations presents a challenge, specifically in the search for pathogen-driven selection. We used a novel genome-scan analysis approach that enabled us to i) identify signals of selection in salmon populations affected by varying levels of genetic drift and ii) separate potentially selected loci into the categories of pathogen (G. salaris)-driven selection and selection acting upon other environmental characteristics. A total of 4631 single nucleotide polymorphisms (SNPs) were screened in Atlantic salmon from 12 different northern European populations. We identified three genomic regions potentially affected by parasite-driven selection, as well as three regions presumably affected by salinity-driven directional selection. Functional annotation of candidate SNPs is consistent with the role of the detected genomic regions in immune defence and, implicitly, in osmoregulation. These results provide new insights into the genetic basis of pathogen susceptibility in Atlantic salmon and will enable future searches for the specific genes involved.
NASA Astrophysics Data System (ADS)
Yetemen, O.; Saco, P. M.
2016-12-01
Orography induced precipitation and its implications on vegetation dynamics and landscape morphology have long been documented in the literature. However a numerical framework that integrates a range of ecohydrologic and geomorphic processes to explore the coupled ecohydro-geomorphic landscape response of catchments where pronounced orographic precipitation prevails has been missing. In this study, our aim is to realistically represent orographic-precipitation-driven ecohydrologic dynamics in a landscape evolution model (LEM). The model is used to investigate how ecohydro-geomorphic differences caused by differential precipitation patterns on the leeward and windward sides of low-relief landscapes lead to differences in the organization of modelled topography, soil moisture and plant biomass. We use the CHILD LEM equipped with a vegetation dynamics component that explicitly tracks above- and below-ground biomass, and a precipitation forcing component that simulates rainfall as a function of elevation and orientation. The preliminary results of the model show how the competition between an increased shear stress through runoff production and an enhanced resistance force due to denser canopy cover shape the landscape. Moreover, orographic precipitation leads to not only the migration of the divide between leeward and windward slopes but also a change in the concavity of streams. These results clearly demonstrate the strong coupling between landform evolution and climate processes.
Mission Services Evolution Center Message Bus
NASA Technical Reports Server (NTRS)
Mayorga, Arturo; Bristow, John O.; Butschky, Mike
2011-01-01
The Goddard Mission Services Evolution Center (GMSEC) Message Bus is a robust, lightweight, fault-tolerant middleware implementation that supports all messaging capabilities of the GMSEC API. This architecture is a distributed software system that routes messages based on message subject names and knowledge of the locations in the network of the interested software components.
An Event-driven, Value-based, Pull Systems Engineering Scheduling Approach
2012-03-01
engineering in rapid response environments has been difficult, particularly those where large, complex brownfield systems or systems of systems exist and...where large, complex brownfield systems or systems of systems exist and are constantly being updated with both short and long term software enhancements...2004. [13] B. Boehm, “Applying the Incremental Commitment Model to Brownfield System Development,” Proceedings, CSER, 2009. [14] A. Borshchev and A
The Use of Model-Driven Methodologies and Processes in Aegis Development
2011-05-17
Jamie.Durbin@lmco.com Christopher.M.Thompson@lmco.com May 17, 2011 Distribution Statement A: Approved for Public Release. Distribution is unlimited...Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per...AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES Presented at the 23rd Systems and Software
The evolution of sex chromosomes in organisms with separate haploid sexes.
Immler, Simone; Otto, Sarah Perin
2015-03-01
The evolution of dimorphic sex chromosomes is driven largely by the evolution of reduced recombination and the subsequent accumulation of deleterious mutations. Although these processes are increasingly well understood in diploid organisms, the evolution of dimorphic sex chromosomes in haploid organisms (U/V) has been virtually unstudied theoretically. We analyze a model to investigate the evolution of linkage between fitness loci and the sex-determining region in U/V species. In a second step, we test how prone nonrecombining regions are to degeneration due to accumulation of deleterious mutations. Our modeling predicts that the decay of recombination on the sex chromosomes and the addition of strata via fusions will be just as much a part of the evolution of haploid sex chromosomes as in diploid sex chromosome systems. Reduced recombination is broadly favored, as long as there is some fitness difference between haploid males and females. The degeneration of the sex-determining region due to the accumulation of deleterious mutations is expected to be slower in haploid organisms because of the absence of masking. Nevertheless, balancing selection often drives greater differentiation between the U/V sex chromosomes than in X/Y and Z/W systems. We summarize empirical evidence for haploid sex chromosome evolution and discuss our predictions in light of these findings. © 2015 The Author(s).
LAPSUS: soil erosion - landscape evolution model
NASA Astrophysics Data System (ADS)
van Gorp, Wouter; Temme, Arnaud; Schoorl, Jeroen
2015-04-01
LAPSUS is a soil erosion - landscape evolution model which is capable of simulating landscape evolution of a gridded DEM by using multiple water, mass movement and human driven processes on multiple temporal and spatial scales. It is able to deal with a variety of human landscape interventions such as landuse management and tillage and it can model their interactions with natural processes. The complex spatially explicit feedbacks the model simulates demonstrate the importance of spatial interaction of human activity and erosion deposition patterns. In addition LAPSUS can model shallow landsliding, slope collapse, creep, solifluction, biological and frost weathering, fluvial behaviour. Furthermore, an algorithm to deal with natural depressions has been added and event-based modelling with an improved infiltration description and dust deposition has been pursued. LAPSUS has been used for case studies in many parts of the world and is continuously developing and expanding. it is now available for third-party and educational use. It has a comprehensive user interface and it is accompanied by a manual and exercises. The LAPSUS model is highly suitable to quantify and understand catchment-scale erosion processes. More information and a download link is available on www.lapsusmodel.nl.
NASA Technical Reports Server (NTRS)
1979-01-01
Program elements of the power module (PM) system, are identified, structured, and defined according to the planned work breakdown structure. Efforts required to design, develop, manufacture, test, checkout, launch and operate a protoflight assembled 25 kW, 50 kW and 100 kW PM include the preparation and delivery of related software, government furnished equipment, space support equipment, ground support equipment, launch site verification software, orbital verification software, and all related data items.
Numerical simulation of plasma processes driven by transverse ion heating
NASA Technical Reports Server (NTRS)
Singh, Nagendra; Chan, C. B.
1993-01-01
The plasma processes driven by transverse ion heating in a diverging flux tube are investigated with numerical simulation. The heating is found to drive a host of plasma processes, in addition to the well-known phenomenon of ion conics. The downward electric field near the reverse shock generates a doublestreaming situation consisting of two upflowing ion populations with different average flow velocities. The electric field in the reverse shock region is modulated by the ion-ion instability driven by the multistreaming ions. The oscillating fields in this region have the possibility of heating electrons. These results from the simulations are compared with results from a previous study based on a hydrodynamical model. Effects of spatial resolutions provided by simulations on the evolution of the plasma are discussed.
Comet Gas and Dust Dynamics Modeling
NASA Technical Reports Server (NTRS)
Von Allmen, Paul A.; Lee, Seungwon
2010-01-01
This software models the gas and dust dynamics of comet coma (the head region of a comet) in order to support the Microwave Instrument for Rosetta Orbiter (MIRO) project. MIRO will study the evolution of the comet 67P/Churyumov-Gerasimenko's coma system. The instrument will measure surface temperature, gas-production rates and relative abundances, and velocity and excitation temperatures of each species along with their spatial temporal variability. This software will use these measurements to improve the understanding of coma dynamics. The modeling tool solves the equation of motion of a dust particle, the energy balance equation of the dust particle, the continuity equation for the dust and gas flow, and the dust and gas mixture energy equation. By solving these equations numerically, the software calculates the temperature and velocity of gas and dust as a function of time for a given initial gas and dust production rate, and a dust characteristic parameter that measures the ability of a dust particle to adjust its velocity to the local gas velocity. The software is written in a modular manner, thereby allowing the addition of more dynamics equations as needed. All of the numerical algorithms are added in-house and no third-party libraries are used.
NASA Astrophysics Data System (ADS)
Gallet, Florian; Bolmont, Emeline; Mathis, Stéphane; Charbonnel, Corinne; Amard, Louis; Alibert, Yann
2017-10-01
Close-in planets represent a large fraction of the population of confirmed exoplanets. To understand the dynamical evolution of these planets, star-planet interactions must be taken into account. In particular, the dependence of the tidal interactions on the structural parameters of the star, its rotation, and its metallicity should be treated in the models. We quantify how the tidal dissipation in the convective envelope of rotating low-mass stars evolves in time. We also investigate the possible consequences of this evolution on planetary orbital evolution. In Gallet et al. (2017) and Bolmont et al. (2017) we generalized the work of Bolmont & Mathis (2016) by following the orbital evolution of close-in planets using the new tidal dissipation predictions for advanced phases of stellar evolution and non-solar metallicity. We find that during the pre-main sequence the evolution of tidal dissipation is controlled by the evolution of the internal structure of the star through the stellar contraction. On the main-sequence tidal dissipation is strongly driven by the evolution of the surface rotation that is impacted by magnetized stellar winds braking. Finally, during the more evolved phases, the tidal dissipation sharply decreases as radiative core retreats in mass and radius towards the red-giant branch. Using an orbital evolution model, we also show that changing the metallicity leads to diUerent orbital evolutions (e.g., planets migrate farther out from an initially fast rotating metal rich star). By using this model, we qualitatively reproduced the observational trends of the population of hot Jupiters with the metallicity of their host stars. However, more work still remain to be do so as to be able to quantitatively fit our results to the observations.
NASA Astrophysics Data System (ADS)
Duan, Huaiyu; Fuller, George M.; Carlson, J.; Qian, Yong-Zhong
2006-11-01
We present results of large-scale numerical simulations of the evolution of neutrino and antineutrino flavors in the region above the late-time post-supernova-explosion proto-neutron star. Our calculations are the first to allow explicit flavor evolution histories on different neutrino trajectories and to self-consistently couple flavor development on these trajectories through forward scattering-induced quantum coupling. Employing the atmospheric-scale neutrino mass-squared difference (|δm2|≃3×10-3eV2) and values of θ13 allowed by current bounds, we find transformation of neutrino and antineutrino flavors over broad ranges of energy and luminosity in roughly the “bi-polar” collective mode. We find that this large-scale flavor conversion, largely driven by the flavor off-diagonal neutrino-neutrino forward scattering potential, sets in much closer to the proto-neutron star than simple estimates based on flavor-diagonal potentials and Mikheyev-Smirnov-Wolfenstein evolution would indicate. In turn, this suggests that models of r-process nucleosynthesis sited in the neutrino-driven wind could be affected substantially by active-active neutrino flavor mixing, even with the small measured neutrino mass-squared differences.
N-body simulations of collective effects in spiral and barred galaxies
NASA Astrophysics Data System (ADS)
Zhang, X.
2016-10-01
We present gravitational N-body simulations of the secular morphological evolution of disk galaxies induced by density wave modes. In particular, we address the demands collective effects place on the choice of simulation parameters, and show that the common practice of the use of a large gravity softening parameter was responsible for the failure of past simulations to correctly model the secular evolution process in galaxies, even for those simulations where the choice of basic state allows an unstable mode to emerge, a prerequisite for obtaining the coordinated radial mass flow pattern needed for secular evolution of galaxies along the Hubble sequence. We also demonstrate that the secular evolution rates measured in our improved simulations agree to an impressive degree with the corresponding rates predicted by the recently-advanced theories of dynamically-driven secular evolution of galaxies. The results of the current work, besides having direct implications on the cosmological evolution of galaxies, also shed light on the general question of how irreversibility emerges from a nominally reversible physical system.
3DVEM Software Modules for Efficient Management of Point Clouds and Photorealistic 3d Models
NASA Astrophysics Data System (ADS)
Fabado, S.; Seguí, A. E.; Cabrelles, M.; Navarro, S.; García-De-San-Miguel, D.; Lerma, J. L.
2013-07-01
Cultural heritage managers in general and information users in particular are not usually used to deal with high-technological hardware and software. On the contrary, information providers of metric surveys are most of the times applying latest developments for real-life conservation and restoration projects. This paper addresses the software issue of handling and managing either 3D point clouds or (photorealistic) 3D models to bridge the gap between information users and information providers as regards the management of information which users and providers share as a tool for decision-making, analysis, visualization and management. There are not many viewers specifically designed to handle, manage and create easily animations of architectural and/or archaeological 3D objects, monuments and sites, among others. 3DVEM - 3D Viewer, Editor & Meter software will be introduced to the scientific community, as well as 3DVEM - Live and 3DVEM - Register. The advantages of managing projects with both sets of data, 3D point cloud and photorealistic 3D models, will be introduced. Different visualizations of true documentation projects in the fields of architecture, archaeology and industry will be presented. Emphasis will be driven to highlight the features of new userfriendly software to manage virtual projects. Furthermore, the easiness of creating controlled interactive animations (both walkthrough and fly-through) by the user either on-the-fly or as a traditional movie file will be demonstrated through 3DVEM - Live.
ERIC Educational Resources Information Center
Williams, Lawrence H., Jr.
2013-01-01
This qualitative study analyzed experiences of twenty software developers. The research showed that all software development methodologies are distinct from each other. While some, such as waterfall, focus on traditional, plan-driven approaches that allow software requirements and design to evolve; others facilitate ambiguity and uncertainty by…
An MDA Based Ontology Platform: AIR
NASA Astrophysics Data System (ADS)
Gaševic, Dragan; Djuric, Dragan; Devedžic, Vladan
In the past few years, software engineering has witnessed two major shifts: model-driven engineering has entered the mainstream, and some leading development tools have become open and extensible.1 AI has always been a spring of new ideas that have been adopted in software engineering, but most of its gems have stayed buried in laboratories, available only to a limited number of AI practitioners. Should AI tools be integrated into mainstream tools and could it be done? We think that it is feasible, and that both communities can benefit from this integration. In fact, some efforts in this direction have already been made, both by major industrial standardization bodies such as the OMG, and by academic laboratories.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sierra Thermal /Fluid Team
The SIERRA Low Mach Module: Fuego along with the SIERRA Participating Media Radiation Module: Syrinx, henceforth referred to as Fuego and Syrinx, respectively, are the key elements of the ASCI fire environment simulation project. The fire environment simulation project is directed at characterizing both open large-scale pool fires and building enclosure fires. Fuego represents the turbulent, buoyantly-driven incompressible flow, heat transfer, mass transfer, combustion, soot, and absorption coefficient model portion of the simulation software. Syrinx represents the participating-media thermal radiation mechanics. This project is an integral part of the SIERRA multi-mechanics software development project. Fuego depends heavily upon the coremore » architecture developments provided by SIERRA for massively parallel computing, solution adaptivity, and mechanics coupling on unstructured grids.« less
Corredor, Iván; Bernardos, Ana M.; Iglesias, Josué; Casar, José R.
2012-01-01
Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT) and Web of Things (WoT) are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i) to integrate sensing and actuating functionalities into everyday objects, and (ii) to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD) methodology based on the Model Driven Architecture (MDA). This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym. PMID:23012544
A Monte Carlo model for 3D grain evolution during welding
NASA Astrophysics Data System (ADS)
Rodgers, Theron M.; Mitchell, John A.; Tikare, Veena
2017-09-01
Welding is one of the most wide-spread processes used in metal joining. However, there are currently no open-source software implementations for the simulation of microstructural evolution during a weld pass. Here we describe a Potts Monte Carlo based model implemented in the SPPARKS kinetic Monte Carlo computational framework. The model simulates melting, solidification and solid-state microstructural evolution of material in the fusion and heat-affected zones of a weld. The model does not simulate thermal behavior, but rather utilizes user input parameters to specify weld pool and heat-affect zone properties. Weld pool shapes are specified by Bézier curves, which allow for the specification of a wide range of pool shapes. Pool shapes can range from narrow and deep to wide and shallow representing different fluid flow conditions within the pool. Surrounding temperature gradients are calculated with the aide of a closest point projection algorithm. The model also allows simulation of pulsed power welding through time-dependent variation of the weld pool size. Example simulation results and comparisons with laboratory weld observations demonstrate microstructural variation with weld speed, pool shape, and pulsed-power.
Launch Site Computer Simulation and its Application to Processes
NASA Technical Reports Server (NTRS)
Sham, Michael D.
1995-01-01
This paper provides an overview of computer simulation, the Lockheed developed STS Processing Model, and the application of computer simulation to a wide range of processes. The STS Processing Model is an icon driven model that uses commercial off the shelf software and a Macintosh personal computer. While it usually takes one year to process and launch 8 space shuttles, with the STS Processing Model this process is computer simulated in about 5 minutes. Facilities, orbiters, or ground support equipment can be added or deleted and the impact on launch rate, facility utilization, or other factors measured as desired. This same computer simulation technology can be used to simulate manufacturing, engineering, commercial, or business processes. The technology does not require an 'army' of software engineers to develop and operate, but instead can be used by the layman with only a minimal amount of training. Instead of making changes to a process and realizing the results after the fact, with computer simulation, changes can be made and processes perfected before they are implemented.
Activity-Centered Domain Characterization for Problem-Driven Scientific Visualization
Marai, G. Elisabeta
2018-01-01
Although visualization design models exist in the literature in the form of higher-level methodological frameworks, these models do not present a clear methodological prescription for the domain characterization step. This work presents a framework and end-to-end model for requirements engineering in problem-driven visualization application design. The framework and model are based on the activity-centered design paradigm, which is an enhancement of human-centered design. The proposed activity-centered approach focuses on user tasks and activities, and allows an explicit link between the requirements engineering process with the abstraction stage—and its evaluation—of existing, higher-level visualization design models. In a departure from existing visualization design models, the resulting model: assigns value to a visualization based on user activities; ranks user tasks before the user data; partitions requirements in activity-related capabilities and nonfunctional characteristics and constraints; and explicitly incorporates the user workflows into the requirements process. A further merit of this model is its explicit integration of functional specifications, a concept this work adapts from the software engineering literature, into the visualization design nested model. A quantitative evaluation using two sets of interdisciplinary projects supports the merits of the activity-centered model. The result is a practical roadmap to the domain characterization step of visualization design for problem-driven data visualization. Following this domain characterization model can help remove a number of pitfalls that have been identified multiple times in the visualization design literature. PMID:28866550
Quasi-static evolution of coronal magnetic fields
NASA Technical Reports Server (NTRS)
Longcope, D. W.; Sudan, R. N.
1992-01-01
A formalism is developed to describe the purely quasi-static part of the evolution of a coronal loop driven by its footpoints. This is accomplished under assumptions of a long, thin loop. The quasi-static equations reveal the possibility for sudden 'loss of equilibrium' at which time the system evolves dynamically rather than quasi-statically. Such quasi-static crises produce high-frequency Alfven waves and, in conjunction with Alfven wave dissipation models, form a viable coronal heating mechanism. Furthermore, an approximate solution to the quasi-static equations by perturbation method verifies the development of small-scale spatial current structure.
Yi, Jinhua; Yu, Hongliu; Zhang, Ying; Hu, Xin; Shi, Ping
2015-12-01
The present paper proposed a central-driven structure of upper limb rehabilitation robot in order to reduce the volume of the robotic arm in the structure, and also to reduce the influence of motor noise, radiation and other adverse factors on upper limb dysfunction patient. The forward and inverse kinematics equations have been obtained with using the Denavit-Hartenberg (D-H) parameter method. The motion simulation has been done to obtain the angle-time curve of each joint and the position-time curve of handle under setting rehabilitation path by using Solid Works software. Experimental results showed that the rationality with the central-driven structure design had been verified by the fact that the handle could move under setting rehabilitation path. The effectiveness of kinematics equations had been proved, and the error was less than 3° by comparing the angle-time curves obtained from calculation with those from motion simulation.
Arnoldt, Hinrich; Strogatz, Steven H; Timme, Marc
2015-01-01
It has been hypothesized that in the era just before the last universal common ancestor emerged, life on earth was fundamentally collective. Ancient life forms shared their genetic material freely through massive horizontal gene transfer (HGT). At a certain point, however, life made a transition to the modern era of individuality and vertical descent. Here we present a minimal model for stochastic processes potentially contributing to this hypothesized "Darwinian transition." The model suggests that HGT-dominated dynamics may have been intermittently interrupted by selection-driven processes during which genotypes became fitter and decreased their inclination toward HGT. Stochastic switching in the population dynamics with three-point (hypernetwork) interactions may have destabilized the HGT-dominated collective state and essentially contributed to the emergence of vertical descent and the first well-defined species in early evolution. A systematic nonlinear analysis of the stochastic model dynamics covering key features of evolutionary processes (such as selection, mutation, drift and HGT) supports this view. Our findings thus suggest a viable direction out of early collective evolution, potentially enabling the start of individuality and vertical Darwinian evolution.
NASA Astrophysics Data System (ADS)
Szillat, F.; Mayr, S. G.
2011-09-01
Self-organized pattern formation during physical vapor deposition of organic materials onto rough inorganic substrates is characterized by a complex morphological evolution as a function of film thickness. We employ a combined experimental-theoretical study using atomic force microscopy and numerically solved continuum rate equations to address morphological evolution in the model system: poly(bisphenol A carbonate) on polycrystalline Cu. As the key ingredients for pattern formation, (i) curvature and interface potential driven surface diffusion, (ii) deposition noise, and (iii) interface boundary effects are identified. Good agreement of experiments and theory, fitting only the Hamaker constant and diffusivity within narrow physical parameter windows, corroborates the underlying physics and paves the way for computer-assisted interface engineering.
The evolution of CMS software performance studies
NASA Astrophysics Data System (ADS)
Kortelainen, M. J.; Elmer, P.; Eulisse, G.; Innocente, V.; Jones, C. D.; Tuura, L.
2011-12-01
CMS has had an ongoing and dedicated effort to optimize software performance for several years. Initially this effort focused primarily on the cleanup of many issues coming from basic C++ errors, namely reducing dynamic memory churn, unnecessary copies/temporaries and tools to routinely monitor these things. Over the past 1.5 years, however, the transition to 64bit, newer versions of the gcc compiler, newer tools and the enabling of techniques like vectorization have made possible more sophisticated improvements to the software performance. This presentation will cover this evolution and describe the current avenues being pursued for software performance, as well as the corresponding gains.
Alternative modeling methods for plasma-based Rf ion sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Veitzer, Seth A., E-mail: veitzer@txcorp.com; Kundrapu, Madhusudhan, E-mail: madhusnk@txcorp.com; Stoltz, Peter H., E-mail: phstoltz@txcorp.com
Rf-driven ion sources for accelerators and many industrial applications benefit from detailed numerical modeling and simulation of plasma characteristics. For instance, modeling of the Spallation Neutron Source (SNS) internal antenna H{sup −} source has indicated that a large plasma velocity is induced near bends in the antenna where structural failures are often observed. This could lead to improved designs and ion source performance based on simulation and modeling. However, there are significant separations of time and spatial scales inherent to Rf-driven plasma ion sources, which makes it difficult to model ion sources with explicit, kinetic Particle-In-Cell (PIC) simulation codes. Inmore » particular, if both electron and ion motions are to be explicitly modeled, then the simulation time step must be very small, and total simulation times must be large enough to capture the evolution of the plasma ions, as well as extending over many Rf periods. Additional physics processes such as plasma chemistry and surface effects such as secondary electron emission increase the computational requirements in such a way that even fully parallel explicit PIC models cannot be used. One alternative method is to develop fluid-based codes coupled with electromagnetics in order to model ion sources. Time-domain fluid models can simulate plasma evolution, plasma chemistry, and surface physics models with reasonable computational resources by not explicitly resolving electron motions, which thereby leads to an increase in the time step. This is achieved by solving fluid motions coupled with electromagnetics using reduced-physics models, such as single-temperature magnetohydrodynamics (MHD), extended, gas dynamic, and Hall MHD, and two-fluid MHD models. We show recent results on modeling the internal antenna H{sup −} ion source for the SNS at Oak Ridge National Laboratory using the fluid plasma modeling code USim. We compare demonstrate plasma temperature equilibration in two-temperature MHD models for the SNS source and present simulation results demonstrating plasma evolution over many Rf periods for different plasma temperatures. We perform the calculations in parallel, on unstructured meshes, using finite-volume solvers in order to obtain results in reasonable time.« less
Alternative modeling methods for plasma-based Rf ion sources.
Veitzer, Seth A; Kundrapu, Madhusudhan; Stoltz, Peter H; Beckwith, Kristian R C
2016-02-01
Rf-driven ion sources for accelerators and many industrial applications benefit from detailed numerical modeling and simulation of plasma characteristics. For instance, modeling of the Spallation Neutron Source (SNS) internal antenna H(-) source has indicated that a large plasma velocity is induced near bends in the antenna where structural failures are often observed. This could lead to improved designs and ion source performance based on simulation and modeling. However, there are significant separations of time and spatial scales inherent to Rf-driven plasma ion sources, which makes it difficult to model ion sources with explicit, kinetic Particle-In-Cell (PIC) simulation codes. In particular, if both electron and ion motions are to be explicitly modeled, then the simulation time step must be very small, and total simulation times must be large enough to capture the evolution of the plasma ions, as well as extending over many Rf periods. Additional physics processes such as plasma chemistry and surface effects such as secondary electron emission increase the computational requirements in such a way that even fully parallel explicit PIC models cannot be used. One alternative method is to develop fluid-based codes coupled with electromagnetics in order to model ion sources. Time-domain fluid models can simulate plasma evolution, plasma chemistry, and surface physics models with reasonable computational resources by not explicitly resolving electron motions, which thereby leads to an increase in the time step. This is achieved by solving fluid motions coupled with electromagnetics using reduced-physics models, such as single-temperature magnetohydrodynamics (MHD), extended, gas dynamic, and Hall MHD, and two-fluid MHD models. We show recent results on modeling the internal antenna H(-) ion source for the SNS at Oak Ridge National Laboratory using the fluid plasma modeling code USim. We compare demonstrate plasma temperature equilibration in two-temperature MHD models for the SNS source and present simulation results demonstrating plasma evolution over many Rf periods for different plasma temperatures. We perform the calculations in parallel, on unstructured meshes, using finite-volume solvers in order to obtain results in reasonable time.
A data-driven approach for modeling post-fire debris-flow volumes and their uncertainty
Friedel, Michael J.
2011-01-01
This study demonstrates the novel application of genetic programming to evolve nonlinear post-fire debris-flow volume equations from variables associated with a data-driven conceptual model of the western United States. The search space is constrained using a multi-component objective function that simultaneously minimizes root-mean squared and unit errors for the evolution of fittest equations. An optimization technique is then used to estimate the limits of nonlinear prediction uncertainty associated with the debris-flow equations. In contrast to a published multiple linear regression three-variable equation, linking basin area with slopes greater or equal to 30 percent, burn severity characterized as area burned moderate plus high, and total storm rainfall, the data-driven approach discovers many nonlinear and several dimensionally consistent equations that are unbiased and have less prediction uncertainty. Of the nonlinear equations, the best performance (lowest prediction uncertainty) is achieved when using three variables: average basin slope, total burned area, and total storm rainfall. Further reduction in uncertainty is possible for the nonlinear equations when dimensional consistency is not a priority and by subsequently applying a gradient solver to the fittest solutions. The data-driven modeling approach can be applied to nonlinear multivariate problems in all fields of study.
An Analysis of Category Management of Service Contracts
2017-12-01
management teams a way to make informed , data-driven decisions. Data-driven decisions derived from clustering not only align with Category...savings. Furthermore, this methodology provides a data-driven visualization to inform sound business decisions on potential Category Management ...Category Management initiatives. The Maptitude software will allow future research to collect data and develop visualizations to inform Category
CASE tools and UML: state of the ART.
Agarwal, S
2001-05-01
With increasing need for automated tools to assist complex systems development, software design methods are becoming popular. This article analyzes the state of art in computer-aided software engineering (CASE) tools and unified modeling language (UML), focusing on their evolution, merits, and industry usage. It identifies managerial issues for the tools' adoption and recommends an action plan to select and implement them. While CASE and UML offer inherent advantages like cheaper, shorter, and efficient development cycles, they suffer from poor user satisfaction. The critical success factors for their implementation include, among others, management and staff commitment, proper corporate infrastructure, and user training.
A non-linear dimension reduction methodology for generating data-driven stochastic input models
NASA Astrophysics Data System (ADS)
Ganapathysubramanian, Baskar; Zabaras, Nicholas
2008-06-01
Stochastic analysis of random heterogeneous media (polycrystalline materials, porous media, functionally graded materials) provides information of significance only if realistic input models of the topology and property variations are used. This paper proposes a framework to construct such input stochastic models for the topology and thermal diffusivity variations in heterogeneous media using a data-driven strategy. Given a set of microstructure realizations (input samples) generated from given statistical information about the medium topology, the framework constructs a reduced-order stochastic representation of the thermal diffusivity. This problem of constructing a low-dimensional stochastic representation of property variations is analogous to the problem of manifold learning and parametric fitting of hyper-surfaces encountered in image processing and psychology. Denote by M the set of microstructures that satisfy the given experimental statistics. A non-linear dimension reduction strategy is utilized to map M to a low-dimensional region, A. We first show that M is a compact manifold embedded in a high-dimensional input space Rn. An isometric mapping F from M to a low-dimensional, compact, connected set A⊂Rd(d≪n) is constructed. Given only a finite set of samples of the data, the methodology uses arguments from graph theory and differential geometry to construct the isometric transformation F:M→A. Asymptotic convergence of the representation of M by A is shown. This mapping F serves as an accurate, low-dimensional, data-driven representation of the property variations. The reduced-order model of the material topology and thermal diffusivity variations is subsequently used as an input in the solution of stochastic partial differential equations that describe the evolution of dependant variables. A sparse grid collocation strategy (Smolyak algorithm) is utilized to solve these stochastic equations efficiently. We showcase the methodology by constructing low-dimensional input stochastic models to represent thermal diffusivity in two-phase microstructures. This model is used in analyzing the effect of topological variations of two-phase microstructures on the evolution of temperature in heat conduction processes.
The Impact of Modeling Assumptions in Galactic Chemical Evolution Models
NASA Astrophysics Data System (ADS)
Côté, Benoit; O'Shea, Brian W.; Ritter, Christian; Herwig, Falk; Venn, Kim A.
2017-02-01
We use the OMEGA galactic chemical evolution code to investigate how the assumptions used for the treatment of galactic inflows and outflows impact numerical predictions. The goal is to determine how our capacity to reproduce the chemical evolution trends of a galaxy is affected by the choice of implementation used to include those physical processes. In pursuit of this goal, we experiment with three different prescriptions for galactic inflows and outflows and use OMEGA within a Markov Chain Monte Carlo code to recover the set of input parameters that best reproduces the chemical evolution of nine elements in the dwarf spheroidal galaxy Sculptor. This provides a consistent framework for comparing the best-fit solutions generated by our different models. Despite their different degrees of intended physical realism, we found that all three prescriptions can reproduce in an almost identical way the stellar abundance trends observed in Sculptor. This result supports the similar conclusions originally claimed by Romano & Starkenburg for Sculptor. While the three models have the same capacity to fit the data, the best values recovered for the parameters controlling the number of SNe Ia and the strength of galactic outflows, are substantially different and in fact mutually exclusive from one model to another. For the purpose of understanding how a galaxy evolves, we conclude that only reproducing the evolution of a limited number of elements is insufficient and can lead to misleading conclusions. More elements or additional constraints such as the Galaxy’s star-formation efficiency and the gas fraction are needed in order to break the degeneracy between the different modeling assumptions. Our results show that the successes and failures of chemical evolution models are predominantly driven by the input stellar yields, rather than by the complexity of the Galaxy model itself. Simple models such as OMEGA are therefore sufficient to test and validate stellar yields. OMEGA is part of the NuGrid chemical evolution package and is publicly available online at http://nugrid.github.io/NuPyCEE.
NASA Astrophysics Data System (ADS)
Carr, Michael J.; Gazel, Esteban
2017-04-01
We provide here an open version of Igpet software, called t-Igpet to emphasize its application for teaching and research in forward modeling of igneous geochemistry. There are three programs, a norm utility, a petrologic mixing program using least squares and Igpet, a graphics program that includes many forms of numerical modeling. Igpet is a multifaceted tool that provides the following basic capabilities: igneous rock identification using the IUGS (International Union of Geological Sciences) classification and several supplementary diagrams; tectonic discrimination diagrams; pseudo-quaternary projections; least squares fitting of lines, polynomials and hyperbolae; magma mixing using two endmembers, histograms, x-y plots, ternary plots and spider-diagrams. The advanced capabilities of Igpet are multi-element mixing and magma evolution modeling. Mixing models are particularly useful for understanding the isotopic variations in rock suites that evolved by mixing different sources. The important melting models include, batch melting, fractional melting and aggregated fractional melting. Crystallization models include equilibrium and fractional crystallization and AFC (assimilation and fractional crystallization). Theses, reports and proposals concerning igneous petrology are improved by numerical modeling. For reviewed publications some elements of modeling are practically a requirement. Our intention in providing this software is to facilitate improved communication and lower entry barriers to research, especially for students.
Data-driven traffic impact assessment tool for work zones.
DOT National Transportation Integrated Search
2017-03-01
Traditionally, traffic impacts of work zones have been assessed using planning software such as Quick Zone, custom spreadsheets, and others. These software programs generate delay, queuing, and other mobility measures but are difficult to validate du...
Asymmetric ecological conditions favor Red-Queen type of continued evolution over stasis
Nordbotten, Jan Martin; Stenseth, Nils C.
2016-01-01
Four decades ago, Leigh Van Valen presented the Red Queen’s hypothesis to account for evolution of species within a multispecies ecological community [Van Valen L (1973) Evol Theory 1(1):1–30]. The overall conclusion of Van Valen’s analysis was that evolution would continue even in the absence of abiotic perturbations. Stenseth and Maynard Smith presented in 1984 [Stenseth NC, Maynard Smith J (1984) Evolution 38(4):870–880] a model for the Red Queen’s hypothesis showing that both Red-Queen type of continuous evolution and stasis could result from a model with biotically driven evolution. However, although that contribution demonstrated that both evolutionary outcomes were possible, it did not identify which ecological conditions would lead to each of these evolutionary outcomes. Here, we provide, using a simple, yet general population-biologically founded eco-evolutionary model, such analytically derived conditions: Stasis will predominantly emerge whenever the ecological system contains only symmetric ecological interactions, whereas both Red-Queen and stasis type of evolution may result if the ecological interactions are asymmetrical, and more likely so with increasing degree of asymmetry in the ecological system (i.e., the more trophic interactions, host–pathogen interactions, and the like there are [i.e., +/− type of ecological interactions as well as asymmetric competitive (−/−) and mutualistic (+/+) ecological interactions]). In the special case of no between-generational genetic variance, our results also predict dynamics within these types of purely ecological systems. PMID:26831108
Table-driven software architecture for a stitching system
NASA Technical Reports Server (NTRS)
Thrash, Patrick J. (Inventor); Miller, Jeffrey L. (Inventor); Pallas, Ken (Inventor); Trank, Robert C. (Inventor); Fox, Rhoda (Inventor); Korte, Mike (Inventor); Codos, Richard (Inventor); Korolev, Alexandre (Inventor); Collan, William (Inventor)
2001-01-01
Native code for a CNC stitching machine is generated by generating a geometry model of a preform; generating tool paths from the geometry model, the tool paths including stitching instructions for making stitches; and generating additional instructions indicating thickness values. The thickness values are obtained from a lookup table. When the stitching machine runs the native code, it accesses a lookup table to determine a thread tension value corresponding to the thickness value. The stitching machine accesses another lookup table to determine a thread path geometry value corresponding to the thickness value.
NASA Astrophysics Data System (ADS)
Graeser, Oliver
This thesis comprises three parts, reporting research results in Fluid Dynamics (Part I), Particle Separation (Part II) and Co-evolving Networks (Part III). Part I deals with the simulation of fluid dynamics using the lattice-Boltzmann method. Microfluidic devices often feature two-dimensional, repetitive arrays. Flows through such devices are pressure-driven and confined by solid walls. We have defined new adaptive generalised periodic boundary conditions to represent the effects of outer solid walls, and are thus able to exploit the periodicity of the array by simulating the flow through one unit cell in lieu of the entire device. The so-calculated fully developed flow describes the flow through the entire array accurately, but with computational requirements that are reduced according to the dimensions of the array. Part II discusses the problem of separating macromolecules like proteins or DNA coils. The reliable separation of such molecules is a crucial task in molecular biology. The use of Brownian ratchets as mechanisms for the separation of such particles has been proposed and discussed during the last decade. Pressure-driven flows have so far been dismissed as possible driving forces for Brownian ratchets, as they do not generate ratchet asymmetry. We propose a microfluidic design that uses pressure-driven flows to create asymmetry and hence allows particle separation. The dependence of the asymmetry on various factors of the microfluidic geometry is discussed. We further exemplify the feasibility of our approach using Brownian dynamics simulations of particles of different sizes in such a device. The results show that ratchet-based particle separation using flows as the driving force is possible. Simulation results and ratchet theory predictions are in excellent agreement. Part III deals with the co-evolution of networks and dynamic models. A group of agents occupies the nodes of a network, which defines the relationship between these agents. The evolution of the agents is defined by the rules of the dynamic model and depends on the relationship between agents, i.e., the state of the network. In return, the evolution of the network depends on the state of the dynamic model. The concept is introduced through the adaptive SIS model. We show that the previously used criterion determining the critical infected fraction, i.e., the number of infected agents required to sustain the epidemic, is inappropriate for this model. We introduce a different criterion and show that the critical infected fraction so determined is in good agreement with results obtained by numerical simulations. We further discuss the concept of co-evolving dynamics using the Snowdrift Game as a model paradigm. Co-evolution occurs through agents cutting dissatisfied links and rewiring to other agents at random. The effect of co-evolution on the emergence of cooperation is discussed using a mean-field theory and numerical simulations. A transition between a connected and a disconnected, highly cooperative state of the system is observed, and explained using the mean-field model. Quantitative deviations regarding the level of cooperation in the disconnected regime can be fully resolved through an improved mean-field theory that includes the effect of random fluctuations into its model.
Landscape co-evolution and river discharge.
NASA Astrophysics Data System (ADS)
van der Velde, Ype; Temme, Arnaud
2015-04-01
Fresh water is crucial for society and ecosystems. However, our ability to secure fresh water resources under climatic and anthropogenic change is impaired by the complexity of interactions between human society, ecosystems, soils, and topography. These interactions cause landscape properties to co-evolve, continuously changing the flow paths of water through the landscape. These co-evolution driven flow path changes and their effect on river runoff are, to-date, poorly understood. In this presentation we introduce a spatially distributed landscape evolution model that incorporates growing vegetation and its effect on evapotranspiration, interception, infiltration, soil permeability, groundwater-surface water exchange and erosion. This landscape scale (10km2) model is calibrated to evolve towards well known empirical organising principles such as the Budyko curve and Hacks law under different climate conditions. To understand how positive and negative feedbacks within the model structure form complex landscape patterns of forests and peat bogs that resemble observed landscapes under humid and boreal climates, we analysed the effects of individual processes on the spatial distribution of vegetation and river peak and mean flows. Our results show that especially river peak flows and droughts decrease with increasing evolution of the landscape, which is a result that has direct implications for flood management.
Modeling of DNA and Protein Organization Levels with Cn3D Software
ERIC Educational Resources Information Center
Stasinakis, Panagiotis K.; Nicolaou, Despoina
2017-01-01
The molecular structure of living organisms and the complex interactions amongst its components are the basis for the diversity observed at the macroscopic level. Proteins and nucleic acids are some of the major molecular components, and play a key role in several biological functions, such as those of development and evolution. This article…
Software Development in the Water Sciences: a view from the divide (Invited)
NASA Astrophysics Data System (ADS)
Miles, B.; Band, L. E.
2013-12-01
While training in statistical methods is an important part of many earth scientists' training, these scientists often learn the bulk of their software development skills in an ad hoc, just-in-time manner. Yet to carry out contemporary research scientists are spending more and more time developing software. Here I present perspectives - as an earth sciences graduate student with professional software engineering experience - on the challenges scientists face adopting software engineering practices, with an emphasis on areas of the science software development lifecycle that could benefit most from improved engineering. This work builds on experience gained as part of the NSF-funded Water Science Software Institute (WSSI) conceptualization award (NSF Award # 1216817). Throughout 2013, the WSSI team held a series of software scoping and development sprints with the goals of: (1) adding features to better model green infrastructure within the Regional Hydro-Ecological Simulation System (RHESSys); and (2) infusing test-driven agile software development practices into the processes employed by the RHESSys team. The goal of efforts such as the WSSI is to ensure that investments by current and future scientists in software engineering training will enable transformative science by improving both scientific reproducibility and researcher productivity. Experience with the WSSI indicates: (1) the potential for achieving this goal; and (2) while scientists are willing to adopt some software engineering practices, transformative science will require continued collaboration between domain scientists and cyberinfrastructure experts for the foreseeable future.
Intermittent dynamics in complex systems driven to depletion.
Escobar, Juan V; Pérez Castillo, Isaac
2018-03-19
When complex systems are driven to depletion by some external factor, their non-stationary dynamics can present an intermittent behaviour between relative tranquility and burst of activity whose consequences are often catastrophic. To understand and ultimately be able to predict such dynamics, we propose an underlying mechanism based on sharp thresholds of a local generalized energy density that naturally leads to negative feedback. We find a transition from a continuous regime to an intermittent one, in which avalanches can be predicted despite the stochastic nature of the process. This model may have applications in many natural and social complex systems where a rapid depletion of resources or generalized energy drives the dynamics. In particular, we show how this model accurately describes the time evolution and avalanches present in a real social system.
EON: software for long time simulations of atomic scale systems
NASA Astrophysics Data System (ADS)
Chill, Samuel T.; Welborn, Matthew; Terrell, Rye; Zhang, Liang; Berthet, Jean-Claude; Pedersen, Andreas; Jónsson, Hannes; Henkelman, Graeme
2014-07-01
The EON software is designed for simulations of the state-to-state evolution of atomic scale systems over timescales greatly exceeding that of direct classical dynamics. States are defined as collections of atomic configurations from which a minimization of the potential energy gives the same inherent structure. The time evolution is assumed to be governed by rare events, where transitions between states are uncorrelated and infrequent compared with the timescale of atomic vibrations. Several methods for calculating the state-to-state evolution have been implemented in EON, including parallel replica dynamics, hyperdynamics and adaptive kinetic Monte Carlo. Global optimization methods, including simulated annealing, basin hopping and minima hopping are also implemented. The software has a client/server architecture where the computationally intensive evaluations of the interatomic interactions are calculated on the client-side and the state-to-state evolution is managed by the server. The client supports optimization for different computer architectures to maximize computational efficiency. The server is written in Python so that developers have access to the high-level functionality without delving into the computationally intensive components. Communication between the server and clients is abstracted so that calculations can be deployed on a single machine, clusters using a queuing system, large parallel computers using a message passing interface, or within a distributed computing environment. A generic interface to the evaluation of the interatomic interactions is defined so that empirical potentials, such as in LAMMPS, and density functional theory as implemented in VASP and GPAW can be used interchangeably. Examples are given to demonstrate the range of systems that can be modeled, including surface diffusion and island ripening of adsorbed atoms on metal surfaces, molecular diffusion on the surface of ice and global structural optimization of nanoparticles.
Human factors for capacity building: lessons learned from the OpenMRS implementers network.
Seebregts, C J; Mamlin, B W; Biondich, P G; Fraser, H S F; Wolfe, B A; Jazayeri, D; Miranda, J; Blaya, J; Sinha, C; Bailey, C T; Kanter, A S
2010-01-01
The overall objective of this project was to investigate ways to strengthen the OpenMRS community by (i) developing capacity and implementing a network focusing specifically on the needs of OpenMRS implementers, (ii) strengthening community-driven aspects of OpenMRS and providing a dedicated forum for implementation-specific issues, and; (iii) providing regional support for OpenMRS implementations as well as mentorship and training. The methods used included (i) face-to-face networking using meetings and workshops; (ii) online collaboration tools, peer support and mentorship programmes; (iii) capacity and community development programmes, and; (iv) community outreach programmes. The community-driven approach, combined with a few simple interventions, has been a key factor in the growth and success of the OpenMRS Implementers Network. It has contributed to implementations in at least twenty-three different countries using basic online tools; and provided mentorship and peer support through an annual meeting, workshops and an internship program. The OpenMRS Implementers Network has formed collaborations with several other open source networks and is evolving regional OpenMRS Centres of Excellence to provide localized support for OpenMRS development and implementation. These initiatives are increasing the range of functionality and sustainability of open source software in the health domain, resulting in improved adoption and enterprise-readiness. Social organization and capacity development activities are important in growing a successful community-driven open source software model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacobs, A. M.; Zingale, M.; Nonaka, A.
2016-08-10
The dynamics of helium shell convection driven by nuclear burning establish the conditions for runaway in the sub-Chandrasekhar-mass, double-detonation model for SNe Ia, as well as for a variety of other explosive phenomena. We explore these convection dynamics for a range of white dwarf core and helium shell masses in three dimensions using the low Mach number hydrodynamics code MAESTRO. We present calculations of the bulk properties of this evolution, including time-series evolution of global diagnostics, lateral averages of the 3D state, and the global 3D state. We find a variety of outcomes, including quasi-equilibrium, localized runaway, and convective runaway.more » Our results suggest that the double-detonation progenitor model is promising and that 3D dynamic convection plays a key role.« less
Modeling Day-to-day Flow Dynamics on Degradable Transport Network
Gao, Bo; Zhang, Ronghui; Lou, Xiaoming
2016-01-01
Stochastic link capacity degradations are common phenomena in transport network which can cause travel time variations and further can affect travelers’ daily route choice behaviors. This paper formulates a deterministic dynamic model, to capture the day-to-day (DTD) flow evolution process in the presence of degraded link capacity degradations. The aggregated network flow dynamics are driven by travelers’ study of uncertain travel time and their choice of risky routes. This paper applies the exponential-smoothing filter to describe travelers’ study of travel time variations, and meanwhile formulates risk attitude parameter updating equation to reflect travelers’ endogenous risk attitude evolution schema. In addition, this paper conducts theoretical analyses to investigate several significant mathematical characteristics implied in the proposed DTD model, including fixed point existence, uniqueness, stability and irreversibility. Numerical experiments are used to demonstrate the effectiveness of the DTD model and verify some important dynamic system properties. PMID:27959903
Jacobs, A. M.; Zingale, M.; Nonaka, A.; ...
2016-08-10
The dynamics of helium shell convection driven by nuclear burning establish the conditions for runaway in the sub-Chandrasekhar-mass, double-detonation model for SNe Ia, as well as for a variety of other explosive phenomena. In this paper, we explore these convection dynamics for a range of white dwarf core and helium shell masses in three dimensions using the low Mach number hydrodynamics code MAESTRO. We present calculations of the bulk properties of this evolution, including time-series evolution of global diagnostics, lateral averages of the 3D state, and the global 3D state. We find a variety of outcomes, including quasi-equilibrium, localized runaway,more » and convective runaway. Finally, our results suggest that the double-detonation progenitor model is promising and that 3D dynamic convection plays a key role.« less
NASA Astrophysics Data System (ADS)
Zhang, W.; Wang, S.; Ma, Z. W.
2017-06-01
The influences of helical driven currents on nonlinear resistive tearing mode evolution and saturation are studied by using a three-dimensional toroidal resistive magnetohydrodynamic code (CLT). We carried out three types of helical driven currents: stationary, time-dependent amplitude, and thickness. It is found that the helical driven current is much more efficient than the Gaussian driven current used in our previous study [S. Wang et al., Phys. Plasmas 23(5), 052503 (2016)]. The stationary helical driven current cannot persistently control tearing mode instabilities. For the time-dependent helical driven current with f c d = 0.01 and δ c d < 0.04 , the island size can be reduced to its saturated level that is about one third of the initial island size. However, if the total driven current increases to about 7% of the total plasma current, tearing mode instabilities will rebound again due to the excitation of the triple tearing mode. For the helical driven current with time dependent strength and thickness, the reduction speed of the radial perturbation component of the magnetic field increases with an increase in the driven current and then saturates at a quite low level. The tearing mode is always controlled even for a large driven current.
The evolution of sensory divergence in the context of limited gene flow in the bumblebee bat
Puechmaille, Sébastien J.; Gouilh, Meriadeg Ar; Piyapan, Piyathip; Yokubol, Medhi; Mie, Khin Mie; Bates, Paul J.; Satasook, Chutamas; Nwe, Tin; Bu, Si Si Hla; Mackie, Iain J.; Petit, Eric J.; Teeling, Emma C.
2011-01-01
The sensory drive theory of speciation predicts that populations of the same species inhabiting different environments can differ in sensory traits, and that this sensory difference can ultimately drive speciation. However, even in the best-known examples of sensory ecology driven speciation, it is uncertain whether the variation in sensory traits is the cause or the consequence of a reduction in levels of gene flow. Here we show strong genetic differentiation, no gene flow and large echolocation differences between the allopatric Myanmar and Thai populations of the world's smallest mammal, Craseonycteris thonglongyai, and suggest that geographic isolation most likely preceded sensory divergence. Within the geographically continuous Thai population, we show that geographic distance has a primary role in limiting gene flow rather than echolocation divergence. In line with sensory-driven speciation models, we suggest that in C. thonglongyai, limited gene flow creates the suitable conditions that favour the evolution of sensory divergence via local adaptation. PMID:22146392
In Review (Geology): Alpine Landscape Evolution Dominated by Cirque Retreat
NASA Technical Reports Server (NTRS)
Oskin, Michael; Burbank, Doug
2005-01-01
Despite the abundance in alpine terrain of glacially dissected landscapes, the magnitude and geometry of glacial erosion can rarely be defined. In the eastern Kyrgyz Range, a widespread unconformity exhumed as a geomorphic surface provides a regional datum with which to calibrate erosion. As tectonically driven surface uplift has progressively pushed this surface into the zone of ice accumulation, glacial erosion has overprinted the landscape. With as little as 500 m of incision into rocks underlying the unconformity, distinctive glacial valleys display their deepest incision adjacent to cirque headwalls. The expansion of north-facing glacial cirques at the expense of south-facing valleys has driven the drainage divide southwards at rates up to 2 to 3 times the rate of valley incision. Existing ice-flux-based glacial erosion rules incompletely model expansion of glacial valleys via cirque retreat into the low-gradient unconformity remnants. Local processes that either directly sap cirque headwalls or inhibit erosion down-glacier appear to control, at least initially, alpine landscape evolution.
NASA Astrophysics Data System (ADS)
Melon Fuksman, J. D.; Becerra, L.; Bianco, C. L.; Karlica, M.; Kovacevic, M.; Moradi, R.; Muccino, M.; Pisani, G. B.; Primorac, D.; Rueda, J. A.; Ruffini, R.; Vereshchagin, G. V.; Wang, Y.
2018-01-01
The binary-driven hypernova (BdHN) model has been introduced in the past years, to explain a subfamily of gamma-ray bursts (GRBs) with energies Eiso ≥ 1052 erg associated with type Ic supernovae. Such BdHNe have as progenitor a tight binary system composed of a carbon-oxigen (CO) core and a neutron star undergoing an induced gravitational collapse to a black hole, triggered by the CO core explosion as a supernova (SN). This collapse produces an optically-thick e+e- plasma, which expands and impacts onto the SN ejecta. This process is here considered as a candidate for the production of X-ray flares, which are frequently observed following the prompt emission of GRBs. In this work we follow the evolution of the e+e- plasma as it interacts with the SN ejecta, by solving the equations of relativistic hydrodynamics numerically. Our results are compatible with the Lorentz factors estimated for the sources that produce the flares, of typically Γ ≲ 4.
Multiple secondary islands formation in nonlinear evolution of double tearing mode simulations
NASA Astrophysics Data System (ADS)
Guo, W.; Ma, J.; Yu, Z.
2017-03-01
A new numerical code solving the conservative perturbed resistive magnetohydrodynamic (MHD) model is developed. Numerical tests of the ideal Kelvin-Helmholtz instability and the resistive double tearing mode (DTM) show its capability in solving linear and nonlinear MHD instabilities. The nonlinear DTM evolution in 2D geometry is numerically investigated with low guiding field B z 0 , short half-distance y 0 between the equilibrium current sheets, and small resistivity η. The interaction of islands on the two initial current sheets may generate an unstable flow driven current sheet with a high length-to-thickness aspect ratio (α), and multiple secondary islands can form. In general, the length-to-thickness aspect ratio α and the number of secondary islands increase with decreasing guide field B z 0 , decreasing half-distance y 0 , and increasing Lundquist number of the flow driven current sheet S L although the dependence may be non-monotonic. The reconnection rate dependence on S L , B z 0 , and y 0 is also investigated.
Discovering objects in a blood recipient information system.
Qiu, D; Junghans, G; Marquardt, K; Kroll, H; Mueller-Eckhardt, C; Dudeck, J
1995-01-01
Application of object-oriented (OO) methodologies has been generally considered as a solution to the problem of improving the software development process and managing the so-called software crisis. Among them, object-oriented analysis (OOA) is the most essential and is a vital prerequisite for the successful use of other OO methodologies. Though there are already a good deal of OOA methods published, the most important aspect common to all these methods: discovering objects classes truly relevant to the given problem domain, has remained a subject to be intensively researched. In this paper, using the successful development of a blood recipient information system as an example, we present our approach which is based on the conceptual framework of responsibility-driven OOA. In the discussion, we also suggest that it may be inadequate to simply attribute the software crisis to the waterfall model of the software development life-cycle. We are convinced that the real causes for the failure of some software and information systems should be sought in the methodologies used in some crucial phases of the software development process. Furthermore, a software system can also fail if object classes essential to the problem domain are not discovered, implemented and visualized, so that the real-world situation cannot be faithfully traced by it.
NASA Astrophysics Data System (ADS)
Liu, D.; Tian, F.; Lin, M.; Sivapalan, M.
2015-02-01
The complex interactions and feedbacks between humans and water are critically important issues but remain poorly understood in the newly proposed discipline of socio-hydrology (Sivapalan et al., 2012). An exploratory model with the appropriate level of simplification can be valuable for improving our understanding of the co-evolution and self-organization of socio-hydrological systems driven by interactions and feedbacks operating at different scales. In this study, a simplified conceptual socio-hydrological model based on logistic growth curves is developed for the Tarim River basin in western China and is used to illustrate the explanatory power of such a co-evolutionary model. The study area is the main stream of the Tarim River, which is divided into two modeling units. The socio-hydrological system is composed of four sub-systems, i.e., the hydrological, ecological, economic, and social sub-systems. In each modeling unit, the hydrological equation focusing on water balance is coupled to the other three evolutionary equations to represent the dynamics of the social sub-system (denoted by population), the economic sub-system (denoted by irrigated crop area ratio), and the ecological sub-system (denoted by natural vegetation cover), each of which is expressed in terms of a logistic growth curve. Four feedback loops are identified to represent the complex interactions among different sub-systems and different spatial units, of which two are inner loops occurring within each separate unit and the other two are outer loops linking the two modeling units. The feedback mechanisms are incorporated into the constitutive relations for model parameters, i.e., the colonization and mortality rates in the logistic growth curves that are jointly determined by the state variables of all sub-systems. The co-evolution of the Tarim socio-hydrological system is then analyzed with this conceptual model to gain insights into the overall system dynamics and its sensitivity to the external drivers and internal system variables. The results show a costly pendulum swing between a balanced distribution of socio-economic and natural ecologic resources among the upper and lower reaches and a highly skewed distribution towards the upper reach. This evolution is principally driven by the attitudinal changes occurring within water resources management policies that reflect the evolving community awareness of society to concerns regarding the ecology and environment.
A Modern Picture of Barred Galaxy Dynamics
NASA Astrophysics Data System (ADS)
Petersen, Michael; Weinberg, Martin; Katz, Neal
2018-01-01
Observations of disk galaxies suggest that bars are responsible for altering global galaxy parameters (e.g. structures, gas fraction, star formation rate). The canonical understanding of the mechanisms underpinning bar-driven secular dynamics in disk galaxies has been largely built upon the analysis of linear theory, despite galactic bars being clearly demonstrated to be nonlinear phenomena in n-body simulations. We present simulations of barred Milky Way-like galaxy models designed to elucidate nonlinear barred galaxy dynamics. We have developed two new methodologies for analyzing n-body simulations that give the best of both powerful analytic linear theory and brute force simulation analysis: orbit family identification and multicomponent torque analysis. The software will be offered publicly to the community for their own simulation analysis.The orbit classifier reveals that the details of kinematic components in galactic disks (e.g. the bar, bulge, thin disk, and thick disk components) are powerful discriminators of evolutionary paradigms (i.e. violent instabilities and secular evolution) as well as the basic parameters of the dark matter halo (mass distribution, angular momentum distribution). Multicomponent torque analysis provides a thorough accounting of the transfer of angular momentum between orbits, global patterns, and distinct components in order to better explain the underlying physics which govern the secular evolution of barred disk galaxies.Using these methodologies, we are able to identify the successes and failures of linear theory and traditional n-body simulations en route to a detailed understanding of the control bars exhibit over secular evolution in galaxies. We present explanations for observed physical and velocity structures in observations of barred galaxies alongside predictions for how structures will vary with dynamical properties from galaxy to galaxy as well as over the lifetime of a galaxy, finding that the transfer of angular momentum through previously unidentified channels can more fully explain the observed dynamics.
Technology-driven dietary assessment: a software developer’s perspective
Buday, Richard; Tapia, Ramsey; Maze, Gary R.
2015-01-01
Dietary researchers need new software to improve nutrition data collection and analysis, but creating information technology is difficult. Software development projects may be unsuccessful due to inadequate understanding of needs, management problems, technology barriers or legal hurdles. Cost overruns and schedule delays are common. Barriers facing scientific researchers developing software include workflow, cost, schedule, and team issues. Different methods of software development and the role that intellectual property rights play are discussed. A dietary researcher must carefully consider multiple issues to maximize the likelihood of success when creating new software. PMID:22591224
NASA Astrophysics Data System (ADS)
Kumlander, Deniss
The globalization of companies operations and competitor between software vendors demand improving quality of delivered software and decreasing the overall cost. The same in fact introduce a lot of problem into software development process as produce distributed organization breaking the co-location rule of modern software development methodologies. Here we propose a reformulation of the ambassador position increasing its productivity in order to bridge communication and workflow gap by managing the entire communication process rather than concentrating purely on the communication result.
The advanced software development workstation project
NASA Technical Reports Server (NTRS)
Fridge, Ernest M., III; Pitman, Charles L.
1991-01-01
The Advanced Software Development Workstation (ASDW) task is researching and developing the technologies required to support Computer Aided Software Engineering (CASE) with the emphasis on those advanced methods, tools, and processes that will be of benefit to support all NASA programs. Immediate goals are to provide research and prototype tools that will increase productivity, in the near term, in projects such as the Software Support Environment (SSE), the Space Station Control Center (SSCC), and the Flight Analysis and Design System (FADS) which will be used to support the Space Shuttle and Space Station Freedom. Goals also include providing technology for development, evolution, maintenance, and operations. The technologies under research and development in the ASDW project are targeted to provide productivity enhancements during the software life cycle phase of enterprise and information system modeling, requirements generation and analysis, system design and coding, and system use and maintenance. On-line user's guides will assist users in operating the developed information system with knowledge base expert assistance.
Data-Driven Modeling of Complex Systems by means of a Dynamical ANN
NASA Astrophysics Data System (ADS)
Seleznev, A.; Mukhin, D.; Gavrilov, A.; Loskutov, E.; Feigin, A.
2017-12-01
The data-driven methods for modeling and prognosis of complex dynamical systems become more and more popular in various fields due to growth of high-resolution data. We distinguish the two basic steps in such an approach: (i) determining the phase subspace of the system, or embedding, from available time series and (ii) constructing an evolution operator acting in this reduced subspace. In this work we suggest a novel approach combining these two steps by means of construction of an artificial neural network (ANN) with special topology. The proposed ANN-based model, on the one hand, projects the data onto a low-dimensional manifold, and, on the other hand, models a dynamical system on this manifold. Actually, this is a recurrent multilayer ANN which has internal dynamics and capable of generating time series. Very important point of the proposed methodology is the optimization of the model allowing us to avoid overfitting: we use Bayesian criterion to optimize the ANN structure and estimate both the degree of evolution operator nonlinearity and the complexity of nonlinear manifold which the data are projected on. The proposed modeling technique will be applied to the analysis of high-dimensional dynamical systems: Lorenz'96 model of atmospheric turbulence, producing high-dimensional space-time chaos, and quasi-geostrophic three-layer model of the Earth's atmosphere with the natural orography, describing the dynamics of synoptical vortexes as well as mesoscale blocking systems. The possibility of application of the proposed methodology to analyze real measured data is also discussed. The study was supported by the Russian Science Foundation (grant #16-12-10198).
Positioning performance of the NTCM model driven by GPS Klobuchar model parameters
NASA Astrophysics Data System (ADS)
Hoque, Mohammed Mainul; Jakowski, Norbert; Berdermann, Jens
2018-03-01
Users of the Global Positioning System (GPS) utilize the Ionospheric Correction Algorithm (ICA) also known as Klobuchar model for correcting ionospheric signal delay or range error. Recently, we developed an ionosphere correction algorithm called NTCM-Klobpar model for single frequency GNSS applications. The model is driven by a parameter computed from GPS Klobuchar model and consecutively can be used instead of the GPS Klobuchar model for ionospheric corrections. In the presented work we compare the positioning solutions obtained using NTCM-Klobpar with those using the Klobuchar model. Our investigation using worldwide ground GPS data from a quiet and a perturbed ionospheric and geomagnetic activity period of 17 days each shows that the 24-hour prediction performance of the NTCM-Klobpar is better than the GPS Klobuchar model in global average. The root mean squared deviation of the 3D position errors are found to be about 0.24 and 0.45 m less for the NTCM-Klobpar compared to the GPS Klobuchar model during quiet and perturbed condition, respectively. The presented algorithm has the potential to continuously improve the accuracy of GPS single frequency mass market devices with only little software modification.
Installé, Arnaud Jf; Van den Bosch, Thierry; De Moor, Bart; Timmerman, Dirk
2014-10-20
Using machine-learning techniques, clinical diagnostic model research extracts diagnostic models from patient data. Traditionally, patient data are often collected using electronic Case Report Form (eCRF) systems, while mathematical software is used for analyzing these data using machine-learning techniques. Due to the lack of integration between eCRF systems and mathematical software, extracting diagnostic models is a complex, error-prone process. Moreover, due to the complexity of this process, it is usually only performed once, after a predetermined number of data points have been collected, without insight into the predictive performance of the resulting models. The objective of the study of Clinical Data Miner (CDM) software framework is to offer an eCRF system with integrated data preprocessing and machine-learning libraries, improving efficiency of the clinical diagnostic model research workflow, and to enable optimization of patient inclusion numbers through study performance monitoring. The CDM software framework was developed using a test-driven development (TDD) approach, to ensure high software quality. Architecturally, CDM's design is split over a number of modules, to ensure future extendability. The TDD approach has enabled us to deliver high software quality. CDM's eCRF Web interface is in active use by the studies of the International Endometrial Tumor Analysis consortium, with over 4000 enrolled patients, and more studies planned. Additionally, a derived user interface has been used in six separate interrater agreement studies. CDM's integrated data preprocessing and machine-learning libraries simplify some otherwise manual and error-prone steps in the clinical diagnostic model research workflow. Furthermore, CDM's libraries provide study coordinators with a method to monitor a study's predictive performance as patient inclusions increase. To our knowledge, CDM is the only eCRF system integrating data preprocessing and machine-learning libraries. This integration improves the efficiency of the clinical diagnostic model research workflow. Moreover, by simplifying the generation of learning curves, CDM enables study coordinators to assess more accurately when data collection can be terminated, resulting in better models or lower patient recruitment costs.
Introducing Risk Management Techniques Within Project Based Software Engineering Courses
NASA Astrophysics Data System (ADS)
Port, Daniel; Boehm, Barry
2002-03-01
In 1996, USC switched its core two-semester software engineering course from a hypothetical-project, homework-and-exam course based on the Bloom taxonomy of educational objectives (knowledge, comprehension, application, analysis, synthesis, and evaluation). The revised course is a real-client team-project course based on the CRESST model of learning objectives (content understanding, problem solving, collaboration, communication, and self-regulation). We used the CRESST cognitive demands analysis to determine the necessary student skills required for software risk management and the other major project activities, and have been refining the approach over the last 5 years of experience, including revised versions for one-semester undergraduate and graduate project course at Columbia. This paper summarizes our experiences in evolving the risk management aspects of the project course. These have helped us mature more general techniques such as risk-driven specifications, domain-specific simplifier and complicator lists, and the schedule as an independent variable (SAIV) process model. The largely positive results in terms of review of pass / fail rates, client evaluations, product adoption rates, and hiring manager feedback are summarized as well.
Mechanical model for filament buckling and growth by phase ordering.
Rey, Alejandro D; Abukhdeir, Nasser M
2008-02-05
A mechanical model of open filament shape and growth driven by phase ordering is formulated. For a given phase-ordering driving force, the model output is the filament shape evolution and the filament end-point kinematics. The linearized model for the slope of the filament is the Cahn-Hilliard model of spinodal decomposition, where the buckling corresponds to concentration fluctuations. Two modes are predicted: (i) sequential growth and buckling and (ii) simultaneous buckling and growth. The relation among the maximum buckling rate, filament tension, and matrix viscosity is given. These results contribute to ongoing work in smectic A filament buckling.
NASA Astrophysics Data System (ADS)
Kapser, Stefan; Balden, Martin; Fiorini da Silva, Tiago; Elgeti, Stefan; Manhard, Armin; Schmid, Klaus; Schwarz-Selinger, Thomas; von Toussaint, Udo
2018-05-01
Low-energy-plasma-driven deuterium permeation through tungsten at 300 K and 450 K has been investigated. Microstructural analysis by scanning electron microscopy, assisted by focused ion beam, revealed sub-surface damage evolution only at 300 K. This damage evolution was correlated with a significant evolution of the deuterium amount retained below the plasma-exposed surface. Although both of these phenomena were observed for 300 K exposure temperature only, the deuterium permeation flux at both exposure temperatures was indistinguishable within the experimental uncertainty. The permeation flux was used to estimate the maximum ratio of solute-deuterium to tungsten atoms during deuterium-plasma exposure at both temperatures and thus in the presence and absence of damage evolution. Diffusion-trapping simulations revealed the proximity of damage evolution to the implantation surface as the reason for an only insignificant decrease of the permeation flux.
Solar Wind Driven Autoregression Model for Ionospheric Short Term Forecast (SWIF)
2008-06-01
for GCAM; previous two months for TSAR). However the quality of the autoscaling software ( ARTIST ) performance determines largely the accuracy of...showing the SWIF synthesis. Figure 17: Athens Digisonde ionograms autoscaled with ARTIST4.0 (top) and with ARTIST4.5 (bottom) during fall quiet...intervals Figure 18: Athens Digisonde ionograms autoscaled with ARTIST4.0 (top) and with ARTIST4.5 (bottom) during summer quiet intervals Figure 19
Theory-Driven Models for Correcting Fight or Flight Imbalance in Gulf War Illness
2011-09-01
testing on software • Performed static and dynamic analysis on safety code Research Interests To understand how the nervous system operates, how...dynamics of these systems to reset control of the HPA-immune axis to normal. We have completed the negotiation of sub-awards to the CFIDS Association...We propose that severe physical or psychological insult to the endocrine and immune systems can displace these from a normal regulatory equilibrium
1993-11-01
Recover Nitramine (Yxidizers from Solid Propellants Using Liquid Ammonia * Co~ial Engine for Ducted Hybrid , and Gel BI-propu~uion Systems S ltravolet...Surface Optical Testing Device * Electron Beam Driven Negative Ion Source * Method of Manufacturing Hybrid Fber-Reinforced Composite Nozzle Materials...Modeling Software FRED Partner I ty * Class VDrnng Simulation Parow. Academia * Combustion and Tribology Partne. Academia * Hybrid Electric Drive/High
A new model for biological effects of radiation and the driven force of molecular evolution
NASA Astrophysics Data System (ADS)
Wada, Takahiro; Manabe, Yuichiro; Nakajima, Hiroo; Tsunoyama, Yuichi; Bando, Masako
We proposed a new mathematical model to estimate biological effects of radiation, which we call Whack-A-Mole (WAM) model. A special feature of WAM model is that it involves the dose rate of radiation as a key ingredient. We succeeded to reproduce the experimental data of various species concerning the radiation induced mutation frequencies. From the analysis of the mega-mouse experiments, we obtained the mutation rate per base-pair per year for mice which is consistent with the so-called molecular clock in evolution genetics, 10-9 mutation/base-pair/year. Another important quantity is the equivalent dose rate for the whole spontaneous mutation, deff. The value of deff for mice is 1.1*10-3 Gy/hour which is much larger than the dose rate of natural radiation (10- (6 - 7) Gy/hour) by several orders of magnitude. We also analyzed Drosophila data and obtained essentially the same numbers. This clearly indicates that the natural radiation is not the dominant driving force of the molecular evolution, but we should look for other factors, such as miscopy of DNA in duplication process. We believe this is the first quantitative proof of the small contribution of the natural radiation in the molecular evolution.
Evidence for adaptive radiation from a phylogenetic study of plant defenses
Agrawal, Anurag A.; Fishbein, Mark; Halitschke, Rayko; Hastings, Amy P.; Rabosky, Daniel L.; Rasmann, Sergio
2009-01-01
One signature of adaptive radiation is a high level of trait change early during the diversification process and a plateau toward the end of the radiation. Although the study of the tempo of evolution has historically been the domain of paleontologists, recently developed phylogenetic tools allow for the rigorous examination of trait evolution in a tremendous diversity of organisms. Enemy-driven adaptive radiation was a key prediction of Ehrlich and Raven's coevolutionary hypothesis [Ehrlich PR, Raven PH (1964) Evolution 18:586–608], yet has remained largely untested. Here we examine patterns of trait evolution in 51 North American milkweed species (Asclepias), using maximum likelihood methods. We study 7 traits of the milkweeds, ranging from seed size and foliar physiological traits to defense traits (cardenolides, latex, and trichomes) previously shown to impact herbivores, including the monarch butterfly. We compare the fit of simple random-walk models of trait evolution to models that incorporate stabilizing selection (Ornstein-Ulenbeck process), as well as time-varying rates of trait evolution. Early bursts of trait evolution were implicated for 2 traits, while stabilizing selection was implicated for several others. We further modeled the relationship between trait change and species diversification while allowing rates of trait evolution to vary during the radiation. Species-rich lineages underwent a proportionately greater decline in latex and cardenolides relative to species-poor lineages, and the rate of trait change was most rapid early in the radiation. An interpretation of this result is that reduced investment in defensive traits accelerated diversification, and disproportionately so, early in the adaptive radiation of milkweeds. PMID:19805160
UML Profiles for Design Decisions and Non-Functional Requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Liming; Gorton, Ian
2007-06-30
A software architecture is composed of a collection of design decisions. Each design decision helps or hinders certain Non-Functional Requirements (NFR). Current software architecture views focus on expressing components and connectors in the system. Design decisions and their relationships with non-functional requirements are often captured in separate design documentation, not explicitly expressed in any views. This disassociation makes architecture comprehension and architecture evolution harder. In this paper, we propose a UML profile for modeling design decisions and an associated UML profile for modeling non-functional requirements in a generic way. The two UML profiles treat design decisions and nonfunctional requirements asmore » first-class elements. Modeled design decisions always refer to existing architectural elements and thus maintain traceability between the two. We provide a mechanism for checking consistency over this traceability. An exemplar is given as« less
Computational modeling of drug-resistant bacteria. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacDougall, Preston
2015-03-12
Initial proposal summary: The evolution of antibiotic-resistant mutants among bacteria (superbugs) is a persistent and growing threat to public health. In many ways, we are engaged in a war with these microorganisms, where the corresponding arms race involves chemical weapons and biological targets. Just as advances in microelectronics, imaging technology and feature recognition software have turned conventional munitions into smart bombs, the long-term objectives of this proposal are to develop highly effective antibiotics using next-generation biomolecular modeling capabilities in tandem with novel subatomic feature detection software. Using model compounds and targets, our design methodology will be validated with correspondingly ultra-highmore » resolution structure-determination methods at premier DOE facilities (single-crystal X-ray diffraction at Argonne National Laboratory, and neutron diffraction at Oak Ridge National Laboratory). The objectives and accomplishments are summarized.« less
The Interior and Orbital Evolution of Charon as Preserved in Its Geologic Record
NASA Technical Reports Server (NTRS)
Rhoden, Alyssa Rose; Henning, Wade; Hurford, Terry A.; Hamilton, Douglas P.
2014-01-01
Pluto and its largest satellite, Charon, currently orbit in a mutually synchronous state; both bodies continuously show the same face to one another. This orbital configuration is a natural end-state for bodies that have undergone tidal dissipation. In order to achieve this state, both bodies would have experienced tidal heating and stress, with the extent of tidal activity controlled by the orbital evolution of Pluto and Charon and by the interior structure and rheology of each body. As the secondary, Charon would have experienced a larger tidal response than Pluto, which may have manifested as observable tectonism. Unfortunately, there are few constraints on the interiors of Pluto and Charon. In addition, the pathway by which Charon came to occupy its present orbital state is uncertain. If Charon's orbit experienced a high-eccentricity phase, as suggested by some orbital evolution models, tidal effects would have likely been more significant. Therefore, we determine the conditions under which Charon could have experienced tidally-driven geologic activity and the extent to which upcoming New Horizons spacecraft observations could be used to constrain Charon's internal structure and orbital evolution. Using plausible interior structure models that include an ocean layer, we find that tidally-driven tensile fractures would likely have formed on Charon if its eccentricity were on the order of 0.01, especially if Charon were orbiting closer to Pluto than at present. Such fractures could display a variety of azimuths near the equator and near the poles, with the range of azimuths in a given region dependent on longitude; east-west-trending fractures should dominate at mid-latitudes. The fracture patterns we predict indicate that Charon's surface geology could provide constraints on the thickness and viscosity of Charon's ice shell at the time of fracture formation.
Strategies for concurrent processing of complex algorithms in data driven architectures
NASA Technical Reports Server (NTRS)
Stoughton, John W.; Mielke, Roland R.
1988-01-01
The purpose is to document research to develop strategies for concurrent processing of complex algorithms in data driven architectures. The problem domain consists of decision-free algorithms having large-grained, computationally complex primitive operations. Such are often found in signal processing and control applications. The anticipated multiprocessor environment is a data flow architecture containing between two and twenty computing elements. Each computing element is a processor having local program memory, and which communicates with a common global data memory. A new graph theoretic model called ATAMM which establishes rules for relating a decomposed algorithm to its execution in a data flow architecture is presented. The ATAMM model is used to determine strategies to achieve optimum time performance and to develop a system diagnostic software tool. In addition, preliminary work on a new multiprocessor operating system based on the ATAMM specifications is described.
Dynamic Performance of Subway Vehicle with Linear Induction Motor System
NASA Astrophysics Data System (ADS)
Wu, Pingbo; Luo, Ren; Hu, Yan; Zeng, Jing
The light rail vehicle with Linear Induction Motor (LIM) bogie, which is a new type of urban rail traffic tool, has the advantages of low costs, wide applicability, low noise, simple maintenance and better dynamic behavior. This kind of vehicle, supported and guided by the wheel and rail, is not driven by the wheel/rail adhesion force, but driven by the electromagnetic force between LIM and reaction plate. In this paper, three different types of suspensions and their characteristic are discussed with considering the interactions both between wheel and rail and between LIM and reaction plate. A nonlinear mathematical model of the vehicle with LIM bogie is set up by using the software SIMPACK, and the electromechanical model is also set up on Simulink roof. Then the running behavior of the LIM vehicle is simulated, and the influence of suspension on the vehicle dynamic performance is investigated.
A Cohesive Zone Approach for Fatigue-Driven Delamination Analysis in Composite Materials
NASA Astrophysics Data System (ADS)
Amiri-Rad, Ahmad; Mashayekhi, Mohammad
2017-08-01
A new model for prediction of fatigue-driven delamination in laminated composites is proposed using cohesive interface elements. The presented model provides a link between cohesive elements damage evolution rate and crack growth rate of Paris law. This is beneficial since no additional material parameters are required and the well-known Paris law constants are used. The link between the cohesive zone method and fracture mechanics is achieved without use of effective length which has led to more accurate results. The problem of unknown failure path in calculation of the energy release rate is solved by imposing a condition on the damage model which leads to completely vertical failure path. A global measure of energy release rate is used for the whole cohesive zone which is computationally more efficient compared to previous similar models. The performance of the proposed model is investigated by simulation of well-known delamination tests and comparison against experimental data of the literature.
Particle acceleration and transport at a 2D CME-driven shock using the HAFv3 and PATH Code
NASA Astrophysics Data System (ADS)
Li, G.; Ao, X.; Fry, C. D.; Verkhoglyadova, O. P.; Zank, G. P.
2012-12-01
We study particle acceleration at a 2D CME-driven shock and the subsequent transport in the inner heliosphere (up to 2 AU) by coupling the kinematic Hakamada-Akasofu-Fry version 3 (HAFv3) solar wind model (Hakamada and Akasofu, 1982, Fry et al. 2003) with the Particle Acceleration and Transport in the Heliosphere (PATH) model (Zank et al., 2000, Li et al., 2003, 2005, Verkhoglyadova et al. 2009). The HAFv3 provides the evolution of a two-dimensional shock geometry and other plasma parameters, which are fed into the PATH model to investigate the effect of a varying shock geometry on particle acceleration and transport. The transport module of the PATH model is parallelized and utilizes the state-of-the-art GPU computation technique to achieve a rapid physics-based numerical description of the interplanetary energetic particles. Together with a fast execution of the HAFv3 model, the coupled code gives us a possibility to nowcast/forecast the interplanetary radiation environment.
CHEMICAL EVOLUTION LIBRARY FOR GALAXY FORMATION SIMULATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saitoh, Takayuki R., E-mail: saitoh@elsi.jp
We have developed a software library for chemical evolution simulations of galaxy formation under the simple stellar population (SSP) approximation. In this library, all of the necessary components concerning chemical evolution, such as initial mass functions, stellar lifetimes, yields from Type II and Type Ia supernovae, asymptotic giant branch stars, and neutron star mergers, are compiled from the literature. Various models are pre-implemented in this library so that users can choose their favorite combination of models. Subroutines of this library return released energy and masses of individual elements depending on a given event type. Since the redistribution manner of thesemore » quantities depends on the implementation of users’ simulation codes, this library leaves it up to the simulation code. As demonstrations, we carry out both one-zone, closed-box simulations and 3D simulations of a collapsing gas and dark matter system using this library. In these simulations, we can easily compare the impact of individual models on the chemical evolution of galaxies, just by changing the control flags and parameters of the library. Since this library only deals with the part of chemical evolution under the SSP approximation, any simulation codes that use the SSP approximation—namely, particle-base and mesh codes, as well as semianalytical models—can use it. This library is named “CELib” after the term “Chemical Evolution Library” and is made available to the community.« less
Numerical modeling of the Madison Dynamo Experiment.
NASA Astrophysics Data System (ADS)
Bayliss, R. A.; Wright, J. C.; Forest, C. B.; O'Connell, R.
2002-11-01
Growth, saturation and turbulent evolution of the Madison dynamo experiment is investigated numerically using a 3-D pseudo-spectral simulation of the MHD equations; results of the simulations will be compared to results obtained from the experiment. The code, Dynamo (Fortran90), allows for full evolution of the magnetic and velocity fields. The induction equation governing B and the curl of the momentum equation governing V are separately or simultaneously solved. The code uses a spectral representation via spherical harmonic basis functions of the vector fields in longitude and latitude, and fourth order finite differences in the radial direction. The magnetic field evolution has been benchmarked against the laminar kinematic dynamo predicted by M.L. Dudley and R.W. James (M.L. Dudley and R.W. James, Time-dependent kinematic dynamos with stationary flows, Proc. R. Soc. Lond. A 425, p. 407 (1989)). Power balance in the system has been verified in both mechanically driven and perturbed hydrodynamic, kinematic, and dynamic cases. Evolution of the vacuum magnetic field has been added to facilitate comparison with the experiment. Modeling of the Madison Dynamo eXperiment will be presented.
SRMS History, Evolution and Lessons Learned
NASA Technical Reports Server (NTRS)
Jorgensen, Glenn; Bains, Elizabeth
2011-01-01
Early in the development of the Space Shuttle, it became clear that NASA needed a method of deploying and retrieving payloads from the payload bay. The Shuttle Remote Manipulator System (SRMS) was developed to fill this need. The 50 foot long robotic arm is an anthropomorphic design consisting of three electromechanical joints, six degrees of freedom, and two boom segments. Its composite boom construction provided a light weight solution needed for space operations. Additionally, a method of capturing payloads with the arm was required and a unique End Effector was developed using an electromechanical snare mechanism. The SRMS is operated using a Displays and Controls Panel and hand controllers located within the aft crew compartment of the shuttle. Although the SRMS was originally conceived to deploy and retrieve payloads, its generic capabilities allowed it to perform many other functions not originally conceived of. Over the years it has been used for deploying and retrieving constrained and free flying payloads, maneuvering and supporting EVA astronauts, satellite repair, International Space Station construction, and as a viewing aid for on-orbit International Space Station operations. After the Columbia accident, a robotically compatible Orbiter Boom Sensor System (OBSS) was developed and used in conjunction with the SRMS to scan the Thermal Protection System (TPS) of the shuttle. These scans ensure there is not a breach of the TPS prior to shuttle re-entry. Ground operations and pre mission simulation, analysis and planning played a major role in the success of the SRMS program. A Systems Engineering Simulator (SES) was developed to provide a utility complimentary to open loop engineering simulations. This system provided a closed-loop real-time pilot-driven simulation giving visual feedback, display and control panel interaction, and integration with other vehicle systems, such as GN&C. It has been useful for many more applications than traditional training. Evolution of the simulations, guided by the Math Model Working Group, showed the utility of input from multiple modeling groups with a structured forum for discussion.There were many unique development challenges in the areas of hardware, software, certification, modeling and simulation. Over the years, upgrades and enhancements were implemented to increase the capability, performance and safety of the SRMS. The history and evolution of the SRMS program provided many lessons learned that can be used for future space robotic systems.
NASA Astrophysics Data System (ADS)
Morse, P. E.; Reading, A. M.; Lueg, C.
2014-12-01
Pattern-recognition in scientific data is not only a computational problem but a human-observer problem as well. Human observation of - and interaction with - data visualization software can augment, select, interrupt and modify computational routines and facilitate processes of pattern and significant feature recognition for subsequent human analysis, machine learning, expert and artificial intelligence systems.'Tagger' is a Mac OS X interactive data visualisation tool that facilitates Human-Computer interaction for the recognition of patterns and significant structures. It is a graphical application developed using the Quartz Composer framework. 'Tagger' follows a Model-View-Controller (MVC) software architecture: the application problem domain (the model) is to facilitate novel ways of abstractly representing data to a human interlocutor, presenting these via different viewer modalities (e.g. chart representations, particle systems, parametric geometry) to the user (View) and enabling interaction with the data (Controller) via a variety of Human Interface Devices (HID). The software enables the user to create an arbitrary array of tags that may be appended to the visualised data, which are then saved into output files as forms of semantic metadata. Three fundamental problems that are not strongly supported by conventional scientific visualisation software are addressed:1] How to visually animate data over time, 2] How to rapidly deploy unconventional parametrically driven data visualisations, 3] How to construct and explore novel interaction models that capture the activity of the end-user as semantic metadata that can be used to computationally enhance subsequent interrogation. Saved tagged data files may be loaded into Tagger, so that tags may be tagged, if desired. Recursion opens up the possibility of refining or overlapping different types of tags, tagging a variety of different POIs or types of events, and of capturing different types of specialist observations of important or noticeable events. Other visualisations and modes of interaction will also be demonstrated, with the aim of discovering knowledge in large datasets in the natural, physical sciences. Fig.1 Wave height data from an oceanographic Wave Rider Buoy. Colors/radii are driven by wave height data.
Independent evolution of the sexes promotes amphibian diversification.
De Lisle, Stephen P; Rowe, Locke
2015-03-22
Classic ecological theory predicts that the evolution of sexual dimorphism constrains diversification by limiting morphospace available for speciation. Alternatively, sexual selection may lead to the evolution of reproductive isolation and increased diversification. We test contrasting predictions of these hypotheses by examining the relationship between sexual dimorphism and diversification in amphibians. Our analysis shows that the evolution of sexual size dimorphism (SSD) is associated with increased diversification and speciation, contrary to the ecological theory. Further, this result is unlikely to be explained by traditional sexual selection models because variation in amphibian SSD is unlikely to be driven entirely by sexual selection. We suggest that relaxing a central assumption of classic ecological models-that the sexes share a common adaptive landscape-leads to the alternative hypothesis that independent evolution of the sexes may promote diversification. Once the constraints of sexual conflict are relaxed, the sexes can explore morphospace that would otherwise be inaccessible. Consistent with this novel hypothesis, the evolution of SSD in amphibians is associated with reduced current extinction threat status, and an historical reduction in extinction rate. Our work reconciles conflicting predictions from ecological and evolutionary theory and illustrates that the ability of the sexes to evolve independently is associated with a spectacular vertebrate radiation. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
NASA Astrophysics Data System (ADS)
Robinet, A.; Castelle, B.; Idier, D.; Le Cozannet, G.; Déqué, M.; Charles, E.
2016-12-01
Modeling studies addressing daily to interannual coastal evolution typically relate shoreline change with waves, currents and sediment transport through complex processes and feedbacks. For wave-dominated environments, the main driver (waves) is controlled by the regional atmospheric circulation. Here a simple weather regime-driven shoreline model is developed for a 15-year shoreline dataset (2000-2014) collected at Truc Vert beach, Bay of Biscay, SW France. In all, 16 weather regimes (four per season) are considered. The centroids and occurrences are computed using the ERA-40 and ERA-Interim reanalyses, applying k-means and EOF methods to the anomalies of the 500-hPa geopotential height over the North Atlantic Basin. The weather regime-driven shoreline model explains 70% of the observed interannual shoreline variability. The application of a proven wave-driven equilibrium shoreline model to the same period shows that both models have similar skills at the interannual scale. Relation between the weather regimes and the wave climate in the Bay of Biscay is investigated and the primary weather regimes impacting shoreline change are identified. For instance, the winter zonal regime characterized by a strengthening of the pressure gradient between the Iceland low and the Azores high is associated with high-energy wave conditions and is found to drive an increase in the shoreline erosion rate. The study demonstrates the predictability of interannual shoreline change from a limited number of weather regimes, which opens new perspectives for shoreline change modeling and encourages long-term shoreline monitoring programs.
Observation-Driven Configuration of Complex Software Systems
NASA Astrophysics Data System (ADS)
Sage, Aled
2010-06-01
The ever-increasing complexity of software systems makes them hard to comprehend, predict and tune due to emergent properties and non-deterministic behaviour. Complexity arises from the size of software systems and the wide variety of possible operating environments: the increasing choice of platforms and communication policies leads to ever more complex performance characteristics. In addition, software systems exhibit different behaviour under different workloads. Many software systems are designed to be configurable so that policies can be chosen to meet the needs of various stakeholders. For complex software systems it can be difficult to accurately predict the effects of a change and to know which configuration is most appropriate. This thesis demonstrates that it is useful to run automated experiments that measure a selection of system configurations. Experiments can find configurations that meet the stakeholders' needs, find interesting behavioural characteristics, and help produce predictive models of the system's behaviour. The design and use of ACT (Automated Configuration Tool) for running such experiments is described, in combination a number of search strategies for deciding on the configurations to measure. Design Of Experiments (DOE) is discussed, with emphasis on Taguchi Methods. These statistical methods have been used extensively in manufacturing, but have not previously been used for configuring software systems. The novel contribution here is an industrial case study, applying the combination of ACT and Taguchi Methods to DC-Directory, a product from Data Connection Ltd (DCL). The case study investigated the applicability of Taguchi Methods for configuring complex software systems. Taguchi Methods were found to be useful for modelling and configuring DC- Directory, making them a valuable addition to the techniques available to system administrators and developers.
QSAR and 3D-QSAR studies applied to compounds with anticonvulsant activity.
Garro Martinez, Juan C; Vega-Hissi, Esteban G; Andrada, Matías F; Estrada, Mario R
2015-01-01
Quantitative structure-activity relationships (QSAR and 3D-QSAR) have been applied in the last decade to obtain a reliable statistical model for the prediction of the anticonvulsant activities of new chemical entities. However, despite the large amount of information on QSAR, no recent review has published and discussed this data in detail. In this review, the authors provide a detailed discussion of QSAR studies that have been applied to compounds with anticonvulsant activity published between the years 2003 and 2013. They also evaluate the mathematical approaches and the main software used to develop the QSAR and 3D-QSAR model. QSAR methodologies continue to attract the attention of researchers and provide valuable information for the development of new potentially active compounds including those with anticonvulsant activity. This has been helped in part by improvements in the size and performance of computers; the development of specific software and the development of novel molecular descriptors, which have given rise to new and more predictive QSAR models. The extensive development of descriptors, and the way by which descriptor values are derived, have allowed the evolution of the QSAR methods. This evolution could strengthen the QSAR methods as an important tool in research and development of new and more potent anticonvulsant agents.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weis, M. R.; Zhang, P.; Lau, Y. Y., E-mail: yylau@umich.edu
2014-12-15
Using the ideal magnetohydrodynamic model, we calculate the temporal evolution of initial ripples on the boundaries of a planar plasma slab that is subjected to the magneto-Rayleigh-Taylor instability. The plasma slab consists of three regions. We assume that in each region the plasma density is constant with an arbitrary value and the magnetic field is also constant with an arbitrary magnitude and an arbitrary direction parallel to the interfaces. Thus, the instability may be driven by a combination of magnetic pressure and kinetic pressure. The general dispersion relation is derived, together with the feedthrough factor between the two interfaces. Themore » temporal evolution is constructed from the superposition of the eigenmodes. Previously established results are recovered in the various limits. Numerical examples are given on the temporal evolution of ripples on the interfaces of the finite plasma slab.« less
Weis, Matthew Robert; Zhang, Peng; Lau, Yue Ying; ...
2014-12-17
Using the ideal magnetohydrodynamic model, we calculate the temporal evolution of initial ripples on the boundaries of a planar plasma slab that is subjected to the magneto-Rayleigh-Taylor instability. The plasma slab consists of three regions. We assume that in each region the plasma density is constant with an arbitrary value and the magnetic field is also constant with an arbitrary magnitude and an arbitrary direction parallel to the interfaces. Then, the instability may be driven by a combination of magnetic pressure and kinetic pressure. Thus the general dispersion relation is derived, together with the feedthrough factor between the two interfaces.more » The temporal evolution is constructed from the superposition of the eigenmodes. Those previously established results are recovered in the various limits. Numerical examples are given on the temporal evolution of ripples on the interfaces of the finite plasma slab.« less
How Complex, Probable, and Predictable is Genetically Driven Red Queen Chaos?
Duarte, Jorge; Rodrigues, Carla; Januário, Cristina; Martins, Nuno; Sardanyés, Josep
2015-12-01
Coevolution between two antagonistic species has been widely studied theoretically for both ecologically- and genetically-driven Red Queen dynamics. A typical outcome of these systems is an oscillatory behavior causing an endless series of one species adaptation and others counter-adaptation. More recently, a mathematical model combining a three-species food chain system with an adaptive dynamics approach revealed genetically driven chaotic Red Queen coevolution. In the present article, we analyze this mathematical model mainly focusing on the impact of species rates of evolution (mutation rates) in the dynamics. Firstly, we analytically proof the boundedness of the trajectories of the chaotic attractor. The complexity of the coupling between the dynamical variables is quantified using observability indices. By using symbolic dynamics theory, we quantify the complexity of genetically driven Red Queen chaos computing the topological entropy of existing one-dimensional iterated maps using Markov partitions. Co-dimensional two bifurcation diagrams are also built from the period ordering of the orbits of the maps. Then, we study the predictability of the Red Queen chaos, found in narrow regions of mutation rates. To extend the previous analyses, we also computed the likeliness of finding chaos in a given region of the parameter space varying other model parameters simultaneously. Such analyses allowed us to compute a mean predictability measure for the system in the explored region of the parameter space. We found that genetically driven Red Queen chaos, although being restricted to small regions of the analyzed parameter space, might be highly unpredictable.
Adaptive Multiscale Modeling of Geochemical Impacts on Fracture Evolution
NASA Astrophysics Data System (ADS)
Molins, S.; Trebotich, D.; Steefel, C. I.; Deng, H.
2016-12-01
Understanding fracture evolution is essential for many subsurface energy applications, including subsurface storage, shale gas production, fracking, CO2 sequestration, and geothermal energy extraction. Geochemical processes in particular play a significant role in the evolution of fractures through dissolution-driven widening, fines migration, and/or fracture sealing due to precipitation. One obstacle to understanding and exploiting geochemical fracture evolution is that it is a multiscale process. However, current geochemical modeling of fractures cannot capture this multi-scale nature of geochemical and mechanical impacts on fracture evolution, and is limited to either a continuum or pore-scale representation. Conventional continuum-scale models treat fractures as preferential flow paths, with their permeability evolving as a function (often, a cubic law) of the fracture aperture. This approach has the limitation that it oversimplifies flow within the fracture in its omission of pore scale effects while also assuming well-mixed conditions. More recently, pore-scale models along with advanced characterization techniques have allowed for accurate simulations of flow and reactive transport within the pore space (Molins et al., 2014, 2015). However, these models, even with high performance computing, are currently limited in their ability to treat tractable domain sizes (Steefel et al., 2013). Thus, there is a critical need to develop an adaptive modeling capability that can account for separate properties and processes, emergent and otherwise, in the fracture and the rock matrix at different spatial scales. Here we present an adaptive modeling capability that treats geochemical impacts on fracture evolution within a single multiscale framework. Model development makes use of the high performance simulation capability, Chombo-Crunch, leveraged by high resolution characterization and experiments. The modeling framework is based on the adaptive capability in Chombo which not only enables mesh refinement, but also refinement of the model-pore scale or continuum Darcy scale-in a dynamic way such that the appropriate model is used only when and where it is needed. Explicit flux matching provides coupling betwen the scales.
A cross-validation package driving Netica with python
Fienen, Michael N.; Plant, Nathaniel G.
2014-01-01
Bayesian networks (BNs) are powerful tools for probabilistically simulating natural systems and emulating process models. Cross validation is a technique to avoid overfitting resulting from overly complex BNs. Overfitting reduces predictive skill. Cross-validation for BNs is known but rarely implemented due partly to a lack of software tools designed to work with available BN packages. CVNetica is open-source, written in Python, and extends the Netica software package to perform cross-validation and read, rebuild, and learn BNs from data. Insights gained from cross-validation and implications on prediction versus description are illustrated with: a data-driven oceanographic application; and a model-emulation application. These examples show that overfitting occurs when BNs become more complex than allowed by supporting data and overfitting incurs computational costs as well as causing a reduction in prediction skill. CVNetica evaluates overfitting using several complexity metrics (we used level of discretization) and its impact on performance metrics (we used skill).
Improving the Transparency of IAEA Safeguards Reporting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toomey, Christopher; Hayman, Aaron M.; Wyse, Evan T.
2011-07-17
In 2008, the Standing Advisory Group on Safeguards Implementation (SAGSI) indicated that the International Atomic Energy Agency's (IAEA) Safeguards Implementation Report (SIR) has not kept pace with the evolution of safeguards and provided the IAEA with a set of recommendations for improvement. The SIR is the primary mechanism for providing an overview of safeguards implementation in a given year and reporting on the annual safeguards findings and conclusions drawn by the Secretariat. As the IAEA transitions to State-level safeguards approaches, SIR reporting must adapt to reflect these evolutionary changes. This evolved report will better reflect the IAEA's transition to amore » more qualitative and information-driven approach, based upon State-as-a-whole considerations. This paper applies SAGSI's recommendations to the development of multiple models for an evolved SIR and finds that an SIR repurposed as a 'safeguards portal' could significantly enhance information delivery, clarity, and transparency. In addition, this paper finds that the 'portal concept' also appears to have value as a standardized information presentation and analysis platform for use by Country Officers, for continuity of knowledge purposes, and the IAEA Secretariat in the safeguards conclusion process. Accompanying this paper is a fully functional prototype of the 'portal' concept, built using commercial software and IAEA Annual Report data.« less
Exploring galaxy evolution with latent space walks
NASA Astrophysics Data System (ADS)
Schawinski, Kevin; Turp, Dennis; Zhang, Ce
2018-01-01
We present a new approach using artificial intelligence to perform data-driven forward models of astrophysical phenomena. We describe how a variational autoencoder can be used to encode galaxies to latent space, independently manipulate properties such as the specific star formation rate, and return it to real space. Such transformations can be used for forward modeling phenomena using data as the only constraints. We demonstrate the utility of this approach using the question of the quenching of star formation in galaxies.
NASA Astrophysics Data System (ADS)
Ulmer, Christopher J.; Motta, Arthur T.
2017-11-01
The development of TEM-visible damage in materials under irradiation at cryogenic temperatures cannot be explained using classical rate theory modeling with thermally activated reactions since at low temperatures thermal reaction rates are too low. Although point defect mobility approaches zero at low temperature, the thermal spikes induced by displacement cascades enable some atom mobility as it cools. In this work a model is developed to calculate "athermal" reaction rates from the atomic mobility within the irradiation-induced thermal spikes, including both displacement cascades and electronic stopping. The athermal reaction rates are added to a simple rate theory cluster dynamics model to allow for the simulation of microstructure evolution during irradiation at cryogenic temperatures. The rate theory model is applied to in-situ irradiation of ZrC and compares well at cryogenic temperatures. The results show that the addition of the thermal spike model makes it possible to rationalize microstructure evolution in the low temperature regime.
NASA Astrophysics Data System (ADS)
Kumar, Ashish; Dasgupta, Dwaipayan; Maroudas, Dimitrios
We report a systematic study of complex pattern formation resulting from the driven dynamics of single-layer homoepitaxial islands on face-centered cubic (FCC) crystalline conducting substrate surfaces under the action of an externally applied electric field. The analysis is based on an experimentally validated nonlinear model of mass transport via island edge atomic diffusion, which also accounts for edge diffusional anisotropy. We analyze the morphological stability and simulate the field-driven evolution of rounded islands for an electric field oriented along the fast diffusion direction. For larger than critical island sizes on {110} and {100} FCC substrates, we show that multiple necking instabilities generate complex island patterns, including void-containing islands, mediated by sequences of breakup and coalescence events and distributed symmetrically with respect to the electric field direction. We analyze the dependence of the formed patterns on the original island size and on the duration of application of the external field. Starting from a single large rounded island, we characterize the evolution of the number of daughter islands and their average size and uniformity. The analysis reveals that the pattern formation kinetics follows a universal scaling relation. Division of Materials Sciences & Engineering, Office of Basic Energy Sciences, U.S. Department of Energy (Award No.: DE-FG02-07ER46407).
Purely Dry Mergers do not Explain the Observed Evolution of Massive Early-type Galaxies since z ~ 1
NASA Astrophysics Data System (ADS)
Sonnenfeld, Alessandro; Nipoti, Carlo; Treu, Tommaso
2014-05-01
Several studies have suggested that the observed size evolution of massive early-type galaxies (ETGs) can be explained as a combination of dry mergers and progenitor bias, at least since z ~ 1. In this paper we carry out a new test of the dry-merger scenario based on recent lensing measurements of the evolution of the mass density profile of ETGs. We construct a theoretical model for the joint evolution of the size and mass density profile slope γ' driven by dry mergers occurring at rates given by cosmological simulations. Such dry-merger model predicts a strong decrease of γ' with cosmic time, inconsistent with the almost constant γ' inferred from observations in the redshift range 0 < z < 1. We then show with a simple toy model that a modest amount of cold gas in the mergers—consistent with the upper limits on recent star formation in ETGs—is sufficient to reconcile the model with measurements of γ'. By fitting for the amount of gas accreted during mergers, we find that models with dissipation are consistent with observations of the evolution in both size and density slope, if ~4% of the total final stellar mass arises from the gas accreted since z ~ 1. Purely dry merger models are ruled out at >99% CL. We thus suggest a scenario where the outer regions of massive ETGs grow by accretion of stars and dark matter, while small amounts of dissipation and nuclear star formation conspire to keep the mass density profile constant and approximately isothermal.
Mushegyan, Vagan; Eronen, Jussi T.; Lawing, A. Michelle; Sharir, Amnon; Janis, Christine; Jernvall, Jukka; Klein, Ophir D.
2015-01-01
Summary The fossil record is widely informative about evolution, but fossils are not systematically used to study the evolution of stem cell-driven renewal. Here, we examined evolution of the continuous growth (hypselodonty) of rodent molar teeth, which is fuelled by the presence of dental stem cells. We studied occurrences of 3500 North American rodent fossils, ranging from 50 million years ago (mya) to 2 mya. We examined changes in molar height to determine if evolution of hypselodonty shows distinct patterns in the fossil record, and we found that hypselodont taxa emerged through intermediate forms of increasing crown height. Next, we designed a Markov simulation model, which replicated molar height increases throughout the Cenozoic, and, moreover, evolution of hypselodonty. Thus, by extension, the retention of the adult stem-cell niche appears to be a predictable quantitative rather than a stochastic qualitative process. Our analyses predict that hypselodonty will eventually become the dominant phenotype. PMID:25921530
First steps of processing VLBI data of space probes with VieVS
NASA Astrophysics Data System (ADS)
Plank, L.; Böhm, J.; Schuh, H.
2011-07-01
Since 2008 the VLBI group at the Institute of Geodesy and Geophysics (IGG) of the Vienna University of Technology has developed the Vienna VLBI Software VieVS which is capable to process geodetic VLBI data in NGS format. Constantly we are working on upgrading the new software, e.g. by developing a scheduling tool or extending the software from single session solution to a so-called global solution, allowing the joint analysis of many sessions covering several years. In this presentation we report on first steps to enable the processing of space VLBI data with the software. Driven by the recently increasing number of space VLBI applications, our goal is the geodetic usage of such data, primarily concerning frame ties between various reference frames, e. g. by connecting the dynamic reference frame of a space probe with the kinematically defined International Celestial Reference Frame (ICRF). Main parts of the software extension w.r.t. the existing VieVS are the treatment of fast moving targets, the implementation of a delay model for radio emitters at finite distances, and the adequate mathematical model and adjustment of the particular unknowns. Actual work has been done for two mission scenarios so far: On the one hand differential VLBI (D-VLBI) data from the two sub-satellites of the Japanese lunar mission Selene were processed, on the other hand VLBI observations of GNSS satellites were modelled in VieVS. Besides some general aspects, we give details on the calculation of the theoretical delay (delay model for moving sources at finite distances) and its realization in VieVS. First results with real data and comparisons with best fit mission orbit data are also presented.'
Bhaskar, Anand; Javanmard, Adel; Courtade, Thomas A; Tse, David
2017-03-15
Genetic variation in human populations is influenced by geographic ancestry due to spatial locality in historical mating and migration patterns. Spatial population structure in genetic datasets has been traditionally analyzed using either model-free algorithms, such as principal components analysis (PCA) and multidimensional scaling, or using explicit spatial probabilistic models of allele frequency evolution. We develop a general probabilistic model and an associated inference algorithm that unify the model-based and data-driven approaches to visualizing and inferring population structure. Our spatial inference algorithm can also be effectively applied to the problem of population stratification in genome-wide association studies (GWAS), where hidden population structure can create fictitious associations when population ancestry is correlated with both the genotype and the trait. Our algorithm Geographic Ancestry Positioning (GAP) relates local genetic distances between samples to their spatial distances, and can be used for visually discerning population structure as well as accurately inferring the spatial origin of individuals on a two-dimensional continuum. On both simulated and several real datasets from diverse human populations, GAP exhibits substantially lower error in reconstructing spatial ancestry coordinates compared to PCA. We also develop an association test that uses the ancestry coordinates inferred by GAP to accurately account for ancestry-induced correlations in GWAS. Based on simulations and analysis of a dataset of 10 metabolic traits measured in a Northern Finland cohort, which is known to exhibit significant population structure, we find that our method has superior power to current approaches. Our software is available at https://github.com/anand-bhaskar/gap . abhaskar@stanford.edu or ajavanma@usc.edu. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
NASA Astrophysics Data System (ADS)
Massmann, J.; Nagel, T.; Bilke, L.; Böttcher, N.; Heusermann, S.; Fischer, T.; Kumar, V.; Schäfers, A.; Shao, H.; Vogel, P.; Wang, W.; Watanabe, N.; Ziefle, G.; Kolditz, O.
2016-12-01
As part of the German site selection process for a high-level nuclear waste repository, different repository concepts in the geological candidate formations rock salt, clay stone and crystalline rock are being discussed. An open assessment of these concepts using numerical simulations requires physical models capturing the individual particularities of each rock type and associated geotechnical barrier concept to a comparable level of sophistication. In a joint work group of the Helmholtz Centre for Environmental Research (UFZ) and the German Federal Institute for Geosciences and Natural Resources (BGR), scientists of the UFZ are developing and implementing multiphysical process models while BGR scientists apply them to large scale analyses. The advances in simulation methods for waste repositories are incorporated into the open-source code OpenGeoSys. Here, recent application-driven progress in this context is highlighted. A robust implementation of visco-plasticity with temperature-dependent properties into a framework for the thermo-mechanical analysis of rock salt will be shown. The model enables the simulation of heat transport along with its consequences on the elastic response as well as on primary and secondary creep or the occurrence of dilatancy in the repository near field. Transverse isotropy, non-isothermal hydraulic processes and their coupling to mechanical stresses are taken into account for the analysis of repositories in clay stone. These processes are also considered in the near field analyses of engineered barrier systems, including the swelling/shrinkage of the bentonite material. The temperature-dependent saturation evolution around the heat-emitting waste container is described by different multiphase flow formulations. For all mentioned applications, we illustrate the workflow from model development and implementation, over verification and validation, to repository-scale application simulations using methods of high performance computing.
Using Real and Simulated TNOs to Constrain the Outer Solar System
NASA Astrophysics Data System (ADS)
Kaib, Nathan
2018-04-01
Over the past 2-3 decades our understanding of the outer solar system’s history and current state has evolved dramatically. An explosion in the number of detected trans-Neptunian objects (TNOs) coupled with simultaneous advances in numerical models of orbital dynamics has driven this rapid evolution. However, successfully constraining the orbital architecture and evolution of the outer solar system requires accurately comparing simulation results with observational datasets. This process is challenging because observed datasets are influenced by orbital discovery biases as well as TNO size and albedo distributions. Meanwhile, such influences are generally absent from numerical results. Here I will review recent work I and others have undertaken using numerical simulations in concert with catalogs of observed TNOs to constrain the outer solar system’s current orbital architecture and past evolution.
ERIC Educational Resources Information Center
Ortega, Ryan A.; Brame, Cynthia J.
2015-01-01
Concept mapping was developed as a method of displaying and organizing hierarchical knowledge structures. Using the new, multidimensional presentation software Prezi, we have developed a new teaching technique designed to engage higher-level skills in the cognitive domain. This tool, synthesis mapping, is a natural evolution of concept mapping,…
Yates, John R
2015-11-01
Advances in computer technology and software have driven developments in mass spectrometry over the last 50 years. Computers and software have been impactful in three areas: the automation of difficult calculations to aid interpretation, the collection of data and control of instruments, and data interpretation. As the power of computers has grown, so too has the utility and impact on mass spectrometers and their capabilities. This has been particularly evident in the use of tandem mass spectrometry data to search protein and nucleotide sequence databases to identify peptide and protein sequences. This capability has driven the development of many new approaches to study biological systems, including the use of "bottom-up shotgun proteomics" to directly analyze protein mixtures. Graphical Abstract ᅟ.
Evolution of Secondary Phases Formed upon Solidification of a Ni-Based Alloy
NASA Astrophysics Data System (ADS)
Zuo, Qiang; Liu, Feng; Wang, Lei; Chen, Changfeng
2013-07-01
The solidification of UNS N08028 alloy subjected to different cooling rates was studied, where primary austenite dendrites occur predominantly and different amounts of sigma phase form in the interdendritic regions. The solidification path and elemental segregation upon solidification were simulated using the CALPHAD method, where THERMO-CALC software packages and two classical segregation models were employed to predict the real process. It is thus revealed that the interdendritic sigma phase is formed via eutectic reaction at the last stage of solidification. On this basis, an analytical model was developed to predict the evolution of nonequilibrium eutectic phase, while the isolated morphology of sigma phase can be described using divorced eutectic theory. Size, fraction, and morphology of the sigma phase were quantitatively studied by a series of experiments; the results are in good agreement with the model prediction.
Spatial evolutionary epidemiology of spreading epidemics
2016-01-01
Most spatial models of host–parasite interactions either neglect the possibility of pathogen evolution or consider that this process is slow enough for epidemiological dynamics to reach an equilibrium on a fast timescale. Here, we propose a novel approach to jointly model the epidemiological and evolutionary dynamics of spatially structured host and pathogen populations. Starting from a multi-strain epidemiological model, we use a combination of spatial moment equations and quantitative genetics to analyse the dynamics of mean transmission and virulence in the population. A key insight of our approach is that, even in the absence of long-term evolutionary consequences, spatial structure can affect the short-term evolution of pathogens because of the build-up of spatial differentiation in mean virulence. We show that spatial differentiation is driven by a balance between epidemiological and genetic effects, and this quantity is related to the effect of kin competition discussed in previous studies of parasite evolution in spatially structured host populations. Our analysis can be used to understand and predict the transient evolutionary dynamics of pathogens and the emergence of spatial patterns of phenotypic variation. PMID:27798295
Spatial evolutionary epidemiology of spreading epidemics.
Lion, S; Gandon, S
2016-10-26
Most spatial models of host-parasite interactions either neglect the possibility of pathogen evolution or consider that this process is slow enough for epidemiological dynamics to reach an equilibrium on a fast timescale. Here, we propose a novel approach to jointly model the epidemiological and evolutionary dynamics of spatially structured host and pathogen populations. Starting from a multi-strain epidemiological model, we use a combination of spatial moment equations and quantitative genetics to analyse the dynamics of mean transmission and virulence in the population. A key insight of our approach is that, even in the absence of long-term evolutionary consequences, spatial structure can affect the short-term evolution of pathogens because of the build-up of spatial differentiation in mean virulence. We show that spatial differentiation is driven by a balance between epidemiological and genetic effects, and this quantity is related to the effect of kin competition discussed in previous studies of parasite evolution in spatially structured host populations. Our analysis can be used to understand and predict the transient evolutionary dynamics of pathogens and the emergence of spatial patterns of phenotypic variation. © 2016 The Author(s).
Modelling opinion formation driven communities in social networks
NASA Astrophysics Data System (ADS)
Iñiguez, Gerardo; Barrio, Rafael A.; Kertész, János; Kaski, Kimmo K.
2011-09-01
In a previous paper we proposed a model to study the dynamics of opinion formation in human societies by a co-evolution process involving two distinct time scales of fast transaction and slower network evolution dynamics. In the transaction dynamics we take into account short range interactions as discussions between individuals and long range interactions to describe the attitude to the overall mood of society. The latter is handled by a uniformly distributed parameter α, assigned randomly to each individual, as quenched personal bias. The network evolution dynamics is realised by rewiring the societal network due to state variable changes as a result of transaction dynamics. The main consequence of this complex dynamics is that communities emerge in the social network for a range of values in the ratio between time scales. In this paper we focus our attention on the attitude parameter α and its influence on the conformation of opinion and the size of the resulting communities. We present numerical studies and extract interesting features of the model that can be interpreted in terms of social behaviour.
The scatter and evolution of the global hot gas properties of simulated galaxy cluster populations
NASA Astrophysics Data System (ADS)
Le Brun, Amandine M. C.; McCarthy, Ian G.; Schaye, Joop; Ponman, Trevor J.
2017-04-01
We use the cosmo-OverWhelmingly Large Simulation (cosmo-OWLS) suite of cosmological hydrodynamical simulations to investigate the scatter and evolution of the global hot gas properties of large simulated populations of galaxy groups and clusters. Our aim is to compare the predictions of different physical models and to explore the extent to which commonly adopted assumptions in observational analyses (e.g. self-similar evolution) are violated. We examine the relations between (true) halo mass and the X-ray temperature, X-ray luminosity, gas mass, Sunyaev-Zel'dovich (SZ) flux, the X-ray analogue of the SZ flux (YX) and the hydrostatic mass. For the most realistic models, which include active galactic nuclei (AGN) feedback, the slopes of the various mass-observable relations deviate substantially from the self-similar ones, particularly at late times and for low-mass clusters. The amplitude of the mass-temperature relation shows negative evolution with respect to the self-similar prediction (I.e. slower than the prediction) for all models, driven by an increase in non-thermal pressure support at higher redshifts. The AGN models predict strong positive evolution of the gas mass fractions at low halo masses. The SZ flux and YX show positive evolution with respect to self-similarity at low mass but negative evolution at high mass. The scatter about the relations is well approximated by log-normal distributions, with widths that depend mildly on halo mass. The scatter decreases significantly with increasing redshift. The exception is the hydrostatic mass-halo mass relation, for which the scatter increases with redshift. Finally, we discuss the relative merits of various hot gas-based mass proxies.
Ha, Minsu; Nehm, Ross H.; Urban-Lurain, Mark; Merrill, John E.
2011-01-01
Our study explored the prospects and limitations of using machine-learning software to score introductory biology students’ written explanations of evolutionary change. We investigated three research questions: 1) Do scoring models built using student responses at one university function effectively at another university? 2) How many human-scored student responses are needed to build scoring models suitable for cross-institutional application? 3) What factors limit computer-scoring efficacy, and how can these factors be mitigated? To answer these questions, two biology experts scored a corpus of 2556 short-answer explanations (from biology majors and nonmajors) at two universities for the presence or absence of five key concepts of evolution. Human- and computer-generated scores were compared using kappa agreement statistics. We found that machine-learning software was capable in most cases of accurately evaluating the degree of scientific sophistication in undergraduate majors’ and nonmajors’ written explanations of evolutionary change. In cases in which the software did not perform at the benchmark of “near-perfect” agreement (kappa > 0.80), we located the causes of poor performance and identified a series of strategies for their mitigation. Machine-learning software holds promise as an assessment tool for use in undergraduate biology education, but like most assessment tools, it is also characterized by limitations. PMID:22135372
Ha, Minsu; Nehm, Ross H; Urban-Lurain, Mark; Merrill, John E
2011-01-01
Our study explored the prospects and limitations of using machine-learning software to score introductory biology students' written explanations of evolutionary change. We investigated three research questions: 1) Do scoring models built using student responses at one university function effectively at another university? 2) How many human-scored student responses are needed to build scoring models suitable for cross-institutional application? 3) What factors limit computer-scoring efficacy, and how can these factors be mitigated? To answer these questions, two biology experts scored a corpus of 2556 short-answer explanations (from biology majors and nonmajors) at two universities for the presence or absence of five key concepts of evolution. Human- and computer-generated scores were compared using kappa agreement statistics. We found that machine-learning software was capable in most cases of accurately evaluating the degree of scientific sophistication in undergraduate majors' and nonmajors' written explanations of evolutionary change. In cases in which the software did not perform at the benchmark of "near-perfect" agreement (kappa > 0.80), we located the causes of poor performance and identified a series of strategies for their mitigation. Machine-learning software holds promise as an assessment tool for use in undergraduate biology education, but like most assessment tools, it is also characterized by limitations.
NASA Astrophysics Data System (ADS)
Morin, Paul; Porter, Claire; Cloutier, Michael; Howat, Ian; Noh, Myoung-Jong; Willis, Michael; Kramer, WIlliam; Bauer, Greg; Bates, Brian; Williamson, Cathleen
2017-04-01
Surface topography is among the most fundamental data sets for geosciences, essential for disciplines ranging from glaciology to geodynamics. Two new projects are using sub-meter, commercial imagery licensed by the National Geospatial-Intelligence Agency and open source photogrammetry software to produce a time-tagged 2m posting elevation model of the Arctic and an 8m posting reference elevation model for the Antarctic. When complete, this publically available data will be at higher resolution than any elevation models that cover the entirety of the Western United States. These two polar projects are made possible due to three equally important factors: 1) open-source photogrammetry software, 2) petascale computing, and 3) sub-meter imagery licensed to the United States Government. Our talk will detail the technical challenges of using automated photogrammetry software; the rapid workflow evolution to allow DEM production; the task of deploying the workflow on one of the world's largest supercomputers; the trials of moving massive amounts of data, and the management strategies the team needed to solve in order to meet deadlines. Finally, we will discuss the implications of this type of collaboration for future multi-team use of leadership-class systems such as Blue Waters, and for further elevation mapping.
Hardware Evolution of Closed-Loop Controller Designs
NASA Technical Reports Server (NTRS)
Gwaltney, David; Ferguson, Ian
2002-01-01
Poster presentation will outline on-going efforts at NASA, MSFC to employ various Evolvable Hardware experimental platforms in the evolution of digital and analog circuitry for application to automatic control. Included will be information concerning the application of commercially available hardware and software along with the use of the JPL developed FPTA2 integrated circuit and supporting JPL developed software. Results to date will be presented.
Computer Aided Teaching of Digital Signal Processing.
ERIC Educational Resources Information Center
Castro, Ian P.
1990-01-01
Describes a microcomputer-based software package developed at the University of Surrey for teaching digital signal processing to undergraduate science and engineering students. Menu-driven software capabilities are explained, including demonstration of qualitative concepts and experimentation with quantitative data, and examples are given of…
A new practice-driven approach to develop software in a cyber-physical system environment
NASA Astrophysics Data System (ADS)
Jiang, Yiping; Chen, C. L. Philip; Duan, Junwei
2016-02-01
Cyber-physical system (CPS) is an emerging area, which cannot work efficiently without proper software handling of the data and business logic. Software and middleware is the soul of the CPS. The software development of CPS is a critical issue because of its complicity in a large scale realistic system. Furthermore, object-oriented approach (OOA) is often used to develop CPS software, which needs some improvements according to the characteristics of CPS. To develop software in a CPS environment, a new systematic approach is proposed in this paper. It comes from practice, and has been evolved from software companies. It consists of (A) Requirement analysis in event-oriented way, (B) architecture design in data-oriented way, (C) detailed design and coding in object-oriented way and (D) testing in event-oriented way. It is a new approach based on OOA; the difference when compared with OOA is that the proposed approach has different emphases and measures in every stage. It is more accord with the characteristics of event-driven CPS. In CPS software development, one should focus on the events more than the functions or objects. A case study of a smart home system is designed to reveal the effectiveness of the approach. It shows that the approach is also easy to be operated in the practice owing to some simplifications. The running result illustrates the validity of this approach.
Evolution of jets driven by relativistic radiation hydrodynamics as Long and Low Luminosity GRBs
NASA Astrophysics Data System (ADS)
Rivera-Paleo, F. J.; Guzmán, F. S.
2018-06-01
We present numerical simulations of jets modeled with Relativistic Radiation Hydrodynamics (RRH), that evolve across two environments: i) a stratified surrounding medium and ii) a 16TI progenitor model. We consider opacities consistent with various processes of interaction between the fluid and radiation, specifically, free-free, bound-free, bound-bound and electron scattering. We explore various initial conditions, with different radiation energy densities of the beam in hydrodynamical and radiation pressure dominated scenarios, considering only highly-relativistic jets. In order to investigate the impact of the radiation field on the evolution of the jets, we compare our results with purely hydrodynamical jets. Comparing among jets driven by RRH, we find that radiation pressure dominated jets propagate slightly faster than gas pressure dominated ones. Finally, we construct the luminosity Light Curves (LCs) associated with the two cases. The construction of LCs uses the fluxes of the radiation field which is fully coupled to the hydrodynamics equations during the evolution. The main properties of the jets propagating on the stratified surrounding medium are that the LCs show the same order of magnitude as the gamma-ray luminosity of typical Long Gamma-Ray Bursts 1050 - 1054erg/s and the difference between the radiation and gas temperatures is of nearly one order of magnitude. The properties of jets breaking out from the progenitor star model are that the LCs are of the order of magnitude of low-luminosity GRBs 1046 - 1049 erg/s, and in this scenario the difference between the gas and radiation temperature is of four orders of magnitude, which is a case far from thermal equilibrium.
2017-03-21
Energy and Water Projects March 21, 2017 REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of...included reduced system energy use and cost as well as improved performance driven by autonomous commissioning and optimized system control. In the end...improve system performance and reduce energy use and cost. However, implementing these solutions into the extremely heterogeneous and often