The emotion system promotes diversity and evolvability
Giske, Jarl; Eliassen, Sigrunn; Fiksen, Øyvind; Jakobsen, Per J.; Aksnes, Dag L.; Mangel, Marc; Jørgensen, Christian
2014-01-01
Studies on the relationship between the optimal phenotype and its environment have had limited focus on genotype-to-phenotype pathways and their evolutionary consequences. Here, we study how multi-layered trait architecture and its associated constraints prescribe diversity. Using an idealized model of the emotion system in fish, we find that trait architecture yields genetic and phenotypic diversity even in absence of frequency-dependent selection or environmental variation. That is, for a given environment, phenotype frequency distributions are predictable while gene pools are not. The conservation of phenotypic traits among these genetically different populations is due to the multi-layered trait architecture, in which one adaptation at a higher architectural level can be achieved by several different adaptations at a lower level. Our results emphasize the role of convergent evolution and the organismal level of selection. While trait architecture makes individuals more constrained than what has been assumed in optimization theory, the resulting populations are genetically more diverse and adaptable. The emotion system in animals may thus have evolved by natural selection because it simultaneously enhances three important functions, the behavioural robustness of individuals, the evolvability of gene pools and the rate of evolutionary innovation at several architectural levels. PMID:25100697
The emotion system promotes diversity and evolvability.
Giske, Jarl; Eliassen, Sigrunn; Fiksen, Øyvind; Jakobsen, Per J; Aksnes, Dag L; Mangel, Marc; Jørgensen, Christian
2014-09-22
Studies on the relationship between the optimal phenotype and its environment have had limited focus on genotype-to-phenotype pathways and their evolutionary consequences. Here, we study how multi-layered trait architecture and its associated constraints prescribe diversity. Using an idealized model of the emotion system in fish, we find that trait architecture yields genetic and phenotypic diversity even in absence of frequency-dependent selection or environmental variation. That is, for a given environment, phenotype frequency distributions are predictable while gene pools are not. The conservation of phenotypic traits among these genetically different populations is due to the multi-layered trait architecture, in which one adaptation at a higher architectural level can be achieved by several different adaptations at a lower level. Our results emphasize the role of convergent evolution and the organismal level of selection. While trait architecture makes individuals more constrained than what has been assumed in optimization theory, the resulting populations are genetically more diverse and adaptable. The emotion system in animals may thus have evolved by natural selection because it simultaneously enhances three important functions, the behavioural robustness of individuals, the evolvability of gene pools and the rate of evolutionary innovation at several architectural levels.
Maximizing the Adjacent Possible in Automata Chemistries.
Hickinbotham, Simon; Clark, Edward; Nellis, Adam; Stepney, Susan; Clarke, Tim; Young, Peter
2016-01-01
Automata chemistries are good vehicles for experimentation in open-ended evolution, but they are by necessity complex systems whose low-level properties require careful design. To aid the process of designing automata chemistries, we develop an abstract model that classifies the features of a chemistry from a physical (bottom up) perspective and from a biological (top down) perspective. There are two levels: things that can evolve, and things that cannot. We equate the evolving level with biology and the non-evolving level with physics. We design our initial organisms in the biology, so they can evolve. We design the physics to facilitate evolvable biologies. This architecture leads to a set of design principles that should be observed when creating an instantiation of the architecture. These principles are Everything Evolves, Everything's Soft, and Everything Dies. To evaluate these ideas, we present experiments in the recently developed Stringmol automata chemistry. We examine the properties of Stringmol with respect to the principles, and so demonstrate the usefulness of the principles in designing automata chemistries.
NASA Technical Reports Server (NTRS)
Moeller, Robert C.; Borden, Chester; Spilker, Thomas; Smythe, William; Lock, Robert
2011-01-01
The JPL Rapid Mission Architecture (RMA) capability is a novel collaborative team-based approach to generate new mission architectures, explore broad trade space options, and conduct architecture-level analyses. RMA studies address feasibility and identify best candidates to proceed to further detailed design studies. Development of RMA first began at JPL in 2007 and has evolved to address the need for rapid, effective early mission architectural development and trade space exploration as a precursor to traditional point design evaluations. The RMA approach integrates a small team of architecture-level experts (typically 6-10 people) to generate and explore a wide-ranging trade space of mission architectures driven by the mission science (or technology) objectives. Group brainstorming and trade space analyses are conducted at a higher level of assessment across multiple mission architectures and systems to enable rapid assessment of a set of diverse, innovative concepts. This paper describes the overall JPL RMA team, process, and high-level approach. Some illustrative results from previous JPL RMA studies are discussed.
Mars Surface Habitability Options
NASA Technical Reports Server (NTRS)
Howe, A. Scott; Simon, Matthew; Smitherman, David; Howard, Robert; Toups, Larry; Hoffman, Stephen J.
2015-01-01
This paper reports on current habitability concepts for an Evolvable Mars Campaign (EMC) prepared by the NASA Human Spaceflight Architecture Team (HAT). For many years NASA has investigated alternative human Mars missions, examining different mission objectives, trajectories, vehicles, and technologies; the combinations of which have been referred to as reference missions or architectures. At the highest levels, decisions regarding the timing and objectives for a human mission to Mars continue to evolve while at the lowest levels, applicable technologies continue to advance. This results in an on-going need for assessments of alternative system designs such as the habitat, a significant element in any human Mars mission scenario, to provide meaningful design sensitivity characterizations to assist decision-makers regarding timing, objectives, and technologies. As a subset of the Evolvable Mars Campaign activities, the habitability team builds upon results from past studies and recommends options for Mars surface habitability compatible with updated technologies.
Designing bioinspired composite reinforcement architectures via 3D magnetic printing
NASA Astrophysics Data System (ADS)
Martin, Joshua J.; Fiore, Brad E.; Erb, Randall M.
2015-10-01
Discontinuous fibre composites represent a class of materials that are strong, lightweight and have remarkable fracture toughness. These advantages partially explain the abundance and variety of discontinuous fibre composites that have evolved in the natural world. Many natural structures out-perform the conventional synthetic counterparts due, in part, to the more elaborate reinforcement architectures that occur in natural composites. Here we present an additive manufacturing approach that combines real-time colloidal assembly with existing additive manufacturing technologies to create highly programmable discontinuous fibre composites. This technology, termed as `3D magnetic printing', has enabled us to recreate complex bioinspired reinforcement architectures that deliver enhanced material performance compared with monolithic structures. Further, we demonstrate that we can now design and evolve elaborate reinforcement architectures that are not found in nature, demonstrating a high level of possible customization in discontinuous fibre composites with arbitrary geometries.
Designing bioinspired composite reinforcement architectures via 3D magnetic printing.
Martin, Joshua J; Fiore, Brad E; Erb, Randall M
2015-10-23
Discontinuous fibre composites represent a class of materials that are strong, lightweight and have remarkable fracture toughness. These advantages partially explain the abundance and variety of discontinuous fibre composites that have evolved in the natural world. Many natural structures out-perform the conventional synthetic counterparts due, in part, to the more elaborate reinforcement architectures that occur in natural composites. Here we present an additive manufacturing approach that combines real-time colloidal assembly with existing additive manufacturing technologies to create highly programmable discontinuous fibre composites. This technology, termed as '3D magnetic printing', has enabled us to recreate complex bioinspired reinforcement architectures that deliver enhanced material performance compared with monolithic structures. Further, we demonstrate that we can now design and evolve elaborate reinforcement architectures that are not found in nature, demonstrating a high level of possible customization in discontinuous fibre composites with arbitrary geometries.
The architecture of a video image processor for the space station
NASA Technical Reports Server (NTRS)
Yalamanchili, S.; Lee, D.; Fritze, K.; Carpenter, T.; Hoyme, K.; Murray, N.
1987-01-01
The architecture of a video image processor for space station applications is described. The architecture was derived from a study of the requirements of algorithms that are necessary to produce the desired functionality of many of these applications. Architectural options were selected based on a simulation of the execution of these algorithms on various architectural organizations. A great deal of emphasis was placed on the ability of the system to evolve and grow over the lifetime of the space station. The result is a hierarchical parallel architecture that is characterized by high level language programmability, modularity, extensibility and can meet the required performance goals.
An Intelligent Propulsion Control Architecture to Enable More Autonomous Vehicle Operation
NASA Technical Reports Server (NTRS)
Litt, Jonathan S.; Sowers, T. Shane; Simon, Donald L.; Owen, A. Karl; Rinehart, Aidan W.; Chicatelli, Amy K.; Acheson, Michael J.; Hueschen, Richard M.; Spiers, Christopher W.
2018-01-01
This paper describes an intelligent propulsion control architecture that coordinates with the flight control to reduce the amount of pilot intervention required to operate the vehicle. Objectives of the architecture include the ability to: automatically recognize the aircraft operating state and flight phase; configure engine control to optimize performance with knowledge of engine condition and capability; enhance aircraft performance by coordinating propulsion control with flight control; and recognize off-nominal propulsion situations and to respond to them autonomously. The hierarchical intelligent propulsion system control can be decomposed into a propulsion system level and an individual engine level. The architecture is designed to be flexible to accommodate evolving requirements, adapt to technology improvements, and maintain safety.
Enabling Communication and Navigation Technologies for Future Near Earth Science Missions
NASA Technical Reports Server (NTRS)
Israel, David J.; Heckler, Gregory; Menrad, Robert; Hudiburg, John; Boroson, Don; Robinson, Bryan; Cornwell, Donald
2016-01-01
In 2015, the Earth Regimes Network Evolution Study (ERNESt) proposed an architectural concept and technologies that evolve to enable space science and exploration missions out to the 2040 timeframe. The architectural concept evolves the current instantiations of the Near Earth Network and Space Network with new technologies to provide a global communication and navigation network that provides communication and navigation services to a wide range of space users in the near Earth domain. The technologies included High Rate Optical Communications, Optical Multiple Access (OMA), Delay Tolerant Networking (DTN), User Initiated Services (UIS), and advanced Position, Navigation, and Timing technology. This paper describes the key technologies and their current technology readiness levels. Examples of science missions that could be enabled by the technologies and the projected operational benefits of the architecture concept to missions are also described.
The flight telerobotic servicer: From functional architecture to computer architecture
NASA Technical Reports Server (NTRS)
Lumia, Ronald; Fiala, John
1989-01-01
After a brief tutorial on the NASA/National Bureau of Standards Standard Reference Model for Telerobot Control System Architecture (NASREM) functional architecture, the approach to its implementation is shown. First, interfaces must be defined which are capable of supporting the known algorithms. This is illustrated by considering the interfaces required for the SERVO level of the NASREM functional architecture. After interface definition, the specific computer architecture for the implementation must be determined. This choice is obviously technology dependent. An example illustrating one possible mapping of the NASREM functional architecture to a particular set of computers which implements it is shown. The result of choosing the NASREM functional architecture is that it provides a technology independent paradigm which can be mapped into a technology dependent implementation capable of evolving with technology in the laboratory and in space.
Managing changes in the enterprise architecture modelling context
NASA Astrophysics Data System (ADS)
Khanh Dam, Hoa; Lê, Lam-Son; Ghose, Aditya
2016-07-01
Enterprise architecture (EA) models the whole enterprise in various aspects regarding both business processes and information technology resources. As the organisation grows, the architecture of its systems and processes must also evolve to meet the demands of the business environment. Evolving an EA model may involve making changes to various components across different levels of the EA. As a result, an important issue before making a change to an EA model is assessing the ripple effect of the change, i.e. change impact analysis. Another critical issue is change propagation: given a set of primary changes that have been made to the EA model, what additional secondary changes are needed to maintain consistency across multiple levels of the EA. There has been however limited work on supporting the maintenance and evolution of EA models. This article proposes an EA description language, namely ChangeAwareHierarchicalEA, integrated with an evolution framework to support both change impact analysis and change propagation within an EA model. The core part of our framework is a technique for computing the impact of a change and a new method for generating interactive repair plans from Alloy consistency rules that constrain the EA model.
HACC: Extreme Scaling and Performance Across Diverse Architectures
NASA Astrophysics Data System (ADS)
Habib, Salman; Morozov, Vitali; Frontiere, Nicholas; Finkel, Hal; Pope, Adrian; Heitmann, Katrin
2013-11-01
Supercomputing is evolving towards hybrid and accelerator-based architectures with millions of cores. The HACC (Hardware/Hybrid Accelerated Cosmology Code) framework exploits this diverse landscape at the largest scales of problem size, obtaining high scalability and sustained performance. Developed to satisfy the science requirements of cosmological surveys, HACC melds particle and grid methods using a novel algorithmic structure that flexibly maps across architectures, including CPU/GPU, multi/many-core, and Blue Gene systems. We demonstrate the success of HACC on two very different machines, the CPU/GPU system Titan and the BG/Q systems Sequoia and Mira, attaining unprecedented levels of scalable performance. We demonstrate strong and weak scaling on Titan, obtaining up to 99.2% parallel efficiency, evolving 1.1 trillion particles. On Sequoia, we reach 13.94 PFlops (69.2% of peak) and 90% parallel efficiency on 1,572,864 cores, with 3.6 trillion particles, the largest cosmological benchmark yet performed. HACC design concepts are applicable to several other supercomputer applications.
An Evolvable Multi-Agent Approach to Space Operations Engineering
NASA Technical Reports Server (NTRS)
Mandutianu, Sanda; Stoica, Adrian
1999-01-01
A complex system of spacecraft and ground tracking stations, as well as a constellation of satellites or spacecraft, has to be able to reliably withstand sudden environment changes, resource fluctuations, dynamic resource configuration, limited communication bandwidth, etc., while maintaining the consistency of the system as a whole. It is not known in advance when a change in the environment might occur or when a particular exchange will happen. A higher degree of sophistication for the communication mechanisms between different parts of the system is required. The actual behavior has to be determined while the system is performing and the course of action can be decided at the individual level. Under such circumstances, the solution will highly benefit from increased on-board and on the ground adaptability and autonomy. An evolvable architecture based on intelligent agents that communicate and cooperate with each other can offer advantages in this direction. This paper presents an architecture of an evolvable agent-based system (software and software/hardware hybrids) as well as some plans for further implementation.
Gene Architectures that Minimize Cost of Gene Expression.
Frumkin, Idan; Schirman, Dvir; Rotman, Aviv; Li, Fangfei; Zahavi, Liron; Mordret, Ernest; Asraf, Omer; Wu, Song; Levy, Sasha F; Pilpel, Yitzhak
2017-01-05
Gene expression burdens cells by consuming resources and energy. While numerous studies have investigated regulation of expression level, little is known about gene design elements that govern expression costs. Here, we ask how cells minimize production costs while maintaining a given protein expression level and whether there are gene architectures that optimize this process. We measured fitness of ∼14,000 E. coli strains, each expressing a reporter gene with a unique 5' architecture. By comparing cost-effective and ineffective architectures, we found that cost per protein molecule could be minimized by lowering transcription levels, regulating translation speeds, and utilizing amino acids that are cheap to synthesize and that are less hydrophobic. We then examined natural E. coli genes and found that highly expressed genes have evolved more forcefully to minimize costs associated with their expression. Our study thus elucidates gene design elements that improve the economy of protein expression in natural and heterologous systems. Copyright © 2017 Elsevier Inc. All rights reserved.
A Practical Software Architecture for Virtual Universities
ERIC Educational Resources Information Center
Xiang, Peifeng; Shi, Yuanchun; Qin, Weijun
2006-01-01
This article introduces a practical software architecture called CUBES, which focuses on system integration and evolvement for online virtual universities. The key of CUBES is a supporting platform that helps to integrate and evolve heterogeneous educational applications developed by different organizations. Both standardized educational…
The Evolution of Mission Architectures for Human Lunar Exploration
NASA Technical Reports Server (NTRS)
Everett, S. F.
1995-01-01
Defining transportation architectures for the human exploration of the Moon is a complex task due to the multitude of mission scenarios available. The mission transportation architecture recently proposed for the First Lunar Outpost (FLO) was not designed from carefully predetermined mission requirements and goals, but evolved from an initial set of requirements, which were continually modified as studies revealed that some early assumptions were not optimal. This paper focuses on the mission architectures proposed for FLO and investigates how these transportation architectures evolved. A comparison of the strengths and weaknesses of the three distinct mission architectures are discussed, namely (1) Lunar Orbit Rendezvous, (2) staging from the Cislunar Libration Point, and (3) direct to the lunar surface. In addition, several new and revolutionary architectures are discussed.
Nonlinear Dynamics in Gene Regulation Promote Robustness and Evolvability of Gene Expression Levels.
Steinacher, Arno; Bates, Declan G; Akman, Ozgur E; Soyer, Orkun S
2016-01-01
Cellular phenotypes underpinned by regulatory networks need to respond to evolutionary pressures to allow adaptation, but at the same time be robust to perturbations. This creates a conflict in which mutations affecting regulatory networks must both generate variance but also be tolerated at the phenotype level. Here, we perform mathematical analyses and simulations of regulatory networks to better understand the potential trade-off between robustness and evolvability. Examining the phenotypic effects of mutations, we find an inverse correlation between robustness and evolvability that breaks only with nonlinearity in the network dynamics, through the creation of regions presenting sudden changes in phenotype with small changes in genotype. For genotypes embedding low levels of nonlinearity, robustness and evolvability correlate negatively and almost perfectly. By contrast, genotypes embedding nonlinear dynamics allow expression levels to be robust to small perturbations, while generating high diversity (evolvability) under larger perturbations. Thus, nonlinearity breaks the robustness-evolvability trade-off in gene expression levels by allowing disparate responses to different mutations. Using analytical derivations of robustness and system sensitivity, we show that these findings extend to a large class of gene regulatory network architectures and also hold for experimentally observed parameter regimes. Further, the effect of nonlinearity on the robustness-evolvability trade-off is ensured as long as key parameters of the system display specific relations irrespective of their absolute values. We find that within this parameter regime genotypes display low and noisy expression levels. Examining the phenotypic effects of mutations, we find an inverse correlation between robustness and evolvability that breaks only with nonlinearity in the network dynamics. Our results provide a possible solution to the robustness-evolvability trade-off, suggest an explanation for the ubiquity of nonlinear dynamics in gene expression networks, and generate useful guidelines for the design of synthetic gene circuits.
Evolution of Bow-Tie Architectures in Biology
Friedlander, Tamar; Mayo, Avraham E.; Tlusty, Tsvi; Alon, Uri
2015-01-01
Bow-tie or hourglass structure is a common architectural feature found in many biological systems. A bow-tie in a multi-layered structure occurs when intermediate layers have much fewer components than the input and output layers. Examples include metabolism where a handful of building blocks mediate between multiple input nutrients and multiple output biomass components, and signaling networks where information from numerous receptor types passes through a small set of signaling pathways to regulate multiple output genes. Little is known, however, about how bow-tie architectures evolve. Here, we address the evolution of bow-tie architectures using simulations of multi-layered systems evolving to fulfill a given input-output goal. We find that bow-ties spontaneously evolve when the information in the evolutionary goal can be compressed. Mathematically speaking, bow-ties evolve when the rank of the input-output matrix describing the evolutionary goal is deficient. The maximal compression possible (the rank of the goal) determines the size of the narrowest part of the network—that is the bow-tie. A further requirement is that a process is active to reduce the number of links in the network, such as product-rule mutations, otherwise a non-bow-tie solution is found in the evolutionary simulations. This offers a mechanism to understand a common architectural principle of biological systems, and a way to quantitate the effective rank of the goals under which they evolved. PMID:25798588
Architectural Lessons: Look Back In Order To Move Forward
NASA Astrophysics Data System (ADS)
Huang, T.; Djorgovski, S. G.; Caltagirone, S.; Crichton, D. J.; Hughes, J. S.; Law, E.; Pilone, D.; Pilone, T.; Mahabal, A.
2015-12-01
True elegance of scalable and adaptable architecture is not about incorporating the latest and greatest technologies. Its elegance is measured by its ability to scale and adapt as its operating environment evolves over time. Architecture is the link that bridges people, process, policies, interfaces, and technologies. Architectural development begins by observe the relationships which really matter to the problem domain. It follows by the creation of a single, shared, evolving, pattern language, which everyone contributes to, and everyone can use [C. Alexander, 1979]. Architects are the true artists. Like all masterpieces, the values and strength of architectures are measured not by the volumes of publications, it is measured by its ability to evolve. An architect must look back in order to move forward. This talk discusses some of the prior works including onboard data analysis system, knowledgebase system, cloud-based Big Data platform, as enablers to help shape the new generation of Earth Science projects at NASA and EarthCube where a community-driven architecture is the key to enable data-intensive science. [C. Alexander, The Timeless Way of Building, Oxford University, 1979.
Cavuşoğlu, M Cenk; Göktekin, Tolga G; Tendick, Frank
2006-04-01
This paper presents the architectural details of an evolving open source/open architecture software framework for developing organ-level surgical simulations. Our goal is to facilitate shared development of reusable models, to accommodate heterogeneous models of computation, and to provide a framework for interfacing multiple heterogeneous models. The framework provides an application programming interface for interfacing dynamic models defined over spatial domains. It is specifically designed to be independent of the specifics of the modeling methods used, and therefore facilitates seamless integration of heterogeneous models and processes. Furthermore, each model has separate geometries for visualization, simulation, and interfacing, allowing the model developer to choose the most natural geometric representation for each case. Input/output interfaces for visualization and haptics for real-time interactive applications have also been provided.
SEL Ada reuse analysis and representations
NASA Technical Reports Server (NTRS)
Kester, Rush
1990-01-01
Overall, it was revealed that the pattern of Ada reuse has evolved from initial reuse of utility components into reuse of generalized application architectures. Utility components were both domain-independent utilities, such as queues and stacks, and domain-specific utilities, such as those that implement spacecraft orbit and attitude mathematical functions and physics or astronomical models. The level of reuse was significantly increased with the development of a generalized telemetry simulator architecture. The use of Ada generics significantly increased the level of verbatum reuse, which is due to the ability, using Ada generics, to parameterize the aspects of design that are configurable during reuse. A key factor in implementing generalized architectures was the ability to use generic subprogram parameters to tailor parts of the algorithm embedded within the architecture. The use of object oriented design (in which objects model real world entities) significantly improved the modularity for reuse. Encapsulating into packages the data and operations associated with common real world entities creates natural building blocks for reuse.
Alternative Architectures for Distributed Work in the National Airspace System
NASA Technical Reports Server (NTRS)
Smith, Philip J.; Billings, Charles E.; Chapman, Roger; Obradovich, Heintz; McCoy, C. Elaine; Orasanu, Judith
2000-01-01
The architecture for the National Airspace System (NAS) in the United States has evolved over time to rely heavily on the distribution of tasks and control authority in order to keep cognitive complexity manageable for any one individual. This paper characterizes a number of different subsystems that have been recently incorporated in the NAS. The goal of this discussion is to begin to identify the critical parameters defining the differences among alternative architectures in terms of the locus of control and in terms of access to relevant data and knowledge. At an abstract level, this analysis can be described as an effort to describe alternative "rules of the game" for the NAS.
An Approach for On-Board Software Building Blocks Cooperation and Interfaces Definition
NASA Astrophysics Data System (ADS)
Pascucci, Dario; Campolo, Giovanni; Candia, Sante; Lisio, Giovanni
2010-08-01
This paper provides an insight on the Avionic SW architecture developed by Thales Alenia Space Italy (TAS-I) to achieve structuring of the OBSW as a set of self-standing and re-usable building blocks. It is initially described the underlying framework for building blocks cooperation, which is based on ECSSE-70 packets forwarding (for services request to a building block) and standard parameters exchange for data communication. Subsequently it is discussed the high level of flexibility and scalability of the resulting architecture, reporting as example an implementation of the Failure Detection, Isolation and Recovery (FDIR) function which exploits the proposed architecture. The presented approach evolves from avionic SW architecture developed in the scope of the project PRIMA (Mult-Purpose Italian Re-configurable Platform) and has been adopted for the Sentinel-1 Avionic Software (ASW).
Han, Chang S; Dingemanse, Niels J
2017-10-11
Empirical studies imply that sex-specific genetic architectures can resolve evolutionary conflicts between males and females, and thereby facilitate the evolution of sexual dimorphism. Sex-specificity of behavioural genetic architectures has, however, rarely been considered. Moreover, as the expression of genetic (co)variances is often environment-dependent, general inferences on sex-specific genetic architectures require estimates of quantitative genetics parameters under multiple conditions. We measured exploration and aggression in pedigreed populations of southern field crickets ( Gryllus bimaculatus ) raised on either naturally balanced (free-choice) or imbalanced (protein-deprived) diets. For each dietary condition, we measured for each behavioural trait (i) level of sexual dimorphism, (ii) level of sex-specificity of survival selection gradients, (iii) level of sex-specificity of additive genetic variance, and (iv) strength of the cross-sex genetic correlation. We report here evidence for sexual dimorphism in behaviour as well as sex-specificity in the expression of genetic (co)variances as predicted by theory. The additive genetic variances of exploration and aggression were significantly greater in males compared with females. Cross-sex genetic correlations were highly positive for exploration but deviating (significantly) from one for aggression; findings were consistent across dietary treatments. This suggests that genetic architectures characterize the sexually dimorphic focal behaviours across various key environmental conditions in the wild. Our finding also highlights that sexual conflict can be resolved by evolving sexually independent genetic architectures. © 2017 The Author(s).
Component-Level Electronic-Assembly Repair (CLEAR) Operational Concept
NASA Technical Reports Server (NTRS)
Oeftering, Richard C.; Bradish, Martin A.; Juergens, Jeffrey R.; Lewis, Michael J.; Vrnak, Daniel R.
2011-01-01
This Component-Level Electronic-Assembly Repair (CLEAR) Operational Concept document was developed as a first step in developing the Component-Level Electronic-Assembly Repair (CLEAR) System Architecture (NASA/TM-2011-216956). The CLEAR operational concept defines how the system will be used by the Constellation Program and what needs it meets. The document creates scenarios for major elements of the CLEAR architecture. These scenarios are generic enough to apply to near-Earth, Moon, and Mars missions. The CLEAR operational concept involves basic assumptions about the overall program architecture and interactions with the CLEAR system architecture. The assumptions include spacecraft and operational constraints for near-Earth orbit, Moon, and Mars missions. This document addresses an incremental development strategy where capabilities evolve over time, but it is structured to prevent obsolescence. The approach minimizes flight hardware by exploiting Internet-like telecommunications that enables CLEAR capabilities to remain on Earth and to be uplinked as needed. To minimize crew time and operational cost, CLEAR exploits offline development and validation to support online teleoperations. Operational concept scenarios are developed for diagnostics, repair, and functional test operations. Many of the supporting functions defined in these operational scenarios are further defined as technologies in NASA/TM-2011-216956.
Updated Mars Mission Architectures Featuring Nuclear Thermal Propulsion
NASA Technical Reports Server (NTRS)
Rodriguez, Mitchell A.; Percy, Thomas K.
2017-01-01
Nuclear thermal propulsion (NTP) can potentially enable routine human exploration of Mars and the solar system. By using nuclear fission instead of a chemical combustion process, and using hydrogen as the propellant, NTP systems promise rocket efficiencies roughly twice that of the best chemical rocket engines currently available. The most recent major Mars architecture study featuring NTP was the Design Reference Architecture 5.0 (DRA 5.0), performed in 2009. Currently, the predominant transportation options being considered are solar electric propulsion (SEP) and chemical propulsion; however, given NTP's capabilities, an updated architectural analysis is needed. This paper provides a top-level overview of several different architectures featuring updated NTP performance data. New architectures presented include a proposed update to the DRA 5.0 as well as an investigation of architectures based on the current Evolvable Mars Campaign, which is the focus of NASA's current analyses for the Journey to Mars. Architectures investigated leverage the latest information relating to NTP performance and design considerations and address new support elements not available at the time of DRA 5.0, most notably the Orion crew module and the Space Launch System (SLS). The paper provides a top level quantitative comparison of key performance metrics as well as a qualitative discussion of improvements and key challenges still to be addressed. Preliminary results indicate that the updated NTP architectures can significantly reduce the campaign mass and subsequently the costs for assembly and number of launches.
ERIC Educational Resources Information Center
Wines, James
1975-01-01
De-architecturization is art about architecture, a catalyst suggesting that public art does not have to respond to formalist doctrine; but rather, may evolve from the informational reservoirs of the city environment, where phenomenology and structure become the fabric of its existence. (Author/RK)
2017-01-31
mapping critical business workflows and then optimizing them with appropriate evolutionary technology choices is often called “ Product Line Architecture... technologies , products , services, and processes, and the USG evaluates them against its 360o requirements objectives, and refines them as appropriate, clarity...in rapidly evolving technological domains (e.g. by applying best commercial practices for open standard product line architecture.) An MP might be
Human Mars Entry, Descent, and Landing Architecture Study Overview
NASA Technical Reports Server (NTRS)
Cianciolo, Alicia D.; Polsgrove, Tara T.
2016-01-01
The Entry, Descent, and Landing (EDL) Architecture Study is a multi-NASA center activity to analyze candidate EDL systems as they apply to human Mars landing in the context of the Evolvable Mars Campaign. The study, led by the Space Technology Mission Directorate (STMD), is performed in conjunction with the NASA's Science Mission Directorate and the Human Architecture Team, sponsored by NASA's Human Exploration and Operations Mission Directorate. The primary objective is to prioritize future STMD EDL technology investments by (1) generating Phase A-level designs for selected concepts to deliver 20 t human class payloads, (2) developing a parameterized mass model for each concept capable of examining payloads between 5 and 40 t, and (3) evaluating integrated system performance using trajectory simulations. This paper summarizes the initial study results.
Concept of Operations for the ESC Product Line Approach.
1996-08-30
production of the application. Product Line Engineering Center ( PLEC ) defines and evolves product line architectures with the SAG. The PLEC is also tasked... PLEC , SAG, and PLAS and offers scenarios for asset and system development. • Section 4 outlines the ESC Product Line transition strategy. • Section...Line or System Needs User Select PLEC ; Assess PL architecture Product Line Architecture Development ments; architecture selection Architecture
OFMspert: An architecture for an operator's associate that evolves to an intelligent tutor
NASA Technical Reports Server (NTRS)
Mitchell, Christine M.
1991-01-01
With the emergence of new technology for both human-computer interaction and knowledge-based systems, a range of opportunities exist which enhance the effectiveness and efficiency of controllers of high-risk engineering systems. The design of an architecture for an operator's associate is described. This associate is a stand-alone model-based system designed to interact with operators of complex dynamic systems, such as airplanes, manned space systems, and satellite ground control systems in ways comparable to that of a human assistant. The operator function model expert system (OFMspert) architecture and the design and empirical validation of OFMspert's understanding component are described. The design and validation of OFMspert's interactive and control components are also described. A description of current work in which OFMspert provides the foundation in the development of an intelligent tutor that evolves to an assistant, as operator expertise evolves from novice to expert, is provided.
Integrated System Health Management (ISHM): Systematic Capability Implementation
NASA Technical Reports Server (NTRS)
Figueroa, Fernando; Holland, Randy; Schmalzwel, John; Duncavage, Dan
2006-01-01
This paper provides a credible approach for implementation of ISHM capability in any system. The requirements and processes to implement ISHM capability are unique in that a credible capability is initially implemented at a low level, and it evolves to achieve higher levels by incremental augmentation. In contrast, typical capabilities, such as thrust of an engine, are implemented once at full Functional Capability Level (FCL), which is not designed to change during the life of the product. The approach will describe core ingredients (e.g. technologies, architectures, etc.) and when and how ISHM capabilities may be implemented. A specific architecture/taxonomy/ontology will be described, as well as a prototype software environment that supports development of ISHM capability. This paper will address implementation of system-wide ISHM as a core capability, and ISHM for specific subsystems as expansions and evolution, but always focusing on achieving an integrated capability.
A Distributed Architecture for Tsunami Early Warning and Collaborative Decision-support in Crises
NASA Astrophysics Data System (ADS)
Moßgraber, J.; Middleton, S.; Hammitzsch, M.; Poslad, S.
2012-04-01
The presentation will describe work on the system architecture that is being developed in the EU FP7 project TRIDEC on "Collaborative, Complex and Critical Decision-Support in Evolving Crises". The challenges for a Tsunami Early Warning System (TEWS) are manifold and the success of a system depends crucially on the system's architecture. A modern warning system following a system-of-systems approach has to integrate various components and sub-systems such as different information sources, services and simulation systems. Furthermore, it has to take into account the distributed and collaborative nature of warning systems. In order to create an architecture that supports the whole spectrum of a modern, distributed and collaborative warning system one must deal with multiple challenges. Obviously, one cannot expect to tackle these challenges adequately with a monolithic system or with a single technology. Therefore, a system architecture providing the blueprints to implement the system-of-systems approach has to combine multiple technologies and architectural styles. At the bottom layer it has to reliably integrate a large set of conventional sensors, such as seismic sensors and sensor networks, buoys and tide gauges, and also innovative and unconventional sensors, such as streams of messages from social media services. At the top layer it has to support collaboration on high-level decision processes and facilitates information sharing between organizations. In between, the system has to process all data and integrate information on a semantic level in a timely manner. This complex communication follows an event-driven mechanism allowing events to be published, detected and consumed by various applications within the architecture. Therefore, at the upper layer the event-driven architecture (EDA) aspects are combined with principles of service-oriented architectures (SOA) using standards for communication and data exchange. The most prominent challenges on this layer include providing a framework for information integration on a syntactic and semantic level, leveraging distributed processing resources for a scalable data processing platform, and automating data processing and decision support workflows.
Explainable expert systems: A research program in information processing
NASA Technical Reports Server (NTRS)
Paris, Cecile L.
1993-01-01
Our work in Explainable Expert Systems (EES) had two goals: to extend and enhance the range of explanations that expert systems can offer, and to ease their maintenance and evolution. As suggested in our proposal, these goals are complementary because they place similar demands on the underlying architecture of the expert system: they both require the knowledge contained in a system to be explicitly represented, in a high-level declarative language and in a modular fashion. With these two goals in mind, the Explainable Expert Systems (EES) framework was designed to remedy limitations to explainability and evolvability that stem from related fundamental flaws in the underlying architecture of current expert systems.
Mars Hybrid Propulsion System Trajectory Analysis. Part I; Crew Missions
NASA Technical Reports Server (NTRS)
Chai, Patrick R.; Merrill, Raymond G.; Qu, Min
2015-01-01
NASAs Human spaceflight Architecture team is developing a reusable hybrid transportation architecture in which both chemical and electric propulsion systems are used to send crew and cargo to Mars destinations such as Phobos, Deimos, the surface of Mars, and other orbits around Mars. By combining chemical and electrical propulsion into a single space- ship and applying each where it is more effective, the hybrid architecture enables a series of Mars trajectories that are more fuel-efficient than an all chemical architecture without significant increases in flight times. This paper provides the analysis of the interplanetary segments of the three Evolvable Mars Campaign crew missions to Mars using the hybrid transportation architecture. The trajectory analysis provides departure and arrival dates and propellant needs for the three crew missions that are used by the campaign analysis team for campaign build-up and logistics aggregation analysis. Sensitivity analyses were performed to investigate the impact of mass growth, departure window, and propulsion system performance on the hybrid transportation architecture. The results and system analysis from this paper contribute to analyses of the other human spaceflight architecture team tasks and feed into the definition of the Evolvable Mars Campaign.
Genetic architecture and the evolution of sex.
Lohaus, Rolf; Burch, Christina L; Azevedo, Ricardo B R
2010-01-01
Theoretical investigations of the advantages of sex have tended to treat the genetic architecture of organisms as static and have not considered that genetic architecture might coevolve with reproductive mode. As a result, some potential advantages of sex may have been missed. Using a gene network model, we recently showed that recombination imposes selection for robustness to mutation and that negative epistasis can evolve as a by-product of this selection. These results motivated a detailed exploration of the mutational deterministic hypothesis, a hypothesis in which the advantage of sex depends critically on epistasis. We found that sexual populations do evolve higher mean fitness and lower genetic load than asexual populations at equilibrium, and, under moderate stabilizing selection and large population size, these equilibrium sexual populations resist invasion by asexuals. However, we found no evidence that these long- and short-term advantages to sex were explained by the negative epistasis that evolved in our experiments. The long-term advantage of sex was that sexual populations evolved a lower deleterious mutation rate, but this property was not sufficient to account for the ability of sexual populations to resist invasion by asexuals. The ability to resist asexual invasion was acquired simultaneously with an increase in recombinational robustness that minimized the cost of sex. These observations provide the first direct evidence that sexual reproduction does indeed select for conditions that favor its own maintenance. Furthermore, our results highlight the importance of considering a dynamic view of the genetic architecture to understand the evolution of sex and recombination.
HACC: Simulating sky surveys on state-of-the-art supercomputing architectures
NASA Astrophysics Data System (ADS)
Habib, Salman; Pope, Adrian; Finkel, Hal; Frontiere, Nicholas; Heitmann, Katrin; Daniel, David; Fasel, Patricia; Morozov, Vitali; Zagaris, George; Peterka, Tom; Vishwanath, Venkatram; Lukić, Zarija; Sehrish, Saba; Liao, Wei-keng
2016-01-01
Current and future surveys of large-scale cosmic structure are associated with a massive and complex datastream to study, characterize, and ultimately understand the physics behind the two major components of the 'Dark Universe', dark energy and dark matter. In addition, the surveys also probe primordial perturbations and carry out fundamental measurements, such as determining the sum of neutrino masses. Large-scale simulations of structure formation in the Universe play a critical role in the interpretation of the data and extraction of the physics of interest. Just as survey instruments continue to grow in size and complexity, so do the supercomputers that enable these simulations. Here we report on HACC (Hardware/Hybrid Accelerated Cosmology Code), a recently developed and evolving cosmology N-body code framework, designed to run efficiently on diverse computing architectures and to scale to millions of cores and beyond. HACC can run on all current supercomputer architectures and supports a variety of programming models and algorithms. It has been demonstrated at scale on Cell- and GPU-accelerated systems, standard multi-core node clusters, and Blue Gene systems. HACC's design allows for ease of portability, and at the same time, high levels of sustained performance on the fastest supercomputers available. We present a description of the design philosophy of HACC, the underlying algorithms and code structure, and outline implementation details for several specific architectures. We show selected accuracy and performance results from some of the largest high resolution cosmological simulations so far performed, including benchmarks evolving more than 3.6 trillion particles.
HACC: Simulating sky surveys on state-of-the-art supercomputing architectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Habib, Salman; Pope, Adrian; Finkel, Hal
2016-01-01
Current and future surveys of large-scale cosmic structure are associated with a massive and complex datastream to study, characterize, and ultimately understand the physics behind the two major components of the ‘Dark Universe’, dark energy and dark matter. In addition, the surveys also probe primordial perturbations and carry out fundamental measurements, such as determining the sum of neutrino masses. Large-scale simulations of structure formation in the Universe play a critical role in the interpretation of the data and extraction of the physics of interest. Just as survey instruments continue to grow in size and complexity, so do the supercomputers thatmore » enable these simulations. Here we report on HACC (Hardware/Hybrid Accelerated Cosmology Code), a recently developed and evolving cosmology N-body code framework, designed to run efficiently on diverse computing architectures and to scale to millions of cores and beyond. HACC can run on all current supercomputer architectures and supports a variety of programming models and algorithms. It has been demonstrated at scale on Cell- and GPU-accelerated systems, standard multi-core node clusters, and Blue Gene systems. HACC’s design allows for ease of portability, and at the same time, high levels of sustained performance on the fastest supercomputers available. We present a description of the design philosophy of HACC, the underlying algorithms and code structure, and outline implementation details for several specific architectures. We show selected accuracy and performance results from some of the largest high resolution cosmological simulations so far performed, including benchmarks evolving more than 3.6 trillion particles.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-22
... Services Delivery Architecture Recommendations are included in the TOR deliverables. The Concept of Use for... operating picture for evolving global ATM concepts. The AIS and MET Services Delivery Architecture... provides recommended alternatives for AIS and MET data delivery architectures. The Concept of Use and...
Creating a New Architecture for the Learning College
ERIC Educational Resources Information Center
O'Banion, Terry
2007-01-01
The publication of "A Nation at Risk" in 1983 triggered a series of major reform efforts in education that are still evolving. As part of the reform efforts, leaders began to refer to a Learning Revolution that would "place learning first by overhauling the traditional architecture of education." The old architecture--time-bound, place-bound,…
Enabling the On-line Intrinsic Evolution of Analog Controllers
NASA Technical Reports Server (NTRS)
Gwaltney, David A.; Ferguson, Michael I.
2005-01-01
The intrinsic evolution of analog controllers to provide closed-loop control of the speed of a DC motor has been previously demonstrated at NASA Marshall Space Flight Center. A side effect of the evolutionary process is that during evolution there are necessarily poor configurations to be evaluated that could cause damage to the plant, This paper concerns the development and implementation of a safe Evolvable Analog Controller (EAC) architecture able to evolve controllers on-line even in the presence of these poor configurations, The EAC concept is discussed and experimental results are presented that show the feasibility of the approach This EAC architecture represents the first in a series of steps required to make deployment of an evolvable controller a reality.
Enabling the On-Line Intrinsic Evolution of Analog Controllers
NASA Technical Reports Server (NTRS)
Gwaltney, David A.; Ferguson, Michael I.
2005-01-01
The intrinsic evolution of analog controllers to provide closed-loop control of the speed of a DC motor has been previously demonstrated at NASA Marshall Space Flight Center. A side fleet of the evolutionary process is that during evolution there are necessarily poor configurations to be evaluated that could cause damage to the plant. This paper concern the development and implementation of a safe Evolvable Analog Controller (EAC) architecture able to evolve controllers on-line even in the presence of these poor configurations. The EAC concept is discussed and experimental results are presented that show the feasibility of the approach This EAC architecture represents the first in a series of steps required to make deployment of an evolvable controller a reality.
Specification and Design of Electrical Flight System Architectures with SysML
NASA Technical Reports Server (NTRS)
McKelvin, Mark L., Jr.; Jimenez, Alejandro
2012-01-01
Modern space flight systems are required to perform more complex functions than previous generations to support space missions. This demand is driving the trend to deploy more electronics to realize system functionality. The traditional approach for the specification, design, and deployment of electrical system architectures in space flight systems includes the use of informal definitions and descriptions that are often embedded within loosely coupled but highly interdependent design documents. Traditional methods become inefficient to cope with increasing system complexity, evolving requirements, and the ability to meet project budget and time constraints. Thus, there is a need for more rigorous methods to capture the relevant information about the electrical system architecture as the design evolves. In this work, we propose a model-centric approach to support the specification and design of electrical flight system architectures using the System Modeling Language (SysML). In our approach, we develop a domain specific language for specifying electrical system architectures, and we propose a design flow for the specification and design of electrical interfaces. Our approach is applied to a practical flight system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Zhenyu; Dagle, Jeffery E.
2008-07-31
The infrastructure of phasor measurements have evolved over the last two decades from isolated measurement units to networked measurement systems with footprints beyond individual utility companies. This is, to a great extent, a bottom-up self-evolving process except some local systems built by design. Given the number of phasor measurement units (PMUs) in the system is small (currently 70 each in western and eastern interconnections), current phasor network architecture works just fine. However, the architecture will become a bottleneck when large number of PMUs are installed (e.g. >1000~10000). The need for phasor architecture design has yet to be addressed. This papermore » reviews the current phasor networks and investigates future architectures, as related to the efforts undertaken by the North America SynchroPhasor Initiative (NASPI). Then it continues to present staged system tests to evaluate the performance of phasor networks, which is a common practice in the Western Electricity Coordinating Council (WECC) system. This is followed by field measurement evaluation and the implication of phasor quality issues on phasor applications.« less
Enabling Communication and Navigation Technologies for Future Near Earth Science Missions
NASA Technical Reports Server (NTRS)
Israel, David J.; Heckler, Greg; Menrad, Robert J.; Hudiburg, John J.; Boroson, Don M.; Robinson, Bryan S.; Cornwell, Donald M.
2016-01-01
In 2015, the Earth Regimes Network Evolution Study (ERNESt) Team proposed a fundamentally new architectural concept, with enabling technologies, that defines an evolutionary pathway out to the 2040 timeframe in which an increasing user community comprised of more diverse space science and exploration missions can be supported. The architectural concept evolves the current instantiations of the Near Earth Network and Space Network through implementation of select technologies resulting in a global communication and navigation network that provides communication and navigation services to a wide range of space users in the Near Earth regime, defined as an Earth-centered sphere with radius of 2M Km. The enabling technologies include: High Rate Optical Communications, Optical Multiple Access (OMA), Delay Tolerant Networking (DTN), User Initiated Services (UIS), and advanced Position, Navigation, and Timing technology (PNT). This paper describes this new architecture, the key technologies that enable it and their current technology readiness levels. Examples of science missions that could be enabled by the technologies and the projected operational benefits of the architecture concept to missions are also described.
The ADEPT Framework for Intelligent Autonomy
NASA Technical Reports Server (NTRS)
Ricard, Michael; Kolitz, Stephan
2003-01-01
This paper describes the design and implementation of Draper Laboratory's All-Domain Execution and Planning Technology (ADEPT) architecture for intelligent autonomy. Intelligent autonomy is the ability to plan and execute complex activities in a manner that provides rapid, effective response to stochastic and dynamic mission events. Thus, intelligent autonomy enables the high-level reasoning and adaptive behavior for an unmanned vehicle that is provided by an operator in man-in-the-loop systems. Draper s intelligent autonomy has architecture evolved over a decade and a half beginning in the mid 1980's culminating in an operational experiment funded under DARPA's Autonomous Minehunting and Mapping Technologies (AMMT) unmanned undersea vehicle program. ADEPT continues to be refined through its application to current programs that involve air vehicles, satellites and higher-level planning used to direct multiple vehicles. The objective of ADEPT is to solidify a proven, dependable software approach that can be quickly applied to new vehicles and domains. The architecture can be viewed as a hierarchical extension of the sense-think-act paradigm of intelligence and has strong parallels with the military's Observe-Orient-Decide-Act (OODA) loop. The key elements of the architecture are planning and decision-making nodes comprising modules for situation assessment, plan generation, plan implementation and coordination. A reusable, object-oriented software framework has been developed that implements these functions. As the architecture is applied to new areas, only the application specific software needs to be developed. This paper describes the core architecture in detail and discusses how this has been applied in the undersea, air, ground and space domains.
System design in an evolving system-of-systems architecture and concept of operations
NASA Astrophysics Data System (ADS)
Rovekamp, Roger N., Jr.
Proposals for space exploration architectures have increased in complexity and scope. Constituent systems (e.g., rovers, habitats, in-situ resource utilization facilities, transfer vehicles, etc) must meet the needs of these architectures by performing in multiple operational environments and across multiple phases of the architecture's evolution. This thesis proposes an approach for using system-of-systems engineering principles in conjunction with system design methods (e.g., Multi-objective optimization, genetic algorithms, etc) to create system design options that perform effectively at both the system and system-of-systems levels, across multiple concepts of operations, and over multiple architectural phases. The framework is presented by way of an application problem that investigates the design of power systems within a power sharing architecture for use in a human Lunar Surface Exploration Campaign. A computer model has been developed that uses candidate power grid distribution solutions for a notional lunar base. The agent-based model utilizes virtual control agents to manage the interactions of various exploration and infrastructure agents. The philosophy behind the model is based both on lunar power supply strategies proposed in literature, as well as on the author's own approaches for power distribution strategies of future lunar bases. In addition to proposing a framework for system design, further implications of system-of-systems engineering principles are briefly explored, specifically as they relate to producing more robust cross-cultural system-of-systems architecture solutions.
Environmental models are products of the computer architecture and software tools available at the time of development. Scientifically sound algorithms may persist in their original state even as system architectures and software development approaches evolve and progress. Dating...
In-Space Transportation for NASA's Evolvable Mars Campaign
NASA Technical Reports Server (NTRS)
Percy, Thomas K.; McGuire, Melissa; Polsgrove, Tara
2015-01-01
As the nation embarks on a new and bold journey to Mars, significant work is being done to determine what that mission and those architectural elements will look like. The Evolvable Mars Campaign, or EMC, is being evaluated as a potential approach to getting humans to Mars. Built on the premise of leveraging current technology investments and maximizing element commonality to reduce cost and development schedule, the EMC transportation architecture is focused on developing the elements required to move crew and equipment to Mars as efficiently and effectively as possible both from a performance and a programmatic standpoint. Over the last 18 months the team has been evaluating potential options for those transportation elements. One of the key aspects of the EMC is leveraging investments being made today in missions like the Asteroid Redirect Mission (ARM) mission using derived versions of the Solar Electric Propulsion (SEP) propulsion systems and coupling them with other chemical propulsion elements that maximize commonality across the architecture between both transportation and Mars operations elements. This paper outlines the broad trade space being evaluated including the different technologies being assessed for transportation elements and how those elements are assembled into an architecture. Impacts to potential operational scenarios at Mars are also investigated. Trades are being made on the size and power level of the SEP vehicle for delivering cargo as well as the size of the chemical propulsion systems and various mission aspects including Inspace assembly and sequencing. Maximizing payload delivery to Mars with the SEP vehicle will better support the operational scenarios at Mars by enabling the delivery of landers and habitation elements that are appropriately sized for the mission. The purpose of this investigation is not to find the solution but rather a suite of solutions with potential application to the challenge of sending cargo and crew to Mars. The goal is that, by building an architecture intelligently with all aspects considered, the sustainable Mars program wisely invests limited resources enabling a long-term human Mars exploration program.
How cancer shapes evolution, and how evolution shapes cancer
Casás-Selves, Matias; DeGregori, James
2013-01-01
Evolutionary theories are critical for understanding cancer development at the level of species as well as at the level of cells and tissues, and for developing effective therapies. Animals have evolved potent tumor suppressive mechanisms to prevent cancer development. These mechanisms were initially necessary for the evolution of multi-cellular organisms, and became even more important as animals evolved large bodies and long lives. Indeed, the development and architecture of our tissues were evolutionarily constrained by the need to limit cancer. Cancer development within an individual is also an evolutionary process, which in many respects mirrors species evolution. Species evolve by mutation and selection acting on individuals in a population; tumors evolve by mutation and selection acting on cells in a tissue. The processes of mutation and selection are integral to the evolution of cancer at every step of multistage carcinogenesis, from tumor genesis to metastasis. Factors associated with cancer development, such as aging and carcinogens, have been shown to promote cancer evolution by impacting both mutation and selection processes. While there are therapies that can decimate a cancer cell population, unfortunately, cancers can also evolve resistance to these therapies, leading to the resurgence of treatment-refractory disease. Understanding cancer from an evolutionary perspective can allow us to appreciate better why cancers predominantly occur in the elderly, and why other conditions, from radiation exposure to smoking, are associated with increased cancers. Importantly, the application of evolutionary theory to cancer should engender new treatment strategies that could better control this dreaded disease. PMID:23705033
Implications of behavioral architecture for the evolution of self-organized division of labor.
Duarte, A; Scholtens, E; Weissing, F J
2012-01-01
Division of labor has been studied separately from a proximate self-organization and an ultimate evolutionary perspective. We aim to bring together these two perspectives. So far this has been done by choosing a behavioral mechanism a priori and considering the evolution of the properties of this mechanism. Here we use artificial neural networks to allow for a more open architecture. We study whether emergent division of labor can evolve in two different network architectures; a simple feedforward network, and a more complex network that includes the possibility of self-feedback from previous experiences. We focus on two aspects of division of labor; worker specialization and the ratio of work performed for each task. Colony fitness is maximized by both reducing idleness and achieving a predefined optimal work ratio. Our results indicate that architectural constraints play an important role for the outcome of evolution. With the simplest network, only genetically determined specialization is possible. This imposes several limitations on worker specialization. Moreover, in order to minimize idleness, networks evolve a biased work ratio, even when an unbiased work ratio would be optimal. By adding self-feedback to the network we increase the network's flexibility and worker specialization evolves under a wider parameter range. Optimal work ratios are more easily achieved with the self-feedback network, but still provide a challenge when combined with worker specialization.
Implications of Behavioral Architecture for the Evolution of Self-Organized Division of Labor
Duarte, A.; Scholtens, E.; Weissing, F. J.
2012-01-01
Division of labor has been studied separately from a proximate self-organization and an ultimate evolutionary perspective. We aim to bring together these two perspectives. So far this has been done by choosing a behavioral mechanism a priori and considering the evolution of the properties of this mechanism. Here we use artificial neural networks to allow for a more open architecture. We study whether emergent division of labor can evolve in two different network architectures; a simple feedforward network, and a more complex network that includes the possibility of self-feedback from previous experiences. We focus on two aspects of division of labor; worker specialization and the ratio of work performed for each task. Colony fitness is maximized by both reducing idleness and achieving a predefined optimal work ratio. Our results indicate that architectural constraints play an important role for the outcome of evolution. With the simplest network, only genetically determined specialization is possible. This imposes several limitations on worker specialization. Moreover, in order to minimize idleness, networks evolve a biased work ratio, even when an unbiased work ratio would be optimal. By adding self-feedback to the network we increase the network's flexibility and worker specialization evolves under a wider parameter range. Optimal work ratios are more easily achieved with the self-feedback network, but still provide a challenge when combined with worker specialization. PMID:22457609
A parallel unbalanced digitization architecture to reduce the dynamic range of multiple signals
NASA Astrophysics Data System (ADS)
Vallérian, Mathieu; HuÅ£u, Florin; Villemaud, Guillaume; Miscopein, Benoît; Risset, Tanguy
2016-05-01
Technologies employed in urban sensor networks are permanently evolving, and thus the gateways employed to collect data in such kind of networks have to be very flexible in order to be compliant with the new communication standards. A convenient way to do that is to digitize all the received signals in one shot and then to digitally perform the signal processing, as it is done in software-defined radio (SDR). All signals can be emitted with very different features (bandwidth, modulation type, and power level) in order to respond to the various propagation conditions. Their difference in terms of power levels is a problem when digitizing them together, as no current commercial analog-to-digital converter (ADC) can provide a fine enough resolution to digitize this high dynamic range between the weakest possible signal in the presence of a stronger signal. This paper presents an RF front end receiver architecture capable of handling this problem by using two ADCs of lower resolutions. The architecture is validated through a set of simulations using Keysight's ADS software. The main validation criterion is the bit error rate comparison with a classical receiver.
Uncoupling File System Components for Bridging Legacy and Modern Storage Architectures
NASA Astrophysics Data System (ADS)
Golpayegani, N.; Halem, M.; Tilmes, C.; Prathapan, S.; Earp, D. N.; Ashkar, J. S.
2016-12-01
Long running Earth Science projects can span decades of architectural changes in both processing and storage environments. As storage architecture designs change over decades such projects need to adjust their tools, systems, and expertise to properly integrate such new technologies with their legacy systems. Traditional file systems lack the necessary support to accommodate such hybrid storage infrastructure resulting in more complex tool development to encompass all possible storage architectures used for the project. The MODIS Adaptive Processing System (MODAPS) and the Level 1 and Atmospheres Archive and Distribution System (LAADS) is an example of a project spanning several decades which has evolved into a hybrid storage architecture. MODAPS/LAADS has developed the Lightweight Virtual File System (LVFS) which ensures a seamless integration of all the different storage architectures, including standard block based POSIX compliant storage disks, to object based architectures such as the S3 compliant HGST Active Archive System, and the Seagate Kinetic disks utilizing the Kinetic Protocol. With LVFS, all analysis and processing tools used for the project continue to function unmodified regardless of the underlying storage architecture enabling MODAPS/LAADS to easily integrate any new storage architecture without the costly need to modify existing tools to utilize such new systems. Most file systems are designed as a single application responsible for using metadata to organizing the data into a tree, determine the location for data storage, and a method of data retrieval. We will show how LVFS' unique approach of treating these components in a loosely coupled fashion enables it to merge different storage architectures into a single uniform storage system which bridges the underlying hybrid architecture.
Source technology as the foundation for modern infra-red counter measures (IRCM)
NASA Astrophysics Data System (ADS)
Grasso, Robert J.
2010-10-01
Protection of military aircraft from IR guided threats is paramount to ensure the survivability of aircrews, platforms, and to ensure mission success. At the foundation of all IRCM systems is the source; that component that provides the in-band radiant energy required for threat defeat. As such, source technology has evolved with IRCM technology to encompass the evolving systems architectures that encompass IRCM: 1) "Hot Brick" omni-directional sources; 2) arc lamps, and; 3) lasers. Lasers, as IRCM sources continue to evolve to meet the challenges of ever-evolving threats, superior techniques, economy of installation, and superior source technology. Lasers represent the single greatest advance in IRCM source technology and continue to evolve to meet ever more sophisticated threats. And have been used with great effect in all modern IRCM systems; evolving from frequency doubled CO2 lasers, to solid state lasers with OPOs, to semiconductor lasers including Quantum Cascade Lasers (QCLs); these last devices represent the latest advance in IRCM source technology offering all-band coverage, architectural simplicity, and economy of scale. While QCLs represent the latest advance in IRCM laser technology, fiber lasers show much promise in addressing multi-band operation as well as the ability to be coherently combined to achieve even greater output capability. Also, ultra-short pulse lasers are evolving to become practical for IRCM applications. Stay tuned ......
Policy-Based Middleware for QoS Management and Signaling in the Evolved Packet System
NASA Astrophysics Data System (ADS)
Good, Richard; Gouveia, Fabricio; Magedanz, Thomas; Ventura, Neco
The 3GPP are currently finalizing their Evolved Packet System (EPS) with the Evolved Packet Core (EPC) central to this framework. The EPC is a simplified, flat, all IP-based architecture that supports mobility between heterogeneous access networks and incorporates an evolved QoS concept based on the 3GPP Policy Control and Charging (PCC) framework. The IP Multimedia Subsystem (IMS) is an IP service element within the EPS, introduced for the rapid provisioning of innovative multimedia services. The evolved PCC framework extends the scope of operation and defines new interactions - in particular the S9 reference point is introduced to facilitate inter-domain PCC communication. This paper proposes an enhancement to the IMS/PCC framework that uses SIP routing information to discover signaling and media paths. This mechanism uses standardized IMS/PCC operations and allows applications to effectively issue resource requests from their home domain enabling QoS-connectivity across multiple domains. Because the mechanism operates at the service control layer it does not require any significant transport layer modifications or the sharing of potentially sensitive internal topology information. The evolved PCC architecture and inter-domain route discovery mechanisms were implemented in an evaluation testbed and performed favorably without adversely effecting end user experience.
Advances in Modern Botnet Understanding and the Accurate Enumeration of Infected Hosts
ERIC Educational Resources Information Center
Nunnery, Christopher Edward
2011-01-01
Botnets remain a potent threat due to evolving modern architectures, inadequate remediation methods, and inaccurate measurement techniques. In response, this research exposes the architectures and operations of two advanced botnets, techniques to enumerate infected hosts, and pursues the scientific refinement of infected-host enumeration data by…
Ervin Zube and landscape architecture
Paul H. Gobster
2002-01-01
As he grew in his knowledge about the landscape through his involvemment in it as a person, student, practitioner, teacher, program director, and researcher, Ervin Zube's ideas about what landscape architecture is and should be continually evolved. He was a prolific writer whose publications span a broad range of audiences, and his contributions to ...
Achieving High Performance With TCP Over 40 GbE on NUMA Architectures for CMS Data Acquisition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bawej, Tomasz; et al.
2014-01-01
TCP and the socket abstraction have barely changed over the last two decades, but at the network layer there has been a giant leap from a few megabits to 100 gigabits in bandwidth. At the same time, CPU architectures have evolved into the multicore era and applications are expected to make full use of all available resources. Applications in the data acquisition domain based on the standard socket library running in a Non-Uniform Memory Access (NUMA) architecture are unable to reach full efficiency and scalability without the software being adequately aware about the IRQ (Interrupt Request), CPU and memory affinities.more » During the first long shutdown of LHC, the CMS DAQ system is going to be upgraded for operation from 2015 onwards and a new software component has been designed and developed in the CMS online framework for transferring data with sockets. This software attempts to wrap the low-level socket library to ease higher-level programming with an API based on an asynchronous event driven model similar to the DAT uDAPL API. It is an event-based application with NUMA optimizations, that allows for a high throughput of data across a large distributed system. This paper describes the architecture, the technologies involved and the performance measurements of the software in the context of the CMS distributed event building.« less
Complex network view of evolving manifolds
NASA Astrophysics Data System (ADS)
da Silva, Diamantino C.; Bianconi, Ginestra; da Costa, Rui A.; Dorogovtsev, Sergey N.; Mendes, José F. F.
2018-03-01
We study complex networks formed by triangulations and higher-dimensional simplicial complexes representing closed evolving manifolds. In particular, for triangulations, the set of possible transformations of these networks is restricted by the condition that at each step, all the faces must be triangles. Stochastic application of these operations leads to random networks with different architectures. We perform extensive numerical simulations and explore the geometries of growing and equilibrium complex networks generated by these transformations and their local structural properties. This characterization includes the Hausdorff and spectral dimensions of the resulting networks, their degree distributions, and various structural correlations. Our results reveal a rich zoo of architectures and geometries of these networks, some of which appear to be small worlds while others are finite dimensional with Hausdorff dimension equal or higher than the original dimensionality of their simplices. The range of spectral dimensions of the evolving triangulations turns out to be from about 1.4 to infinity. Our models include simplicial complexes representing manifolds with evolving topologies, for example, an h -holed torus with a progressively growing number of holes. This evolving graph demonstrates features of a small-world network and has a particularly heavy-tailed degree distribution.
Exploration Space Suit Architecture and Destination Environmental-Based Technology Development
NASA Technical Reports Server (NTRS)
Hill, Terry R.; McFarland, Shane M.; Korona, F. Adam
2013-01-01
This paper continues forward where EVA Space Suit Architecture: Low Earth Orbit Vs. Moon Vs. Mars left off in the development of a space suit architecture that is modular in design and could be reconfigured prior to launch or during any given mission depending on the tasks or destination. This space suit system architecture and technologies required based on human exploration (EVA) destinations will be discussed, and how these systems should evolve to meet the future exploration EVA needs of the US human space flight program. A series of exercises and analyses provided a strong indication that the Constellation Program space suit architecture, with its maximum reuse of technology and functionality across a range of mission profiles and destinations, is postured to provide a viable solution for future space exploration missions. The destination environmental analysis demonstrates that the modular architecture approach could provide the lowest mass and mission cost for the protection of the crew, given any human mission outside of low-Earth orbit. Additionally, some of the high-level trades presented here provide a review of the environmental and nonenvironmental design drivers that will become increasingly important as humans venture farther from Earth. The presentation of destination environmental data demonstrates a logical clustering of destination design environments that allows a focused approach to technology prioritization, development, and design that will maximize the return on investment, largely independent of any particular design reference mission.
Exploration Space Suit Architecture and Destination Environmental-Based Technology Development
NASA Technical Reports Server (NTRS)
Hill, Terry R.; McFarland, Shane M.; Korona, F. Adam
2013-01-01
This paper continues forward where EVA Space Suit Architecture: Low Earth Orbit Vs. Moon Vs. Mars1 left off in the development of a space suit architecture that is modular in design and could be reconfigured prior to launch or during any given mission depending on the tasks or destination. This paper addresses the space suit system architecture and technologies required based on human exploration (EVA) destinations, and describes how these systems should evolve to meet the future exploration EVA needs of the US human space flight program. A series of exercises and analyses provided a strong indication that the Constellation Program space suit architecture, with its maximum reuse of technology and functionality across a range of mission profiles and destinations, is postured to provide a viable solution for future space exploration missions. The destination environmental analysis demonstrates that the modular architecture approach could provide the lowest mass and mission cost for the protection of the crew, given any human mission outside of low-Earth orbit. Additionally, some of the high-level trades presented here provide a review of the environmental and non-environmental design drivers that will become increasingly important as humans venture farther from Earth. This paper demonstrates a logical clustering of destination design environments that allows a focused approach to technology prioritization, development, and design that will maximize the return on investment, largely independent of any particular design reference mission.
NASA Technical Reports Server (NTRS)
Sjauw, Waldy K.; McGuire, Melissa L.; Freeh, Joshua E.
2016-01-01
Recent NASA interest in human missions to Mars has led to an Evolvable Mars Campaign by the agency's Human Architecture Team. Delivering the crew return propulsion stages and Mars surface landers, SEP based systems are employed because of their high specific impulse characteristics enabling missions requiring less propellant although with longer transfer times. The Earth departure trajectories start from an SLS launch vehicle delivery orbit and are spiral shaped because of the low SEP thrust. Previous studies have led to interest in assessing the divide in trip time between the Earth departure and interplanetary legs of the mission for a representative SEP cargo vehicle.
Combining Solar Electric Propulsion and Chemical Propulsion for Crewed Missions to Mars
NASA Technical Reports Server (NTRS)
Percy, Tom; McGuire, Melissa; Polsgrove, Tara
2015-01-01
This paper documents the results of an investigation of human Mars mission architectures that leverage near-term technology investments and infrastructures resulting from the planned Asteroid Redirect Robotic Mission (ARRM), including high-power Solar Electric Propulsion (SEP) and a human presence in Lunar Distant Retrograde Orbit (LDRO). The architectures investigated use a combination of SEP and chemical propulsion elements. Through this combination of propulsion technologies, these architectures take advantage of the high efficiency SEP propulsion system to deliver cargo, while maintaining the faster trip times afforded by chemical propulsion for crew transport. Evolved configurations of the Asteroid Redirect Vehicle (ARV) are considered for cargo delivery. Sensitivities to SEP system design parameters, including power level and propellant quantity, are presented. For the crew delivery, liquid oxygen and methane stages were designed using engines common to future human Mars landers. Impacts of various Earth departure orbits, Mars loiter orbits, and Earth return strategies are presented. The use of the Space Launch System for delivery of the various architecture elements was also investigated and launch vehicle manifesting, launch scheduling and mission timelines are also discussed. The study results show that viable Mars architecture can be constructed using LDRO and SEP in order to take advantage of investments made in the ARRM mission.
Combining Solar Electric and Chemical Propulsion for Crewed Missions to Mars
NASA Technical Reports Server (NTRS)
Percy, Tom; McGuire, Melissa; Polsgrove, Tara
2015-01-01
This paper documents the results of an investigation of human Mars mission architectures that leverage near-term technology investments and infrastructures resulting from the planned Asteroid Redirect Mission, including high-power Solar Electric Propulsion (SEP) and a human presence in Lunar Distant Retrograde Orbit (LDRO). The architectures investigated use a combination of SEP and chemical propulsion elements. Through this combination of propulsion technologies, these architectures take advantage of the high efficiency SEP propulsion system to deliver cargo, while maintaining the faster trip times afforded by chemical propulsion for crew transport. Evolved configurations of the Asteroid Redirect Vehicle (ARV) are considered for cargo delivery. Sensitivities to SEP system design parameters, including power level and propellant quantity, are presented. For the crew delivery, liquid oxygen and methane stages were designed using engines common to future human Mars landers. Impacts of various Earth departure orbits, Mars loiter orbits, and Earth return strategies are presented. The use of the Space Launch System for delivery of the various architecture elements was also investigated and launch vehicle manifesting, launch scheduling and mission timelines are also discussed. The study results show that viable Mars architecture can be constructed using LDRO and SEP in order to take advantage of investments made in the ARM mission.
A Principled Approach to the Specification of System Architectures for Space Missions
NASA Technical Reports Server (NTRS)
McKelvin, Mark L. Jr.; Castillo, Robert; Bonanne, Kevin; Bonnici, Michael; Cox, Brian; Gibson, Corrina; Leon, Juan P.; Gomez-Mustafa, Jose; Jimenez, Alejandro; Madni, Azad
2015-01-01
Modern space systems are increasing in complexity and scale at an unprecedented pace. Consequently, innovative methods, processes, and tools are needed to cope with the increasing complexity of architecting these systems. A key systems challenge in practice is the ability to scale processes, methods, and tools used to architect complex space systems. Traditionally, the process for specifying space system architectures has largely relied on capturing the system architecture in informal descriptions that are often embedded within loosely coupled design documents and domain expertise. Such informal descriptions often lead to misunderstandings between design teams, ambiguous specifications, difficulty in maintaining consistency as the architecture evolves throughout the system development life cycle, and costly design iterations. Therefore, traditional methods are becoming increasingly inefficient to cope with ever-increasing system complexity. We apply the principles of component-based design and platform-based design to the development of the system architecture for a practical space system to demonstrate feasibility of our approach using SysML. Our results show that we are able to apply a systematic design method to manage system complexity, thus enabling effective data management, semantic coherence and traceability across different levels of abstraction in the design chain. Just as important, our approach enables interoperability among heterogeneous tools in a concurrent engineering model based design environment.
Secure Remote Access Issues in a Control Center Environment
NASA Technical Reports Server (NTRS)
Pitts, Lee; McNair, Ann R. (Technical Monitor)
2002-01-01
The ISS finally reached an operational state and exists for local and remote users. Onboard payload systems are managed by the Huntsville Operations Support Center (HOSC). Users access HOSC systems by internet protocols in support of daily operations, preflight simulation, and test. In support of this diverse user community, a modem security architecture has been implemented. The architecture has evolved over time from an isolated but open system to a system which supports local and remote access to the ISS over broad geographic regions. This has been accomplished through the use of an evolved security strategy, PKI, and custom design. Through this paper, descriptions of the migration process and the lessons learned are presented. This will include product decision criteria, rationale, and the use of commodity products in the end architecture. This paper will also stress the need for interoperability of various products and the effects of seemingly insignificant details.
Secure Payload Access to the International Space Station
NASA Technical Reports Server (NTRS)
Pitts, R. Lee; Reid, Chris
2002-01-01
The ISS finally reached an operational state and exists for local and remote users. Onboard payload systems are managed by the Huntsville Operations Support Center (HOSC). Users access HOSC systems by internet protocols in support of daily operations, preflight simulation, and test. In support of this diverse user community, a modem security architecture has been implemented. The architecture has evolved over time from an isolated but open system to a system which supports local and remote access to the ISS over broad geographic regions. This has been accomplished through the use of an evolved security strategy, PKI, and custom design. Through this paper, descriptions of the migration process and the lessons learned are presented. This will include product decision criteria, rationale, and the use of commodity products in the end architecture. This paper will also stress the need for interoperability of various products and the effects of seemingly insignificant details.
The genetic architecture of gene expression levels in wild baboons.
Tung, Jenny; Zhou, Xiang; Alberts, Susan C; Stephens, Matthew; Gilad, Yoav
2015-02-25
Primate evolution has been argued to result, in part, from changes in how genes are regulated. However, we still know little about gene regulation in natural primate populations. We conducted an RNA sequencing (RNA-seq)-based study of baboons from an intensively studied wild population. We performed complementary expression quantitative trait locus (eQTL) mapping and allele-specific expression analyses, discovering substantial evidence for, and surprising power to detect, genetic effects on gene expression levels in the baboons. eQTL were most likely to be identified for lineage-specific, rapidly evolving genes; interestingly, genes with eQTL significantly overlapped between baboons and a comparable human eQTL data set. Our results suggest that genes vary in their tolerance of genetic perturbation, and that this property may be conserved across species. Further, they establish the feasibility of eQTL mapping using RNA-seq data alone, and represent an important step towards understanding the genetic architecture of gene expression in primates.
The genetic architecture of gene expression levels in wild baboons
Tung, Jenny; Zhou, Xiang; Alberts, Susan C; Stephens, Matthew; Gilad, Yoav
2015-01-01
Primate evolution has been argued to result, in part, from changes in how genes are regulated. However, we still know little about gene regulation in natural primate populations. We conducted an RNA sequencing (RNA-seq)-based study of baboons from an intensively studied wild population. We performed complementary expression quantitative trait locus (eQTL) mapping and allele-specific expression analyses, discovering substantial evidence for, and surprising power to detect, genetic effects on gene expression levels in the baboons. eQTL were most likely to be identified for lineage-specific, rapidly evolving genes; interestingly, genes with eQTL significantly overlapped between baboons and a comparable human eQTL data set. Our results suggest that genes vary in their tolerance of genetic perturbation, and that this property may be conserved across species. Further, they establish the feasibility of eQTL mapping using RNA-seq data alone, and represent an important step towards understanding the genetic architecture of gene expression in primates. DOI: http://dx.doi.org/10.7554/eLife.04729.001 PMID:25714927
A Successful Component Architecture for Interoperable and Evolvable Ground Data Systems
NASA Technical Reports Server (NTRS)
Smith, Danford S.; Bristow, John O.; Wilmot, Jonathan
2006-01-01
The National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) has adopted an open architecture approach for satellite control centers and is now realizing benefits beyond those originally envisioned. The Goddard Mission Services Evolution Center (GMSEC) architecture utilizes standardized interfaces and a middleware software bus to allow functional components to be easily integrated. This paper presents the GMSEC architectural goals and concepts, the capabilities enabled and the benefits realized by adopting this framework approach. NASA experiences with applying the GMSEC architecture on multiple missions are discussed. The paper concludes with a summary of lessons learned, future directions for GMSEC and the possible applications beyond NASA GSFC.
Options for a lunar base surface architecture
NASA Technical Reports Server (NTRS)
Roberts, Barney B.
1992-01-01
The Planet Surface Systems Office at the NASA Johnson Space Center has participated in an analysis of the Space Exploration Initiative architectures described in the Synthesis Group report. This effort involves a Systems Engineering and Integration effort to define point designs for evolving lunar and Mars bases that support substantial science, exploration, and resource production objectives. The analysis addresses systems-level designs; element requirements and conceptual designs; assessments of precursor and technology needs; and overall programmatics and schedules. This paper focuses on the results of the study of the Space Resource Utilization Architecture. This architecture develops the capability to extract useful materials from the indigenous resources of the Moon and Mars. On the Moon, a substantial infrastructure is emplaced which can support a crew of up to twelve. Two major process lines are developed: one produces oxygen, ceramics, and metals; the other produces hydrogen, helium, and other volatiles. The Moon is also used for a simulation of a Mars mission. Significant science capabilities are established in conjunction with resource development. Exploration includes remote global surveys and piloted sorties of local and regional areas. Science accommodations include planetary science, astronomy, and biomedical research. Greenhouses are established to provide a substantial amount of food needs.
Engineering Design Graphics: Into the 21st Century
ERIC Educational Resources Information Center
Harris, La Verne Abe; Meyers, Frederick
2007-01-01
Graphical plans for construction of machinery and architecture have evolved over the last 6,000 years beginning from hieroglyphics to drawings on printable media, from the "Golden Age" of engineering graphics to the innovation of computer graphics and prototyping. The evolution of engineering design graphics as a profession has also evolved. Years…
Plant Nitrogen Acquisition Under Low Availability: Regulation of Uptake and Root Architecture
Kiba, Takatoshi; Krapp, Anne
2016-01-01
Nitrogen availability is a major factor determining plant growth and productivity. Plants acquire nitrogen nutrients from the soil through their roots mostly in the form of ammonium and nitrate. Since these nutrients are scarce in natural soils, plants have evolved adaptive responses to cope with the environment. One of the most important responses is the regulation of nitrogen acquisition efficiency. This review provides an update on the molecular determinants of two major drivers of the nitrogen acquisition efficiency: (i) uptake activity (e.g. high-affinity nitrogen transporters) and (ii) root architecture (e.g. low-nitrogen-availability-specific regulators of primary and lateral root growth). Major emphasis is laid on the regulation of these determinants by nitrogen supply at the transcriptional and post-transcriptional levels, which enables plants to optimize nitrogen acquisition efficiency under low nitrogen availability. PMID:27025887
Ape parasite origins of human malaria virulence genes
Larremore, Daniel B.; Sundararaman, Sesh A.; Liu, Weimin; Proto, William R.; Clauset, Aaron; Loy, Dorothy E.; Speede, Sheri; Plenderleith, Lindsey J.; Sharp, Paul M.; Hahn, Beatrice H.; Rayner, Julian C.; Buckee, Caroline O.
2015-01-01
Antigens encoded by the var gene family are major virulence factors of the human malaria parasite Plasmodium falciparum, exhibiting enormous intra- and interstrain diversity. Here we use network analysis to show that var architecture and mosaicism are conserved at multiple levels across the Laverania subgenus, based on var-like sequences from eight single-species and three multi-species Plasmodium infections of wild-living or sanctuary African apes. Using select whole-genome amplification, we also find evidence of multi-domain var structure and synteny in Plasmodium gaboni, one of the ape Laverania species most distantly related to P. falciparum, as well as a new class of Duffy-binding-like domains. These findings indicate that the modular genetic architecture and sequence diversity underlying var-mediated host-parasite interactions evolved before the radiation of the Laverania subgenus, long before the emergence of P. falciparum. PMID:26456841
Securing the Global Airspace System Via Identity-Based Security
NASA Technical Reports Server (NTRS)
Ivancic, William D.
2015-01-01
Current telecommunications systems have very good security architectures that include authentication and authorization as well as accounting. These three features enable an edge system to obtain access into a radio communication network, request specific Quality-of-Service (QoS) requirements and ensure proper billing for service. Furthermore, the links are secure. Widely used telecommunication technologies are Long Term Evolution (LTE) and Worldwide Interoperability for Microwave Access (WiMAX) This paper provides a system-level view of network-centric operations for the global airspace system and the problems and issues with deploying new technologies into the system. The paper then focuses on applying the basic security architectures of commercial telecommunication systems and deployment of federated Authentication, Authorization and Accounting systems to provide a scalable, evolvable reliable and maintainable solution to enable a globally deployable identity-based secure airspace system.
The NASA Space Communications Data Networking Architecture
NASA Technical Reports Server (NTRS)
Israel, David J.; Hooke, Adrian J.; Freeman, Kenneth; Rush, John J.
2006-01-01
The NASA Space Communications Architecture Working Group (SCAWG) has recently been developing an integrated agency-wide space communications architecture in order to provide the necessary communication and navigation capabilities to support NASA's new Exploration and Science Programs. A critical element of the space communications architecture is the end-to-end Data Networking Architecture, which must provide a wide range of services required for missions ranging from planetary rovers to human spaceflight, and from sub-orbital space to deep space. Requirements for a higher degree of user autonomy and interoperability between a variety of elements must be accommodated within an architecture that necessarily features minimum operational complexity. The architecture must also be scalable and evolvable to meet mission needs for the next 25 years. This paper will describe the recommended NASA Data Networking Architecture, present some of the rationale for the recommendations, and will illustrate an application of the architecture to example NASA missions.
Security Policy for a Generic Space Exploration Communication Network Architecture
NASA Technical Reports Server (NTRS)
Ivancic, William D.; Sheehe, Charles J.; Vaden, Karl R.
2016-01-01
This document is one of three. It describes various security mechanisms and a security policy profile for a generic space-based communication architecture. Two other documents accompany this document- an Operations Concept (OpsCon) and a communication architecture document. The OpsCon should be read first followed by the security policy profile described by this document and then the architecture document. The overall goal is to design a generic space exploration communication network architecture that is affordable, deployable, maintainable, securable, evolvable, reliable, and adaptable. The architecture should also require limited reconfiguration throughout system development and deployment. System deployment includes subsystem development in a factory setting, system integration in a laboratory setting, launch preparation, launch, and deployment and operation in space.
Viability of a Reusable In-Space Transportation System
NASA Technical Reports Server (NTRS)
Jefferies, Sharon A.; McCleskey, Carey M.; Nufer, Brian M.; Lepsch, Roger A.; Merrill, Raymond G.; North, David D.; Martin, John G.; Komar, David R.
2015-01-01
The National Aeronautics and Space Administration (NASA) is currently developing options for an Evolvable Mars Campaign (EMC) that expands human presence from Low Earth Orbit (LEO) into the solar system and to the surface of Mars. The Hybrid in-space transportation architecture is one option being investigated within the EMC. The architecture enables return of the entire in-space propulsion stage and habitat to cis-lunar space after a round trip to Mars. This concept of operations opens the door for a fully reusable Mars transportation system from cis-lunar space to a Mars parking orbit and back. This paper explores the reuse of in-space transportation systems, with a focus on the propulsion systems. It begins by examining why reusability should be pursued and defines reusability in space-flight context. A range of functions and enablers associated with preparing a system for reuse are identified and a vision for reusability is proposed that can be advanced and implemented as new capabilities are developed. Following this, past reusable spacecraft and servicing capabilities, as well as those currently in development are discussed. Using the Hybrid transportation architecture as an example, an assessment of the degree of reusability that can be incorporated into the architecture with current capabilities is provided and areas for development are identified that will enable greater levels of reuse in the future. Implications and implementation challenges specific to the architecture are also presented.
Kokkos: Enabling manycore performance portability through polymorphic memory access patterns
Carter Edwards, H.; Trott, Christian R.; Sunderland, Daniel
2014-07-22
The manycore revolution can be characterized by increasing thread counts, decreasing memory per thread, and diversity of continually evolving manycore architectures. High performance computing (HPC) applications and libraries must exploit increasingly finer levels of parallelism within their codes to sustain scalability on these devices. We found that a major obstacle to performance portability is the diverse and conflicting set of constraints on memory access patterns across devices. Contemporary portable programming models address manycore parallelism (e.g., OpenMP, OpenACC, OpenCL) but fail to address memory access patterns. The Kokkos C++ library enables applications and domain libraries to achieve performance portability on diversemore » manycore architectures by unifying abstractions for both fine-grain data parallelism and memory access patterns. In this paper we describe Kokkos’ abstractions, summarize its application programmer interface (API), present performance results for unit-test kernels and mini-applications, and outline an incremental strategy for migrating legacy C++ codes to Kokkos. Furthermore, the Kokkos library is under active research and development to incorporate capabilities from new generations of manycore architectures, and to address a growing list of applications and domain libraries.« less
Comparing the OpenMP, MPI, and Hybrid Programming Paradigm on an SMP Cluster
NASA Technical Reports Server (NTRS)
Jost, Gabriele; Jin, Haoqiang; anMey, Dieter; Hatay, Ferhat F.
2003-01-01
With the advent of parallel hardware and software technologies users are faced with the challenge to choose a programming paradigm best suited for the underlying computer architecture. With the current trend in parallel computer architectures towards clusters of shared memory symmetric multi-processors (SMP), parallel programming techniques have evolved to support parallelism beyond a single level. Which programming paradigm is the best will depend on the nature of the given problem, the hardware architecture, and the available software. In this study we will compare different programming paradigms for the parallelization of a selected benchmark application on a cluster of SMP nodes. We compare the timings of different implementations of the same CFD benchmark application employing the same numerical algorithm on a cluster of Sun Fire SMP nodes. The rest of the paper is structured as follows: In section 2 we briefly discuss the programming models under consideration. We describe our compute platform in section 3. The different implementations of our benchmark code are described in section 4 and the performance results are presented in section 5. We conclude our study in section 6.
Evaluation of an Atmosphere Revitalization Subsystem for Deep Space Exploration Missions
NASA Technical Reports Server (NTRS)
Perry, Jay L.; Abney, Morgan B.; Conrad, Ruth E.; Frederick, Kenneth R.; Greenwood, Zachary W.; Kayatin, Matthew J.; Knox, James C.; Newton, Robert L.; Parrish, Keith J.; Takada, Kevin C.;
2015-01-01
An Atmosphere Revitalization Subsystem (ARS) suitable for deployment aboard deep space exploration mission vehicles has been developed and functionally demonstrated. This modified ARS process design architecture was derived from the International Space Station's (ISS) basic ARS. Primary functions considered in the architecture include trace contaminant control, carbon dioxide removal, carbon dioxide reduction, and oxygen generation. Candidate environmental monitoring instruments were also evaluated. The process architecture rearranges unit operations and employs equipment operational changes to reduce mass, simplify, and improve the functional performance for trace contaminant control, carbon dioxide removal, and oxygen generation. Results from integrated functional demonstration are summarized and compared to the performance observed during previous testing conducted on an ISS-like subsystem architecture and a similarly evolved process architecture. Considerations for further subsystem architecture and process technology development are discussed.
Operational Concepts for a Generic Space Exploration Communication Network Architecture
NASA Technical Reports Server (NTRS)
Ivancic, William D.; Vaden, Karl R.; Jones, Robert E.; Roberts, Anthony M.
2015-01-01
This document is one of three. It describes the Operational Concept (OpsCon) for a generic space exploration communication architecture. The purpose of this particular document is to identify communication flows and data types. Two other documents accompany this document, a security policy profile and a communication architecture document. The operational concepts should be read first followed by the security policy profile and then the architecture document. The overall goal is to design a generic space exploration communication network architecture that is affordable, deployable, maintainable, securable, evolvable, reliable, and adaptable. The architecture should also require limited reconfiguration throughout system development and deployment. System deployment includes: subsystem development in a factory setting, system integration in a laboratory setting, launch preparation, launch, and deployment and operation in space.
Dynamics and design principles of a basic regulatory architecture controlling metabolic pathways.
Chin, Chen-Shan; Chubukov, Victor; Jolly, Emmitt R; DeRisi, Joe; Li, Hao
2008-06-17
The dynamic features of a genetic network's response to environmental fluctuations represent essential functional specifications and thus may constrain the possible choices of network architecture and kinetic parameters. To explore the connection between dynamics and network design, we have analyzed a general regulatory architecture that is commonly found in many metabolic pathways. Such architecture is characterized by a dual control mechanism, with end product feedback inhibition and transcriptional regulation mediated by an intermediate metabolite. As a case study, we measured with high temporal resolution the induction profiles of the enzymes in the leucine biosynthetic pathway in response to leucine depletion, using an automated system for monitoring protein expression levels in single cells. All the genes in the pathway are known to be coregulated by the same transcription factors, but we observed drastically different dynamic responses for enzymes upstream and immediately downstream of the key control point-the intermediate metabolite alpha-isopropylmalate (alphaIPM), which couples metabolic activity to transcriptional regulation. Analysis based on genetic perturbations suggests that the observed dynamics are due to differential regulation by the leucine branch-specific transcription factor Leu3, and that the downstream enzymes are strictly controlled and highly expressed only when alphaIPM is available. These observations allow us to build a simplified mathematical model that accounts for the observed dynamics and can correctly predict the pathway's response to new perturbations. Our model also suggests that transient dynamics and steady state can be separately tuned and that the high induction levels of the downstream enzymes are necessary for fast leucine recovery. It is likely that principles emerging from this work can reveal how gene regulation has evolved to optimize performance in other metabolic pathways with similar architecture.
Space station needs, attributes and architectural options. Volume 1: Executive summary NASA
NASA Technical Reports Server (NTRS)
1983-01-01
The uses alignment plan was implemented. The existing data bank was used to define a large number of station requirements. Ten to 20 valid mission scenarios were developed. Architectural options as they are influenced by communications operations, subsystem evolvability, and required technology growth are defined. Costing of evolutionary concepts, alternative approaches, and options, was based on minimum design details.
Decision Aids Using Heterogeneous Intelligence Analysis
2010-08-20
developing a Geocultural service, a software framework and inferencing engine for the Transparent Urban Structures program. The scope of the effort...has evolved as the program has matured and is including multiple data sources, as well as interfaces out to the ONR architectural framework . Tasks...Interface; Application Program Interface; Application Programmer Interface CAF Common Application Framework EDA Event Driven Architecture a 16. SECURITY
Benefits of Mars ISRU Regolith Water Processing: A Case Study for the NASA Evolvable Mars Campaign
NASA Technical Reports Server (NTRS)
Kleinhenz, Julie; Paz, Aaron; Mueller, Robert
2016-01-01
ISRU of Mars resources was baselined in 2009 Design Reference Architecture (DRA) 5.0, but only for Oxygen production using atmospheric CO2. The Methane (LCH4) needed for ascent propulsion of the Mars Ascent Vehicle (MAV) would need to be brought from Earth. However: Extracting water from the Martian Regolith enables the production of both Oxygen and Methane from Mars resources: Water resources could also be used for other applications including: Life support, radiation shielding, plant growth, etc. Water extraction was not baselined in DRA5.0 due to perceived difficulties and complexity in processing regolith. The NASA Evolvable Mars Campaign (EMC) requested studies to look at the quantitative benefits and trades of using Mars water ISRUPhase 1: Examined architecture scenarios for regolith water retrieval. Completed October 2015. Phase 2: Deep dive of one architecture concept to look at end-to-end system size, mass, power of a LCH4/LO2 ISRU production system
System analysis of graphics processor architecture using virtual prototyping
NASA Astrophysics Data System (ADS)
Hancock, William R.; Groat, Jeff; Steeves, Todd; Spaanenburg, Henk; Shackleton, John
1995-06-01
Honeywell has been actively involved in the definition of the next generation display processors for military and commercial cockpits. A major concern is how to achieve super graphics workstation performance in avionics application. Most notable are requirements for low volume, low power, harsh environmental conditions, real-time performance and low cost. This paper describes the application of VHDL to the system analysis tasks associated with achieving these goals in a cost effective manner. The paper will describe the top level architecture identified to provide the graphical and video processing power needed to drive future high resolution display devices and to generate more natural panoramic 3D formats. The major discussion, however, will be on the use of VHDL to model the processing elements and customized pipelines needed to realize the architecture and for doing the complex system tradeoff studies necessary to achieve a cost effective implementation. New software tools have been developed to allow 'virtual' prototyping in the VHDL environment. This results in a hardware/software codesign using VHDL performance and functional models. This unique architectural tool allows simulation and tradeoffs within a standard and tightly integrated toolset, which eventually will be used to specify and design the entire system from the top level requirements and system performance to the lowest level individual ASICs. New processing elements, algorithms, and standard graphical inputs can be designed, tested and evaluated without the costly hardware prototyping using the innovative 'virtual' prototyping techniques which are evolving on this project. In addition, virtual prototyping of the display processor does not bind the preliminary design to point solutions as a physical prototype will. when the development schedule is known, one can extrapolate processing elements performance and design the system around the most current technology.
NASA Astrophysics Data System (ADS)
Bejan, Adrian
2017-03-01
This review covers two aspects of "evolution" in thermodynamics. First, with the constructal law, thermodynamics is becoming the domain of physics that accounts for the phenomenon of evolution in nature, in general. Second, thermodynamics (and science generally) is the evolving add-on that empowers humans to predict the future and move more easily on earth, farther and longer in time. The part of nature that thermodynamics represents is this: nothing moves by itself unless it is driven by power, which is then destroyed (dissipated) during movement. Nothing evolves unless it flows and has the freedom to change its architecture such that it provides greater and easier access to the available space. Thermodynamics is the modern science of heat and work and their usefulness, which comes from converting the work (power) into movement (life) in flow architectures that evolve over time to facilitate movement. I also review the rich history of the science, and I clarify misconceptions regarding the second law, entropy, disorder, and the arrow of time, and the supposed analogy between heat and work.
Silicon Nanophotonics for Many-Core On-Chip Networks
NASA Astrophysics Data System (ADS)
Mohamed, Moustafa
Number of cores in many-core architectures are scaling to unprecedented levels requiring ever increasing communication capacity. Traditionally, architects follow the path of higher throughput at the expense of latency. This trend has evolved into being problematic for performance in many-core architectures. Moreover, the trends of power consumption is increasing with system scaling mandating nontraditional solutions. Nanophotonics can address these problems, offering benefits in the three frontiers of many-core processor design: Latency, bandwidth, and power. Nanophotonics leverage circuit-switching flow control allowing low latency; in addition, the power consumption of optical links is significantly lower compared to their electrical counterparts at intermediate and long links. Finally, through wave division multiplexing, we can keep the high bandwidth trends without sacrificing the throughput. This thesis focuses on realizing nanophotonics for communication in many-core architectures at different design levels considering reliability challenges that our fabrication and measurements reveal. First, we study how to design on-chip networks for low latency, low power, and high bandwidth by exploiting the full potential of nanophotonics. The design process considers device level limitations and capabilities on one hand, and system level demands in terms of power and performance on the other hand. The design involves the choice of devices, designing the optical link, the topology, the arbitration technique, and the routing mechanism. Next, we address the problem of reliability in on-chip networks. Reliability not only degrades performance but can block communication. Hence, we propose a reliability-aware design flow and present a reliability management technique based on this flow to address reliability in the system. In the proposed flow reliability is modeled and analyzed for at the device, architecture, and system level. Our reliability management technique is superior to existing solutions in terms of power and performance. In fact, our solution can scale to thousand core with low overhead.
Control Center Technology Conference Proceedings
NASA Technical Reports Server (NTRS)
1991-01-01
Conference papers and presentations are compiled and cover evolving architectures and technologies applicable to flight control centers. Advances by NASA Centers and the aerospace industry are presented.
Plant Nitrogen Acquisition Under Low Availability: Regulation of Uptake and Root Architecture.
Kiba, Takatoshi; Krapp, Anne
2016-04-01
Nitrogen availability is a major factor determining plant growth and productivity. Plants acquire nitrogen nutrients from the soil through their roots mostly in the form of ammonium and nitrate. Since these nutrients are scarce in natural soils, plants have evolved adaptive responses to cope with the environment. One of the most important responses is the regulation of nitrogen acquisition efficiency. This review provides an update on the molecular determinants of two major drivers of the nitrogen acquisition efficiency: (i) uptake activity (e.g. high-affinity nitrogen transporters) and (ii) root architecture (e.g. low-nitrogen-availability-specific regulators of primary and lateral root growth). Major emphasis is laid on the regulation of these determinants by nitrogen supply at the transcriptional and post-transcriptional levels, which enables plants to optimize nitrogen acquisition efficiency under low nitrogen availability. © The Author 2016. Published by Oxford University Press on behalf of Japanese Society of Plant Physiologists.
Diverse Cis-Regulatory Mechanisms Contribute to Expression Evolution of Tandem Gene Duplicates
Baudouin-Gonzalez, Luís; Santos, Marília A; Tempesta, Camille; Sucena, Élio; Roch, Fernando; Tanaka, Kohtaro
2017-01-01
Abstract Pairs of duplicated genes generally display a combination of conserved expression patterns inherited from their unduplicated ancestor and newly acquired domains. However, how the cis-regulatory architecture of duplicated loci evolves to produce these expression patterns is poorly understood. We have directly examined the gene-regulatory evolution of two tandem duplicates, the Drosophila Ly6 genes CG9336 and CG9338, which arose at the base of the drosophilids between 40 and 60 Ma. Comparing the expression patterns of the two paralogs in four Drosophila species with that of the unduplicated ortholog in the tephritid Ceratitis capitata, we show that they diverged from each other as well as from the unduplicated ortholog. Moreover, the expression divergence appears to have occurred close to the duplication event and also more recently in a lineage-specific manner. The comparison of the tissue-specific cis-regulatory modules (CRMs) controlling the paralog expression in the four Drosophila species indicates that diverse cis-regulatory mechanisms, including the novel tissue-specific enhancers, differential inactivation, and enhancer sharing, contributed to the expression evolution. Our analysis also reveals a surprisingly variable cis-regulatory architecture, in which the CRMs driving conserved expression domains change in number, location, and specificity. Altogether, this study provides a detailed historical account that uncovers a highly dynamic picture of how the paralog expression patterns and their underlying cis-regulatory landscape evolve. We argue that our findings will encourage studying cis-regulatory evolution at the whole-locus level to understand how interactions between enhancers and other regulatory levels shape the evolution of gene expression. PMID:28961967
Kiefer, Gundolf; Lehmann, Helko; Weese, Jürgen
2006-04-01
Maximum intensity projections (MIPs) are an important visualization technique for angiographic data sets. Efficient data inspection requires frame rates of at least five frames per second at preserved image quality. Despite the advances in computer technology, this task remains a challenge. On the one hand, the sizes of computed tomography and magnetic resonance images are increasing rapidly. On the other hand, rendering algorithms do not automatically benefit from the advances in processor technology, especially for large data sets. This is due to the faster evolving processing power and the slower evolving memory access speed, which is bridged by hierarchical cache memory architectures. In this paper, we investigate memory access optimization methods and use them for generating MIPs on general-purpose central processing units (CPUs) and graphics processing units (GPUs), respectively. These methods can work on any level of the memory hierarchy, and we show that properly combined methods can optimize memory access on multiple levels of the hierarchy at the same time. We present performance measurements to compare different algorithm variants and illustrate the influence of the respective techniques. On current hardware, the efficient handling of the memory hierarchy for CPUs improves the rendering performance by a factor of 3 to 4. On GPUs, we observed that the effect is even larger, especially for large data sets. The methods can easily be adjusted to different hardware specifics, although their impact can vary considerably. They can also be used for other rendering techniques than MIPs, and their use for more general image processing task could be investigated in the future.
Advances in Robotic, Human, and Autonomous Systems for Missions of Space Exploration
NASA Technical Reports Server (NTRS)
Gross, Anthony R.; Briggs, Geoffrey A.; Glass, Brian J.; Pedersen, Liam; Kortenkamp, David M.; Wettergreen, David S.; Nourbakhsh, I.; Clancy, Daniel J.; Zornetzer, Steven (Technical Monitor)
2002-01-01
Space exploration missions are evolving toward more complex architectures involving more capable robotic systems, new levels of human and robotic interaction, and increasingly autonomous systems. How this evolving mix of advanced capabilities will be utilized in the design of new missions is a subject of much current interest. Cost and risk constraints also play a key role in the development of new missions, resulting in a complex interplay of a broad range of factors in the mission development and planning of new missions. This paper will discuss how human, robotic, and autonomous systems could be used in advanced space exploration missions. In particular, a recently completed survey of the state of the art and the potential future of robotic systems, as well as new experiments utilizing human and robotic approaches will be described. Finally, there will be a discussion of how best to utilize these various approaches for meeting space exploration goals.
Deep-Space Optical Communications: Visions, Trends, and Prospects
NASA Technical Reports Server (NTRS)
Cesarone, R. J.; Abraham, D. S.; Shambayati, S.; Rush, J.
2011-01-01
Current key initiatives in deep-space optical communications are treated in terms of historical context, contemporary trends, and prospects for the future. An architectural perspective focusing on high-level drivers, systems, and related operations concepts is provided. Detailed subsystem and component topics are not addressed. A brief overview of past ideas and architectural concepts sets the stage for current developments. Current requirements that might drive a transition from radio frequencies to optical communications are examined. These drivers include mission demand for data rates and/or data volumes; spectrum to accommodate such data rates; and desired power, mass, and cost benefits. As is typical, benefits come with associated challenges. For optical communications, these include atmospheric effects, link availability, pointing, and background light. The paper describes how NASA's Space Communication and Navigation Office will respond to the drivers, achieve the benefits, and mitigate the challenges, as documented in its Optical Communications Roadmap. Some nontraditional architectures and operations concepts are advanced in an effort to realize benefits and mitigate challenges as quickly as possible. Radio frequency communications is considered as both a competitor to and a partner with optical communications. The paper concludes with some suggestions for two affordable first steps that can yet evolve into capable architectures that will fulfill the vision inherent in optical communications.
Surface protection in bio-shields via a functional soft skin layer: Lessons from the turtle shell.
Shelef, Yaniv; Bar-On, Benny
2017-09-01
The turtle shell is a functional bio-shielding element, which has evolved naturally to provide protection against predator attacks that involve biting and clawing. The near-surface architecture of the turtle shell includes a soft bi-layer skin coating - rather than a hard exterior - which functions as a first line of defense against surface damage. This architecture represents a novel type of bio-shielding configuration, namely, an inverse structural-mechanical design, rather than the hard-coated bio-shielding elements identified so far. In the current study, we used experimentally based structural modeling and FE simulations to analyze the mechanical significance of this unconventional protection architecture in terms of resistance to surface damage upon extensive indentations. We found that the functional bi-layer skin of the turtle shell, which provides graded (soft-softer-hard) mechanical characteristics to the bio-shield exterior, serves as a bumper-buffer mechanism. This material-level adaptation protects the inner core from the highly localized indentation loads via stress delocalization and extensive near-surface plasticity. The newly revealed functional bi-layer coating architecture can potentially be adapted, using synthetic materials, to considerably enhance the surface load-bearing capabilities of various engineering configurations. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Candia, Sante; Lisio, Giovanni; Campolo, Giovanni; Pascucci, Dario
2010-08-01
The Avionics Software (ASW), in charge of controlling the Low Earth Orbit (LEO) Spacecraft PRIMA Platform (Piattaforma Ri-configurabile Italiana Multi-Applicativa), is evolving towards a highly modular and re-usable architecture based on an architectural framework allowing the effective integration of the software building blocks (SWBBs) providing the on-board control functions. During the recent years, the PRIMA ASW design and production processes have been improved to reach the following objectives: (a) at PUS Services level, separation of the mission-independent software mechanisms from the mission-dependent configuration information; (b) at Application level, identification of mission-independent recurrent functions for promoting abstraction and obtaining a more efficient and safe ASW production, with positive implications also on the software validation activities. This paper is dedicated to the characterisation activity which has been performed at Application level for a software component abstracting a set of functions for the generic On-Board Assembly (OBA), a set of hardware units used to deliver an on-board service. Moreover, the ASW production process is specified to show how it results after the introduction of the new design features.
NASA Technical Reports Server (NTRS)
Keyes, Jennifer; Troutman, Patrick A.; Saucillo, Rudolph; Cirillo, William M.; Cavanaugh, Steve; Stromgren, Chel
2006-01-01
The NASA Langley Research Center (LaRC) Systems Analysis & Concepts Directorate (SACD) began studying human exploration missions beyond low Earth orbit (LEO) in the year 1999. This included participation in NASA s Decadal Planning Team (DPT), the NASA Exploration Team (NExT), Space Architect studies and Revolutionary Aerospace Systems Concepts (RASC) architecture studies that were used in formulating the new Vision for Space Exploration. In May of 2005, NASA initiated the Exploration Systems Architecture Study (ESAS). The primary outputs of the ESAS activity were concepts and functional requirements for the Crewed Exploration Vehicle (CEV), its supporting launch vehicle infrastructure and identification of supporting technology requirements and investments. An exploration systems analysis capability has evolved to support these functions in the past and continues to evolve to support anticipated future needs. SACD had significant roles in supporting the ESAS study team. SACD personnel performed the liaison function between the ESAS team and the Shuttle/Station Configuration Options Team (S/SCOT), an agency-wide team charged with using the Space Shuttle to complete the International Space Station (ISS) by the end of Fiscal Year (FY) 2010. The most significant of the identified issues involved the ability of the Space Shuttle system to achieve the desired number of flights in the proposed time frame. SACD with support from the Kennedy Space Center performed analysis showing that, without significant investments in improving the shuttle processing flow, that there was almost no possibility of completing the 28-flight sequence by the end of 2010. SACD performed numerous Lunar Surface Access Module (LSAM) trades to define top level element requirements and establish architecture propellant needs. Configuration trades were conducted to determine the impact of varying degrees of segmentation of the living capabilities of the combined descent stage, ascent stage, and other elements. The technology assessment process was developed and implemented by SACD as the ESAS architecture was refined. SACD implemented a rigorous and objective process which included (a) establishing architectural functional needs, (b) collection, synthesis and mapping of technology data, and (c) performing an objective decision analysis resulting in technology development investment recommendations. The investment recommendation provided budget, schedule, and center/program allocations to develop required technologies for the exploration architecture, as well as the identification of other investment opportunities to maximize performance and flexibility while minimizing cost and risk. A summary of the trades performed and methods utilized by SACD for the Exploration Systems Mission Directorate (ESAS) activity is presented along with how SACD is currently supporting the implementation of the Vision for Space Exploration.
EarthCube as an information resource marketplace; the GEAR Project conceptual design
NASA Astrophysics Data System (ADS)
Richard, S. M.; Zaslavsky, I.; Gupta, A.; Valentine, D.
2015-12-01
Geoscience Architecture for Research (GEAR) is approaching EarthCube design as a complex and evolving socio-technical federation of systems. EarthCube is intended to support the science research enterprise, for which there is no centralized command and control, requirements are a moving target, the function and behavior of the system must evolve and adapt as new scientific paradigms emerge, and system participants are conducting research that inherently implies seeking new ways of doing things. EarthCube must address evolving user requirements and enable domain and project systems developed under different management and for different purposes to work together. The EC architecture must focus on creating a technical environment that enables new capabilities by combining existing and newly developed resources in various ways, and encourages development of new resource designs intended for re-use and interoperability. In a sense, instead of a single architecture design, GEAR provides a way to accommodate multiple designs tuned to different tasks. This agile, adaptive, evolutionary software development style is based on a continuously updated portfolio of compatible components that enable new sub-system architecture. System users make decisions about which components to use in this marketplace based on performance, satisfaction, and impact metrics collected continuously to evaluate components, determine priorities, and guide resource allocation decisions by the system governance agency. EC is designed as a federation of independent systems, and although the coordinator of the EC system may be named an enterprise architect, the focus of the role needs to be organizing resources, assessing their readiness for interoperability with the existing EC component inventory, managing dependencies between transient subsystems, mechanisms of stakeholder engagement and inclusion, and negotiation of standard interfaces, rather than actual specification of components. Composition of components will be developed by projects that involve both domain scientists and CI experts for specific research problems. We believe an agile, marketplace type approach is an essential architectural strategy for EarthCube.
A flexible architecture for advanced process control solutions
NASA Astrophysics Data System (ADS)
Faron, Kamyar; Iourovitski, Ilia
2005-05-01
Advanced Process Control (APC) is now mainstream practice in the semiconductor manufacturing industry. Over the past decade and a half APC has evolved from a "good idea", and "wouldn"t it be great" concept to mandatory manufacturing practice. APC developments have primarily dealt with two major thrusts, algorithms and infrastructure, and often the line between them has been blurred. The algorithms have evolved from very simple single variable solutions to sophisticated and cutting edge adaptive multivariable (input and output) solutions. Spending patterns in recent times have demanded that the economics of a comprehensive APC infrastructure be completely justified for any and all cost conscious manufacturers. There are studies suggesting integration costs as high as 60% of the total APC solution costs. Such cost prohibitive figures clearly diminish the return on APC investments. This has limited the acceptance and development of pure APC infrastructure solutions for many fabs. Modern APC solution architectures must satisfy the wide array of requirements from very manual R&D environments to very advanced and automated "lights out" manufacturing facilities. A majority of commercially available control solutions and most in house developed solutions lack important attributes of scalability, flexibility, and adaptability and hence require significant resources for integration, deployment, and maintenance. Many APC improvement efforts have been abandoned and delayed due to legacy systems and inadequate architectural design. Recent advancements (Service Oriented Architectures) in the software industry have delivered ideal technologies for delivering scalable, flexible, and reliable solutions that can seamlessly integrate into any fabs" existing system and business practices. In this publication we shall evaluate the various attributes of the architectures required by fabs and illustrate the benefits of a Service Oriented Architecture to satisfy these requirements. Blue Control Technologies has developed an advance service oriented architecture Run to Run Control System which addresses these requirements.
The NBS-LRR architectures of plant R-proteins and metazoan NLRs evolved in independent events
Urbach, Jonathan M.; Ausubel, Frederick M.
2017-01-01
There are intriguing parallels between plants and animals, with respect to the structures of their innate immune receptors, that suggest universal principles of innate immunity. The cytosolic nucleotide binding site–leucine rich repeat (NBS-LRR) resistance proteins of plants (R-proteins) and the so-called NOD-like receptors of animals (NLRs) share a domain architecture that includes a STAND (signal transduction ATPases with numerous domains) family NTPase followed by a series of LRRs, suggesting inheritance from a common ancestor with that architecture. Focusing on the STAND NTPases of plant R-proteins, animal NLRs, and their homologs that represent the NB-ARC (nucleotide-binding adaptor shared by APAF-1, certain R gene products and CED-4) and NACHT (named for NAIP, CIIA, HET-E, and TEP1) subfamilies of the STAND NTPases, we analyzed the phylogenetic distribution of the NBS-LRR domain architecture, used maximum-likelihood methods to infer a phylogeny of the NTPase domains of R-proteins, and reconstructed the domain structure of the protein containing the common ancestor of the STAND NTPase domain of R-proteins and NLRs. Our analyses reject monophyly of plant R-proteins and NLRs and suggest that the protein containing the last common ancestor of the STAND NTPases of plant R-proteins and animal NLRs (and, by extension, all NB-ARC and NACHT domains) possessed a domain structure that included a STAND NTPase paired with a series of tetratricopeptide repeats. These analyses reject the hypothesis that the domain architecture of R-proteins and NLRs was inherited from a common ancestor and instead suggest the domain architecture evolved at least twice. It remains unclear whether the NBS-LRR architectures were innovations of plants and animals themselves or were acquired by one or both lineages through horizontal gene transfer. PMID:28096345
Whitlock, Alexander O. B.; Peck, Kayla M.; Azevedo, Ricardo B. R.; Burch, Christina L.
2016-01-01
Sex is ubiquitous in the natural world, but the nature of its benefits remains controversial. Previous studies have suggested that a major advantage of sex is its ability to eliminate interference between selection on linked mutations, a phenomenon known as Hill–Robertson interference. However, those studies may have missed both important advantages and important disadvantages of sexual reproduction because they did not allow the distributions of mutational effects and interactions (i.e., the genetic architecture) to evolve. Here we investigate how Hill–Robertson interference interacts with an evolving genetic architecture to affect the evolutionary origin and maintenance of sex by simulating evolution in populations of artificial gene networks. We observed a long-term advantage of sex—equilibrium mean fitness of sexual populations exceeded that of asexual populations—that did not depend on population size. We also observed a short-term advantage of sex—sexual modifier mutations readily invaded asexual populations—that increased with population size, as was observed in previous studies. We show that the long- and short-term advantages of sex were both determined by differences between sexual and asexual populations in the evolutionary dynamics of two properties of the genetic architecture: the deleterious mutation rate (Ud) and recombination load (LR). These differences resulted from a combination of selection to minimize LR, which is experienced only by sexuals, and Hill–Robertson interference experienced primarily by asexuals. In contrast to the previous studies, in which Hill–Robertson interference had only a direct impact on the fitness advantages of sex, the impact of Hill–Robertson interference in our simulations was mediated additionally by an indirect impact on the efficiency with which selection acted to reduce Ud. PMID:27098911
Cyberinfrastructure for Airborne Sensor Webs
NASA Technical Reports Server (NTRS)
Freudinger, Lawrence C.
2009-01-01
Since 2004 the NASA Airborne Science Program has been prototyping and using infrastructure that enables researchers to interact with each other and with their instruments via network communications. This infrastructure uses satellite links and an evolving suite of applications and services that leverage open-source software. The use of these tools has increased near-real-time situational awareness during field operations, resulting in productivity improvements and the collection of better data. This paper describes the high-level system architecture and major components, with example highlights from the use of the infrastructure. The paper concludes with a discussion of ongoing efforts to transition to operational status.
Cis-Lunar Reusable In-Space Transportation Architecture for the Evolvable Mars Campaign
NASA Technical Reports Server (NTRS)
McVay, Eric S.; Jones, Christopher A.; Merrill, Raymond G.
2016-01-01
Human exploration missions to Mars or other destinations in the solar system require large quantities of propellant to enable the transportation of required elements from Earth's sphere of influence to Mars. Current and proposed launch vehicles are incapable of launching all of the requisite mass on a single vehicle; hence, multiple launches and in-space aggregation are required to perform a Mars mission. This study examines the potential of reusable chemical propulsion stages based in cis-lunar space to meet the transportation objectives of the Evolvable Mars Campaign and identifies cis-lunar propellant supply requirements. These stages could be supplied with fuel and oxidizer delivered to cis-lunar space, either launched from Earth or other inner solar system sources such as the Moon or near Earth asteroids. The effects of uncertainty in the model parameters are evaluated through sensitivity analysis of key parameters including the liquid propellant combination, inert mass fraction of the vehicle, change in velocity margin, and change in payload masses. The outcomes of this research include a description of the transportation elements, the architecture that they enable, and an option for a campaign that meets the objectives of the Evolvable Mars Campaign. This provides a more complete understanding of the propellant requirements, as a function of time, that must be delivered to cis-lunar space. Over the selected sensitivity ranges for the current payload and schedule requirements of the 2016 point of departure of the Evolvable Mars Campaign destination systems, the resulting propellant delivery quantities are between 34 and 61 tonnes per year of hydrogen and oxygen propellant, or between 53 and 76 tonnes per year of methane and oxygen propellant, or between 74 and 92 tonnes per year of hypergolic propellant. These estimates can guide future propellant manufacture and/or delivery architectural analysis.
A Reference Architecture for Space Information Management
NASA Technical Reports Server (NTRS)
Mattmann, Chris A.; Crichton, Daniel J.; Hughes, J. Steven; Ramirez, Paul M.; Berrios, Daniel C.
2006-01-01
We describe a reference architecture for space information management systems that elegantly overcomes the rigid design of common information systems in many domains. The reference architecture consists of a set of flexible, reusable, independent models and software components that function in unison, but remain separately managed entities. The main guiding principle of the reference architecture is to separate the various models of information (e.g., data, metadata, etc.) from implemented system code, allowing each to evolve independently. System modularity, systems interoperability, and dynamic evolution of information system components are the primary benefits of the design of the architecture. The architecture requires the use of information models that are substantially more advanced than those used by the vast majority of information systems. These models are more expressive and can be more easily modularized, distributed and maintained than simpler models e.g., configuration files and data dictionaries. Our current work focuses on formalizing the architecture within a CCSDS Green Book and evaluating the architecture within the context of the C3I initiative.
Innovative telescope architectures for future large space observatories
NASA Astrophysics Data System (ADS)
Polidan, Ronald S.; Breckinridge, James B.; Lillie, Charles F.; MacEwen, Howard A.; Flannery, Martin R.; Dailey, Dean R.
2016-10-01
Over the past few years, we have developed a concept for an evolvable space telescope (EST) that is assembled on orbit in three stages, growing from a 4×12-m telescope in Stage 1, to a 12-m filled aperture in Stage 2, and then to a 20-m filled aperture in Stage 3. Stage 1 is launched as a fully functional telescope and begins gathering science data immediately after checkout on orbit. This observatory is then periodically augmented in space with additional mirror segments, structures, and newer instruments to evolve the telescope over the years to a 20-m space telescope. We discuss the EST architecture, the motivation for this approach, and the benefits it provides over current approaches to building and maintaining large space observatories.
NASA Astrophysics Data System (ADS)
Baynes, K.; Gilman, J.; Pilone, D.; Mitchell, A. E.
2015-12-01
The NASA EOSDIS (Earth Observing System Data and Information System) Common Metadata Repository (CMR) is a continuously evolving metadata system that merges all existing capabilities and metadata from EOS ClearingHOuse (ECHO) and the Global Change Master Directory (GCMD) systems. This flagship catalog has been developed with several key requirements: fast search and ingest performance ability to integrate heterogenous external inputs and outputs high availability and resiliency scalability evolvability and expandability This talk will focus on the advantages and potential challenges of tackling these requirements using a microservices architecture, which decomposes system functionality into smaller, loosely-coupled, individually-scalable elements that communicate via well-defined APIs. In addition, time will be spent examining specific elements of the CMR architecture and identifying opportunities for future integrations.
NASA Technical Reports Server (NTRS)
Jost, Gabriele; Labarta, Jesus; Gimenez, Judit
2004-01-01
With the current trend in parallel computer architectures towards clusters of shared memory symmetric multi-processors, parallel programming techniques have evolved that support parallelism beyond a single level. When comparing the performance of applications based on different programming paradigms, it is important to differentiate between the influence of the programming model itself and other factors, such as implementation specific behavior of the operating system (OS) or architectural issues. Rewriting-a large scientific application in order to employ a new programming paradigms is usually a time consuming and error prone task. Before embarking on such an endeavor it is important to determine that there is really a gain that would not be possible with the current implementation. A detailed performance analysis is crucial to clarify these issues. The multilevel programming paradigms considered in this study are hybrid MPI/OpenMP, MLP, and nested OpenMP. The hybrid MPI/OpenMP approach is based on using MPI [7] for the coarse grained parallelization and OpenMP [9] for fine grained loop level parallelism. The MPI programming paradigm assumes a private address space for each process. Data is transferred by explicitly exchanging messages via calls to the MPI library. This model was originally designed for distributed memory architectures but is also suitable for shared memory systems. The second paradigm under consideration is MLP which was developed by Taft. The approach is similar to MPi/OpenMP, using a mix of coarse grain process level parallelization and loop level OpenMP parallelization. As it is the case with MPI, a private address space is assumed for each process. The MLP approach was developed for ccNUMA architectures and explicitly takes advantage of the availability of shared memory. A shared memory arena which is accessible by all processes is required. Communication is done by reading from and writing to the shared memory.
Natural energy and vernacular architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fathy, H.
1986-01-01
This volume presents insights into the indigenous architectural forms in hot arid climates. The author presents his extensive research on climate control, particularly in the Middle East, to demonstrate the advantages of many locally available building materials and traditional building methods. He suggests improved uses of natural energy that can bridge the gap between traditional achievements and modern needs. He argues that various architectural forms in these climates have evolved intuitively from scientifically valid concepts. Such forms combine comfort and beauty, social and physical functionality. He discusses that in substituting modern materials, architects sometimes have ignored the environmental context ofmore » traditional architecture. As a result, individuals may find themselves physically and psychologically uncomfortable in modern structures. His approach, informed by a sensitive humanism, demonstrates the ways in which traditional architectural forms can be of use in solving problems facing contemporary architecture, in particular the critical housing situation in the Third World.« less
NASA Astrophysics Data System (ADS)
Domercant, Jean Charles
The combination of today's national security environment and mandated acquisition policies makes it necessary for military systems to interoperate with each other to greater degrees. This growing interdependency results in complex Systems-of-Systems (SoS) that only continue to grow in complexity to meet evolving capability needs. Thus, timely and affordable acquisition becomes more difficult, especially in the face of mounting budgetary pressures. To counter this, architecting principles must be applied to SoS design. The research objective is to develop an Architecture Real Options Complexity-Based Valuation Methodology (ARC-VM) suitable for acquisition-level decision making, where there is a stated desire for more informed tradeoffs between cost, schedule, and performance during the early phases of design. First, a framework is introduced to measure architecture complexity as it directly relates to military SoS. Development of the framework draws upon a diverse set of disciplines, including Complexity Science, software architecting, measurement theory, and utility theory. Next, a Real Options based valuation strategy is developed using techniques established for financial stock options that have recently been adapted for use in business and engineering decisions. The derived complexity measure provides architects with an objective measure of complexity that focuses on relevant complex system attributes. These attributes are related to the organization and distribution of SoS functionality and the sharing and processing of resources. The use of Real Options provides the necessary conceptual and visual framework to quantifiably and traceably combine measured architecture complexity, time-valued performance levels, as well as programmatic risks and uncertainties. An example suppression of enemy air defenses (SEAD) capability demonstrates the development and usefulness of the resulting architecture complexity & Real Options based valuation methodology. Different portfolios of candidate system types are used to generate an array of architecture alternatives that are then evaluated using an engagement model. This performance data is combined with both measured architecture complexity and programmatic data to assign an acquisition value to each alternative. This proves useful when selecting alternatives most likely to meet current and future capability needs.
Genetic Architecture of Conspicuous Red Ornaments in Female Threespine Stickleback
Yong, Lengxob; Peichel, Catherine L.; McKinnon, Jeffrey S.
2015-01-01
Explaining the presence of conspicuous female ornaments that take the form of male-typical traits has been a longstanding challenge in evolutionary biology. Such female ornaments have been proposed to evolve via both adaptive and nonadaptive evolutionary processes. Determining the genetic underpinnings of female ornaments is important for elucidating the mechanisms by which such female traits arise and persist in natural populations, but detailed information about their genetic basis is still scarce. In this study, we investigated the genetic architecture of two ornaments, the orange-red throat and pelvic spine, in the threespine stickleback (Gasterosteus aculeatus). Throat coloration is male-specific in ancestral marine populations but has evolved in females in some derived stream populations, whereas sexual dimorphism in pelvic spine coloration is variable among populations. We find that ornaments share a common genetic architecture between the sexes. At least three independent genomic regions contribute to red throat coloration, and harbor candidate genes related to pigment production and pigment cell differentiation. One of these regions is also associated with spine coloration, indicating that both ornaments might be mediated partly via pleiotropic genetic mechanisms. PMID:26715094
Mars Hybrid Propulsion System Trajectory Analysis. Part II; Cargo Missions
NASA Technical Reports Server (NTRS)
Chai, Patrick R.; Merrill, Raymond G.; Qu, Min
2015-01-01
NASA's Human Spaceflight Architecture Team is developing a reusable hybrid transportation architecture in which both chemical and electric propulsion systems are used to send crew and cargo to Mars destinations such as Phobos, Deimos, the surface of Mars, and other orbits around Mars. By combining chemical and electrical propulsion into a single spaceship and applying each where it is more effective, the hybrid architecture enables a series of Mars trajectories that are more fuel-efficient than an all chemical architecture without significant increases in flight times. This paper shows the feasibility of the hybrid transportation architecture to pre-deploy cargo to Mars and Phobos in support of the Evolvable Mars Campaign crew missions. The analysis shows that the hybrid propulsion stage is able to deliver all of the current manifested payload to Phobos and Mars through the first three crew missions. The conjunction class trajectory also allows the hybrid propulsion stage to return to Earth in a timely fashion so it can be reused for additional cargo deployment. The 1,100 days total trip time allows the hybrid propulsion stage to deliver cargo to Mars every other Earth-Mars transit opportunity. For the first two Mars surface mission in the Evolvable Mars Campaign, the short trip time allows the hybrid propulsion stage to be reused for three round-trip journeys to Mars, which matches the hybrid propulsion stage's designed lifetime for three round-trip crew missions to the Martian sphere of influence.
Developing Architectures and Technologies for an Evolvable NASA Space Communication Infrastructure
NASA Technical Reports Server (NTRS)
Bhasin, Kul; Hayden, Jeffrey
2004-01-01
Space communications architecture concepts play a key role in the development and deployment of NASA's future exploration and science missions. Once a mission is deployed, the communication link to the user needs to provide maximum information delivery and flexibility to handle the expected large and complex data sets and to enable direct interaction with the spacecraft and experiments. In human and robotic missions, communication systems need to offer maximum reliability with robust two-way links for software uploads and virtual interactions. Identifying the capabilities to cost effectively meet the demanding space communication needs of 21st century missions, proper formulation of the requirements for these missions, and identifying the early technology developments that will be needed can only be resolved with architecture design. This paper will describe the development of evolvable space communication architecture models and the technologies needed to support Earth sensor web and collaborative observation formation missions; robotic scientific missions for detailed investigation of planets, moons, and small bodies in the solar system; human missions for exploration of the Moon, Mars, Ganymede, Callisto, and asteroids; human settlements in space, on the Moon, and on Mars; and great in-space observatories for observing other star systems and the universe. The resulting architectures will enable the reliable, multipoint, high data rate capabilities needed on demand to provide continuous, maximum coverage of areas of concentrated activities, such as in the vicinity of outposts in-space, on the Moon or on Mars.
Rapid prototyping and evaluation of programmable SIMD SDR processors in LISA
NASA Astrophysics Data System (ADS)
Chen, Ting; Liu, Hengzhu; Zhang, Botao; Liu, Dongpei
2013-03-01
With the development of international wireless communication standards, there is an increase in computational requirement for baseband signal processors. Time-to-market pressure makes it impossible to completely redesign new processors for the evolving standards. Due to its high flexibility and low power, software defined radio (SDR) digital signal processors have been proposed as promising technology to replace traditional ASIC and FPGA fashions. In addition, there are large numbers of parallel data processed in computation-intensive functions, which fosters the development of single instruction multiple data (SIMD) architecture in SDR platform. So a new way must be found to prototype the SDR processors efficiently. In this paper we present a bit-and-cycle accurate model of programmable SIMD SDR processors in a machine description language LISA. LISA is a language for instruction set architecture which can gain rapid model at architectural level. In order to evaluate the availability of our proposed processor, three common baseband functions, FFT, FIR digital filter and matrix multiplication have been mapped on the SDR platform. Analytical results showed that the SDR processor achieved the maximum of 47.1% performance boost relative to the opponent processor.
A VHDL Core for Intrinsic Evolution of Discrete Time Filters with Signal Feedback
NASA Technical Reports Server (NTRS)
Gwaltney, David A.; Dutton, Kenneth
2005-01-01
The design of an Evolvable Machine VHDL Core is presented, representing a discrete-time processing structure capable of supporting control system applications. This VHDL Core is implemented in an FPGA and is interfaced with an evolutionary algorithm implemented in firmware on a Digital Signal Processor (DSP) to create an evolvable system platform. The salient features of this architecture are presented. The capability to implement IIR filter structures is presented along with the results of the intrinsic evolution of a filter. The robustness of the evolved filter design is tested and its unique characteristics are described.
NASA Technical Reports Server (NTRS)
Smith, Phillip J.; Billings, Charles; McCoy, C. Elaine; Orasanu, Judith
1999-01-01
The air traffic management system in the United States is an example of a distributed problem solving system. It has elements of both cooperative and competitive problem-solving. This system includes complex organizations such as Airline Operations Centers (AOCs), the FAA Air Traffic Control Systems Command Center (ATCSCC), and traffic management units (TMUs) at enroute centers and TRACONs, all of which have a major focus on strategic decision-making. It also includes individuals concerned more with tactical decisions (such as air traffic controllers and pilots). The architecture for this system has evolved over time to rely heavily on the distribution of tasks and control authority in order to keep cognitive complexity manageable for any one individual operator, and to provide redundancy (both human and technological) to serve as a safety net to catch the slips or mistakes that any one person or entity might make. Currently, major changes are being considered for this architecture, especially with respect to the locus of control, in an effort to improve efficiency and safety. This paper uses a series of case studies to help evaluate some of these changes from the perspective of system complexity, and to point out possible alternative approaches that might be taken to improve system performance. The paper illustrates the need to maintain a clear understanding of what is required to assure a high level of performance when alternative system architectures and decompositions are developed.
NASA Astrophysics Data System (ADS)
King, Nelson E.; Liu, Brent; Zhou, Zheng; Documet, Jorge; Huang, H. K.
2005-04-01
Grid Computing represents the latest and most exciting technology to evolve from the familiar realm of parallel, peer-to-peer and client-server models that can address the problem of fault-tolerant storage for backup and recovery of clinical images. We have researched and developed a novel Data Grid testbed involving several federated PAC systems based on grid architecture. By integrating a grid computing architecture to the DICOM environment, a failed PACS archive can recover its image data from others in the federation in a timely and seamless fashion. The design reflects the five-layer architecture of grid computing: Fabric, Resource, Connectivity, Collective, and Application Layers. The testbed Data Grid architecture representing three federated PAC systems, the Fault-Tolerant PACS archive server at the Image Processing and Informatics Laboratory, Marina del Rey, the clinical PACS at Saint John's Health Center, Santa Monica, and the clinical PACS at the Healthcare Consultation Center II, USC Health Science Campus, will be presented. The successful demonstration of the Data Grid in the testbed will provide an understanding of the Data Grid concept in clinical image data backup as well as establishment of benchmarks for performance from future grid technology improvements and serve as a road map for expanded research into large enterprise and federation level data grids to guarantee 99.999 % up time.
National Launch System comparative economic analysis
NASA Technical Reports Server (NTRS)
Prince, A.
1992-01-01
Results are presented from an analysis of economic benefits (or losses), in the form of the life cycle cost savings, resulting from the development of the National Launch System (NLS) family of launch vehicles. The analysis was carried out by comparing various NLS-based architectures with the current Shuttle/Titan IV fleet. The basic methodology behind this NLS analysis was to develop a set of annual payload requirements for the Space Station Freedom and LEO, to design launch vehicle architectures around these requirements, and to perform life-cycle cost analyses on all of the architectures. A SEI requirement was included. Launch failure costs were estimated and combined with the relative reliability assumptions to measure the effects of losses. Based on the analysis, a Shuttle/NLS architecture evolving into a pressurized-logistics-carrier/NLS architecture appears to offer the best long-term cost benefit.
Recognition of degraded handwritten digits using dynamic Bayesian networks
NASA Astrophysics Data System (ADS)
Likforman-Sulem, Laurence; Sigelle, Marc
2007-01-01
We investigate in this paper the application of dynamic Bayesian networks (DBNs) to the recognition of handwritten digits. The main idea is to couple two separate HMMs into various architectures. First, a vertical HMM and a horizontal HMM are built observing the evolving streams of image columns and image rows respectively. Then, two coupled architectures are proposed to model interactions between these two streams and to capture the 2D nature of character images. Experiments performed on the MNIST handwritten digit database show that coupled architectures yield better recognition performances than non-coupled ones. Additional experiments conducted on artificially degraded (broken) characters demonstrate that coupled architectures better cope with such degradation than non coupled ones and than discriminative methods such as SVMs.
Reynolds, Lucy
2012-01-01
July 2011 marked the 40th anniversary of social marketing. However, while the previous Labour administration dedicated sustained resources and support to developing the field of social marketing, this was followed by a time of uncertainty during the Coalition Government's ascent to power. This paper explores the potential future position of social marketing within David Cameron's evolving public health landscape, outlining areas of synergy between social marketing's key features, and the coalition's emergent public health architecture. The paper concludes with an exploration of the development opportunities nascent within social marketing, suggesting that support for the new commissioners (GP and local authority), and an enhanced emphasis on evaluation of financial and social outcomes, will be required if the evidence base for strong practice is to continue to grow and evolve.
A new evolutionary system for evolving artificial neural networks.
Yao, X; Liu, Y
1997-01-01
This paper presents a new evolutionary system, i.e., EPNet, for evolving artificial neural networks (ANNs). The evolutionary algorithm used in EPNet is based on Fogel's evolutionary programming (EP). Unlike most previous studies on evolving ANN's, this paper puts its emphasis on evolving ANN's behaviors. Five mutation operators proposed in EPNet reflect such an emphasis on evolving behaviors. Close behavioral links between parents and their offspring are maintained by various mutations, such as partial training and node splitting. EPNet evolves ANN's architectures and connection weights (including biases) simultaneously in order to reduce the noise in fitness evaluation. The parsimony of evolved ANN's is encouraged by preferring node/connection deletion to addition. EPNet has been tested on a number of benchmark problems in machine learning and ANNs, such as the parity problem, the medical diagnosis problems, the Australian credit card assessment problem, and the Mackey-Glass time series prediction problem. The experimental results show that EPNet can produce very compact ANNs with good generalization ability in comparison with other algorithms.
NASA Astrophysics Data System (ADS)
Fanget, Anne-Sophie; Berné, Serge; Jouet, Gwénaël; Bassetti, Maria-Angela; Dennielou, Bernard; Maillet, Grégoire M.; Tondut, Mathieu
2014-05-01
The modern Rhone delta in the Gulf of Lions (NW Mediterranean) is a typical wave-dominated delta that developed after the stabilization of relative sea level following the last deglacial sea-level rise. Similar to most other deltas worldwide, it displays several stacked parasequences and lobes that reflect the complex interaction between accommodation, sediment supply and autogenic processes on the architecture of a wave-dominated delta. The interpretation of a large set of newly acquired very high-resolution seismic and sedimentological data, well constrained by 14C dates, provides a refined three-dimensional image of the detailed architecture (seismic bounding surfaces, sedimentary facies) of the Rhone subaqueous delta, and allows us to propose a scenario for delta evolution during the last deglaciation and Holocene. The subaqueous delta consists of “parasequence-like” depositional wedges, a few metres to 20-30 m in thickness. These wedges first back-stepped inland toward the NW in response to combined global sea-level rise and overall westward oceanic circulation, at a time when sediment supply could not keep pace with rapid absolute (eustatic) sea-level rise. At the Younger Dryas-Preboreal transition, more rapid sea-level rise led to the formation of a major flooding surface (equivalent to a wave ravinement surface). After stabilization of global sea level in the mid-Holocene, accommodation became the leading factor in controlling delta architecture. An eastward shift of depocentres occurred, probably favoured by higher subsidence rate within the thick Messinian Rhone valley fill. The transition between transgressive (backstepping geometry) and regressive (prograding geometry) (para)sequences resulted in creation of a Maximum Flooding Surface (MFS) that differs from a “classical” MFS described in the literature. It consists of a coarse-grained interval incorporating reworked shoreface material within a silty clay matrix. This distinct lithofacies results from condensation/erosion, which appears as an important process even within supply-dominated deltaic systems, due to avulsion of distributaries. The age of the MFS varies along-strike between ca. 7.8 and 5.6 kyr cal. BP in relation to the position of depocentres and climatically-controlled sediment supply. The last rapid climate change of the Holocene, the Little Ice Age (1250-1850 AD), had a distinct stratigraphic influence on the architecture and lithofacies of the Rhone subaqueous delta through the progradation of two deltaic lobes. In response to changes in sediment supply linked to rapid climate changes (and to anthropic factors), the Rhone delta evolved from wave-dominated to fluvial dominated, and then wave dominated again.
Distributed numerical controllers
NASA Astrophysics Data System (ADS)
Orban, Peter E.
2001-12-01
While the basic principles of Numerical Controllers (NC) have not changed much during the years, the implementation of NCs' has changed tremendously. NC equipment has evolved from yesterday's hard-wired specialty control apparatus to today's graphics intensive, networked, increasingly PC based open systems, controlling a wide variety of industrial equipment with positioning needs. One of the newest trends in NC technology is the distributed implementation of the controllers. Distributed implementation promises to offer robustness, lower implementation costs, and a scalable architecture. Historically partitioning has been done along the hierarchical levels, moving individual modules into self contained units. The paper discusses various NC architectures, the underlying technology for distributed implementation, and relevant design issues. First the functional requirements of individual NC modules are analyzed. Module functionality, cycle times, and data requirements are examined. Next the infrastructure for distributed node implementation is reviewed. Various communication protocols and distributed real-time operating system issues are investigated and compared. Finally, a different, vertical system partitioning, offering true scalability and reconfigurability is presented.
Exploration Space Suit Architecture and Destination Environmental-Based Technology Development
NASA Technical Reports Server (NTRS)
Hill, Terry R.; Korona, F. Adam; McFarland, Shane
2012-01-01
This paper continues forward where EVA Space Suit Architecture: Low Earth Orbit Vs. Moon Vs. Mars [1] left off in the development of a space suit architecture that is modular in design and could be reconfigured prior to launch or during any given mission depending on the tasks or destination. This paper will address the space suit system architecture and technologies required based upon human exploration extravehicular activity (EVA) destinations, and describe how they should evolve to meet the future exploration EVA needs of the US human space flight program.1, 2, 3 In looking forward to future US space exploration to a space suit architecture with maximum reuse of technology and functionality across a range of mission profiles and destinations, a series of exercises and analyses have provided a strong indication that the Constellation Program (CxP) space suit architecture is postured to provide a viable solution for future exploration missions4. The destination environmental analysis presented in this paper demonstrates that the modular architecture approach could provide the lowest mass and mission cost for the protection of the crew given any human mission outside of low-Earth orbit (LEO). Additionally, some of the high-level trades presented here provide a review of the environmental and non-environmental design drivers that will become increasingly important the farther away from Earth humans venture. This paper demonstrates a logical clustering of destination design environments that allows a focused approach to technology prioritization, development, and design that will maximize the return on investment, independent of any particular program, and provide architecture and design solutions for space suit systems in time or ahead of need dates for any particular crewed flight program in the future. The approach to space suit design and interface definition discussion will show how the architecture is very adaptable to programmatic and funding changes with minimal redesign effort such that the modular architecture can be quickly and efficiently honed into a specific mission point solution if required. Additionally, the modular system will allow for specific technology incorporation and upgrade as required with minimal redesign of the system.
Framework for architecture-independent run-time reconfigurable applications
NASA Astrophysics Data System (ADS)
Lehn, David I.; Hudson, Rhett D.; Athanas, Peter M.
2000-10-01
Configurable Computing Machines (CCMs) have emerged as a technology with the computational benefits of custom ASICs as well as the flexibility and reconfigurability of general-purpose microprocessors. Significant effort from the research community has focused on techniques to move this reconfigurability from a rapid application development tool to a run-time tool. This requires the ability to change the hardware design while the application is executing and is known as Run-Time Reconfiguration (RTR). Widespread acceptance of run-time reconfigurable custom computing depends upon the existence of high-level automated design tools. Such tools must reduce the designers effort to port applications between different platforms as the architecture, hardware, and software evolves. A Java implementation of a high-level application framework, called Janus, is presented here. In this environment, developers create Java classes that describe the structural behavior of an application. The framework allows hardware and software modules to be freely mixed and interchanged. A compilation phase of the development process analyzes the structure of the application and adapts it to the target platform. Janus is capable of structuring the run-time behavior of an application to take advantage of the memory and computational resources available.
2007-04-01
Services and System Capabilities Enterprise Rules and Standards for Interoperability Navy AFArmy TRANS COM DFASDLA Ente prise Shared Services and System...Where commonality among components exists, there are also opportunities for identifying and leveraging shared services . A service-oriented architecture...and (3) shared services . The BMA federation strategy, according to these officials, is the first mission area federation strategy, and it is their
VLBA Archive &Distribution Architecture
NASA Astrophysics Data System (ADS)
Wells, D. C.
1994-01-01
Signals from the 10 antennas of NRAO's VLBA [Very Long Baseline Array] are processed by a Correlator. The complex fringe visibilities produced by the Correlator are archived on magnetic cartridges using a low-cost architecture which is capable of scaling and evolving. Archive files are copied to magnetic media to be distributed to users in FITS format, using the BINTABLE extension. Archive files are labelled using SQL INSERT statements, in order to bind the DBMS-based archive catalog to the archive media.
Multiscale structural gradients enhance the biomechanical functionality of the spider fang
Bar-On, Benny; Barth, Friedrich G.; Fratzl, Peter; Politi, Yael
2014-01-01
The spider fang is a natural injection needle, hierarchically built from a complex composite material comprising multiscale architectural gradients. Considering its biomechanical function, the spider fang has to sustain significant mechanical loads. Here we apply experiment-based structural modelling of the fang, followed by analytical mechanical description and Finite-Element simulations, the results of which indicate that the naturally evolved fang architecture results in highly adapted effective structural stiffness and damage resilience. The analysis methods and physical insights of this work are potentially important for investigating and understanding the architecture and structural motifs of sharp-edge biological elements such as stingers, teeth, claws and more. PMID:24866935
Development of an unmanned maritime system reference architecture
NASA Astrophysics Data System (ADS)
Duarte, Christiane N.; Cramer, Megan A.; Stack, Jason R.
2014-06-01
The concept of operations (CONOPS) for unmanned maritime systems (UMS) continues to envision systems that are multi-mission, re-configurable and capable of acceptable performance over a wide range of environmental and contextual variability. Key enablers for these concepts of operation are an autonomy module which can execute different mission directives and a mission payload consisting of re-configurable sensor or effector suites. This level of modularity in mission payloads enables affordability, flexibility (i.e., more capability with future platforms) and scalability (i.e., force multiplication). The modularity in autonomy facilitates rapid technology integration, prototyping, testing and leveraging of state-of-the-art advances in autonomy research. Capability drivers imply a requirement to maintain an open architecture design for both research and acquisition programs. As the maritime platforms become more stable in their design (e.g. unmanned surface vehicles, unmanned underwater vehicles) future developments are able to focus on more capable sensors and more robust autonomy algorithms. To respond to Fleet needs, given an evolving threat, programs will want to interchange the latest sensor or a new and improved algorithm in a cost effective and efficient manner. In order to make this possible, the programs need a reference architecture that will define for technology providers where their piece fits and how to successfully integrate. With these concerns in mind, the US Navy established the Unmanned Maritime Systems Reference Architecture (UMS-RA) Working Group in August 2011. This group consists of Department of Defense and industry participants working the problem of defining reference architecture for autonomous operations of maritime systems. This paper summarizes its efforts to date.
NASA Technical Reports Server (NTRS)
Sherwood, Brent
2006-01-01
This paper develops a conceptual model, adapted from the way research and development non-profits and universities tend to be organized, that could help amplify the reach and effectiveness of the international space architecture community. The model accommodates current activities and published positions, and increases involvement by allocating accountability for necessary professional and administrative activities. It coordinates messaging and other outreach functions to improve brand management. It increases sustainability by balancing volunteer workload. And it provides an open-ended structure that can be modified gracefully as needs, focus, and context evolve. Over the past 20 years, Space Architecture has attained some early signs of legitimacy as a discipline: an active, global community of practicing and publishing professionals; university degree programs; a draft undergraduate curriculum; and formal committee establishment within multiple professional organizations. However, the nascent field has few outlets for expression in built architecture, which exacerbates other challenges the field is experiencing in adolescence: obtaining recognition and inclusion as a unique contributor by the established aerospace profession; organizing and managing outreach by volunteers; striking a balance between setting admittance or performance credentials and attaining a critical mass of members; and knowing what to do, beyond sharing common interests, to actually increase the market demand for space architecture. This paper develops a conceptual model, adapted from the way research-anddevelopment non-profits and universities tend to be organized, that could help amplify the reach and effectiveness of the international space architecture community. The model accommodates current activities and published positions, and increases involvement by allocating accountability for necessary professional and administrative activities. It coordinates messaging and other outreach functions to improve brand management. It increases sustainability by balancing volunteer workload. And it provides an open-ended structure that can be modified gracefully as needs, focus, and context evolve. This organizational model is offered up for consideration, debate, and toughening by the space architecture community at large.
Griswold, Cortland K
2015-12-21
Epistatic gene action occurs when mutations or alleles interact to produce a phenotype. Theoretically and empirically it is of interest to know whether gene interactions can facilitate the evolution of diversity. In this paper, we explore how epistatic gene action affects the additive genetic component or heritable component of multivariate trait variation, as well as how epistatic gene action affects the evolvability of multivariate traits. The analysis involves a sexually reproducing and recombining population. Our results indicate that under stabilizing selection conditions a population with a mixed additive and epistatic genetic architecture can have greater multivariate additive genetic variation and evolvability than a population with a purely additive genetic architecture. That greater multivariate additive genetic variation can occur with epistasis is in contrast to previous theory that indicated univariate additive genetic variation is decreased with epistasis under stabilizing selection conditions. In a multivariate setting, epistasis leads to less relative covariance among individuals in their genotypic, as well as their breeding values, which facilitates the maintenance of additive genetic variation and increases a population׳s evolvability. Our analysis involves linking the combinatorial nature of epistatic genetic effects to the ancestral graph structure of a population to provide insight into the consequences of epistasis on multivariate trait variation and evolution. Copyright © 2015 Elsevier Ltd. All rights reserved.
Hripcsak, George
1997-01-01
Abstract An information system architecture defines the components of a system and the interfaces among the components. A good architecture is essential for creating an Integrated Advanced Information Management System (IAIMS) that works as an integrated whole yet is flexible enough to accommodate many users and roles, multiple applications, changing vendors, evolving user needs, and advancing technology. Modularity and layering promote flexibility by reducing the complexity of a system and by restricting the ways in which components may interact. Enterprise-wide mediation promotes integration by providing message routing, support for standards, dictionary-based code translation, a centralized conceptual data schema, business rule implementation, and consistent access to databases. Several IAIMS sites have adopted a client-server architecture, and some have adopted a three-tiered approach, separating user interface functions, application logic, and repositories. PMID:9067884
Development of Mission Enabling Infrastructure — Cislunar Autonomous Positioning System (CAPS)
NASA Astrophysics Data System (ADS)
Cheetham, B. W.
2017-10-01
Advanced Space, LLC is developing the Cislunar Autonomous Positioning System (CAPS) which would provide a scalable and evolvable architecture for navigation to reduce ground congestion and improve operations for missions throughout cislunar space.
NREL Workshop Convenes Industry Experts on Cybersecurity and an Evolving
silos in a field that demands greater collaboration, and the benefits of systemic security architecture groups to identify possible solutions to the challenges in securing DERs-from a technology, business, and
Analysis of NASA communications (Nascom) II network protocols and performance
NASA Technical Reports Server (NTRS)
Omidyar, Guy C.; Butler, Thomas E.
1991-01-01
The NASA Communications (Nascom) Division of the Mission Operations and Data Systems Directorate is to undertake a major initiative to develop the Nascom II (NII) network to achieve its long-range service objectives for operational data transport to support the Space Station Freedom Program, the Earth Observing System, and other projects. NII is the Nascom ground communications network being developed to accommodate the operational traffic of the mid-1990s and beyond. The authors describe various baseline protocol architectures based on current and evolving technologies. They address the internetworking issues suggested for reliable transfer of data over heterogeneous segments. They also describe the NII architecture, topology, system components, and services. A comparative evaluation of the current and evolving technologies was made, and suggestions for further study are described. It is shown that the direction of the NII configuration and the subsystem component design will clearly depend on the advances made in the area of broadband integrated services.
NASA Astrophysics Data System (ADS)
Trani, L.; Spinuso, A.; Galea, M.; Atkinson, M.; Van Eck, T.; Vilotte, J.
2011-12-01
The data bonanza generated by today's digital revolution is forcing scientists to rethink their methodologies and working practices. Traditional approaches to knowledge discovery are pushed to their limit and struggle to keep apace with the data flows produced by modern systems. This work shows how the ADMIRE data-intensive architecture supports seismologists by enabling them to focus on their scientific goals and questions, abstracting away the underlying technology platform that enacts their data integration and analysis tasks. ADMIRE accomplishes this partly by recognizing 3 different types of experts that require clearly defined interfaces between their interaction: the domain expert who is the application specialist, the data-analysis expert who is a specialist in extracting information from data, and the data-intensive engineer who develops the infrastructure for data-intensive computation. In order to provide a context in which each category of expert may flourish, ADMIRE uses a 3-level architecture: the upper - tool - level supports the work of both domain and data-analysis experts, housing an extensive and evolving set of portals, tools and development environments; the lower - enactment - level houses a large and dynamic community of providers delivering data and data-intensive enactment environments as an evolving infrastructure that supports all of the work underway in the upper layer. Most data-intensive engineers work here; the crucial innovation lies in the middle level, a gateway that is a tightly defined and stable interface through which the two diverse and dynamic upper and lower layers communicate. This is a minimal and simple protocol and language (DISPEL), ultimately to be controlled by standards, so that the upper and lower communities may invest, secure in the knowledge that changes in this interface will be carefully managed. We implemented a well-established procedure for processing seismic ambient noise on the prototype architecture. The primary goal was to evaluate its capabilities for large-scale integration and analysis of distributed data. A secondary goal was to gauge its potential and the added value that it might bring to the seismological community. Though still in its infant state, the architecture met the demands of our use case and promises to cater for our future requirements. We shall continue to develop its capabilities as part of an EU funded project VERCE - Virtual Earthquake and Seismology Research Community for Europe. VERCE aims to significantly advance our understanding of the Earth in order to aid society in its management of natural resources and hazards. Its strategy is to enable seismologists to fully exploit the under-utilized wealth of seismic data, and key to this is a data-intensive computation framework adapted to the scale and diversity of the community. This is a first step in building a data-intensive highway for geoscientists, smoothing their travel from the primary sources of data to new insights and rapid delivery of actionable information.
User assembly and servicing system for Space Station, an evolving architecture approach
NASA Technical Reports Server (NTRS)
Lavigna, Thomas A.; Cline, Helmut P.
1988-01-01
On-orbit assembly and servicing of a variety of scientific and applications hardware systems is expected to be one of the Space Station's primary functions. The hardware to be serviced will include the attached payloads resident on the Space Station, the free-flying satellites and co-orbiting platforms brought to the Space Station, and the polar orbiting platforms. The requirements for assembly and servicing such a broad spectrum of missions have led to the development of an Assembly and Servicing System Architecture that is composed of a complex array of support elements. This array is comprised of US elements, both Space Station and non-Space Station, and elements provided by Canada to the Space Station Program. For any given servicing or assembly mission, the necessary support elements will be employed in an integrated manner to satisfy the mission-specific needs. The structure of the User Assembly and Servicing System Architecture and the manner in which it will evolved throughout the duration of the phased Space Station Program are discussed. Particular emphasis will be placed upon the requirements to be accommodated in each phase, and the development of a logical progression of capabilities to meet these requirements.
NASA Technical Reports Server (NTRS)
Hinchey, Michael G. (Inventor); Rash, James L. (Inventor); Pena, Joaquin (Inventor)
2011-01-01
Systems, methods and apparatus are provided through which an evolutionary system is managed and viewed as a software product line. In some embodiments, the core architecture is a relatively unchanging part of the system, and each version of the system is viewed as a product from the product line. Each software product is generated from the core architecture with some agent-based additions. The result may be a multi-agent system software product line.
EHR standards--A comparative study.
Blobel, Bernd; Pharow, Peter
2006-01-01
For ensuring quality and efficiency of patient's care, the care paradigm moves from organization-centered over process-controlled towards personal care. Such health system paradigm change leads to new paradigms for analyzing, designing, implementing and deploying supporting health information systems including EHR systems as core application in a distributed eHealth environment. The paper defines the architectural paradigm for future-proof EHR systems. It compares advanced EHR architectures referencing them at the Generic Component Model. The paper introduces the evolving paradigm of autonomous computing for self-organizing health information systems.
Exploring with PAM: Prospecting ANTS Missions for Solar System Surveys
NASA Technical Reports Server (NTRS)
Clark, P. E.; Rilee, M. L.; Curtis, S. A.
2003-01-01
ANTS (Autonomous Nano-Technology Swarm), a large (1000 member) swarm of nano to picoclass (10 to 1 kg) totally autonomous spacecraft, are being developed as a NASA advanced mission concept. ANTS, based on a hierarchical insect social order, use an evolvable, self-similar, hierarchical neural system in which individual spacecraft represent the highest level nodes. ANTS uses swarm intelligence attained through collective, cooperative interactions of the nodes at all levels of the system. At the highest levels this can take the form of cooperative, collective behavior among the individual spacecraft in a very large constellation. The ANTS neural architecture is designed for totally autonomous operation of complex systems including spacecraft constellations. The ANTS (Autonomous Nano Technology Swarm) concept has a number of possible applications. A version of ANTS designed for surveying and determining the resource potential of the asteroid belt, called PAM (Prospecting ANTS Mission), is examined here.
Benefits of Using a Mars Forward Strategy for Lunar Surface Systems
NASA Technical Reports Server (NTRS)
Mulqueen, Jack; Griffin, Brand; Smitherman, David; Maples, Dauphne
2009-01-01
This paper identifies potential risk reduction, cost savings and programmatic procurement benefits of a Mars Forward Lunar Surface System architecture that provides commonality or evolutionary development paths for lunar surface system elements applicable to Mars surface systems. The objective of this paper is to identify the potential benefits for incorporating a Mars Forward development strategy into the planned Project Constellation Lunar Surface System Architecture. The benefits include cost savings, technology readiness, and design validation of systems that would be applicable to lunar and Mars surface systems. The paper presents a survey of previous lunar and Mars surface systems design concepts and provides an assessment of previous conclusions concerning those systems in light of the current Project Constellation Exploration Architectures. The operational requirements for current Project Constellation lunar and Mars surface system elements are compared and evaluated to identify the potential risk reduction strategies that build on lunar surface systems to reduce the technical and programmatic risks for Mars exploration. Risk reduction for rapidly evolving technologies is achieved through systematic evolution of technologies and components based on Moore's Law superimposed on the typical NASA systems engineering project development "V-cycle" described in NASA NPR 7120.5. Risk reduction for established or slowly evolving technologies is achieved through a process called the Mars-Ready Platform strategy in which incremental improvements lead from the initial lunar surface system components to Mars-Ready technologies. The potential programmatic benefits of the Mars Forward strategy are provided in terms of the transition from the lunar exploration campaign to the Mars exploration campaign. By utilizing a sequential combined procurement strategy for lunar and Mars exploration surface systems, the overall budget wedges for exploration systems are reduced and the costly technological development gap between the lunar and Mars programs can be eliminated. This provides a sustained level of technological competitiveness as well as maintaining a stable engineering and manufacturing capability throughout the entire duration of Project Constellation.
Evolving Digital Ecological Networks
Wagner, Aaron P.; Ofria, Charles
2013-01-01
“It is hard to realize that the living world as we know it is just one among many possibilities” [1]. Evolving digital ecological networks are webs of interacting, self-replicating, and evolving computer programs (i.e., digital organisms) that experience the same major ecological interactions as biological organisms (e.g., competition, predation, parasitism, and mutualism). Despite being computational, these programs evolve quickly in an open-ended way, and starting from only one or two ancestral organisms, the formation of ecological networks can be observed in real-time by tracking interactions between the constantly evolving organism phenotypes. These phenotypes may be defined by combinations of logical computations (hereafter tasks) that digital organisms perform and by expressed behaviors that have evolved. The types and outcomes of interactions between phenotypes are determined by task overlap for logic-defined phenotypes and by responses to encounters in the case of behavioral phenotypes. Biologists use these evolving networks to study active and fundamental topics within evolutionary ecology (e.g., the extent to which the architecture of multispecies networks shape coevolutionary outcomes, and the processes involved). PMID:23533370
OSD CALS Architecture Master Plan Study. Concept Paper. Indexing. Volume 30
DOT National Transportation Integrated Search
1989-06-01
An index identifies and reference information which is exchanged between multiple users and systems. The increased automation that will take place as CALS evolves will dictate an increased use of indexes for the successful exchange of information. Th...
Move-tecture: A Conceptual Framework for Designing Movement in Architecture
NASA Astrophysics Data System (ADS)
Yilmaz, Irem
2017-10-01
Along with the technological improvements in our age, it is now possible for the movement to become one of the basic components of the architectural space. Accordingly, architectural construction of movement changes both our architectural production practices and our understanding of architectural space. However, existing design concepts and approaches are insufficient to discuss and understand this change. In this respect, this study aims to form a conceptual framework on the relationship of architecture and movement. In this sense, the conceptualization of move-tecture is developed to research on the architectural construction of movement and the potentials of spatial creation through architecturally constructed movement. Move-tecture, is a conceptualization that treats movement as a basic component of spatial creation. It presents the framework of a qualitative categorization on the design of moving architectural structures. However, this categorization is a flexible one that can evolve in the direction of the expanding possibilities of the architectural design and the changing living conditions. With this understanding, six categories have been defined within the context of the article: Topological Organization, Choreographic Formation, Kinetic Structuring, Corporeal Constitution, Technological Configuration and Interactional Patterning. In line with these categories, a multifaceted perspective on the moving architectural structures is promoted. It is aimed that such an understanding constitutes a new initiative in the design practices carried out in this area and provides a conceptual basis for the discussions to be developed.
NASA Technical Reports Server (NTRS)
Lindvall, Mikael; Godfrey, Sally; Ackermann, Chris; Ray, Arnab; Yonkwa, Lyly; Ganesan, Dharma; Stratton, William C.; Sibol, Deane E.
2008-01-01
Analyze, Visualize, and Evaluate structure and behavior using static and dynamic information, individual systems as well as systems of systems. Next steps: Refine software tool support; Apply to other systems; and Apply earlier in system life cycle.
Transitioning from analog to digital communications: An information security perspective
NASA Technical Reports Server (NTRS)
Dean, Richard A.
1990-01-01
A summary is given of the government's perspective on evolving digital communications as they affect secure voice users and approaches for operating during a transition period to an all digital world. An integrated architecture and a mobile satellite interface are discussed.
Genetic architecture of evolved tolerance to PCBs in the estuarine fish Fundulus heteroclitus
Populations of Atlantic killifish (F. heteroclitus) resident to coastal estuarine habitats contaminated with halogenated aromatic hydrocarbons (HAHs) exhibit heritable resistance to the early life-stage toxicity associated with these compounds. Beyond our knowledge of the aryl hy...
Fox, Charles W; Wagner, James D; Cline, Sara; Thomas, Frances Ann; Messina, Frank J
2009-05-01
Independent populations subjected to similar environments often exhibit convergent evolution. An unresolved question is the frequency with which such convergence reflects parallel genetic mechanisms. We examined the convergent evolution of egg-laying behavior in the seed-feeding beetle Callosobruchus maculatus. Females avoid ovipositing on seeds bearing conspecific eggs, but the degree of host discrimination varies among geographic populations. In a previous experiment, replicate lines switched from a small host to a large one evolved reduced discrimination after 40 generations. We used line crosses to determine the genetic architecture underlying this rapid response. The most parsimonious genetic models included dominance and/or epistasis for all crosses. The genetic architecture underlying reduced discrimination in two lines was not significantly different from the architecture underlying differences between geographic populations, but the architecture underlying the divergence of a third line differed from all others. We conclude that convergence of this complex trait may in some cases involve parallel genetic mechanisms.
Kii, Isao; Nishiyama, Takashi; Li, Minqi; Matsumoto, Ken-ichi; Saito, Mitsuru; Amizuka, Norio; Kudo, Akira
2010-01-01
Extracellular matrix (ECM) underlies a complicated multicellular architecture that is subjected to significant forces from mechanical environment. Although various components of the ECM have been enumerated, mechanisms that evolve the sophisticated ECM architecture remain to be addressed. Here we show that periostin, a matricellular protein, promotes incorporation of tenascin-C into the ECM and organizes a meshwork architecture of the ECM. We found that both periostin null mice and tenascin-C null mice exhibited a similar phenotype, confined tibial periostitis, which possibly corresponds to medial tibial stress syndrome in human sports injuries. Periostin possessed adjacent domains that bind to tenascin-C and the other ECM protein: fibronectin and type I collagen, respectively. These adjacent domains functioned as a bridge between tenascin-C and the ECM, which increased deposition of tenascin-C on the ECM. The deposition of hexabrachions of tenascin-C may stabilize bifurcations of the ECM fibrils, which is integrated into the extracellular meshwork architecture. This study suggests a role for periostin in adaptation of the ECM architecture in the mechanical environment. PMID:19887451
Kii, Isao; Nishiyama, Takashi; Li, Minqi; Matsumoto, Ken-Ichi; Saito, Mitsuru; Amizuka, Norio; Kudo, Akira
2010-01-15
Extracellular matrix (ECM) underlies a complicated multicellular architecture that is subjected to significant forces from mechanical environment. Although various components of the ECM have been enumerated, mechanisms that evolve the sophisticated ECM architecture remain to be addressed. Here we show that periostin, a matricellular protein, promotes incorporation of tenascin-C into the ECM and organizes a meshwork architecture of the ECM. We found that both periostin null mice and tenascin-C null mice exhibited a similar phenotype, confined tibial periostitis, which possibly corresponds to medial tibial stress syndrome in human sports injuries. Periostin possessed adjacent domains that bind to tenascin-C and the other ECM protein: fibronectin and type I collagen, respectively. These adjacent domains functioned as a bridge between tenascin-C and the ECM, which increased deposition of tenascin-C on the ECM. The deposition of hexabrachions of tenascin-C may stabilize bifurcations of the ECM fibrils, which is integrated into the extracellular meshwork architecture. This study suggests a role for periostin in adaptation of the ECM architecture in the mechanical environment.
Investigation of rat exploratory behavior via evolving artificial neural networks.
Costa, Ariadne de Andrade; Tinós, Renato
2016-09-01
Neuroevolution comprises the use of evolutionary computation to define the architecture and/or to train artificial neural networks (ANNs). This strategy has been employed to investigate the behavior of rats in the elevated plus-maze, which is a widely used tool for studying anxiety in mice and rats. Here we propose a neuroevolutionary model, in which both the weights and the architecture of artificial neural networks (our virtual rats) are evolved by a genetic algorithm. This model is an improvement of a previous model that involves the evolution of just the weights of the ANN by the genetic algorithm. In order to compare both models, we analyzed traditional measures of anxiety behavior, like the time spent and the number of entries in both open and closed arms of the maze. When compared to real rat data, our findings suggest that the results from the model introduced here are statistically better than those from other models in the literature. In this way, the neuroevolution of architecture is clearly important for the development of the virtual rats. Moreover, this technique allowed the comprehension of the importance of different sensory units and different number of hidden neurons (performing as memory) in the ANNs (virtual rats). Copyright © 2016 Elsevier B.V. All rights reserved.
Architecture of Allosteric Materials and Edge Modes
NASA Astrophysics Data System (ADS)
Yan, Le; Ravasio, Riccardo; Brito, Carolina; Wyart, Matthieu
Allostery, a long-range elasticity-mediated interaction, remains the biggest mystery decades after its discovery in proteins. We introduce a numerical scheme to evolve functional materials that can accomplish a specified mechanical task. In this scheme, the number of solutions, their spatial architectures and the correlations among them can be computed. As an example, we consider an ``allosteric'' task, which requires the material to respond specifically to a stimulus at a distant active site. We find that functioning materials evolve a less-constrained trumpet-shaped region connecting the stimulus and active sites and that the amplitude of the elastic response varies non-monotonically along the trumpet. As previously shown for some proteins, we find that correlations appearing during evolution alone are sufficient to identify key aspects of this design. Finally, we show that the success of this architecture stems from the emergence of soft edge modes recently found to appear near the surface of marginally connected materials. Overall, our in silico evolution experiment offers a new window to study the relationship between structure, function, and correlations emerging during evolution. L.Y. was supported in part by the National Science Foundation under Grant No. NSF PHY11-25915. M.W. thanks the Swiss National Science Foundation for support under Grant No. 200021-165509 and the Simons Foundation Grant (#454953 Matthieu Wyart).
Experience in running relational databases on clustered storage
NASA Astrophysics Data System (ADS)
Gaspar Aparicio, Ruben; Potocky, Miroslav
2015-12-01
For past eight years, CERN IT Database group has based its backend storage on NAS (Network-Attached Storage) architecture, providing database access via NFS (Network File System) protocol. In last two and half years, our storage has evolved from a scale-up architecture to a scale-out one. This paper describes our setup and a set of functionalities providing key features to other services like Database on Demand [1] or CERN Oracle backup and recovery service. It also outlines possible trend of evolution that, storage for databases could follow.
Rollout Strategy to Implement Interoperable Traceability in the Seafood Industry.
Gooch, Martin; Dent, Benjamin; Sylvia, Gilbert; Cusack, Christopher
2017-08-01
Verifying the accuracy and rigor of data exchanged within and between businesses for the purposes of traceability rests on the existence of effective and efficient interoperable information systems that meet users' needs. Interoperability, particularly given the complexities intrinsic to the seafood industry, requires that the systems used by businesses operating along the supply chain share a common technology architecture that is robust, resilient, and evolves as industry needs change. Technology architectures are developed through engaging industry stakeholders in understanding why an architecture is required, the benefits provided to the industry and individual businesses and supply chains, and how the architecture will translate into practical results. This article begins by reiterating the benefits that the global seafood industry can capture by implementing interoperable chain-length traceability and the reason for basing the architecture on a peer-to-peer networked database concept versus more traditional centralized or linear approaches. A summary of capabilities that already exist within the seafood industry that the proposed architecture uses is discussed; and a strategy for implementing the architecture is presented. The 6-step strategy is presented in the form of a critical path. © 2017 Institute of Food Technologists®.
Rübben, Albert; Nordhoff, Ole
2013-01-01
Summary Most clinically distinguishable malignant tumors are characterized by specific mutations, specific patterns of chromosomal rearrangements and a predominant mechanism of genetic instability but it remains unsolved whether modifications of cancer genomes can be explained solely by mutations and selection through the cancer microenvironment. It has been suggested that internal dynamics of genomic modifications as opposed to the external evolutionary forces have a significant and complex impact on Darwinian species evolution. A similar situation can be expected for somatic cancer evolution as molecular key mechanisms encountered in species evolution also constitute prevalent mutation mechanisms in human cancers. This assumption is developed into a systems approach of carcinogenesis which focuses on possible inner constraints of the genome architecture on lineage selection during somatic cancer evolution. The proposed systems approach can be considered an analogy to the concept of evolvability in species evolution. The principal hypothesis is that permissive or restrictive effects of the genome architecture on lineage selection during somatic cancer evolution exist and have a measurable impact. The systems approach postulates three classes of lineage selection effects of the genome architecture on somatic cancer evolution: i) effects mediated by changes of fitness of cells of cancer lineage, ii) effects mediated by changes of mutation probabilities and iii) effects mediated by changes of gene designation and physical and functional genome redundancy. Physical genome redundancy is the copy number of identical genetic sequences. Functional genome redundancy of a gene or a regulatory element is defined as the number of different genetic elements, regardless of copy number, coding for the same specific biological function within a cancer cell. Complex interactions of the genome architecture on lineage selection may be expected when modifications of the genome architecture have multiple and possibly opposed effects which manifest themselves at disparate times and progression stages. Dissection of putative mechanisms mediating constraints exerted by the genome architecture on somatic cancer evolution may provide an algorithm for understanding and predicting as well as modifying somatic cancer evolution in individual patients. PMID:23336076
Plasma Oscillation Characterization of NASA's HERMeS Hall Thruster via High Speed Imaging
NASA Technical Reports Server (NTRS)
Huang, Wensheng; Kamhawi, Hani; Haag, Thomas W.
2016-01-01
For missions beyond low Earth orbit, spacecraft size and mass can be dominated by onboard chemical propulsion systems and propellants that may constitute more than 50 percent of the spacecraft mass. This impact can be substantially reduced through the utilization of Solar Electric Propulsion (SEP) due to its substantially higher specific impulse. Studies performed for NASA's Human Exploration and Operations Mission Directorate and Science Mission Directorate have demonstrated that a 50kW-class SEP capability can be enabling for both near term and future architectures and science missions. A high-power SEP element is integral to the Evolvable Mars Campaign, which presents an approach to establish an affordable evolutionary human exploration architecture. To enable SEP missions at the power levels required for these applications, an in-space demonstration of an operational 50kW-class SEP spacecraft has been proposed as a SEP Technology Demonstration Mission (TDM). In 2010 NASA's Space Technology Mission Directorate (STMD) began developing high-power electric propulsion technologies. The maturation of these critical technologies has made mission concepts utilizing high-power SEP viable.
NASA Astrophysics Data System (ADS)
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; Kalinkin, Alexander A.
2017-02-01
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, which is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,'bottom-up' and 'top-down', are illustrated. Preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.
A review of experimental techniques to produce a nacre-like structure.
Corni, I; Harvey, T J; Wharton, J A; Stokes, K R; Walsh, F C; Wood, R J K
2012-09-01
The performance of man-made materials can be improved by exploring new structures inspired by the architecture of biological materials. Natural materials, such as nacre (mother-of-pearl), can have outstanding mechanical properties due to their complicated architecture and hierarchical structure at the nano-, micro- and meso-levels which have evolved over millions of years. This review describes the numerous experimental methods explored to date to produce composites with structures and mechanical properties similar to those of natural nacre. The materials produced have sizes ranging from nanometres to centimetres, processing times varying from a few minutes to several months and a different range of mechanical properties that render them suitable for various applications. For the first time, these techniques have been divided into those producing bulk materials, coatings and free-standing films. This is due to the fact that the material's application strongly depends on its dimensions and different results have been reported by applying the same technique to produce materials with different sizes. The limitations and capabilities of these methodologies have been also described.
Space Telescope Sensitivity and Controls for Exoplanet Imaging
NASA Technical Reports Server (NTRS)
Lyon, Richard G.; Clampin, Mark
2012-01-01
Herein we address design considerations and outline requirements for space telescopes with capabilities for high contrast imaging of exoplanets. The approach taken is to identify the span of potentially detectable Earth-sized terrestrial planets in the habitable zone of the nearest stars within 30 parsecs and estimate their inner working angles, flux ratios, SNR, sensitivities, wavefront error requirements and sensing and control times parametrically versus aperture size. We consider 1, 2, 4, 8 and 16-meter diameter telescope apertures. The achievable science, range of telescope architectures, and the coronagraphic approach are all active areas of research and are all subject to change in a rapidly evolving field. Thus, presented is a snapshot of our current understanding with the goal of limiting the choices to those that appear currently technically feasible. We describe the top-level metrics of inner working angle, contrast and photometric throughput and explore how they are related to the range of target stars. A critical point is that for each telescope architecture and coronagraphic choice the telescope stability requirements have differing impacts on the design for open versus closed-loop sensing and control.
NASA Astrophysics Data System (ADS)
Sabeur, Z. A.; Wächter, J.; Middleton, S. E.; Zlatev, Z.; Häner, R.; Hammitzsch, M.; Loewe, P.
2012-04-01
The intelligent management of large volumes of environmental monitoring data for early tsunami warning requires the deployment of robust and scalable service oriented infrastructure that is supported by an agile knowledge-base for critical decision-support In the TRIDEC project (TRIDEC 2010-2013), a sensor observation service bus of the TRIDEC system is being developed for the advancement of complex tsunami event processing and management. Further, a dedicated TRIDEC system knowledge-base is being implemented to enable on-demand access to semantically rich OGC SWE compliant hydrodynamic observations and operationally oriented meta-information to multiple subscribers. TRIDEC decision support requires a scalable and agile real-time processing architecture which enables fast response to evolving subscribers requirements as the tsunami crisis develops. This is also achieved with the support of intelligent processing services which specialise in multi-level fusion methods with relevance feedback and deep learning. The TRIDEC knowledge base development work coupled with that of the generic sensor bus platform shall be presented to demonstrate advanced decision-support with situation awareness in context of tsunami early warning and crisis management.
Network architecture test-beds as platforms for ubiquitous computing.
Roscoe, Timothy
2008-10-28
Distributed systems research, and in particular ubiquitous computing, has traditionally assumed the Internet as a basic underlying communications substrate. Recently, however, the networking research community has come to question the fundamental design or 'architecture' of the Internet. This has been led by two observations: first, that the Internet as it stands is now almost impossible to evolve to support new functionality; and second, that modern applications of all kinds now use the Internet rather differently, and frequently implement their own 'overlay' networks above it to work around its perceived deficiencies. In this paper, I discuss recent academic projects to allow disruptive change to the Internet architecture, and also outline a radically different view of networking for ubiquitous computing that such proposals might facilitate.
Healthy eating design guidelines for school architecture.
Huang, Terry T-K; Sorensen, Dina; Davis, Steven; Frerichs, Leah; Brittin, Jeri; Celentano, Joseph; Callahan, Kelly; Trowbridge, Matthew J
2013-01-01
We developed a new tool, Healthy Eating Design Guidelines for School Architecture, to provide practitioners in architecture and public health with a practical set of spatially organized and theory-based strategies for making school environments more conducive to learning about and practicing healthy eating by optimizing physical resources and learning spaces. The design guidelines, developed through multidisciplinary collaboration, cover 10 domains of the school food environment (eg, cafeteria, kitchen, garden) and 5 core healthy eating design principles. A school redesign project in Dillwyn, Virginia, used the tool to improve the schools' ability to adopt a healthy nutrition curriculum and promote healthy eating. The new tool, now in a pilot version, is expected to evolve as its components are tested and evaluated through public health and design research.
The nuclear envelope as an integrator of nuclear and cytoplasmic architecture.
Crisp, Melissa; Burke, Brian
2008-06-18
Initially perceived as little more than a container for the genome, our view of the nuclear envelope (NE) and its role in defining global nuclear architecture has evolved significantly in recent years. The recognition that certain human diseases arise from defects in NE components has provided new insight into its structural and regulatory functions. In particular, NE defects associated with striated muscle disease have been shown to cause structural perturbations not just of the nucleus itself but also of the cytoplasm. It is now becoming increasingly apparent that these two compartments display co-dependent mechanical properties. The identification of cytoskeletal binding complexes that localize to the NE now reveals a molecular framework that can seamlessly integrate nuclear and cytoplasmic architecture.
Corwin, John; Silberschatz, Avi; Miller, Perry L; Marenco, Luis
2007-01-01
Data sparsity and schema evolution issues affecting clinical informatics and bioinformatics communities have led to the adoption of vertical or object-attribute-value-based database schemas to overcome limitations posed when using conventional relational database technology. This paper explores these issues and discusses why biomedical data are difficult to model using conventional relational techniques. The authors propose a solution to these obstacles based on a relational database engine using a sparse, column-store architecture. The authors provide benchmarks comparing the performance of queries and schema-modification operations using three different strategies: (1) the standard conventional relational design; (2) past approaches used by biomedical informatics researchers; and (3) their sparse, column-store architecture. The performance results show that their architecture is a promising technique for storing and processing many types of data that are not handled well by the other two semantic data models.
Geotechnical engineering practices in Canada and Europe
DOT National Transportation Integrated Search
2011-12-01
This report describes Machine-to-Machine service architecture and how it is evolving over the next several years. Nearly 50 billion Machine-to-Machine (M2M) devices are predicted to be deployed by all sectors by 2025. The largest impediment to M2M de...
MSAT signalling and network management architectures
NASA Technical Reports Server (NTRS)
Garland, Peter; Keelty, J. Malcolm
1989-01-01
Spar Aerospace has been active in the design and definition of Mobile Satellite Systems since the mid 1970's. In work sponsored by the Canadian Department of Communications, various payload configurations have evolved. In addressing the payload configuration, the requirements of the mobile user, the service provider and the satellite operator have always been the most important consideration. The current Spar 11 beam satellite design is reviewed, and its capabilities to provide flexibility and potential for network growth within the WARC87 allocations are explored. To enable the full capabilities of the payload to be realized, a large amount of ground based Switching and Network Management infrastructure will be required, when space segment becomes available. Early indications were that a single custom designed Demand Assignment Multiple Access (DAMA) switch should be implemented to provide efficient use of the space segment. As MSAT has evolved into a multiple service concept, supporting many service providers, this architecture should be reviewed. Some possible signalling and Network Management solutions are explored.
Hybridization Reveals the Evolving Genomic Architecture of Speciation
Kronforst, Marcus R.; Hansen, Matthew E.B.; Crawford, Nicholas G.; Gallant, Jason R.; Zhang, Wei; Kulathinal, Rob J.; Kapan, Durrell D.; Mullen, Sean P.
2014-01-01
SUMMARY The rate at which genomes diverge during speciation is unknown, as are the physical dynamics of the process. Here, we compare full genome sequences of 32 butterflies, representing five species from a hybridizing Heliconius butterfly community, to examine genome-wide patterns of introgression and infer how divergence evolves during the speciation process. Our analyses reveal that initial divergence is restricted to a small fraction of the genome, largely clustered around known wing-patterning genes. Over time, divergence evolves rapidly, due primarily to the origin of new divergent regions. Furthermore, divergent genomic regions display signatures of both selection and adaptive introgression, demonstrating the link between microevolutionary processes acting within species and the origin of species across macroevolutionary timescales. Our results provide a uniquely comprehensive portrait of the evolving species boundary due to the role that hybridization plays in reducing the background accumulation of divergence at neutral sites. PMID:24183670
Utilizing data grid architecture for the backup and recovery of clinical image data.
Liu, Brent J; Zhou, M Z; Documet, J
2005-01-01
Grid Computing represents the latest and most exciting technology to evolve from the familiar realm of parallel, peer-to-peer and client-server models. However, there has been limited investigation into the impact of this emerging technology in medical imaging and informatics. In particular, PACS technology, an established clinical image repository system, while having matured significantly during the past ten years, still remains weak in the area of clinical image data backup. Current solutions are expensive or time consuming and the technology is far from foolproof. Many large-scale PACS archive systems still encounter downtime for hours or days, which has the critical effect of crippling daily clinical operations. In this paper, a review of current backup solutions will be presented along with a brief introduction to grid technology. Finally, research and development utilizing the grid architecture for the recovery of clinical image data, in particular, PACS image data, will be presented. The focus of this paper is centered on applying a grid computing architecture to a DICOM environment since DICOM has become the standard for clinical image data and PACS utilizes this standard. A federation of PACS can be created allowing a failed PACS archive to recover its image data from others in the federation in a seamless fashion. The design reflects the five-layer architecture of grid computing: Fabric, Resource, Connectivity, Collective, and Application Layers. The testbed Data Grid is composed of one research laboratory and two clinical sites. The Globus 3.0 Toolkit (Co-developed by the Argonne National Laboratory and Information Sciences Institute, USC) for developing the core and user level middleware is utilized to achieve grid connectivity. The successful implementation and evaluation of utilizing data grid architecture for clinical PACS data backup and recovery will provide an understanding of the methodology for using Data Grid in clinical image data backup for PACS, as well as establishment of benchmarks for performance from future grid technology improvements. In addition, the testbed can serve as a road map for expanded research into large enterprise and federation level data grids to guarantee CA (Continuous Availability, 99.999% up time) in a variety of medical data archiving, retrieval, and distribution scenarios.
Modeling the evolution of protein domain architectures using maximum parsimony.
Fong, Jessica H; Geer, Lewis Y; Panchenko, Anna R; Bryant, Stephen H
2007-02-09
Domains are basic evolutionary units of proteins and most proteins have more than one domain. Advances in domain modeling and collection are making it possible to annotate a large fraction of known protein sequences by a linear ordering of their domains, yielding their architecture. Protein domain architectures link evolutionarily related proteins and underscore their shared functions. Here, we attempt to better understand this association by identifying the evolutionary pathways by which extant architectures may have evolved. We propose a model of evolution in which architectures arise through rearrangements of inferred precursor architectures and acquisition of new domains. These pathways are ranked using a parsimony principle, whereby scenarios requiring the fewest number of independent recombination events, namely fission and fusion operations, are assumed to be more likely. Using a data set of domain architectures present in 159 proteomes that represent all three major branches of the tree of life allows us to estimate the history of over 85% of all architectures in the sequence database. We find that the distribution of rearrangement classes is robust with respect to alternative parsimony rules for inferring the presence of precursor architectures in ancestral species. Analyzing the most parsimonious pathways, we find 87% of architectures to gain complexity over time through simple changes, among which fusion events account for 5.6 times as many architectures as fission. Our results may be used to compute domain architecture similarities, for example, based on the number of historical recombination events separating them. Domain architecture "neighbors" identified in this way may lead to new insights about the evolution of protein function.
Modeling the Evolution of Protein Domain Architectures Using Maximum Parsimony
Fong, Jessica H.; Geer, Lewis Y.; Panchenko, Anna R.; Bryant, Stephen H.
2007-01-01
Domains are basic evolutionary units of proteins and most proteins have more than one domain. Advances in domain modeling and collection are making it possible to annotate a large fraction of known protein sequences by a linear ordering of their domains, yielding their architecture. Protein domain architectures link evolutionarily related proteins and underscore their shared functions. Here, we attempt to better understand this association by identifying the evolutionary pathways by which extant architectures may have evolved. We propose a model of evolution in which architectures arise through rearrangements of inferred precursor architectures and acquisition of new domains. These pathways are ranked using a parsimony principle, whereby scenarios requiring the fewest number of independent recombination events, namely fission and fusion operations, are assumed to be more likely. Using a data set of domain architectures present in 159 proteomes that represent all three major branches of the tree of life allows us to estimate the history of over 85% of all architectures in the sequence database. We find that the distribution of rearrangement classes is robust with respect to alternative parsimony rules for inferring the presence of precursor architectures in ancestral species. Analyzing the most parsimonious pathways, we find 87% of architectures to gain complexity over time through simple changes, among which fusion events account for 5.6 times as many architectures as fission. Our results may be used to compute domain architecture similarities, for example, based on the number of historical recombination events separating them. Domain architecture “neighbors” identified in this way may lead to new insights about the evolution of protein function. PMID:17166515
Development and Evaluation of a Faculty Designed Courseware
ERIC Educational Resources Information Center
Sternberger, Carol
2006-01-01
Electronic delivery of courses presents an evolving process and one that necessitates a change in the architecture of learning designs. Moving beyond the adaptation of familiar methodology for electronic delivery is challenging and requires innovation. The inclusion of interactive courseware in an electronically delivered course addresses varied…
The Technology of Teaching Young Handicapped Children.
ERIC Educational Resources Information Center
Bijou, Sidney W.
To fabricate a technology for teaching young school children with serious behavior problems, classroom materials, curriculum format, and teaching procedures were developed, and problems that evolve from the technology investigated. Two classrooms were architecturally designed to provide the basic needs of a special classroom and to facilitate…
Key Facts about Higher Education in Washington
ERIC Educational Resources Information Center
Washington Higher Education Coordinating Board, 2011
2011-01-01
Since its establishment in the 1860s, Washington's higher education system has evolved rapidly to meet a myriad of state needs in fields as diverse as agriculture, bioscience, chemistry, environmental sciences, engineering, medicine, law, business, computer science, and architecture. Today, higher education, like other vital state functions, faces…
NASA Astrophysics Data System (ADS)
Lewe, Jung-Ho
The National Transportation System (NTS) is undoubtedly a complex system-of-systems---a collection of diverse 'things' that evolve over time, organized at multiple levels, to achieve a range of possibly conflicting objectives, and never quite behaving as planned. The purpose of this research is to develop a virtual transportation architecture for the ultimate goal of formulating an integrated decision-making framework. The foundational endeavor begins with creating an abstraction of the NTS with the belief that a holistic frame of reference is required to properly study such a multi-disciplinary, trans-domain system. The culmination of the effort produces the Transportation Architecture Field (TAF) as a mental model of the NTS, in which the relationships between four basic entity groups are identified and articulated. This entity-centric abstraction framework underpins the construction of a virtual NTS couched in the form of an agent-based model. The transportation consumers and the service providers are identified as adaptive agents that apply a set of preprogrammed behavioral rules to achieve their respective goals. The transportation infrastructure and multitude of exogenous entities (disruptors and drivers) in the whole system can also be represented without resorting to an extremely complicated structure. The outcome is a flexible, scalable, computational model that allows for examination of numerous scenarios which involve the cascade of interrelated effects of aviation technology, infrastructure, and socioeconomic changes throughout the entire system.
NASA Technical Reports Server (NTRS)
King, Ellis; Hart, Jeremy; Odegard, Ryan
2010-01-01
The Orion Crew Exploration Vehicle (CET) is being designed to include significantly more automation capability than either the Space Shuttle or the International Space Station (ISS). In particular, the vehicle flight software has requirements to accommodate increasingly automated missions throughout all phases of flight. A data-driven flight software architecture will provide an evolvable automation capability to sequence through Guidance, Navigation & Control (GN&C) flight software modes and configurations while maintaining the required flexibility and human control over the automation. This flexibility is a key aspect needed to address the maturation of operational concepts, to permit ground and crew operators to gain trust in the system and mitigate unpredictability in human spaceflight. To allow for mission flexibility and reconfrgurability, a data driven approach is being taken to load the mission event plan as well cis the flight software artifacts associated with the GN&C subsystem. A database of GN&C level sequencing data is presented which manages and tracks the mission specific and algorithm parameters to provide a capability to schedule GN&C events within mission segments. The flight software data schema for performing automated mission sequencing is presented with a concept of operations for interactions with ground and onboard crew members. A prototype architecture for fault identification, isolation and recovery interactions with the automation software is presented and discussed as a forward work item.
Evolving neural networks for strategic decision-making problems.
Kohl, Nate; Miikkulainen, Risto
2009-04-01
Evolution of neural networks, or neuroevolution, has been a successful approach to many low-level control problems such as pole balancing, vehicle control, and collision warning. However, certain types of problems-such as those involving strategic decision-making-have remained difficult for neuroevolution to solve. This paper evaluates the hypothesis that such problems are difficult because they are fractured: The correct action varies discontinuously as the agent moves from state to state. A method for measuring fracture using the concept of function variation is proposed and, based on this concept, two methods for dealing with fracture are examined: neurons with local receptive fields, and refinement based on a cascaded network architecture. Experiments in several benchmark domains are performed to evaluate how different levels of fracture affect the performance of neuroevolution methods, demonstrating that these two modifications improve performance significantly. These results form a promising starting point for expanding neuroevolution to strategic tasks.
Adaptive neural coding: from biological to behavioral decision-making
Louie, Kenway; Glimcher, Paul W.; Webb, Ryan
2015-01-01
Empirical decision-making in diverse species deviates from the predictions of normative choice theory, but why such suboptimal behavior occurs is unknown. Here, we propose that deviations from optimality arise from biological decision mechanisms that have evolved to maximize choice performance within intrinsic biophysical constraints. Sensory processing utilizes specific computations such as divisive normalization to maximize information coding in constrained neural circuits, and recent evidence suggests that analogous computations operate in decision-related brain areas. These adaptive computations implement a relative value code that may explain the characteristic context-dependent nature of behavioral violations of classical normative theory. Examining decision-making at the computational level thus provides a crucial link between the architecture of biological decision circuits and the form of empirical choice behavior. PMID:26722666
NASA Technical Reports Server (NTRS)
Hornstein, Rhoda S.; Willoughby, John K.; Gardner, Jo A.; Shinkle, Gerald L.
1993-01-01
In 1992, NASA made the decision to evolve a Consolidated Planning System (CPS) by adding the Space Transportation System (STS) requirements to the Space Station Freedom (SSF) planning software. This paper describes this evolutionary process, which began with a series of six-month design-build-test cycles, using a domain-independent architecture and a set of developmental tools known as the Advanced Scheduling Environment. It is shown that, during these tests, the CPS could be used at multiple organizational levels of planning and for integrating schedules from geographically distributed (including international) planning environments. The potential for using the CPS for other planning and scheduling tasks in the SSF program is being currently examined.
Interplanetary laser ranging - an emerging technology for planetary science missions
NASA Astrophysics Data System (ADS)
Dirkx, D.; Vermeersen, L. L. A.
2012-09-01
Interplanetary laser ranging (ILR) is an emerging technology for very high accuracy distance determination between Earth-based stations and spacecraft or landers at interplanetary distances. It has evolved from laser ranging to Earth-orbiting satellites, modified with active laser transceiver systems at both ends of the link instead of the passive space-based retroreflectors. It has been estimated that this technology can be used for mm- to cm-level accuracy range determination at interplanetary distances [2, 7]. Work is being performed in the ESPaCE project [6] to evaluate in detail the potential and limitations of this technology by means of bottom-up laser link simulation, allowing for a reliable performance estimate from mission architecture and hardware characteristics.
NASA Astrophysics Data System (ADS)
Häner, R.; Wächter, J.
2012-04-01
The project Collaborative, Complex, and Critical Decision-Support in Evolving Crises (TRIDEC), co-funded by the European Commission in its Seventh Framework Programme aims at establishing a network of dedicated, autonomous legacy systems for large-scale concurrent management of natural crises utilising heterogeneous information resources. TRIDEC's architecture reflects the System-of- Systems (SoS) approach which is based on task-oriented systems, cooperatively interacting as a collective in a common environment. The design of the TRIDEC-SoS follows the principles of service-oriented and event-driven architectures (SOA & EDA) exceedingly focusing on a loose coupling of the systems. The SoS approach in combination with SOA and EDA has the distinction of being able to provide novel and coherent behaviours and features resulting from a process of dynamic self-organisation. Self-organisation is a process without the need for a central or external coordinator controlling it through orchestration. It is the result of enacted concurrent tasks in a collaborative environment of geographically distributed systems. Although the individual systems act completely autonomously, their interactions expose emergent structures of evolving nature. Particularly, the fact is important that SoS are inherently able to evolve on all facets of intelligent information management. This includes adaptive properties, e.g. seamless integration of new resource types or the adoption of new fields in natural crisis management. In the case of TRIDEC with various heterogeneous participants involved, concurrent information processing is of fundamental importance because of the achievable improvements regarding cooperative decision making. Collaboration within TRIDEC will be implemented with choreographies and conversations. Choreographies specify the expected behaviour between two or more participants; conversations describe the message exchange between all participants emphasising their logical relation. The TRIDEC choreography will be based on the definition of Behavioural Interfaces and Service Level Agreements, which describe the interactions of all participants involved in the collaborative process by binding the tasks of dedicated systems to high-level business processes. All methods of a Behavioural Interfaces can be assigned dynamically to the activities of a business process. This allows it to utilise a system during the run-time of a business process and thus, for example enabling task balancing or the delegation of responsibilities. Since the individual parts of a SoS are normally managed independently and operate autonomously because of their geographical distribution it is of vital importance to ensure the reliability (robustness and correctness) of their interactions which will be achieved by applying the Design by Contract (DbC) approach to the TRIDEC architecture. Key challenge for TRIDEC is establishing a reliable adaptive system which exposes an emergent behaviour, for example intelligent monitoring strategies or dynamic system adaptions even in case of partly system failures. It is essential for TRIDEC that for example redundant parts of the system can take over tasks from defect components in a process of re-organising its network.
Urban Landscape Architecture in the Reshaping of the Contemporary Cityscape
NASA Astrophysics Data System (ADS)
Ananiadou-Tzimopoulou, Maria; Bourlidou, Anastasia
2017-10-01
The contemporary urban landscape is the evolving image of dynamic social, economic and ecological changes and heterogeneity. It constitutes the mirror of history, natural and cultural, urban processes, as well as locations of hybrid character, such as degraded and fragmented spaces within the urban fabric or in the city boundaries -areas in between, infrastructures, post-industrial and waterfront sites, but also potential grounds for urban development. Along with the awakening of the global ecological awareness and the ongoing discussion on sustainability issues, the cityscape with its new attributes, constitutes a challenging field of research and planning for various disciplines, further more than landscape architecture, such as architecture, planning, ecology, environment and engineering. This paper focuses on the role of urban landscape architecture, via its theory and practice, in the reshaping of the city territory. It aspires to broaden the discussion concerning the upgrading of the contemporary cities, aiming firstly at the determination of a wider vocabulary for the urban landscape and its design, and secondly at the highlighting of landscape architecture’s contribution to the sustainable perspective of urban design and planning. The methodology is based on a comparative research implemented both on a theoretical level and on a level of applied work. Urban landscape architecture is described through theory and practice, along with correlative approaches deriving mainly from landscape urbanism and secondarily from the field of architecture. Urban landscape is approached as a socio-ecological and perceptual legible, a territory of culture, process and production; operating as an entity of ecological, infrastructural systems and planning needs, it is also regarded as a precedent for urban development. Furthermore, the research is supported by selected European and International urban landscape projects, presented in a cohesive multiscalar approach, from the node to the region. Theory is reflected upon: a/smaller scale projects-cultural landscapes, b/infrastructural projects, c/extended process territories and d/grand metropolitan projects. The particular case studies constitute representative design approaches dealing with the urban complexity and are hierarchized on qualitative criteria, spatial and functional; they are indicative of the spectrum of project’s scale, type of intervention -redesign, reclamation, reuse, planning, but also of the project’s operational value -cultural, infrastructural, strategic. They stress the importance of landscape’s flexible and open-ended nature and ultimately, they underline the crucial role of urban landscape architecture, within transdisciplinarity and sustainable design strategies, in the regeneration of the contemporary cityscape.
A Method for Aligning Acquisition Strategies and Software Architectures
2014-09-01
system • Want to make sure the system can be readily evolved to use new technology Members of the HR staff ( super - visors and those who would use the...References URLs are valid as of the publication date of this document. [Barbacci 2003] Barbacci, Mario , Ellison, Robert, Lattanze, Anthony, Stafford
Lukatskaya, Maria R.; Bak, Seong -Min; Yu, Xiqian; ...
2015-05-28
The field of supercapacitors (electrochemical capacitors) is constantly evolving. The global motivation is to create devices that possess a significant energy density without compromising the power density. To achieve this goal, new materials must be discovered and complex electrode architectures developed.
Athens Junior High School, Athens, Tennessee. Profile of a Significant School.
ERIC Educational Resources Information Center
Justus, John E., Ed.
This article describes a school which is the product of architectural design evolving from educational specifications. The building demonstrates the relationship of pupil learning to the component systems of the building design. Brief explanations and illustrations (from the planning and design stage) are made for an instructional materials…
Systems and Algorithms for Automated Collaborative Observation Using Networked Robotic Cameras
ERIC Educational Resources Information Center
Xu, Yiliang
2011-01-01
The development of telerobotic systems has evolved from Single Operator Single Robot (SOSR) systems to Multiple Operator Multiple Robot (MOMR) systems. The relationship between human operators and robots follows the master-slave control architecture and the requests for controlling robot actuation are completely generated by human operators. …
Healthy Eating Design Guidelines for School Architecture
Huang, Terry T-K; Sorensen, Dina; Davis, Steven; Frerichs, Leah; Brittin, Jeri; Celentano, Joseph; Callahan, Kelly
2013-01-01
We developed a new tool, Healthy Eating Design Guidelines for School Architecture, to provide practitioners in architecture and public health with a practical set of spatially organized and theory-based strategies for making school environments more conducive to learning about and practicing healthy eating by optimizing physical resources and learning spaces. The design guidelines, developed through multidisciplinary collaboration, cover 10 domains of the school food environment (eg, cafeteria, kitchen, garden) and 5 core healthy eating design principles. A school redesign project in Dillwyn, Virginia, used the tool to improve the schools’ ability to adopt a healthy nutrition curriculum and promote healthy eating. The new tool, now in a pilot version, is expected to evolve as its components are tested and evaluated through public health and design research. PMID:23449281
NASA Technical Reports Server (NTRS)
Barnes, Jeffrey M.
2011-01-01
All software systems of significant size and longevity eventually undergo changes to their basic architectural structure. Such changes may be prompted by evolving requirements, changing technology, or other reasons. Whatever the cause, software architecture evolution is commonplace in real world software projects. Recently, software architecture researchers have begun to study this phenomenon in depth. However, this work has suffered from problems of validation; research in this area has tended to make heavy use of toy examples and hypothetical scenarios and has not been well supported by real world examples. To help address this problem, I describe an ongoing effort at the Jet Propulsion Laboratory to re-architect the Advanced Multimission Operations System (AMMOS), which is used to operate NASA's deep-space and astrophysics missions. Based on examination of project documents and interviews with project personnel, I describe the goals and approach of this evolution effort and then present models that capture some of the key architectural changes. Finally, I demonstrate how approaches and formal methods from my previous research in architecture evolution may be applied to this evolution, while using languages and tools already in place at the Jet Propulsion Laboratory.
ANTS: Applying A New Paradigm for Lunar and Planetary Exploration
NASA Technical Reports Server (NTRS)
Clark, P. E.; Curtis, S. A.; Rilee, M. L.
2002-01-01
ANTS (Autonomous Nano- Technology Swarm), a mission architecture consisting of a large (1000 member) swarm of picoclass (1 kg) totally autonomous spacecraft with both adaptable and evolvable heuristic systems, is being developed as a NASA advanced mission concept, and is here examined as a paradigm for lunar surface exploration. As the capacity and complexity of hardware and software, demands for bandwidth, and the sophistication of goals for lunar and planetary exploration have increased, greater cost constraints have led to fewer resources and thus, the need to operate spacecraft with less frequent human contact. At present, autonomous operation of spacecraft systems allows great capability of spacecraft to 'safe' themselves and survive when conditions threaten spacecraft safety. To further develop spacecraft capability, NASA is at the forefront of development of new mission architectures which involve the use of Intelligent Software Agents (ISAs), performing experiments in space and on the ground to advance deliberative and collaborative autonomous control techniques. Selected missions in current planning stages require small groups of spacecraft weighing tens, instead of hundreds, of kilograms to cooperate at a tactical level to select and schedule measurements to be made by appropriate instruments onboard. Such missions will be characterizing rapidly unfolding real-time events on a routine basis. The next level of development, which we are considering here, is in the use of autonomous systems at the strategic level, to explore the remote terranes, potentially involving large surveys or detailed reconnaissance.
Inter-computer communication architecture for a mixed redundancy distributed system
NASA Technical Reports Server (NTRS)
Lala, Jaynarayan H.; Adams, Stuart J.
1987-01-01
The triply redundant intercomputer network for the Advanced Information Processing System (AIPS), an architecture developed to serve as the core avionics system for a broad range of aerospace vehicles, is discussed. The AIPS intercomputer network provides a high-speed, Byzantine-fault-resilient communication service between processing sites, even in the presence of arbitrary failures of simplex and duplex processing sites on the IC network. The IC network contention poll has evolved from the Laning Poll. An analysis of the failure modes and effects and a simulation of the AIPS contention poll, demonstrate the robustness of the system.
Network simulations of optical illusions
NASA Astrophysics Data System (ADS)
Shinbrot, Troy; Lazo, Miguel Vivar; Siu, Theo
We examine a dynamical network model of visual processing that reproduces several aspects of a well-known optical illusion, including subtle dependencies on curvature and scale. The model uses a genetic algorithm to construct the percept of an image, and we show that this percept evolves dynamically so as to produce the illusions reported. We find that the perceived illusions are hardwired into the model architecture and we propose that this approach may serve as an archetype to distinguish behaviors that are due to nature (i.e. a fixed network architecture) from those subject to nurture (that can be plastically altered through learning).
Using JWST Heritage to Enable a Future Large Ultra-Violet Optical Infrared Telescope
NASA Technical Reports Server (NTRS)
Feinberg, Lee
2016-01-01
To the extent it makes sense, leverage JWST knowledge, designs, architectures, GSE. Develop a scalable design reference mission (9.2 meter). Do just enough work to understand launch break points in aperture size. Demonstrate 10 pm stability is achievable on a design reference mission. Make design compatible with starshades. While segmented coronagraphs with high throughput and large bandpasses are important, make the system serviceable so you can evolve the instruments. Keep it room temperature to minimize the costs associated with cryo. Focus resources on the contrast problem. Start with the architecture and connect it to the technology needs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campbell, Andrea Beth
2004-07-01
This is a case study of the NuMAC nuclear accountability system developed at a private fuel fabrication facility. This paper investigates nuclear material accountability and safeguards by researching expert knowledge applied in the system design and development. Presented is a system developed to detect and deter the theft of weapons grade nuclear material. Examined is the system architecture that includes: issues for the design and development of the system; stakeholder issues; how the system was built and evolved; software design, database design, and development tool considerations; security and computing ethics. (author)
Evolving Our Evaluation of Luminous Environments
NASA Technical Reports Server (NTRS)
Clark, Toni
2016-01-01
The advance in solid state light emitting technologies and optics for lighting and visual communication necessitates the evaluation of how NASA envisions spacecraft lighting architectures and how NASA uses industry standards for the design and evaluation of lighting systems. Current NASA lighting standards and requirements for existing architectures focus on the separate ability of a lighting system to throw light against a surface or the ability of a display system to provide the appropriate visual contrast. This project investigated large luminous surface lamps as an alternative or supplement to overhead lighting. The efficiency of the technology was evaluated for uniformity and power consumption.
Utopian Kinetic Structures and Their Impact on the Contemporary Architecture
NASA Astrophysics Data System (ADS)
Cudzik, Jan; Nyka, Lucyna
2017-10-01
This paper delves into relationships between twentieth century utopian concepts of movable structures and the kinematic solutions implemented in contemporary architectural projects. The reason for conducting this study is to determine the impact of early architectural conceptions on today’s solutions. This paper points out close links that stem from the imagination of artists and architects working in 1960s and 70s and the solutions implemented by contemporary architects of that era. The research method of this paper is based on comparative analyses of architectural forms with adopted kinematic solutions. It is based on archive drawings’ studies and the examination of theoretical concepts. The research pertains to different forms of such mobility that evolved in 1960s and 70s. Many of them, usually based on the simple forms of movement were realized. The more complicated ones remained in the sphere of utopian visionary architecture. In this case, projects often exceed technical limitations and capabilities of design tools. Finally, after some decades, with the development of innovative architectural design tools and new building technologies many early visions materialized into architectural forms. In conclusion, this research indicates that modern kinematic design solutions are often based on conceptual designs formed from the beginning of the second half of the twentieth century.
Material Properties of the Posterior Human Sclera☆
Grytz, Rafael; Fazio, Massimo A.; Girard, Michael J.A.; Libertiaux, Vincent; Bruno, Luigi; Gardiner, Stuart; Girkin, Christopher A.; Downs, J. Crawford
2013-01-01
To characterize the material properties of posterior and peripapillary sclera from human donors, and to investigate the macro- and micro-scale strains as potential control mechanisms governing mechanical homeostasis. Posterior scleral shells from 9 human donors aged 57–90 years were subjected to IOP elevations from 5 to 45 mmHg and the resulting full-field displacements were recorded using laser speckle interferometry. Eye-specific finite element models were generated based on experimentally measured scleral shell surface geometry and thickness. Inverse numerical analyses were performed to identify material parameters for each eye by matching experimental deformation measurements to model predictions using a microstructure-based constitutive formulation that incorporates the crimp response and anisotropic architecture of scleral collagen fibrils. The material property fitting produced models that fit both the overall and local deformation responses of posterior scleral shells very well. The nonlinear stiffening of the sclera with increasing IOP was well reproduced by the uncrimping of scleral collagen fibrils, and a circumferentially-aligned ring of collagen fibrils around the scleral canal was predicted in all eyes. Macroscopic in-plane strains were significantly higher in peripapillary region then in the mid-periphery. In contrast, the meso- and micro-scale strains at the collagen network and collagen fibril level were not significantly different between regions. The elastic response of the posterior human sclera can be characterized by the anisotropic architecture and crimp response of scleral collagen fibrils. The similar collagen fibril strains in the peripapillary and mid-peripheral regions support the notion that the scleral collagen architecture including the circumpapillary ring of collagen fibrils evolved to establish optimal load bearing conditions at the collagen fibril level. PMID:23684352
Argento, G; de Jonge, N; Söntjens, S H M; Oomens, C W J; Bouten, C V C; Baaijens, F P T
2015-06-01
The anisotropic collagen architecture of an engineered cardiovascular tissue has a major impact on its in vivo mechanical performance. This evolving collagen architecture is determined by initial scaffold microstructure and mechanical loading. Here, we developed and validated a theoretical and computational microscale model to quantitatively understand the interplay between scaffold architecture and mechanical loading on collagen synthesis and degradation. Using input from experimental studies, we hypothesize that both the microstructure of the scaffold and the loading conditions influence collagen turnover. The evaluation of the mechanical and topological properties of in vitro engineered constructs reveals that the formation of extracellular matrix layers on top of the scaffold surface influences the mechanical anisotropy on the construct. Results show that the microscale model can successfully capture the collagen arrangement between the fibers of an electrospun scaffold under static and cyclic loading conditions. Contact guidance by the scaffold, and not applied load, dominates the collagen architecture. Therefore, when the collagen grows inside the pores of the scaffold, pronounced scaffold anisotropy guarantees the development of a construct that mimics the mechanical anisotropy of the native cardiovascular tissue.
The Psychosemantics of Free Riding: Dissecting the Architecture of a Moral Concept
Delton, Andrew W.; Cosmides, Leda; Guemo, Marvin; Robertson, Theresa E.; Tooby, John
2012-01-01
For collective action to evolve and be maintained by selection, the mind must be equipped with mechanisms designed to identify free riders—individuals who do not contribute to a collective project but still benefit from it. Once identified, free riders must be either punished or excluded from future collective actions. But what criteria does the mind use to categorize someone as a free rider? An evolutionary analysis suggests that failure to contribute is not sufficient. Failure to contribute can occur by intention or accident, but the adaptive threat is posed by those who are motivated to benefit themselves at the expense of cooperators. In 6 experiments, we show that only individuals with exploitive intentions were categorized as free riders, even when holding their actual level of contribution constant (Studies 1 and 2). In contrast to an evolutionary model, rational choice and reinforcement theory suggest that different contribution levels (leading to different payoffs for their cooperative partners) should be key. When intentions were held constant, however, differences in contribution level were not used to categorize individuals as free riders, although some categorization occurred along a competence dimension (Study 3). Free rider categorization was not due to general tendencies to categorize (Study 4) or to mechanisms that track a broader class of intentional moral violations (Studies 5A and 5B). The results reveal the operation of an evolved concept with features tailored for solving the collective action problems faced by ancestral hunter-gatherers. PMID:22268815
A portable approach for PIC on emerging architectures
NASA Astrophysics Data System (ADS)
Decyk, Viktor
2016-03-01
A portable approach for designing Particle-in-Cell (PIC) algorithms on emerging exascale computers, is based on the recognition that 3 distinct programming paradigms are needed. They are: low level vector (SIMD) processing, middle level shared memory parallel programing, and high level distributed memory programming. In addition, there is a memory hierarchy associated with each level. Such algorithms can be initially developed using vectorizing compilers, OpenMP, and MPI. This is the approach recommended by Intel for the Phi processor. These algorithms can then be translated and possibly specialized to other programming models and languages, as needed. For example, the vector processing and shared memory programming might be done with CUDA instead of vectorizing compilers and OpenMP, but generally the algorithm itself is not greatly changed. The UCLA PICKSC web site at http://www.idre.ucla.edu/ contains example open source skeleton codes (mini-apps) illustrating each of these three programming models, individually and in combination. Fortran2003 now supports abstract data types, and design patterns can be used to support a variety of implementations within the same code base. Fortran2003 also supports interoperability with C so that implementations in C languages are also easy to use. Finally, main codes can be translated into dynamic environments such as Python, while still taking advantage of high performing compiled languages. Parallel languages are still evolving with interesting developments in co-Array Fortran, UPC, and OpenACC, among others, and these can also be supported within the same software architecture. Work supported by NSF and DOE Grants.
PyEvolve: a toolkit for statistical modelling of molecular evolution.
Butterfield, Andrew; Vedagiri, Vivek; Lang, Edward; Lawrence, Cath; Wakefield, Matthew J; Isaev, Alexander; Huttley, Gavin A
2004-01-05
Examining the distribution of variation has proven an extremely profitable technique in the effort to identify sequences of biological significance. Most approaches in the field, however, evaluate only the conserved portions of sequences - ignoring the biological significance of sequence differences. A suite of sophisticated likelihood based statistical models from the field of molecular evolution provides the basis for extracting the information from the full distribution of sequence variation. The number of different problems to which phylogeny-based maximum likelihood calculations can be applied is extensive. Available software packages that can perform likelihood calculations suffer from a lack of flexibility and scalability, or employ error-prone approaches to model parameterisation. Here we describe the implementation of PyEvolve, a toolkit for the application of existing, and development of new, statistical methods for molecular evolution. We present the object architecture and design schema of PyEvolve, which includes an adaptable multi-level parallelisation schema. The approach for defining new methods is illustrated by implementing a novel dinucleotide model of substitution that includes a parameter for mutation of methylated CpG's, which required 8 lines of standard Python code to define. Benchmarking was performed using either a dinucleotide or codon substitution model applied to an alignment of BRCA1 sequences from 20 mammals, or a 10 species subset. Up to five-fold parallel performance gains over serial were recorded. Compared to leading alternative software, PyEvolve exhibited significantly better real world performance for parameter rich models with a large data set, reducing the time required for optimisation from approximately 10 days to approximately 6 hours. PyEvolve provides flexible functionality that can be used either for statistical modelling of molecular evolution, or the development of new methods in the field. The toolkit can be used interactively or by writing and executing scripts. The toolkit uses efficient processes for specifying the parameterisation of statistical models, and implements numerous optimisations that make highly parameter rich likelihood functions solvable within hours on multi-cpu hardware. PyEvolve can be readily adapted in response to changing computational demands and hardware configurations to maximise performance. PyEvolve is released under the GPL and can be downloaded from http://cbis.anu.edu.au/software.
The growing need for microservices in bioinformatics.
Williams, Christopher L; Sica, Jeffrey C; Killen, Robert T; Balis, Ulysses G J
2016-01-01
Within the information technology (IT) industry, best practices and standards are constantly evolving and being refined. In contrast, computer technology utilized within the healthcare industry often evolves at a glacial pace, with reduced opportunities for justified innovation. Although the use of timely technology refreshes within an enterprise's overall technology stack can be costly, thoughtful adoption of select technologies with a demonstrated return on investment can be very effective in increasing productivity and at the same time, reducing the burden of maintenance often associated with older and legacy systems. In this brief technical communication, we introduce the concept of microservices as applied to the ecosystem of data analysis pipelines. Microservice architecture is a framework for dividing complex systems into easily managed parts. Each individual service is limited in functional scope, thereby conferring a higher measure of functional isolation and reliability to the collective solution. Moreover, maintenance challenges are greatly simplified by virtue of the reduced architectural complexity of each constitutive module. This fact notwithstanding, rendered overall solutions utilizing a microservices-based approach provide equal or greater levels of functionality as compared to conventional programming approaches. Bioinformatics, with its ever-increasing demand for performance and new testing algorithms, is the perfect use-case for such a solution. Moreover, if promulgated within the greater development community as an open-source solution, such an approach holds potential to be transformative to current bioinformatics software development. Bioinformatics relies on nimble IT framework which can adapt to changing requirements. To present a well-established software design and deployment strategy as a solution for current challenges within bioinformatics. Use of the microservices framework is an effective methodology for the fabrication and implementation of reliable and innovative software, made possible in a highly collaborative setting.
The growing need for microservices in bioinformatics
Williams, Christopher L.; Sica, Jeffrey C.; Killen, Robert T.; Balis, Ulysses G. J.
2016-01-01
Objective: Within the information technology (IT) industry, best practices and standards are constantly evolving and being refined. In contrast, computer technology utilized within the healthcare industry often evolves at a glacial pace, with reduced opportunities for justified innovation. Although the use of timely technology refreshes within an enterprise's overall technology stack can be costly, thoughtful adoption of select technologies with a demonstrated return on investment can be very effective in increasing productivity and at the same time, reducing the burden of maintenance often associated with older and legacy systems. In this brief technical communication, we introduce the concept of microservices as applied to the ecosystem of data analysis pipelines. Microservice architecture is a framework for dividing complex systems into easily managed parts. Each individual service is limited in functional scope, thereby conferring a higher measure of functional isolation and reliability to the collective solution. Moreover, maintenance challenges are greatly simplified by virtue of the reduced architectural complexity of each constitutive module. This fact notwithstanding, rendered overall solutions utilizing a microservices-based approach provide equal or greater levels of functionality as compared to conventional programming approaches. Bioinformatics, with its ever-increasing demand for performance and new testing algorithms, is the perfect use-case for such a solution. Moreover, if promulgated within the greater development community as an open-source solution, such an approach holds potential to be transformative to current bioinformatics software development. Context: Bioinformatics relies on nimble IT framework which can adapt to changing requirements. Aims: To present a well-established software design and deployment strategy as a solution for current challenges within bioinformatics Conclusions: Use of the microservices framework is an effective methodology for the fabrication and implementation of reliable and innovative software, made possible in a highly collaborative setting. PMID:27994937
Panmictic and Clonal Evolution on a Single Patchy Resource Produces Polymorphic Foraging Guilds
Getz, Wayne M.; Salter, Richard; Lyons, Andrew J.; Sippl-Swezey, Nicolas
2015-01-01
We develop a stochastic, agent-based model to study how genetic traits and experiential changes in the state of agents and available resources influence individuals’ foraging and movement behaviors. These behaviors are manifest as decisions on when to stay and exploit a current resource patch or move to a particular neighboring patch, based on information of the resource qualities of the patches and the anticipated level of intraspecific competition within patches. We use a genetic algorithm approach and an individual’s biomass as a fitness surrogate to explore the foraging strategy diversity of evolving guilds under clonal versus hermaphroditic sexual reproduction. We first present the resource exploitation processes, movement on cellular arrays, and genetic algorithm components of the model. We then discuss their implementation on the Nova software platform. This platform seamlessly combines the dynamical systems modeling of consumer-resource interactions with agent-based modeling of individuals moving over a landscapes, using an architecture that lays transparent the following four hierarchical simulation levels: 1.) within-patch consumer-resource dynamics, 2.) within-generation movement and competition mitigation processes, 3.) across-generation evolutionary processes, and 4.) multiple runs to generate the statistics needed for comparative analyses. The focus of our analysis is on the question of how the biomass production efficiency and the diversity of guilds of foraging strategy types, exploiting resources over a patchy landscape, evolve under clonal versus random hermaphroditic sexual reproduction. Our results indicate greater biomass production efficiency under clonal reproduction only at higher population densities, and demonstrate that polymorphisms evolve and are maintained under random mating systems. The latter result questions the notion that some type of associative mating structure is needed to maintain genetic polymorphisms among individuals exploiting a common patchy resource on an otherwise spatially homogeneous landscape. PMID:26274613
Najafpour, Mohammad Mahdi
2011-01-01
The oxygen evolving complex in photosystem II which induces the oxidation of water to dioxygen in plants, algae and certain bacteria contains a cluster of one calcium and four manganese ions. It serves as a model to split water by sunlight. Reports on the mechanism and structure of photosystem II provide a more detailed architecture of the oxygen evolving complex and the surrounding amino acids. One challenge in this field is the development of artificial model compounds to study oxygen evolution reaction outside the complicated environment of the enzyme. Calcium-manganese oxides as structural and functional models for the active site of photosystem II are explained and reviewed in this paper. Because of related structures of these calcium-manganese oxides and the catalytic centers of active site of the oxygen evolving complex of photosystem II, the study may help to understand more about mechanism of oxygen evolution by the oxygen evolving complex of photosystem II. Copyright © 2010 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Simon, Matthew A.; Toups, Larry; Howe, A. Scott; Wald, Samuel I.
2015-01-01
The Evolvable Mars Campaign (EMC) is the current NASA Mars mission planning effort which seeks to establish sustainable, realistic strategies to enable crewed Mars missions in the mid-2030s timeframe. The primary outcome of the Evolvable Mars Campaign is not to produce "The Plan" for sending humans to Mars, but instead its intent is to inform the Human Exploration and Operations Mission Directorate near-term key decisions and investment priorities to prepare for those types of missions. The FY'15 EMC effort focused upon analysis of integrated mission architectures to identify technically appealing transportation strategies, logistics build-up strategies, and vehicle designs for reaching and exploring Mars moons and Mars surface. As part of the development of this campaign, long duration habitats are required which are capable of supporting crew with limited resupply and crew abort during the Mars transit, Mars moons, and Mars surface segments of EMC missions. In particular, the EMC design team sought to design a single, affordable habitation system whose manufactured units could be outfitted uniquely for each of these missions and reused for multiple crewed missions. This habitat system must provide all of the functionality to safely support 4 crew for long durations while meeting mass and volume constraints for each of the mission segments set by the chosen transportation architecture and propulsion technologies. This paper describes several proposed long-duration habitation strategies to enable the Evolvable Mars Campaign through improvements in mass, cost, and reusability, and presents results of analysis to compare the options and identify promising solutions. The concepts investigated include several monolithic concepts: monolithic clean sheet designs, and concepts which leverage the co-manifested payload capability of NASA's Space Launch System (SLS) to deliver habitable elements within the Universal Payload Adaptor between the SLS upper stage and the Orion/Service module on the top of the vehicle. Multiple modular habitat options for Mars surface and in-space missions are also considered with various functionality and volume splits between modules to find the best balance of reducing the single largest mass which must be delivered to a destination and reducing the number of separate elements which must be launched. Analysis results presented for each of these concepts in this paper include mass/volume/power sizing using parametric sizing tools, identification of unique operational constraints, and limited comments on the additional impacts of reusability/dormancy on system design. Finally, recommendations will be made for promising solutions which will be carried forward for consideration in the Evolvable Mars Campaign work.
A Robust Scalable Transportation System Concept
NASA Technical Reports Server (NTRS)
Hahn, Andrew; DeLaurentis, Daniel
2006-01-01
This report documents the 2005 Revolutionary System Concept for Aeronautics (RSCA) study entitled "A Robust, Scalable Transportation System Concept". The objective of the study was to generate, at a high-level of abstraction, characteristics of a new concept for the National Airspace System, or the new NAS, under which transportation goals such as increased throughput, delay reduction, and improved robustness could be realized. Since such an objective can be overwhelmingly complex if pursued at the lowest levels of detail, instead a System-of-Systems (SoS) approach was adopted to model alternative air transportation architectures at a high level. The SoS approach allows the consideration of not only the technical aspects of the NAS", but also incorporates policy, socio-economic, and alternative transportation system considerations into one architecture. While the representations of the individual systems are basic, the higher level approach allows for ways to optimize the SoS at the network level, determining the best topology (i.e. configuration of nodes and links). The final product (concept) is a set of rules of behavior and network structure that not only satisfies national transportation goals, but represents the high impact rules that accomplish those goals by getting the agents to "do the right thing" naturally. The novel combination of Agent Based Modeling and Network Theory provides the core analysis methodology in the System-of-Systems approach. Our method of approach is non-deterministic which means, fundamentally, it asks and answers different questions than deterministic models. The nondeterministic method is necessary primarily due to our marriage of human systems with technological ones in a partially unknown set of future worlds. Our goal is to understand and simulate how the SoS, human and technological components combined, evolve.
Overview of the LINCS architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fletcher, J.G.; Watson, R.W.
1982-01-13
Computing at the Lawrence Livermore National Laboratory (LLNL) has evolved over the past 15 years with a computer network based resource sharing environment. The increasing use of low cost and high performance micro, mini and midi computers and commercially available local networking systems will accelerate this trend. Further, even the large scale computer systems, on which much of the LLNL scientific computing depends, are evolving into multiprocessor systems. It is our belief that the most cost effective use of this environment will depend on the development of application systems structured into cooperating concurrent program modules (processes) distributed appropriately over differentmore » nodes of the environment. A node is defined as one or more processors with a local (shared) high speed memory. Given the latter view, the environment can be characterized as consisting of: multiple nodes communicating over noisy channels with arbitrary delays and throughput, heterogenous base resources and information encodings, no single administration controlling all resources, distributed system state, and no uniform time base. The system design problem is - how to turn the heterogeneous base hardware/firmware/software resources of this environment into a coherent set of resources that facilitate development of cost effective, reliable, and human engineered applications. We believe the answer lies in developing a layered, communication oriented distributed system architecture; layered and modular to support ease of understanding, reconfiguration, extensibility, and hiding of implementation or nonessential local details; communication oriented because that is a central feature of the environment. The Livermore Interactive Network Communication System (LINCS) is a hierarchical architecture designed to meet the above needs. While having characteristics in common with other architectures, it differs in several respects.« less
Agents Control in Intelligent Learning Systems: The Case of Reactive Characteristics
ERIC Educational Resources Information Center
Laureano-Cruces, Ana Lilia; Ramirez-Rodriguez, Javier; de Arriaga, Fernando; Escarela-Perez, Rafael
2006-01-01
Intelligent learning systems (ILSs) have evolved in the last few years basically because of influences received from multi-agent architectures (MAs). Conflict resolution among agents has been a very important problem for multi-agent systems, with specific features in the case of ILSs. The literature shows that ILSs with cognitive or pedagogical…
Enabling Future Robotic Missions with Multicore Processors
NASA Technical Reports Server (NTRS)
Powell, Wesley A.; Johnson, Michael A.; Wilmot, Jonathan; Some, Raphael; Gostelow, Kim P.; Reeves, Glenn; Doyle, Richard J.
2011-01-01
Recent commercial developments in multicore processors (e.g. Tilera, Clearspeed, HyperX) have provided an option for high performance embedded computing that rivals the performance attainable with FPGA-based reconfigurable computing architectures. Furthermore, these processors offer more straightforward and streamlined application development by allowing the use of conventional programming languages and software tools in lieu of hardware design languages such as VHDL and Verilog. With these advantages, multicore processors can significantly enhance the capabilities of future robotic space missions. This paper will discuss these benefits, along with onboard processing applications where multicore processing can offer advantages over existing or competing approaches. This paper will also discuss the key artchitecural features of current commercial multicore processors. In comparison to the current art, the features and advancements necessary for spaceflight multicore processors will be identified. These include power reduction, radiation hardening, inherent fault tolerance, and support for common spacecraft bus interfaces. Lastly, this paper will explore how multicore processors might evolve with advances in electronics technology and how avionics architectures might evolve once multicore processors are inserted into NASA robotic spacecraft.
φ-evo: A program to evolve phenotypic models of biological networks.
Henry, Adrien; Hemery, Mathieu; François, Paul
2018-06-01
Molecular networks are at the core of most cellular decisions, but are often difficult to comprehend. Reverse engineering of network architecture from their functions has proved fruitful to classify and predict the structure and function of molecular networks, suggesting new experimental tests and biological predictions. We present φ-evo, an open-source program to evolve in silico phenotypic networks performing a given biological function. We include implementations for evolution of biochemical adaptation, adaptive sorting for immune recognition, metazoan development (somitogenesis, hox patterning), as well as Pareto evolution. We detail the program architecture based on C, Python 3, and a Jupyter interface for project configuration and network analysis. We illustrate the predictive power of φ-evo by first recovering the asymmetrical structure of the lac operon regulation from an objective function with symmetrical constraints. Second, we use the problem of hox-like embryonic patterning to show how a single effective fitness can emerge from multi-objective (Pareto) evolution. φ-evo provides an efficient approach and user-friendly interface for the phenotypic prediction of networks and the numerical study of evolution itself.
Phylogeny of metabolic networks: a spectral graph theoretical approach.
Deyasi, Krishanu; Banerjee, Anirban; Deb, Bony
2015-10-01
Many methods have been developed for finding the commonalities between different organisms in order to study their phylogeny. The structure of metabolic networks also reveals valuable insights into metabolic capacity of species as well as into the habitats where they have evolved. We constructed metabolic networks of 79 fully sequenced organisms and compared their architectures. We used spectral density of normalized Laplacian matrix for comparing the structure of networks. The eigenvalues of this matrix reflect not only the global architecture of a network but also the local topologies that are produced by different graph evolutionary processes like motif duplication or joining. A divergence measure on spectral densities is used to quantify the distances between various metabolic networks, and a split network is constructed to analyse the phylogeny from these distances. In our analysis, we focused on the species that belong to different classes, but appear more related to each other in the phylogeny. We tried to explore whether they have evolved under similar environmental conditions or have similar life histories. With this focus, we have obtained interesting insights into the phylogenetic commonality between different organisms.
Mask data processing in the era of multibeam writers
NASA Astrophysics Data System (ADS)
Abboud, Frank E.; Asturias, Michael; Chandramouli, Maesh; Tezuka, Yoshihiro
2014-10-01
Mask writers' architectures have evolved through the years in response to ever tightening requirements for better resolution, tighter feature placement, improved CD control, and tolerable write time. The unprecedented extension of optical lithography and the myriad of Resolution Enhancement Techniques have tasked current mask writers with ever increasing shot count and higher dose, and therefore, increasing write time. Once again, we see the need for a transition to a new type of mask writer based on massively parallel architecture. These platforms offer a step function improvement in both dose and the ability to process massive amounts of data. The higher dose and almost unlimited appetite for edge corrections open new windows of opportunity to further push the envelope. These architectures are also naturally capable of producing curvilinear shapes, making the need to approximate a curve with multiple Manhattan shapes unnecessary.
Positioning navigation and timing service applications in cyber physical systems
NASA Astrophysics Data System (ADS)
Qu, Yi; Wu, Xiaojing; Zeng, Lingchuan
2017-10-01
The positioning navigation and timing (PNT) architecture was discussed in detail, whose history, evolvement, current status and future plan were presented, main technologies were listed, advantages and limitations of most technologies were compared, novel approaches were introduced, and future capacities were sketched. The concept of cyber-physical system (CPS) was described and their primary features were interpreted. Then the three-layer architecture of CPS was illustrated. Next CPS requirements on PNT services were analyzed, including requirements on position reference and time reference, requirements on temporal-spatial error monitor, requirements on dynamic services, real-time services, autonomous services, security services and standard services. Finally challenges faced by PNT applications in CPS were concluded. The conclusion was expected to facilitate PNT applications in CPS, and furthermore to provide references to the design and implementation of both architectures.
Reducing Development and Operations Costs using NASA's "GMSEC" Systems Architecture
NASA Technical Reports Server (NTRS)
Smith, Dan; Bristow, John; Crouse, Patrick
2007-01-01
This viewgraph presentation reviews the role of Goddard Mission Services Evolution Center (GMSEC) in reducing development and operation costs in handling the massive data from NASA missions. The goals of GMSEC systems architecture development are to (1) Simplify integration and development, (2)Facilitate technology infusion over time, (3) Support evolving operational concepts, and (4) All for mix of heritage, COTS and new components. First 3 missions (i.e., Tropical Rainforest Measuring Mission (TRMM), Small Explorer (SMEX) missions - SWAS, TRACE, SAMPEX, and ST5 3-Satellite Constellation System) each selected a different telemetry and command system. These results show that GMSEC's message-bus component-based framework architecture is well proven and provides significant benefits over traditional flight and ground data system designs. The missions benefit through increased set of product options, enhanced automation, lower cost and new mission-enabling operations concept options .
A modeling process to understand complex system architectures
NASA Astrophysics Data System (ADS)
Robinson, Santiago Balestrini
2009-12-01
In recent decades, several tools have been developed by the armed forces, and their contractors, to test the capability of a force. These campaign level analysis tools, often times characterized as constructive simulations are generally expensive to create and execute, and at best they are extremely difficult to verify and validate. This central observation, that the analysts are relying more and more on constructive simulations to predict the performance of future networks of systems, leads to the two central objectives of this thesis: (1) to enable the quantitative comparison of architectures in terms of their ability to satisfy a capability without resorting to constructive simulations, and (2) when constructive simulations must be created, to quantitatively determine how to spend the modeling effort amongst the different system classes. The first objective led to Hypothesis A, the first main hypotheses, which states that by studying the relationships between the entities that compose an architecture, one can infer how well it will perform a given capability. The method used to test the hypothesis is based on two assumptions: (1) the capability can be defined as a cycle of functions, and that it (2) must be possible to estimate the probability that a function-based relationship occurs between any two types of entities. If these two requirements are met, then by creating random functional networks, different architectures can be compared in terms of their ability to satisfy a capability. In order to test this hypothesis, a novel process for creating representative functional networks of large-scale system architectures was developed. The process, named the Digraph Modeling for Architectures (DiMA), was tested by comparing its results to those of complex constructive simulations. Results indicate that if the inputs assigned to DiMA are correct (in the tests they were based on time-averaged data obtained from the ABM), DiMA is able to identify which of any two architectures is better more than 98% of the time. The second objective led to Hypothesis B, the second of the main hypotheses. This hypothesis stated that by studying the functional relations, the most critical entities composing the architecture could be identified. The critical entities are those that when their behavior varies slightly, the behavior of the overall architecture varies greatly. These are the entities that must be modeled more carefully and where modeling effort should be expended. This hypothesis was tested by simplifying agent-based models to the non-trivial minimum, and executing a large number of different simulations in order to obtain statistically significant results. The tests were conducted by evolving the complex model without any error induced, and then evolving the model once again for each ranking and assigning error to any of the nodes with a probability inversely proportional to the ranking. The results from this hypothesis test indicate that depending on the structural characteristics of the functional relations, it is useful to use one of two of the intelligent rankings tested, or it is best to expend effort equally amongst all the entities. Random ranking always performed worse than uniform ranking, indicating that if modeling effort is to be prioritized amongst the entities composing the large-scale system architecture, it should be prioritized intelligently. The benefit threshold between intelligent prioritization and no prioritization lays on the large-scale system's chaotic boundary. If the large-scale system behaves chaotically, small variations in any of the entities tends to have a great impact on the behavior of the entire system. Therefore, even low ranking entities can still affect the behavior of the model greatly, and error should not be concentrated in any one entity. It was discovered that the threshold can be identified from studying the structure of the networks, in particular the cyclicity, the Off-diagonal Complexity, and the Digraph Algebraic Connectivity. (Abstract shortened by UMI.)
Cost/Effort Drivers and Decision Analysis
NASA Technical Reports Server (NTRS)
Seidel, Jonathan
2010-01-01
Engineering trade study analyses demand consideration of performance, cost and schedule impacts across the spectrum of alternative concepts and in direct reference to product requirements. Prior to detailed design, requirements are too often ill-defined (only goals ) and prone to creep, extending well beyond the Systems Requirements Review. Though lack of engineering design and definitive requirements inhibit the ability to perform detailed cost analyses, affordability trades still comprise the foundation of these future product decisions and must evolve in concert. This presentation excerpts results of the recent NASA subsonic Engine Concept Study for an Advanced Single Aisle Transport to demonstrate an affordability evaluation of performance characteristics and the subsequent impacts on engine architecture decisions. Applying the Process Based Economic Analysis Tool (PBEAT), development cost, production cost, as well as operation and support costs were considered in a traditional weighted ranking of the following system-level figures of merit: mission fuel burn, take-off noise, NOx emissions, and cruise speed. Weighting factors were varied to ascertain the architecture ranking sensitivities to these performance figures of merit with companion cost considerations. A more detailed examination of supersonic variable cycle engine cost is also briefly presented, with observations and recommendations for further refinements.
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; ...
2017-03-20
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less
Evolvable Smartphone-Based Platforms for Point-of-Care In-Vitro Diagnostics Applications.
Patou, François; AlZahra'a Alatraktchi, Fatima; Kjægaard, Claus; Dimaki, Maria; Madsen, Jan; Svendsen, Winnie E
2016-09-03
The association of smart mobile devices and lab-on-chip technologies offers unprecedented opportunities for the emergence of direct-to-consumer in vitro medical diagnostics applications. Despite their clear transformative potential, obstacles remain to the large-scale disruption and long-lasting success of these systems in the consumer market. For instance, the increasing level of complexity of instrumented lab-on-chip devices, coupled to the sporadic nature of point-of-care testing, threatens the viability of a business model mainly relying on disposable/consumable lab-on-chips. We argued recently that system evolvability, defined as the design characteristic that facilitates more manageable transitions between system generations via the modification of an inherited design, can help remedy these limitations. In this paper, we discuss how platform-based design can constitute a formal entry point to the design and implementation of evolvable smart device/lab-on-chip systems. We present both a hardware/software design framework and the implementation details of a platform prototype enabling at this stage the interfacing of several lab-on-chip variants relying on current- or impedance-based biosensors. Our findings suggest that several change-enabling mechanisms implemented in the higher abstraction software layers of the system can promote evolvability, together with the design of change-absorbing hardware/software interfaces. Our platform architecture is based on a mobile software application programming interface coupled to a modular hardware accessory. It allows the specification of lab-on-chip operation and post-analytic functions at the mobile software layer. We demonstrate its potential by operating a simple lab-on-chip to carry out the detection of dopamine using various electroanalytical methods.
Evolvable Smartphone-Based Platforms for Point-of-Care In-Vitro Diagnostics Applications
Patou, François; AlZahra’a Alatraktchi, Fatima; Kjægaard, Claus; Dimaki, Maria; Madsen, Jan; Svendsen, Winnie E.
2016-01-01
The association of smart mobile devices and lab-on-chip technologies offers unprecedented opportunities for the emergence of direct-to-consumer in vitro medical diagnostics applications. Despite their clear transformative potential, obstacles remain to the large-scale disruption and long-lasting success of these systems in the consumer market. For instance, the increasing level of complexity of instrumented lab-on-chip devices, coupled to the sporadic nature of point-of-care testing, threatens the viability of a business model mainly relying on disposable/consumable lab-on-chips. We argued recently that system evolvability, defined as the design characteristic that facilitates more manageable transitions between system generations via the modification of an inherited design, can help remedy these limitations. In this paper, we discuss how platform-based design can constitute a formal entry point to the design and implementation of evolvable smart device/lab-on-chip systems. We present both a hardware/software design framework and the implementation details of a platform prototype enabling at this stage the interfacing of several lab-on-chip variants relying on current- or impedance-based biosensors. Our findings suggest that several change-enabling mechanisms implemented in the higher abstraction software layers of the system can promote evolvability, together with the design of change-absorbing hardware/software interfaces. Our platform architecture is based on a mobile software application programming interface coupled to a modular hardware accessory. It allows the specification of lab-on-chip operation and post-analytic functions at the mobile software layer. We demonstrate its potential by operating a simple lab-on-chip to carry out the detection of dopamine using various electroanalytical methods. PMID:27598208
Bieri, Jonas; Kawecki, Tadeusz J
2003-02-01
We investigated the genetic architecture underlying differentiation in fitness-related traits between two pairs of populations of the seed beetle Callosobruchus maculatus (Coleoptera: Bruchidae). These populations had geographically distant (> 2000 km) origins but evolved in a uniform laboratory environment for 120 generations. For each pair of populations (Nigeria x Yemen and Cameroon x Uganda) we estimated the means of five fitness-related characters and a measure of fitness (net reproductive rate R0) in each of the parental populations and 12 types of hybrids (two F1 and two F2 lines and eight backcrosses). Models containing up to nine composite genetic parameters were fitted to the means of the 14 lines. The patterns of line means for all traits in the Nigeria x Yemen cross and for four traits (larval survival, developmental rate, female body weight, and fecundity) in the Cameroon x Uganda cross were best explained by models including additive, dominance, and maternal effects, but excluding epistasis. We did not find any evidence for outbreeding depression for any trait. An epistatic component of divergence was detected for egg hatching success and R0 in the Cameroon x Uganda cross, but its sign was opposite to that expected under outbreeding depression, that is, additive x additive epistasis had a positive effect on the performance of F2 hybrids. All traits except fecundity showed a pattern of heterosis. A large difference of egg-hatching success between the two reciprocal F1 lines in that cross was best explained as fertilization incompatibility between Cameroon females and sperm carrying Uganda genes. The results suggest that these populations have not converged to the same life-history phenotype and genetic architecture, despite 120 generations of uniform natural selection. However, the absence of outbreeding depression implies that they did not evolve toward different adaptive peaks.
Integrating Existing Simulation Components into a Cohesive Simulation System
NASA Technical Reports Server (NTRS)
McLaughlin, Brian J.; Barrett, Larry K.
2012-01-01
A tradition of leveraging the re-use of components to help manage costs has evolved in the development of complex system. This tradition continues on in the Joint Polar Satellite System (JPSS) Program with the cloning of the Suomi National Polar-orbiting Partnership (NPP) satellite for the JPSS-1 mission, including the instrument complement. One benefit of re-use on a mission is the availability of existing simulation assets from the systems that were previously built. An issue arises in the continual shift of technology over a long mission, or multi-mission, lifecycle. As the missions mature, the requirements for the observatory simulations evolve. The challenge in this environment becomes re-using the existing components in that ever-changing landscape. To meet this challenge, the system must: establish an operational architecture that minimizes impacts on the implementation of individual components, consolidate the satisfaction of new high-impact requirements into system-level infrastructure, and build in a long-term view of system adaptation that spans the full lifecycle of the simulation system. The Flight Vehicle Test Suite (FVTS) within the JPSS Program is defining and executing this approach to ensure a robust simulation capability for the JPSS multi-mission environment
Evolution of the Power Processing Units Architecture for Electric Propulsion at CRISA
NASA Astrophysics Data System (ADS)
Palencia, J.; de la Cruz, F.; Wallace, N.
2008-09-01
Since 2002, the team formed by EADS Astrium CRISA, Astrium GmbH Friedrichshafen, and QinetiQ has participated in several flight programs where the Electric Propulsion based on Kaufman type Ion Thrusters is the baseline conceptOn 2002, CRISA won the contract for the development of the Ion Propulsion Control Unit (IPCU) for GOCE. This unit together with the T5 thruster by QinetiQ provides near perfect atmospheric drag compensation offering thrust levels in the range of 1 to 20mN.By the end of 2003, CRISA started the adaptation of the IPCU concept to the QinetiQ T6 Ion Thruster for the Alphabus program.This paper shows how the Power Processing Unit design evolved in time including the current developments.
TetrUSS Capabilities for S and C Applications
NASA Technical Reports Server (NTRS)
Frink, Neal T.; Parikh, Paresh
2004-01-01
TetrUSS is a suite of loosely coupled computational fluid dynamics software that is packaged into a complete flow analysis system. The system components consist of tools for geometry setup, grid generation, flow solution, visualization, and various utilities tools. Development began in 1990 and it has evolved into a proven and stable system for Euler and Navier-Stokes analysis and design of unconventional configurations. It is 1) well developed and validated, 2) has a broad base of support, and 3) is presently is a workhorse code because of the level of confidence that has been established through wide use. The entire system can now run on linux or mac architectures. In the following slides, I will highlight more of the features of the VGRID and USM3D codes.
Interior design for passive solar homes
NASA Astrophysics Data System (ADS)
Breen, J. C.
1981-07-01
The increasing emphasis on refinement of passive solar systems brought recognition to interior design as an integral part of passive solar architecture. Interior design can be used as a finetuning tool minimizing many of the problems associated with passive solar energy use in residential buildings. In addition, treatment of interior space in solar model homes may be a prime factor in determining sales success. A new style of interior design is evolving in response to changes in building from incorporating passive solar design features. The psychology behind passive solar architecture is reflected in interiors, and selection of interior components increasingly depends on the functional suitably of various interior elements.
Environmentally stable seed source for high power ultrafast laser
NASA Astrophysics Data System (ADS)
Samartsev, Igor; Bordenyuk, Andrey; Gapontsev, Valentin
2017-02-01
We present an environmentally stable Yb ultrafast ring oscillator utilizing a new method of passive mode-locking. The laser is using all-fiber architecture which makes it insensitive to environmental factors, like temperature, humidity, vibrations, and shocks. The new method of mode-locking is utilizing crossed bandpass transmittance filters in ring architecture to discriminate against CW lasing. Broadband pulse evolves from cavity noise under amplification, after passing each filter, causing strong spectral broadening. The laser is self-starting. It generates transform limited spectrally flat pulses of 1 - 50 nm width at 6 - 15 MHz repetition rate and pulse energy 0.2 - 15 nJ at 1010 - 1080 nm CWL.
Medicaid information technology architecture: an overview.
Friedman, Richard H
2006-01-01
The Medicaid Information Technology Architecture (MITA) is a roadmap and tool-kit for States to transform their Medicaid Management Information System (MMIS) into an enterprise-wide, beneficiary-centric system. MITA will enable State Medicaid agencies to align their information technology (IT) opportunities with their evolving business needs. It also addresses long-standing issues of interoperability, adaptability, and data sharing, including clinical data, across organizational boundaries by creating models based on nationally accepted technical standards. Perhaps most significantly, MITA allows State Medicaid Programs to actively participate in the DHHS Secretary's vision of a transparent health care market that utilizes electronic health records (EHRs), ePrescribing and personal health records (PHRs).
Space Telecommunications Radio System (STRS) Architecture Goals/Objectives and Level 1 Requirements
NASA Technical Reports Server (NTRS)
Briones, Janette C.; Johnson, Sandra K.; VanDerAar, Lisa
2007-01-01
The Space Telecommunications Radio System (STRS) Architecture Requirements Document provides the basis for the development of an open architecture for NASA Software Defined Radios (SDRs) for space use. The main objective of this document is to evaluate the goals and objectives and high level (Level 1) requirements that have bearing on the design of the architecture. The goals and objectives will provide broad, fundamental direction and purpose. The high level requirements (Level 1) intend to guide the broader and longer term aspects aspects of the SDR Architecture and provide guidance for the development of level 2 requirements.
Kitano, Hiroaki
2004-11-01
Robustness is a ubiquitously observed property of biological systems. It is considered to be a fundamental feature of complex evolvable systems. It is attained by several underlying principles that are universal to both biological organisms and sophisticated engineering systems. Robustness facilitates evolvability and robust traits are often selected by evolution. Such a mutually beneficial process is made possible by specific architectural features observed in robust systems. But there are trade-offs between robustness, fragility, performance and resource demands, which explain system behaviour, including the patterns of failure. Insights into inherent properties of robust systems will provide us with a better understanding of complex diseases and a guiding principle for therapy design.
Solar-terrestrial data access distribution and archiving
NASA Technical Reports Server (NTRS)
1984-01-01
It is recommended that a central data catalog and data access network (CDC/DAN) for solar-terrestrial research be established, initially as a NASA pilot program. The system is envisioned to be flexible and to evolve as funds permit, starting from a catalog to an access network for high-resolution data. The report describes the various functional requirements for the CDC/DAN, but does not specify the hardware and software architectures as these are constantly evolving. The importance of a steering committee, working with the CDC/DAN organization, to provide scientific guidelines for the data catalog and for data storage, access, and distribution is also stressed.
Information Quality Evaluation of C2 Systems at Architecture Level
2014-06-01
based on architecture models of C2 systems, which can help to identify key factors impacting information quality and improve the system capability at the stage of architecture design of C2 system....capability evaluation of C2 systems at architecture level becomes necessary and important for improving the system capability at the stage of architecture ... design . This paper proposes a method for information quality evaluation of C2 system at architecture level. First, the information quality model is
Mission Systems Open Architecture Science and Technology (MOAST) program
NASA Astrophysics Data System (ADS)
Littlejohn, Kenneth; Rajabian-Schwart, Vahid; Kovach, Nicholas; Satterthwaite, Charles P.
2017-04-01
The Mission Systems Open Architecture Science and Technology (MOAST) program is an AFRL effort that is developing and demonstrating Open System Architecture (OSA) component prototypes, along with methods and tools, to strategically evolve current OSA standards and technical approaches, promote affordable capability evolution, reduce integration risk, and address emerging challenges [1]. Within the context of open architectures, the program is conducting advanced research and concept development in the following areas: (1) Evolution of standards; (2) Cyber-Resiliency; (3) Emerging Concepts and Technologies; (4) Risk Reduction Studies and Experimentation; and (5) Advanced Technology Demonstrations. Current research includes the development of methods, tools, and techniques to characterize the performance of OMS data interconnection methods for representative mission system applications. Of particular interest are the OMS Critical Abstraction Layer (CAL), the Avionics Service Bus (ASB), and the Bulk Data Transfer interconnects, as well as to develop and demonstrate cybersecurity countermeasures techniques to detect and mitigate cyberattacks against open architecture based mission systems and ensure continued mission operations. Focus is on cybersecurity techniques that augment traditional cybersecurity controls and those currently defined within the Open Mission System and UCI standards. AFRL is also developing code generation tools and simulation tools to support evaluation and experimentation of OSA-compliant implementations.
ATLAST and JWST Segmented Telescope Design Considerations
NASA Technical Reports Server (NTRS)
Feinberg, Lee
2016-01-01
To the extent it makes sense, leverage JWST (James Webb Space Telescope) knowledge, designs, architectures. GSE (Ground Support Equipment) good starting point. Develop a full end-to-end architecture that closes. Try to avoid recreating the wheel except where needed. Optimize from there (mainly for stability and coronagraphy). Develop a scalable design reference mission (9.2 meters). Do just enough work to understand launch break points in aperture size Demonstrate 10 pm (phase modulation) stability is achievable on a design reference mission. A really key design driver is the most robust stability possible!!! Make design compatible with starshades. While segmented coronagraphs with high throughput and large bandpasses are important, make the system serviceable so you can evolve the instruments. Keep it room temperature to minimize the costs associated with cryo. Focus resources on the contrast problem. Start with the architecture and connect it to the technology needs.
The NASA/OAST telerobot testbed architecture
NASA Technical Reports Server (NTRS)
Matijevic, J. R.; Zimmerman, W. F.; Dolinsky, S.
1989-01-01
Through a phased development such as a laboratory-based research testbed, the NASA/OAST Telerobot Testbed provides an environment for system test and demonstration of the technology which will usefully complement, significantly enhance, or even replace manned space activities. By integrating advanced sensing, robotic manipulation and intelligent control under human-interactive supervision, the Testbed will ultimately demonstrate execution of a variety of generic tasks suggestive of space assembly, maintenance, repair, and telescience. The Testbed system features a hierarchical layered control structure compatible with the incorporation of evolving technologies as they become available. The Testbed system is physically implemented in a computing architecture which allows for ease of integration of these technologies while preserving the flexibility for test of a variety of man-machine modes. The development currently in progress on the functional and implementation architectures of the NASA/OAST Testbed and capabilities planned for the coming years are presented.
Space/ground systems as cooperating agents
NASA Technical Reports Server (NTRS)
Grant, T. J.
1994-01-01
Within NASA and the European Space Agency (ESA) it is agreed that autonomy is an important goal for the design of future spacecraft and that this requires on-board artificial intelligence. NASA emphasizes deep space and planetary rover missions, while ESA considers on-board autonomy as an enabling technology for missions that must cope with imperfect communications. ESA's attention is on the space/ground system. A major issue is the optimal distribution of intelligent functions within the space/ground system. This paper describes the multi-agent architecture for space/ground systems (MAASGS) which would enable this issue to be investigated. A MAASGS agent may model a complete spacecraft, a spacecraft subsystem or payload, a ground segment, a spacecraft control system, a human operator, or an environment. The MAASGS architecture has evolved through a series of prototypes. The paper recommends that the MAASGS architecture should be implemented in the operational Dutch Utilization Center.
Description of the SSF PMAD DC testbed control system data acquisition function
NASA Technical Reports Server (NTRS)
Baez, Anastacio N.; Mackin, Michael; Wright, Theodore
1992-01-01
The NASA LeRC in Cleveland, Ohio has completed the development and integration of a Power Management and Distribution (PMAD) DC Testbed. This testbed is a reduced scale representation of the end to end, sources to loads, Space Station Freedom Electrical Power System (SSF EPS). This unique facility is being used to demonstrate DC power generation and distribution, power management and control, and system operation techniques considered to be prime candidates for the Space Station Freedom. A key capability of the testbed is its ability to be configured to address system level issues in support of critical SSF program design milestones. Electrical power system control and operation issues like source control, source regulation, system fault protection, end-to-end system stability, health monitoring, resource allocation, and resource management are being evaluated in the testbed. The SSF EPS control functional allocation between on-board computers and ground based systems is evolving. Initially, ground based systems will perform the bulk of power system control and operation. The EPS control system is required to continuously monitor and determine the current state of the power system. The DC Testbed Control System consists of standard controllers arranged in a hierarchical and distributed architecture. These controllers provide all the monitoring and control functions for the DC Testbed Electrical Power System. Higher level controllers include the Power Management Controller, Load Management Controller, Operator Interface System, and a network of computer systems that perform some of the SSF Ground based Control Center Operation. The lower level controllers include Main Bus Switch Controllers and Photovoltaic Controllers. Power system status information is periodically provided to the higher level controllers to perform system control and operation. The data acquisition function of the control system is distributed among the various levels of the hierarchy. Data requirements are dictated by the control system algorithms being implemented at each level. A functional description of the various levels of the testbed control system architecture, the data acquisition function, and the status of its implementationis presented.
Urošević, Vladimir; Mitić, Marko
2014-01-01
Successful service integration in policy and practice requires both technology innovation and service process innovation being pursued and implemented at the same time. The SmartCare project (partially EC-funded under CIP ICT PSP Program) aims to achieve this through development, piloting and evaluation of ICT-based services, horizontally integrating health and social care in ten pilot regions, including Kraljevo region in Serbia. The project has identified and adopted two generic highest-level common thematic pathways in joint consolidation phase - integrated support for long-term care and integrated support after hospital discharge. A common set of standard functional specifications for an open ICT platform enabling the delivery of integrated care is being defined, around the challenges of data sharing, coordination and communication in these two formalized pathways. Implementation and system integration on technology and architecture level are to be based on open standards, multivendor interoperability, and leveraging on the current evolving open specification technology foundations developed in relevant projects across the European Research Area.
The kinetics of pre-mRNA splicing in the Drosophila genome and the influence of gene architecture.
Pai, Athma A; Henriques, Telmo; McCue, Kayla; Burkholder, Adam; Adelman, Karen; Burge, Christopher B
2017-12-27
Production of most eukaryotic mRNAs requires splicing of introns from pre-mRNA. The splicing reaction requires definition of splice sites, which are initially recognized in either intron-spanning ('intron definition') or exon-spanning ('exon definition') pairs. To understand how exon and intron length and splice site recognition mode impact splicing, we measured splicing rates genome-wide in Drosophila , using metabolic labeling/RNA sequencing and new mathematical models to estimate rates. We found that the modal intron length range of 60-70 nt represents a local maximum of splicing rates, but that much longer exon-defined introns are spliced even faster and more accurately. We observed unexpectedly low variation in splicing rates across introns in the same gene, suggesting the presence of gene-level influences, and we identified multiple gene level variables associated with splicing rate. Together our data suggest that developmental and stress response genes may have preferentially evolved exon definition in order to enhance the rate or accuracy of splicing.
NASA Technical Reports Server (NTRS)
Chien, E. S. K.; Marinho, J. A.; Russell, J. E., Sr.
1988-01-01
The Cellular Access Digital Network (CADN) is the access vehicle through which cellular technology is brought into the mainstream of the evolving integrated telecommunications network. Beyond the integrated end-to-end digital access and per call network services provisioning of the Integrated Services Digital Network (ISDN), the CADN engenders the added capability of mobility freedom via wireless access. One key element of the CADN network architecture is the standard user to network interface that is independent of RF transmission technology. Since the Mobile Satellite System (MSS) is envisioned to not only complement but also enhance the capabilities of the terrestrial cellular telecommunications network, compatibility and interoperability between terrestrial cellular and mobile satellite systems are vitally important to provide an integrated moving telecommunications network of the future. From a network standpoint, there exist very strong commonalities between the terrestrial cellular system and the mobile satellite system. Therefore, the MSS architecture should be designed as an integral part of the CADN. This paper describes the concept of the CADN, the functional architecture of the MSS, and the user-network interface signaling protocols.
Chen, Da; Zheng, Xiaoyu
2018-06-14
Nature has evolved with a recurring strategy to achieve unusual mechanical properties through coupling variable elastic moduli from a few GPa to below KPa within a single tissue. The ability to produce multi-material, three-dimensional (3D) micro-architectures with high fidelity incorporating dissimilar components has been a major challenge in man-made materials. Here we show multi-modulus metamaterials whose architectural element is comprised of encoded elasticity ranging from rigid to soft. We found that, in contrast to ordinary architected materials whose negative Poisson's ratio is dictated by their geometry, these type of metamaterials are capable of displaying Poisson's ratios from extreme negative to zero, independent of their 3D micro-architecture. The resulting low density metamaterials is capable of achieving functionally graded, distributed strain amplification capabilities within the metamaterial with uniform micro-architectures. Simultaneous tuning of Poisson's ratio and moduli within the 3D multi-materials could open up a broad array of material by design applications ranging from flexible armor, artificial muscles, to actuators and bio-mimetic materials.
Connecting Architecture and Implementation
NASA Astrophysics Data System (ADS)
Buchgeher, Georg; Weinreich, Rainer
Software architectures are still typically defined and described independently from implementation. To avoid architectural erosion and drift, architectural representation needs to be continuously updated and synchronized with system implementation. Existing approaches for architecture representation like informal architecture documentation, UML diagrams, and Architecture Description Languages (ADLs) provide only limited support for connecting architecture descriptions and implementations. Architecture management tools like Lattix, SonarJ, and Sotoarc and UML-tools tackle this problem by extracting architecture information directly from code. This approach works for low-level architectural abstractions like classes and interfaces in object-oriented systems but fails to support architectural abstractions not found in programming languages. In this paper we present an approach for linking and continuously synchronizing a formalized architecture representation to an implementation. The approach is a synthesis of functionality provided by code-centric architecture management and UML tools and higher-level architecture analysis approaches like ADLs.
NASA Astrophysics Data System (ADS)
Solomon, D.; van Dijk, A.
The "2002 ESA Lunar Architecture Workshop" (June 3-16) ESTEC, Noordwijk, NL and V2_Lab, Rotterdam, NL) is the first-of-its-kind workshop for exploring the design of extra-terrestrial (infra) structures for human exploration of the Moon and Earth-like planets introducing 'architecture's current line of research', and adopting an architec- tural criteria. The workshop intends to inspire, engage and challenge 30-40 European masters students from the fields of aerospace engineering, civil engineering, archi- tecture, and art to design, validate and build models of (infra) structures for Lunar exploration. The workshop also aims to open up new physical and conceptual terrain for an architectural agenda within the field of space exploration. A sound introduc- tion to the issues, conditions, resources, technologies, and architectural strategies will initiate the workshop participants into the context of lunar architecture scenarios. In my paper and presentation about the development of the ideology behind this work- shop, I will comment on the following questions: * Can the contemporary architectural agenda offer solutions that affect the scope of space exploration? It certainly has had an impression on urbanization and colonization of previously sparsely populated parts of Earth. * Does the current line of research in architecture offer any useful strategies for com- bining scientific interests, commercial opportunity, and public space? What can be learned from 'state of the art' architecture that blends commercial and public pro- grammes within one location? * Should commercial 'colonisation' projects in space be required to provide public space in a location where all humans present are likely to be there in a commercial context? Is the wave in Koolhaas' new Prada flagship store just a gesture to public space, or does this new concept in architecture and shopping evolve the public space? * What can we learn about designing (infra-) structures on the Moon or any other space context that will be useful on Earth on a conceptual and practical level? * In what ways could architecture's field of reference offer building on the Moon (and other celestial bodies) a paradigm shift? 1 In addition to their models and designs, workshop participants will begin authoring a design recommendation for the building of (infra-) structures and habitats on celestial bodies in particular the Moon and Mars. The design recommendation, a substantiated aesthetic code of conduct (not legally binding) will address long term planning and incorporate issues of sustainability, durability, bio-diversity, infrastructure, CHANGE, and techniques that lend themselves to Earth-bound applications. It will also address the cultural implications of architectural design might have within the context of space exploration. The design recommendation will ultimately be presented for peer review to both the space and architecture communities. What would the endorsement from the architectural community of such a document mean to the space community? The Lunar Architecture Workshop is conceptualised, produced and organised by(in alphabetical order): Alexander van Dijk, Art Race in Space, Barbara Imhof; ES- CAPE*spHERE, Vienna, University of Technology, Institute for Design and Building Construction, Vienna, Bernard Foing; ESA SMART1 Project Scientist, Susmita Mo- hanty; MoonFront, LLC, Hans Schartner' Vienna University of Technology, Institute for Design and Building Construction, Debra Solomon; Art Race in Space, Dutch Art Institute, Paul van Susante; Lunar Explorers Society. Workshop locations: ESTEC, Noordwijk, NL and V2_Lab, Rotterdam, NL Workshop dates: June 3-16, 2002 (a Call for Participation will be made in March -April 2002.) 2
A mathematical framework for modelling cambial surface evolution using a level set method
Sellier, Damien; Plank, Michael J.; Harrington, Jonathan J.
2011-01-01
Background and Aims During their lifetime, tree stems take a series of successive nested shapes. Individual tree growth models traditionally focus on apical growth and architecture. However, cambial growth, which is distributed over a surface layer wrapping the whole organism, equally contributes to plant form and function. This study aims at providing a framework to simulate how organism shape evolves as a result of a secondary growth process that occurs at the cellular scale. Methods The development of the vascular cambium is modelled as an expanding surface using the level set method. The surface consists of multiple compartments following distinct expansion rules. Growth behaviour can be formulated as a mathematical function of surface state variables and independent variables to describe biological processes. Key Results The model was coupled to an architectural model and to a forest stand model to simulate cambium dynamics and wood formation at the scale of the organism. The model is able to simulate competition between cambia, surface irregularities and local features. Predicting the shapes associated with arbitrarily complex growth functions does not add complexity to the numerical method itself. Conclusions Despite their slenderness, it is sometimes useful to conceive of trees as expanding surfaces. The proposed mathematical framework provides a way to integrate through time and space the biological and physical mechanisms underlying cambium activity. It can be used either to test growth hypotheses or to generate detailed maps of wood internal structure. PMID:21470972
Reference Architecture Model Enabling Standards Interoperability.
Blobel, Bernd
2017-01-01
Advanced health and social services paradigms are supported by a comprehensive set of domains managed by different scientific disciplines. Interoperability has to evolve beyond information and communication technology (ICT) concerns, including the real world business domains and their processes, but also the individual context of all actors involved. So, the system must properly reflect the environment in front and around the computer as essential and even defining part of the health system. This paper introduces an ICT-independent system-theoretical, ontology-driven reference architecture model allowing the representation and harmonization of all domains involved including the transformation into an appropriate ICT design and implementation. The entire process is completely formalized and can therefore be fully automated.
Saranummi, Niilo
2005-01-01
The PICNIC architecture aims at supporting inter-enterprise integration and the facilitation of collaboration between healthcare organisations. The concept of a Regional Health Economy (RHE) is introduced to illustrate the varying nature of inter-enterprise collaboration between healthcare organisations collaborating in providing health services to citizens and patients in a regional setting. The PICNIC architecture comprises a number of PICNIC IT Services, the interfaces between them and presents a way to assemble these into a functioning Regional Health Care Network meeting the needs and concerns of its stakeholders. The PICNIC architecture is presented through a number of views relevant to different stakeholder groups. The stakeholders of the first view are national and regional health authorities and policy makers. The view describes how the architecture enables the implementation of national and regional health policies, strategies and organisational structures. The stakeholders of the second view, the service viewpoint, are the care providers, health professionals, patients and citizens. The view describes how the architecture supports and enables regional care delivery and process management including continuity of care (shared care) and citizen-centred health services. The stakeholders of the third view, the engineering view, are those that design, build and implement the RHCN. The view comprises four sub views: software engineering, IT services engineering, security and data. The proposed architecture is founded into the main stream of how distributed computing environments are evolving. The architecture is realised using the web services approach. A number of well established technology platforms and generic standards exist that can be used to implement the software components. The software components that are specified in PICNIC are implemented in Open Source.
Crew Integration & Automation Testbed and Robotic Follower Programs
2001-05-30
Evolving Technologies for Reduced Crew Operation” Vehicle Tech Demo #1 (VTT) Vehicle Tech Demo #2 ( CAT ATD) Two Man Transition Future Combat...Simulation Advanced Electronic Architecture Concept Vehicle Shown with Onboard Safety Driver Advanced Interfaces CAT ATD Exit Criteria...Provide 1000 Hz control loop for critical real-time tasks CAT Workload IPT Process and Product Schedule Crew Task List Task Timelines Workload Analysis
Towards Wearable Cognitive Assistance
2013-12-01
ABSTRACT unclassified c. THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Keywords: mobile computing, cloud...It presents a muli-tiered mobile system architecture that offers tight end-to-end latency bounds on compute-intensive cognitive assistance...to an entire neighborhood or an entire city is extremely expensive and time-consuming. Physical infrastructure in public spaces tends to evolve very
2016-03-01
Infrastructure to Support Mobile Devices (Takai, 2012, p. 2). The objectives needed in order to meet this goal are to: evolve spectrum management, expand... infrastructure to support wireless capabilities, and establish a mobile device security architecture (Takai, 2012, p. 2). By expanding infrastructure to...often used on Mobile Ad-Hoc Networks (MANETs). MANETS are infrastructure -less networks that include, but are not limited to, mobile devices. These
ERIC Educational Resources Information Center
Helps, Richard
2010-01-01
A major challenge for Information Technology (IT) programs is that the rapid pace of evolution of computing technology leads to frequent redesign of IT courses. The problem is exacerbated by several factors. Firstly, the changing technology is the subject matter of the discipline and is also frequently used to support instruction; secondly, this…
NASA Astrophysics Data System (ADS)
Hall, Justin R.; Hastrup, Rolf C.
The United States Space Exploration Initiative (SEI) calls for the charting of a new and evolving manned course to the Moon, Mars, and beyond. This paper discusses key challenges in providing effective deep space telecommunications, navigation, and information management (TNIM) architectures and designs for Mars exploration support. The fundamental objectives are to provide the mission with means to monitor and control mission elements, acquire engineering, science, and navigation data, compute state vectors and navigate, and move these data efficiently and automatically between mission nodes for timely analysis and decision-making. Although these objectives do not depart, fundamentally, from those evolved over the past 30 years in supporting deep space robotic exploration, there are several new issues. This paper focuses on summarizing new requirements, identifying related issues and challenges, responding with concepts and strategies which are enabling, and, finally, describing candidate architectures, and driving technologies. The design challenges include the attainment of: 1) manageable interfaces in a large distributed system, 2) highly unattended operations for in-situ Mars telecommunications and navigation functions, 3) robust connectivity for manned and robotic links, 4) information management for efficient and reliable interchange of data between mission nodes, and 5) an adequate Mars-Earth data rate.
NASA IVHM Technology Experiment for X-vehicles (NITEX)
NASA Technical Reports Server (NTRS)
Sandra, Hayden; Bajwa, Anupa
2001-01-01
The purpose of the NASA IVHM Technology Experiment for X-vehicles (NITEX) is to advance the development of selected IVHM technologies in a flight environment and to demonstrate the potential for reusable launch vehicle ground processing savings. The technologies to be developed and demonstrated include system-level and detailed diagnostics for real-time fault detection and isolation, prognostics for fault prediction, automated maintenance planning based on diagnostic and prognostic results, and a microelectronics hardware platform. Complete flight The Evolution of Flexible Insulation as IVHM consists of advanced sensors, distributed data acquisition, data processing that includes model-based diagnostics, prognostics and vehicle autonomy for control or suggested action, and advanced data storage. Complete ground IVHM consists of evolved control room architectures, advanced applications including automated maintenance planning and automated ground support equipment. This experiment will advance the development of a subset of complete IVHM.
Mars Surface Tunnel Element Concept
NASA Technical Reports Server (NTRS)
Rucker, Michelle A.
2016-01-01
How crews get into or out of their ascent vehicle has profound implications for Mars surface architecture. Extravehicular Activity (EVA) hatches and Airlocks have the benefit of relatively low mass and high Technology Readiness Level (TRL), but waste consumables with a volume depressurization for every ingress/egress. Perhaps the biggest drawback to EVA hatches or Airlocks is that they make it difficult to keep Martian dust from being tracked back into the ascent vehicle, in violation of planetary protection protocols. Suit ports offer the promise of dust mitigation by keeping dusty suits outside the cabin, but require significant cabin real estate, are relatively high mass, and current operational concepts still require an EVA hatch to get the suits outside for the first EVA, and back inside after the final EVA. This is primarily because current designs don't provide enough structural support to protect the suits from ascent/descent loads or potential thruster plume impingement. For architectures involving more than one surface element-such as an ascent vehicle and a rover or surface habitat-a retractable tunnel is an attractive option. By pushing spacesuit don/doff and EVA operations to an element that remains on the surface, ascended vehicle mass and dust can be minimized. What's more, retractable tunnels provide operational flexibility by allowing surface assets to be re-configured or built up over time. Retractable tunnel functional requirements and design concepts being developed as part of the National Aeronautics and Space Administration's (NASA) Evolvable Mars Campaign (EMC) work will add a new ingress/egress option to the surface architecture trade space.
NASA Astrophysics Data System (ADS)
Tambara, Lucas Antunes; Tonfat, Jorge; Santos, André; Kastensmidt, Fernanda Lima; Medina, Nilberto H.; Added, Nemitala; Aguiar, Vitor A. P.; Aguirre, Fernando; Silveira, Marcilei A. G.
2017-02-01
The increasing system complexity of FPGA-based hardware designs and shortening of time-to-market have motivated the adoption of new designing methodologies focused on addressing the current need for high-performance circuits. High-Level Synthesis (HLS) tools can generate Register Transfer Level (RTL) designs from high-level software programming languages. These tools have evolved significantly in recent years, providing optimized RTL designs, which can serve the needs of safety-critical applications that require both high performance and high reliability levels. However, a reliability evaluation of HLS-based designs under soft errors has not yet been presented. In this work, the trade-offs of different HLS-based designs in terms of reliability, resource utilization, and performance are investigated by analyzing their behavior under soft errors and comparing them to a standard processor-based implementation in an SRAM-based FPGA. Results obtained from fault injection campaigns and radiation experiments show that it is possible to increase the performance of a processor-based system up to 5,000 times by changing its architecture with a small impact in the cross section (increasing up to 8 times), and still increasing the Mean Workload Between Failures (MWBF) of the system.
A holistic view of nitrogen acquisition in plants.
Kraiser, Tatiana; Gras, Diana E; Gutiérrez, Alvaro G; González, Bernardo; Gutiérrez, Rodrigo A
2011-02-01
Nitrogen (N) is the mineral nutrient required in the greatest amount and its availability is a major factor limiting growth and development of plants. As sessile organisms, plants have evolved different strategies to adapt to changes in the availability and distribution of N in soils. These strategies include mechanisms that act at different levels of biological organization from the molecular to the ecosystem level. At the molecular level, plants can adjust their capacity to acquire different forms of N in a range of concentrations by modulating the expression and function of genes in different N uptake systems. Modulation of plant growth and development, most notably changes in the root system architecture, can also greatly impact plant N acquisition in the soil. At the organism and ecosystem levels, plants establish associations with diverse microorganisms to ensure adequate nutrition and N supply. These different adaptive mechanisms have been traditionally discussed separately in the literature. To understand plant N nutrition in the environment, an integrated view of all pathways contributing to plant N acquisition is required. Towards this goal, in this review the different mechanisms that plants utilize to maintain an adequate N supply are summarized and integrated.
A holistic view of nitrogen acquisition in plants
Kraiser, Tatiana; Gras, Diana E.; Gutiérrez, Alvaro G.; González, Bernardo; Gutiérrez, Rodrigo A.
2011-01-01
Nitrogen (N) is the mineral nutrient required in the greatest amount and its availability is a major factor limiting growth and development of plants. As sessile organisms, plants have evolved different strategies to adapt to changes in the availability and distribution of N in soils. These strategies include mechanisms that act at different levels of biological organization from the molecular to the ecosystem level. At the molecular level, plants can adjust their capacity to acquire different forms of N in a range of concentrations by modulating the expression and function of genes in different N uptake systems. Modulation of plant growth and development, most notably changes in the root system architecture, can also greatly impact plant N acquisition in the soil. At the organism and ecosystem levels, plants establish associations with diverse microorganisms to ensure adequate nutrition and N supply. These different adaptive mechanisms have been traditionally discussed separately in the literature. To understand plant N nutrition in the environment, an integrated view of all pathways contributing to plant N acquisition is required. Towards this goal, in this review the different mechanisms that plants utilize to maintain an adequate N supply are summarized and integrated. PMID:21239377
Blankers, T; Lübke, A K; Hennig, R M
2015-09-01
Studying the genetic architecture of sexual traits provides insight into the rate and direction at which traits can respond to selection. Traits associated with few loci and limited genetic and phenotypic constraints tend to evolve at high rates typically observed for secondary sexual characters. Here, we examined the genetic architecture of song traits and female song preferences in the field crickets Gryllus rubens and Gryllus texensis. Song and preference data were collected from both species and interspecific F1 and F2 hybrids. We first analysed phenotypic variation to examine interspecific differentiation and trait distributions in parental and hybrid generations. Then, the relative contribution of additive and additive-dominance variation was estimated. Finally, phenotypic variance-covariance (P) matrices were estimated to evaluate the multivariate phenotype available for selection. Song traits and preferences had unimodal trait distributions, and hybrid offspring were intermediate with respect to the parents. We uncovered additive and dominance variation in song traits and preferences. For two song traits, we found evidence for X-linked inheritance. On the one hand, the observed genetic architecture does not suggest rapid divergence, although sex linkage may have allowed for somewhat higher evolutionary rates. On the other hand, P matrices revealed that multivariate variation in song traits aligned with major dimensions in song preferences, suggesting a strong selection response. We also found strong covariance between the main traits that are sexually selected and traits that are not directly selected by females, providing an explanation for the striking multivariate divergence in male calling songs despite limited divergence in female preferences. © 2015 European Society For Evolutionary Biology.
Atkinson, Elizabeth G.; Rogers, Jeffrey; Mahaney, Michael C.; Cox, Laura A.; Cheverud, James M.
2015-01-01
Folding of the primate brain cortex allows for improved neural processing power by increasing cortical surface area for the allocation of neurons. The arrangement of folds (sulci) and ridges (gyri) across the cerebral cortex is thought to reflect the underlying neural network. Gyrification, an adaptive trait with a unique evolutionary history, is affected by genetic factors different from those affecting brain volume. Using a large pedigreed population of ∼1000 Papio baboons, we address critical questions about the genetic architecture of primate brain folding, the interplay between genetics, brain anatomy, development, patterns of cortical–cortical connectivity, and gyrification’s potential for future evolution. Through Mantel testing and cluster analyses, we find that the baboon cortex is quite evolvable, with high integration between the genotype and phenotype. We further find significantly similar partitioning of variation between cortical development, anatomy, and connectivity, supporting the predictions of tension-based models for sulcal development. We identify a significant, moderate degree of genetic control over variation in sulcal length, with gyrus-shape features being more susceptible to environmental effects. Finally, through QTL mapping, we identify novel chromosomal regions affecting variation in brain folding. The most significant QTL contain compelling candidate genes, including gene clusters associated with Williams and Down syndromes. The QTL distribution suggests a complex genetic architecture for gyrification with both polygeny and pleiotropy. Our results provide a solid preliminary characterization of the genetic basis of primate brain folding, a unique and biomedically relevant phenotype with significant implications in primate brain evolution. PMID:25873632
Sensor Open System Architecture (SOSA) evolution for collaborative standards development
NASA Astrophysics Data System (ADS)
Collier, Charles Patrick; Lipkin, Ilya; Davidson, Steven A.; Baldwin, Rusty; Orlovsky, Michael C.; Ibrahim, Tim
2017-04-01
The Sensor Open System Architecture (SOSA) is a C4ISR-focused technical and economic collaborative effort between the Air Force, Navy, Army, the Department of Defense (DoD), Industry, and other Governmental agencies to develop (and incorporate) a technical Open Systems Architecture standard in order to maximize C4ISR sub-system, system, and platform affordability, re-configurability, and hardware/software/firmware re-use. The SOSA effort will effectively create an operational and technical framework for the integration of disparate payloads into C4ISR systems; with a focus on the development of a modular decomposition (defining functions and behaviors) and associated key interfaces (physical and logical) for common multi-purpose architecture for radar, EO/IR, SIGINT, EW, and Communications. SOSA addresses hardware, software, and mechanical/electrical interfaces. The modular decomposition will produce a set of re-useable components, interfaces, and sub-systems that engender reusable capabilities. This, in effect, creates a realistic and affordable ecosystem enabling mission effectiveness through systematic re-use of all available re-composed hardware, software, and electrical/mechanical base components and interfaces. To this end, SOSA will leverage existing standards as much as possible and evolve the SOSA architecture through modification, reuse, and enhancements to achieve C4ISR goals. This paper will present accomplishments over the first year of SOSA initiative.
Architectural Strategies for Enabling Data-Driven Science at Scale
NASA Astrophysics Data System (ADS)
Crichton, D. J.; Law, E. S.; Doyle, R. J.; Little, M. M.
2017-12-01
The analysis of large data collections from NASA or other agencies is often executed through traditional computational and data analysis approaches, which require users to bring data to their desktops and perform local data analysis. Alternatively, data are hauled to large computational environments that provide centralized data analysis via traditional High Performance Computing (HPC). Scientific data archives, however, are not only growing massive, but are also becoming highly distributed. Neither traditional approach provides a good solution for optimizing analysis into the future. Assumptions across the NASA mission and science data lifecycle, which historically assume that all data can be collected, transmitted, processed, and archived, will not scale as more capable instruments stress legacy-based systems. New paradigms are needed to increase the productivity and effectiveness of scientific data analysis. This paradigm must recognize that architectural and analytical choices are interrelated, and must be carefully coordinated in any system that aims to allow efficient, interactive scientific exploration and discovery to exploit massive data collections, from point of collection (e.g., onboard) to analysis and decision support. The most effective approach to analyzing a distributed set of massive data may involve some exploration and iteration, putting a premium on the flexibility afforded by the architectural framework. The framework should enable scientist users to assemble workflows efficiently, manage the uncertainties related to data analysis and inference, and optimize deep-dive analytics to enhance scalability. In many cases, this "data ecosystem" needs to be able to integrate multiple observing assets, ground environments, archives, and analytics, evolving from stewardship of measurements of data to using computational methodologies to better derive insight from the data that may be fused with other sets of data. This presentation will discuss architectural strategies, including a 2015-2016 NASA AIST Study on Big Data, for evolving scientific research towards massively distributed data-driven discovery. It will include example use cases across earth science, planetary science, and other disciplines.
Dover, John A; Burmeister, Alita R; Molineux, Ian J; Parent, Kristin N
2016-09-19
Genomic architecture is the framework within which genes and regulatory elements evolve and where specific constructs may constrain or potentiate particular adaptations. One such construct is evident in phages that use a headful packaging strategy that results in progeny phage heads packaged with DNA until full rather than encapsidating a simple unit-length genome. Here, we investigate the evolution of the headful packaging phage Sf6 in response to barriers that impede efficient phage adsorption to the host cell. Ten replicate populations evolved faster Sf6 life cycles by parallel mutations found in a phage lysis gene and/or by large, 1.2- to 4.0-kb deletions that remove a mobile genetic IS911 element present in the ancestral phage genome. The fastest life cycles were found in phages that acquired both mutations. No mutations were found in genes encoding phage structural proteins, which were a priori expected from the experimental design that imposed a challenge for phage adsorption by using a Shigella flexneri host lacking receptors preferred by Sf6. We used DNA sequencing, molecular approaches, and physiological experiments on 82 clonal isolates taken from all 10 populations to reveal the genetic basis of the faster Sf6 life cycle. The majority of our isolates acquired deletions in the phage genome. Our results suggest that deletions are adaptive and can influence the duration of the phage life cycle while acting in conjunction with other lysis time-determining point mutations. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
Evolving the Reuse Process at the Flight Dynamics Division (FDD) Goddard Space Flight Center
NASA Technical Reports Server (NTRS)
Condon, S.; Seaman, C.; Basili, Victor; Kraft, S.; Kontio, J.; Kim, Y.
1996-01-01
This paper presents the interim results from the Software Engineering Laboratory's (SEL) Reuse Study. The team conducting this study has, over the past few months, been studying the Generalized Support Software (GSS) domain asset library and architecture, and the various processes associated with it. In particular, we have characterized the process used to configure GSS-based attitude ground support systems (AGSS) to support satellite missions at NASA's Goddard Space Flight Center. To do this, we built detailed models of the tasks involved, the people who perform these tasks, and the interdependencies and information flows among these people. These models were based on information gleaned from numerous interviews with people involved in this process at various levels. We also analyzed effort data in order to determine the cost savings in moving from actual development of AGSSs to support each mission (which was necessary before GSS was available) to configuring AGSS software from the domain asset library. While characterizing the GSS process, we became aware of several interesting factors which affect the successful continued use of GSS. Many of these issues fall under the subject of evolving technologies, which were not available at the inception of GSS, but are now. Some of these technologies could be incorporated into the GSS process, thus making the whole asset library more usable. Other technologies are being considered as an alternative to the GSS process altogether. In this paper, we outline some of issues we will be considering in our continued study of GSS and the impact of evolving technologies.
Evolvable synthetic neural system
NASA Technical Reports Server (NTRS)
Curtis, Steven A. (Inventor)
2009-01-01
An evolvable synthetic neural system includes an evolvable neural interface operably coupled to at least one neural basis function. Each neural basis function includes an evolvable neural interface operably coupled to a heuristic neural system to perform high-level functions and an autonomic neural system to perform low-level functions. In some embodiments, the evolvable synthetic neural system is operably coupled to one or more evolvable synthetic neural systems in a hierarchy.
A systems approach to animal communication
Barron, Andrew B.; Balakrishnan, Christopher N.; Hauber, Mark E.; Hoke, Kim L.
2016-01-01
Why animal communication displays are so complex and how they have evolved are active foci of research with a long and rich history. Progress towards an evolutionary analysis of signal complexity, however, has been constrained by a lack of hypotheses to explain similarities and/or differences in signalling systems across taxa. To address this, we advocate incorporating a systems approach into studies of animal communication—an approach that includes comprehensive experimental designs and data collection in combination with the implementation of systems concepts and tools. A systems approach evaluates overall display architecture, including how components interact to alter function, and how function varies in different states of the system. We provide a brief overview of the current state of the field, including a focus on select studies that highlight the dynamic nature of animal signalling. We then introduce core concepts from systems biology (redundancy, degeneracy, pluripotentiality, and modularity) and discuss their relationships with system properties (e.g. robustness, flexibility, evolvability). We translate systems concepts into an animal communication framework and accentuate their utility through a case study. Finally, we demonstrate how consideration of the system-level organization of animal communication poses new practical research questions that will aid our understanding of how and why animal displays are so complex. PMID:26936240
A systems approach to animal communication.
Hebets, Eileen A; Barron, Andrew B; Balakrishnan, Christopher N; Hauber, Mark E; Mason, Paul H; Hoke, Kim L
2016-03-16
Why animal communication displays are so complex and how they have evolved are active foci of research with a long and rich history. Progress towards an evolutionary analysis of signal complexity, however, has been constrained by a lack of hypotheses to explain similarities and/or differences in signalling systems across taxa. To address this, we advocate incorporating a systems approach into studies of animal communication--an approach that includes comprehensive experimental designs and data collection in combination with the implementation of systems concepts and tools. A systems approach evaluates overall display architecture, including how components interact to alter function, and how function varies in different states of the system. We provide a brief overview of the current state of the field, including a focus on select studies that highlight the dynamic nature of animal signalling. We then introduce core concepts from systems biology (redundancy, degeneracy, pluripotentiality, and modularity) and discuss their relationships with system properties (e.g. robustness, flexibility, evolvability). We translate systems concepts into an animal communication framework and accentuate their utility through a case study. Finally, we demonstrate how consideration of the system-level organization of animal communication poses new practical research questions that will aid our understanding of how and why animal displays are so complex. © 2016 The Author(s).
A computer architecture for intelligent machines
NASA Technical Reports Server (NTRS)
Lefebvre, D. R.; Saridis, G. N.
1992-01-01
The theory of intelligent machines proposes a hierarchical organization for the functions of an autonomous robot based on the principle of increasing precision with decreasing intelligence. An analytic formulation of this theory using information-theoretic measures of uncertainty for each level of the intelligent machine has been developed. The authors present a computer architecture that implements the lower two levels of the intelligent machine. The architecture supports an event-driven programming paradigm that is independent of the underlying computer architecture and operating system. Execution-level controllers for motion and vision systems are briefly addressed, as well as the Petri net transducer software used to implement coordination-level functions. A case study illustrates how this computer architecture integrates real-time and higher-level control of manipulator and vision systems.
Analytical Design of Evolvable Software for High-Assurance Computing
2001-02-14
Mathematical expression for the Total Sum of Squares which measures the variability that results when all values are treated as a combined sample coming from...primarily interested in background on software design and high-assurance computing, research in software architecture generation or evaluation...respectively. Those readers solely interested in the validation of a software design approach should at the minimum read Chapter 6 followed by Chapter
Liquefaction and Storage of In-Situ Oxygen on the Surface of Mars
NASA Technical Reports Server (NTRS)
Hauser, Daniel M.; Johnson, Wesley L.; Sutherlin, Steven G.
2016-01-01
ISRU is currently base-lined for the production of oxygen on the Martian surface in the Evolvable Mars Campaign Over 50 of return vehicle mass is oxygen for propulsion. There are two key cryogenic fluid-thermal technologies that need to be investigated to enable these architectures. High lift refrigeration systems. Thermal Insulation systems, either lightweight vacuum jackets of soft vacuum insulation systems.
Research Notes - An Introduction to Openness and Evolvability Assessment
2016-08-01
importance of different business and technical characteristics that combine to achieve an open solution. The complexity of most large-scale systems of...process characteristic) Granularity of the architecture (size of functional blocks) Modularity (cohesion and coupling) Support for multiple...Description) OV-3 (Operational Information Exchange Matrix) SV-1 (Systems Interface Description) TV-1 ( Technical Standards Profile). Note that there
NASA's Advanced Exploration Systems Mars Transit Habitat Refinement Point of Departure Design
NASA Technical Reports Server (NTRS)
Simon, Matthew; Latorella, Kara; Martin, John; Cerro, Jeff; Lepsch, Roger; Jefferies, Sharon; Goodliff, Kandyce; McCleskey, Carey; Smitherman, David; Stromgren, Chel
2017-01-01
This paper describes the recently developed point of departure design for a long duration, reusable Mars Transit Habitat, which was established during a 2016 NASA habitat design refinement activity supporting the definition of NASA's Evolvable Mars Campaign. As part of its development of sustainable human Mars mission concepts achievable in the 2030s, the Evolvable Mars Campaign has identified desired durations and mass/dimensional limits for long duration Mars habitat designs to enable the currently assumed solar electric and chemical transportation architectures. The Advanced Exploration Systems Mars Transit Habitat Refinement Activity brought together habitat subsystem design expertise from across NASA to develop an increased fidelity, consensus design for a transit habitat within these constraints. The resulting design and data (including a mass equipment list) contained in this paper are intended to help teams across the agency and potential commercial, academic, or international partners understand: 1) the current architecture/habitat guidelines and assumptions, 2) performance targets of such a habitat (particularly in mass, volume, and power), 3) the driving technology/capability developments and architectural solutions which are necessary for achieving these targets, and 4) mass reduction opportunities and research/design needs to inform the development of future research and proposals. Data presented includes: an overview of the habitat refinement activity including motivation and process when informative; full documentation of the baseline design guidelines and assumptions; detailed mass and volume breakdowns; a moderately detailed concept of operations; a preliminary interior layout design with rationale; a list of the required capabilities necessary to enable the desired mass; and identification of any worthwhile trades/analyses which could inform future habitat design efforts. As a whole, the data in the paper show that a transit habitat meeting the 43 metric tons launch mass/trans-Mars injection burn limits specified by the Evolvable Mars Campaign is achievable near the desired timeframe with moderate strategic investments including maintainable life support systems, repurposable structures and packaging, and lightweight exercise modalities. It also identifies operational and technological options to reduce this mass to less than 41 metric tons including staging of launch structure/packaging and alternate structural materials.
Gemballa, Sven; Hagen, Katja
2004-01-01
Recent studies have revealed the 3D morphology and collagen fiber architecture of myosepta in teleostome fishes. Here we present the first data set on the myoseptal structure of a representative of the chondrichthyan clade. We investigate the series of myosepta in the ratfish Chimaera monstrosa (Holocephali) from the anterior to the posterior body using microdissections of cleared and stained specimens, polarized light microscopy of excised myosepta, and histology. The features of the myoseptal system of Chimaera are compared to data from closely related vertebrate groups and are mapped onto a phylogenetic tree to further clarify the characteristics of the myoseptal series in the gnathostome ancestor. The 3D morphology and collagen fiber architecture of the myoseptal series in C. monstrosa resembles that of Teleostomi (Actinopterygii+Sarcopterygii) with regard to several features. Our comparative analysis reveals that some of them have evolved in the gnathostome stem lineage. (1) A series of epineural and epaxial lateral tendons (LTs) along the whole body, and a series of epipleural and hypaxial LTs in the postanal region evolved in the gnathostome stem lineage. (2) The LTs increase in length towards the posterior body (three-fold in Chimaera). Data on Chimaera and some comparative data on actinopterygian fishes indicate that LTs also increase in thickness towards the posterior body, but further data are necessary to test whether this holds true generally. (3) Another conspicuous apomorphic gnathostome feature is represented by multi-layer structures of myosepta. These are formed along the vertebral column by converging medial regions of successive sloping parts of myosepta. (4) The dorsalmost and ventralmost flanking parts of myosepta bear a set of mediolaterally oriented collagen fibers that are present in all gnathostomes but are lacking in outgroups. Preanal hypaxial myosepta are clearly different from epaxial myosepta and postanal hypaxial myosepta in terms of their collagen fiber architecture. In Chimaera, preanal hypaxial myosepta consist of an array of mediolaterally oriented collagen fibers closely resembling the condition in other gnathostome groups and in petromyzontids. Only one series of tendons, the myorhabdoid tendons of the flanking parts of myosepta, have evolved in the stem lineage of Myopterygii (Gnathostomata+Petromyzontida). Similar to LTs, the tendons of this series also increase in length towards the posterior body. In combination with other studies, the present study provides a framework for the design of morphologically based experiments and modeling to further address the function of myosepta and myoseptal tendons in gnathostomes.
Memristor-Based Synapse Design and Training Scheme for Neuromorphic Computing Architecture
2012-06-01
system level built upon the conventional Von Neumann computer architecture [2][3]. Developing the neuromorphic architecture at chip level by...SCHEME FOR NEUROMORPHIC COMPUTING ARCHITECTURE 5a. CONTRACT NUMBER FA8750-11-2-0046 5b. GRANT NUMBER N/A 5c. PROGRAM ELEMENT NUMBER 62788F 6...creation of memristor-based neuromorphic computing architecture. Rather than the existing crossbar-based neuron network designs, we focus on memristor
NASA Astrophysics Data System (ADS)
Wallace, K. A.; Abriola, L.; Chen, M.; Ramsburg, A.; Pennell, K. D.; Christ, J.
2009-12-01
Multiphase, compositional simulators were employed to investigate the spill characteristics and subsurface properties that lead to pool-dominated, dense non-aqueous phase liquid (DNAPL) source zone architectures. DNAPL pools commonly form at textural interfaces where low permeability lenses restrict the vertical migration of DNAPL, allowing for DNAPL to accumulate, reaching high saturation. Significant pooling has been observed in bench-scale experiments and field settings. However, commonly employed numerical simulations rarely predict the pooling suspected in the field. Given the importance of pooling on the efficacy of mass recovery and the down-gradient contaminant signal, it is important to understand the predominant factors affecting the creation of pool-dominated source zones and their subsequent mass discharge. In this work, contaminant properties, spill characteristics and subsurface permeability were varied to investigate the factors contributing to the development of a pool-dominated source zone. DNAPL infiltration and entrapment simulations were conducted in two- and three-dimensional domains using the University of Texas Chemical Compositional (UTCHEM) simulator. A modified version of MT3DMS was then used to simulate DNAPL dissolution and mass discharge. Numerical mesh size was varied to investigate the importance of numerical model parameters on simulations results. The temporal evolution of commonly employed source zone architecture metrics, such as the maximum DNAPL saturation, first and second spatial moments, and fraction of DNAPL mass located in pools, was monitored to determine how the source zone architecture evolved with time. Mass discharge was monitored to identify the link between source zone architecture and down-gradient contaminant flux. Contaminant characteristics and the presence of extensive low permeability lenses appeared to have the most influence on the development of a pool-dominated source zone. The link between DNAPL mass recovery and contaminant mass discharge was significantly influenced by the fraction of mass residing in DNAPL pools. The greater the fraction of mass residing in DNAPL pools the greater the likelihood for significant reductions in contaminant mass discharge at modest levels of mass removal. These results will help guide numerical and experimental studies on the remediation of pool-dominated source zones and will likely guide future source zone characterization efforts.
NASA Astrophysics Data System (ADS)
Castiglione, Steven Louis
As scientific research trends towards trace levels and smaller architectures, the analytical chemist is often faced with the challenge of quantitating said species in a variety of matricies. The challenge is heightened when the analytes prove to be potentially toxic or possess physical or chemical properties that make traditional analytical methods problematic. In such cases, the successful development of an acceptable quantitative method plays a critical role in the ability to further develop the species under study. This is particularly true for pharmaceutical impurities and nanoparticles (NP). The first portion of the research focuses on the development of a part-per-billion level HPLC method for a substituted phenazine-class pharmaceutical impurity. The development of this method was required due to the need for a rapid methodology to quantitatively determine levels of a potentially toxic phenazine moiety in order to ensure patient safety. As the synthetic pathway for the active ingredient was continuously refined to produce progressively lower amounts of the phenazine impurity, the approach for increasingly sensitive quantitative methods was required. The approaches evolved across four discrete methods, each employing a unique scheme for analyte detection. All developed methods were evaluated with regards to accuracy, precision and linear adherence as well as ancillary benefits and detriments -- e.g., one method in this evolution demonstrated the ability to resolve and detect other species from the phenazine class. The second portion of the research focuses on the development of an HPLC method for the quantitative determination of NP size distributions. The current methodology for the determination of NP sizes employs tunneling electron microscopy (TEM), which requires sample drying without particle size alteration and which, in many cases, may prove infeasible due to cost or availability. The feasibility of an HPLC method for NP size characterizations evolved across three methods, each employing a different approach for size resolution. These methods were evaluated primarily for sensitivity, which proved to be a substantial hurdle to further development, but does not appear to deter future research efforts.
Assured Mission Support Space Architecture (AMSSA) study
NASA Technical Reports Server (NTRS)
Hamon, Rob
1993-01-01
The assured mission support space architecture (AMSSA) study was conducted with the overall goal of developing a long-term requirements-driven integrated space architecture to provide responsive and sustained space support to the combatant commands. Although derivation of an architecture was the focus of the study, there are three significant products from the effort. The first is a philosophy that defines the necessary attributes for the development and operation of space systems to ensure an integrated, interoperable architecture that, by design, provides a high degree of combat utility. The second is the architecture itself; based on an interoperable system-of-systems strategy, it reflects a long-range goal for space that will evolve as user requirements adapt to a changing world environment. The third product is the framework of a process that, when fully developed, will provide essential information to key decision makers for space systems acquisition in order to achieve the AMSSA goal. It is a categorical imperative that military space planners develop space systems that will act as true force multipliers. AMSSA provides the philosophy, process, and architecture that, when integrated with the DOD requirements and acquisition procedures, can yield an assured mission support capability from space to the combatant commanders. An important feature of the AMSSA initiative is the participation by every organization that has a role or interest in space systems development and operation. With continued community involvement, the concept of the AMSSA will become a reality. In summary, AMSSA offers a better way to think about space (philosophy) that can lead to the effective utilization of limited resources (process) with an infrastructure designed to meet the future space needs (architecture) of our combat forces.
Roofline model toolkit: A practical tool for architectural and program analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lo, Yu Jung; Williams, Samuel; Van Straalen, Brian
We present preliminary results of the Roofline Toolkit for multicore, many core, and accelerated architectures. This paper focuses on the processor architecture characterization engine, a collection of portable instrumented micro benchmarks implemented with Message Passing Interface (MPI), and OpenMP used to express thread-level parallelism. These benchmarks are specialized to quantify the behavior of different architectural features. Compared to previous work on performance characterization, these microbenchmarks focus on capturing the performance of each level of the memory hierarchy, along with thread-level parallelism, instruction-level parallelism and explicit SIMD parallelism, measured in the context of the compilers and run-time environments. We also measuremore » sustained PCIe throughput with four GPU memory managed mechanisms. By combining results from the architecture characterization with the Roofline model based solely on architectural specifications, this work offers insights for performance prediction of current and future architectures and their software systems. To that end, we instrument three applications and plot their resultant performance on the corresponding Roofline model when run on a Blue Gene/Q architecture.« less
EDOS Evolution to Support NASA Future Earth Sciences Missions
NASA Technical Reports Server (NTRS)
Cordier, Guy R.; McLemore, Bruce; Wood, Terri; Wilkinson, Chris
2010-01-01
This paper presents a ground system architecture to service future NASA decadal missions and in particular, the high rate science data downlinks, by evolving EDOS current infrastructure and upgrading high rate network lines. The paper will also cover EDOS participation to date in formulation and operations concepts for the respective missions to understand the particular mission needs and derived requirements such as data volumes, downlink rates, data encoding, and data latencies. Future decadal requirements such as onboard data recorder management and file protocols drive the need to emulate these requirements within the ground system. The EDOS open system modular architecture is scalable to accommodate additional missions using the current sites antennas and future sites as well and meet the data security requirements and fulfill mission's objectives
Challenges for deep space communications in the 1990s
NASA Technical Reports Server (NTRS)
Dumas, Larry N.; Hornstein, Robert M.
1991-01-01
The discussion of NASA's Deep Space Network (DSN) examines the evolving character of aerospace missions and the corresponding changes in the DSN architecture. Deep space missions are reviewed, and it is noted that the two 34-m and the 70-m antenna subnets of the DSN are heavily loaded and more use is expected. High operational workload and the challenge of network cross-support are the design drivers for a flexible DSN architecture configuration. Incorporated in the design are antenna arraying for aperture augmentation, beam-waveguide antennas for frequency agility, and connectivity with non-DSN sites for cross-support. Compatibility between spacecraft and ground-facility designs is important for establishing common international standards of communication and data-system specification.
Considerations for the Next Revision of NASA's Space Telecommunications Radio System Architecture
NASA Technical Reports Server (NTRS)
Johnson, Sandra K.; Handler, Louis M.; Briones, Janette C.
2016-01-01
Development of NASA's Software Defined Radio architecture, the Space Telecommunication Radio System (STRS), was initiated in 2004 with a goal of reducing the cost, risk and schedule when implementing Software Defined Radios (SDR) for National Aeronautics and Space Administration (NASA) space missions. Since STRS was first flown in 2012 on three Software Defined Radios on the Space Communication and Navigation (SCaN) Testbed, only minor changes have been made to the architecture. Multiple entities have since implemented the architecture and provided significant feedback for consideration for the next revision of the standard. The focus for the first set of updates to the architecture is items that enhance application portability. Items that require modifications to existing applications before migrating to the updated architecture will only be considered if there is compelling reasons to make the change. The significant suggestions that were further evaluated for consideration include expanding and clarifying the timing Application Programming Interfaces (APIs), improving handle name and identification (ID) definitions and use, and multiple items related to implementation of STRS Devices. In addition to ideas suggested while implementing STRS, SDR technology has evolved significantly and this impact to the architecture needs to be considered. These include incorporating cognitive concepts - learning from past decisions and making new decisions that the radio can act upon. SDRs are also being developed that do not contain a General Purpose Module - which is currently required for the platform to be STRS compliant. The purpose of this paper is to discuss the comments received, provide a summary of the evaluation considerations, and examine planned dispositions.
SpaceCubeX: A Framework for Evaluating Hybrid Multi-Core CPU FPGA DSP Architectures
NASA Technical Reports Server (NTRS)
Schmidt, Andrew G.; Weisz, Gabriel; French, Matthew; Flatley, Thomas; Villalpando, Carlos Y.
2017-01-01
The SpaceCubeX project is motivated by the need for high performance, modular, and scalable on-board processing to help scientists answer critical 21st century questions about global climate change, air quality, ocean health, and ecosystem dynamics, while adding new capabilities such as low-latency data products for extreme event warnings. These goals translate into on-board processing throughput requirements that are on the order of 100-1,000 more than those of previous Earth Science missions for standard processing, compression, storage, and downlink operations. To study possible future architectures to achieve these performance requirements, the SpaceCubeX project provides an evolvable testbed and framework that enables a focused design space exploration of candidate hybrid CPU/FPGA/DSP processing architectures. The framework includes ArchGen, an architecture generator tool populated with candidate architecture components, performance models, and IP cores, that allows an end user to specify the type, number, and connectivity of a hybrid architecture. The framework requires minimal extensions to integrate new processors, such as the anticipated High Performance Spaceflight Computer (HPSC), reducing time to initiate benchmarking by months. To evaluate the framework, we leverage a wide suite of high performance embedded computing benchmarks and Earth science scenarios to ensure robust architecture characterization. We report on our projects Year 1 efforts and demonstrate the capabilities across four simulation testbed models, a baseline SpaceCube 2.0 system, a dual ARM A9 processor system, a hybrid quad ARM A53 and FPGA system, and a hybrid quad ARM A53 and DSP system.
Concurrent approach for evolving compact decision rule sets
NASA Astrophysics Data System (ADS)
Marmelstein, Robert E.; Hammack, Lonnie P.; Lamont, Gary B.
1999-02-01
The induction of decision rules from data is important to many disciplines, including artificial intelligence and pattern recognition. To improve the state of the art in this area, we introduced the genetic rule and classifier construction environment (GRaCCE). It was previously shown that GRaCCE consistently evolved decision rule sets from data, which were significantly more compact than those produced by other methods (such as decision tree algorithms). The primary disadvantage of GRaCCe, however, is its relatively poor run-time execution performance. In this paper, a concurrent version of the GRaCCE architecture is introduced, which improves the efficiency of the original algorithm. A prototype of the algorithm is tested on an in- house parallel processor configuration and the results are discussed.
Enterprise-wide PACS: beyond radiology, an architecture to manage all medical images.
Bandon, David; Lovis, Christian; Geissbühler, Antoine; Vallée, Jean-Paul
2005-08-01
Picture archiving and communication systems (PACS) have the vocation to manage all medical images acquired within the hospital. To address the various situations encountered in the imaging specialties, the traditional architecture used for the radiology department has to evolve. We present our preliminarily results toward an enterprise-wide PACS intended to support all kind of image production in medicine, from biomolecular images to whole-body pictures. Our solution is based on an existing radiologic PACS system from which images are distributed through an electronic patient record to all care facilities. This platform is enriched with a flexible integration framework supporting digital image communication in medicine (DICOM) and DICOM-XML formats. In addition, a generic workflow engine highly customizable is used to drive work processes. Echocardiology; hematology; ear, nose, and throat; and dermatology, including wounds, follow-up is the first implemented extensions outside of radiology. We also propose a global strategy for further developments based on three possible architectures for an enterprise-wide PACS.
LHCb Kalman Filter cross architecture studies
NASA Astrophysics Data System (ADS)
Cámpora Pérez, Daniel Hugo
2017-10-01
The 2020 upgrade of the LHCb detector will vastly increase the rate of collisions the Online system needs to process in software, in order to filter events in real time. 30 million collisions per second will pass through a selection chain, where each step is executed conditional to its prior acceptance. The Kalman Filter is a fit applied to all reconstructed tracks which, due to its time characteristics and early execution in the selection chain, consumes 40% of the whole reconstruction time in the current trigger software. This makes the Kalman Filter a time-critical component as the LHCb trigger evolves into a full software trigger in the Upgrade. I present a new Kalman Filter algorithm for LHCb that can efficiently make use of any kind of SIMD processor, and its design is explained in depth. Performance benchmarks are compared between a variety of hardware architectures, including x86_64 and Power8, and the Intel Xeon Phi accelerator, and the suitability of said architectures to efficiently perform the LHCb Reconstruction process is determined.
Evolution of a protein folding nucleus.
Xia, Xue; Longo, Liam M; Sutherland, Mason A; Blaber, Michael
2016-07-01
The folding nucleus (FN) is a cryptic element within protein primary structure that enables an efficient folding pathway and is the postulated heritable element in the evolution of protein architecture; however, almost nothing is known regarding how the FN structurally changes as complex protein architecture evolves from simpler peptide motifs. We report characterization of the FN of a designed purely symmetric β-trefoil protein by ϕ-value analysis. We compare the structure and folding properties of key foldable intermediates along the evolutionary trajectory of the β-trefoil. The results show structural acquisition of the FN during gene fusion events, incorporating novel turn structure created by gene fusion. Furthermore, the FN is adjusted by circular permutation in response to destabilizing functional mutation. FN plasticity by way of circular permutation is made possible by the intrinsic C3 cyclic symmetry of the β-trefoil architecture, identifying a possible selective advantage that helps explain the prevalence of cyclic structural symmetry in the proteome. © 2015 The Protein Society.
On developing the local research environment of the 1990s - The Space Station era
NASA Technical Reports Server (NTRS)
Chase, Robert; Ziel, Fred
1989-01-01
A requirements analysis for the Space Station's polar platform data system has been performed. Based upon this analysis, a cluster, layered cluster, and layered-modular implementation of one specific module within the Eos Data and Information System (EosDIS), an active data base for satellite remote sensing research has been developed. It is found that a distributed system based on a layered-modular architecture and employing current generation work station technologies has the requisite attributes ascribed by the remote sensing research community. Although, based on benchmark testing, probabilistic analysis, failure analysis and user-survey technique analysis, it is found that this architecture presents some operational shortcomings that will not be alleviated with new hardware or software developments. Consequently, the potential of a fully-modular layered architectural design for meeting the needs of Eos researchers has also been evaluated, concluding that it would be well suited to the evolving requirements of this multidisciplinary research community.
NASA Technical Reports Server (NTRS)
Colloredo, Scott; Gray, James A.
2011-01-01
The impending conclusion of the Space Shuttle Program and the Constellation Program cancellation unveiled in the FY2011 President's budget created a large void for human spaceflight capability and specifically launch activity from the Florida launch Site (FlS). This void created an opportunity to re-architect the launch site to be more accommodating to the future NASA heavy lift and commercial space industry. The goal is to evolve the heritage capabilities into a more affordable and flexible launch complex. This case study will discuss the FlS architecture evolution from the trade studies to select primary launch site locations for future customers, to improving infrastructure; promoting environmental remediation/compliance; improving offline processing, manufacturing, & recovery; developing range interface and control services with the US Air Force, and developing modernization efforts for the launch Pad, Vehicle Assembly Building, Mobile launcher, and supporting infrastructure. The architecture studies will steer how to best invest limited modernization funding from initiatives like the 21 st elSe and other potential funding.
Architecture for hospital information integration
NASA Astrophysics Data System (ADS)
Chimiak, William J.; Janariz, Daniel L.; Martinez, Ralph
1999-07-01
The ongoing integration of hospital information systems (HIS) continues. Data storage systems, data networks and computers improve, data bases grow and health-care applications increase. Some computer operating systems continue to evolve and some fade. Health care delivery now depends on this computer-assisted environment. The result is the critical harmonization of the various hospital information systems becomes increasingly difficult. The purpose of this paper is to present an architecture for HIS integration that is computer-language-neutral and computer- hardware-neutral for the informatics applications. The proposed architecture builds upon the work done at the University of Arizona on middleware, the work of the National Electrical Manufacturers Association, and the American College of Radiology. It is a fresh approach to allowing applications engineers to access medical data easily and thus concentrates on the application techniques in which they are expert without struggling with medical information syntaxes. The HIS can be modeled using a hierarchy of information sub-systems thus facilitating its understanding. The architecture includes the resulting information model along with a strict but intuitive application programming interface, managed by CORBA. The CORBA requirement facilitates interoperability. It should also reduce software and hardware development times.
Considerations for the Next Revision of STRS
NASA Technical Reports Server (NTRS)
Johnson, Sandra K.; Handler, Louis M.; Briones, Janette C.
2016-01-01
Development of NASAs Software Defined Radio architecture, the Space Telecommunication Radio System (STRS), was initiated in 2004 with a goal of reducing the cost, risk and schedule when implementing Software Defined Radios (SDR) for NASA space missions. Since STRS was first flown in 2012 on three Software Defined Radios on the Space Communication and Navigation (SCaN) Testbed, only minor changes have been made to the architecture. Multiple entities have since implemented the architecture and have provided significant feedback for consideration for the next revision of the standard. The focus for the first set of updates to the architecture is items that enhance application portability. Items that require modifications to existing applications before migrating to the updated architecture will only be considered if there is compelling reasons to make the change. The significant suggestions that were further evaluated for consideration include expanding and clarifying the timing Application Programming Interfaces (APIs), improving handle name and identification (ID) definitions and use, and multiple items related to implementation of STRS Devices. In addition to ideas suggested while implementing STRS, SDR technology has evolved significantly and this impact to the architecture needs to be considered. These include incorporating cognitive concepts - learning from past decisions and making new decisions that the radio can act upon. SDRs are also being developed that do not contain a General Purpose Module which is currently required for the platform to be STRS compliant. The purpose of this paper is to discuss the comments received, provide a summary of the evaluation considerations, and examine planned dispositions
ERIC Educational Resources Information Center
van Maanen, Leendert; van Rijn, Hedderik; Taatgen, Niels
2012-01-01
This article discusses how sequential sampling models can be integrated in a cognitive architecture. The new theory Retrieval by Accumulating Evidence in an Architecture (RACE/A) combines the level of detail typically provided by sequential sampling models with the level of task complexity typically provided by cognitive architectures. We will use…
Crosstalk and the evolvability of intracellular communication.
Rowland, Michael A; Greenbaum, Joseph M; Deeds, Eric J
2017-07-10
Metazoan signalling networks are complex, with extensive crosstalk between pathways. It is unclear what pressures drove the evolution of this architecture. We explore the hypothesis that crosstalk allows different cell types, each expressing a specific subset of signalling proteins, to activate different outputs when faced with the same inputs, responding differently to the same environment. We find that the pressure to generate diversity leads to the evolution of networks with extensive crosstalk. Using available data, we find that human tissues exhibit higher levels of diversity between cell types than networks with random expression patterns or networks with no crosstalk. We also find that crosstalk and differential expression can influence drug activity: no protein has the same impact on two tissues when inhibited. In addition to providing a possible explanation for the evolution of crosstalk, our work indicates that consideration of cellular context will likely be crucial for targeting signalling networks.
Optimization of Orchestral Layouts Based on Instrument Directivity Patterns
NASA Astrophysics Data System (ADS)
Stroud, Nathan Paul
The experience of hearing an exceptional symphony orchestra perform in an excel- lent concert hall can be profound and moving, causing a level of excitement not often reached for listeners. Romantic period style orchestral music, recognized for validating the use of intense emotion for aesthetic pleasure, was the last significant development in the history of the orchestra. In an age where orchestral popularity is waning, the possibil- ity of evolving the orchestral sound in our modern era exists through the combination of our current understanding of instrument directivity patterns and their interaction with architectural acoustics. With the aid of wave field synthesis (WFS), newly proposed variations on orchestral layouts are tested virtually using a 64-channel WFS array. Each layout is objectively and subjectively compared for determination of which layout could optimize the sound of the orchestra and revitalize the excitement of the performance.
Genomics-Based Security Protocols: From Plaintext to Cipherprotein
NASA Technical Reports Server (NTRS)
Shaw, Harry; Hussein, Sayed; Helgert, Hermann
2011-01-01
The evolving nature of the internet will require continual advances in authentication and confidentiality protocols. Nature provides some clues as to how this can be accomplished in a distributed manner through molecular biology. Cryptography and molecular biology share certain aspects and operations that allow for a set of unified principles to be applied to problems in either venue. A concept for developing security protocols that can be instantiated at the genomics level is presented. A DNA (Deoxyribonucleic acid) inspired hash code system is presented that utilizes concepts from molecular biology. It is a keyed-Hash Message Authentication Code (HMAC) capable of being used in secure mobile Ad hoc networks. It is targeted for applications without an available public key infrastructure. Mechanics of creating the HMAC are presented as well as a prototype HMAC protocol architecture. Security concepts related to the implementation differences between electronic domain security and genomics domain security are discussed.
Learning and optimization with cascaded VLSI neural network building-block chips
NASA Technical Reports Server (NTRS)
Duong, T.; Eberhardt, S. P.; Tran, M.; Daud, T.; Thakoor, A. P.
1992-01-01
To demonstrate the versatility of the building-block approach, two neural network applications were implemented on cascaded analog VLSI chips. Weights were implemented using 7-b multiplying digital-to-analog converter (MDAC) synapse circuits, with 31 x 32 and 32 x 32 synapses per chip. A novel learning algorithm compatible with analog VLSI was applied to the two-input parity problem. The algorithm combines dynamically evolving architecture with limited gradient-descent backpropagation for efficient and versatile supervised learning. To implement the learning algorithm in hardware, synapse circuits were paralleled for additional quantization levels. The hardware-in-the-loop learning system allocated 2-5 hidden neurons for parity problems. Also, a 7 x 7 assignment problem was mapped onto a cascaded 64-neuron fully connected feedback network. In 100 randomly selected problems, the network found optimal or good solutions in most cases, with settling times in the range of 7-100 microseconds.
Nanomechanical strength mechanisms of hierarchical biological materials and tissues.
Buehler, Markus J; Ackbarow, Theodor
2008-12-01
Biological protein materials (BPMs), intriguing hierarchical structures formed by assembly of chemical building blocks, are crucial for critical functions of life. The structural details of BPMs are fascinating: They represent a combination of universally found motifs such as alpha-helices or beta-sheets with highly adapted protein structures such as cytoskeletal networks or spider silk nanocomposites. BPMs combine properties like strength and robustness, self-healing ability, adaptability, changeability, evolvability and others into multi-functional materials at a level unmatched in synthetic materials. The ability to achieve these properties depends critically on the particular traits of these materials, first and foremost their hierarchical architecture and seamless integration of material and structure, from nano to macro. Here, we provide a brief review of this field and outline new research directions, along with a review of recent research results in the development of structure-property relationships of biological protein materials exemplified in a study of vimentin intermediate filaments.
Mars Ascent Vehicle Design for Human Exploration
NASA Technical Reports Server (NTRS)
Polsgrove, Tara; Thomas, Dan; Sutherlin, Steven; Stephens, Walter; Rucker, Michelle
2015-01-01
In NASA's evolvable Mars campaign, transportation architectures for human missions to Mars rely on a combination of solar electric propulsion and chemical propulsion systems. Minimizing the Mars ascent vehicle (MAV) mass is critical in reducing the overall lander mass and also eases the requirements placed on the transportation stages. This paper presents the results of a conceptual design study to obtain a minimal MAV configuration, including subsystem designs and mass summaries.
Seed coat thickness in the evolution of angiosperms.
Coen, Olivier; Magnani, Enrico
2018-05-05
The seed habit represents a remarkable evolutionary advance in plant sexual reproduction. Since the Paleozoic, seeds carry a seed coat that protects, nourishes and facilitates the dispersal of the fertilization product(s). The seed coat architecture evolved to adapt to different environments and reproductive strategies in part by modifying its thickness. Here, we review the great natural diversity observed in seed coat thickness among angiosperms and its molecular regulation in Arabidopsis.
Collaborative Oceanography and Virtual Experiments
2013-09-30
Observing Laboratory ( EOL ), but also contains an internal architecture which will allow it to evolve into a collaborative communication tool...an additional 50,988 "common" products were generated (241 plot types), along with 47,600 overlays (101 plot types). From 52 non- EOL sources...24,471 products were collected, and from 1486 EOL data collections, 643,263 "federated" products were indexed and made available through itop.org
Systemic Operational Design: Epistemological Bumpf or the Way Ahead for Operational Design?
2006-05-25
facilitating the design of such architectural frames (meta-concepts), they are doomed to be trapped in a simplistic structuralist approach.”1...systems theory and complexity theory . SOD emerged and evolved in response to inherent challenges in the contemporary Israeli security environment...discussed in subsequent chapters. Theory . Theory is critical to this examination of the CEOD approach and SOD because theory underpins and informs
Data Serving Climate Simulation Science at the NASA Center for Climate Simulation
NASA Technical Reports Server (NTRS)
Salmon, Ellen M.
2011-01-01
The NASA Center for Climate Simulation (NCCS) provides high performance computational resources, a multi-petabyte archive, and data services in support of climate simulation research and other NASA-sponsored science. This talk describes the NCCS's data-centric architecture and processing, which are evolving in anticipation of researchers' growing requirements for higher resolution simulations and increased data sharing among NCCS users and the external science community.
Transforming System Engineering through Model-Centric Engineering
2015-01-31
story that is being applied and evolved on Jupiter Europa Orbiter (JEO) project [75], and we summarize some aspects of it here, because it goes beyond...JEO Jupiter Europa Orbiter project at NASA/JPL JSF Joint Strike Fighter JPL Jet Propulsion Laboratory of NASA Linux An operating system created by...Adaptation of Flight-Critical Systems, Digital Avionics Systems Conference, 2009. [75] Rasumussen, R., R. Shishko, Jupiter Europa Orbiter Architecture
Kimmel, Charles B.; Cresko, William A.; Phillips, Patrick C.; Ullmann, Bonnie; Currey, Mark; von Hippel, Frank; Kristjánsson, Bjarni K.; Gelmond, Ofer; McGuigan, Katrina
2014-01-01
Evolution of similar phenotypes in independent populations is often taken as evidence of adaptation to the same fitness optimum. However, the genetic architecture of traits might cause evolution to proceed more often toward particular phenotypes, and less often toward others, independently of the adaptive value of the traits. Freshwater populations of Alaskan threespine stickleback have repeatedly evolved the same distinctive opercle shape after divergence from an oceanic ancestor. Here we demonstrate that this pattern of parallel evolution is widespread, distinguishing oceanic and freshwater populations across the Pacific Coast of North America and Iceland. We test whether this parallel evolution reflects genetic bias by estimating the additive genetic variance– covariance matrix (G) of opercle shape in an Alaskan oceanic (putative ancestral) population. We find significant additive genetic variance for opercle shape and that G has the potential to be biasing, because of the existence of regions of phenotypic space with low additive genetic variation. However, evolution did not occur along major eigenvectors of G, rather it occurred repeatedly in the same directions of high evolvability. We conclude that the parallel opercle evolution is most likely due to selection during adaptation to freshwater habitats, rather than due to biasing effects of opercle genetic architecture. PMID:22276538
Firewall systems: the next generation
NASA Astrophysics Data System (ADS)
McGhie, Lynda L.
1996-01-01
To be competitive in today's globally connected marketplace, a company must ensure that their internal network security methodologies and supporting policies are current and reflect an overall understanding of today's technology and its resultant threats. Further, an integrated approach to information security should ensure that new ways of sharing information and doing business are accommodated; such as electronic commerce, high speed public broadband network services, and the federally sponsored National Information Infrastructure. There are many challenges, and success is determined by the establishment of a solid and firm baseline security architecture that accommodate today's external connectivity requirements, provides transitional solutions that integrate with evolving and dynamic technologies, and ultimately acknowledges both the strategic and tactical goals of an evolving network security architecture and firewall system. This paper explores the evolution of external network connectivity requirements, the associated challenges and the subsequent development and evolution of firewall security systems. It makes the assumption that a firewall is a set of integrated and interoperable components, coming together to form a `SYSTEM' and must be designed, implement and managed as such. A progressive firewall model will be utilized to illustrates the evolution of firewall systems from earlier models utilizing separate physical networks, to today's multi-component firewall systems enabling secure heterogeneous and multi-protocol interfaces.
Directed evolution of the TALE N-terminal domain for recognition of all 5' bases.
Lamb, Brian M; Mercer, Andrew C; Barbas, Carlos F
2013-11-01
Transcription activator-like effector (TALE) proteins can be designed to bind virtually any DNA sequence. General guidelines for design of TALE DNA-binding domains suggest that the 5'-most base of the DNA sequence bound by the TALE (the N0 base) should be a thymine. We quantified the N0 requirement by analysis of the activities of TALE transcription factors (TALE-TF), TALE recombinases (TALE-R) and TALE nucleases (TALENs) with each DNA base at this position. In the absence of a 5' T, we observed decreases in TALE activity up to >1000-fold in TALE-TF activity, up to 100-fold in TALE-R activity and up to 10-fold reduction in TALEN activity compared with target sequences containing a 5' T. To develop TALE architectures that recognize all possible N0 bases, we used structure-guided library design coupled with TALE-R activity selections to evolve novel TALE N-terminal domains to accommodate any N0 base. A G-selective domain and broadly reactive domains were isolated and characterized. The engineered TALE domains selected in the TALE-R format demonstrated modularity and were active in TALE-TF and TALEN architectures. Evolved N-terminal domains provide effective and unconstrained TALE-based targeting of any DNA sequence as TALE binding proteins and designer enzymes.
NASA Technical Reports Server (NTRS)
Snyder, Christopher A.
2014-01-01
A Large Civil Tiltrotor (LCTR) conceptual design was developed as part of the NASA Heavy Lift Rotorcraft Systems Investigation in order to establish a consistent basis for evaluating the benefits of advanced technology for large tiltrotors. The concept has since evolved into the second-generation LCTR2, designed to carry 90 passengers for 1,000 nautical miles at 300 knots, with vertical takeoff and landing capability. This paper explores gas turbine component performance and cycle parameters to quantify performance gains possible for additional improvements in component and material performance beyond those identified in previous LCTR2 propulsion studies and to identify additional research areas. The vehicle-level characteristics from this advanced technology generation 2 propulsion architecture will help set performance levels as additional propulsion and power systems are conceived to meet ever-increasing requirements for mobility and comfort, while reducing energy use, cost, noise and emissions. The Large Civil Tiltrotor vehicle and mission will be discussed as a starting point for this effort. A few, relevant engine and component technology studies, including previous LCTR2 engine study results will be summarized to help orient the reader on gas turbine engine architecture, performance and limitations. Study assumptions and methodology used to explore engine design and performance, as well as assess vehicle sizing and mission performance will then be discussed. Individual performance for present and advanced engines, as well as engine performance effects on overall vehicle size and mission fuel usage, will be given. All results will be summarized to facilitate understanding the importance and interaction of various component and system performance on overall vehicle characteristics.
Electrical Grounding Architecture for Unmanned Spacecraft
NASA Technical Reports Server (NTRS)
1998-01-01
This handbook is approved for use by NASA Headquarters and all NASA Centers and is intended to provide a common framework for consistent practices across NASA programs. This handbook was developed to describe electrical grounding design architecture options for unmanned spacecraft. This handbook is written for spacecraft system engineers, power engineers, and electromagnetic compatibility (EMC) engineers. Spacecraft grounding architecture is a system-level decision which must be established at the earliest point in spacecraft design. All other grounding design must be coordinated with and be consistent with the system-level architecture. This handbook assumes that there is no one single 'correct' design for spacecraft grounding architecture. There have been many successful satellite and spacecraft programs from NASA, using a variety of grounding architectures with different levels of complexity. However, some design principles learned over the years apply to all types of spacecraft development. This handbook summarizes those principles to help guide spacecraft grounding architecture design for NASA and others.
An ISRU Propellant Production System to Fully Fuel a Mars Ascent Vehicle
NASA Technical Reports Server (NTRS)
Kleinhenz, Julie; Paz, Aaron
2017-01-01
ISRU of Mars resources was base lined in 2009 Design Reference Architecture (DRA) 5.0, but only for Oxygen production using atmospheric CO2The Methane (LCH4) needed for ascent propulsion of the Mars Ascent Vehicle (MAV) would need to be brought from Earth. HOWEVER: Extracting water from the Martian Regolith enables the production of both Oxygen and Methane from Mars resources Water resources could also be used for other applications including: Life support, radiation shielding, plant growth, etc. Water extraction was not base lined in DRA5.0 due to perceived difficulties and complexity in processing regolith. The NASA Evolvable Mars Campaign (EMC) requested studies to look at the quantitative benefits and trades of using Mars water ISRU Phase 1: Examined architecture scenarios for regolith water retrieval. Completed October 2015Phase 2: Deep dive of one architecture concept to look at end-to-end system size, mass, power of a LCH4LO2 ISRU production system.Evolvable Mars CampaignPre-deployed Mars ascent vehicle (MAV)4 crew membersPropellants: Oxygen MethaneGenerate a system model to roll up mass power of a full ISRU system and enable parametric trade studies. Leverage models from previous studies and technology development programs Anchor with mass power performance from existing hardware. Whenever possible used reference-able (published) numbers for traceability.Modular approach to allow subsystem trades and parametric studies. Propellant mass needs taken from most recently published MAV study:Polsgrove, T. et al. (2015), AIAA2015-4416MAV engines operate at mixture ratios (oxygen: methane) between 3:1 and 3.5:1, whereas the Sabatier reactor produces at a 4:1 ratio. Therefore:Methane production is the driving requirement-Excess Oxygen will be produced.
NASA Astrophysics Data System (ADS)
Haener, Rainer; Waechter, Joachim; Fleischer, Jens; Herrnkind, Stefan; Schwarting, Herrmann
2010-05-01
The German Indonesian Tsunami Early Warning System (GITEWS) is a multifaceted system consisting of various sensor types like seismometers, sea level sensors or GPS stations, and processing components, all with their own system behavior and proprietary data structure. To operate a warning chain, beginning from measurements scaling up to warning products, all components have to interact in a correct way, both syntactically and semantically. Designing the system great emphasis was laid on conformity to the Sensor Web Enablement (SWE) specification by the Open Geospatial Consortium (OGC). The technical infrastructure, the so called Tsunami Service Bus (TSB) follows the blueprint of Service Oriented Architectures (SOA). The TSB is an integration concept (SWE) where functionality (observe, task, notify, alert, and process) is grouped around business processes (Monitoring, Decision Support, Sensor Management) and packaged as interoperable services (SAS, SOS, SPS, WNS). The benefits of using a flexible architecture together with SWE lead to an open integration platform: • accessing and controlling heterogeneous sensors in a uniform way (Functional Integration) • assigns functionality to distinct services (Separation of Concerns) • allows resilient relationship between systems (Loose Coupling) • integrates services so that they can be accessed from everywhere (Location Transparency) • enables infrastructures which integrate heterogeneous applications (Encapsulation) • allows combination of services (Orchestration) and data exchange within business processes Warning systems will evolve over time: New sensor types might be added, old sensors will be replaced and processing components will be improved. From a collection of few basic services it shall be possible to compose more complex functionality essential for specific warning systems. Given these requirements a flexible infrastructure is a prerequisite for sustainable systems and their architecture must be tailored for evolution. The use of well-known techniques and widely used open source software implementing industrial standards reduces the impact of service modifications allowing the evolution of a system as a whole. GITEWS implemented a solution to feed sensor raw data from any (remote) system into the infrastructure. Specific dispatchers enable plugging in sensor-type specific processing without changing the architecture. Client components don't need to be adjusted if new sensor-types or individuals are added to the system, because they access them via standardized services. One of the outstanding features of service-oriented architectures is the possibility to compose new services from existing ones. The so called orchestration, allows the definition of new warning processes which can be adapted easily to new requirements. This approach has following advantages: • With implementing SWE it is possible to establish the "detection" and integration of sensors via the internet. Thus a system of systems combining early warning functionality at different levels of detail is feasible. • Any institution could add both its own components as well as components from third parties if they are developed in conformance to SOA principles. In a federation an institution keeps the ownership of its data and decides which data are provided by a service and when. • A system can be deployed at minor costs as a core for own development at any institution and thus enabling autonomous early warning- or monitoring systems. The presentation covers both design and various instantiations (live demonstration) of the GITEWS architecture. Experiences concerning the design and complexity of SWE will be addressed in detail. A substantial amount of attention is laid on the techniques and methods of extending the architecture, adapting proprietary components to SWE services and encoding, and their orchestration in high level workflows and processes. Furthermore the potential of the architecture concerning adaptive behavior, collaboration across boundaries and semantic interoperability will be addressed.
Histone variant innovation in a rapidly evolving chordate lineage.
Moosmann, Alexandra; Campsteijn, Coen; Jansen, Pascal Wtc; Nasrallah, Carole; Raasholm, Martina; Stunnenberg, Henk G; Thompson, Eric M
2011-07-15
Histone variants alter the composition of nucleosomes and play crucial roles in transcription, chromosome segregation, DNA repair, and sperm compaction. Modification of metazoan histone variant lineages occurs on a background of genome architecture that shows global similarities from sponges to vertebrates, but the urochordate, Oikopleura dioica, a member of the sister group to vertebrates, exhibits profound modification of this ancestral architecture. We show that a histone complement of 47 gene loci encodes 31 histone variants, grouped in distinct sets of developmental expression profiles throughout the life cycle. A particularly diverse array of 15 male-specific histone variants was uncovered, including a testes-specific H4t, the first metazoan H4 sequence variant reported. Universal histone variants H3.3, CenH3, and H2A.Z are present but O. dioica lacks homologs of macroH2A and H2AX. The genome encodes many H2A and H2B variants and the repertoire of H2A.Z isoforms is expanded through alternative splicing, incrementally regulating the number of acetylatable lysine residues in the functionally important N-terminal "charge patch". Mass spectrometry identified 40 acetylation, methylation and ubiquitylation posttranslational modifications (PTMs) and showed that hallmark PTMs of "active" and "repressive" chromatin were present in O. dioica. No obvious reduction in silent heterochromatic marks was observed despite high gene density in this extraordinarily compacted chordate genome. These results show that histone gene complements and their organization differ considerably even over modest phylogenetic distances. Substantial innovation among all core and linker histone variants has evolved in concert with adaptation of specific life history traits in this rapidly evolving chordate lineage.
Architectural Design for a Mars Communications and Navigation Orbital Infrastructure
NASA Technical Reports Server (NTRS)
Ceasrone R. J.; Hastrup, R. C.; Bell, D. J.; Roncoli, R. B.; Nelson, K.
1999-01-01
The planet Mars has become the focus of an intensive series of missions that span decades of time, a wide array of international agencies and an evolution from robotics to humans. The number of missions to Mars at any one time, and over a period of time, is unprecedented in the annals of space exploration. To meet the operational needs of this exploratory fleet will require the implementation of new architectural concepts for communications and navigation. To this end, NASA's Jet Propulsion Laboratory has begun to define and develop a Mars communications and navigation orbital infrastructure. This architecture will make extensive use of assets at Mars, as well as use of traditional Earth-based assets, such as the Deep Space Network, DSN. Indeed, the total system can be thought of as an extension of DSN nodes and services to the Mars in-situ region. The concept has been likened to the beginnings of an interplanetary Internet that will bring the exploration of Mars right into our living rooms. The paper will begin with a high-level overview of the concept for the Mars communications and navigation infrastructure. Next, the mission requirements will be presented. These will include the relatively near-term needs of robotic landers, rovers, ascent vehicles, balloons, airplanes, and possibly orbiting, arriving and departing spacecraft. Requirements envisioned for the human exploration of Mars will also be described. The important Mars orbit design trades on telecommunications and navigation capabilities will be summarized, and the baseline infrastructure will be described. A roadmap of NASA's plan to evolve this infrastructure over time will be shown. Finally, launch considerations and delivery to Mars will be briefly treated.
Extending XNAT Platform with an Incremental Semantic Framework
Timón, Santiago; Rincón, Mariano; Martínez-Tomás, Rafael
2017-01-01
Informatics increases the yield from neuroscience due to improved data. Data sharing and accessibility enable joint efforts between different research groups, as well as replication studies, pivotal for progress in the field. Research data archiving solutions are evolving rapidly to address these necessities, however, distributed data integration is still difficult because of the need of explicit agreements for disparate data models. To address these problems, ontologies are widely used in biomedical research to obtain common vocabularies and logical descriptions, but its application may suffer from scalability issues, domain bias, and loss of low-level data access. With the aim of improving the application of semantic models in biobanking systems, an incremental semantic framework that takes advantage of the latest advances in biomedical ontologies and the XNAT platform is designed and implemented. We follow a layered architecture that allows the alignment of multi-domain biomedical ontologies to manage data at different levels of abstraction. To illustrate this approach, the development is integrated in the JPND (EU Joint Program for Neurodegenerative Disease) APGeM project, focused on finding early biomarkers for Alzheimer's and other dementia related diseases. PMID:28912709
Extending XNAT Platform with an Incremental Semantic Framework.
Timón, Santiago; Rincón, Mariano; Martínez-Tomás, Rafael
2017-01-01
Informatics increases the yield from neuroscience due to improved data. Data sharing and accessibility enable joint efforts between different research groups, as well as replication studies, pivotal for progress in the field. Research data archiving solutions are evolving rapidly to address these necessities, however, distributed data integration is still difficult because of the need of explicit agreements for disparate data models. To address these problems, ontologies are widely used in biomedical research to obtain common vocabularies and logical descriptions, but its application may suffer from scalability issues, domain bias, and loss of low-level data access. With the aim of improving the application of semantic models in biobanking systems, an incremental semantic framework that takes advantage of the latest advances in biomedical ontologies and the XNAT platform is designed and implemented. We follow a layered architecture that allows the alignment of multi-domain biomedical ontologies to manage data at different levels of abstraction. To illustrate this approach, the development is integrated in the JPND (EU Joint Program for Neurodegenerative Disease) APGeM project, focused on finding early biomarkers for Alzheimer's and other dementia related diseases.
The kinetics of pre-mRNA splicing in the Drosophila genome and the influence of gene architecture
Pai, Athma A; Henriques, Telmo; McCue, Kayla; Burkholder, Adam; Adelman, Karen
2017-01-01
Production of most eukaryotic mRNAs requires splicing of introns from pre-mRNA. The splicing reaction requires definition of splice sites, which are initially recognized in either intron-spanning (‘intron definition’) or exon-spanning (‘exon definition’) pairs. To understand how exon and intron length and splice site recognition mode impact splicing, we measured splicing rates genome-wide in Drosophila, using metabolic labeling/RNA sequencing and new mathematical models to estimate rates. We found that the modal intron length range of 60–70 nt represents a local maximum of splicing rates, but that much longer exon-defined introns are spliced even faster and more accurately. We observed unexpectedly low variation in splicing rates across introns in the same gene, suggesting the presence of gene-level influences, and we identified multiple gene level variables associated with splicing rate. Together our data suggest that developmental and stress response genes may have preferentially evolved exon definition in order to enhance the rate or accuracy of splicing. PMID:29280736
The kinetics of pre-mRNA splicing in the Drosophila genome and the influence of gene architecture
Pai, Athma A.; Henriques, Telmo; McCue, Kayla; ...
2017-12-27
Production of most eukaryotic mRNAs requires splicing of introns from pre-mRNA. The splicing reaction requires definition of splice sites, which are initially recognized in either intron-spanning (‘intron definition’) or exon-spanning (‘exon definition’) pairs. To understand how exon and intron length and splice site recognition mode impact splicing, we measured splicing rates genome-wide in Drosophila, using metabolic labeling/RNA sequencing and new mathematical models to estimate rates. We found that the modal intron length range of 60–70 nt represents a local maximum of splicing rates, but that much longer exon-defined introns are spliced even faster and more accurately. We observed unexpectedly lowmore » variation in splicing rates across introns in the same gene, suggesting the presence of gene-level influences, and we identified multiple gene level variables associated with splicing rate. Together our data suggest that developmental and stress response genes may have preferentially evolved exon definition in order to enhance the rate or accuracy of splicing.« less
The kinetics of pre-mRNA splicing in the Drosophila genome and the influence of gene architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pai, Athma A.; Henriques, Telmo; McCue, Kayla
Production of most eukaryotic mRNAs requires splicing of introns from pre-mRNA. The splicing reaction requires definition of splice sites, which are initially recognized in either intron-spanning (‘intron definition’) or exon-spanning (‘exon definition’) pairs. To understand how exon and intron length and splice site recognition mode impact splicing, we measured splicing rates genome-wide in Drosophila, using metabolic labeling/RNA sequencing and new mathematical models to estimate rates. We found that the modal intron length range of 60–70 nt represents a local maximum of splicing rates, but that much longer exon-defined introns are spliced even faster and more accurately. We observed unexpectedly lowmore » variation in splicing rates across introns in the same gene, suggesting the presence of gene-level influences, and we identified multiple gene level variables associated with splicing rate. Together our data suggest that developmental and stress response genes may have preferentially evolved exon definition in order to enhance the rate or accuracy of splicing.« less
Genetics of reproduction and regulation of honey bee (Apis mellifera L.) social behavior
Page, Robert E.; Rueppell, Olav; Amdam, Gro V.
2014-01-01
Honey bees form complex societies with a division of labor for reproduction, nutrition, nest construction and maintenance, and defense. How does it evolve? Tasks performed by worker honey bees are distributed in time and space. There is no central control over behavior and there is no central genome on which selection can act and effect adaptive change. For 22 years we have been asking these questions by selecting on a single social trait associated with nutrition: the amount of surplus pollen (a source of protein) that is stored in combs of the nest. Forty-two generations of selection have revealed changes at biological levels extending from the society down to the level of the gene. We show how we constructed this vertical understanding of social evolution using behavioral and anatomical analyses, physiology, genetic mapping, and gene knockdowns. We map out the phenotypic and genetic architectures of food storage and foraging behavior and show how they are linked through broad epistasis and pleiotropy affecting a reproductive regulatory network that influences foraging behavior. PMID:22934646
A computer architecture for intelligent machines
NASA Technical Reports Server (NTRS)
Lefebvre, D. R.; Saridis, G. N.
1991-01-01
The Theory of Intelligent Machines proposes a hierarchical organization for the functions of an autonomous robot based on the Principle of Increasing Precision With Decreasing Intelligence. An analytic formulation of this theory using information-theoretic measures of uncertainty for each level of the intelligent machine has been developed in recent years. A computer architecture that implements the lower two levels of the intelligent machine is presented. The architecture supports an event-driven programming paradigm that is independent of the underlying computer architecture and operating system. Details of Execution Level controllers for motion and vision systems are addressed, as well as the Petri net transducer software used to implement Coordination Level functions. Extensions to UNIX and VxWorks operating systems which enable the development of a heterogeneous, distributed application are described. A case study illustrates how this computer architecture integrates real-time and higher-level control of manipulator and vision systems.
NASA Technical Reports Server (NTRS)
Kolar, Mike; Estefan, Jeff; Giovannoni, Brian; Barkley, Erik
2011-01-01
Topics covered (1) Why Governance and Why Now? (2) Characteristics of Architecture Governance (3) Strategic Elements (3a) Architectural Principles (3b) Architecture Board (3c) Architecture Compliance (4) Architecture Governance Infusion Process. Governance is concerned with decision making (i.e., setting directions, establishing standards and principles, and prioritizing investments). Architecture governance is the practice and orientation by which enterprise architectures and other architectures are managed and controlled at an enterprise-wide level
NASA's Space Launch System: Moving Toward the Launch Pad
NASA Technical Reports Server (NTRS)
Creech, Stephen D.; May, Todd A.
2013-01-01
The National Aeronautics and Space Administration's (NASA's) Space Launch System (SLS) Program, managed at the Marshall Space Flight Center (MSFC), is making progress toward delivering a new capability for human space flight and scientific missions beyond Earth orbit. Designed with the goals of safety, affordability, and sustainability in mind, the SLS rocket will launch the Orion Multi-Purpose Crew Vehicle (MPCV), equipment, supplies, and major science missions for exploration and discovery. Supporting Orion's first autonomous flight to lunar orbit and back in 2017 and its first crewed flight in 2021, the SLS will evolve into the most powerful launch vehicle ever flown via an upgrade approach that will provide building blocks for future space exploration. NASA is working to deliver this new capability in an austere economic climate, a fact that has inspired the SLS team to find innovative solutions to the challenges of designing, developing, fielding, and operating the largest rocket in history. This paper will summarize the planned capabilities of the vehicle, the progress the SLS Program has made in the 2 years since the Agency formally announced its architecture in September 2011, the path it is following to reach the launch pad in 2017 and then to evolve the 70 metric ton (t) initial lift capability to 130-t lift capability after 2021. The paper will explain how, to meet the challenge of a flat funding curve, an architecture was chosen that combines the use and enhancement of legacy systems and technology with strategic new developments that will evolve the launch vehicle's capabilities. This approach reduces the time and cost of delivering the initial 70 t Block 1 vehicle, and reduces the number of parallel development investments required to deliver the evolved 130 t Block 2 vehicle. The paper will outline the milestones the program has already reached, from developmental milestones such as the manufacture of the first flight hardware, to life-cycle milestones such as the vehicle's Preliminary Design Review (PDR). The paper will also discuss the remaining challenges both in delivering the 70-t vehicle and in evolving its capabilities to the 130-t vehicle, and how NASA plans to accomplish these goals. As this paper will explain, SLS is making measurable progress toward becoming a global infrastructure asset for robotic and human scouts of all nations by harnessing business and technological innovations to deliver sustainable solutions for space exploration.
Back to the future: virtualization of the computing environment at the W. M. Keck Observatory
NASA Astrophysics Data System (ADS)
McCann, Kevin L.; Birch, Denny A.; Holt, Jennifer M.; Randolph, William B.; Ward, Josephine A.
2014-07-01
Over its two decades of science operations, the W.M. Keck Observatory computing environment has evolved to contain a distributed hybrid mix of hundreds of servers, desktops and laptops of multiple different hardware platforms, O/S versions and vintages. Supporting the growing computing capabilities to meet the observatory's diverse, evolving computing demands within fixed budget constraints, presents many challenges. This paper describes the significant role that virtualization is playing in addressing these challenges while improving the level and quality of service as well as realizing significant savings across many cost areas. Starting in December 2012, the observatory embarked on an ambitious plan to incrementally test and deploy a migration to virtualized platforms to address a broad range of specific opportunities. Implementation to date has been surprisingly glitch free, progressing well and yielding tangible benefits much faster than many expected. We describe here the general approach, starting with the initial identification of some low hanging fruit which also provided opportunity to gain experience and build confidence among both the implementation team and the user community. We describe the range of challenges, opportunities and cost savings potential. Very significant among these was the substantial power savings which resulted in strong broad support for moving forward. We go on to describe the phasing plan, the evolving scalable architecture, some of the specific technical choices, as well as some of the individual technical issues encountered along the way. The phased implementation spans Windows and Unix servers for scientific, engineering and business operations, virtualized desktops for typical office users as well as more the more demanding graphics intensive CAD users. Other areas discussed in this paper include staff training, load balancing, redundancy, scalability, remote access, disaster readiness and recovery.
Evolutionary genomics of LysM genes in land plants.
Zhang, Xue-Cheng; Cannon, Steven B; Stacey, Gary
2009-08-03
The ubiquitous LysM motif recognizes peptidoglycan, chitooligosaccharides (chitin) and, presumably, other structurally-related oligosaccharides. LysM-containing proteins were first shown to be involved in bacterial cell wall degradation and, more recently, were implicated in perceiving chitin (one of the established pathogen-associated molecular patterns) and lipo-chitin (nodulation factors) in flowering plants. However, the majority of LysM genes in plants remain functionally uncharacterized and the evolutionary history of complex LysM genes remains elusive. We show that LysM-containing proteins display a wide range of complex domain architectures. However, only a simple core architecture is conserved across kingdoms. Each individual kingdom appears to have evolved a distinct array of domain architectures. We show that early plant lineages acquired four characteristic architectures and progressively lost several primitive architectures. We report plant LysM phylogenies and associated gene, protein and genomic features, and infer the relative timing of duplications of LYK genes. We report a domain architecture catalogue of LysM proteins across all kingdoms. The unique pattern of LysM protein domain architectures indicates the presence of distinctive evolutionary paths in individual kingdoms. We describe a comparative and evolutionary genomics study of LysM genes in plant kingdom. One of the two groups of tandemly arrayed plant LYK genes likely resulted from an ancient genome duplication followed by local genomic rearrangement, while the origin of the other groups of tandemly arrayed LYK genes remains obscure. Given the fact that no animal LysM motif-containing genes have been functionally characterized, this study provides clues to functional characterization of plant LysM genes and is also informative with regard to evolutionary and functional studies of animal LysM genes.
NASA Technical Reports Server (NTRS)
Watzin, James G.; Burt, Joseph; Tooley, Craig
2004-01-01
The Vision for Space Exploration calls for undertaking lunar exploration activities to enable sustained human and robotic exploration of Mars and beyond, including more distant destinations in the solar system. In support of this vision, the Robotic Lunar Exploration Program (RLEP) is expected to execute a series of robotic missions to the Moon, starting in 2008, in order to pave the way for further human space exploration. This paper will give an introduction to the RLEP program office, its role and its goals, and the approach it is taking to executing the charter of the program. The paper will also discuss candidate architectures that are being studied as a framework for defining the RLEP missions and the context in which they will evolve.
Using Real and Simulated TNOs to Constrain the Outer Solar System
NASA Astrophysics Data System (ADS)
Kaib, Nathan
2018-04-01
Over the past 2-3 decades our understanding of the outer solar system’s history and current state has evolved dramatically. An explosion in the number of detected trans-Neptunian objects (TNOs) coupled with simultaneous advances in numerical models of orbital dynamics has driven this rapid evolution. However, successfully constraining the orbital architecture and evolution of the outer solar system requires accurately comparing simulation results with observational datasets. This process is challenging because observed datasets are influenced by orbital discovery biases as well as TNO size and albedo distributions. Meanwhile, such influences are generally absent from numerical results. Here I will review recent work I and others have undertaken using numerical simulations in concert with catalogs of observed TNOs to constrain the outer solar system’s current orbital architecture and past evolution.
Parallel Architectures and Parallel Algorithms for Integrated Vision Systems. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Choudhary, Alok Nidhi
1989-01-01
Computer vision is regarded as one of the most complex and computationally intensive problems. An integrated vision system (IVS) is a system that uses vision algorithms from all levels of processing to perform for a high level application (e.g., object recognition). An IVS normally involves algorithms from low level, intermediate level, and high level vision. Designing parallel architectures for vision systems is of tremendous interest to researchers. Several issues are addressed in parallel architectures and parallel algorithms for integrated vision systems.
Sabne, Amit J.; Sakdhnagool, Putt; Lee, Seyong; ...
2015-07-13
Accelerator-based heterogeneous computing is gaining momentum in the high-performance computing arena. However, the increased complexity of heterogeneous architectures demands more generic, high-level programming models. OpenACC is one such attempt to tackle this problem. Although the abstraction provided by OpenACC offers productivity, it raises questions concerning both functional and performance portability. In this article, the authors propose HeteroIR, a high-level, architecture-independent intermediate representation, to map high-level programming models, such as OpenACC, to heterogeneous architectures. They present a compiler approach that translates OpenACC programs into HeteroIR and accelerator kernels to obtain OpenACC functional portability. They then evaluate the performance portability obtained bymore » OpenACC with their approach on 12 OpenACC programs on Nvidia CUDA, AMD GCN, and Intel Xeon Phi architectures. They study the effects of various compiler optimizations and OpenACC program settings on these architectures to provide insights into the achieved performance portability.« less
The K9 On-Board Rover Architecture
NASA Technical Reports Server (NTRS)
Bresina, John L.; Bualat, Maria; Fair, Michael; Washington, Richard; Wright, Anne
2006-01-01
This paper describes the software architecture of NASA Ames Research Center s K9 rover. The goal of the onboard software architecture team was to develop a modular, flexible framework that would allow both high- and low-level control of the K9 hardware. Examples of low-level control are the simple drive or pan/tilt commands which are handled by the resource managers, and examples of high-level control are the command sequences which are handled by the conditional executive. In between these two control levels are complex behavioral commands which are handled by the pilot, such as drive to goal with obstacle avoidance or visually servo to a target. This paper presents the design of the architecture as of Fall 2000. We describe the state of the architecture implementation as well as its current evolution. An early version of the architecture was used for K9 operations during a dual-rover field experiment conducted by NASA Ames Research Center (ARC) and the Jet Propulsion Laboratory (JPL) from May 14 to May 16, 2000.
Evolution of Perianth and Stamen Characteristics with Respect to Floral Symmetry in Ranunculales
Damerval, Catherine; Nadot, Sophie
2007-01-01
Background and Aims Floral symmetry presents two main states in angiosperms, namely polysymmetry and monosymmetry. Monosymmetry is thought to have evolved several times independently from polysymmetry, possibly in co-adaptation with specialized pollinators. Monosymmetry commonly refers to the perianth, even though associated androecium modifications have been reported. The evolution of perianth symmetry is examined with respect to traits of flower architecture in the Ranunculales, the sister group to all other eudicots, which present a large diversity of floral forms. Methods Characters considered were perianth merism, calyx, corolla and androecium symmetry, number of stamens and spurs. Character evolution was optimized on a composite phylogenetic tree of Ranunculales using maximum parsimony. Key Results The ancestral state for merism could not be inferred because the basalmost Eupteleaceae lack a perianth and have a variable number of stamens. The Papaveraceae are dimerous, and the five other families share a common trimerous ancestor. Shifts from trimery to dimery (or reverse) are observed. Pentamery evolved in Ranunculaceae. Ranunculales except Eupteleaceae, present a polysymmetric ancestral state. Monosymmetry evolved once within Papaveraceae, Ranunculaceae and Menispermaceae (female flowers only). Oligandry is the ancestral state for all Ranunculales, and polyandry evolved several times independently, in Papaveraceae, Menispermaceae, Berberidaceae and Ranunculaceae, with two reversions to oligandry in the latter. The ancestral state for androecium symmetry is ambiguous for the Ranunculales, while polysymmetry evolved immediately after the divergence of Eupteleaceae. A disymmetric androecium evolved in Papaveraceae. The ancestral state for spurs is none. Multiple spurs evolved in Papaveraceae, Berberidaceae and Ranunculaceae, and single spurs occur in Papaveraceae and Ranunculaceae. Conclusions The evolution of symmetry appears disconnected from changes in merism and stamen number, although monosymmetry never evolved in the context of an open ground plan. In bisexual species, monosymmetry evolved coincidently with single spurs, allowing us to propose an evolutionary scenario for Papaveraceae. PMID:17428835
1996-04-30
CJCS Chairman of the Joint Chiefs of Staff CMP Configuration Management Plan COTS Commercial-off-the-Shelf DA Data Administrator DASD (IM) Deputy...Staff ( CJCS ) representing the unified combatant commands. " Technical: The system can evolve (migrate) to be supported by the integrated, standards...s) (PSAs), or CJCS , having functional responsibility for the missions and functions supported by the system, with the participation of affected DoD
2002-09-01
ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER Egov 9. SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSORING / MONITORING...initiatives. The federal government has 55 databases that deal with security threats, but inter- agency access depends on establishing agreements through...which that information can be shared. True cooperation also will require government -wide commitment to enterprise architecture, integrated
Automation and robotics for Space Station in the twenty-first century
NASA Technical Reports Server (NTRS)
Willshire, K. F.; Pivirotto, D. L.
1986-01-01
Space Station telerobotics will evolve beyond the initial capability into a smarter and more capable system as we enter the twenty-first century. Current technology programs including several proposed ground and flight experiments to enable development of this system are described. Advancements in the areas of machine vision, smart sensors, advanced control architecture, manipulator joint design, end effector design, and artificial intelligence will provide increasingly more autonomous telerobotic systems.
Future Directions for Astronomical Image Display
NASA Technical Reports Server (NTRS)
Mandel, Eric
2000-01-01
In the "Future Directions for Astronomical Image Displav" project, the Smithsonian Astrophysical Observatory (SAO) and the National Optical Astronomy Observatories (NOAO) evolved our existing image display program into fully extensible. cross-platform image display software. We also devised messaging software to support integration of image display into astronomical analysis systems. Finally, we migrated our software from reliance on Unix and the X Window System to a platform-independent architecture that utilizes the cross-platform Tcl/Tk technology.
Ross, Emma
2017-05-26
Management, coordination and logistics were critical for responding effectively to the Ebola outbreak in Sierra Leone, and the duration of the epidemic provided a rare opportunity to study the management of an outbreak that endured long enough for the response to mature. This qualitative study examines the structures and systems used to manage the response, and how and why they changed and evolved. It also discusses the quality of relationships between key responders and their impact. Early coordination mechanisms failed and the President took operational control away from the Ministry of Health and Sanitation and established a National Ebola Response Centre, headed by the Minister of Defence, and District Ebola Response Centres. British civilian and military personnel were deeply embedded in this command and control architecture and, together with the United Nations Mission for Ebola Emergency Response lead, were the dominant coordination partners at the national level. Coordination, politics and tensions in relationships hampered the response, but as the response mechanisms matured, coordination improved and rifts healed. Simultaneously setting up new organizations, processes and plans as well as attempting to reconcile different cultures, working practices and personalities in such an emergency was bound to be challenging.This article is part of the themed issue 'The 2013-2016 West African Ebola epidemic: data, decision-making and disease control'. © 2017 The Author(s).
Page, Robert E; Scheiner, Ricarda; Erber, Joachim; Amdam, Gro V
2006-01-01
How does complex social behavior evolve? What are the developmental building blocks of division of labor and specialization, the hallmarks of insect societies? Studies have revealed the developmental origins in the evolution of division of labor and specialization in foraging worker honeybees, the hallmarks of complex insect societies. Selective breeding for a single social trait, the amount of surplus pollen stored in the nest (pollen hoarding) revealed a phenotypic architecture of correlated traits at multiple levels of biological organization in facultatively sterile female worker honeybees. Verification of this phenotypic architecture in "wild-type" bees provided strong support for a "pollen foraging syndrome" that involves increased senso-motor responses, motor activity, associative learning, reproductive status, and rates of behavioral development, as well as foraging behavior. This set of traits guided further research into reproductive regulatory systems that were co-opted by natural selection during the evolution of social behavior. Division of labor, characterized by changes in the tasks performed by bees, as they age, is controlled by hormones linked to ovary development. Foraging specialization on nectar and pollen results also from different reproductive states of bees where nectar foragers engage in pre-reproductive behavior, foraging for nectar for self-maintenance, while pollen foragers perform foraging tasks associated with reproduction and maternal care, collecting protein.
Paccard, Antoine; Van Buskirk, Josh; Willi, Yvonne
2016-05-01
Species distribution limits are hypothesized to be caused by small population size and limited genetic variation in ecologically relevant traits, but earlier studies have not evaluated genetic variation in multivariate phenotypes. We asked whether populations at the latitudinal edges of the distribution have altered quantitative genetic architecture of ecologically relevant traits compared with midlatitude populations. We calculated measures of evolutionary potential in nine Arabidopsis lyrata populations spanning the latitudinal range of the species in eastern and midwestern North America. Environments at the latitudinal extremes have reduced water availability, and therefore plants were assessed under wet and dry treatments. We estimated genetic variance-covariance (G-) matrices for 10 traits related to size, development, and water balance. Populations at southern and northern distribution edges had reduced levels of genetic variation across traits, but their G-matrices were more spherical; G-matrix orientation was unrelated to latitude. As a consequence, the predicted short-term response to selection was at least as strong in edge populations as in central populations. These results are consistent with genetic drift eroding variation and reducing the effectiveness of correlational selection at distribution margins. We conclude that genetic variation of isolated traits poorly predicts the capacity to evolve in response to multivariate selection and that the response to selection may frequently be greater than expected at species distribution margins because of genetic drift.
Innovative fiber-laser architecture-based compact wind lidar
NASA Astrophysics Data System (ADS)
Prasad, Narasimha S.; Tracy, Allen; Vetorino, Steve; Higgins, Richard; Sibell, Russ
2016-03-01
This paper describes an innovative, compact and eyesafe coherent lidar system developed for use in wind and wake vortex sensing applications. This advanced lidar system is field ruggedized with reduced size, weight, and power consumption (SWaP) configured based on an all-fiber and modular architecture. The all-fiber architecture is developed using a fiber seed laser that is coupled to uniquely configured fiber amplifier modules and associated photonic elements including an integrated 3D scanner. The scanner provides user programmable continuous 360 degree azimuth and 180 degree elevation scan angles. The system architecture eliminates free-space beam alignment issues and allows plug and play operation using graphical user interface software modules. Besides its all fiber architecture, the lidar system also provides pulsewidth agility to aid in improving range resolution. Operating at 1.54 microns and with a PRF of up to 20 KHz, the wind lidar is air cooled with overall dimensions of 30" x 46" x 60" and is designed as a Class 1 system. This lidar is capable of measuring wind velocities greater than 120 +/- 0.2 m/s over ranges greater than 10 km and with a range resolution of less than 15 m. This compact and modular system is anticipated to provide mobility, reliability, and ease of field deployment for wind and wake vortex measurements. The current lidar architecture is amenable for trace gas sensing and as such it is being evolved for airborne and space based platforms. In this paper, the key features of wind lidar instrumentation and its functionality are discussed followed by results of recent wind forecast measurements on a wind farm.
SAMS--a systems architecture for developing intelligent health information systems.
Yılmaz, Özgün; Erdur, Rıza Cenk; Türksever, Mustafa
2013-12-01
In this paper, SAMS, a novel health information system architecture for developing intelligent health information systems is proposed and also some strategies for developing such systems are discussed. The systems fulfilling this architecture will be able to store electronic health records of the patients using OWL ontologies, share patient records among different hospitals and provide physicians expertise to assist them in making decisions. The system is intelligent because it is rule-based, makes use of rule-based reasoning and has the ability to learn and evolve itself. The learning capability is provided by extracting rules from previously given decisions by the physicians and then adding the extracted rules to the system. The proposed system is novel and original in all of these aspects. As a case study, a system is implemented conforming to SAMS architecture for use by dentists in the dental domain. The use of the developed system is described with a scenario. For evaluation, the developed dental information system will be used and tried by a group of dentists. The development of this system proves the applicability of SAMS architecture. By getting decision support from a system derived from this architecture, the cognitive gap between experienced and inexperienced physicians can be compensated. Thus, patient satisfaction can be achieved, inexperienced physicians are supported in decision making and the personnel can improve their knowledge. A physician can diagnose a case, which he/she has never diagnosed before, using this system. With the help of this system, it will be possible to store general domain knowledge in this system and the personnel's need to medical guideline documents will be reduced.
Critical Review of NOAA's Observation Requirements Process
NASA Astrophysics Data System (ADS)
LaJoie, M.; Yapur, M.; Vo, T.; Templeton, A.; Bludis, D.
2017-12-01
NOAA's Observing Systems Council (NOSC) maintains a comprehensive database of user observation requirements. The requirements collection process engages NOAA subject matter experts to document and effectively communicate the specific environmental observation measurements (parameters and attributes) needed to produce operational products and pursue research objectives. User observation requirements documented using a structured and standardized manner and framework enables NOAA to assess its needs across organizational lines in an impartial, objective, and transparent manner. This structure provides the foundation for: selecting, designing, developing, acquiring observing technologies, systems and architectures; budget and contract formulation and decision-making; and assessing in a repeatable fashion the productivity, efficiency and optimization of NOAA's observing system enterprise. User observation requirements are captured independently from observing technologies. Therefore, they can be addressed by a variety of current or expected observing capabilities and allow flexibility to be remapped to new and evolving technologies. NOAA's current inventory of user observation requirements were collected over a ten-year period, and there have been many changes in policies, mission priorities, and funding levels during this time. In light of these changes, the NOSC initiated a critical, in-depth review to examine all aspects of user observation requirements and associated processes during 2017. This presentation provides background on the NOAA requirements process, major milestones and outcomes of the critical review, and plans for evolving and connecting observing requirements processes in the next year.
Uncovering the genetic signature of quantitative trait evolution with replicated time series data.
Franssen, S U; Kofler, R; Schlötterer, C
2017-01-01
The genetic architecture of adaptation in natural populations has not yet been resolved: it is not clear to what extent the spread of beneficial mutations (selective sweeps) or the response of many quantitative trait loci drive adaptation to environmental changes. Although much attention has been given to the genomic footprint of selective sweeps, the importance of selection on quantitative traits is still not well studied, as the associated genomic signature is extremely difficult to detect. We propose 'Evolve and Resequence' as a promising tool, to study polygenic adaptation of quantitative traits in evolving populations. Simulating replicated time series data we show that adaptation to a new intermediate trait optimum has three characteristic phases that are reflected on the genomic level: (1) directional frequency changes towards the new trait optimum, (2) plateauing of allele frequencies when the new trait optimum has been reached and (3) subsequent divergence between replicated trajectories ultimately leading to the loss or fixation of alleles while the trait value does not change. We explore these 3 phase characteristics for relevant population genetic parameters to provide expectations for various experimental evolution designs. Remarkably, over a broad range of parameters the trajectories of selected alleles display a pattern across replicates, which differs both from neutrality and directional selection. We conclude that replicated time series data from experimental evolution studies provide a promising framework to study polygenic adaptation from whole-genome population genetics data.
Assuring SS7 dependability: A robustness characterization of signaling network elements
NASA Astrophysics Data System (ADS)
Karmarkar, Vikram V.
1994-04-01
Current and evolving telecommunication services will rely on signaling network performance and reliability properties to build competitive call and connection control mechanisms under increasing demands on flexibility without compromising on quality. The dimensions of signaling dependability most often evaluated are the Rate of Call Loss and End-to-End Route Unavailability. A third dimension of dependability that captures the concern about large or catastrophic failures can be termed Network Robustness. This paper is concerned with the dependability aspects of the evolving Signaling System No. 7 (SS7) networks and attempts to strike a balance between the probabilistic and deterministic measures that must be evaluated to accomplish a risk-trend assessment to drive architecture decisions. Starting with high-level network dependability objectives and field experience with SS7 in the U.S., potential areas of growing stringency in network element (NE) dependability are identified to improve against current measures of SS7 network quality, as per-call signaling interactions increase. A sensitivity analysis is presented to highlight the impact due to imperfect coverage of duplex network component or element failures (i.e., correlated failures), to assist in the setting of requirements on NE robustness. A benefit analysis, covering several dimensions of dependability, is used to generate the domain of solutions available to the network architect in terms of network and network element fault tolerance that may be specified to meet the desired signaling quality goals.
Simultaneous Transmit and Receive Performance of an 8-channel Digital Phased Array
2017-01-16
Lincoln Laboratory Lexington, Massachusetts, USA Abstract—The Aperture- Level Simultaneous Transmit and Re- ceive (ALSTAR) architecture enables extremely...In [1], the Aperture- Level Simultaneous Transmit and Receive (ALSTAR) architecture was proposed for achieving STAR using a fully digital phased array...Aperture- Level Simultaneous Transmit and Receive (ALSTAR) architecture enables STAR functionality in a digital phased array without the use of specialized
Frances: A Tool for Understanding Computer Architecture and Assembly Language
ERIC Educational Resources Information Center
Sondag, Tyler; Pokorny, Kian L.; Rajan, Hridesh
2012-01-01
Students in all areas of computing require knowledge of the computing device including software implementation at the machine level. Several courses in computer science curricula address these low-level details such as computer architecture and assembly languages. For such courses, there are advantages to studying real architectures instead of…
Architecture is Elementary: Visual Thinking through Architectural Concepts.
ERIC Educational Resources Information Center
Winters, Nathan B.
This book presents very basic but important concepts about architecture and outlines some of the most important concepts used by great architects. These concepts are taught at levels of perceptual maturity applicable to adults and children alike and progress from levels one through seven as the concepts become progressively intertwined. The…
Digital Historic Urban Landscape Methodology for Heritage Impact Assessment of Singapore
NASA Astrophysics Data System (ADS)
Widodo, J.; Wong, Y. C.; Ismail, F.
2017-08-01
Using the case study of Singapore's existing heritage websites, this research will probe the circumstances of the emerging technology and practice of consuming heritage architecture on a digital platform. Despite the diverse objectives, technology is assumed to help deliver greater interpretation through the use of new and high technology emphasising experience and provide visual fidelity. However, the success is limited as technology is insufficient to provide the past from multiple perspectives. Currently, existing projects provide linear narratives developed through a top-down approach that assumes the end-users as an individual entity and limits heritage as a consumable product. Through this research, we hope to uncover for better experience of digital heritage architecture where interpretation is an evolving `process' that is participatory and contributory that allows public participation, together with effective presentation, cultural learning and embodiment, to enhance the end-users' interpretation of digital heritage architecture. Additionally, this research seeks to establish an inventory in the form of a digital platform that adopts the Historic Urban Landscape (HUL) into the Singapore context to better and deepen the understandings of the public towards architectural as well as cultural heritage through an intercultural and intergenerational dialogue. Through HUL, this research hopes that it will better shape conservation strategies and urban planning.
Exploration Space Suit Architecture: Destination Environmental-Based Technology Development
NASA Technical Reports Server (NTRS)
Hill, Terry R.
2010-01-01
This paper picks up where EVA Space Suit Architecture: Low Earth Orbit Vs. Moon Vs. Mars (Hill, Johnson, IEEEAC paper #1209) left off in the development of a space suit architecture that is modular in design and interfaces and could be reconfigured to meet the mission or during any given mission depending on the tasks or destination. This paper will walk though the continued development of a space suit system architecture, and how it should evolve to meeting the future exploration EVA needs of the United States space program. In looking forward to future US space exploration and determining how the work performed to date in the CxP and how this would map to a future space suit architecture with maximum re-use of technology and functionality, a series of thought exercises and analysis have provided a strong indication that the CxP space suit architecture is well postured to provide a viable solution for future exploration missions. Through the destination environmental analysis that is presented in this paper, the modular architecture approach provides the lowest mass, lowest mission cost for the protection of the crew given any human mission outside of low Earth orbit. Some of the studies presented here provide a look and validation of the non-environmental design drivers that will become every-increasingly important the further away from Earth humans venture and the longer they are away. Additionally, the analysis demonstrates a logical clustering of design environments that allows a very focused approach to technology prioritization, development and design that will maximize the return on investment independent of any particular program and provide architecture and design solutions for space suit systems in time or ahead of being required for any particular manned flight program in the future. The new approach to space suit design and interface definition the discussion will show how the architecture is very adaptable to programmatic and funding changes with minimal redesign effort required such that the modular architecture can be quickly and efficiently honed into a specific mission point solution if required.
Recent Developments in Hardware-in-the-Loop Formation Navigation and Control
NASA Technical Reports Server (NTRS)
Mitchell, Jason W.; Luquette, Richard J.
2005-01-01
The Formation Flying Test-Bed (FFTB) at NASA Goddard Space Flight Center (GSFC) provides a hardware-in-the-loop test environment for formation navigation and control. The facility is evolving as a modular, hybrid, dynamic simulation facility for end-tc-end guidance, navigation, and control (GN&C) design and analysis of formation flying spacecraft. The core capabilities of the FFTB, as a platform for testing critical hardware and software algorithms in-the-loop, are reviewed with a focus on many recent improvements. Two significant upgrades to the FFTB are a message-oriented middleware (MOM) architecture, and a software crosslink for inter-spacecraft ranging. The MOM architecture provides a common messaging bus for software agents, easing integration, arid supporting the GSFC Mission Services Evolution Center (GMSEC) architecture via software bridge. Additionally, the FFTB s hardware capabilities are expanding. Recently, two Low-Power Transceivers (LPTs) with ranging capability have been introduced into the FFTB. The LPT crosslinks will be connected to a modified Crosslink Channel Simulator (CCS), which applies realistic space-environment effects to the Radio Frequency (RF) signals produced by the LPTs.
Predicting Instability Timescales in Closely-Packed Planetary Systems
NASA Astrophysics Data System (ADS)
Tamayo, Daniel; Hadden, Samuel; Hussain, Naireen; Silburt, Ari; Gilbertson, Christian; Rein, Hanno; Menou, Kristen
2018-04-01
Many of the multi-planet systems discovered around other stars are maximally packed. This implies that simulations with masses or orbital parameters too far from the actual values will destabilize on short timescales; thus, long-term dynamics allows one to constrain the orbital architectures of many closely packed multi-planet systems. A central challenge in such efforts is the large computational cost of N-body simulations, which preclude a full survey of the high-dimensional parameter space of orbital architectures allowed by observations. I will present our recent successes in training machine learning models capable of reliably predicting orbital stability a million times faster than N-body simulations. By engineering dynamically relevant features that we feed to a gradient-boosted decision tree algorithm (XGBoost), we are able to achieve a precision and recall of 90% on a holdout test set of N-body simulations. This opens a wide discovery space for characterizing new exoplanet discoveries and for elucidating how orbital architectures evolve through time as the next generation of spaceborne exoplanet surveys prepare for launch this year.
Performance Assessment of the Exploration Water Recovery System
NASA Technical Reports Server (NTRS)
Carter. D. Layne; Tabb, David; Perry, Jay
2008-01-01
A new water recovery system architecture designed to fulfill the National Aeronautics and Space Administration s (NASA) Space Exploration Policy has been tested at the Marshall Space Flight Center (MSFC). This water recovery system architecture evolved from the current state-of-the-art system developed for the International Space Station (ISS). Through novel integration of proven technologies for air and water purification, this system promises to elevate existing system optimization. The novel aspect of the system is twofold. First, volatile organic compounds (VOC) are removed from the cabin air via catalytic oxidation in the vapor phase, prior to their absorption into the aqueous phase. Second, vapor compression distillation (VCD) technology processes the condensate and hygiene waste streams in addition to the urine waste stream. Oxidation kinetics dictate that removing VOCs from the vapor phase is more efficient. Treating the various waste streams by VCD reduces the load on the expendable ion exchange and adsorption media which follows, as well as the aqueous-phase catalytic oxidation process further downstream. This paper documents the results of testing this new architecture.
Architecture and material properties of diatom shells provide effective mechanical protection
NASA Astrophysics Data System (ADS)
Hamm, Christian E.; Merkel, Rudolf; Springer, Olaf; Jurkojc, Piotr; Maier, Christian; Prechtel, Kathrin; Smetacek, Victor
2003-02-01
Diatoms are the major contributors to phytoplankton blooms in lakes and in the sea and hence are central in aquatic ecosystems and the global carbon cycle. All free-living diatoms differ from other phytoplankton groups in having silicified cell walls in the form of two `shells' (the frustule) of manifold shape and intricate architecture whose function and role, if any, in contributing to the evolutionary success of diatoms is under debate. We explored the defence potential of the frustules as armour against predators by measuring their strength. Real and virtual loading tests (using calibrated glass microneedles and finite element analysis) were performed on centric and pennate diatom cells. Here we show that the frustules are remarkably strong by virtue of their architecture and the material properties of the diatom silica. We conclude that diatom frustules have evolved as mechanical protection for the cells because exceptional force is required to break them. The evolutionary arms race between diatoms and their specialized predators will have had considerable influence in structuring pelagic food webs and biogeochemical cycles.
Architecture and material properties of diatom shells provide effective mechanical protection.
Hamm, Christian E; Merkel, Rudolf; Springer, Olaf; Jurkojc, Piotr; Maier, Christian; Prechtel, Kathrin; Smetacek, Victor
2003-02-20
Diatoms are the major contributors to phytoplankton blooms in lakes and in the sea and hence are central in aquatic ecosystems and the global carbon cycle. All free-living diatoms differ from other phytoplankton groups in having silicified cell walls in the form of two 'shells' (the frustule) of manifold shape and intricate architecture whose function and role, if any, in contributing to the evolutionary success of diatoms is under debate. We explored the defence potential of the frustules as armour against predators by measuring their strength. Real and virtual loading tests (using calibrated glass microneedles and finite element analysis) were performed on centric and pennate diatom cells. Here we show that the frustules are remarkably strong by virtue of their architecture and the material properties of the diatom silica. We conclude that diatom frustules have evolved as mechanical protection for the cells because exceptional force is required to break them. The evolutionary arms race between diatoms and their specialized predators will have had considerable influence in structuring pelagic food webs and biogeochemical cycles.
System Engineering Strategy for Distributed Multi-Purpose Simulation Architectures
NASA Technical Reports Server (NTRS)
Bhula, Dlilpkumar; Kurt, Cindy Marie; Luty, Roger
2007-01-01
This paper describes the system engineering approach used to develop distributed multi-purpose simulations. The multi-purpose simulation architecture focuses on user needs, operations, flexibility, cost and maintenance. This approach was used to develop an International Space Station (ISS) simulator, which is called the International Space Station Integrated Simulation (ISIS)1. The ISIS runs unmodified ISS flight software, system models, and the astronaut command and control interface in an open system design that allows for rapid integration of multiple ISS models. The initial intent of ISIS was to provide a distributed system that allows access to ISS flight software and models for the creation, test, and validation of crew and ground controller procedures. This capability reduces the cost and scheduling issues associated with utilizing standalone simulators in fixed locations, and facilitates discovering unknowns and errors earlier in the development lifecycle. Since its inception, the flexible architecture of the ISIS has allowed its purpose to evolve to include ground operator system and display training, flight software modification testing, and as a realistic test bed for Exploration automation technology research and development.
Autonomous control systems - Architecture and fundamental issues
NASA Technical Reports Server (NTRS)
Antsaklis, P. J.; Passino, K. M.; Wang, S. J.
1988-01-01
A hierarchical functional autonomous controller architecture is introduced. In particular, the architecture for the control of future space vehicles is described in detail; it is designed to ensure the autonomous operation of the control system and it allows interaction with the pilot and crew/ground station, and the systems on board the autonomous vehicle. The fundamental issues in autonomous control system modeling and analysis are discussed. It is proposed to utilize a hybrid approach to modeling and analysis of autonomous systems. This will incorporate conventional control methods based on differential equations and techniques for the analysis of systems described with a symbolic formalism. In this way, the theory of conventional control can be fully utilized. It is stressed that autonomy is the design requirement and intelligent control methods appear at present, to offer some of the necessary tools to achieve autonomy. A conventional approach may evolve and replace some or all of the `intelligent' functions. It is shown that in addition to conventional controllers, the autonomous control system incorporates planning, learning, and FDI (fault detection and identification).
Sustaining Human Presence on Mars Using ISRU and a Reusable Lander
NASA Technical Reports Server (NTRS)
Arney, Dale C.; Jones, Christopher A.; Klovstad, Jordan J.; Komar, D.R.; Earle, Kevin; Moses, Robert; Shyface, Hilary R.
2015-01-01
This paper presents an analysis of the impact of ISRU (In-Site Resource Utilization), reusability, and automation on sustaining a human presence on Mars, requiring a transition from Earth dependence to Earth independence. The study analyzes the surface and transportation architectures and compared campaigns that revealed the importance of ISRU and reusability. A reusable Mars lander, Hercules, eliminates the need to deliver a new descent and ascent stage with each cargo and crew delivery to Mars, reducing the mass delivered from Earth. As part of an evolvable transportation architecture, this investment is key to enabling continuous human presence on Mars. The extensive use of ISRU reduces the logistics supply chain from Earth in order to support population growth at Mars. Reliable and autonomous systems, in conjunction with robotics, are required to enable ISRU architectures as systems must operate and maintain themselves while the crew is not present. A comparison of Mars campaigns is presented to show the impact of adding these investments and their ability to contribute to sustaining a human presence on Mars.
Genetic basis of sexual dimorphism in the threespine stickleback Gasterosteus aculeatus
Leinonen, T; Cano, J M; Merilä, J
2011-01-01
Sexual dimorphism (SD) in morphological, behavioural and physiological features is common, but the genetics of SD in the wild has seldom been studied in detail. We investigated the genetic basis of SD in morphological traits of threespine stickleback (Gasterosteus aculeatus) by conducting a large breeding experiment with fish from an ancestral marine population that acts as a source of morphological variation. We also examined the patterns of SD in a set of 38 wild populations from different habitats to investigate the relationship between the genetic architecture of SD of the marine ancestral population in relation to variation within and among natural populations. The results show that genetic architecture in terms of heritabilities, additive genetic variances and covariances (as well as correlations) is very similar in the two sexes in spite of the fact that many of the traits express significant SD. Furthermore, population differences in threespine stickleback body shape and armour SD appear to have evolved despite constraints imposed by genetic architecture. This implies that constraints for the evolution of SD imposed by strong genetic correlations are not as severe and absolute as commonly thought. PMID:20700139
Advanced control architecture for autonomous vehicles
NASA Astrophysics Data System (ADS)
Maurer, Markus; Dickmanns, Ernst D.
1997-06-01
An advanced control architecture for autonomous vehicles is presented. The hierarchical architecture consists of four levels: a vehicle level, a control level, a rule-based level and a knowledge-based level. A special focus is on forms of internal representation, which have to be chosen adequately for each level. The control scheme is applied to VaMP, a Mercedes passenger car which autonomously performs missions on German freeways. VaMP perceives the environment with its sense of vision and conventional sensors. It controls its actuators for locomotion and attention focusing. Modules for perception, cognition and action are discussed.
Wu, Xinru; Tang, Ding; Li, Ming; Wang, Kejian; Cheng, Zhukuan
2013-01-01
Tiller angle and leaf angle are two important components of rice (Oryza sativa) plant architecture that play a crucial role in determining grain yield. Here, we report the cloning and characterization of the Loose Plant Architecture1 (LPA1) gene in rice, the functional ortholog of the AtIDD15/SHOOT GRAVITROPISM5 (SGR5) gene in Arabidopsis (Arabidopsis thaliana). LPA1 regulates tiller angle and leaf angle by controlling the adaxial growth of tiller node and lamina joint. LPA1 was also found to affect shoot gravitropism. Expression pattern analysis suggested that LPA1 influences plant architecture by affecting the gravitropism of leaf sheath pulvinus and lamina joint. However, LPA1 only influences gravity perception or signal transduction in coleoptile gravitropism by regulating the sedimentation rate of amyloplasts, distinct from the actions of LAZY1. LPA1 encodes a plant-specific INDETERMINATE DOMAIN protein and defines a novel subfamily of 28 INDETERMINATE DOMAIN proteins with several unique conserved features. LPA1 is localized in the nucleus and functions as an active transcriptional repressor, an activity mainly conferred by a conserved ethylene response factor-associated amphiphilic repression-like motif. Further analysis suggests that LPA1 participates in a complicated transcriptional and protein interaction network and has evolved novel functions distinct from SGR5. This study not only facilitates the understanding of gravitropism mechanisms but also generates a useful genetic material for rice breeding. PMID:23124325
Wu, Xinru; Tang, Ding; Li, Ming; Wang, Kejian; Cheng, Zhukuan
2013-01-01
Tiller angle and leaf angle are two important components of rice (Oryza sativa) plant architecture that play a crucial role in determining grain yield. Here, we report the cloning and characterization of the Loose Plant Architecture1 (LPA1) gene in rice, the functional ortholog of the AtIDD15/SHOOT GRAVITROPISM5 (SGR5) gene in Arabidopsis (Arabidopsis thaliana). LPA1 regulates tiller angle and leaf angle by controlling the adaxial growth of tiller node and lamina joint. LPA1 was also found to affect shoot gravitropism. Expression pattern analysis suggested that LPA1 influences plant architecture by affecting the gravitropism of leaf sheath pulvinus and lamina joint. However, LPA1 only influences gravity perception or signal transduction in coleoptile gravitropism by regulating the sedimentation rate of amyloplasts, distinct from the actions of LAZY1. LPA1 encodes a plant-specific INDETERMINATE DOMAIN protein and defines a novel subfamily of 28 INDETERMINATE DOMAIN proteins with several unique conserved features. LPA1 is localized in the nucleus and functions as an active transcriptional repressor, an activity mainly conferred by a conserved ethylene response factor-associated amphiphilic repression-like motif. Further analysis suggests that LPA1 participates in a complicated transcriptional and protein interaction network and has evolved novel functions distinct from SGR5. This study not only facilitates the understanding of gravitropism mechanisms but also generates a useful genetic material for rice breeding.
A new flight control and management system architecture and configuration
NASA Astrophysics Data System (ADS)
Kong, Fan-e.; Chen, Zongji
2006-11-01
The advanced fighter should possess the performance such as super-sound cruising, stealth, agility, STOVL(Short Take-Off Vertical Landing),powerful communication and information processing. For this purpose, it is not enough only to improve the aerodynamic and propulsion system. More importantly, it is necessary to enhance the control system. A complete flight control system provides not only autopilot, auto-throttle and control augmentation, but also the given mission management. F-22 and JSF possess considerably outstanding flight control system on the basis of pave pillar and pave pace avionics architecture. But their control architecture is not enough integrated. The main purpose of this paper is to build a novel fighter control system architecture. The control system constructed on this architecture should be enough integrated, inexpensive, fault-tolerant, high safe, reliable and effective. And it will take charge of both the flight control and mission management. Starting from this purpose, this paper finishes the work as follows: First, based on the human nervous control, a three-leveled hierarchical control architecture is proposed. At the top of the architecture, decision level is in charge of decision-making works. In the middle, organization & coordination level will schedule resources, monitor the states of the fighter and switch the control modes etc. And the bottom is execution level which holds the concrete drive and measurement; then, according to their function and resources all the tasks involving flight control and mission management are sorted to individual level; at last, in order to validate the three-leveled architecture, a physical configuration is also showed. The configuration is distributed and applies some new advancement in information technology industry such line replaced module and cluster technology.
Lunar Exploration Architecture Level Key Drivers and Sensitivities
NASA Technical Reports Server (NTRS)
Goodliff, Kandyce; Cirillo, William; Earle, Kevin; Reeves, J. D.; Shyface, Hilary; Andraschko, Mark; Merrill, R. Gabe; Stromgren, Chel; Cirillo, Christopher
2009-01-01
Strategic level analysis of the integrated behavior of lunar transportation and lunar surface systems architecture options is performed to assess the benefit, viability, affordability, and robustness of system design choices. This analysis employs both deterministic and probabilistic modeling techniques so that the extent of potential future uncertainties associated with each option are properly characterized. The results of these analyses are summarized in a predefined set of high-level Figures of Merit (FOMs) so as to provide senior NASA Constellation Program (CxP) and Exploration Systems Mission Directorate (ESMD) management with pertinent information to better inform strategic level decision making. The strategic level exploration architecture model is designed to perform analysis at as high a level as possible but still capture those details that have major impacts on system performance. The strategic analysis methodology focuses on integrated performance, affordability, and risk analysis, and captures the linkages and feedbacks between these three areas. Each of these results leads into the determination of the high-level FOMs. This strategic level analysis methodology has been previously applied to Space Shuttle and International Space Station assessments and is now being applied to the development of the Constellation Program point-of-departure lunar architecture. This paper provides an overview of the strategic analysis methodology and the lunar exploration architecture analyses to date. In studying these analysis results, the strategic analysis team has identified and characterized key drivers affecting the integrated architecture behavior. These key drivers include inclusion of a cargo lander, mission rate, mission location, fixed-versus- variable costs/return on investment, and the requirement for probabilistic analysis. Results of sensitivity analysis performed on lunar exploration architecture scenarios are also presented.
The evolutionary dynamics of haplodiploidy: Genome architecture and haploid viability
Blackmon, Heath; Hardy, Nate B.; Ross, Laura
2015-01-01
Haplodiploid reproduction, in which males are haploid and females are diploid, is widespread among animals, yet we understand little about the forces responsible for its evolution. The current theory is that haplodiploidy has evolved through genetic conflicts, as it provides a transmission advantage to mothers. Male viability is thought to be a major limiting factor; diploid individuals tend to harbor many recessive lethal mutations. This theory predicts that the evolution of haplodiploidy is more likely in male heterogametic lineages with few chromosomes, as genes on the X chromosome are often expressed in a haploid environment, and the fewer the chromosome number, the greater the proportion of the total genome that is X‐linked. We test this prediction with comparative phylogenetic analyses of mites, among which haplodiploidy has evolved repeatedly. We recover a negative correlation between chromosome number and haplodiploidy, find evidence that low chromosome number evolved prior to haplodiploidy, and that it is unlikely that diplodiploidy has reevolved from haplodiploid lineages of mites. These results are consistent with the predicted importance of haploid male viability. PMID:26462452
Neural codes of seeing architectural styles
Choo, Heeyoung; Nasar, Jack L.; Nikrahei, Bardia; Walther, Dirk B.
2017-01-01
Images of iconic buildings, such as the CN Tower, instantly transport us to specific places, such as Toronto. Despite the substantial impact of architectural design on people’s visual experience of built environments, we know little about its neural representation in the human brain. In the present study, we have found patterns of neural activity associated with specific architectural styles in several high-level visual brain regions, but not in primary visual cortex (V1). This finding suggests that the neural correlates of the visual perception of architectural styles stem from style-specific complex visual structure beyond the simple features computed in V1. Surprisingly, the network of brain regions representing architectural styles included the fusiform face area (FFA) in addition to several scene-selective regions. Hierarchical clustering of error patterns further revealed that the FFA participated to a much larger extent in the neural encoding of architectural styles than entry-level scene categories. We conclude that the FFA is involved in fine-grained neural encoding of scenes at a subordinate-level, in our case, architectural styles of buildings. This study for the first time shows how the human visual system encodes visual aspects of architecture, one of the predominant and longest-lasting artefacts of human culture. PMID:28071765
Neural codes of seeing architectural styles.
Choo, Heeyoung; Nasar, Jack L; Nikrahei, Bardia; Walther, Dirk B
2017-01-10
Images of iconic buildings, such as the CN Tower, instantly transport us to specific places, such as Toronto. Despite the substantial impact of architectural design on people's visual experience of built environments, we know little about its neural representation in the human brain. In the present study, we have found patterns of neural activity associated with specific architectural styles in several high-level visual brain regions, but not in primary visual cortex (V1). This finding suggests that the neural correlates of the visual perception of architectural styles stem from style-specific complex visual structure beyond the simple features computed in V1. Surprisingly, the network of brain regions representing architectural styles included the fusiform face area (FFA) in addition to several scene-selective regions. Hierarchical clustering of error patterns further revealed that the FFA participated to a much larger extent in the neural encoding of architectural styles than entry-level scene categories. We conclude that the FFA is involved in fine-grained neural encoding of scenes at a subordinate-level, in our case, architectural styles of buildings. This study for the first time shows how the human visual system encodes visual aspects of architecture, one of the predominant and longest-lasting artefacts of human culture.
Greaves, Mel; Maley, Carlo C.
2012-01-01
Cancers evolve by a reiterative process of clonal expansion, genetic diversification and clonal selection within the adaptive landscapes of tissue ecosystems. The dynamics are complex with highly variable patterns of genetic diversity and resultant clonal architecture. Therapeutic intervention may decimate cancer clones, and erode their habitats, but inadvertently provides potent selective pressure for the expansion of resistant variants. The inherently Darwinian character of cancer lies at the heart of therapeutic failure but perhaps also holds the key to more effective control. PMID:22258609
Schadow, Gunther; Dhaval, Rakesh; McDonald, Clement J; Ragg, Susanne
2006-01-01
We present the architecture and approach of an evolving campus-wide information service for tissues with clinical and data annotations to be used and contributed to by clinical researchers across the campus. The services provided include specimen tracking, long term data storage, and computational analysis services. The project is conceived and sustained by collaboration among researchers on the campus as well as participation in standards organizations and national collaboratives.
Digital Waveguide Architectures for Virtual Musical Instruments
NASA Astrophysics Data System (ADS)
Smith, Julius O.
Digital sound synthesis has become a standard staple of modern music studios, videogames, personal computers, and hand-held devices. As processing power has increased over the years, sound synthesis implementations have evolved from dedicated chip sets, to single-chip solutions, and ultimately to software implementations within processors used primarily for other tasks (such as for graphics or general purpose computing). With the cost of implementation dropping closer and closer to zero, there is increasing room for higher quality algorithms.
Evolution of a designless nanoparticle network into reconfigurable Boolean logic
NASA Astrophysics Data System (ADS)
Bose, S. K.; Lawrence, C. P.; Liu, Z.; Makarenko, K. S.; van Damme, R. M. J.; Broersma, H. J.; van der Wiel, W. G.
2015-12-01
Natural computers exploit the emergent properties and massive parallelism of interconnected networks of locally active components. Evolution has resulted in systems that compute quickly and that use energy efficiently, utilizing whatever physical properties are exploitable. Man-made computers, on the other hand, are based on circuits of functional units that follow given design rules. Hence, potentially exploitable physical processes, such as capacitive crosstalk, to solve a problem are left out. Until now, designless nanoscale networks of inanimate matter that exhibit robust computational functionality had not been realized. Here we artificially evolve the electrical properties of a disordered nanomaterials system (by optimizing the values of control voltages using a genetic algorithm) to perform computational tasks reconfigurably. We exploit the rich behaviour that emerges from interconnected metal nanoparticles, which act as strongly nonlinear single-electron transistors, and find that this nanoscale architecture can be configured in situ into any Boolean logic gate. This universal, reconfigurable gate would require about ten transistors in a conventional circuit. Our system meets the criteria for the physical realization of (cellular) neural networks: universality (arbitrary Boolean functions), compactness, robustness and evolvability, which implies scalability to perform more advanced tasks. Our evolutionary approach works around device-to-device variations and the accompanying uncertainties in performance. Moreover, it bears a great potential for more energy-efficient computation, and for solving problems that are very hard to tackle in conventional architectures.
Directed evolution of the TALE N-terminal domain for recognition of all 5′ bases
Lamb, Brian M.; Mercer, Andrew C.; Barbas, Carlos F.
2013-01-01
Transcription activator-like effector (TALE) proteins can be designed to bind virtually any DNA sequence. General guidelines for design of TALE DNA-binding domains suggest that the 5′-most base of the DNA sequence bound by the TALE (the N0 base) should be a thymine. We quantified the N0 requirement by analysis of the activities of TALE transcription factors (TALE-TF), TALE recombinases (TALE-R) and TALE nucleases (TALENs) with each DNA base at this position. In the absence of a 5′ T, we observed decreases in TALE activity up to >1000-fold in TALE-TF activity, up to 100-fold in TALE-R activity and up to 10-fold reduction in TALEN activity compared with target sequences containing a 5′ T. To develop TALE architectures that recognize all possible N0 bases, we used structure-guided library design coupled with TALE-R activity selections to evolve novel TALE N-terminal domains to accommodate any N0 base. A G-selective domain and broadly reactive domains were isolated and characterized. The engineered TALE domains selected in the TALE-R format demonstrated modularity and were active in TALE-TF and TALEN architectures. Evolved N-terminal domains provide effective and unconstrained TALE-based targeting of any DNA sequence as TALE binding proteins and designer enzymes. PMID:23980031
Evolution and genome architecture in fungal plant pathogens.
Möller, Mareike; Stukenbrock, Eva H
2017-12-01
The fungal kingdom comprises some of the most devastating plant pathogens. Sequencing the genomes of fungal pathogens has shown a remarkable variability in genome size and architecture. Population genomic data enable us to understand the mechanisms and the history of changes in genome size and adaptive evolution in plant pathogens. Although transposable elements predominantly have negative effects on their host, fungal pathogens provide prominent examples of advantageous associations between rapidly evolving transposable elements and virulence genes that cause variation in virulence phenotypes. By providing homogeneous environments at large regional scales, managed ecosystems, such as modern agriculture, can be conducive for the rapid evolution and dispersal of pathogens. In this Review, we summarize key examples from fungal plant pathogen genomics and discuss evolutionary processes in pathogenic fungi in the context of molecular evolution, population genomics and agriculture.
A Sustained Proximity Network for Multi-Mission Lunar Exploration
NASA Technical Reports Server (NTRS)
Soloff, Jason A.; Noreen, Gary; Deutsch, Leslie; Israel, David
2005-01-01
Tbe Vision for Space Exploration calls for an aggressive sequence of robotic missions beginning in 2008 to prepare for a human return to the Moon by 2020, with the goal of establishing a sustained human presence beyond low Earth orbit. A key enabler of exploration is reliable, available communication and navigation capabilities to support both human and robotic missions. An adaptable, sustainable communication and navigation architecture has been developed by Goddard Space Flight Center and the Jet Propulsion Laboratory to support human and robotic lunar exploration through the next two decades. A key component of the architecture is scalable deployment, with the infrastructure evolving as needs emerge, allowing NASA and its partner agencies to deploy an interoperable communication and navigation system in an evolutionary way, enabling cost effective, highly adaptable systems throughout the lunar exploration program.
Systemic risk on different interbank network topologies
NASA Astrophysics Data System (ADS)
Lenzu, Simone; Tedeschi, Gabriele
2012-09-01
In this paper we develop an interbank market with heterogeneous financial institutions that enter into lending agreements on different network structures. Credit relationships (links) evolve endogenously via a fitness mechanism based on agents' performance. By changing the agent's trust on its neighbor's performance, interbank linkages self-organize themselves into very different network architectures, ranging from random to scale-free topologies. We study which network architecture can make the financial system more resilient to random attacks and how systemic risk spreads over the network. To perturb the system, we generate a random attack via a liquidity shock. The hit bank is not automatically eliminated, but its failure is endogenously driven by its incapacity to raise liquidity in the interbank network. Our analysis shows that a random financial network can be more resilient than a scale free one in case of agents' heterogeneity.
Evolving concepts of lunar architecture: The potential of subselene development
NASA Technical Reports Server (NTRS)
Daga, Andrew W.; Daga, Meryl A.; Wendel, Wendel R.
1992-01-01
In view of the superior environmental and operational conditions that are thought to exist in lava tubes, popular visions of permanent settlements built upon the lunar surface may prove to be entirely romantic. The factors that will ultimately come together to determine the design of a lunar base are complex and interrelated, and they call for a radical architectural solution. Whether lunar surface-deployed superstructures can answer these issues is called into question. One particularly troublesome concern in any lunar base design is the need for vast amounts of space, and the ability of man-made structures to provide such volumes in a reliable pressurized habitat is doubtful. An examination of several key environmental design issues suggests that the alternative mode of subselene development may offer the best opportunity for an enduring and humane settlement.
WDM-PON Architecture for FTTx Networks
NASA Astrophysics Data System (ADS)
Iannone, E.; Franco, P.; Santoni, S.
Broadband services for residential users in European countries have until now largely relied on xDSL technologies, while FTTx technologies have been mainly exploited in Asia and North America. The increasing bandwidth demand and the growing penetration of new services are pushing the deployment of optical access networks, and major European operators are now announcing FTTx projects. While FTTH is recognized as the target solution to bring broadband services to residential users, the identification of an FTTx evolutionary path able to seamlessly migrate to FTTH is key to enabling a massive deployment, easing the huge investments needed. WDM-PON architecture is an interesting solution that is able to accommodate the strategic need of building a new fiber-based access infrastructure with the possibility of adapting investments to actual demands and evolving to FTTH without requiring further interventions on fiber infrastructures.
Space Telecommunications Radio System STRS Cognitive Radio
NASA Technical Reports Server (NTRS)
Briones, Janette C.; Handler, Louis M.
2013-01-01
Radios today are evolving from awareness toward cognition. A software defined radio (SDR) provides the most capability for integrating autonomic decision making ability and allows the incremental evolution toward a cognitive radio. This cognitive radio technology will impact NASA space communications in areas such as spectrum utilization, interoperability, network operations, and radio resource management over a wide range of operating conditions. NASAs cognitive radio will build upon the infrastructure being developed by Space Telecommunication Radio System (STRS) SDR technology. This paper explores the feasibility of inserting cognitive capabilities in the NASA STRS architecture and the interfaces between the cognitive engine and the STRS radio. The STRS architecture defines methods that can inform the cognitive engine about the radio environment so that the cognitive engine can learn autonomously from experience, and take appropriate actions to adapt the radio operating characteristics and optimize performance.
An Autonomous Flight Safety System
NASA Technical Reports Server (NTRS)
Bull, James B.; Lanzi, Raymond J.
2007-01-01
The Autonomous Flight Safety System (AFSS) being developed by NASA s Goddard Space Flight Center s Wallops Flight Facility and Kennedy Space Center has completed two successful developmental flights and is preparing for a third. AFSS has been demonstrated to be a viable architecture for implementation of a completely vehicle based system capable of protecting life and property in event of an errant vehicle by terminating the flight or initiating other actions. It is capable of replacing current human-in-the-loop systems or acting in parallel with them. AFSS is configured prior to flight in accordance with a specific rule set agreed upon by the range safety authority and the user to protect the public and assure mission success. This paper discusses the motivation for the project, describes the method of development, and presents an overview of the evolving architecture and the current status.
DFT algorithms for bit-serial GaAs array processor architectures
NASA Technical Reports Server (NTRS)
Mcmillan, Gary B.
1988-01-01
Systems and Processes Engineering Corporation (SPEC) has developed an innovative array processor architecture for computing Fourier transforms and other commonly used signal processing algorithms. This architecture is designed to extract the highest possible array performance from state-of-the-art GaAs technology. SPEC's architectural design includes a high performance RISC processor implemented in GaAs, along with a Floating Point Coprocessor and a unique Array Communications Coprocessor, also implemented in GaAs technology. Together, these data processors represent the latest in technology, both from an architectural and implementation viewpoint. SPEC has examined numerous algorithms and parallel processing architectures to determine the optimum array processor architecture. SPEC has developed an array processor architecture with integral communications ability to provide maximum node connectivity. The Array Communications Coprocessor embeds communications operations directly in the core of the processor architecture. A Floating Point Coprocessor architecture has been defined that utilizes Bit-Serial arithmetic units, operating at very high frequency, to perform floating point operations. These Bit-Serial devices reduce the device integration level and complexity to a level compatible with state-of-the-art GaAs device technology.
Anatomic pathology laboratory information systems: a review.
Park, Seung Lyung; Pantanowitz, Liron; Sharma, Gaurav; Parwani, Anil Vasdev
2012-03-01
The modern anatomic pathology laboratory depends on a reliable information infrastructure to register specimens, record gross and microscopic findings, regulate laboratory workflow, formulate and sign out report(s), disseminate them to the intended recipients across the whole health system, and support quality assurance measures. This infrastructure is provided by the Anatomical Pathology Laboratory Information Systems (APLIS), which have evolved over decades and now are beginning to support evolving technologies like asset tracking and digital imaging. As digital pathology transitions from "the way of the future" to "the way of the present," the APLIS continues to be one of the key effective enablers of the scope and practice of pathology. In this review, we discuss the evolution, necessary components, architecture and functionality of the APLIS that are crucial to today's practicing pathologist and address the demands of emerging trends on the future APLIS.
Orion FSW V and V and Kedalion Engineering Lab Insight
NASA Technical Reports Server (NTRS)
Mangieri, Mark L.
2010-01-01
NASA, along with its prime Orion contractor and its subcontractor s are adapting an avionics system paradigm borrowed from the manned commercial aircraft industry for use in manned space flight systems. Integrated Modular Avionics (IMA) techniques have been proven as a robust avionics solution for manned commercial aircraft (B737/777/787, MD 10/90). This presentation will outline current approaches to adapt IMA, along with its heritage FSW V&V paradigms, into NASA's manned space flight program for Orion. NASA's Kedalion engineering analysis lab is on the forefront of validating many of these contemporary IMA based techniques. Kedalion has already validated many of the proposed Orion FSW V&V paradigms using Orion's precursory Flight Test Article (FTA) Pad Abort 1 (PA-1) program. The Kedalion lab will evolve its architectures, tools, and techniques in parallel with the evolving Orion program.
Solving the Software Legacy Problem with RISA
NASA Astrophysics Data System (ADS)
Ibarra, A.; Gabriel, C.
2012-09-01
Nowadays hardware and system infrastructure evolve on time scales much shorter than the typical duration of space astronomy missions. Data processing software capabilities have to evolve to preserve the scientific return during the entire experiment life time. Software preservation is a key issue that has to be tackled before the end of the project to keep the data usable over many years. We present RISA (Remote Interface to Science Analysis) as a solution to decouple data processing software and infrastructure life-cycles, using JAVA applications and web-services wrappers to existing software. This architecture employs embedded SAS in virtual machines assuring a homogeneous job execution environment. We will also present the first studies to reactivate the data processing software of the EXOSAT mission, the first ESA X-ray astronomy mission launched in 1983, using the generic RISA approach.
IRAF and STSDAS under the new ALPHA architecture
NASA Technical Reports Server (NTRS)
Zarate, N. R.
1992-01-01
Digital's next generation RISC architecture, known as ALPHA, presents many IRAF system portability questions and challenges to both site managers and end users. DEC promises to support the ULTRIX, VMS, and OSF/1 operating systems, which should allow IRAF to be ported to the new architecture at either the program executable level (using VEST), or at the source level, where IRAF can be tuned for greater performance. These notes highlight some of the details of porting IRAF to OpenVMS on the ALPHA architecture.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-23
... Engineering, Architectural Services, Design Policies and Construction Standards AGENCY: Rural Utilities..., engineering services and architectural services for transactions above the established threshold dollar levels... Code of Federal Regulations as follows: PART 1724--ELECTRIC ENGINEERING, ARCHITECTURAL SERVICES AND...
Conservatism and novelty in the genetic architecture of adaptation in Heliconius butterflies.
Huber, B; Whibley, A; Poul, Y L; Navarro, N; Martin, A; Baxter, S; Shah, A; Gilles, B; Wirth, T; McMillan, W O; Joron, M
2015-05-01
Understanding the genetic architecture of adaptive traits has been at the centre of modern evolutionary biology since Fisher; however, evaluating how the genetic architecture of ecologically important traits influences their diversification has been hampered by the scarcity of empirical data. Now, high-throughput genomics facilitates the detailed exploration of variation in the genome-to-phenotype map among closely related taxa. Here, we investigate the evolution of wing pattern diversity in Heliconius, a clade of neotropical butterflies that have undergone an adaptive radiation for wing-pattern mimicry and are influenced by distinct selection regimes. Using crosses between natural wing-pattern variants, we used genome-wide restriction site-associated DNA (RAD) genotyping, traditional linkage mapping and multivariate image analysis to study the evolution of the architecture of adaptive variation in two closely related species: Heliconius hecale and H. ismenius. We implemented a new morphometric procedure for the analysis of whole-wing pattern variation, which allows visualising spatial heatmaps of genotype-to-phenotype association for each quantitative trait locus separately. We used the H. melpomene reference genome to fine-map variation for each major wing-patterning region uncovered, evaluated the role of candidate genes and compared genetic architectures across the genus. Our results show that, although the loci responding to mimicry selection are highly conserved between species, their effect size and phenotypic action vary throughout the clade. Multilocus architecture is ancestral and maintained across species under directional selection, whereas the single-locus (supergene) inheritance controlling polymorphism in H. numata appears to have evolved only once. Nevertheless, the conservatism in the wing-patterning toolkit found throughout the genus does not appear to constrain phenotypic evolution towards local adaptive optima.
3D level set methods for evolving fronts on tetrahedral meshes with adaptive mesh refinement
Morgan, Nathaniel Ray; Waltz, Jacob I.
2017-03-02
The level set method is commonly used to model dynamically evolving fronts and interfaces. In this work, we present new methods for evolving fronts with a specified velocity field or in the surface normal direction on 3D unstructured tetrahedral meshes with adaptive mesh refinement (AMR). The level set field is located at the nodes of the tetrahedral cells and is evolved using new upwind discretizations of Hamilton–Jacobi equations combined with a Runge–Kutta method for temporal integration. The level set field is periodically reinitialized to a signed distance function using an iterative approach with a new upwind gradient. We discuss themore » details of these level set and reinitialization methods. Results from a range of numerical test problems are presented.« less
An object-oriented software approach for a distributed human tracking motion system
NASA Astrophysics Data System (ADS)
Micucci, Daniela L.
2003-06-01
Tracking is a composite job involving the co-operation of autonomous activities which exploit a complex information model and rely on a distributed architecture. Both information and activities must be classified and related in several dimensions: abstraction levels (what is modelled and how information is processed); topology (where the modelled entities are); time (when entities exist); strategy (why something happens); responsibilities (who is in charge of processing the information). A proper Object-Oriented analysis and design approach leads to a modular architecture where information about conceptual entities is modelled at each abstraction level via classes and intra-level associations, whereas inter-level associations between classes model the abstraction process. Both information and computation are partitioned according to level-specific topological models. They are also placed in a temporal framework modelled by suitable abstractions. Domain-specific strategies control the execution of the computations. Computational components perform both intra-level processing and intra-level information conversion. The paper overviews the phases of the analysis and design process, presents major concepts at each abstraction level, and shows how the resulting design turns into a modular, flexible and adaptive architecture. Finally, the paper sketches how the conceptual architecture can be deployed into a concrete distribute architecture by relying on an experimental framework.
Reconfigurable Autonomy for Future Planetary Rovers
NASA Astrophysics Data System (ADS)
Burroughes, Guy
Extra-terrestrial Planetary rover systems are uniquely remote, placing constraints in regard to communication, environmental uncertainty, and limited physical resources, and requiring a high level of fault tolerance and resistance to hardware degradation. This thesis presents a novel self-reconfiguring autonomous software architecture designed to meet the needs of extraterrestrial planetary environments. At runtime it can safely reconfigure low-level control systems, high-level decisional autonomy systems, and managed software architecture. The architecture can perform automatic Verification and Validation of self-reconfiguration at run-time, and enables a system to be self-optimising, self-protecting, and self-healing. A novel self-monitoring system, which is non-invasive, efficient, tunable, and autonomously deploying, is also presented. The architecture was validated through the use-case of a highly autonomous extra-terrestrial planetary exploration rover. Three major forms of reconfiguration were demonstrated and tested: first, high level adjustment of system internal architecture and goal; second, software module modification; and third, low level alteration of hardware control in response to degradation of hardware and environmental change. The architecture was demonstrated to be robust and effective in a Mars sample return mission use-case testing the operational aspects of a novel, reconfigurable guidance, navigation, and control system for a planetary rover, all operating in concert through a scenario that required reconfiguration of all elements of the system.
Instruction-level performance modeling and characterization of multimedia applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, Y.; Cameron, K.W.
1999-06-01
One of the challenges for characterizing and modeling realistic multimedia applications is the lack of access to source codes. On-chip performance counters effectively resolve this problem by monitoring run-time behaviors at the instruction-level. This paper presents a novel technique of characterizing and modeling workloads at the instruction level for realistic multimedia applications using hardware performance counters. A variety of instruction counts are collected from some multimedia applications, such as RealPlayer, GSM Vocoder, MPEG encoder/decoder, and speech synthesizer. These instruction counts can be used to form a set of abstract characteristic parameters directly related to a processor`s architectural features. Based onmore » microprocessor architectural constraints and these calculated abstract parameters, the architectural performance bottleneck for a specific application can be estimated. Meanwhile, the bottleneck estimation can provide suggestions about viable architectural/functional improvement for certain workloads. The biggest advantage of this new characterization technique is a better understanding of processor utilization efficiency and architectural bottleneck for each application. This technique also provides predictive insight of future architectural enhancements and their affect on current codes. In this paper the authors also attempt to model architectural effect on processor utilization without memory influence. They derive formulas for calculating CPI{sub 0}, CPI without memory effect, and they quantify utilization of architectural parameters. These equations are architecturally diagnostic and predictive in nature. Results provide promise in code characterization, and empirical/analytical modeling.« less
Considerations for Architecture Level Trade Studies for Environmental Sensors
NASA Technical Reports Server (NTRS)
Peterson, Craig
2010-01-01
Comparisons of key characteristics of environmental sensors such as technology readiness levels, mass, power, volume, and detection capabilities are essential for initial trade studies to determine likely candidates for further development and evaluation. However, these trade studies only provide part of the information necessary to make selection decisions. Ultimately, the sensors must be judged based on the overall system architectures and operational scenarios for which they are intended. This means that additional characteristics, such as architectural needs for redundancy, operational lifetime, ability to maintain calibration, and repair and replacement strategies, among others, must also be considered. Given that these characteristics can be extremely time-consuming and costly to obtain, careful planning is essential to minimize the effort involved. In this paper, an approach is explored for determining an effective yet comprehensive set of architecture level trades which is minimally impacted by the inevitable changes in operational (mission) scenarios. The approach will also identify and integrate the various facilities and opportunities required to obtain the desired architecture level trade information.
Top-level modeling of an als system utilizing object-oriented techniques
NASA Astrophysics Data System (ADS)
Rodriguez, L. F.; Kang, S.; Ting, K. C.
The possible configuration of an Advanced Life Support (ALS) System capable of supporting human life for long-term space missions continues to evolve as researchers investigate potential technologies and configurations. To facilitate the decision process the development of acceptable, flexible, and dynamic mathematical computer modeling tools capable of system level analysis is desirable. Object-oriented techniques have been adopted to develop a dynamic top-level model of an ALS system.This approach has several advantages; among these, object-oriented abstractions of systems are inherently modular in architecture. Thus, models can initially be somewhat simplistic, while allowing for adjustments and improvements. In addition, by coding the model in Java, the model can be implemented via the World Wide Web, greatly encouraging the utilization of the model. Systems analysis is further enabled with the utilization of a readily available backend database containing information supporting the model. The subsystem models of the ALS system model include Crew, Biomass Production, Waste Processing and Resource Recovery, Food Processing and Nutrition, and the Interconnecting Space. Each subsystem model and an overall model have been developed. Presented here is the procedure utilized to develop the modeling tool, the vision of the modeling tool, and the current focus for each of the subsystem models.
NASA Astrophysics Data System (ADS)
Masset, Frédéric
2015-09-01
GFARGO is a GPU version of FARGO. It is written in C and C for CUDA and runs only on NVIDIA’s graphics cards. Though it corresponds to the standard, isothermal version of FARGO, not all functionnalities of the CPU version have been translated to CUDA. The code is available in single and double precision versions, the latter compatible with FERMI architectures. GFARGO can run on a graphics card connected to the display, allowing the user to see in real time how the fields evolve.
Speed challenge: a case for hardware implementation in soft-computing
NASA Technical Reports Server (NTRS)
Daud, T.; Stoica, A.; Duong, T.; Keymeulen, D.; Zebulum, R.; Thomas, T.; Thakoor, A.
2000-01-01
For over a decade, JPL has been actively involved in soft computing research on theory, architecture, applications, and electronics hardware. The driving force in all our research activities, in addition to the potential enabling technology promise, has been creation of a niche that imparts orders of magnitude speed advantage by implementation in parallel processing hardware with algorithms made especially suitable for hardware implementation. We review our work on neural networks, fuzzy logic, and evolvable hardware with selected application examples requiring real time response capabilities.
Homeland security in the USA: past, present, and future.
Kemp, Roger L
2012-01-01
This paper examines the evolving and dynamic field of homeland security in the USA. Included in this analysis is the evolution of the creation of the Department of Homeland Security, an overview of the National Warning System, a summary of citizen support groups, and how the field of homeland security has had an impact on the location and architecture of public buildings and facilities. Also included are website directories of citizen support groups and federal agencies related to the field of homeland security.
A study of space station needs, attributes and architectural options
NASA Technical Reports Server (NTRS)
1983-01-01
The mission requirements, economic benefits, and time table of deployment of the space station are discussed. It is concluded that: (1) mission requirements overwhelmingly support the need for a space station; (2) a single space station is the way to begin; (3) the space station must evolve its capability; (4) the orbit transfer vehicle aspect of the space station will provide significant economic benefit; and (5) an early, affordable, effective way to start the space station program is needed.
2003-09-01
resolution M&S concept for integrating heterogeneous M&S into the hierarchy has existed since the early 1980s [DH92a, DH92b]. 25...groups [PAD78]. The need for credible M&S grew in the Nation’s private and public sectors. By 1980 , information from computer-based simulations...formal) identified in the [GMS+96 and RPG00]. We noted that systemic issues identi- fied in by reports, studies, and assessments the early 1980s
Towards 100,000 CPU Cycle-Scavenging by Genetic Algorithms
NASA Technical Reports Server (NTRS)
Globus, Al; Biegel, Bryan A. (Technical Monitor)
2001-01-01
We examine a web-centric design using standard tools such as web servers, web browsers, PHP, and mySQL. We also consider the applicability of Information Power Grid tools such as the Globus (no relation to the author) Toolkit. We intend to implement this architecture with JavaGenes running on at least two cycle-scavengers: Condor and United Devices. JavaGenes, a genetic algorithm code written in Java, will be used to evolve multi-species reactive molecular force field parameters.
The OpenMP Implementation of NAS Parallel Benchmarks and its Performance
NASA Technical Reports Server (NTRS)
Jin, Hao-Qiang; Frumkin, Michael; Yan, Jerry
1999-01-01
As the new ccNUMA architecture became popular in recent years, parallel programming with compiler directives on these machines has evolved to accommodate new needs. In this study, we examine the effectiveness of OpenMP directives for parallelizing the NAS Parallel Benchmarks. Implementation details will be discussed and performance will be compared with the MPI implementation. We have demonstrated that OpenMP can achieve very good results for parallelization on a shared memory system, but effective use of memory and cache is very important.
Application of the SCADA system in wastewater treatment plants.
Dieu, B
2001-01-01
The implementation of the SCADA system has a positive impact on the operations, maintenance, process improvement and savings for the City of Houston's Wastewater Operations branch. This paper will discuss the system's evolvement, the external/internal architecture, and the human-machine-interface graphical design. Finally, it will demonstrate the system's successes in monitoring the City's sewage and sludge collection/distribution systems, wet-weather facilities and wastewater treatment plants, complying with the USEPA requirements on the discharge, and effectively reducing the operations and maintenance costs.
Pathogen trafficking pathways and host phosphoinositide metabolism.
Weber, Stefan S; Ragaz, Curdin; Hilbi, Hubert
2009-03-01
Phosphoinositide (PI) glycerolipids are key regulators of eukaryotic signal transduction, cytoskeleton architecture and membrane dynamics. The host cell PI metabolism is targeted by intracellular bacterial pathogens, which evolved intricate strategies to modulate uptake processes and vesicle trafficking pathways. Upon entering eukaryotic host cells, pathogenic bacteria replicate in distinct vacuoles or in the host cytoplasm. Vacuolar pathogens manipulate PI levels to mimic or modify membranes of subcellular compartments and thereby establish their replicative niche. Legionella pneumophila, Brucella abortus, Mycobacterium tuberculosis and Salmonella enterica translocate effector proteins into the host cell, some of which anchor to the vacuolar membrane via PIs or enzymatically turnover PIs. Cytoplasmic pathogens target PI metabolism at the plasma membrane, thus modulating their uptake and antiapoptotic signalling pathways. Employing this strategy, Shigella flexneri directly injects a PI-modifying effector protein, while Listeria monocytogenes exploits PI metabolism indirectly by binding to transmembrane receptors. Thus, regardless of the intracellular lifestyle of the pathogen, PI metabolism is critically involved in the interactions with host cells.
Nonlinear feedback drives homeostatic plasticity in H2O2 stress response
Goulev, Youlian; Morlot, Sandrine; Matifas, Audrey; Huang, Bo; Molin, Mikael; Toledano, Michel B; Charvin, Gilles
2017-01-01
Homeostatic systems that rely on genetic regulatory networks are intrinsically limited by the transcriptional response time, which may restrict a cell’s ability to adapt to unanticipated environmental challenges. To bypass this limitation, cells have evolved mechanisms whereby exposure to mild stress increases their resistance to subsequent threats. However, the mechanisms responsible for such adaptive homeostasis remain largely unknown. Here, we used live-cell imaging and microfluidics to investigate the adaptive response of budding yeast to temporally controlled H2O2 stress patterns. We demonstrate that acquisition of tolerance is a systems-level property resulting from nonlinearity of H2O2 scavenging by peroxiredoxins and our study reveals that this regulatory scheme induces a striking hormetic effect of extracellular H2O2 stress on replicative longevity. Our study thus provides a novel quantitative framework bridging the molecular architecture of a cellular homeostatic system to the emergence of nonintuitive adaptive properties. DOI: http://dx.doi.org/10.7554/eLife.23971.001 PMID:28418333
Disorder in convergent floral nanostructures enhances signalling to bees
NASA Astrophysics Data System (ADS)
Moyroud, Edwige; Wenzel, Tobias; Middleton, Rox; Rudall, Paula J.; Banks, Hannah; Reed, Alison; Mellers, Greg; Killoran, Patrick; Westwood, M. Murphy; Steiner, Ullrich; Vignolini, Silvia; Glover, Beverley J.
2017-10-01
Diverse forms of nanoscale architecture generate structural colour and perform signalling functions within and between species. Structural colour is the result of the interference of light from approximately regular periodic structures; some structural disorder is, however, inevitable in biological organisms. Is this disorder functional and subject to evolutionary selection, or is it simply an unavoidable outcome of biological developmental processes? Here we show that disordered nanostructures enable flowers to produce visual signals that are salient to bees. These disordered nanostructures (identified in most major lineages of angiosperms) have distinct anatomies but convergent optical properties; they all produce angle-dependent scattered light, predominantly at short wavelengths (ultraviolet and blue). We manufactured artificial flowers with nanoscale structures that possessed tailored levels of disorder in order to investigate how foraging bumblebees respond to this optical effect. We conclude that floral nanostructures have evolved, on multiple independent occasions, an effective degree of relative spatial disorder that generates a photonic signature that is highly salient to insect pollinators.
Challenges and dreams: physics of weak interactions essential to life
Chien, Peter; Gierasch, Lila M.
2014-01-01
Biological systems display stunning capacities to self-organize. Moreover, their subcellular architectures are dynamic and responsive to changing needs and conditions. Key to these properties are manifold weak “quinary” interactions that have evolved to create specific spatial networks of macromolecules. These specific arrangements of molecules enable signals to be propagated over distances much greater than molecular dimensions, create phase separations that define functional regions in cells, and amplify cellular responses to changes in their environments. A major challenge is to develop biochemical tools and physical models to describe the panoply of weak interactions operating in cells. We also need better approaches to measure the biases in the spatial distributions of cellular macromolecules that result from the integrated action of multiple weak interactions. Partnerships between cell biologists, biochemists, and physicists are required to deploy these methods. Together these approaches will help us realize the dream of understanding the biological “glue” that sustains life at a molecular and cellular level. PMID:25368424
Dual response to nest flooding during monsoon in an Indian ant
Kolay, Swetashree; Annagiri, Sumana
2015-01-01
Flooding causes destruction of shelter and disruption of activity in animals occupying subterranean nests. To ensure their survival organisms have evolved various responses to combat this problem. In this study we examine the response of an Indian ant, Diacamma indicum, to nest flooding during the monsoon season. Based on characterization of nest location, architecture and the response of these ants to different levels of flooding in their natural habitat as well as in the laboratory, we infer that they exhibit a dual response. On the one hand, the challenges presented by monsoon are dealt with by occupying shallow nests and modifying the entrance with decorations and soil mounds. On the other hand, inundated nests are evacuated and the ants occupy shelters at higher elevations. We conclude that focused studies of the monsoon biology of species that dwell in such climatic conditions may help us appreciate how organisms deal with, and adapt to, extreme seasonal changes. PMID:26349015
Minie, Mark; Bowers, Stuart; Tarczy-Hornoch, Peter; Roberts, Edward; James, Rose A.; Rambo, Neil; Fuller, Sherrilynne
2006-01-01
Setting: The University of Washington Health Sciences Libraries and Information Center BioCommons serves the bioinformatics needs of researchers at the university and in the vibrant for-profit and not-for-profit biomedical research sector in the Washington area and region. Program Components: The BioCommons comprises services addressing internal University of Washington, not-for-profit, for-profit, and regional and global clientele. The BioCommons is maintained and administered by the BioResearcher Liaison Team. The BioCommons architecture provides a highly flexible structure for adapting to rapidly changing resources and needs. Evaluation Mechanisms: BioCommons uses Web-based pre- and post-course evaluations and periodic user surveys to assess service effectiveness. Recent surveys indicate substantial usage of BioCommons services and a high level of effectiveness and user satisfaction. Next Steps/Future Directions: BioCommons is developing novel collaborative Web resources to distribute bioinformatics tools and is experimenting with Web-based competency training in bioinformation resource use. PMID:16888667
Stacked STN LCDs for true-color projection systems
NASA Astrophysics Data System (ADS)
Gulick, Paul E.; Conner, Arlie R.
1991-08-01
The demand for a true color LCD projection panel for use with standard overhead projectors has been around ever since the first monochrome OHP projection panel was introduced in 1986. The monochrome panels evolved along with the LCD technology from the first blue- and-yellow mode units to black-and-white with levels of gray, and to yellow-and-magenta panels with limited intermediate color shades known as pseudo-color. Finally, a novel solution has been implemented using a stack of custom designed STN panels, making possible true color LCD projection panels that are reasonably priced, available in high volume and quite acceptable in overall image quality. This stacked technology relies on the inherent birefringence colors of each layer to switch between white (passing all wavelengths) and a subtractive color primary (passing all wavelengths but red, green, or blue) so the full spectrum can be projected. Standard gray-scale techniques expand the displayable color palette to almost 5,000 colors and beyond. The same technology can also be applied to various self-contained projection architectures.
A Conserved Developmental Mechanism Builds Complex Visual Systems in Insects and Vertebrates
Joly, Jean-Stéphane; Recher, Gaelle; Brombin, Alessandro; Ngo, Kathy; Hartenstein, Volker
2016-01-01
The visual systems of vertebrates and many other bilaterian clades consist of complex neural structures guiding a wide spectrum of behaviors. Homologies at the level of cell types and even discrete neural circuits have been proposed, but many questions of how the architecture of visual neuropils evolved among different phyla remain open. In this review we argue that the profound conservation of genetic and developmental steps generating the eye and its target neuropils in fish and fruit flies supports a homology between some core elements of bilaterian visual circuitries. Fish retina and tectum, and fly optic lobe, develop from a partitioned, unidirectionally proliferating neurectodermal domain that combines slowly dividing neuroepithelial stem cells and rapidly amplifying progenitors with shared genetic signatures to generate large numbers and different types of neurons in a temporally ordered way. This peculiar ‘conveyor belt neurogenesis’ could play an essential role in generating the topographically ordered circuitry of the visual system. PMID:27780043
Viral infection and human disease - insights from minimotifs
Kadaveru, Krishna; Vyas, Jay; Schiller, Martin R.
2008-01-01
Short functional peptide motifs cooperate in many molecular functions including protein interactions, protein trafficking, and posttranslational modifications. Viruses exploit these motifs as a principal mechanism for hijacking cells and many motifs are necessary for the viral life-cycle. A virus can accommodate many short motifs in its small genome size providing a plethora of ways for the virus to acquire host molecular machinery. Host enzymes that act on motifs such as kinases, proteases, and lipidation enzymes, as well as protein interaction domains, are commonly mutated in human disease, suggesting that the short peptide motif targets of these enzymes may also be mutated in disease; however, this is not observed. How can we explain why viruses have evolved to be so dependent on motifs, yet these motifs, in general do not seem to be as necessary for human viability? We propose that short motifs are used at the system level. This system architecture allows viruses to exploit a motif, whereas the viability of the host is not affected by mutation of a single motif. PMID:18508672
Fernandes, Noemi M; Vizzoni, Vinicius F; Borges, Bárbara do N; A G Soares, Carlos; Silva-Neto, Inácio D da; S Paiva, Thiago da
2018-04-18
The odontostomatids are among the least studied ciliates, possibly due to their small sizes, restriction to anaerobic environments and difficulty in culturing. Consequently, their phylogenetic affinities to other ciliate taxa are still poorly understood. In the present study, we analyzed newly obtained ribosomal gene sequences of the odontostomatids Discomorphella pedroeneasi and Saprodinium dentatum, together with sequences from the literature, including Epalxella antiquorum and a large assemblage of ciliate sequences representing the major recognized classes. The results show that D. pedroeneasi and S. dentatum form a deep-diverging branch related to metopid and clevelandellid armophoreans, corroborating the old literature. However E. antiquorum clustered with the morphologically discrepant plagiopylids, indicating that either the complex odontostomatid body architecture evolved convergently, or the positioning of E. antiquorum as a plagiopylid is artifactual. A new ciliate class, Odontostomatea n. cl., is proposed based on molecular analyses and comparative morphology of odontostomatids with related taxa. Copyright © 2018. Published by Elsevier Inc.
NASA's Space Launch System: Moving Toward the Launch Pad
NASA Technical Reports Server (NTRS)
Creech, Stephen D.; May, Todd
2013-01-01
The National Aeronautics and Space Administration's (NASA's) Space Launch System (SLS) Program, managed at the Marshall Space Flight Center, is making progress toward delivering a new capability for human space flight and scientific missions beyond Earth orbit. Developed with the goals of safety, affordability, and sustainability in mind, the SLS rocket will launch the Orion Multi-Purpose Crew Vehicle (MPCV), equipment, supplies, and major science missions for exploration and discovery. Supporting Orion's first autonomous flight to lunar orbit and back in 2017 and its first crewed flight in 2021, the SLS will evolve into the most powerful launch vehicle ever flown, via an upgrade approach that will provide building blocks for future space exploration and development. NASA is working to develop this new capability in an austere economic climate, a fact which has inspired the SLS team to find innovative solutions to the challenges of designing, developing, fielding, and operating the largest rocket in history. This paper will summarize the planned capabilities of the vehicle, the progress the SLS program has made in the 2 years since the Agency formally announced its architecture in September 2011, and the path the program is following to reach the launch pad in 2017 and then to evolve the 70 metric ton (t) initial lift capability to 130-t lift capability. The paper will explain how, to meet the challenge of a flat funding curve, an architecture was chosen which combines the use and enhancement of legacy systems and technology with strategic new development projects that will evolve the capabilities of the launch vehicle. This approach reduces the time and cost of delivering the initial 70 t Block 1 vehicle, and reduces the number of parallel development investments required to deliver the evolved version of the vehicle. The paper will outline the milestones the program has already reached, from developmental milestones such as the manufacture of the first flight hardware and the record-breaking testing of the J-2X engine, to life-cycle milestones such as the vehicle's Preliminary Design Review. The paper will also discuss the remaining challenges in both delivering the 70 t vehicle and in evolving its capabilities to the 130 t vehicle, and how the program plans to accomplish these goals. As this paper will explain, SLS is making measurable progress toward becoming a global infrastructure asset for robotic and human scouts of all nations by harnessing business and technological innovations to deliver sustainable solutions for space exploration
NASA Astrophysics Data System (ADS)
Zhang, Daili
Increasing societal demand for automation has led to considerable efforts to control large-scale complex systems, especially in the area of autonomous intelligent control methods. The control system of a large-scale complex system needs to satisfy four system level requirements: robustness, flexibility, reusability, and scalability. Corresponding to the four system level requirements, there arise four major challenges. First, it is difficult to get accurate and complete information. Second, the system may be physically highly distributed. Third, the system evolves very quickly. Fourth, emergent global behaviors of the system can be caused by small disturbances at the component level. The Multi-Agent Based Control (MABC) method as an implementation of distributed intelligent control has been the focus of research since the 1970s, in an effort to solve the above-mentioned problems in controlling large-scale complex systems. However, to the author's best knowledge, all MABC systems for large-scale complex systems with significant uncertainties are problem-specific and thus difficult to extend to other domains or larger systems. This situation is partly due to the control architecture of multiple agents being determined by agent to agent coupling and interaction mechanisms. Therefore, the research objective of this dissertation is to develop a comprehensive, generalized framework for the control system design of general large-scale complex systems with significant uncertainties, with the focus on distributed control architecture design and distributed inference engine design. A Hybrid Multi-Agent Based Control (HyMABC) architecture is proposed by combining hierarchical control architecture and module control architecture with logical replication rings. First, it decomposes a complex system hierarchically; second, it combines the components in the same level as a module, and then designs common interfaces for all of the components in the same module; third, replications are made for critical agents and are organized into logical rings. This architecture maintains clear guidelines for complexity decomposition and also increases the robustness of the whole system. Multiple Sectioned Dynamic Bayesian Networks (MSDBNs) as a distributed dynamic probabilistic inference engine, can be embedded into the control architecture to handle uncertainties of general large-scale complex systems. MSDBNs decomposes a large knowledge-based system into many agents. Each agent holds its partial perspective of a large problem domain by representing its knowledge as a Dynamic Bayesian Network (DBN). Each agent accesses local evidence from its corresponding local sensors and communicates with other agents through finite message passing. If the distributed agents can be organized into a tree structure, satisfying the running intersection property and d-sep set requirements, globally consistent inferences are achievable in a distributed way. By using different frequencies for local DBN agent belief updating and global system belief updating, it balances the communication cost with the global consistency of inferences. In this dissertation, a fully factorized Boyen-Koller (BK) approximation algorithm is used for local DBN agent belief updating, and the static Junction Forest Linkage Tree (JFLT) algorithm is used for global system belief updating. MSDBNs assume a static structure and a stable communication network for the whole system. However, for a real system, sub-Bayesian networks as nodes could be lost, and the communication network could be shut down due to partial damage in the system. Therefore, on-line and automatic MSDBNs structure formation is necessary for making robust state estimations and increasing survivability of the whole system. A Distributed Spanning Tree Optimization (DSTO) algorithm, a Distributed D-Sep Set Satisfaction (DDSSS) algorithm, and a Distributed Running Intersection Satisfaction (DRIS) algorithm are proposed in this dissertation. Combining these three distributed algorithms and a Distributed Belief Propagation (DBP) algorithm in MSDBNs makes state estimations robust to partial damage in the whole system. Combining the distributed control architecture design and the distributed inference engine design leads to a process of control system design for a general large-scale complex system. As applications of the proposed methodology, the control system design of a simplified ship chilled water system and a notional ship chilled water system have been demonstrated step by step. Simulation results not only show that the proposed methodology gives a clear guideline for control system design for general large-scale complex systems with dynamic and uncertain environment, but also indicate that the combination of MSDBNs and HyMABC can provide excellent performance for controlling general large-scale complex systems.
A run-time control architecture for the JPL telerobot
NASA Technical Reports Server (NTRS)
Balaram, J.; Lokshin, A.; Kreutz, K.; Beahan, J.
1987-01-01
An architecture for implementing the process-level decision making for a hierarchically structured telerobot currently being implemented at the Jet Propolusion Laboratory (JPL) is described. Constraints on the architecture design, architecture partitioning concepts, and a detailed description of the existing and proposed implementations are provided.
A Geo-Distributed System Architecture for Different Domains
NASA Astrophysics Data System (ADS)
Moßgraber, Jürgen; Middleton, Stuart; Tao, Ran
2013-04-01
The presentation will describe work on the system-of-systems (SoS) architecture that is being developed in the EU FP7 project TRIDEC on "Collaborative, Complex and Critical Decision-Support in Evolving Crises". In this project we deal with two use-cases: Natural Crisis Management (e.g. Tsunami Early Warning) and Industrial Subsurface Development (e.g. drilling for oil). These use-cases seem to be quite different at first sight but share a lot of similarities, like managing and looking up available sensors, extracting data from them and annotate it semantically, intelligently manage the data (big data problem), run mathematical analysis algorithms on the data and finally provide decision support on this basis. The main challenge was to create a generic architecture which fits both use-cases. The requirements to the architecture are manifold and the whole spectrum of a modern, geo-distributed and collaborative system comes into play. Obviously, one cannot expect to tackle these challenges adequately with a monolithic system or with a single technology. Therefore, a system architecture providing the blueprints to implement the system-of-systems approach has to combine multiple technologies and architectural styles. The most important architectural challenges we needed to address are 1. Build a scalable communication layer for a System-of-sytems 2. Build a resilient communication layer for a System-of-sytems 3. Efficiently publish large volumes of semantically rich sensor data 4. Scalable and high performance storage of large distributed datasets 5. Handling federated multi-domain heterogeneous data 6. Discovery of resources in a geo-distributed SoS 7. Coordination of work between geo-distributed systems The design decisions made for each of them will be presented. These developed concepts are also applicable to the requirements of the Future Internet (FI) and Internet of Things (IoT) which will provide services like smart grids, smart metering, logistics and environmental monitoring.
Genetics of reproduction and regulation of honeybee (Apis mellifera L.) social behavior.
Page, Robert E; Rueppell, Olav; Amdam, Gro V
2012-01-01
Honeybees form complex societies with a division of labor for reproduction, nutrition, nest construction and maintenance, and defense. How does it evolve? Tasks performed by worker honeybees are distributed in time and space. There is no central control over behavior and there is no central genome on which selection can act and effect adaptive change. For 22 years, we have been addressing these questions by selecting on a single social trait associated with nutrition: the amount of surplus pollen (a source of protein) that is stored in the combs of the nest. Forty-two generations of selection have revealed changes at biological levels extending from the society down to the level of the gene. We show how we constructed this vertical understanding of social evolution using behavioral and anatomical analyses, physiology, genetic mapping, and gene knockdowns. We map out the phenotypic and genetic architectures of food storage and foraging behavior and show how they are linked through broad epistasis and pleiotropy affecting a reproductive regulatory network that influences foraging behavior. This is remarkable because worker honeybees have reduced reproductive organs and are normally sterile; however, the reproductive regulatory network has been co-opted for behavioral division of labor.
Rueppell, Olav
2014-01-01
Social evolution has influenced every aspect of contemporary honey bee biology, but the details are difficult to reconstruct. The reproductive ground plan hypothesis of social evolution proposes that central regulators of the gonotropic cycle of solitary insects have been coopted to coordinate social complexity in honey bees, such as the division of labor among workers. The predicted trait associations between reproductive physiology and social behavior have been identified in the context of the pollen hoarding syndrome, a larger suite of interrelated traits. The genetic architecture of this syndrome is characterized by a partially overlapping genetic architecture with several consistent, pleiotropic QTL. Despite these central QTL and an integrated hormonal regulation, separate aspects of the pollen hoarding syndrome may evolve independently due to peripheral QTL and additionally segregating genetic variance. The characterization of the pollen hoarding syndrome has also demonstrated that this syndrome involves many non-behavioral traits, which may be the case for numerous “behavioral” syndromes. Furthermore, the genetic architecture of the pollen hoarding syndrome has implications for breeding programs for improving honey health and other desirable traits: If these traits are comparable to the pollen hoarding syndrome, consistent pleiotropic QTL will enable marker assisted selection, while sufficient additional genetic variation may permit the dissociation of trade-offs for efficient multiple trait selection. PMID:25506100
Rueppell, Olav
2014-05-01
Social evolution has influenced every aspect of contemporary honey bee biology, but the details are difficult to reconstruct. The reproductive ground plan hypothesis of social evolution proposes that central regulators of the gonotropic cycle of solitary insects have been coopted to coordinate social complexity in honey bees, such as the division of labor among workers. The predicted trait associations between reproductive physiology and social behavior have been identified in the context of the pollen hoarding syndrome, a larger suite of interrelated traits. The genetic architecture of this syndrome is characterized by a partially overlapping genetic architecture with several consistent, pleiotropic QTL. Despite these central QTL and an integrated hormonal regulation, separate aspects of the pollen hoarding syndrome may evolve independently due to peripheral QTL and additionally segregating genetic variance. The characterization of the pollen hoarding syndrome has also demonstrated that this syndrome involves many non-behavioral traits, which may be the case for numerous "behavioral" syndromes. Furthermore, the genetic architecture of the pollen hoarding syndrome has implications for breeding programs for improving honey health and other desirable traits: If these traits are comparable to the pollen hoarding syndrome, consistent pleiotropic QTL will enable marker assisted selection, while sufficient additional genetic variation may permit the dissociation of trade-offs for efficient multiple trait selection.
NASA Astrophysics Data System (ADS)
Martinez, Maria Isabel
2003-11-01
This thesis establishes a methodology that incorporates the latest procedures used in architectural acoustics for the study of open spaces of this general type, and definitions are given for the acoustic variables of interest. The ``Juego de Pelota'' (ball game) sites are the only ceremonial sites built specifically for the performance of a fertility ritual, and are ideal for the study of prehispanic architectural topographies. Analysis of the acoustic properties of such sites revealed that the topographical characteristics of the elevation profiles of these architectural structures determine the acoustic behavior of these spaces. Such profiles are classified into three basic types: (i) inclined profile, (ii) terraced profile, and (iii) mixed profile. The terraced profiles are the most efficient, and the mixed profiles are the least efficient, in regard to acoustics. The consideration of the acoustic behavior of architectural structures intended for the ``Ball Game,'' as the designs evolved over time, leads to the conclusion that acoustical sensations that contributed effectively to the characteristic mystical atmosphere of the ceremonial rituals were characteristic only of those sites constructed in the ``classical'' period. Thesis advisors: Jaime Navarro and Juan J. Sendra Copies of this thesis written in Spanish may be obtained by contacting the advisor, Jaime Navarro, E.T.S. de Arquitectura de Sevilla, Dpto. de Construcciones Arquitectonicas I, Av. Reina Mercedes, 2, 41012 Sevilla, Spain. E-mail address: jnavarro@us.es
A Proposed Information Architecture for Telehealth System Interoperability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warren, S.; Craft, R.L.; Parks, R.C.
1999-04-07
Telemedicine technology is rapidly evolving. Whereas early telemedicine consultations relied primarily on video conferencing, consultations today may utilize video conferencing, medical peripherals, store-and-forward capabilities, electronic patient record management software, and/or a host of other emerging technologies. These remote care systems rely increasingly on distributed, collaborative information technology during the care delivery process, in its many forms. While these leading-edge systems are bellwethers for highly advanced telemedicine, the remote care market today is still immature. Most telemedicine systems are custom-designed and do not interoperate with other commercial offerings. Users are limited to a set of functionality that a single vendor providesmore » and must often pay high prices to obtain this functionality, since vendors in this marketplace must deliver entire systems in order to compete. Besides increasing corporate research and development costs, this inhibits the ability of the user to make intelligent purchasing decisions regarding best-of-breed technologies. We propose a secure, object-oriented information architecture for telemedicine systems that promotes plug-and-play interaction between system components through standardized interfaces, communication protocols, messaging formats, and data definitions. In this architecture, each component functions as a black box, and components plug together in a lego-like fashion to achieve the desired device or system functionality. The architecture will support various ongoing standards work in the medical device arena.« less
Padhi, Radhakant; Unnikrishnan, Nishant; Wang, Xiaohua; Balakrishnan, S N
2006-12-01
Even though dynamic programming offers an optimal control solution in a state feedback form, the method is overwhelmed by computational and storage requirements. Approximate dynamic programming implemented with an Adaptive Critic (AC) neural network structure has evolved as a powerful alternative technique that obviates the need for excessive computations and storage requirements in solving optimal control problems. In this paper, an improvement to the AC architecture, called the "Single Network Adaptive Critic (SNAC)" is presented. This approach is applicable to a wide class of nonlinear systems where the optimal control (stationary) equation can be explicitly expressed in terms of the state and costate variables. The selection of this terminology is guided by the fact that it eliminates the use of one neural network (namely the action network) that is part of a typical dual network AC setup. As a consequence, the SNAC architecture offers three potential advantages: a simpler architecture, lesser computational load and elimination of the approximation error associated with the eliminated network. In order to demonstrate these benefits and the control synthesis technique using SNAC, two problems have been solved with the AC and SNAC approaches and their computational performances are compared. One of these problems is a real-life Micro-Electro-Mechanical-system (MEMS) problem, which demonstrates that the SNAC technique is applicable to complex engineering systems.
A Public Health Grid (PHGrid): Architecture and value proposition for 21st century public health.
Savel, T; Hall, K; Lee, B; McMullin, V; Miles, M; Stinn, J; White, P; Washington, D; Boyd, T; Lenert, L
2010-07-01
This manuscript describes the value of and proposal for a high-level architectural framework for a Public Health Grid (PHGrid), which the authors feel has the capability to afford the public health community a robust technology infrastructure for secure and timely data, information, and knowledge exchange, not only within the public health domain, but between public health and the overall health care system. The CDC facilitated multiple Proof-of-Concept (PoC) projects, leveraging an open-source-based software development methodology, to test four hypotheses with regard to this high-level framework. The outcomes of the four PoCs in combination with the use of the Federal Enterprise Architecture Framework (FEAF) and the newly emerging Federal Segment Architecture Methodology (FSAM) was used to develop and refine a high-level architectural framework for a Public Health Grid infrastructure. The authors were successful in documenting a robust high-level architectural framework for a PHGrid. The documentation generated provided a level of granularity needed to validate the proposal, and included examples of both information standards and services to be implemented. Both the results of the PoCs as well as feedback from selected public health partners were used to develop the granular documentation. A robust high-level cohesive architectural framework for a Public Health Grid (PHGrid) has been successfully articulated, with its feasibility demonstrated via multiple PoCs. In order to successfully implement this framework for a Public Health Grid, the authors recommend moving forward with a three-pronged approach focusing on interoperability and standards, streamlining the PHGrid infrastructure, and developing robust and high-impact public health services. Published by Elsevier Ireland Ltd.
Development of Design Expertise by Architecture Students
ERIC Educational Resources Information Center
Oluwatayo, Adedapo Adewunmi; Ezema, Isidore; Opoko, Akunnaya
2017-01-01
What constitutes design ability and design expertise in architecture? Which categories of design expertise can be identified amongst architecture students? And which input factors differentiate one level of expertise from another? These questions were addressed in a survey of architecture students in Nigeria. Based on the results, students were…
Bridgham, Jamie T.; Keay, June; Ortlund, Eric A.; Thornton, Joseph W.
2014-01-01
An important goal in molecular evolution is to understand the genetic and physical mechanisms by which protein functions evolve and, in turn, to characterize how a protein's physical architecture influences its evolution. Here we dissect the mechanisms for an evolutionary shift in function in the mollusk ortholog of the steroid hormone receptors (SRs), a family of biologically essential transcription factors. In vertebrates, the activity of SRs allosterically depends on binding a hormonal ligand; in mollusks, however, the SR ortholog (called ER, because of high sequence similarity to vertebrate estrogen receptors) activates transcription in the absence of ligand and does not respond to steroid hormones. To understand how this shift in regulation evolved, we combined evolutionary, structural, and functional analyses. We first determined the X-ray crystal structure of the ER of the Pacific oyster Crassostrea gigas (CgER), and found that its ligand pocket is filled with bulky residues that prevent ligand occupancy. To understand the genetic basis for the evolution of mollusk ERs' unique functions, we resurrected an ancient SR progenitor and characterized the effect of historical amino acid replacements on its functions. We found that reintroducing just two ancient replacements from the lineage leading to mollusk ERs recapitulates the evolution of full constitutive activity and the loss of ligand activation. These substitutions stabilize interactions among key helices, causing the allosteric switch to become “stuck” in the active conformation and making activation independent of ligand binding. Subsequent changes filled the ligand pocket without further affecting activity; by degrading the allosteric switch, these substitutions vestigialized elements of the protein's architecture required for ligand regulation and made reversal to the ancestral function more complex. These findings show how the physical architecture of allostery enabled a few large-effect mutations to trigger a profound evolutionary change in the protein's function and shaped the genetics of evolutionary reversibility. PMID:24415950
NASA Technical Reports Server (NTRS)
Weinberg, Aaron
1989-01-01
The Tracking and Data Relay Satellite System (TDRSS) is an integral part of the overall NASA Space Network (SN) that will continue to evolve into the 1990's. Projections for the first decade of the 21st century indicate the need for an SN evolution that must accommodate growth int he LEO user population and must further support the introduction of new/improved user services. A central ingredient of this evolution is an Advanced TDRSS (ATDRSS) follow-on to the current TDRSS that must initiate operations by the late 1990's in a manner that permits an orderly transition from the TDRSS to the ATDRSS era. An SN/ATDRSS architectural and operational concept that will satisfy the above goals is being developed. To this date, an SN/ATDRSS baseline concept was established that provides users with an end-to-end data transport (ENDAT) service. An expanded description of the baseline ENDAT concept, from the user perspective, is provided with special emphasis on the TDRSS/ATDRSS evolution. A high-level description of the end-to-end system that identifies the role of ATDRSS is presented; also included is a description of the baseline ATDRSS architecture and its relationship with the TDRSS 1996 baseline. Other key features of the ENDAT service are then expanded upon, including the multiple grades of service, and the RF telecommunications/tracking services to be available. The ATDRSS service options are described.
Long-range dismount activity classification: LODAC
NASA Astrophysics Data System (ADS)
Garagic, Denis; Peskoe, Jacob; Liu, Fang; Cuevas, Manuel; Freeman, Andrew M.; Rhodes, Bradley J.
2014-06-01
Continuous classification of dismount types (including gender, age, ethnicity) and their activities (such as walking, running) evolving over space and time is challenging. Limited sensor resolution (often exacerbated as a function of platform standoff distance) and clutter from shadows in dense target environments, unfavorable environmental conditions, and the normal properties of real data all contribute to the challenge. The unique and innovative aspect of our approach is a synthesis of multimodal signal processing with incremental non-parametric, hierarchical Bayesian machine learning methods to create a new kind of target classification architecture. This architecture is designed from the ground up to optimally exploit correlations among the multiple sensing modalities (multimodal data fusion) and rapidly and continuously learns (online self-tuning) patterns of distinct classes of dismounts given little a priori information. This increases classification performance in the presence of challenges posed by anti-access/area denial (A2/AD) sensing. To fuse multimodal features, Long-range Dismount Activity Classification (LODAC) develops a novel statistical information theoretic approach for multimodal data fusion that jointly models multimodal data (i.e., a probabilistic model for cross-modal signal generation) and discovers the critical cross-modal correlations by identifying components (features) with maximal mutual information (MI) which is efficiently estimated using non-parametric entropy models. LODAC develops a generic probabilistic pattern learning and classification framework based on a new class of hierarchical Bayesian learning algorithms for efficiently discovering recurring patterns (classes of dismounts) in multiple simultaneous time series (sensor modalities) at multiple levels of feature granularity.
Locality Aware Concurrent Start for Stencil Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shrestha, Sunil; Gao, Guang R.; Manzano Franco, Joseph B.
Stencil computations are at the heart of many physical simulations used in scientific codes. Thus, there exists a plethora of optimization efforts for this family of computations. Among these techniques, tiling techniques that allow concurrent start have proven to be very efficient in providing better performance for these critical kernels. Nevertheless, with many core designs being the norm, these optimization techniques might not be able to fully exploit locality (both spatial and temporal) on multiple levels of the memory hierarchy without compromising parallelism. It is no longer true that the machine can be seen as a homogeneous collection of nodesmore » with caches, main memory and an interconnect network. New architectural designs exhibit complex grouping of nodes, cores, threads, caches and memory connected by an ever evolving network-on-chip design. These new designs may benefit greatly from carefully crafted schedules and groupings that encourage parallel actors (i.e. threads, cores or nodes) to be aware of the computational history of other actors in close proximity. In this paper, we provide an efficient tiling technique that allows hierarchical concurrent start for memory hierarchy aware tile groups. Each execution schedule and tile shape exploit the available parallelism, load balance and locality present in the given applications. We demonstrate our technique on the Intel Xeon Phi architecture with selected and representative stencil kernels. We show improvement ranging from 5.58% to 31.17% over existing state-of-the-art techniques.« less
Structural basis for Diels-Alder ribozyme-catalyzed carbon-carbon bond formation
Serganov, Alexander; Keiper, Sonja; Malinina, Lucy; Tereshko, Valentina; Skripkin, Eugene; Höbartner, Claudia; Polonskaia, Anna; Phan, Anh Tuân; Wombacher, Richard; Micura, Ronald; Dauter, Zbigniew; Jäschke, Andres; Patel, Dinshaw J
2015-01-01
The majority of structural efforts addressing RNA’s catalytic function have focused on natural ribozymes, which catalyze phosphodiester transfer reactions. By contrast, little is known about how RNA catalyzes other types of chemical reactions. We report here the crystal structures of a ribozyme that catalyzes enantioselective carbon-carbon bond formation by the Diels-Alder reaction in the unbound state and in complex with a reaction product. The RNA adopts a λ-shaped nested pseudoknot architecture whose preformed hydrophobic pocket is precisely complementary in shape to the reaction product. RNA folding and product binding are dictated by extensive stacking and hydrogen bonding, whereas stereoselection is governed by the shape of the catalytic pocket. Catalysis is apparently achieved by a combination of proximity, complementarity and electronic effects. We observe structural parallels in the independently evolved catalytic pocket architectures for ribozyme- and antibody-catalyzed Diels-Alder carbon-carbon bond-forming reactions. PMID:15723077
Hemispheric Asymmetries during Processing of Immoral Stimuli
Cope, Lora M.; Borg, Jana Schaich; Harenski, Carla L.; Sinnott-Armstrong, Walter; Lieberman, Debra; Nyalakanti, Prashanth K.; Calhoun, Vince D.; Kiehl, Kent A.
2010-01-01
Evolutionary approaches to dissecting our psychological architecture underscore the importance of both function and structure. Here we focus on both the function and structure of our neural circuitry and report a functional bilateral asymmetry associated with the processing of immoral stimuli. Many processes in the human brain are associated with functional specialization unique to one hemisphere. With respect to emotions, most research points to right-hemispheric lateralization. Here we provide evidence that not all emotional stimuli share right-hemispheric lateralization. Across three studies employing different paradigms, the processing of negative morally laden stimuli was found to be highly left-lateralized. Regions of engagement common to the three studies include the left medial prefrontal cortex, left temporoparietal junction, and left posterior cingulate. These data support the hypothesis that processing of immoral stimuli preferentially engages left hemispheric processes and sheds light on our evolved neural architecture. PMID:21344009
Entry, Descent, and Landing Performance for a Mid-Lift-to-Drag Ratio Vehicle at Mars
NASA Technical Reports Server (NTRS)
Johnson, Breanna J.; Braden, Ellen M.; Sostaric, Ronald R.; Cerimele, Christopher J.; Lu, Ping
2018-01-01
In an effort to mature the design of the Mid-Lift-to-Drag ratio Rigid Vehicle (MRV) candidate of the NASA Evolvable Mars Campaign (EMC) architecture study, end-to-end six-degree-of-freedom (6DOF) simulations are needed to ensure a successful entry, descent, and landing (EDL) design. The EMC study is assessing different vehicle and mission architectures to determine which candidate would be best to deliver a 20 metric ton payload to the surface of Mars. Due to the large mass payload and the relatively low atmospheric density of Mars, all candidates of the EMC study propose to use Supersonic Retro-Propulsion (SRP) throughout the descent and landing phase, as opposed to parachutes, in order to decelerate to a subsonic touchdown. This paper presents a 6DOF entry-to-landing performance and controllability study with sensitivities to dispersions, particularly in the powered descent and landing phases.
Distributed Multisensor Data Fusion under Unknown Correlation and Data Inconsistency
Abu Bakr, Muhammad; Lee, Sukhan
2017-01-01
The paradigm of multisensor data fusion has been evolved from a centralized architecture to a decentralized or distributed architecture along with the advancement in sensor and communication technologies. These days, distributed state estimation and data fusion has been widely explored in diverse fields of engineering and control due to its superior performance over the centralized one in terms of flexibility, robustness to failure and cost effectiveness in infrastructure and communication. However, distributed multisensor data fusion is not without technical challenges to overcome: namely, dealing with cross-correlation and inconsistency among state estimates and sensor data. In this paper, we review the key theories and methodologies of distributed multisensor data fusion available to date with a specific focus on handling unknown correlation and data inconsistency. We aim at providing readers with a unifying view out of individual theories and methodologies by presenting a formal analysis of their implications. Finally, several directions of future research are highlighted. PMID:29077035
Grounding Robot Autonomy in Emotion and Self-awareness
NASA Astrophysics Data System (ADS)
Sanz, Ricardo; Hernández, Carlos; Hernando, Adolfo; Gómez, Jaime; Bermejo, Julita
Much is being done in an attempt to transfer emotional mechanisms from reverse-engineered biology into social robots. There are two basic approaches: the imitative display of emotion —e.g. to intend more human-like robots— and the provision of architectures with intrinsic emotion —in the hope of enhancing behavioral aspects. This paper focuses on the second approach, describing a core vision regarding the integration of cognitive, emotional and autonomic aspects in social robot systems. This vision has evolved as a result of the efforts in consolidating the models extracted from rat emotion research and their implementation in technical use cases based on a general systemic analysis in the framework of the ICEA and C3 projects. The desire for generality of the approach intends obtaining universal theories of integrated —autonomic, emotional, cognitive— behavior. The proposed conceptualizations and architectural principles are then captured in a theoretical framework: ASys — The Autonomous Systems Framework.
A Design for Composing and Extending Vehicle Models
NASA Technical Reports Server (NTRS)
Madden, Michael M.; Neuhaus, Jason R.
2003-01-01
The Systems Development Branch (SDB) at NASA Langley Research Center (LaRC) creates simulation software products for research. Each product consists of an aircraft model with experiment extensions. SDB treats its aircraft models as reusable components, upon which experiments can be built. SDB has evolved aircraft model design with the following goals: 1. Avoid polluting the aircraft model with experiment code. 2. Discourage the copy and tailor method of reuse. The current evolution of that architecture accomplishes these goals by reducing experiment creation to extend and compose. The architecture mechanizes the operational concerns of the model's subsystems and encapsulates them in an interface inherited by all subsystems. Generic operational code exercises the subsystems through the shared interface. An experiment is thus defined by the collection of subsystems that it creates ("compose"). Teams can modify the aircraft subsystems for the experiment using inheritance and polymorphism to create variants ("extend").
Topical perspective on massive threading and parallelism.
Farber, Robert M
2011-09-01
Unquestionably computer architectures have undergone a recent and noteworthy paradigm shift that now delivers multi- and many-core systems with tens to many thousands of concurrent hardware processing elements per workstation or supercomputer node. GPGPU (General Purpose Graphics Processor Unit) technology in particular has attracted significant attention as new software development capabilities, namely CUDA (Compute Unified Device Architecture) and OpenCL™, have made it possible for students as well as small and large research organizations to achieve excellent speedup for many applications over more conventional computing architectures. The current scientific literature reflects this shift with numerous examples of GPGPU applications that have achieved one, two, and in some special cases, three-orders of magnitude increased computational performance through the use of massive threading to exploit parallelism. Multi-core architectures are also evolving quickly to exploit both massive-threading and massive-parallelism such as the 1.3 million threads Blue Waters supercomputer. The challenge confronting scientists in planning future experimental and theoretical research efforts--be they individual efforts with one computer or collaborative efforts proposing to use the largest supercomputers in the world is how to capitalize on these new massively threaded computational architectures--especially as not all computational problems will scale to massive parallelism. In particular, the costs associated with restructuring software (and potentially redesigning algorithms) to exploit the parallelism of these multi- and many-threaded machines must be considered along with application scalability and lifespan. This perspective is an overview of the current state of threading and parallelize with some insight into the future. Published by Elsevier Inc.
Genetic Architecture Promotes the Evolution and Maintenance of Cooperation
Frénoy, Antoine; Taddei, François; Misevic, Dusan
2013-01-01
When cooperation has a direct cost and an indirect benefit, a selfish behavior is more likely to be selected for than an altruistic one. Kin and group selection do provide evolutionary explanations for the stability of cooperation in nature, but we still lack the full understanding of the genomic mechanisms that can prevent cheater invasion. In our study we used Aevol, an agent-based, in silico genomic platform to evolve populations of digital organisms that compete, reproduce, and cooperate by secreting a public good for tens of thousands of generations. We found that cooperating individuals may share a phenotype, defined as the amount of public good produced, but have very different abilities to resist cheater invasion. To understand the underlying genetic differences between cooperator types, we performed bio-inspired genomics analyses of our digital organisms by recording and comparing the locations of metabolic and secretion genes, as well as the relevant promoters and terminators. Association between metabolic and secretion genes (promoter sharing, overlap via frame shift or sense-antisense encoding) was characteristic for populations with robust cooperation and was more likely to evolve when secretion was costly. In mutational analysis experiments, we demonstrated the potential evolutionary consequences of the genetic association by performing a large number of mutations and measuring their phenotypic and fitness effects. The non-cooperating mutants arising from the individuals with genetic association were more likely to have metabolic deleterious mutations that eventually lead to selection eliminating such mutants from the population due to the accompanying fitness decrease. Effectively, cooperation evolved to be protected and robust to mutations through entangled genetic architecture. Our results confirm the importance of second-order selection on evolutionary outcomes, uncover an important genetic mechanism for the evolution and maintenance of cooperation, and suggest promising methods for preventing gene loss in synthetically engineered organisms. PMID:24278000
Origins of multicellular evolvability in snowflake yeast
Ratcliff, William C.; Fankhauser, Johnathon D.; Rogers, David W.; Greig, Duncan; Travisano, Michael
2015-01-01
Complex life has arisen through a series of ‘major transitions’ in which collectives of formerly autonomous individuals evolve into a single, integrated organism. A key step in this process is the origin of higher-level evolvability, but little is known about how higher-level entities originate and gain the capacity to evolve as an individual. Here we report a single mutation that not only creates a new level of biological organization, but also potentiates higher-level evolvability. Disrupting the transcription factor ACE2 in Saccharomyces cerevisiae prevents mother–daughter cell separation, generating multicellular ‘snowflake’ yeast. Snowflake yeast develop through deterministic rules that produce geometrically defined clusters that preclude genetic conflict and display a high broad-sense heritability for multicellular traits; as a result they are preadapted to multicellular adaptation. This work demonstrates that simple microevolutionary changes can have profound macroevolutionary consequences, and suggests that the formation of clonally developing clusters may often be the first step to multicellularity. PMID:25600558
ARIADNE: a Tracking System for Relationships in LHCb Metadata
NASA Astrophysics Data System (ADS)
Shapoval, I.; Clemencic, M.; Cattaneo, M.
2014-06-01
The data processing model of the LHCb experiment implies handling of an evolving set of heterogeneous metadata entities and relationships between them. The entities range from software and databases states to architecture specificators and software/data deployment locations. For instance, there is an important relationship between the LHCb Conditions Database (CondDB), which provides versioned, time dependent geometry and conditions data, and the LHCb software, which is the data processing applications (used for simulation, high level triggering, reconstruction and analysis of physics data). The evolution of CondDB and of the LHCb applications is a weakly-homomorphic process. It means that relationships between a CondDB state and LHCb application state may not be preserved across different database and application generations. These issues may lead to various kinds of problems in the LHCb production, varying from unexpected application crashes to incorrect data processing results. In this paper we present Ariadne - a generic metadata relationships tracking system based on the novel NoSQL Neo4j graph database. Its aim is to track and analyze many thousands of evolving relationships for cases such as the one described above, and several others, which would otherwise remain unmanaged and potentially harmful. The highlights of the paper include the system's implementation and management details, infrastructure needed for running it, security issues, first experience of usage in the LHCb production and potential of the system to be applied to a wider set of LHCb tasks.
2001-01-01
This editorial provides a model of how quality initiatives concerned with health information on the World Wide Web may in the future interact with each other. This vision fits into the evolving "Semantic Web" architecture - ie, the prospective that the World Wide Web may evolve from a mess of unstructured, human-readable information sources into a global knowledge base with an additional layer providing richer and more meaningful relationships between resources. One first prerequisite for forming such a "Semantic Web" or "web of trust" among the players active in quality management of health information is that these initiatives make statements about themselves and about each other in a machine-processable language. I present a concrete model on how this collaboration could look, and provide some recommendations on what the role of the World Health Organization (WHO) and other policy makers in this framework could be. PMID:11772549
Eysenbach, G
2001-01-01
This editorial provides a model of how quality initiatives concerned with health information on the World Wide Web may in the future interact with each other. This vision fits into the evolving "Semantic Web" architecture - ie, the prospective that the World Wide Web may evolve from a mess of unstructured, human-readable information sources into a global knowledge base with an additional layer providing richer and more meaningful relationships between resources. One first prerequisite for forming such a "Semantic Web" or "web of trust" among the players active in quality management of health information is that these initiatives make statements about themselves and about each other in a machine-processable language. I present a concrete model on how this collaboration could look, and provide some recommendations on what the role of the World Health Organization (WHO) and other policy makers in this framework could be.
Understanding the Evolution and Stability of the G-Matrix
Arnold, Stevan J.; Bürger, Reinhard; Hohenlohe, Paul A.; Ajie, Beverley C.; Jones, Adam G.
2011-01-01
The G-matrix summarizes the inheritance of multiple, phenotypic traits. The stability and evolution of this matrix are important issues because they affect our ability to predict how the phenotypic traits evolve by selection and drift. Despite the centrality of these issues, comparative, experimental, and analytical approaches to understanding the stability and evolution of the G-matrix have met with limited success. Nevertheless, empirical studies often find that certain structural features of the matrix are remarkably constant, suggesting that persistent selection regimes or other factors promote stability. On the theoretical side, no one has been able to derive equations that would relate stability of the G-matrix to selection regimes, population size, migration, or to the details of genetic architecture. Recent simulation studies of evolving G-matrices offer solutions to some of these problems, as well as a deeper, synthetic understanding of both the G-matrix and adaptive radiations. PMID:18973631
The architecture of human kin detection
Lieberman, Debra; Tooby, John; Cosmides, Leda
2012-01-01
Evolved mechanisms for assessing genetic relatedness have been found in many species, but their existence in humans has been a matter of controversy. Here we report three converging lines of evidence, drawn from siblings, that support the hypothesis that kin detection mechanisms exist in humans. These operate by computing, for each familiar individual, a unitary regulatory variable (the kinship index) that corresponds to a pairwise estimate of genetic relatedness between self and other. The cues that the system uses were identified by quantitatively matching individual exposure to potential cues of relatedness to variation in three outputs relevant to the system’s evolved functions: sibling altruism, aversion to personally engaging in sibling incest, and moral opposition to third party sibling incest. As predicted, the kin detection system uses two distinct, ancestrally valid cues to compute relatedness: the familiar other’s perinatal association with the individual’s biological mother, and duration of sibling coresidence. PMID:17301784
Evolving EO-1 Sensor Web Testbed Capabilities in Pursuit of GEOSS
NASA Technical Reports Server (NTRS)
Mandi, Dan; Ly, Vuong; Frye, Stuart; Younis, Mohamed
2006-01-01
A viewgraph presentation to evolve sensor web capabilities in pursuit of capabilities to support Global Earth Observing System of Systems (GEOSS) is shown. The topics include: 1) Vision to Enable Sensor Webs with "Hot Spots"; 2) Vision Extended for Communication/Control Architecture for Missions to Mars; 3) Key Capabilities Implemented to Enable EO-1 Sensor Webs; 4) One of Three Experiments Conducted by UMBC Undergraduate Class 12-14-05 (1 - 3); 5) Closer Look at our Mini-Rovers and Simulated Mars Landscae at GSFC; 6) Beginning to Implement Experiments with Standards-Vision for Integrated Sensor Web Environment; 7) Goddard Mission Services Evolution Center (GMSEC); 8) GMSEC Component Catalog; 9) Core Flight System (CFS) and Extension for GMSEC for Flight SW; 10) Sensor Modeling Language; 11) Seamless Ground to Space Integrated Message Bus Demonstration (completed December 2005); 12) Other Experiments in Queue; 13) Acknowledgements; and 14) References.
Tanikawa, Akio; Shinkai, Akira; Miyashita, Tadashi
2014-11-01
The evolutionary process of the unique web architectures of spiders of the sub-family Cyrtarachninae, which includes the triangular web weaver, bolas spider, and webless spider, is thought to be derived from reduction of orbicular 'spanning-thread webs' resembling ordinal orb webs. A molecular phylogenetic analysis was conducted to explore this hypothesis using orbicular web spiders Cyrtarachne, Paraplectana, Poecilopachys, triangular web spider Pasilobus, bolas spiders Ordgarius and Mastophora, and webless spider Celaenia. The phylogeny inferred from partial sequences of mt-COI, nuclear 18S-rRNA and 28S-rRNA showed that the common ancestor of these spiders diverged into two clades: a spanning-thread web clade and a bolas or webless clade. This finding suggests that the triangular web evolved by reduction of an orbicular spanning web, but that bolas spiders evolved in the early stage, which does not support the gradual web reduction hypothesis.
Coding principles of the canonical cortical microcircuit in the avian brain
Calabrese, Ana; Woolley, Sarah M. N.
2015-01-01
Mammalian neocortex is characterized by a layered architecture and a common or “canonical” microcircuit governing information flow among layers. This microcircuit is thought to underlie the computations required for complex behavior. Despite the absence of a six-layered cortex, birds are capable of complex cognition and behavior. In addition, the avian auditory pallium is composed of adjacent information-processing regions with genetically identified neuron types and projections among regions comparable with those found in the neocortex. Here, we show that the avian auditory pallium exhibits the same information-processing principles that define the canonical cortical microcircuit, long thought to have evolved only in mammals. These results suggest that the canonical cortical microcircuit evolved in a common ancestor of mammals and birds and provide a physiological explanation for the evolution of neural processes that give rise to complex behavior in the absence of cortical lamination. PMID:25691736
Evolution of System Architectures: Where Do We Need to Fail Next?
NASA Astrophysics Data System (ADS)
Bermudez, Luis; Alameh, Nadine; Percivall, George
2013-04-01
Innovation requires testing and failing. Thomas Edison was right when he said "I have not failed. I've just found 10,000 ways that won't work". For innovation and improvement of standards to happen, service Architectures have to be tested and tested. Within the Open Geospatial Consortium (OGC), testing of service architectures has occurred for the last 15 years. This talk will present an evolution of these service architectures and a possible future path. OGC is a global forum for the collaboration of developers and users of spatial data products and services, and for the advancement and development of international standards for geospatial interoperability. The OGC Interoperability Program is a series of hands-on, fast paced, engineering initiatives to accelerate the development and acceptance of OGC standards. Each initiative is organized in threads that provide focus under a particular theme. The first testbed, OGC Web Services phase 1, completed in 2003 had four threads: Common Architecture, Web Mapping, Sensor Web and Web Imagery Enablement. The Common Architecture was a cross-thread theme, to ensure that the Web Mapping and Sensor Web experiments built on a base common architecture. The architecture was based on the three main SOA components: Broker, Requestor and Provider. It proposed a general service model defining service interactions and dependencies; categorization of service types; registries to allow discovery and access of services; data models and encodings; and common services (WMS, WFS, WCS). For the latter, there was a clear distinction on the different services: Data Services (e.g. WMS), Application services (e.g. Coordinate transformation) and server-side client applications (e.g. image exploitation). The latest testbed, OGC Web Service phase 9, completed in 2012 had 5 threads: Aviation, Cross-Community Interoperability (CCI), Security and Services Interoperability (SSI), OWS Innovations and Compliance & Interoperability Testing & Evaluation (CITE). Compared to the first testbed, OWS-9 did not have a separate common architecture thread. Instead the emphasis was on brokering information models, securing them and making data available efficiently on mobile devices. The outcome is an architecture based on usability and non-intrusiveness while leveraging mediation of information models from different communities. This talk will use lessons learned from the evolution from OGC Testbed phase 1 to phase 9 to better understand how global and complex infrastructures evolve to support many communities including the Earth System Science Community.
Design and Development of a Methane Cryogenic Propulsion Stage for Human Mars Exploration
NASA Technical Reports Server (NTRS)
Percy, Thomas K.; Polsgrove, Tara; Turpin, Jason; Alexander, Leslie
2016-01-01
NASA is currently working on the Evolvabe Mars Campaign (EMC) study to outline transportation and mission options for human exploration of Mars. One of the key aspects of the EMC is leveraging current and planned near-term technology investments to build an affordable and evolvable approach to Mars exploration. This leveraging of investments includes the use of high-power Solar Electric Propulsion (SEP) systems, evolved from those currently under development in support of the Asteroid Redirect Mission (ARM), to deliver payloads to Mars. The EMC is considering several transportation options that combine solar electric and chemical propulsion technologies to deliver crew and cargo to Mars. In one primary architecture option, the SEP propulsion system is used to pre-deploy mission elements to Mars while a high-thrust chemical propulsion system is used to send crew on faster ballistic transfers between Earth and Mars. This high-thrust chemical system uses liquid oxygen - liquid methane main propulsion and reaction control systems integrated into the Methane Cryogenic Propulsion Stage (MCPS). Over the past year, there have been several studies completed to provide critical design and development information related to the MCPS. This paper is intended to provide a summary of these efforts. A summary of the current point of departure design for the MCPS is provided as well as an overview of the mission architecture and concept of operations that the MCPS is intended to support. To leverage the capabilities of solar electric propulsion to the greatest extent possible, the EMC architecture pre-deploys to Mars orbit the stages required for returning crew from Mars. While this changes the risk posture of the architecture, it can provide some mass savings by using higher-efficiency systems for interplanetary transfer. However, this does introduce significantly longer flight times to Mars which, in turn, increases the overall lifetime of the stages to as long as 2500 days. This unique aspect to the concept of operations introduces several challenges, specifically related to propellant storage and engine reliability. These challenges and some potential solutions are discussed. Specific focus is provided on two key technology areas; propulsion and cryogenic fluid management. In the area of propulsion development, the development of an integrated methane propulsion system that combines both main propulsion and reaction control is discussed. This includes an overview of potential development paths, areas where development for Mars applications are complementary to development efforts underway in other parts of the aerospace industry, and commonality between the MCPS methane propulsion applications and other Mars elements, including the Mars lander systems. This commonality is a key affordability aspect of the Evolvable Mars Campaign. A similar discussion is provided for cryogenic fluid management technologies including a discussion of how using cryo propulsion in the Mars transportation application not only provides performance benefits but also leverages decades of technology development investments made by NASA and its aerospace contractor community.
Design and Development of a Methane Cryogenic Propulsion Stage for Human Mars Exploration
NASA Technical Reports Server (NTRS)
Percy, Thomas K.; Polsgrove, Tara; Turpin, Jason; Alexander, Leslie
2016-01-01
NASA is currently working on the Evolvabe Mars Campaign (EMC) study to outline transportation and mission options for human exploration of Mars. One of the key aspects of the EMC is leveraging current and planned near-term technology investments to build an affordable and evolvable approach to Mars exploration. This leveraging of investments includes the use of high-power Solar Electric Propulsion (SEP) systems evolved from those currently under development in support of the Asteroid Redirect Mission to deliver payloads to Mars. The EMC is considering several transportation options that combine solar electric and chemical propulsion technologies to deliver crew and cargo to Mars. In one primary architecture option, the SEP propulsion system is used to pre-deploy mission elements to Mars while a high-thrust chemical propulsion system is used to send crew on faster ballistic transfers between Earth and Mars. This high-thrust chemical system uses liquid oxygen - liquid methane main propulsion and reaction control systems integrated into the Methane Cryogenic Propulsion Stage (MCPS). Over the past year, there have been several studies completed to provide critical design and development information related to the MCPS. This paper is intended to provide a summary of these efforts. A summary of the current point of departure design for the MCPS is provided as well as an overview of the mission architecture and concept of operations that the MCPS is intended to support. To leverage the capabilities of solar electric propulsion to the greatest extent possible, the EMC architecture pre-deploys the required stages for returning crew from Mars. While this changes the risk posture of the architecture, it provides mass savings by using higher-efficiency systems for interplanetary transfer. However, this does introduce significantly longer flight times to Mars which, in turn, increases the overall lifetime of the stages to as long as 3000 days. This unique aspect to the concept of operations introduces several challenges, specifically related to propellant storage and engine reliability. These challenges and some potential solutions are discussed. Specific focus is provided on two key technology areas; propulsion and cryogenic fluid management. In the area of propulsion development, the development of an integrated methane propulsion system that combines both main propulsion and reaction control is discussed. This includes an overview of potential development paths, areas where development for Mars applications are complementary to development efforts underway in other parts of the aerospace industry, and commonality between the MCPS methane propulsion applications and other Mars elements, including the Mars lander systems. This commonality is a key affordability aspect of the Evolvable Mars Campaign. A similar discussion is provided for cryogenic fluid management technologies including a discussion of how using cryo-propulsion in the Mars transportation application not only provides performance benefits but also leverages decades of technology development investments made by NASA and its aerospace contractor community.
Dual RING E3 Architectures Regulate Multiubiquitination and Ubiquitin Chain Elongation by APC/C.
Brown, Nicholas G; VanderLinden, Ryan; Watson, Edmond R; Weissmann, Florian; Ordureau, Alban; Wu, Kuen-Phon; Zhang, Wei; Yu, Shanshan; Mercredi, Peter Y; Harrison, Joseph S; Davidson, Iain F; Qiao, Renping; Lu, Ying; Dube, Prakash; Brunner, Michael R; Grace, Christy R R; Miller, Darcie J; Haselbach, David; Jarvis, Marc A; Yamaguchi, Masaya; Yanishevski, David; Petzold, Georg; Sidhu, Sachdev S; Kuhlman, Brian; Kirschner, Marc W; Harper, J Wade; Peters, Jan-Michael; Stark, Holger; Schulman, Brenda A
2016-06-02
Protein ubiquitination involves E1, E2, and E3 trienzyme cascades. E2 and RING E3 enzymes often collaborate to first prime a substrate with a single ubiquitin (UB) and then achieve different forms of polyubiquitination: multiubiquitination of several sites and elongation of linkage-specific UB chains. Here, cryo-EM and biochemistry show that the human E3 anaphase-promoting complex/cyclosome (APC/C) and its two partner E2s, UBE2C (aka UBCH10) and UBE2S, adopt specialized catalytic architectures for these two distinct forms of polyubiquitination. The APC/C RING constrains UBE2C proximal to a substrate and simultaneously binds a substrate-linked UB to drive processive multiubiquitination. Alternatively, during UB chain elongation, the RING does not bind UBE2S but rather lures an evolving substrate-linked UB to UBE2S positioned through a cullin interaction to generate a Lys11-linked chain. Our findings define mechanisms of APC/C regulation, and establish principles by which specialized E3-E2-substrate-UB architectures control different forms of polyubiquitination. Copyright © 2016 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Venkata, Manjunath Gorentla; Aderholdt, William F
The pre-exascale systems are expected to have a significant amount of hierarchical and heterogeneous on-node memory, and this trend of system architecture in extreme-scale systems is expected to continue into the exascale era. along with hierarchical-heterogeneous memory, the system typically has a high-performing network ad a compute accelerator. This system architecture is not only effective for running traditional High Performance Computing (HPC) applications (Big-Compute), but also for running data-intensive HPC applications and Big-Data applications. As a consequence, there is a growing desire to have a single system serve the needs of both Big-Compute and Big-Data applications. Though the system architecturemore » supports the convergence of the Big-Compute and Big-Data, the programming models and software layer have yet to evolve to support either hierarchical-heterogeneous memory systems or the convergence. A programming abstraction to address this problem. The programming abstraction is implemented as a software library and runs on pre-exascale and exascale systems supporting current and emerging system architecture. Using distributed data-structures as a central concept, it provides (1) a simple, usable, and portable abstraction for hierarchical-heterogeneous memory and (2) a unified programming abstraction for Big-Compute and Big-Data applications.« less
Palomar, Esther; Chen, Xiaohong; Liu, Zhiming; Maharjan, Sabita; Bowen, Jonathan
2016-10-28
Smart city systems embrace major challenges associated with climate change, energy efficiency, mobility and future services by embedding the virtual space into a complex cyber-physical system. Those systems are constantly evolving and scaling up, involving a wide range of integration among users, devices, utilities, public services and also policies. Modelling such complex dynamic systems' architectures has always been essential for the development and application of techniques/tools to support design and deployment of integration of new components, as well as for the analysis, verification, simulation and testing to ensure trustworthiness. This article reports on the definition and implementation of a scalable component-based architecture that supports a cooperative energy demand response (DR) system coordinating energy usage between neighbouring households. The proposed architecture, called refinement of Cyber-Physical Component Systems (rCPCS), which extends the refinement calculus for component and object system (rCOS) modelling method, is implemented using Eclipse Extensible Coordination Tools (ECT), i.e., Reo coordination language. With rCPCS implementation in Reo, we specify the communication, synchronisation and co-operation amongst the heterogeneous components of the system assuring, by design scalability and the interoperability, correctness of component cooperation.
Palomar, Esther; Chen, Xiaohong; Liu, Zhiming; Maharjan, Sabita; Bowen, Jonathan
2016-01-01
Smart city systems embrace major challenges associated with climate change, energy efficiency, mobility and future services by embedding the virtual space into a complex cyber-physical system. Those systems are constantly evolving and scaling up, involving a wide range of integration among users, devices, utilities, public services and also policies. Modelling such complex dynamic systems’ architectures has always been essential for the development and application of techniques/tools to support design and deployment of integration of new components, as well as for the analysis, verification, simulation and testing to ensure trustworthiness. This article reports on the definition and implementation of a scalable component-based architecture that supports a cooperative energy demand response (DR) system coordinating energy usage between neighbouring households. The proposed architecture, called refinement of Cyber-Physical Component Systems (rCPCS), which extends the refinement calculus for component and object system (rCOS) modelling method, is implemented using Eclipse Extensible Coordination Tools (ECT), i.e., Reo coordination language. With rCPCS implementation in Reo, we specify the communication, synchronisation and co-operation amongst the heterogeneous components of the system assuring, by design scalability and the interoperability, correctness of component cooperation. PMID:27801829
Improving a HMM-based off-line handwriting recognition system using MME-PSO optimization
NASA Astrophysics Data System (ADS)
Hamdani, Mahdi; El Abed, Haikal; Hamdani, Tarek M.; Märgner, Volker; Alimi, Adel M.
2011-01-01
One of the trivial steps in the development of a classifier is the design of its architecture. This paper presents a new algorithm, Multi Models Evolvement (MME) using Particle Swarm Optimization (PSO). This algorithm is a modified version of the basic PSO, which is used to the unsupervised design of Hidden Markov Model (HMM) based architectures. For instance, the proposed algorithm is applied to an Arabic handwriting recognizer based on discrete probability HMMs. After the optimization of their architectures, HMMs are trained with the Baum- Welch algorithm. The validation of the system is based on the IfN/ENIT database. The performance of the developed approach is compared to the participating systems at the 2005 competition organized on Arabic handwriting recognition on the International Conference on Document Analysis and Recognition (ICDAR). The final system is a combination between an optimized HMM with 6 other HMMs obtained by a simple variation of the number of states. An absolute improvement of 6% of word recognition rate with about 81% is presented. This improvement is achieved comparing to the basic system (ARAB-IfN). The proposed recognizer outperforms also most of the known state-of-the-art systems.
An Architecture for Autonomous Rovers on Future Planetary Missions
NASA Astrophysics Data System (ADS)
Ocon, J.; Avilés, M.; Graziano, M.
2018-04-01
This paper proposes an architecture for autonomous planetary rovers. This architecture combines a set of characteristics required in this type of system: high level of abstraction, reactive event-based activity execution, and automous navigation.
Space Communications Capability Roadmap Interim Review
NASA Technical Reports Server (NTRS)
Spearing, Robert; Regan, Michael
2005-01-01
Contents include the following: Identify the need for a robust communications and navigation architecture for the success of exploration and science missions. Describe an approach for specifying architecture alternatives and analyzing them. Establish a top level architecture based on a network of networks. Identify key enabling technologies. Synthesize capability, architecture and technology into an initial capability roadmap.
Design and Analysis of Architectures for Structural Health Monitoring Systems
NASA Technical Reports Server (NTRS)
Mukkamala, Ravi; Sixto, S. L. (Technical Monitor)
2002-01-01
During the two-year project period, we have worked on several aspects of Health Usage and Monitoring Systems for structural health monitoring. In particular, we have made contributions in the following areas. 1. Reference HUMS architecture: We developed a high-level architecture for health monitoring and usage systems (HUMS). The proposed reference architecture is shown. It is compatible with the Generic Open Architecture (GOA) proposed as a standard for avionics systems. 2. HUMS kernel: One of the critical layers of HUMS reference architecture is the HUMS kernel. We developed a detailed design of a kernel to implement the high level architecture.3. Prototype implementation of HUMS kernel: We have implemented a preliminary version of the HUMS kernel on a Unix platform.We have implemented both a centralized system version and a distributed version. 4. SCRAMNet and HUMS: SCRAMNet (Shared Common Random Access Memory Network) is a system that is found to be suitable to implement HUMS. For this reason, we have conducted a simulation study to determine its stability in handling the input data rates in HUMS. 5. Architectural specification.
The Aeronautical Data Link: Decision Framework for Architecture Analysis
NASA Technical Reports Server (NTRS)
Morris, A. Terry; Goode, Plesent W.
2003-01-01
A decision analytic approach that develops optimal data link architecture configuration and behavior to meet multiple conflicting objectives of concurrent and different airspace operations functions has previously been developed. The approach, premised on a formal taxonomic classification that correlates data link performance with operations requirements, information requirements, and implementing technologies, provides a coherent methodology for data link architectural analysis from top-down and bottom-up perspectives. This paper follows the previous research by providing more specific approaches for mapping and transitioning between the lower levels of the decision framework. The goal of the architectural analysis methodology is to assess the impact of specific architecture configurations and behaviors on the efficiency, capacity, and safety of operations. This necessarily involves understanding the various capabilities, system level performance issues and performance and interface concepts related to the conceptual purpose of the architecture and to the underlying data link technologies. Efficient and goal-directed data link architectural network configuration is conditioned on quantifying the risks and uncertainties associated with complex structural interface decisions. Deterministic and stochastic optimal design approaches will be discussed that maximize the effectiveness of architectural designs.
Clinical Decision Support: a 25 Year Retrospective and a 25 Year Vision.
Middleton, B; Sittig, D F; Wright, A
2016-08-02
The objective of this review is to summarize the state of the art of clinical decision support (CDS) circa 1990, review progress in the 25 year interval from that time, and provide a vision of what CDS might look like 25 years hence, or circa 2040. Informal review of the medical literature with iterative review and discussion among the authors to arrive at six axes (data, knowledge, inference, architecture and technology, implementation and integration, and users) to frame the review and discussion of selected barriers and facilitators to the effective use of CDS. In each of the six axes, significant progress has been made. Key advances in structuring and encoding standardized data with an increased availability of data, development of knowledge bases for CDS, and improvement of capabilities to share knowledge artifacts, explosion of methods analyzing and inferring from clinical data, evolution of information technologies and architectures to facilitate the broad application of CDS, improvement of methods to implement CDS and integrate CDS into the clinical workflow, and increasing sophistication of the end-user, all have played a role in improving the effective use of CDS in healthcare delivery. CDS has evolved dramatically over the past 25 years and will likely evolve just as dramatically or more so over the next 25 years. Increasingly, the clinical encounter between a clinician and a patient will be supported by a wide variety of cognitive aides to support diagnosis, treatment, care-coordination, surveillance and prevention, and health maintenance or wellness.
An MBSE Approach to Space Suit Development
NASA Technical Reports Server (NTRS)
Cordova, Lauren; Kovich, Christine; Sargusingh, Miriam
2012-01-01
The EVA/Space Suit Development Office (ESSD) Systems Engineering and Integration (SE&I) team has utilized MBSE in multiple programs. After developing operational and architectural models, the MBSE framework was expanded to link the requirements space to the system models through functional analysis and interfaces definitions. By documenting all the connections within the technical baseline, ESSD experienced significant efficiency improvements in analysis and identification of change impacts. One of the biggest challenges presented to the MBSE structure was a program transition and restructuring effort, which was completed successfully in 4 months culminating in the approval of a new EVA Technical Baseline. During this time three requirements sets spanning multiple DRMs were streamlined into one NASA-owned Systems Requirement Document (SRD) that successfully identified requirements relevant to the current hardware development effort while remaining extensible to support future hardware developments. A capability-based hierarchy was established to provide a more flexible framework for future space suit development that can support multiple programs with minimal rework of basic EVA/Space Suit requirements. This MBSE approach was most recently applied for generation of an EMU Demonstrator technical baseline being developed for an ISS DTO. The relatively quick turnaround of operational concepts, architecture definition, and requirements for this new suit development has allowed us to test and evolve the MBSE process and framework in an extremely different setting while still offering extensibility and traceability throughout ESSD projects. The ESSD MBSE framework continues to be evolved in order to support integration of all products associated with the SE&I engine.
Berkowitz, Murray R
2013-01-01
Current information systems for use in detecting bioterrorist attacks lack a consistent, overarching information architecture. An overview of the use of biological agents as weapons during a bioterrorist attack is presented. Proposed are the design, development, and implementation of a medical informatics system to mine pertinent databases, retrieve relevant data, invoke appropriate biostatistical and epidemiological software packages, and automatically analyze these data. The top-level information architecture is presented. Systems requirements and functional specifications for this level are presented. Finally, future studies are identified.
Relation of Melatonin to Sleep Architecture in Children with Autism
ERIC Educational Resources Information Center
Leu, Roberta M.; Beyderman, Liya; Botzolakis, Emmanuel J.; Surdyka, Kyla; Wang, Lily; Malow, Beth A.
2011-01-01
Children with autism often suffer from sleep disturbances, and compared to age-matched controls, have decreased melatonin levels, as indicated by urine levels of the primary melatonin metabolite, 6-sulfatoxymelatonin (6-SM). We therefore investigated the relationship between 6-SM levels and sleep architecture in children with autism spectrum…
The evolutionary dynamics of haplodiploidy: Genome architecture and haploid viability.
Blackmon, Heath; Hardy, Nate B; Ross, Laura
2015-11-01
Haplodiploid reproduction, in which males are haploid and females are diploid, is widespread among animals, yet we understand little about the forces responsible for its evolution. The current theory is that haplodiploidy has evolved through genetic conflicts, as it provides a transmission advantage to mothers. Male viability is thought to be a major limiting factor; diploid individuals tend to harbor many recessive lethal mutations. This theory predicts that the evolution of haplodiploidy is more likely in male heterogametic lineages with few chromosomes, as genes on the X chromosome are often expressed in a haploid environment, and the fewer the chromosome number, the greater the proportion of the total genome that is X-linked. We test this prediction with comparative phylogenetic analyses of mites, among which haplodiploidy has evolved repeatedly. We recover a negative correlation between chromosome number and haplodiploidy, find evidence that low chromosome number evolved prior to haplodiploidy, and that it is unlikely that diplodiploidy has reevolved from haplodiploid lineages of mites. These results are consistent with the predicted importance of haploid male viability. © 2015 The Author(s). Evolution published by Wiley Periodicals, Inc. on behalf of The Society for the Study of Evolution.
Towards a Framework for Evolvable Network Design
NASA Astrophysics Data System (ADS)
Hassan, Hoda; Eltarras, Ramy; Eltoweissy, Mohamed
The layered Internet architecture that had long guided network design and protocol engineering was an “interconnection architecture” defining a framework for interconnecting networks rather than a model for generic network structuring and engineering. We claim that the approach of abstracting the network in terms of an internetwork hinders the thorough understanding of the network salient characteristics and emergent behavior resulting in impeding design evolution required to address extreme scale, heterogeneity, and complexity. This paper reports on our work in progress that aims to: 1) Investigate the problem space in terms of the factors and decisions that influenced the design and development of computer networks; 2) Sketch the core principles for designing complex computer networks; and 3) Propose a model and related framework for building evolvable, adaptable and self organizing networks We will adopt a bottom up strategy primarily focusing on the building unit of the network model, which we call the “network cell”. The model is inspired by natural complex systems. A network cell is intrinsically capable of specialization, adaptation and evolution. Subsequently, we propose CellNet; a framework for evolvable network design. We outline scenarios for using the CellNet framework to enhance legacy Internet protocol stack.
Convergence in Thunniform Anatomy in Lamnid Sharks and Jurassic Ichthyosaurs.
Lingham-Soliar, Theagarten
2016-12-01
Among extinct ichthyosaurs the Jurassic forms Ichthyosaurus and Stenopterygius share a number of anatomical specializations with lamnid sharks, characterized in the white shark, Carcharodon carcharias These features allow their inclusion within the mode of high-speed thunniform swimming to which only two other equally distinctive phylogenetic groups belong, tuna and dolphins-a striking testaments to evolutionary convergence. Jurassic ichthyosaurs evolved from reptiles that had returned to the sea (secondarily adapted) about 250 million years ago (MYA) while lamnid sharks evolved about 50 MYA from early cartilaginous fishes (originating ca. 400 MYA). Their shared independently evolved anatomical characteristics are discussed. These include a deep tear-drop body shape that helped initially define members as thunniform swimmers. Later, other critical structural characteristics were discovered such as the crossed-fiber architecture of the skin, high-speed adapted dorsal and caudal fins, a caudal peduncle and series of ligaments to enable transmission of power from the musculature located anteriorly to the caudal fin. Both groups also share a similar chemistry of the dermal fibers, i.e., the scleroprotein collagen. © The Author 2016. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology. All rights reserved. For permissions please email: journals.permissions@oup.com.
Implementing the Freight Transportation Data Architecture : Data Element Dictionary
DOT National Transportation Integrated Search
2015-01-01
NCFRP Report 9: Guidance for Developing a Freight Data Architecture articulates the value of establishing architecture for linking data across modes, subjects, and levels of geography to obtain essential information for decision making. Central to th...
Fry, Bryan G; Scheib, Holger; van der Weerd, Louise; Young, Bruce; McNaughtan, Judith; Ramjan, S F Ryan; Vidal, Nicolas; Poelmann, Robert E; Norman, Janette A
2008-02-01
Venom is a key innovation underlying the evolution of advanced snakes (Caenophidia). Despite this, very little is known about venom system structural diversification, toxin recruitment event timings, or toxin molecular evolution. A multidisciplinary approach was used to examine the diversification of the venom system and associated toxins across the full range of the approximately 100 million-year-old advanced snake clade with a particular emphasis upon families that have not secondarily evolved a front-fanged venom system ( approximately 80% of the 2500 species). Analysis of cDNA libraries revealed complex venom transcriptomes containing multiple toxin types including three finger toxins, cobra venom factor, cysteine-rich secretory protein, hyaluronidase, kallikrein, kunitz, lectin, matrix metalloprotease, phospholipase A(2), snake venom metalloprotease/a disintegrin and metalloprotease, and waprin. High levels of sequence diversity were observed, including mutations in structural and functional residues, changes in cysteine spacing, and major deletions/truncations. Morphological analysis comprising gross dissection, histology, and magnetic resonance imaging also demonstrated extensive modification of the venom system architecture in non-front-fanged snakes in contrast to the conserved structure of the venom system within the independently evolved front-fanged elapid or viperid snakes. Further, a reduction in the size and complexity of the venom system was observed in species in which constriction has been secondarily evolved as the preferred method of prey capture or dietary preference has switched from live prey to eggs or to slugs/snails. Investigation of the timing of toxin recruitment events across the entire advanced snake radiation indicates that the evolution of advanced venom systems in three front-fanged lineages is associated with recruitment of new toxin types or explosive diversification of existing toxin types. These results support the role of venom as a key evolutionary innovation in the diversification of advanced snakes and identify a potential role for non-front-fanged venom toxins as a rich source for lead compounds for drug design and development.
Integrating hospital information systems in healthcare institutions: a mediation architecture.
El Azami, Ikram; Cherkaoui Malki, Mohammed Ouçamah; Tahon, Christian
2012-10-01
Many studies have examined the integration of information systems into healthcare institutions, leading to several standards in the healthcare domain (CORBAmed: Common Object Request Broker Architecture in Medicine; HL7: Health Level Seven International; DICOM: Digital Imaging and Communications in Medicine; and IHE: Integrating the Healthcare Enterprise). Due to the existence of a wide diversity of heterogeneous systems, three essential factors are necessary to fully integrate a system: data, functions and workflow. However, most of the previous studies have dealt with only one or two of these factors and this makes the system integration unsatisfactory. In this paper, we propose a flexible, scalable architecture for Hospital Information Systems (HIS). Our main purpose is to provide a practical solution to insure HIS interoperability so that healthcare institutions can communicate without being obliged to change their local information systems and without altering the tasks of the healthcare professionals. Our architecture is a mediation architecture with 3 levels: 1) a database level, 2) a middleware level and 3) a user interface level. The mediation is based on two central components: the Mediator and the Adapter. Using the XML format allows us to establish a structured, secured exchange of healthcare data. The notion of medical ontology is introduced to solve semantic conflicts and to unify the language used for the exchange. Our mediation architecture provides an effective, promising model that promotes the integration of hospital information systems that are autonomous, heterogeneous, semantically interoperable and platform-independent.
Rugged, Portable, Real-Time Optical Gaseous Analyzer for Hydrogen Fluoride
NASA Technical Reports Server (NTRS)
Pilgrim, Jeffrey; Gonzales, Paula
2012-01-01
Hydrogen fluoride (HF) is a primary evolved combustion product of fluorinated and perfluorinated hydrocarbons. HF is produced during combustion by the presence of impurities and hydrogen- containing polymers including polyimides. This effect is especially dangerous in closed occupied volumes like spacecraft and submarines. In these systems, combinations of perfluorinated hydrocarbons and polyimides are used for insulating wiring. HF is both highly toxic and short-lived in closed environments due to its reactivity. The high reactivity also makes HF sampling problematic. An infrared optical sensor can detect promptly evolving HF with minimal sampling requirements, while providing both high sensitivity and high specificity. A rugged optical path length enhancement architecture enables both high HF sensitivity and rapid environmental sampling with minimal gaseous contact with the low-reactivity sensor surfaces. The inert optical sample cell, combined with infrared semiconductor lasers, is joined with an analog and digital electronic control architecture that allows for ruggedness and compactness. The combination provides both portability and battery operation on a simple camcorder battery for up to eight hours. Optical detection of gaseous HF is confounded by the need for rapid sampling with minimal contact between the sensor and the environmental sample. A sensor is required that must simultaneously provide the required sub-parts-permillion detection limits, but with the high specificity and selectivity expected of optical absorption techniques. It should also be rugged and compact for compatibility with operation onboard spacecraft and submarines. A new optical cell has been developed for which environmental sampling is accomplished by simply traversing the few mm-thick cell walls into an open volume where the measurement is made. A small, low-power fan or vacuum pump may be used to push or pull the gaseous sample into the sample volume for a response time of a few seconds. The optical cell simultaneously provides for an enhanced optical interaction path length between the environmental sample and the infrared laser. Further, the optical cell itself is comprised of inert materials that render it immune to attack by HF. In some cases, the sensor may be configured so that the optoelectronic devices themselves are protected and isolated from HF by the optical cell. The optical sample cell is combined with custom-developed analog and digital control electronics that provide rugged, compact operation on a platform that can run on a camcorder battery. The sensor is inert with respect to acidic gases like HF, while providing the required sensitivity, selectivity, and response time. Certain types of combustion events evolve copious amounts of HF, very little of other gases typically associated with combustion (e.g., carbon monoxide), and very low levels of aerosols and particulates (which confound traditional smoke detectors). The new sensor platform could warn occupants early enough to take the necessary countermeasures.
An information model for a virtual private optical network (OVPN) using virtual routers (VRs)
NASA Astrophysics Data System (ADS)
Vo, Viet Minh Nhat
2002-05-01
This paper describes a virtual private optical network architecture (Optical VPN - OVPN) based on virtual router (VR). It improves over architectures suggested for virtual private networks by using virtual routers with optical networks. The new things in this architecture are necessary changes to adapt to devices and protocols used in optical networks. This paper also presents information models for the OVPN: at the architecture level and at the service level. These are extensions to the DEN (directory enable network) and CIM (Common Information Model) for OVPNs using VRs. The goal is to propose a common management model using policies.
FTA Transit Intelligent Transportation System Architecture Consistency Review - 2010 Update
DOT National Transportation Integrated Search
2011-07-01
This report provides an assessment on the level of compliance among the FTA grantees with the National ITS Architecture Policy, specifically examining three items: 1. The use and maintenance of Regional ITS Architectures by transit agencies to plan, ...
Insider Threat Security Reference Architecture
2012-04-01
this challenge. CMU/SEI-2012-TR-007 | 2 2 The Components of the ITSRA Figure 2 shows the four layers of the ITSRA. The Business Security layer......organizations improve their level of preparedness to address the insider threat. Business Security Architecture Data Security Architecture
DOT National Transportation Integrated Search
1999-04-22
The CVISN Operational and Architectural Compatibility Handbook (COACH) provides a comprehensive checklist of what is required to conform with the Commercial Vehicle Information Systems and Networks (CVISN) operational concepts and architecture. It is...
Research Update: Programmable tandem repeat proteins inspired by squid ring teeth
NASA Astrophysics Data System (ADS)
Pena-Francesch, Abdon; Domeradzka, Natalia E.; Jung, Huihun; Barbu, Benjamin; Vural, Mert; Kikuchi, Yusuke; Allen, Benjamin D.; Demirel, Melik C.
2018-01-01
Cephalopods have evolved many interesting features that can serve as inspiration. Repetitive squid ring teeth (SRT) proteins from cephalopods exhibit properties such as strength, self-healing, and biocompatibility. These proteins have been engineered to design novel adhesives, self-healing textiles, and the assembly of 2d-layered materials. Compared to conventional polymers, repetitive proteins are easy to modify and can assemble in various morphologies and molecular architectures. This research update discusses the molecular biology and materials science of polypeptides inspired by SRT proteins, their properties, and perspectives for future applications.
The Technology of LiFi: A Brief Introduction
NASA Astrophysics Data System (ADS)
Ramadhani, E.; Mahardika, G. P.
2018-03-01
Light Fidelity (LiFi) is a Visible Light Communication (VLC) based technology that making a light as a media of communication replacing the cable wire communication. LiFi is evolve to overcome the rate speed in WiFi, while using LiFi the rate speed can reach until 14 Gbps. This paper presents an introduction of the LiFi technology including the architecture, modulation, performance, and the challenges. The result of this paper can be used as a reference and knowledge to develop some of the LiFi technology.
Cyberinfrastructure for Aircraft Mission Support
NASA Technical Reports Server (NTRS)
Freudinger, Lawrence C.
2010-01-01
Forth last several years NASA's Airborne Science Program has been developing and using infrastructure and applications that enable researchers to interact with each other and with airborne instruments via network communications. Use of these tools has increased near realtime situational awareness during field operations, resulting it productivity improvements, improved decision making, and the collection of better data. Advances in pre-mission planning and post-mission access have also emerged. Integrating these capabilities with other tools to evolve coherent service-oriented enterprise architecture for aircraft flight and test operations is the subject of ongoing efforts.
Human factors in spacecraft design
NASA Technical Reports Server (NTRS)
Harrison, Albert A.; Connors, Mary M.
1990-01-01
This paper describes some of the salient implications of evolving mission parameters for spacecraft design. Among the requirements for future spacecraft are new, higher standards of living, increased support of human productivity, and greater accommodation of physical and cultural variability. Design issues include volumetric allowances, architecture and layouts, closed life support systems, health maintenance systems, recreational facilities, automation, privacy, and decor. An understanding of behavioral responses to design elements is a precondition for critical design decisions. Human factors research results must be taken into account early in the course of the design process.
A failure management prototype: DR/Rx
NASA Technical Reports Server (NTRS)
Hammen, David G.; Baker, Carolyn G.; Kelly, Christine M.; Marsh, Christopher A.
1991-01-01
This failure management prototype performs failure diagnosis and recovery management of hierarchical, distributed systems. The prototype, which evolved from a series of previous prototypes following a spiral model for development, focuses on two functions: (1) the diagnostic reasoner (DR) performs integrated failure diagnosis in distributed systems; and (2) the recovery expert (Rx) develops plans to recover from the failure. Issues related to expert system prototype design and the previous history of this prototype are discussed. The architecture of the current prototype is described in terms of the knowledge representation and functionality of its components.
A highly efficient 3D level-set grain growth algorithm tailored for ccNUMA architecture
NASA Astrophysics Data System (ADS)
Mießen, C.; Velinov, N.; Gottstein, G.; Barrales-Mora, L. A.
2017-12-01
A highly efficient simulation model for 2D and 3D grain growth was developed based on the level-set method. The model introduces modern computational concepts to achieve excellent performance on parallel computer architectures. Strong scalability was measured on cache-coherent non-uniform memory access (ccNUMA) architectures. To achieve this, the proposed approach considers the application of local level-set functions at the grain level. Ideal and non-ideal grain growth was simulated in 3D with the objective to study the evolution of statistical representative volume elements in polycrystals. In addition, microstructure evolution in an anisotropic magnetic material affected by an external magnetic field was simulated.
NASA Astrophysics Data System (ADS)
Gallant, Frederick M.
A novel method of fabricating functionally graded extruded composite materials is proposed for propellant applications using the technology of continuous processing with a Twin-Screw Extruder. The method is applied to the manufacturing of grains for solid rocket motors in an end-burning configuration with an axial gradient in ammonium perchlorate volume fraction and relative coarse/fine particle size distributions. The fabrication of functionally graded extruded polymer composites with either inert or energetic ingredients has yet to be investigated. The lack of knowledge concerning the processing of these novel materials has necessitated that a number of research issues be addressed. Of primary concern is characterizing and modeling the relationship between the extruder screw geometry, transient processing conditions, and the gradient architecture that evolves in the extruder. Recent interpretations of the Residence Time Distributions (RTDs) and Residence Volume Distributions (RVDs) for polymer composites in the TSE are used to develop new process models for predicting gradient architectures in the direction of extrusion. An approach is developed for characterizing the sections of the extrudate using optical, mechanical, and compositional analysis to determine the gradient architectures. The effects of processing on the burning rate properties of extruded energetic polymer composites are characterized for homogeneous formulations over a range of compositions to determine realistic gradient architectures for solid rocket motor applications. The new process models and burning rate properties that have been characterized in this research effort will be the basis for an inverse design procedure that is capable of determining gradient architectures for grains in solid rocket motors that possess tailored burning rate distributions that conform to user-defined performance specifications.
Something old, something new: data warehousing in the digital age
NASA Astrophysics Data System (ADS)
Maguire, Rob; Woolf, Andrew
2015-04-01
The implications of digital transformation for Earth science data managers are significant: big data, internet of things, new sources of third-party observations. This at a time when many are struggling to deal with half a century of legacy data infrastructure since the International Geophysical Year. While data management best practice has evolved over this time, large-scale migration activities are rare, with processes and applications instead built up around a plethora of different technologies and approaches. It is perhaps more important than ever, before embarking on major investments in new technologies, to consider the benefits first of 'catching up' with mature best-practice. Data warehousing, as an architectural formalism, was developed in the 1990s as a response to the growing challenges in corporate environments of assembling, integrating, and quality controlling large amounts of data from multiple sources and for multiple purposes. A layered architecture separates transactional data, integration and staging areas, the warehouse itself, and analytical 'data marts', with optimised ETL (Extract, Transform, Load) processes used to promote data through the layers. The data warehouse, together with associated techniques of 'master data management' and 'business intelligence', provides a classic foundation for 'enterprise information management' ("an integrative discipline for structuring, describing and governing information assets across organizational and technological boundaries to improve efficiency, promote transparency and enable business insight", Gartner). The Australian Bureau of Meteorology, like most Earth-science agencies, maintains a large amount of observation data in a variety of systems and architectures. These data assets evolve over decades, usually for operational, rather than information management, reasons. Consequently there can be inconsistency in architectures and technologies. We describe our experience with two major data assets: the Australian Water Resource Information System (AWRIS) and the Australian Data Archive for Meteorology (ADAM). These maintain the national archive of hydrological and climate data. We are undertaking a migration of AWRIS from a 'software-centric' system to a 'data-centric' warehouse, with significant benefits in performance, scalability, and maintainability. As well, the architecture supports the use of conventional BI tools for product development and visualisation. We have also experimented with a warehouse ETL replacement for custom tsunameter ingest code in ADAM, with considerable success. Our experience suggests that there is benefit to be gained through adoption by science agencies of professional IT best practice that is mature in industry but may have been overlooked by scientific information practitioners. In the case of data warehousing, the practice requires a change of perspective from a focus on code development to a focus on data. It will continue to be relevant in the 'digital age' as vendors increasingly support integrated warehousing and 'big data' platforms.
Specification, Design, and Analysis of Advanced HUMS Architectures
NASA Technical Reports Server (NTRS)
Mukkamala, Ravi
2004-01-01
During the two-year project period, we have worked on several aspects of domain-specific architectures for HUMS. In particular, we looked at using scenario-based approach for the design and designed a language for describing such architectures. The language is now being used in all aspects of our HUMS design. In particular, we have made contributions in the following areas. 1) We have employed scenarios in the development of HUMS in three main areas. They are: (a) To improve reusability by using scenarios as a library indexing tool and as a domain analysis tool; (b) To improve maintainability by recording design rationales from two perspectives - problem domain and solution domain; (c) To evaluate the software architecture. 2) We have defined a new architectural language called HADL or HUMS Architectural Definition Language. It is a customized version of xArch/xADL. It is based on XML and, hence, is easily portable from domain to domain, application to application, and machine to machine. Specifications written in HADL can be easily read and parsed using the currently available XML parsers. Thus, there is no need to develop a plethora of software to support HADL. 3) We have developed an automated design process that involves two main techniques: (a) Selection of solutions from a large space of designs; (b) Synthesis of designs. However, the automation process is not an absolute Artificial Intelligence (AI) approach though it uses a knowledge-based system that epitomizes a specific HUMS domain. The process uses a database of solutions as an aid to solve the problems rather than creating a new design in the literal sense. Since searching is adopted as the main technique, the challenges involved are: (a) To minimize the effort in searching the database where a very large number of possibilities exist; (b) To develop representations that could conveniently allow us to depict design knowledge evolved over many years; (c) To capture the required information that aid the automation process.
The evolvability of programmable hardware.
Raman, Karthik; Wagner, Andreas
2011-02-06
In biological systems, individual phenotypes are typically adopted by multiple genotypes. Examples include protein structure phenotypes, where each structure can be adopted by a myriad individual amino acid sequence genotypes. These genotypes form vast connected 'neutral networks' in genotype space. The size of such neutral networks endows biological systems not only with robustness to genetic change, but also with the ability to evolve a vast number of novel phenotypes that occur near any one neutral network. Whether technological systems can be designed to have similar properties is poorly understood. Here we ask this question for a class of programmable electronic circuits that compute digital logic functions. The functional flexibility of such circuits is important in many applications, including applications of evolutionary principles to circuit design. The functions they compute are at the heart of all digital computation. We explore a vast space of 10(45) logic circuits ('genotypes') and 10(19) logic functions ('phenotypes'). We demonstrate that circuits that compute the same logic function are connected in large neutral networks that span circuit space. Their robustness or fault-tolerance varies very widely. The vicinity of each neutral network contains circuits with a broad range of novel functions. Two circuits computing different functions can usually be converted into one another via few changes in their architecture. These observations show that properties important for the evolvability of biological systems exist in a commercially important class of electronic circuitry. They also point to generic ways to generate fault-tolerant, adaptable and evolvable electronic circuitry.
The evolvability of programmable hardware
Raman, Karthik; Wagner, Andreas
2011-01-01
In biological systems, individual phenotypes are typically adopted by multiple genotypes. Examples include protein structure phenotypes, where each structure can be adopted by a myriad individual amino acid sequence genotypes. These genotypes form vast connected ‘neutral networks’ in genotype space. The size of such neutral networks endows biological systems not only with robustness to genetic change, but also with the ability to evolve a vast number of novel phenotypes that occur near any one neutral network. Whether technological systems can be designed to have similar properties is poorly understood. Here we ask this question for a class of programmable electronic circuits that compute digital logic functions. The functional flexibility of such circuits is important in many applications, including applications of evolutionary principles to circuit design. The functions they compute are at the heart of all digital computation. We explore a vast space of 1045 logic circuits (‘genotypes’) and 1019 logic functions (‘phenotypes’). We demonstrate that circuits that compute the same logic function are connected in large neutral networks that span circuit space. Their robustness or fault-tolerance varies very widely. The vicinity of each neutral network contains circuits with a broad range of novel functions. Two circuits computing different functions can usually be converted into one another via few changes in their architecture. These observations show that properties important for the evolvability of biological systems exist in a commercially important class of electronic circuitry. They also point to generic ways to generate fault-tolerant, adaptable and evolvable electronic circuitry. PMID:20534598
Model-Drive Architecture for Agent-Based Systems
NASA Technical Reports Server (NTRS)
Gradanin, Denis; Singh, H. Lally; Bohner, Shawn A.; Hinchey, Michael G.
2004-01-01
The Model Driven Architecture (MDA) approach uses a platform-independent model to define system functionality, or requirements, using some specification language. The requirements are then translated to a platform-specific model for implementation. An agent architecture based on the human cognitive model of planning, the Cognitive Agent Architecture (Cougaar) is selected for the implementation platform. The resulting Cougaar MDA prescribes certain kinds of models to be used, how those models may be prepared and the relationships of the different kinds of models. Using the existing Cougaar architecture, the level of application composition is elevated from individual components to domain level model specifications in order to generate software artifacts. The software artifacts generation is based on a metamodel. Each component maps to a UML structured component which is then converted into multiple artifacts: Cougaar/Java code, documentation, and test cases.
NASA Astrophysics Data System (ADS)
Armstrong, Michael James
Increases in power demands and changes in the design practices of overall equipment manufacturers has led to a new paradigm in vehicle systems definition. The development of unique power systems architectures is of increasing importance to overall platform feasibility and must be pursued early in the aircraft design process. Many vehicle systems architecture trades must be conducted concurrent to platform definition. With an increased complexity introduced during conceptual design, accurate predictions of unit level sizing requirements must be made. Architecture specific emergent requirements must be identified which arise due to the complex integrated effect of unit behaviors. Off-nominal operating scenarios present sizing critical requirements to the aircraft vehicle systems. These requirements are architecture specific and emergent. Standard heuristically defined failure mitigation is sufficient for sizing traditional and evolutionary architectures. However, architecture concepts which vary significantly in terms of structure and composition require that unique failure mitigation strategies be defined for accurate estimations of unit level requirements. Identifying of these off-nominal emergent operational requirements require extensions to traditional safety and reliability tools and the systematic identification of optimal performance degradation strategies. Discrete operational constraints posed by traditional Functional Hazard Assessment (FHA) are replaced by continuous relationships between function loss and operational hazard. These relationships pose the objective function for hazard minimization. Load shedding optimization is performed for all statistically significant failures by varying the allocation of functional capability throughout the vehicle systems architecture. Expressing hazards, and thereby, reliability requirements as continuous relationships with the magnitude and duration of functional failure requires augmentations to the traditional means for system safety assessment (SSA). The traditional two state and discrete system reliability assessment proves insufficient. Reliability is, therefore, handled in an analog fashion: as a function of magnitude of failure and failure duration. A series of metrics are introduced which characterize system performance in terms of analog hazard probabilities. These include analog and cumulative system and functional risk, hazard correlation, and extensions to the traditional component importance metrics. Continuous FHA, load shedding optimization, and analog SSA constitute the SONOMA process (Systematic Off-Nominal Requirements Analysis). Analog system safety metrics inform both architecture optimization (changes in unit level capability and reliability) and architecture augmentation (changes in architecture structure and composition). This process was applied for two vehicle systems concepts (conventional and 'more-electric') in terms of loss/hazard relationships with varying degrees of fidelity. Application of this process shows that the traditional assumptions regarding the structure of the function loss vs. hazard relationship apply undue design bias to functions and components during exploratory design. This bias is illustrated in terms of inaccurate estimations of the system and function level risk and unit level importance. It was also shown that off-nominal emergent requirements must be defined specific to each architecture concept. Quantitative comparisons of architecture specific off-nominal performance were obtained which provide evidence to the need for accurate definition of load shedding strategies during architecture exploratory design. Formally expressing performance degradation strategies in terms of the minimization of a continuous hazard space enhances the system architects ability to accurately predict sizing critical emergent requirements concurrent to architecture definition. Furthermore, the methods and frameworks generated here provide a structured and flexible means for eliciting these architecture specific requirements during the performance of architecture trades.
NASA Technical Reports Server (NTRS)
Fouts, Douglas J.; Butner, Steven E.
1991-01-01
The design of the processing element of GASP, a GaAs supercomputer with a 500-MHz instruction issue rate and 1-GHz subsystem clocks, is presented. The novel, functionally modular, block data flow architecture of GASP is described. The architecture and design of a GASP processing element is then presented. The processing element (PE) is implemented in a hybrid semiconductor module with 152 custom GaAs ICs of eight different types. The effects of the implementation technology on both the system-level architecture and the PE design are discussed. SPICE simulations indicate that parts of the PE are capable of being clocked at 1 GHz, while the rest of the PE uses a 500-MHz clock. The architecture utilizes data flow techniques at a program block level, which allows efficient execution of parallel programs while maintaining reasonably good performance on sequential programs. A simulation study of the architecture indicates that an instruction execution rate of over 30,000 MIPS can be attained with 65 PEs.
Software architecture of INO340 telescope control system
NASA Astrophysics Data System (ADS)
Ravanmehr, Reza; Khosroshahi, Habib
2016-08-01
The software architecture plays an important role in distributed control system of astronomical projects because many subsystems and components must work together in a consistent and reliable way. We have utilized a customized architecture design approach based on "4+1 view model" in order to design INOCS software architecture. In this paper, after reviewing the top level INOCS architecture, we present the software architecture model of INOCS inspired by "4+1 model", for this purpose we provide logical, process, development, physical, and scenario views of our architecture using different UML diagrams and other illustrative visual charts. Each view presents INOCS software architecture from a different perspective. We finish the paper by science data operation of INO340 and the concluding remarks.
Quantum Computing Architectural Design
NASA Astrophysics Data System (ADS)
West, Jacob; Simms, Geoffrey; Gyure, Mark
2006-03-01
Large scale quantum computers will invariably require scalable architectures in addition to high fidelity gate operations. Quantum computing architectural design (QCAD) addresses the problems of actually implementing fault-tolerant algorithms given physical and architectural constraints beyond those of basic gate-level fidelity. Here we introduce a unified framework for QCAD that enables the scientist to study the impact of varying error correction schemes, architectural parameters including layout and scheduling, and physical operations native to a given architecture. Our software package, aptly named QCAD, provides compilation, manipulation/transformation, multi-paradigm simulation, and visualization tools. We demonstrate various features of the QCAD software package through several examples.
NASA Technical Reports Server (NTRS)
Laurini, Kathleen C.; Hufenbach, Bernhard; Junichiro, Kawaguchi; Piedboeuf, Jean-Claude; Schade, Britta; Lorenzoni, Andrea; Curtis, Jeremy; Hae-Dong, Kim
2010-01-01
The International Space Exploration Coordination Group (ISECG) was established in response to The Global Exploration Strategy: The Framework for Coordination developed by fourteen space agencies and released in May 2007. Several ISECG participating space agencies have been studying concepts for human exploration of the moon that allow individual and collective goals and objectives to be met. This 18 month study activity culminated with the development of the ISECG Reference Architecture for Human Lunar Exploration. The reference architecture is a series of elements delivered over time in a flexible and evolvable campaign. This paper will describe the reference architecture and how it will inform near-term and long-term programmatic planning within interested agencies. The reference architecture is intended to serve as a global point of departure conceptual architecture that enables individual agency investments in technology development and demonstration, International Space Station research and technology demonstration, terrestrial analog studies, and robotic precursor missions to contribute towards the eventual implementation of a human lunar exploration scenario which reflects the concepts and priorities established to date. It also serves to create opportunities for partnerships that will support evolution of this concept and its eventual realization. The ISECG Reference Architecture for Human Lunar Exploration (commonly referred to as the lunar gPoD) reflects the agency commitments to finding an effective balance between conducting important scientific investigations of and from the moon, as well as demonstrating and mastering the technologies and capabilities to send humans farther into the Solar System. The lunar gPoD begins with a robust robotic precursor phase that demonstrates technologies and capabilities considered important for the success of the campaign. Robotic missions will inform the human missions and buy down risks. Human exploration will start with a thorough scientific investigation of the polar region while allowing the ability to demonstrate and validate the systems needed to take humans on more ambitious lunar exploration excursions. The ISECG Reference Architecture for Human Lunar Exploration serves as a model for future cooperation and is documented in a summary report and a comprehensive document that also describes the collaborative international process that led to its development. ISECG plans to continue with architecture studies such as this to examine an open transportation architecture and other destinations, with expanded participation from ISECG agencies, as it works to inform international partnerships and advance the Global Exploration Strategy.
A Case Study on Neural Inspired Dynamic Memory Management Strategies for High Performance Computing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vineyard, Craig Michael; Verzi, Stephen Joseph
As high performance computing architectures pursue more computational power there is a need for increased memory capacity and bandwidth as well. A multi-level memory (MLM) architecture addresses this need by combining multiple memory types with different characteristics as varying levels of the same architecture. How to efficiently utilize this memory infrastructure is an unknown challenge, and in this research we sought to investigate whether neural inspired approaches can meaningfully help with memory management. In particular we explored neurogenesis inspired re- source allocation, and were able to show a neural inspired mixed controller policy can beneficially impact how MLM architectures utilizemore » memory.« less
NASA Technical Reports Server (NTRS)
Ruiz, B. Ian; Burke, Gary R.; Lung, Gerald; Whitaker, William D.; Nowicki, Robert M.
2004-01-01
This viewgraph presentation reviews the architecture of the The CIA-AlA chip-set is a set of mixed-signal ASICs that provide a flexible high level interface between the spacecraft's command and data handling (C&DH) electronics and lower level functions in other spacecraft subsystems. Due to the open-systems architecture of the chip-set including an embedded micro-controller a variety of applications are possible. The chip-set was developed for the missions to the outer planets. The chips were developed to provide a single solution for both the switching and regulation of a spacecraft power bus. The Open-Systems Architecture allows for other powerful applications.
Contention Modeling for Multithreaded Distributed Shared Memory Machines: The Cray XMT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Secchi, Simone; Tumeo, Antonino; Villa, Oreste
Distributed Shared Memory (DSM) machines are a wide class of multi-processor computing systems where a large virtually-shared address space is mapped on a network of physically distributed memories. High memory latency and network contention are two of the main factors that limit performance scaling of such architectures. Modern high-performance computing DSM systems have evolved toward exploitation of massive hardware multi-threading and fine-grained memory hashing to tolerate irregular latencies, avoid network hot-spots and enable high scaling. In order to model the performance of such large-scale machines, parallel simulation has been proved to be a promising approach to achieve good accuracy inmore » reasonable times. One of the most critical factors in solving the simulation speed-accuracy trade-off is network modeling. The Cray XMT is a massively multi-threaded supercomputing architecture that belongs to the DSM class, since it implements a globally-shared address space abstraction on top of a physically distributed memory substrate. In this paper, we discuss the development of a contention-aware network model intended to be integrated in a full-system XMT simulator. We start by measuring the effects of network contention in a 128-processor XMT machine and then investigate the trade-off that exists between simulation accuracy and speed, by comparing three network models which operate at different levels of accuracy. The comparison and model validation is performed by executing a string-matching algorithm on the full-system simulator and on the XMT, using three datasets that generate noticeably different contention patterns.« less
Kim, Seongkyun; Kim, Hyoungkyu; Kralik, Jerald D.; Jeong, Jaeseung
2016-01-01
Determining the fundamental architectural design of complex nervous systems will lead to significant medical and technological advances. Yet it remains unclear how nervous systems evolved highly efficient networks with near optimal sharing of pathways that yet produce multiple distinct behaviors to reach the organism’s goals. To determine this, the nematode roundworm Caenorhabditis elegans is an attractive model system. Progress has been made in delineating the behavioral circuits of the C. elegans, however, many details are unclear, including the specific functions of every neuron and synapse, as well as the extent the behavioral circuits are separate and parallel versus integrative and serial. Network analysis provides a normative approach to help specify the network design. We investigated the vulnerability of the Caenorhabditis elegans connectome by performing computational experiments that (a) “attacked” 279 individual neurons and 2,990 weighted synaptic connections (composed of 6,393 chemical synapses and 890 electrical junctions) and (b) quantified the effects of each removal on global network properties that influence information processing. The analysis identified 12 critical neurons and 29 critical synapses for establishing fundamental network properties. These critical constituents were found to be control elements—i.e., those with the most influence over multiple underlying pathways. Additionally, the critical synapses formed into circuit-level pathways. These emergent pathways provide evidence for (a) the importance of backward locomotion, avoidance behavior, and social feeding behavior to the organism; (b) the potential roles of specific neurons whose functions have been unclear; and (c) both parallel and serial design elements in the connectome—i.e., specific evidence for a mixed architectural design. PMID:27540747
York, Larry M.; Galindo-Castañeda, Tania; Schussler, Jeffrey R.; Lynch, Jonathan P.
2015-01-01
Increasing the nitrogen use efficiency of maize is an important goal for food security and agricultural sustainability. In the past 100 years, maize breeding has focused on yield and above-ground phenes. Over this period, maize cultivation has changed from low fertilizer inputs and low population densities to intensive fertilization and dense populations. The authors hypothesized that through indirect selection the maize root system has evolved phenotypes suited to more intense competition for nitrogen. Sixteen maize varieties representing commercially successful lines over the past century were planted at two nitrogen levels and three planting densities. Root systems of the most recent material were 7 º more shallow, had one less nodal root per whorl, had double the distance from nodal root emergence to lateral branching, and had 14% more metaxylem vessels, but total mextaxylem vessel area remained unchanged because individual metaxylem vessels had 12% less area. Plasticity was also observed in cortical phenes such as aerenchyma, which increased at greater population densities. Simulation modelling with SimRoot demonstrated that even these relatively small changes in root architecture and anatomy could increase maize shoot growth by 16% in a high density and high nitrogen environment. The authors concluded that evolution of maize root phenotypes over the past century is consistent with increasing nitrogen use efficiency. Introgression of more contrasting root phene states into the germplasm of elite maize and determination of the functional utility of these phene states in multiple agronomic conditions could contribute to future yield gains. PMID:25795737
Completing sparse and disconnected protein-protein network by deep learning.
Huang, Lei; Liao, Li; Wu, Cathy H
2018-03-22
Protein-protein interaction (PPI) prediction remains a central task in systems biology to achieve a better and holistic understanding of cellular and intracellular processes. Recently, an increasing number of computational methods have shifted from pair-wise prediction to network level prediction. Many of the existing network level methods predict PPIs under the assumption that the training network should be connected. However, this assumption greatly affects the prediction power and limits the application area because the current golden standard PPI networks are usually very sparse and disconnected. Therefore, how to effectively predict PPIs based on a training network that is sparse and disconnected remains a challenge. In this work, we developed a novel PPI prediction method based on deep learning neural network and regularized Laplacian kernel. We use a neural network with an autoencoder-like architecture to implicitly simulate the evolutionary processes of a PPI network. Neurons of the output layer correspond to proteins and are labeled with values (1 for interaction and 0 for otherwise) from the adjacency matrix of a sparse disconnected training PPI network. Unlike autoencoder, neurons at the input layer are given all zero input, reflecting an assumption of no a priori knowledge about PPIs, and hidden layers of smaller sizes mimic ancient interactome at different times during evolution. After the training step, an evolved PPI network whose rows are outputs of the neural network can be obtained. We then predict PPIs by applying the regularized Laplacian kernel to the transition matrix that is built upon the evolved PPI network. The results from cross-validation experiments show that the PPI prediction accuracies for yeast data and human data measured as AUC are increased by up to 8.4 and 14.9% respectively, as compared to the baseline. Moreover, the evolved PPI network can also help us leverage complementary information from the disconnected training network and multiple heterogeneous data sources. Tested by the yeast data with six heterogeneous feature kernels, the results show our method can further improve the prediction performance by up to 2%, which is very close to an upper bound that is obtained by an Approximate Bayesian Computation based sampling method. The proposed evolution deep neural network, coupled with regularized Laplacian kernel, is an effective tool in completing sparse and disconnected PPI networks and in facilitating integration of heterogeneous data sources.
NASA Astrophysics Data System (ADS)
Zhao, Yongli; Ji, Yuefeng; Zhang, Jie; Li, Hui; Xiong, Qianjin; Qiu, Shaofeng
2014-08-01
Ultrahigh throughout capacity requirement is challenging the current optical switching nodes with the fast development of data center networks. Pbit/s level all optical switching networks need to be deployed soon, which will cause the high complexity of node architecture. How to control the future network and node equipment together will become a new problem. An enhanced Software Defined Networking (eSDN) control architecture is proposed in the paper, which consists of Provider NOX (P-NOX) and Node NOX (N-NOX). With the cooperation of P-NOX and N-NOX, the flexible control of the entire network can be achieved. All optical switching network testbed has been experimentally demonstrated with efficient control of enhanced Software Defined Networking (eSDN). Pbit/s level all optical switching nodes in the testbed are implemented based on multi-dimensional switching architecture, i.e. multi-level and multi-planar. Due to the space and cost limitation, each optical switching node is only equipped with four input line boxes and four output line boxes respectively. Experimental results are given to verify the performance of our proposed control and switching architecture.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morgan, Nathaniel Ray; Waltz, Jacob I.
The level set method is commonly used to model dynamically evolving fronts and interfaces. In this work, we present new methods for evolving fronts with a specified velocity field or in the surface normal direction on 3D unstructured tetrahedral meshes with adaptive mesh refinement (AMR). The level set field is located at the nodes of the tetrahedral cells and is evolved using new upwind discretizations of Hamilton–Jacobi equations combined with a Runge–Kutta method for temporal integration. The level set field is periodically reinitialized to a signed distance function using an iterative approach with a new upwind gradient. We discuss themore » details of these level set and reinitialization methods. Results from a range of numerical test problems are presented.« less
Using Multiple FPGA Architectures for Real-time Processing of Low-level Machine Vision Functions
Thomas H. Drayer; William E. King; Philip A. Araman; Joseph G. Tront; Richard W. Conners
1995-01-01
In this paper, we investigate the use of multiple Field Programmable Gate Array (FPGA) architectures for real-time machine vision processing. The use of FPGAs for low-level processing represents an excellent tradeoff between software and special purpose hardware implementations. A library of modules that implement common low-level machine vision operations is presented...
NASA Astrophysics Data System (ADS)
Mioulet, L.; Bideault, G.; Chatelain, C.; Paquet, T.; Brunessaux, S.
2015-01-01
The BLSTM-CTC is a novel recurrent neural network architecture that has outperformed previous state of the art algorithms in tasks such as speech recognition or handwriting recognition. It has the ability to process long term dependencies in temporal signals in order to label unsegmented data. This paper describes different ways of combining features using a BLSTM-CTC architecture. Not only do we explore the low level combination (feature space combination) but we also explore high level combination (decoding combination) and mid-level (internal system representation combination). The results are compared on the RIMES word database. Our results show that the low level combination works best, thanks to the powerful data modeling of the LSTM neurons.
TET Explorers: Pushing back the frontiers of Science
NASA Astrophysics Data System (ADS)
Curtis, S. A.; Clark, P. E.; Garvin, J. B.; Rilee, M. L.; Dorband, J. E.; Cheung, C. Y.; Sams, J. E.
2005-12-01
We are in the process of developing Tetrahedral Explorer Technologies (TETs) for the extreme mobility needed to explore remote, rugged terrain. TET architecture is based on the tetrahedron as building block, acting singly or interconnected, where apices act as nodes from which struts reversibly deploy. Conformable tetrahedra are the simplest space-filling form the way triangles are the simplest plane-filling facets. The tetrahedral framework acts as a simple skeletal muscular structure. Reconfigurable architecture is essential in exploration because reaching features of the greatest potential interest requires crossing a wide range of terrains. Thus, areas of interest are relatively inaccessible to permanently appendaged vehicles. For example, morphology and geochemistry of interior basins, walls, and ejecta blankets of impact structures must all be studied to understand the nature of an impact event. The crater floor might be relatively flat and navigable, while typical crater walls are variably sloping, and dominated by unconsolidated debris. To be totally functional, structures must form pseudo-appendages varying in size, rate, and manner of deployment (gait). We have already prototyped a simple robotic walker from a single reconfigurable tetrahedron capable of tumbling and are simulating and building a prototype of the more evolved 12Tetrahedral Walker (Autonomous Lunar Investigator) which has interior nodes for payload, more continuous motion, and is commandable through a user friendly interface. Our current applications consist of a more differentiated architecture to form detachable, reconfigurable, reshapable linearly extendable bodies (Class W or Worm), ranging from arms terminating in opposable digits (Class S or Spider) to act as manual assistant subsystems on rovers, to autonomous pseudo-hominid clamberers (Class M or Mammal), with extensions terminating in a wider range of sensors. We are now simulating Class W and Class S gaits and will be building a prototype rover arm. Ultimately, complex continuous n-tetrahedral structures, more advanced versions of Class A, will have deployable outer skin, and even higher degrees of freedom. Combined high and low level intelligence through an extended neural interface will allow `shape shifting' for required function, from surface-conformable lander to amorphous rover to concave surface formation for antenna function. Such architecture will consist of reusable, reconfigurable, mobile, and self-repairing structures, capable of acting as a multi-functional infrastructure. TET systems will act as robotic adjuncts to human explorers, enabling access to otherwise inaccessible resources essential to sustaining human presence.
Jenkins, Dafyd J; Stekel, Dov J
2010-02-01
Gene regulation is one important mechanism in producing observed phenotypes and heterogeneity. Consequently, the study of gene regulatory network (GRN) architecture, function and evolution now forms a major part of modern biology. However, it is impossible to experimentally observe the evolution of GRNs on the timescales on which living species evolve. In silico evolution provides an approach to studying the long-term evolution of GRNs, but many models have either considered network architecture from non-adaptive evolution, or evolution to non-biological objectives. Here, we address a number of important modelling and biological questions about the evolution of GRNs to the realistic goal of biomass production. Can different commonly used simulation paradigms, in particular deterministic and stochastic Boolean networks, with and without basal gene expression, be used to compare adaptive with non-adaptive evolution of GRNs? Are these paradigms together with this goal sufficient to generate a range of solutions? Will the interaction between a biological goal and evolutionary dynamics produce trade-offs between growth and mutational robustness? We show that stochastic basal gene expression forces shrinkage of genomes due to energetic constraints and is a prerequisite for some solutions. In systems that are able to evolve rates of basal expression, two optima, one with and one without basal expression, are observed. Simulation paradigms without basal expression generate bloated networks with non-functional elements. Further, a range of functional solutions was observed under identical conditions only in stochastic networks. Moreover, there are trade-offs between efficiency and yield, indicating an inherent intertwining of fitness and evolutionary dynamics.
Dias, Raquel; Manny, Austin; Kolaczkowski, Oralia; Kolaczkowski, Bryan
2017-06-01
Reconstruction of ancestral protein sequences using phylogenetic methods is a powerful technique for directly examining the evolution of molecular function. Although ancestral sequence reconstruction (ASR) is itself very efficient, downstream functional, and structural studies necessary to characterize when and how changes in molecular function occurred are often costly and time-consuming, currently limiting ASR studies to examining a relatively small number of discrete functional shifts. As a result, we have very little direct information about how molecular function evolves across large protein families. Here we develop an approach combining ASR with structure and function prediction to efficiently examine the evolution of ligand affinity across a large family of double-stranded RNA binding proteins (DRBs) spanning animals and plants. We find that the characteristic domain architecture of DRBs-consisting of 2-3 tandem double-stranded RNA binding motifs (dsrms)-arose independently in early animal and plant lineages. The affinity with which individual dsrms bind double-stranded RNA appears to have increased and decreased often across both animal and plant phylogenies, primarily through convergent structural mechanisms involving RNA-contact residues within the β1-β2 loop and a small region of α2. These studies provide some of the first direct information about how protein function evolves across large gene families and suggest that changes in molecular function may occur often and unassociated with major phylogenetic events, such as gene or domain duplications. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
Clinical Decision Support: a 25 Year Retrospective and a 25 Year Vision
Sittig, D. F.; Wright, A.
2016-01-01
Summary Objective The objective of this review is to summarize the state of the art of clinical decision support (CDS) circa 1990, review progress in the 25 year interval from that time, and provide a vision of what CDS might look like 25 years hence, or circa 2040. Method Informal review of the medical literature with iterative review and discussion among the authors to arrive at six axes (data, knowledge, inference, architecture and technology, implementation and integration, and users) to frame the review and discussion of selected barriers and facilitators to the effective use of CDS. Result In each of the six axes, significant progress has been made. Key advances in structuring and encoding standardized data with an increased availability of data, development of knowledge bases for CDS, and improvement of capabilities to share knowledge artifacts, explosion of methods analyzing and inferring from clinical data, evolution of information technologies and architectures to facilitate the broad application of CDS, improvement of methods to implement CDS and integrate CDS into the clinical workflow, and increasing sophistication of the end-user, all have played a role in improving the effective use of CDS in healthcare delivery. Conclusion CDS has evolved dramatically over the past 25 years and will likely evolve just as dramatically or more so over the next 25 years. Increasingly, the clinical encounter between a clinician and a patient will be supported by a wide variety of cognitive aides to support diagnosis, treatment, care-coordination, surveillance and prevention, and health maintenance or wellness. PMID:27488402
NASA Technical Reports Server (NTRS)
Zelkin, Natalie; Henriksen, Stephen
2010-01-01
This NASA Contractor Report summarizes and documents the work performed to develop concepts of use (ConUse) and high-level system requirements and architecture for the proposed L-band (960 to 1164 MHz) terrestrial en route communications system. This work was completed as a follow-on to the technology assessment conducted by NASA Glenn Research Center and ITT for the Future Communications Study (FCS). ITT assessed air-to-ground (A/G) communications concepts of use and operations presented in relevant NAS-level, international, and NAS-system-level documents to derive the appropriate ConUse relevant to potential A/G communications applications and services for domestic continental airspace. ITT also leveraged prior concepts of use developed during the earlier phases of the FCS. A middle-out functional architecture was adopted by merging the functional system requirements identified in the bottom-up assessment of existing requirements with those derived as a result of the top-down analysis of ConUse and higher level functional requirements. Initial end-to-end system performance requirements were derived to define system capabilities based on the functional requirements and on NAS-SR-1000 and the Operational Performance Assessment conducted as part of the COCR. A high-level notional architecture of the L-DACS supporting A/G communication was derived from the functional architecture and requirements.
Mars Surface Tunnel Element Concept
NASA Technical Reports Server (NTRS)
Rucker, Michelle A.; Mary, Natalie; Howe, A. Scott; Jeffries, Sharon
2016-01-01
How Mars surface crews get into their ascent vehicle has profound implications for Mars surface architecture. To meet planetary protection protocols, the architecture has get Intravehicular Activity (IVA)-suited crew into a Mars Ascent Vehicle (MAV) without having to step outside into the Mars environment. Pushing EVA suit don/doff and EVA operations to an element that remains on the surface also helps to minimize MAV cabin volume, which in turn can reduce MAV cabin mass. Because the MAV will require at least seven kilograms of propellant to ascend each kilogram of cabin mass, minimal MAV mass is desired. For architectures involving more than one surface element-such as an ascent vehicle and a pressurized rover or surface habitat-a retractable tunnel is an attractive solution. Beyond addressing the immediate MAV access issue, a reusable tunnel may be useful for other surface applications once its primary mission is complete. A National Aeronautics and Space Administration (NASA) team is studying the optimal balance between surface tunnel functionality, mass, and stowed volume as part of the Evolvable Mars Campaign (EMC). The "Minimum Functional Tunnel" is a conceptual design that performs a single function. Having established this baseline configuration, the next step is to trade design options, evaluate other applications, and explore alternative solutions.
NASA Technical Reports Server (NTRS)
Simon, Matthew A.; Toups, Larry
2014-01-01
Increased public awareness of carbon footprints, crowding in urban areas, and rising housing costs have spawned a 'small house movement' in the housing industry. Members of this movement desire small, yet highly functional residences which are both affordable and sensitive to consumer comfort standards. In order to create comfortable, minimum-volume interiors, recent advances have been made in furniture design and approaches to interior layout that improve both space utilization and encourage multi-functional design for small homes, apartments, naval, and recreational vehicles. Design efforts in this evolving niche of terrestrial architecture can provide useful insights leading to innovation and efficiency in the design of space habitats for future human space exploration missions. This paper highlights many of the cross-cutting architectural solutions used in small space design which are applicable to the spacecraft interior design problem. Specific solutions discussed include reconfigurable, multi-purpose spaces; collapsible or transformable furniture; multi-purpose accommodations; efficient, space saving appliances; stowable and mobile workstations; and the miniaturization of electronics and computing hardware. For each of these design features, descriptions of how they save interior volume or mitigate other small space issues such as confinement stress or crowding are discussed. Finally, recommendations are provided to provide guidance for future designs and identify potential collaborations with the small spaces design community.
Evolving the NASA Near Earth Network for the Next Generation of Human Space Flight
NASA Technical Reports Server (NTRS)
Roberts, Christopher J.; Carter, David L.; Hudiburg, John J.; Tye, Robert N.; Celeste, Peter B.
2014-01-01
The purpose of this paper is to present the planned development and evolution of the NASA Near Earth Network (NEN) launch communications services in support of the next generation of human space flight programs. Following the final space shuttle mission in 2011, the two NEN launch communications stations were decommissioned. Today, NASA is developing the next generation of human space flight systems focused on exploration missions beyond low-earth orbit, and supporting the emerging market for commercial crew and cargo human space flight services. The NEN is leading a major initiative to develop a modern high data rate launch communications ground architecture with support from the Kennedy Space Center Ground Systems Development and Operations Program and in partnership with the U.S. Air Force (USAF) Eastern Range. This initiative, the NEN Launch Communications Stations (LCS) development project, successfully completed its System Requirements Review in November 2013. This paper provides an overview of the LCS project and a summary of its progress. The LCS ground architecture, concept of operations, and driving requirements to support the new heavy-lift Space Launch System and Orion Multi-Purpose Crew Vehicle for Exploration Mission-1 are presented. Finally, potential future extensions to the ground architecture beyond EM-1 are discussed.
Experimenting with an Evolving Ground/Space-based Software Architecture to Enable Sensor Webs
NASA Technical Reports Server (NTRS)
mandl, Daniel; Frye, Stuart
2005-01-01
A series of ongoing experiments are being conducted at the NASA Goddard Space Flight Center to explore integrated ground and space-based software architectures enabling sensor webs. A sensor web, as defined by Steve Talabac at NASA Goddard Space Flight Center(GSFC), is a coherent set of distributed nodes interconnected by a communications fabric, that collectively behave as a single, dynamically adaptive, observing system. The nodes can be comprised of satellites, ground instruments, computing nodes etc. Sensor web capability requires autonomous management of constellation resources. This becomes progressively more important as more and more satellites share resource, such as communication channels and ground station,s while automatically coordinating their activities. There have been five ongoing activities which include an effort to standardize a set of middleware. This paper will describe one set of activities using the Earth Observing 1 satellite, which used a variety of ground and flight software along with other satellites and ground sensors to prototype a sensor web. This activity allowed us to explore where the difficulties that occur in the assembly of sensor webs given today s technology. We will present an overview of the software system architecture, some key experiments and lessons learned to facilitate better sensor webs in the future.
Neuenfeldt, Anne; Lorber, Bernard; Ennifar, Eric; Gaudry, Agnès; Sauter, Claude; Sissler, Marie; Florentz, Catherine
2013-02-01
In the mammalian mitochondrial translation apparatus, the proteins and their partner RNAs are coded by two genomes. The proteins are nuclear-encoded and resemble their homologs, whereas the RNAs coming from the rapidly evolving mitochondrial genome have lost critical structural information. This raises the question of molecular adaptation of these proteins to their peculiar partner RNAs. The crystal structure of the homodimeric bacterial-type human mitochondrial aspartyl-tRNA synthetase (DRS) confirmed a 3D architecture close to that of Escherichia coli DRS. However, the mitochondrial enzyme distinguishes by an enlarged catalytic groove, a more electropositive surface potential and an alternate interaction network at the subunits interface. It also presented a thermal stability reduced by as much as 12°C. Isothermal titration calorimetry analyses revealed that the affinity of the mitochondrial enzyme for cognate and non-cognate tRNAs is one order of magnitude higher, but with different enthalpy and entropy contributions. They further indicated that both enzymes bind an adenylate analog by a cooperative allosteric mechanism with different thermodynamic contributions. The larger flexibility of the mitochondrial synthetase with respect to the bacterial enzyme, in combination with a preserved architecture, may represent an evolutionary process, allowing nuclear-encoded proteins to cooperate with degenerated organelle RNAs.