Sample records for reference model architecture

  1. A Reference Architecture for Space Information Management

    NASA Technical Reports Server (NTRS)

    Mattmann, Chris A.; Crichton, Daniel J.; Hughes, J. Steven; Ramirez, Paul M.; Berrios, Daniel C.

    2006-01-01

    We describe a reference architecture for space information management systems that elegantly overcomes the rigid design of common information systems in many domains. The reference architecture consists of a set of flexible, reusable, independent models and software components that function in unison, but remain separately managed entities. The main guiding principle of the reference architecture is to separate the various models of information (e.g., data, metadata, etc.) from implemented system code, allowing each to evolve independently. System modularity, systems interoperability, and dynamic evolution of information system components are the primary benefits of the design of the architecture. The architecture requires the use of information models that are substantially more advanced than those used by the vast majority of information systems. These models are more expressive and can be more easily modularized, distributed and maintained than simpler models e.g., configuration files and data dictionaries. Our current work focuses on formalizing the architecture within a CCSDS Green Book and evaluating the architecture within the context of the C3I initiative.

  2. A reference architecture for the component factory

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Caldiera, Gianluigi; Cantone, Giovanni

    1992-01-01

    Software reuse can be achieved through an organization that focuses on utilization of life cycle products from previous developments. The component factory is both an example of the more general concepts of experience and domain factory and an organizational unit worth being considered independently. The critical features of such an organization are flexibility and continuous improvement. In order to achieve these features we can represent the architecture of the factory at different levels of abstraction and define a reference architecture from which specific architectures can be derived by instantiation. A reference architecture is an implementation and organization independent representation of the component factory and its environment. The paper outlines this reference architecture, discusses the instantiation process, and presents some examples of specific architectures by comparing them in the framework of the reference model.

  3. Accelerating Cogent Confabulation: An Exploration in the Architecture Design Space

    DTIC Science & Technology

    2008-06-01

    DATES COVERED (From - To) 1-8 June 2008 4. TITLE AND SUBTITLE ACCELERATING COGENT CONFABULATION: AN EXPLORATION IN THE ARCHITECTURE DESIGN SPACE 5a...spiking neural networks is proposed in reference [8]. Reference [9] investigates the architecture design of a Brain-state-in-a-box model. The...Richard Linderman2, Thomas Renz2, Qing Wu1 Accelerating Cogent Confabulation: an Exploration in the Architecture Design Space POSTPRINT complexity

  4. Theory of electronically phased coherent beam combination without a reference beam

    NASA Astrophysics Data System (ADS)

    Shay, Thomas M.

    2006-12-01

    The first theory for two novel coherent beam combination architectures that are the first electronic beam combination architectures that completely eliminate the need for a separate reference beam are presented. Detailed theoretical models are developed and presented for the first time.

  5. Analyzing and designing object-oriented missile simulations with concurrency

    NASA Astrophysics Data System (ADS)

    Randorf, Jeffrey Allen

    2000-11-01

    A software object model for the six degree-of-freedom missile modeling domain is presented. As a precursor, a domain analysis of the missile modeling domain was started, based on the Feature-Oriented Domain Analysis (FODA) technique described by the Software Engineering Institute (SEI). It was subsequently determined the FODA methodology is functionally equivalent to the Object Modeling Technique. The analysis used legacy software documentation and code from the ENDOSIM, KDEC, and TFrames 6-DOF modeling tools, including other technical literature. The SEI Object Connection Architecture (OCA) was the template for designing the object model. Three variants of the OCA were considered---a reference structure, a recursive structure, and a reference structure with augmentation for flight vehicle modeling. The reference OCA design option was chosen for maintaining simplicity while not compromising the expressive power of the OMT model. The missile architecture was then analyzed for potential areas of concurrent computing. It was shown how protected objects could be used for data passing between OCA object managers, allowing concurrent access without changing the OCA reference design intent or structure. The implementation language was the 1995 release of Ada. OCA software components were shown how to be expressed as Ada child packages. While acceleration of several low level and other high operations level are possible on proper hardware, there was a 33% degradation of 4th order Runge-Kutta integrator performance of two simultaneous ordinary differential equations using Ada tasking on a single processor machine. The Defense Department's High Level Architecture was introduced and explained in context with the OCA. It was shown the HLA and OCA were not mutually exclusive architectures, but complimentary. HLA was shown as an interoperability solution, with the OCA as an architectural vehicle for software reuse. Further directions for implementing a 6-DOF missile modeling environment are discussed.

  6. Theory of electronic phase locking of an optical array without a reference beam

    NASA Astrophysics Data System (ADS)

    Shay, Thomas M.

    2006-08-01

    The first theory for two novel coherent beam combination architectures that are the first electronic beam combination architectures that completely eliminate the need for a separate reference beam are presented. Detailed theoretical models are developed and presented for the first time.

  7. Predictor-Based Model Reference Adaptive Control

    NASA Technical Reports Server (NTRS)

    Lavretsky, Eugene; Gadient, Ross; Gregory, Irene M.

    2009-01-01

    This paper is devoted to robust, Predictor-based Model Reference Adaptive Control (PMRAC) design. The proposed adaptive system is compared with the now-classical Model Reference Adaptive Control (MRAC) architecture. Simulation examples are presented. Numerical evidence indicates that the proposed PMRAC tracking architecture has better than MRAC transient characteristics. In this paper, we presented a state-predictor based direct adaptive tracking design methodology for multi-input dynamical systems, with partially known dynamics. Efficiency of the design was demonstrated using short period dynamics of an aircraft. Formal proof of the reported PMRAC benefits constitute future research and will be reported elsewhere.

  8. A Nonlinear Dynamic Inversion Predictor-Based Model Reference Adaptive Controller for a Generic Transport Model

    NASA Technical Reports Server (NTRS)

    Campbell, Stefan F.; Kaneshige, John T.

    2010-01-01

    Presented here is a Predictor-Based Model Reference Adaptive Control (PMRAC) architecture for a generic transport aircraft. At its core, this architecture features a three-axis, non-linear, dynamic-inversion controller. Command inputs for this baseline controller are provided by pilot roll-rate, pitch-rate, and sideslip commands. This paper will first thoroughly present the baseline controller followed by a description of the PMRAC adaptive augmentation to this control system. Results are presented via a full-scale, nonlinear simulation of NASA s Generic Transport Model (GTM).

  9. A reference architecture for integrated EHR in Colombia.

    PubMed

    de la Cruz, Edgar; Lopez, Diego M; Uribe, Gustavo; Gonzalez, Carolina; Blobel, Bernd

    2011-01-01

    The implementation of national EHR infrastructures has to start by a detailed definition of the overall structure and behavior of the EHR system (system architecture). Architectures have to be open, scalable, flexible, user accepted and user friendly, trustworthy, based on standards including terminologies and ontologies. The GCM provides an architectural framework created with the purpose of analyzing any kind of system, including EHR system´s architectures. The objective of this paper is to propose a reference architecture for the implementation of an integrated EHR in Colombia, based on the current state of system´s architectural models, and EHR standards. The proposed EHR architecture defines a set of services (elements) and their interfaces, to support the exchange of clinical documents, offering an open, scalable, flexible and semantically interoperable infrastructure. The architecture was tested in a pilot tele-consultation project in Colombia, where dental EHR are exchanged.

  10. A set-theoretic model reference adaptive control architecture for disturbance rejection and uncertainty suppression with strict performance guarantees

    NASA Astrophysics Data System (ADS)

    Arabi, Ehsan; Gruenwald, Benjamin C.; Yucelen, Tansel; Nguyen, Nhan T.

    2018-05-01

    Research in adaptive control algorithms for safety-critical applications is primarily motivated by the fact that these algorithms have the capability to suppress the effects of adverse conditions resulting from exogenous disturbances, imperfect dynamical system modelling, degraded modes of operation, and changes in system dynamics. Although government and industry agree on the potential of these algorithms in providing safety and reducing vehicle development costs, a major issue is the inability to achieve a-priori, user-defined performance guarantees with adaptive control algorithms. In this paper, a new model reference adaptive control architecture for uncertain dynamical systems is presented to address disturbance rejection and uncertainty suppression. The proposed framework is predicated on a set-theoretic adaptive controller construction using generalised restricted potential functions.The key feature of this framework allows the system error bound between the state of an uncertain dynamical system and the state of a reference model, which captures a desired closed-loop system performance, to be less than a-priori, user-defined worst-case performance bound, and hence, it has the capability to enforce strict performance guarantees. Examples are provided to demonstrate the efficacy of the proposed set-theoretic model reference adaptive control architecture.

  11. NASA/NBS (National Aeronautics and Space Administration/National Bureau of Standards) standard reference model for telerobot control system architecture (NASREM)

    NASA Technical Reports Server (NTRS)

    Albus, James S.; Mccain, Harry G.; Lumia, Ronald

    1989-01-01

    The document describes the NASA Standard Reference Model (NASREM) Architecture for the Space Station Telerobot Control System. It defines the functional requirements and high level specifications of the control system for the NASA space Station document for the functional specification, and a guideline for the development of the control system architecture, of the 10C Flight Telerobot Servicer. The NASREM telerobot control system architecture defines a set of standard modules and interfaces which facilitates software design, development, validation, and test, and make possible the integration of telerobotics software from a wide variety of sources. Standard interfaces also provide the software hooks necessary to incrementally upgrade future Flight Telerobot Systems as new capabilities develop in computer science, robotics, and autonomous system control.

  12. Communication architecture for AAL. Supporting patient care by health care providers in AAL-enhanced living quarters.

    PubMed

    Nitzsche, T; Thiele, S; Häber, A; Winter, A

    2014-01-01

    This article is part of the Focus Theme of Methods of Information in Medicine on "Using Data from Ambient Assisted Living and Smart Homes in Electronic Health Records". Concepts of Ambient Assisted Living (AAL) support a long-term health monitoring and further medical and other services for multi-morbid patients with chronic diseases. In Germany many AAL and telemedical applications exist. Synergy effects by common agreements for essential application components and standards are not achieved. It is necessary to define a communication architecture which is based on common definitions of communication scenarios, application components and communication standards. The development of a communication architecture requires different steps. To gain a reference model for the problem area different AAL and telemedicine projects were compared and relevant data elements were generalized. The derived reference model defines standardized communication links. As a result the authors present an approach towards a reference architecture for AAL-communication. The focus of the architecture lays on the communication layer. The necessary application components are identified and a communication based on standards and their extensions is highlighted. The exchange of patient individual events supported by an event classification model, raw and aggregated data from the personal home area over a telemedicine center to health care providers is possible.

  13. Predictor-Based Model Reference Adaptive Control

    NASA Technical Reports Server (NTRS)

    Lavretsky, Eugene; Gadient, Ross; Gregory, Irene M.

    2010-01-01

    This paper is devoted to the design and analysis of a predictor-based model reference adaptive control. Stable adaptive laws are derived using Lyapunov framework. The proposed architecture is compared with the now classical model reference adaptive control. A simulation example is presented in which numerical evidence indicates that the proposed controller yields improved transient characteristics.

  14. Space Generic Open Avionics Architecture (SGOAA) reference model technical guide

    NASA Technical Reports Server (NTRS)

    Wray, Richard B.; Stovall, John R.

    1993-01-01

    This report presents a full description of the Space Generic Open Avionics Architecture (SGOAA). The SGOAA consists of a generic system architecture for the entities in spacecraft avionics, a generic processing architecture, and a six class model of interfaces in a hardware/software system. The purpose of the SGOAA is to provide an umbrella set of requirements for applying the generic architecture interface model to the design of specific avionics hardware/software systems. The SGOAA defines a generic set of system interface points to facilitate identification of critical interfaces and establishes the requirements for applying appropriate low level detailed implementation standards to those interface points. The generic core avionics system and processing architecture models provided herein are robustly tailorable to specific system applications and provide a platform upon which the interface model is to be applied.

  15. An Adaptive Critic Approach to Reference Model Adaptation

    NASA Technical Reports Server (NTRS)

    Krishnakumar, K.; Limes, G.; Gundy-Burlet, K.; Bryant, D.

    2003-01-01

    Neural networks have been successfully used for implementing control architectures for different applications. In this work, we examine a neural network augmented adaptive critic as a Level 2 intelligent controller for a C- 17 aircraft. This intelligent control architecture utilizes an adaptive critic to tune the parameters of a reference model, which is then used to define the angular rate command for a Level 1 intelligent controller. The present architecture is implemented on a high-fidelity non-linear model of a C-17 aircraft. The goal of this research is to improve the performance of the C-17 under degraded conditions such as control failures and battle damage. Pilot ratings using a motion based simulation facility are included in this paper. The benefits of using an adaptive critic are documented using time response comparisons for severe damage situations.

  16. Performance Analysis of GFDL's GCM Line-By-Line Radiative Transfer Model on GPU and MIC Architectures

    NASA Astrophysics Data System (ADS)

    Menzel, R.; Paynter, D.; Jones, A. L.

    2017-12-01

    Due to their relatively low computational cost, radiative transfer models in global climate models (GCMs) run on traditional CPU architectures generally consist of shortwave and longwave parameterizations over a small number of wavelength bands. With the rise of newer GPU and MIC architectures, however, the performance of high resolution line-by-line radiative transfer models may soon approach those of the physical parameterizations currently employed in GCMs. Here we present an analysis of the current performance of a new line-by-line radiative transfer model currently under development at GFDL. Although originally designed to specifically exploit GPU architectures through the use of CUDA, the radiative transfer model has recently been extended to include OpenMP in an effort to also effectively target MIC architectures such as Intel's Xeon Phi. Using input data provided by the upcoming Radiative Forcing Model Intercomparison Project (RFMIP, as part of CMIP 6), we compare model results and performance data for various model configurations and spectral resolutions run on both GPU and Intel Knights Landing architectures to analogous runs of the standard Oxford Reference Forward Model on traditional CPUs.

  17. Technical Reference Suite Addressing Challenges of Providing Assurance for Fault Management Architectural Design

    NASA Technical Reports Server (NTRS)

    Fitz, Rhonda; Whitman, Gerek

    2016-01-01

    Research into complexities of software systems Fault Management (FM) and how architectural design decisions affect safety, preservation of assets, and maintenance of desired system functionality has coalesced into a technical reference (TR) suite that advances the provision of safety and mission assurance. The NASA Independent Verification and Validation (IVV) Program, with Software Assurance Research Program support, extracted FM architectures across the IVV portfolio to evaluate robustness, assess visibility for validation and test, and define software assurance methods applied to the architectures and designs. This investigation spanned IVV projects with seven different primary developers, a wide range of sizes and complexities, and encompassed Deep Space Robotic, Human Spaceflight, and Earth Orbiter mission FM architectures. The initiative continues with an expansion of the TR suite to include Launch Vehicles, adding the benefit of investigating differences intrinsic to model-based FM architectures and insight into complexities of FM within an Agile software development environment, in order to improve awareness of how nontraditional processes affect FM architectural design and system health management.

  18. Data Modeling Challenges of Advanced Interoperability.

    PubMed

    Blobel, Bernd; Oemig, Frank; Ruotsalainen, Pekka

    2018-01-01

    Progressive health paradigms, involving many different disciplines and combining multiple policy domains, requires advanced interoperability solutions. This results in special challenges for modeling health systems. The paper discusses classification systems for data models and enterprise business architectures and compares them with the ISO Reference Architecture. On that basis, existing definitions, specifications and standards of data models for interoperability are evaluated and their limitations are discussed. Amendments to correctly use those models and to better meet the aforementioned challenges are offered.

  19. Strategic Mobility 21. Service Oriented Architecture (SOA) Reference Model - Global Transportation Management System Architecture

    DTIC Science & Technology

    2009-10-07

    SECTION A. BUSINESS ENVIRONMENT 1 INTRODUCTION The Strategic Mobility 21 (SM21) program is currently in the process of developing the Joint...Platform ( BPP ) which enables the ability to rapidly compose new business processes and expand the core TMS feature-set to adapt to the challenges...Reference: Strategic Mobility 21 Contract N00014-06-C-0060 Dear Paul, In accordance with the requirements of referenced contract, we are pleased to

  20. An International Strategy for Human Exploration of the Moon: The International Space Exploration Coordination Group (ISECG) Reference Architecture for Human Lunar Exploration

    NASA Technical Reports Server (NTRS)

    Laurini, Kathleen C.; Hufenbach, Bernhard; Junichiro, Kawaguchi; Piedboeuf, Jean-Claude; Schade, Britta; Lorenzoni, Andrea; Curtis, Jeremy; Hae-Dong, Kim

    2010-01-01

    The International Space Exploration Coordination Group (ISECG) was established in response to The Global Exploration Strategy: The Framework for Coordination developed by fourteen space agencies and released in May 2007. Several ISECG participating space agencies have been studying concepts for human exploration of the moon that allow individual and collective goals and objectives to be met. This 18 month study activity culminated with the development of the ISECG Reference Architecture for Human Lunar Exploration. The reference architecture is a series of elements delivered over time in a flexible and evolvable campaign. This paper will describe the reference architecture and how it will inform near-term and long-term programmatic planning within interested agencies. The reference architecture is intended to serve as a global point of departure conceptual architecture that enables individual agency investments in technology development and demonstration, International Space Station research and technology demonstration, terrestrial analog studies, and robotic precursor missions to contribute towards the eventual implementation of a human lunar exploration scenario which reflects the concepts and priorities established to date. It also serves to create opportunities for partnerships that will support evolution of this concept and its eventual realization. The ISECG Reference Architecture for Human Lunar Exploration (commonly referred to as the lunar gPoD) reflects the agency commitments to finding an effective balance between conducting important scientific investigations of and from the moon, as well as demonstrating and mastering the technologies and capabilities to send humans farther into the Solar System. The lunar gPoD begins with a robust robotic precursor phase that demonstrates technologies and capabilities considered important for the success of the campaign. Robotic missions will inform the human missions and buy down risks. Human exploration will start with a thorough scientific investigation of the polar region while allowing the ability to demonstrate and validate the systems needed to take humans on more ambitious lunar exploration excursions. The ISECG Reference Architecture for Human Lunar Exploration serves as a model for future cooperation and is documented in a summary report and a comprehensive document that also describes the collaborative international process that led to its development. ISECG plans to continue with architecture studies such as this to examine an open transportation architecture and other destinations, with expanded participation from ISECG agencies, as it works to inform international partnerships and advance the Global Exploration Strategy.

  1. Information Interaction: Providing a Framework for Information Architecture.

    ERIC Educational Resources Information Center

    Toms, Elaine G.

    2002-01-01

    Discussion of information architecture focuses on a model of information interaction that bridges the gap between human and computer and between information behavior and information retrieval. Illustrates how the process of information interaction is affected by the user, the system, and the content. (Contains 93 references.) (LRW)

  2. UML Profiles for Design Decisions and Non-Functional Requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Liming; Gorton, Ian

    2007-06-30

    A software architecture is composed of a collection of design decisions. Each design decision helps or hinders certain Non-Functional Requirements (NFR). Current software architecture views focus on expressing components and connectors in the system. Design decisions and their relationships with non-functional requirements are often captured in separate design documentation, not explicitly expressed in any views. This disassociation makes architecture comprehension and architecture evolution harder. In this paper, we propose a UML profile for modeling design decisions and an associated UML profile for modeling non-functional requirements in a generic way. The two UML profiles treat design decisions and nonfunctional requirements asmore » first-class elements. Modeled design decisions always refer to existing architectural elements and thus maintain traceability between the two. We provide a mechanism for checking consistency over this traceability. An exemplar is given as« less

  3. A Neurobehavioral Model of Flexible Spatial Language Behaviors

    PubMed Central

    Lipinski, John; Schneegans, Sebastian; Sandamirskaya, Yulia; Spencer, John P.; Schöner, Gregor

    2012-01-01

    We propose a neural dynamic model that specifies how low-level visual processes can be integrated with higher level cognition to achieve flexible spatial language behaviors. This model uses real-word visual input that is linked to relational spatial descriptions through a neural mechanism for reference frame transformations. We demonstrate that the system can extract spatial relations from visual scenes, select items based on relational spatial descriptions, and perform reference object selection in a single unified architecture. We further show that the performance of the system is consistent with behavioral data in humans by simulating results from 2 independent empirical studies, 1 spatial term rating task and 1 study of reference object selection behavior. The architecture we present thereby achieves a high degree of task flexibility under realistic stimulus conditions. At the same time, it also provides a detailed neural grounding for complex behavioral and cognitive processes. PMID:21517224

  4. Comparing a Japanese and a German hospital information system.

    PubMed

    Jahn, F; Issler, L; Winter, A; Takabayashi, K

    2009-01-01

    To examine the architectural differences and similarities of a Japanese and German hospital information system (HIS) in a case study. This cross-cultural comparison, which focuses on structural quality characteristics, offers the chance to get new insights into different HIS architectures, which possibly cannot be obtained by inner-country comparisons. A reference model for the domain layer of hospital information systems containing the typical enterprise functions of a hospital provides the basis of comparison for the two different hospital information systems. 3LGM(2) models, which describe the two HISs and which are based on that reference model, are used to assess several structural quality criteria. Four of these criteria are introduced in detail. The two examined HISs are different in terms of the four structural quality criteria examined. Whereas the centralized architecture of the hospital information system at Chiba University Hospital causes only few functional redundancies and leads to a low implementation of communication standards, the hospital information system at the University Hospital of Leipzig, having a decentralized architecture, exhibits more functional redundancies and a higher use of communication standards. Using a model-based comparison, it was possible to detect remarkable differences between the observed hospital information systems of completely different cultural areas. However, the usability of 3LGM(2) models for comparisons has to be improved in order to apply key figures and to assess or benchmark the structural quality of health information systems architectures more thoroughly.

  5. Evaluating the Effectiveness of Reference Models in Federating Enterprise Architectures

    ERIC Educational Resources Information Center

    Wilson, Jeffery A.

    2012-01-01

    Agencies need to collaborate with each other to perform missions, improve mission performance, and find efficiencies. The ability of individual government agencies to collaborate with each other for mission and business success and efficiency is complicated by the different techniques used to describe their Enterprise Architectures (EAs).…

  6. Requirements for data integration platforms in biomedical research networks: a reference model.

    PubMed

    Ganzinger, Matthias; Knaup, Petra

    2015-01-01

    Biomedical research networks need to integrate research data among their members and with external partners. To support such data sharing activities, an adequate information technology infrastructure is necessary. To facilitate the establishment of such an infrastructure, we developed a reference model for the requirements. The reference model consists of five reference goals and 15 reference requirements. Using the Unified Modeling Language, the goals and requirements are set into relation to each other. In addition, all goals and requirements are described textually in tables. This reference model can be used by research networks as a basis for a resource efficient acquisition of their project specific requirements. Furthermore, a concrete instance of the reference model is described for a research network on liver cancer. The reference model is transferred into a requirements model of the specific network. Based on this concrete requirements model, a service-oriented information technology architecture is derived and also described in this paper.

  7. The NIST Real-Time Control System (RCS): A Reference Model Architecture for Computational Intelligence

    NASA Technical Reports Server (NTRS)

    Albus, James S.

    1996-01-01

    The Real-time Control System (RCS) developed at NIST and elsewhere over the past two decades defines a reference model architecture for design and analysis of complex intelligent control systems. The RCS architecture consists of a hierarchically layered set of functional processing modules connected by a network of communication pathways. The primary distinguishing feature of the layers is the bandwidth of the control loops. The characteristic bandwidth of each level is determined by the spatial and temporal integration window of filters, the temporal frequency of signals and events, the spatial frequency of patterns, and the planning horizon and granularity of the planners that operate at each level. At each level, tasks are decomposed into sequential subtasks, to be performed by cooperating sets of subordinate agents. At each level, signals from sensors are filtered and correlated with spatial and temporal features that are relevant to the control function being implemented at that level.

  8. NASA Enterprise Architecture and Its Use in Transition of Research Results to Operations

    NASA Astrophysics Data System (ADS)

    Frisbie, T. E.; Hall, C. M.

    2006-12-01

    Enterprise architecture describes the design of the components of an enterprise, their relationships and how they support the objectives of that enterprise. NASA Stennis Space Center leads several projects involving enterprise architecture tools used to gather information on research assets within NASA's Earth Science Division. In the near future, enterprise architecture tools will link and display the relevant requirements, parameters, observatories, models, decision systems, and benefit/impact information relationships and map to the Federal Enterprise Architecture Reference Models. Components configured within the enterprise architecture serving the NASA Applied Sciences Program include the Earth Science Components Knowledge Base, the Systems Components database, and the Earth Science Architecture Tool. The Earth Science Components Knowledge Base systematically catalogues NASA missions, sensors, models, data products, model products, and network partners appropriate for consideration in NASA Earth Science applications projects. The Systems Components database is a centralized information warehouse of NASA's Earth Science research assets and a critical first link in the implementation of enterprise architecture. The Earth Science Architecture Tool is used to analyze potential NASA candidate systems that may be beneficial to decision-making capabilities of other Federal agencies. Use of the current configuration of NASA enterprise architecture (the Earth Science Components Knowledge Base, the Systems Components database, and the Earth Science Architecture Tool) has far exceeded its original intent and has tremendous potential for the transition of research results to operational entities.

  9. Panel C report: Standards needed for the use of ISO Open Systems Interconnection - basic reference model

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The use of an International Standards Organization (ISO) Open Systems Interconnection (OSI) Reference Model and its relevance to interconnecting an Applications Data Service (ADS) pilot program for data sharing is discussed. A top level mapping between the conjectured ADS requirements and identified layers within the OSI Reference Model was performed. It was concluded that the OSI model represents an orderly architecture for the ADS networking planning and that the protocols being developed by the National Bureau of Standards offer the best available implementation approach.

  10. Reference Architecture Model Enabling Standards Interoperability.

    PubMed

    Blobel, Bernd

    2017-01-01

    Advanced health and social services paradigms are supported by a comprehensive set of domains managed by different scientific disciplines. Interoperability has to evolve beyond information and communication technology (ICT) concerns, including the real world business domains and their processes, but also the individual context of all actors involved. So, the system must properly reflect the environment in front and around the computer as essential and even defining part of the health system. This paper introduces an ICT-independent system-theoretical, ontology-driven reference architecture model allowing the representation and harmonization of all domains involved including the transformation into an appropriate ICT design and implementation. The entire process is completely formalized and can therefore be fully automated.

  11. Unified web-based network management based on distributed object orientated software agents

    NASA Astrophysics Data System (ADS)

    Djalalian, Amir; Mukhtar, Rami; Zukerman, Moshe

    2002-09-01

    This paper presents an architecture that provides a unified web interface to managed network devices that support CORBA, OSI or Internet-based network management protocols. A client gains access to managed devices through a web browser, which is used to issue management operations and receive event notifications. The proposed architecture is compatible with both the OSI Management reference Model and CORBA. The steps required for designing the building blocks of such architecture are identified.

  12. Architectural and Functional Design and Evaluation of E-Learning VUIS Based on the Proposed IEEE LTSA Reference Model.

    ERIC Educational Resources Information Center

    O'Droma, Mairtin S.; Ganchev, Ivan; McDonnell, Fergal

    2003-01-01

    Presents a comparative analysis from the Institute of Electrical and Electronics Engineers (IEEE) Learning Technology Standards Committee's (LTSC) of the architectural and functional design of e-learning delivery platforms and applications, e-learning course authoring tools, and learning management systems (LMSs), with a view of assessing how…

  13. Requirements for data integration platforms in biomedical research networks: a reference model

    PubMed Central

    Knaup, Petra

    2015-01-01

    Biomedical research networks need to integrate research data among their members and with external partners. To support such data sharing activities, an adequate information technology infrastructure is necessary. To facilitate the establishment of such an infrastructure, we developed a reference model for the requirements. The reference model consists of five reference goals and 15 reference requirements. Using the Unified Modeling Language, the goals and requirements are set into relation to each other. In addition, all goals and requirements are described textually in tables. This reference model can be used by research networks as a basis for a resource efficient acquisition of their project specific requirements. Furthermore, a concrete instance of the reference model is described for a research network on liver cancer. The reference model is transferred into a requirements model of the specific network. Based on this concrete requirements model, a service-oriented information technology architecture is derived and also described in this paper. PMID:25699205

  14. Space station group activities habitability module study

    NASA Technical Reports Server (NTRS)

    Nixon, David

    1986-01-01

    This study explores and analyzes architectural design approaches for the interior of the Space Station Habitability Module (originally defined as Habitability Module 1 in Space Station Reference Configuration Decription, JSC-19989, August 1984). In the Research Phase, architectural program and habitability design guidelines are specified. In the Schematic Design Phase, a range of alternative concepts is described and illustrated with drawings, scale-model photographs and design analysis evaluations. Recommendations are presented on the internal architectural, configuration of the Space Station Habitability Module for such functions as the wardroom, galley, exercise facility, library and station control work station. The models show full design configurations for on-orbit performance.

  15. The flight telerobotic servicer: From functional architecture to computer architecture

    NASA Technical Reports Server (NTRS)

    Lumia, Ronald; Fiala, John

    1989-01-01

    After a brief tutorial on the NASA/National Bureau of Standards Standard Reference Model for Telerobot Control System Architecture (NASREM) functional architecture, the approach to its implementation is shown. First, interfaces must be defined which are capable of supporting the known algorithms. This is illustrated by considering the interfaces required for the SERVO level of the NASREM functional architecture. After interface definition, the specific computer architecture for the implementation must be determined. This choice is obviously technology dependent. An example illustrating one possible mapping of the NASREM functional architecture to a particular set of computers which implements it is shown. The result of choosing the NASREM functional architecture is that it provides a technology independent paradigm which can be mapped into a technology dependent implementation capable of evolving with technology in the laboratory and in space.

  16. A knowledge-based system for patient image pre-fetching in heterogeneous database environments--modeling, design, and evaluation.

    PubMed

    Wei, C P; Hu, P J; Sheng, O R

    2001-03-01

    When performing primary reading on a newly taken radiological examination, a radiologist often needs to reference relevant prior images of the same patient for confirmation or comparison purposes. Support of such image references is of clinical importance and may have significant effects on radiologists' examination reading efficiency, service quality, and work satisfaction. To effectively support such image reference needs, we proposed and developed a knowledge-based patient image pre-fetching system, addressing several challenging requirements of the application that include representation and learning of image reference heuristics and management of data-intensive knowledge inferencing. Moreover, the system demands an extensible and maintainable architecture design capable of effectively adapting to a dynamic environment characterized by heterogeneous and autonomous data source systems. In this paper, we developed a synthesized object-oriented entity- relationship model, a conceptual model appropriate for representing radiologists' prior image reference heuristics that are heuristic oriented and data intensive. We detailed the system architecture and design of the knowledge-based patient image pre-fetching system. Our architecture design is based on a client-mediator-server framework, capable of coping with a dynamic environment characterized by distributed, heterogeneous, and highly autonomous data source systems. To adapt to changes in radiologists' patient prior image reference heuristics, ID3-based multidecision-tree induction and CN2-based multidecision induction learning techniques were developed and evaluated. Experimentally, we examined effects of the pre-fetching system we created on radiologists' examination readings. Preliminary results show that the knowledge-based patient image pre-fetching system more accurately supports radiologists' patient prior image reference needs than the current practice adopted at the study site and that radiologists may become more efficient, consultatively effective, and better satisfied when supported by the pre-fetching system than when relying on the study site's pre-fetching practice.

  17. Biological basis for space-variant sensor design I: parameters of monkey and human spatial vision

    NASA Astrophysics Data System (ADS)

    Rojer, Alan S.; Schwartz, Eric L.

    1991-02-01

    Biological sensor design has long provided inspiration for sensor design in machine vision. However relatively little attention has been paid to the actual design parameters provided by biological systems as opposed to the general nature of biological vision architectures. In the present paper we will provide a review of current knowledge of primate spatial vision design parameters and will present recent experimental and modeling work from our lab which demonstrates that a numerical conformal mapping which is a refinement of our previous complex logarithmic model provides the best current summary of this feature of the primate visual system. In this paper we will review recent work from our laboratory which has characterized some of the spatial architectures of the primate visual system. In particular we will review experimental and modeling studies which indicate that: . The global spatial architecture of primate visual cortex is well summarized by a numerical conformal mapping whose simplest analytic approximation is the complex logarithm function . The columnar sub-structure of primate visual cortex can be well summarized by a model based on a band-pass filtered white noise. We will also refer to ongoing work in our lab which demonstrates that: . The joint columnar/map structure of primate visual cortex can be modeled and summarized in terms of a new algorithm the ''''proto-column'''' algorithm. This work provides a reference-point for current engineering approaches to novel architectures for

  18. Digital Invasions: from Point Clouds to Historical Building Object Modeling H-Bom of a Unesco Whl Site

    NASA Astrophysics Data System (ADS)

    Chiabrando, F.; Lo Turco, M.; Santagati, C.

    2017-02-01

    The paper here presented shows the outcomes of a research/didactic activity carried out within a workshop titled "Digital Invasions. From point cloud to Heritage Building Information Modeling" held at Politecnico di Torino (29th September-5th October 2016). The term digital invasions refers to an Italian bottom up project born in the 2013 with the aim of promoting innovative digital ways for the enhancement of Cultural Heritage by the co-creation of cultural contents and its sharing through social media platforms. At this regard, we have worked with students of Architectural Master of Science degree, training them with a multidisciplinary teaching team (Architectural Representation, History of Architecture, Restoration, Digital Communication and Geomatics). The aim was also to test if our students could be involved in a sort of niche crowdsourcing for the creation of a library of H-BOMS (Historical-Building Object Modeling) of architectural elements.

  19. Theatre and Cinema Architecture: A Guide to Information Sources.

    ERIC Educational Resources Information Center

    Stoddard, Richard

    This annotated bibliography cites works related to theatres, movie houses, opera houses, and dance facilities. It is divided into three parts: general references, theatre architecture, and cinema architecture. The part on general references includes bibliographies and periodicals. The second and main part of the guide, on theatre architecture,…

  20. Suggestions for Documenting SOA-Based Systems

    DTIC Science & Technology

    2010-09-01

    Number FA8721-05-C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and...understandability and fo even across an enterprise. Technical reference models (see F (e.g., Oracle database managemen general in nature, and they typica...architectural pattern. CMU/SEI-2010- T Key Aspects of the Architecture unicate something that is important to the stakeholders intaining the system

  1. Adaptive Control with Reference Model Modification

    NASA Technical Reports Server (NTRS)

    Stepanyan, Vahram; Krishnakumar, Kalmanje

    2012-01-01

    This paper presents a modification of the conventional model reference adaptive control (MRAC) architecture in order to improve transient performance of the input and output signals of uncertain systems. A simple modification of the reference model is proposed by feeding back the tracking error signal. It is shown that the proposed approach guarantees tracking of the given reference command and the reference control signal (one that would be designed if the system were known) not only asymptotically but also in transient. Moreover, it prevents generation of high frequency oscillations, which are unavoidable in conventional MRAC systems for large adaptation rates. The provided design guideline makes it possible to track a reference commands of any magnitude from any initial position without re-tuning. The benefits of the method are demonstrated with a simulation example

  2. OWL-based reasoning methods for validating archetypes.

    PubMed

    Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2013-04-01

    Some modern Electronic Healthcare Record (EHR) architectures and standards are based on the dual model-based architecture, which defines two conceptual levels: reference model and archetype model. Such architectures represent EHR domain knowledge by means of archetypes, which are considered by many researchers to play a fundamental role for the achievement of semantic interoperability in healthcare. Consequently, formal methods for validating archetypes are necessary. In recent years, there has been an increasing interest in exploring how semantic web technologies in general, and ontologies in particular, can facilitate the representation and management of archetypes, including binding to terminologies, but no solution based on such technologies has been provided to date to validate archetypes. Our approach represents archetypes by means of OWL ontologies. This permits to combine the two levels of the dual model-based architecture in one modeling framework which can also integrate terminologies available in OWL format. The validation method consists of reasoning on those ontologies to find modeling errors in archetypes: incorrect restrictions over the reference model, non-conformant archetype specializations and inconsistent terminological bindings. The archetypes available in the repositories supported by the openEHR Foundation and the NHS Connecting for Health Program, which are the two largest publicly available ones, have been analyzed with our validation method. For such purpose, we have implemented a software tool called Archeck. Our results show that around 1/5 of archetype specializations contain modeling errors, the most common mistakes being related to coded terms and terminological bindings. The analysis of each repository reveals that different patterns of errors are found in both repositories. This result reinforces the need for making serious efforts in improving archetype design processes. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. AmapSim: a structural whole-plant simulator based on botanical knowledge and designed to host external functional models.

    PubMed

    Barczi, Jean-François; Rey, Hervé; Caraglio, Yves; de Reffye, Philippe; Barthélémy, Daniel; Dong, Qiao Xue; Fourcaud, Thierry

    2008-05-01

    AmapSim is a tool that implements a structural plant growth model based on a botanical theory and simulates plant morphogenesis to produce accurate, complex and detailed plant architectures. This software is the result of more than a decade of research and development devoted to plant architecture. New advances in the software development have yielded plug-in external functions that open up the simulator to functional processes. The simulation of plant topology is based on the growth of a set of virtual buds whose activity is modelled using stochastic processes. The geometry of the resulting axes is modelled by simple descriptive functions. The potential growth of each bud is represented by means of a numerical value called physiological age, which controls the value for each parameter in the model. The set of possible values for physiological ages is called the reference axis. In order to mimic morphological and architectural metamorphosis, the value allocated for the physiological age of buds evolves along this reference axis according to an oriented finite state automaton whose occupation and transition law follows a semi-Markovian function. Simulations were performed on tomato plants to demonstrate how the AmapSim simulator can interface external modules, e.g. a GREENLAB growth model and a radiosity model. The algorithmic ability provided by AmapSim, e.g. the reference axis, enables unified control to be exercised over plant development parameter values, depending on the biological process target: how to affect the local pertinent process, i.e. the pertinent parameter(s), while keeping the rest unchanged. This opening up to external functions also offers a broadened field of applications and thus allows feedback between plant growth and the physical environment.

  4. AmapSim: A Structural Whole-plant Simulator Based on Botanical Knowledge and Designed to Host External Functional Models

    PubMed Central

    Barczi, Jean-François; Rey, Hervé; Caraglio, Yves; de Reffye, Philippe; Barthélémy, Daniel; Dong, Qiao Xue; Fourcaud, Thierry

    2008-01-01

    Background and Aims AmapSim is a tool that implements a structural plant growth model based on a botanical theory and simulates plant morphogenesis to produce accurate, complex and detailed plant architectures. This software is the result of more than a decade of research and development devoted to plant architecture. New advances in the software development have yielded plug-in external functions that open up the simulator to functional processes. Methods The simulation of plant topology is based on the growth of a set of virtual buds whose activity is modelled using stochastic processes. The geometry of the resulting axes is modelled by simple descriptive functions. The potential growth of each bud is represented by means of a numerical value called physiological age, which controls the value for each parameter in the model. The set of possible values for physiological ages is called the reference axis. In order to mimic morphological and architectural metamorphosis, the value allocated for the physiological age of buds evolves along this reference axis according to an oriented finite state automaton whose occupation and transition law follows a semi-Markovian function. Key Results Simulations were performed on tomato plants to demostrate how the AmapSim simulator can interface external modules, e.g. a GREENLAB growth model and a radiosity model. Conclusions The algorithmic ability provided by AmapSim, e.g. the reference axis, enables unified control to be exercised over plant development parameter values, depending on the biological process target: how to affect the local pertinent process, i.e. the pertinent parameter(s), while keeping the rest unchanged. This opening up to external functions also offers a broadened field of applications and thus allows feedback between plant growth and the physical environment. PMID:17766310

  5. Hardware architecture for projective model calculation and false match refining using random sample consensus algorithm

    NASA Astrophysics Data System (ADS)

    Azimi, Ehsan; Behrad, Alireza; Ghaznavi-Ghoushchi, Mohammad Bagher; Shanbehzadeh, Jamshid

    2016-11-01

    The projective model is an important mapping function for the calculation of global transformation between two images. However, its hardware implementation is challenging because of a large number of coefficients with different required precisions for fixed point representation. A VLSI hardware architecture is proposed for the calculation of a global projective model between input and reference images and refining false matches using random sample consensus (RANSAC) algorithm. To make the hardware implementation feasible, it is proved that the calculation of the projective model can be divided into four submodels comprising two translations, an affine model and a simpler projective mapping. This approach makes the hardware implementation feasible and considerably reduces the required number of bits for fixed point representation of model coefficients and intermediate variables. The proposed hardware architecture for the calculation of a global projective model using the RANSAC algorithm was implemented using Verilog hardware description language and the functionality of the design was validated through several experiments. The proposed architecture was synthesized by using an application-specific integrated circuit digital design flow utilizing 180-nm CMOS technology as well as a Virtex-6 field programmable gate array. Experimental results confirm the efficiency of the proposed hardware architecture in comparison with software implementation.

  6. Business process architectures: overview, comparison and framework

    NASA Astrophysics Data System (ADS)

    Dijkman, Remco; Vanderfeesten, Irene; Reijers, Hajo A.

    2016-02-01

    With the uptake of business process modelling in practice, the demand grows for guidelines that lead to consistent and integrated collections of process models. The notion of a business process architecture has been explicitly proposed to address this. This paper provides an overview of the prevailing approaches to design a business process architecture. Furthermore, it includes evaluations of the usability and use of the identified approaches. Finally, it presents a framework for business process architecture design that can be used to develop a concrete architecture. The use and usability were evaluated in two ways. First, a survey was conducted among 39 practitioners, in which the opinion of the practitioners on the use and usefulness of the approaches was evaluated. Second, four case studies were conducted, in which process architectures from practice were analysed to determine the approaches or elements of approaches that were used in their design. Both evaluations showed that practitioners have a preference for using approaches that are based on reference models and approaches that are based on the identification of business functions or business objects. At the same time, the evaluations showed that practitioners use these approaches in combination, rather than selecting a single approach.

  7. Design of a Model Reference Adaptive Controller for an Unmanned Air Vehicle

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Matsutani, Megumi; Annaswamy, Anuradha M.

    2010-01-01

    This paper presents the "Adaptive Control Technology for Safe Flight (ACTS)" architecture, which consists of a non-adaptive controller that provides satisfactory performance under nominal flying conditions, and an adaptive controller that provides robustness under off nominal ones. The design and implementation procedures of both controllers are presented. The aim of these procedures, which encompass both theoretical and practical considerations, is to develop a controller suitable for flight. The ACTS architecture is applied to the Generic Transport Model developed by NASA-Langley Research Center. The GTM is a dynamically scaled test model of a transport aircraft for which a flight-test article and a high-fidelity simulation are available. The nominal controller at the core of the ACTS architecture has a multivariable LQR-PI structure while the adaptive one has a direct, model reference structure. The main control surfaces as well as the throttles are used as control inputs. The inclusion of the latter alleviates the pilot s workload by eliminating the need for cancelling the pitch coupling generated by changes in thrust. Furthermore, the independent usage of the throttles by the adaptive controller enables their use for attitude control. Advantages and potential drawbacks of adaptation are demonstrated by performing high fidelity simulations of a flight-validated controller and of its adaptive augmentation.

  8. Comparing root architectural models

    NASA Astrophysics Data System (ADS)

    Schnepf, Andrea; Javaux, Mathieu; Vanderborght, Jan

    2017-04-01

    Plant roots play an important role in several soil processes (Gregory 2006). Root architecture development determines the sites in soil where roots provide input of carbon and energy and take up water and solutes. However, root architecture is difficult to determine experimentally when grown in opaque soil. Thus, root architectural models have been widely used and been further developed into functional-structural models that are able to simulate the fate of water and solutes in the soil-root system (Dunbabin et al. 2013). Still, a systematic comparison of the different root architectural models is missing. In this work, we focus on discrete root architecture models where roots are described by connected line segments. These models differ (a) in their model concepts, such as the description of distance between branches based on a prescribed distance (inter-nodal distance) or based on a prescribed time interval. Furthermore, these models differ (b) in the implementation of the same concept, such as the time step size, the spatial discretization along the root axes or the way stochasticity of parameters such as root growth direction, growth rate, branch spacing, branching angles are treated. Based on the example of two such different root models, the root growth module of R-SWMS and RootBox, we show the impact of these differences on simulated root architecture and aggregated information computed from this detailed simulation results, taking into account the stochastic nature of those models. References Dunbabin, V.M., Postma, J.A., Schnepf, A., Pagès, L., Javaux, M., Wu, L., Leitner, D., Chen, Y.L., Rengel, Z., Diggle, A.J. Modelling root-soil interactions using three-dimensional models of root growth, architecture and function (2013) Plant and Soil, 372 (1-2), pp. 93 - 124. Gregory (2006) Roots, rhizosphere and soil: the route to a better understanding of soil science? European Journal of Soil Science 57: 2-12.

  9. On-Line Tracking Controller for Brushless DC Motor Drives Using Artificial Neural Networks

    NASA Technical Reports Server (NTRS)

    Rubaai, Ahmed

    1996-01-01

    A real-time control architecture is developed for time-varying nonlinear brushless dc motors operating in a high performance drives environment. The developed control architecture possesses the capabilities of simultaneous on-line identification and control. The dynamics of the motor are modeled on-line and controlled using an artificial neural network, as the system runs. The control architecture combines the experience and dependability of adaptive tracking systems with potential and promise of the neural computing technology. The sensitivity of real-time controller to parametric changes that occur during training is investigated. Such changes are usually manifested by rapid changes in the load of the brushless motor drives. This sudden change in the external load is simulated for the sigmoidal and sinusoidal reference tracks. The ability of the neuro-controller to maintain reasonable tracking accuracy in the presence of external noise is also verified for a number of desired reference trajectories.

  10. Architecture for Survivable System Processing (ASSP)

    NASA Astrophysics Data System (ADS)

    Wood, Richard J.

    1991-11-01

    The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.

  11. Architecture for Survivable System Processing (ASSP)

    NASA Technical Reports Server (NTRS)

    Wood, Richard J.

    1991-01-01

    The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.

  12. PDS4 - Some Principles for Agile Data Curation

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Crichton, D. J.; Hardman, S. H.; Joyner, R.; Algermissen, S.; Padams, J.

    2015-12-01

    PDS4, a research data management and curation system for NASA's Planetary Science Archive, was developed using principles that promote the characteristics of agile development. The result is an efficient system that produces better research data products while using less resources (time, effort, and money) and maximizes their usefulness for current and future scientists. The key principle is architectural. The PDS4 information architecture is developed and maintained independent of the infrastructure's process, application and technology architectures. The information architecture is based on an ontology-based information model developed to leverage best practices from standard reference models for digital archives, digital object registries, and metadata registries and capture domain knowledge from a panel of planetary science domain experts. The information model provides a sharable, stable, and formal set of information requirements for the system and is the primary source for information to configure most system components, including the product registry, search engine, validation and display tools, and production pipelines. Multi-level governance is also allowed for the effective management of the informational elements at the common, discipline, and project level. This presentation will describe the development principles, components, and uses of the information model and how an information model-driven architecture exhibits characteristics of agile curation including early delivery, evolutionary development, adaptive planning, continuous improvement, and rapid and flexible response to change.

  13. 78 FR 18415 - Connected Vehicle Reference Implementation Architecture Workshop; Notice of Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-26

    ... DEPARTMENT OF TRANSPORTATION Connected Vehicle Reference Implementation Architecture Workshop...) Intelligent Transportation System Joint Program Office (ITS JPO) will host a free Connected Vehicle Reference... manufacturing, developing, deploying, operating, or maintaining the connected [[Page 18416

  14. New ARCH: Future Generation Internet Architecture

    DTIC Science & Technology

    2004-08-01

    a vocabulary to talk about a system . This provides a framework ( a “reference model ...layered model Modularity and abstraction are central tenets of Computer Science thinking. Modularity breaks a system into parts, normally to permit...this complexity is hidden. Abstraction suggests a structure for the system . A popular and simple structure is a layered model : lower layer

  15. Model-Based Engine Control Architecture with an Extended Kalman Filter

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.; Connolly, Joseph W.

    2016-01-01

    This paper discusses the design and implementation of an extended Kalman filter (EKF) for model-based engine control (MBEC). Previously proposed MBEC architectures feature an optimal tuner Kalman Filter (OTKF) to produce estimates of both unmeasured engine parameters and estimates for the health of the engine. The success of this approach relies on the accuracy of the linear model and the ability of the optimal tuner to update its tuner estimates based on only a few sensors. Advances in computer processing are making it possible to replace the piece-wise linear model, developed off-line, with an on-board nonlinear model running in real-time. This will reduce the estimation errors associated with the linearization process, and is typically referred to as an extended Kalman filter. The non-linear extended Kalman filter approach is applied to the Commercial Modular Aero-Propulsion System Simulation 40,000 (C-MAPSS40k) and compared to the previously proposed MBEC architecture. The results show that the EKF reduces the estimation error, especially during transient operation.

  16. Model-Based Engine Control Architecture with an Extended Kalman Filter

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.; Connolly, Joseph W.

    2016-01-01

    This paper discusses the design and implementation of an extended Kalman filter (EKF) for model-based engine control (MBEC). Previously proposed MBEC architectures feature an optimal tuner Kalman Filter (OTKF) to produce estimates of both unmeasured engine parameters and estimates for the health of the engine. The success of this approach relies on the accuracy of the linear model and the ability of the optimal tuner to update its tuner estimates based on only a few sensors. Advances in computer processing are making it possible to replace the piece-wise linear model, developed off-line, with an on-board nonlinear model running in real-time. This will reduce the estimation errors associated with the linearization process, and is typically referred to as an extended Kalman filter. The nonlinear extended Kalman filter approach is applied to the Commercial Modular Aero-Propulsion System Simulation 40,000 (C-MAPSS40k) and compared to the previously proposed MBEC architecture. The results show that the EKF reduces the estimation error, especially during transient operation.

  17. Cognitive Modeling of Individual Variation in Reference Production and Comprehension

    PubMed Central

    Hendriks, Petra

    2016-01-01

    A challenge for most theoretical and computational accounts of linguistic reference is the observation that language users vary considerably in their referential choices. Part of the variation observed among and within language users and across tasks may be explained from variation in the cognitive resources available to speakers and listeners. This paper presents a computational model of reference production and comprehension developed within the cognitive architecture ACT-R. Through simulations with this ACT-R model, it is investigated how cognitive constraints interact with linguistic constraints and features of the linguistic discourse in speakers’ production and listeners’ comprehension of referring expressions in specific tasks, and how this interaction may give rise to variation in referential choice. The ACT-R model of reference explains and predicts variation among language users in their referential choices as a result of individual and task-related differences in processing speed and working memory capacity. Because of limitations in their cognitive capacities, speakers sometimes underspecify or overspecify their referring expressions, and listeners sometimes choose incorrect referents or are overly liberal in their interpretation of referring expressions. PMID:27092101

  18. Design and Analysis of Architectures for Structural Health Monitoring Systems

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi; Sixto, S. L. (Technical Monitor)

    2002-01-01

    During the two-year project period, we have worked on several aspects of Health Usage and Monitoring Systems for structural health monitoring. In particular, we have made contributions in the following areas. 1. Reference HUMS architecture: We developed a high-level architecture for health monitoring and usage systems (HUMS). The proposed reference architecture is shown. It is compatible with the Generic Open Architecture (GOA) proposed as a standard for avionics systems. 2. HUMS kernel: One of the critical layers of HUMS reference architecture is the HUMS kernel. We developed a detailed design of a kernel to implement the high level architecture.3. Prototype implementation of HUMS kernel: We have implemented a preliminary version of the HUMS kernel on a Unix platform.We have implemented both a centralized system version and a distributed version. 4. SCRAMNet and HUMS: SCRAMNet (Shared Common Random Access Memory Network) is a system that is found to be suitable to implement HUMS. For this reason, we have conducted a simulation study to determine its stability in handling the input data rates in HUMS. 5. Architectural specification.

  19. Tradeoffs in the design of a system for high level language interpretation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osorio, F.C.C.; Patt, Y.N.

    The problem of designing a system for high-level language interpretation (HLLI) is considered. First, a model of the design process is presented where several styles of design, e.g. turing machine interpretation, CISC architecture interpretation and RISC architecture interpretation are treated uniformly. Second, the most significant characteristics of HLLI are analysed in the context of different design styles, and some guidelines are presented on how to identify the most suitable design style for a given high-level language problem. 12 references.

  20. NASREN: Standard reference model for telerobot control

    NASA Technical Reports Server (NTRS)

    Albus, J. S.; Lumia, R.; Mccain, H.

    1987-01-01

    A hierarchical architecture is described which supports space station telerobots in a variety of modes. The system is divided into three hierarchies: task decomposition, world model, and sensory processing. Goals at each level of the task dedomposition heirarchy are divided both spatially and temporally into simpler commands for the next lower level. This decomposition is repreated until, at the lowest level, the drive signals to the robot actuators are generated. To accomplish its goals, task decomposition modules must often use information stored it the world model. The purpose of the sensory system is to update the world model as rapidly as possible to keep the model in registration with the physical world. The architecture of the entire control system hierarch is described and how it can be applied to space telerobot applications.

  1. MRAC Revisited: Guaranteed Performance with Reference Model Modification

    NASA Technical Reports Server (NTRS)

    Stepanyan, Vahram; Krishnakumar, Kalmaje

    2010-01-01

    This paper presents modification of the conventional model reference adaptive control (MRAC) architecture in order to achieve guaranteed transient performance both in the output and input signals of an uncertain system. The proposed modification is based on the tracking error feedback to the reference model. It is shown that approach guarantees tracking of a given command and the ideal control signal (one that would be designed if the system were known) not only asymptotically but also in transient by a proper selection of the error feedback gain. The method prevents generation of high frequency oscillations that are unavoidable in conventional MRAC systems for large adaptation rates. The provided design guideline makes it possible to track a reference command of any magnitude form any initial position without re-tuning. The benefits of the method are demonstrated in simulations.

  2. Query Health: standards-based, cross-platform population health surveillance

    PubMed Central

    Klann, Jeffrey G; Buck, Michael D; Brown, Jeffrey; Hadley, Marc; Elmore, Richard; Weber, Griffin M; Murphy, Shawn N

    2014-01-01

    Objective Understanding population-level health trends is essential to effectively monitor and improve public health. The Office of the National Coordinator for Health Information Technology (ONC) Query Health initiative is a collaboration to develop a national architecture for distributed, population-level health queries across diverse clinical systems with disparate data models. Here we review Query Health activities, including a standards-based methodology, an open-source reference implementation, and three pilot projects. Materials and methods Query Health defined a standards-based approach for distributed population health queries, using an ontology based on the Quality Data Model and Consolidated Clinical Document Architecture, Health Quality Measures Format (HQMF) as the query language, the Query Envelope as the secure transport layer, and the Quality Reporting Document Architecture as the result language. Results We implemented this approach using Informatics for Integrating Biology and the Bedside (i2b2) and hQuery for data analytics and PopMedNet for access control, secure query distribution, and response. We deployed the reference implementation at three pilot sites: two public health departments (New York City and Massachusetts) and one pilot designed to support Food and Drug Administration post-market safety surveillance activities. The pilots were successful, although improved cross-platform data normalization is needed. Discussions This initiative resulted in a standards-based methodology for population health queries, a reference implementation, and revision of the HQMF standard. It also informed future directions regarding interoperability and data access for ONC's Data Access Framework initiative. Conclusions Query Health was a test of the learning health system that supplied a functional methodology and reference implementation for distributed population health queries that has been validated at three sites. PMID:24699371

  3. Query Health: standards-based, cross-platform population health surveillance.

    PubMed

    Klann, Jeffrey G; Buck, Michael D; Brown, Jeffrey; Hadley, Marc; Elmore, Richard; Weber, Griffin M; Murphy, Shawn N

    2014-01-01

    Understanding population-level health trends is essential to effectively monitor and improve public health. The Office of the National Coordinator for Health Information Technology (ONC) Query Health initiative is a collaboration to develop a national architecture for distributed, population-level health queries across diverse clinical systems with disparate data models. Here we review Query Health activities, including a standards-based methodology, an open-source reference implementation, and three pilot projects. Query Health defined a standards-based approach for distributed population health queries, using an ontology based on the Quality Data Model and Consolidated Clinical Document Architecture, Health Quality Measures Format (HQMF) as the query language, the Query Envelope as the secure transport layer, and the Quality Reporting Document Architecture as the result language. We implemented this approach using Informatics for Integrating Biology and the Bedside (i2b2) and hQuery for data analytics and PopMedNet for access control, secure query distribution, and response. We deployed the reference implementation at three pilot sites: two public health departments (New York City and Massachusetts) and one pilot designed to support Food and Drug Administration post-market safety surveillance activities. The pilots were successful, although improved cross-platform data normalization is needed. This initiative resulted in a standards-based methodology for population health queries, a reference implementation, and revision of the HQMF standard. It also informed future directions regarding interoperability and data access for ONC's Data Access Framework initiative. Query Health was a test of the learning health system that supplied a functional methodology and reference implementation for distributed population health queries that has been validated at three sites. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  4. Magnetic resonance dispersion imaging for localization of angiogenesis and cancer growth.

    PubMed

    Mischi, Massimo; Turco, Simona; Lavini, Cristina; Kompatsiari, Kyveli; de la Rosette, Jean J M C H; Breeuwer, Marcel; Wijkstra, Hessel

    2014-08-01

    Cancer angiogenesis can be imaged by using dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI). Pharmacokinetic modeling can be used to assess vascular perfusion and permeability, but the assessment of angiogenic changes in the microvascular architecture remains challenging. This article presents 2 models enabling the characterization of the microvascular architecture by DCE-MRI. The microvascular architecture is reflected in the dispersion coefficient according to the convective dispersion equation. A solution of this equation, combined with the Tofts model, permits defining a dispersion model for magnetic resonance imaging. A reduced dispersion model is also presented. The proposed models were evaluated for prostate cancer diagnosis. Dynamic contrast-enhanced magnetic resonance imaging was performed, and concentration-time curves were calculated in each voxel. The simultaneous generation of parametric maps related to permeability and dispersion was obtained through model fitting. A preliminary validation was carried out through comparison with the histology in 15 patients referred for radical prostatectomy. Cancer localization was accurate with both dispersion models, with an area under the receiver operating characteristic curve greater than 0.8. None of the compared parameters, aimed at assessing vascular permeability and perfusion, showed better results. A new DCE-MRI method is proposed to characterize the microvascular architecture through the assessment of intravascular dispersion, without the need for separate arterial-input-function estimation. The results are promising and encourage further research.

  5. A Model-Based Architecture Approach to Ship Design Linking Capability Needs to System Solutions

    DTIC Science & Technology

    2012-06-01

    NSSM NATO Sea Sparrow Missile RAM Rolling Airframe Missile CIWS Close-In Weapon System 3D Three Dimensional Ps Probability of Survival PHit ...example effectiveness model. The primary MOP is the inverse of the probability of taking a hit (1- PHit ), which in, this study, will be referred to as

  6. Technical Reference Suite Addressing Challenges of Providing Assurance for Fault Management Architectural Design

    NASA Technical Reports Server (NTRS)

    Fitz, Rhonda; Whitman, Gerek

    2016-01-01

    Research into complexities of software systems Fault Management (FM) and how architectural design decisions affect safety, preservation of assets, and maintenance of desired system functionality has coalesced into a technical reference (TR) suite that advances the provision of safety and mission assurance. The NASA Independent Verification and Validation (IV&V) Program, with Software Assurance Research Program support, extracted FM architectures across the IV&V portfolio to evaluate robustness, assess visibility for validation and test, and define software assurance methods applied to the architectures and designs. This investigation spanned IV&V projects with seven different primary developers, a wide range of sizes and complexities, and encompassed Deep Space Robotic, Human Spaceflight, and Earth Orbiter mission FM architectures. The initiative continues with an expansion of the TR suite to include Launch Vehicles, adding the benefit of investigating differences intrinsic to model-based FM architectures and insight into complexities of FM within an Agile software development environment, in order to improve awareness of how nontraditional processes affect FM architectural design and system health management. The identification of particular FM architectures, visibility, and associated IV&V techniques provides a TR suite that enables greater assurance that critical software systems will adequately protect against faults and respond to adverse conditions. Additionally, the role FM has with regard to strengthened security requirements, with potential to advance overall asset protection of flight software systems, is being addressed with the development of an adverse conditions database encompassing flight software vulnerabilities. Capitalizing on the established framework, this TR suite provides assurance capability for a variety of FM architectures and varied development approaches. Research results are being disseminated across NASA, other agencies, and the software community. This paper discusses the findings and TR suite informing the FM domain in best practices for FM architectural design, visibility observations, and methods employed for IV&V and mission assurance.

  7. TMN: Introduction and interpretation

    NASA Astrophysics Data System (ADS)

    Pras, Aiko

    An overview of Telecommunications Management Network (TMN) status is presented. Its relation with Open System Interconnection (OSI) systems management is given and the commonalities and distinctions are identified. Those aspects that distinguish TMN from OSI management are introduced; TMN's functional and physical architectures and TMN's logical layered architecture are discussed. An analysis of the concepts used by these architectures (reference point, interface, function block, and building block) is given. The use of these concepts to express geographical distribution and functional layering is investigated. This aspect is interesting to understand how OSI management protocols can be used in a TMN environment. A statement regarding applicability of TMN as a model that helps the designers of (management) networks is given.

  8. Unified implementation of the reference architecture : concept of operations.

    DOT National Transportation Integrated Search

    2015-10-19

    This document describes the Concept of Operations (ConOps) for the Unified Implementation of the Reference Architecture, located in Southeast Michigan, which supports connected vehicle research and development. This ConOps describes the current state...

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knirsch, Fabian; Engel, Dominik; Neureiter, Christian

    In a smart grid, data and information are transported, transmitted, stored, and processed with various stakeholders having to cooperate effectively. Furthermore, personal data is the key to many smart grid applications and therefore privacy impacts have to be taken into account. For an effective smart grid, well integrated solutions are crucial and for achieving a high degree of customer acceptance, privacy should already be considered at design time of the system. To assist system engineers in early design phase, frameworks for the automated privacy evaluation of use cases are important. For evaluation, use cases for services and software architectures needmore » to be formally captured in a standardized and commonly understood manner. In order to ensure this common understanding for all kinds of stakeholders, reference models have recently been developed. In this paper we present a model-driven approach for the automated assessment of such services and software architectures in the smart grid that builds on the standardized reference models. The focus of qualitative and quantitative evaluation is on privacy. For evaluation, the framework draws on use cases from the University of Southern California microgrid.« less

  10. Human Exploration of Mars Design Reference Architecture 5.0

    NASA Technical Reports Server (NTRS)

    Drake, Bret G.

    2010-01-01

    This paper provides a summary of the Mars Design Reference Architecture 5.0 (DRA 5.0), which is the latest in a series of NASA Mars reference missions. It provides a vision of one potential approach to human Mars exploration. The reference architecture provides a common framework for future planning of systems concepts, technology development, and operational testing as well as Mars robotic missions, research that is conducted on the International Space Station, and future lunar exploration missions. This summary the Mars DRA 5.0 provides an overview of the overall mission approach, surface strategy and exploration goals, as well as the key systems and challenges for the first three human missions to Mars.

  11. Integrating MPI and deduplication engines: a software architecture roadmap.

    PubMed

    Baksi, Dibyendu

    2009-03-01

    The objective of this paper is to clarify the major concepts related to architecture and design of patient identity management software systems so that an implementor looking to solve a specific integration problem in the context of a Master Patient Index (MPI) and a deduplication engine can address the relevant issues. The ideas presented are illustrated in the context of a reference use case from Integrating the Health Enterprise Patient Identifier Cross-referencing (IHE PIX) profile. Sound software engineering principles using the latest design paradigm of model driven architecture (MDA) are applied to define different views of the architecture. The main contribution of the paper is a clear software architecture roadmap for implementors of patient identity management systems. Conceptual design in terms of static and dynamic views of the interfaces is provided as an example of platform independent model. This makes the roadmap applicable to any specific solutions of MPI, deduplication library or software platform. Stakeholders in need of integration of MPIs and deduplication engines can evaluate vendor specific solutions and software platform technologies in terms of fundamental concepts and can make informed decisions that preserve investment. This also allows freedom from vendor lock-in and the ability to kick-start integration efforts based on a solid architecture.

  12. An Approach to Building a Learning Management System that Emphasizes on Incorporating Individualized Dissemination with Intelligent Tutoring

    NASA Astrophysics Data System (ADS)

    Ghosh, Sreya

    2017-02-01

    This article proposes a new six-model architecture for an intelligent tutoring system to be incorporated in a learning management system with domain-independence feature and individualized dissemination. The present six model architecture aims to simulate a human tutor. Some recent extensions of using intelligent tutoring system (ITS) explores learning management systems to behave as a real teacher during a teaching-learning process, by taking care of, mainly, the dynamic response system. However, the present paper argues that to mimic a human teacher it needs not only the dynamic response but also the incorporation of the teacher's dynamic review of students' performance and keeping track of their current level of understanding. Here, the term individualization has been used to refer to tailor making of contents and its dissemination fitting to the individual needs and capabilities of learners who is taking a course online and is subjected to teaching in absentia. This paper describes how the individual models of the proposed architecture achieves the features of ITS.

  13. Framework for a clinical information system.

    PubMed

    Van de Velde, R

    2000-01-01

    The current status of our work towards the design and implementation of a reference architecture for a Clinical Information System is presented. This architecture has been developed and implemented based on components following a strong underlying conceptual and technological model. Common Object Request Broker and n-tier technology featuring centralised and departmental clinical information systems as the back-end store for all clinical data are used. Servers located in the 'middle' tier apply the clinical (business) model and application rules to communicate with so-called 'thin client' workstations. The main characteristics are the focus on modelling and reuse of both data and business logic as there is a shift away from data and functional modelling towards object modelling. Scalability as well as adaptability to constantly changing requirements via component driven computing are the main reasons for that approach.

  14. Reference architecture of application services for personal wellbeing information management.

    PubMed

    Tuomainen, Mika; Mykkänen, Juha

    2011-01-01

    Personal information management has been proposed as an important enabler for individual empowerment concerning citizens' wellbeing and health information. In the MyWellbeing project in Finland, a strictly citizen-driven concept of "Coper" and related architectural and functional guidelines have been specified. We present a reference architecture and a set of identified application services to support personal wellbeing information management. In addition, the related standards and developments are discussed.

  15. Reference Avionics Architecture for Lunar Surface Systems

    NASA Technical Reports Server (NTRS)

    Somervill, Kevin M.; Lapin, Jonathan C.; Schmidt, Oron L.

    2010-01-01

    Developing and delivering infrastructure capable of supporting long-term manned operations to the lunar surface has been a primary objective of the Constellation Program in the Exploration Systems Mission Directorate. Several concepts have been developed related to development and deployment lunar exploration vehicles and assets that provide critical functionality such as transportation, habitation, and communication, to name a few. Together, these systems perform complex safety-critical functions, largely dependent on avionics for control and behavior of system functions. These functions are implemented using interchangeable, modular avionics designed for lunar transit and lunar surface deployment. Systems are optimized towards reuse and commonality of form and interface and can be configured via software or component integration for special purpose applications. There are two core concepts in the reference avionics architecture described in this report. The first concept uses distributed, smart systems to manage complexity, simplify integration, and facilitate commonality. The second core concept is to employ extensive commonality between elements and subsystems. These two concepts are used in the context of developing reference designs for many lunar surface exploration vehicles and elements. These concepts are repeated constantly as architectural patterns in a conceptual architectural framework. This report describes the use of these architectural patterns in a reference avionics architecture for Lunar surface systems elements.

  16. Experiences with Ada in an embedded system

    NASA Technical Reports Server (NTRS)

    Labaugh, Robert J.

    1988-01-01

    Recent experiences with using Ada in a real time environment are described. The application was the control system for an experimental robotic arm. The objectives of the effort were to experiment with developing embedded applications in Ada, evaluating the suitability of the language for the application, and determining the performance of the system. Additional objectives were to develop a control system based on the NASA/NBS Standard Reference Model for Telerobot Control System Architecture (NASREM) in Ada, and to experiment with the control laws and how to incorporate them into the NASREM architecture.

  17. Developing the Next Generation NATO Reference Mobility Model

    DTIC Science & Technology

    2016-06-27

    acquisition • design UNCLASSIFIED: Distribution Statement A. Approved for public release; distribution is unlimited.(#27992) Vehicle Dynamics Model...and numerical resolution – for use in vehicle design , acquisition and operational mobility planning 27 June 2016 An open architecture was established...the current empirical methods for simulating vehicle and suspension designs . – Industry wide shortfall with tire dynamics and soft soil behavior

  18. A One-System Theory Which is Not Propositional.

    PubMed

    Witnauer, James E; Urcelay, Gonzalo P; Miller, Ralph R

    2009-04-01

    We argue that the propositional and link-based approaches to human contingency learning represent different levels of analysis because propositional reasoning requires a basis, which is plausibly provided by a link-based architecture. Moreover, in their attempt to compare two general classes of models (link-based and propositional), Mitchell et al. have referred to only two generic models and ignore the large variety of different models within each class.

  19. Human Exploration of Mars Design Reference Architecture 5.0

    NASA Technical Reports Server (NTRS)

    Drake, Bret G.; Hoffman, Stephen J.; Beaty, David W.

    2009-01-01

    This paper provides a summary of the 2007 Mars Design Reference Architecture 5.0 (DRA 5.0), which is the latest in a series of NASA Mars reference missions. It provides a vision of one potential approach to human Mars exploration including how Constellation systems can be used. The reference architecture provides a common framework for future planning of systems concepts, technology development, and operational testing as well as Mars robotic missions, research that is conducted on the International Space Station, and future lunar exploration missions. This summary the Mars DRA 5.0 provides an overview of the overall mission approach, surface strategy and exploration goals, as well as the key systems and challenges for the first three human missions to Mars.

  20. Understanding cellular architecture in cancer cells

    NASA Astrophysics Data System (ADS)

    Bianco, Simone; Tang, Chao

    2011-03-01

    Understanding the development of cancer is an important goal for today's science. The morphology of cellular organelles, such as the nucleus, the nucleoli and the mitochondria, which is referred to as cellular architecture or cytoarchitecture, is an important indicator of the state of the cell. In particular, there are striking difference between the cellular architecture of a healthy cell versus a cancer cell. In this work we present a dynamical model for the evolution of organelles morphology in cancer cells. Using a dynamical systems approach, we describe the evolution of a cell on its way to cancer as a trajectory in a multidimensional morphology state. The results provided by this work may increase our insight on the mechanism of tumorigenesis and help build new therapeutic strategies.

  1. The diabolo classifier

    PubMed

    Schwenk

    1998-11-15

    We present a new classification architecture based on autoassociative neural networks that are used to learn discriminant models of each class. The proposed architecture has several interesting properties with respect to other model-based classifiers like nearest-neighbors or radial basis functions: it has a low computational complexity and uses a compact distributed representation of the models. The classifier is also well suited for the incorporation of a priori knowledge by means of a problem-specific distance measure. In particular, we will show that tangent distance (Simard, Le Cun, & Denker, 1993) can be used to achieve transformation invariance during learning and recognition. We demonstrate the application of this classifier to optical character recognition, where it has achieved state-of-the-art results on several reference databases. Relations to other models, in particular those based on principal component analysis, are also discussed.

  2. Architecture of dermatophyte cell Walls: Electron microscopic and biochemical analysis

    NASA Technical Reports Server (NTRS)

    Nozawa, Y.; Kitajima, Y.

    1984-01-01

    A review with 83 references on the cell wall structure of dermatophytes is presented. Topics discussed include separation and preparation of cell walls; microstructure of cell walls by electron microscopy; chemical composition of cell walls; structural model of cell walls; and morphological structure of cell walls.

  3. A reference web architecture and patterns for real-time visual analytics on large streaming data

    NASA Astrophysics Data System (ADS)

    Kandogan, Eser; Soroker, Danny; Rohall, Steven; Bak, Peter; van Ham, Frank; Lu, Jie; Ship, Harold-Jeffrey; Wang, Chun-Fu; Lai, Jennifer

    2013-12-01

    Monitoring and analysis of streaming data, such as social media, sensors, and news feeds, has become increasingly important for business and government. The volume and velocity of incoming data are key challenges. To effectively support monitoring and analysis, statistical and visual analytics techniques need to be seamlessly integrated; analytic techniques for a variety of data types (e.g., text, numerical) and scope (e.g., incremental, rolling-window, global) must be properly accommodated; interaction, collaboration, and coordination among several visualizations must be supported in an efficient manner; and the system should support the use of different analytics techniques in a pluggable manner. Especially in web-based environments, these requirements pose restrictions on the basic visual analytics architecture for streaming data. In this paper we report on our experience of building a reference web architecture for real-time visual analytics of streaming data, identify and discuss architectural patterns that address these challenges, and report on applying the reference architecture for real-time Twitter monitoring and analysis.

  4. Application of the Life Cycle Analysis and the Building Information Modelling Software in the Architectural Climate Change-Oriented Design Process

    NASA Astrophysics Data System (ADS)

    Gradziński, Piotr

    2017-10-01

    Whereas World’s climate is changing (inter alia, under the influence of architecture activity), the author attempts to reorientations design practice primarily in a direction the use and adapt to the climatic conditions. Architectural Design using in early stages of the architectural Design Process of the building, among other Life Cycle Analysis (LCA) and digital analytical tools BIM (Building Information Modelling) defines the overriding requirements which the designer/architect should meet. The first part, the text characterized the architecture activity influences (by consumption, pollution, waste, etc.) and the use of building materials (embodied energy, embodied carbon, Global Warming Potential, etc.) within the meaning of the direct negative environmental impact. The second part, the paper presents the revision of the methods and analytical techniques prevent negative influences. Firstly, showing the study of the building by using the Life Cycle Analysis of the structure (e.g. materials) and functioning (e.g. energy consumptions) of the architectural object (stages: before use, use, after use). Secondly, the use of digital analytical tools for determining the benefits of running multi-faceted simulations in terms of environmental factors (exposure to light, shade, wind) directly affecting shaping the form of the building. The conclusion, author’s research results highlight the fact that indicates the possibility of building design using the above-mentioned elements (LCA, BIM) causes correction, early designs decisions in the design process of architectural form, minimizing the impact on nature, environment. The work refers directly to the architectural-environmental dimensions, orienting the design process of buildings in respect of widely comprehended climatic changes.

  5. GPS Block 2R Time Standard Assembly (TSA) architecture

    NASA Technical Reports Server (NTRS)

    Baker, Anthony P.

    1990-01-01

    The underlying philosophy of the Global Positioning System (GPS) 2R Time Standard Assembly (TSA) architecture is to utilize two frequency sources, one fixed frequency reference source and one system frequency source, and to couple the system frequency source to the reference frequency source via a sample data loop. The system source is used to provide the basic clock frequency and timing for the space vehicle (SV) and it uses a voltage controlled crystal oscillator (VCXO) with high short term stability. The reference source is an atomic frequency standard (AFS) with high long term stability. The architecture can support any type of frequency standard. In the system design rubidium, cesium, and H2 masers outputting a canonical frequency were accommodated. The architecture is software intensive. All VCXO adjustments are digital and are calculated by a processor. They are applied to the VCXO via a digital to analog converter.

  6. A new class of compact high sensitive tiltmeter based on the UNISA folded pendulum mechanical architecture

    NASA Astrophysics Data System (ADS)

    Barone, Fabrizio; Giordano, Gerardo

    2018-02-01

    We present the Extended Folded Pendulum Model (EFPM), a model developed for a quantitative description of the dynamical behavior of a folded pendulum generically oriented in space. This model, based on the Tait-Bryan angular reference system, highlights the relationship between the folded pendulum orientation in the gravitational field and its natural resonance frequency. Tis model validated by tests performed with a monolithic UNISA Folded Pendulum, highlights a new technique of implementation of folded pendulum based tiltmeters.

  7. User’s guide and reference to Ash3d: a three-dimensional model for Eulerian atmospheric tephra transport and deposition

    USGS Publications Warehouse

    Mastin, Larry G.; Randall, Michael J.; Schwaiger, Hans F.; Denlinger, Roger P.

    2013-01-01

    Ash3d is a three-dimensional Eulerian atmospheric model for tephra transport, dispersal, and deposition, written by the authors to study and forecast hazards of volcanic ash clouds and tephra fall. In this report, we explain how to set up simulations using both a web interface and an ASCII input file, and how to view and interpret model output. We also summarize the architecture of the model and some of its properties.

  8. Image Reference Database in Teleradiology: Migrating to WWW

    NASA Astrophysics Data System (ADS)

    Pasqui, Valdo

    The paper presents a multimedia Image Reference Data Base (IRDB) used in Teleradiology. The application was developed at the University of Florence in the framework of the European Community TELEMED Project. TELEMED overall goals and IRDB requirements are outlined and the resulting architecture is described. IRDB is a multisite database containing radiological images, selected because their scientific interest, and their related information. The architecture consists of a set of IRDB Installations which are accessed from Viewing Stations (VS) located at different medical sites. The interaction between VS and IRDB Installations follows the client-server paradigm and uses an OSI level-7 protocol, named Telemed Communication Language. After reviewing Florence prototype implementation and experimentation, IRDB migration to World Wide Web (WWW) is discussed. A possible scenery to implement IRDB on the basis of WWW model is depicted in order to exploit WWW servers and browsers capabilities. Finally, the advantages of this conversion are outlined.

  9. The Role of a Reference Synthetic Data Generator within the Field of Learning Analytics

    ERIC Educational Resources Information Center

    Berg, Alan\tM.; Mol, Stefan T.; Kismihók, Gábor; Sclater, Niall

    2016-01-01

    This paper details the anticipated impact of synthetic "big" data on learning analytics (LA) infrastructures, with a particular focus on data governance, the acceleration of service development, and the benchmarking of predictive models. By reviewing two cases, one at the sector-wide level (the Jisc learning analytics architecture) and…

  10. Evaluating a Control System Architecture Based on a Formally Derived AOCS Model

    NASA Astrophysics Data System (ADS)

    Ilic, Dubravka; Latvala, Timo; Varpaaniemi, Kimmo; Vaisanen, Pauli; Troubitsyna, Elena; Laibinis, Linas

    2010-08-01

    Attitude & Orbit Control System (AOCS) refers to a wider class of control systems which are used to determine and control the attitude of the spacecraft while in orbit, based on the information obtained from various sensors. In this paper, we propose an approach to evaluate a typical (yet somewhat simplified) AOCS architecture using formal development - based on the Event-B method. As a starting point, an Ada specification of the AOCS is translated into a formal specification and further refined to incorporate all the details of its original source code specification. This way we are able not only to evaluate the Ada specification by expressing and verifying specific system properties in our formal models, but also to determine how well the chosen modelling framework copes with the level of detail required for an actual implementation and code generation from the derived models.

  11. MDA-based EHR application security services.

    PubMed

    Blobel, Bernd; Pharow, Peter

    2004-01-01

    Component-oriented, distributed, virtual EHR systems have to meet enhanced security and privacy requirements. In the context of advanced architectural paradigms such as component-orientation, model-driven, and knowledge-based, standardised security services needed have to be specified and implemented in an integrated way following the same paradigm. This concerns the deployment of formal models, meta-languages, reference models such as the ISO RM-ODP, and development as well as implementation tools. International projects' results presented proceed on that streamline.

  12. Deep RNNs for video denoising

    NASA Astrophysics Data System (ADS)

    Chen, Xinyuan; Song, Li; Yang, Xiaokang

    2016-09-01

    Video denoising can be described as the problem of mapping from a specific length of noisy frames to clean one. We propose a deep architecture based on Recurrent Neural Network (RNN) for video denoising. The model learns a patch-based end-to-end mapping between the clean and noisy video sequences. It takes the corrupted video sequences as the input and outputs the clean one. Our deep network, which we refer to as deep Recurrent Neural Networks (deep RNNs or DRNNs), stacks RNN layers where each layer receives the hidden state of the previous layer as input. Experiment shows (i) the recurrent architecture through temporal domain extracts motion information and does favor to video denoising, and (ii) deep architecture have large enough capacity for expressing mapping relation between corrupted videos as input and clean videos as output, furthermore, (iii) the model has generality to learned different mappings from videos corrupted by different types of noise (e.g., Poisson-Gaussian noise). By training on large video databases, we are able to compete with some existing video denoising methods.

  13. Advanced and secure architectural EHR approaches.

    PubMed

    Blobel, Bernd

    2006-01-01

    Electronic Health Records (EHRs) provided as a lifelong patient record advance towards core applications of distributed and co-operating health information systems and health networks. For meeting the challenge of scalable, flexible, portable, secure EHR systems, the underlying EHR architecture must be based on the component paradigm and model driven, separating platform-independent and platform-specific models. Allowing manageable models, real systems must be decomposed and simplified. The resulting modelling approach has to follow the ISO Reference Model - Open Distributing Processing (RM-ODP). The ISO RM-ODP describes any system component from different perspectives. Platform-independent perspectives contain the enterprise view (business process, policies, scenarios, use cases), the information view (classes and associations) and the computational view (composition and decomposition), whereas platform-specific perspectives concern the engineering view (physical distribution and realisation) and the technology view (implementation details from protocols up to education and training) on system components. Those views have to be established for components reflecting aspects of all domains involved in healthcare environments including administrative, legal, medical, technical, etc. Thus, security-related component models reflecting all view mentioned have to be established for enabling both application and communication security services as integral part of the system's architecture. Beside decomposition and simplification of system regarding the different viewpoint on their components, different levels of systems' granularity can be defined hiding internals or focusing on properties of basic components to form a more complex structure. The resulting models describe both structure and behaviour of component-based systems. The described approach has been deployed in different projects defining EHR systems and their underlying architectural principles. In that context, the Australian GEHR project, the openEHR initiative, the revision of CEN ENV 13606 "Electronic Health Record communication", all based on Archetypes, but also the HL7 version 3 activities are discussed in some detail. The latter include the HL7 RIM, the HL7 Development Framework, the HL7's clinical document architecture (CDA) as well as the set of models from use cases, activity diagrams, sequence diagrams up to Domain Information Models (DMIMs) and their building blocks Common Message Element Types (CMET) Constraining Models to their underlying concepts. The future-proof EHR architecture as open, user-centric, user-friendly, flexible, scalable, portable core application in health information systems and health networks has to follow advanced architectural paradigms.

  14. Quantification of the effects of architectural traits on dry mass production and light interception of tomato canopy under different temperature regimes using a dynamic functional–structural plant model

    PubMed Central

    Chen, Tsu-Wei; Nguyen, Thi My Nguyet; Kahlen, Katrin; Stützel, Hartmut

    2014-01-01

    There is increasing interest in evaluating the environmental effects on crop architectural traits and yield improvement. However, crop models describing the dynamic changes in canopy structure with environmental conditions and the complex interactions between canopy structure, light interception, and dry mass production are only gradually emerging. Using tomato (Solanum lycopersicum L.) as a model crop, a dynamic functional–structural plant model (FSPM) was constructed, parameterized, and evaluated to analyse the effects of temperature on architectural traits, which strongly influence canopy light interception and shoot dry mass. The FSPM predicted the organ growth, organ size, and shoot dry mass over time with high accuracy (>85%). Analyses of this FSPM showed that, in comparison with the reference canopy, shoot dry mass may be affected by leaf angle by as much as 20%, leaf curvature by up to 7%, the leaf length:width ratio by up to 5%, internode length by up to 9%, and curvature ratios and leaf arrangement by up to 6%. Tomato canopies at low temperature had higher canopy density and were more clumped due to higher leaf area and shorter internodes. Interestingly, dry mass production and light interception of the clumped canopy were more sensitive to changes in architectural traits. The complex interactions between architectural traits, canopy light interception, dry mass production, and environmental conditions can be studied by the dynamic FSPM, which may serve as a tool for designing a canopy structure which is ‘ideal’ in a given environment. PMID:25183746

  15. Analysis and comparison of NoSQL databases with an introduction to consistent references in big data storage systems

    NASA Astrophysics Data System (ADS)

    Dziedzic, Adam; Mulawka, Jan

    2014-11-01

    NoSQL is a new approach to data storage and manipulation. The aim of this paper is to gain more insight into NoSQL databases, as we are still in the early stages of understanding when to use them and how to use them in an appropriate way. In this submission descriptions of selected NoSQL databases are presented. Each of the databases is analysed with primary focus on its data model, data access, architecture and practical usage in real applications. Furthemore, the NoSQL databases are compared in fields of data references. The relational databases offer foreign keys, whereas NoSQL databases provide us with limited references. An intermediate model between graph theory and relational algebra which can address the problem should be created. Finally, the proposal of a new approach to the problem of inconsistent references in Big Data storage systems is introduced.

  16. Controle du vol longitudinal d'un avion civil avec satisfaction de qualiies de manoeuvrabilite

    NASA Astrophysics Data System (ADS)

    Saussie, David Alexandre

    2010-03-01

    Fulfilling handling qualities still remains a challenging problem during flight control design. These criteria of different nature are derived from a wide experience based upon flight tests and data analysis, and they have to be considered if one expects a good behaviour of the aircraft. The goal of this thesis is to develop synthesis methods able to satisfy these criteria with fixed classical architectures imposed by the manufacturer or with a new flight control architecture. This is applied to the longitudinal flight model of a Bombardier Inc. business jet aircraft, namely the Challenger 604. A first step of our work consists in compiling the most commonly used handling qualities in order to compare them. A special attention is devoted to the dropback criterion for which theoretical analysis leads us to establish a practical formulation for synthesis purpose. Moreover, the comparison of the criteria through a reference model highlighted dominant criteria that, once satisfied, ensure that other ones are satisfied too. Consequently, we are able to consider the fulfillment of these criteria in the fixed control architecture framework. Guardian maps (Saydy et al., 1990) are then considered to handle the problem. Initially for robustness study, they are integrated in various algorithms for controller synthesis. Incidently, this fixed architecture problem is similar to the static output feedback stabilization problem and reduced-order controller synthesis. Algorithms performing stabilization and pole assignment in a specific region of the complex plane are then proposed. Afterwards, they are extended to handle the gain-scheduling problem. The controller is then scheduled through the entire flight envelope with respect to scheduling parameters. Thereafter, the fixed architecture is put aside while only conserving the same output signals. The main idea is to use Hinfinity synthesis to obtain an initial controller satisfying handling qualities thanks to reference model pairing and robust versus mass and center of gravity variations. Using robust modal control (Magni, 2002), we are able to reduce substantially the controller order and to structure it in order to come close to a classical architecture. An auto-scheduling method finally allows us to schedule the controller with respect to scheduling parameters. Two different paths are used to solve the same problem; each one exhibits its own advantages and disadvantages.

  17. Behavioral Reference Model for Pervasive Healthcare Systems.

    PubMed

    Tahmasbi, Arezoo; Adabi, Sahar; Rezaee, Ali

    2016-12-01

    The emergence of mobile healthcare systems is an important outcome of application of pervasive computing concepts for medical care purposes. These systems provide the facilities and infrastructure required for automatic and ubiquitous sharing of medical information. Healthcare systems have a dynamic structure and configuration, therefore having an architecture is essential for future development of these systems. The need for increased response rate, problem limited storage, accelerated processing and etc. the tendency toward creating a new generation of healthcare system architecture highlight the need for further focus on cloud-based solutions for transfer data and data processing challenges. Integrity and reliability of healthcare systems are of critical importance, as even the slightest error may put the patients' lives in danger; therefore acquiring a behavioral model for these systems and developing the tools required to model their behaviors are of significant importance. The high-level designs may contain some flaws, therefor the system must be fully examined for different scenarios and conditions. This paper presents a software architecture for development of healthcare systems based on pervasive computing concepts, and then models the behavior of described system. A set of solutions are then proposed to improve the design's qualitative characteristics including, availability, interoperability and performance.

  18. Rosen's (M,R) system as an X-machine.

    PubMed

    Palmer, Michael L; Williams, Richard A; Gatherer, Derek

    2016-11-07

    Robert Rosen's (M,R) system is an abstract biological network architecture that is allegedly both irreducible to sub-models of its component states and non-computable on a Turing machine. (M,R) stands as an obstacle to both reductionist and mechanistic presentations of systems biology, principally due to its self-referential structure. If (M,R) has the properties claimed for it, computational systems biology will not be possible, or at best will be a science of approximate simulations rather than accurate models. Several attempts have been made, at both empirical and theoretical levels, to disprove this assertion by instantiating (M,R) in software architectures. So far, these efforts have been inconclusive. In this paper, we attempt to demonstrate why - by showing how both finite state machine and stream X-machine formal architectures fail to capture the self-referential requirements of (M,R). We then show that a solution may be found in communicating X-machines, which remove self-reference using parallel computation, and then synthesise such machine architectures with object-orientation to create a formal basis for future software instantiations of (M,R) systems. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Bio-inspired adaptive feedback error learning architecture for motor control.

    PubMed

    Tolu, Silvia; Vanegas, Mauricio; Luque, Niceto R; Garrido, Jesús A; Ros, Eduardo

    2012-10-01

    This study proposes an adaptive control architecture based on an accurate regression method called Locally Weighted Projection Regression (LWPR) and on a bio-inspired module, such as a cerebellar-like engine. This hybrid architecture takes full advantage of the machine learning module (LWPR kernel) to abstract an optimized representation of the sensorimotor space while the cerebellar component integrates this to generate corrective terms in the framework of a control task. Furthermore, we illustrate how the use of a simple adaptive error feedback term allows to use the proposed architecture even in the absence of an accurate analytic reference model. The presented approach achieves an accurate control with low gain corrective terms (for compliant control schemes). We evaluate the contribution of the different components of the proposed scheme comparing the obtained performance with alternative approaches. Then, we show that the presented architecture can be used for accurate manipulation of different objects when their physical properties are not directly known by the controller. We evaluate how the scheme scales for simulated plants of high Degrees of Freedom (7-DOFs).

  20. Insider Threat Security Reference Architecture

    DTIC Science & Technology

    2012-04-01

    this challenge. CMU/SEI-2012-TR-007 | 2 2 The Components of the ITSRA Figure 2 shows the four layers of the ITSRA. The Business Security layer......organizations improve their level of preparedness to address the insider threat. Business Security Architecture Data Security Architecture

  1. Current state of the mass storage system reference model

    NASA Technical Reports Server (NTRS)

    Coyne, Robert

    1993-01-01

    IEEE SSSWG was chartered in May 1990 to abstract the hardware and software components of existing and emerging storage systems and to define the software interfaces between these components. The immediate goal is the decomposition of a storage system into interoperable functional modules which vendors can offer as separate commercial products. The ultimate goal is to develop interoperable standards which define the software interfaces, and in the distributed case, the associated protocols to each of the architectural modules in the model. The topics are presented in viewgraph form and include the following: IEEE SSSWG organization; IEEE SSSWG subcommittees & chairs; IEEE standards activity board; layered view of the reference model; layered access to storage services; IEEE SSSWG emphasis; and features for MSSRM version 5.

  2. Urban Modelling with Typological Approach. Case Study: Merida, Yucatan, Mexico

    NASA Astrophysics Data System (ADS)

    Rodriguez, A.

    2017-08-01

    In three-dimensional models of urban historical reconstruction, missed contextual architecture faces difficulties because it does not have much written references in contrast to the most important monuments. This is the case of Merida, Yucatan, Mexico during the Colonial Era (1542-1810), which has lost much of its heritage. An alternative to offer a hypothetical view of these elements is a typological - parametric definition that allows a 3D modeling approach to the most common features of this heritage evidence.

  3. A Neural Network Architecture For Rapid Model Indexing In Computer Vision Systems

    NASA Astrophysics Data System (ADS)

    Pawlicki, Ted

    1988-03-01

    Models of objects stored in memory have been shown to be useful for guiding the processing of computer vision systems. A major consideration in such systems, however, is how stored models are initially accessed and indexed by the system. As the number of stored models increases, the time required to search memory for the correct model becomes high. Parallel distributed, connectionist, neural networks' have been shown to have appealing content addressable memory properties. This paper discusses an architecture for efficient storage and reference of model memories stored as stable patterns of activity in a parallel, distributed, connectionist, neural network. The emergent properties of content addressability and resistance to noise are exploited to perform indexing of the appropriate object centered model from image centered primitives. The system consists of three network modules each of which represent information relative to a different frame of reference. The model memory network is a large state space vector where fields in the vector correspond to ordered component objects and relative, object based spatial relationships between the component objects. The component assertion network represents evidence about the existence of object primitives in the input image. It establishes local frames of reference for object primitives relative to the image based frame of reference. The spatial relationship constraint network is an intermediate representation which enables the association between the object based and the image based frames of reference. This intermediate level represents information about possible object orderings and establishes relative spatial relationships from the image based information in the component assertion network below. It is also constrained by the lawful object orderings in the model memory network above. The system design is consistent with current psychological theories of recognition by component. It also seems to support Marr's notions of hierarchical indexing. (i.e. the specificity, adjunct, and parent indices) It supports the notion that multiple canonical views of an object may have to be stored in memory to enable its efficient identification. The use of variable fields in the state space vectors appears to keep the number of required nodes in the network down to a tractable number while imposing a semantic value on different areas of the state space. This semantic imposition supports an interface between the analogical aspects of neural networks and the propositional paradigms of symbolic processing.

  4. The Semantic Mapping of Archival Metadata to the CIDOC CRM Ontology

    ERIC Educational Resources Information Center

    Bountouri, Lina; Gergatsoulis, Manolis

    2011-01-01

    In this article we analyze the main semantics of archival description, expressed through Encoded Archival Description (EAD). Our main target is to map the semantics of EAD to the CIDOC Conceptual Reference Model (CIDOC CRM) ontology as part of a wider integration architecture of cultural heritage metadata. Through this analysis, it is concluded…

  5. Problems of Implementing SCORM in an Enterprise Distance Learning Architecture: SCORM Incompatibility across Multiple Web Domains.

    ERIC Educational Resources Information Center

    Engelbrecht, Jeffrey C.

    2003-01-01

    Delivering content to distant users located in dispersed networks, separated by firewalls and different web domains requires extensive customization and integration. This article outlines some of the problems of implementing the Sharable Content Object Reference Model (SCORM) in the Marine Corps' Distance Learning System (MarineNet) and extends…

  6. Improving TOGAF ADM 9.1 Migration Planning Phase by ITIL V3 Service Transition

    NASA Astrophysics Data System (ADS)

    Hanum Harani, Nisa; Akhmad Arman, Arry; Maulana Awangga, Rolly

    2018-04-01

    Modification planning of business transformation involving technological utilization required a system of transition and migration planning process. Planning of system migration activity is the most important. The migration process is including complex elements such as business re-engineering, transition scheme mapping, data transformation, application development, individual involvement by computer and trial interaction. TOGAF ADM is the framework and method of enterprise architecture implementation. TOGAF ADM provides a manual refer to the architecture and migration planning. The planning includes an implementation solution, in this case, IT solution, but when the solution becomes an IT operational planning, TOGAF could not handle it. This paper presents a new model framework detail transitions process of integration between TOGAF and ITIL. We evaluated our models in field study inside a private university.

  7. Hospital enterprise Architecture Framework (Study of Iranian University Hospital Organization).

    PubMed

    Haghighathoseini, Atefehsadat; Bobarshad, Hossein; Saghafi, Fatehmeh; Rezaei, Mohammad Sadegh; Bagherzadeh, Nader

    2018-06-01

    Nowadays developing smart and fast services for patients and transforming hospitals to modern hospitals is considered a necessity. Living in the world inundated with information systems, designing services based on information technology entails a suitable architecture framework. This paper aims to present a localized enterprise architecture framework for the Iranian university hospital. Using two dimensions of implementation and having appropriate characteristics, the best 17 enterprises frameworks were chosen. As part of this effort, five criteria were selected according to experts' inputs. According to these criteria, five frameworks which had the highest rank were chosen. Then 44 general characteristics were extracted from the existing 17 frameworks after careful studying. Then a questionnaire was written accordingly to distinguish the necessity of those characteristics using expert's opinions and Delphi method. The result showed eight important criteria. In the next step, using AHP method, TOGAF was chosen regarding having appropriate characteristics and the ability to be implemented among reference formats. In the next step, enterprise architecture framework was designed by TOGAF in a conceptual model and its layers. For determining architecture framework parts, a questionnaire with 145 questions was written based on literature review and expert's opinions. The results showed during localization of TOGAF for Iran, 111 of 145 parts were chosen and certified to be used in the hospital. The results showed that TOGAF could be suitable for use in the hospital. So, a localized Hospital Enterprise Architecture Modelling is developed by customizing TOGAF for an Iranian hospital at eight levels and 11 parts. This new model could be used to be performed in other Iranian hospitals. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. LifeWatch - a Large-scale eScience Infrastructure to Assist in Understanding and Managing our Planet's Biodiversity

    NASA Astrophysics Data System (ADS)

    Hernández Ernst, Vera; Poigné, Axel; Los, Walter

    2010-05-01

    Understanding and managing the complexity of the biodiversity system in relation to global changes concerning land use and climate change with their social and economic implications is crucial to mitigate species loss and biodiversity changes in general. The sustainable development and exploitation of existing biodiversity resources require flexible and powerful infrastructures offering, on the one hand, the access to large-scale databases of observations and measures, to advanced analytical and modelling software, and to high performance computing environments and, on the other hand, the interlinkage of European scientific communities among each others and with national policies. The European Strategy Forum on Research Infrastructures (ESFRI) selected the "LifeWatch e-science and technology infrastructure for biodiversity research" as a promising development to construct facilities to contribute to meet those challenges. LifeWatch collaborates with other selected initiatives (e.g. ICOS, ANAEE, NOHA, and LTER-Europa) to achieve the integration of the infrastructures at landscape and regional scales. This should result in a cooperating cluster of such infrastructures supporting an integrated approach for data capture and transmission, data management and harmonisation. Besides, facilities for exploration, forecasting, and presentation using heterogeneous and distributed data and tools should allow the interdisciplinary scientific research at any spatial and temporal scale. LifeWatch is an example of a new generation of interoperable research infrastructures based on standards and a service-oriented architecture that allow for linkage with external resources and associated infrastructures. External data sources will be established data aggregators as the Global Biodiversity Information Facility (GBIF) for species occurrences and other EU Networks of Excellence like the Long-Term Ecological Research Network (LTER), GMES, and GEOSS for terrestrial monitoring, the MARBEF network for marine data, and the Consortium for European Taxonomic Facilities (CETAF) and its European Distributed Institute for Taxonomy (EDIT) for taxonomic data. But also "smaller" networks and "volunteer scientists" may send data (e.g. GPS supported species observations) to a LifeWatch repository. Autonomous operating wireless environmental sensors and other smart hand-held devices will contribute to increase data capture activities. In this way LifeWatch will directly underpin the development of GEOBON, the biodiversity component if GEOSS, the Global Earth observation System. To overcome all major technical difficulties imposed by the variety of currently and future technologies, protocols, data formats, etc., LifeWatch will define and use common open interfaces. For this purpose, the LifeWatch Reference Model was developed during the preparatory phase specifying the service-oriented architecture underlying the ICT-infrastructure. The Reference Model identifies key requirements and key architectural concepts to support workflows for scientific in-silico experiments, tracking of provenance, and semantic enhancement, besides meeting the functional requirements mentioned before. It provides guidelines for the specification and implementation of services and information models, defining as well a number of generic services and models. Another key issue addressed by the Reference Model is that the cooperation of many developer teams residing in many European countries has to be organized to obtain compatible results in that conformance with the specifications and policies of the Reference Model will be required. The LifeWatch Reference Model is based on the ORCHESTRA Reference Model for geospatial-oriented architectures and services networks that provides a generic framework and has been endorsed as best practice by the Open Geospatial Consortium (OGC). The LifeWatch Infrastructure will allow (interdisciplinary) scientific researchers to collaborate by creating e-Laboratories or by composing e-Services which can be shared and jointly developed. For it a long-term vision for the LifeWatch Biodiversity Workbench Portal has been developed as a one-stop application for the LifeWatch infrastructure based on existing and emerging technologies. There the user can find all available resources such as data, workflows, tools, etc. and access LifeWatch applications that integrate different resource and provides key capabilities like resource discovery and visualisation, creation of workflows, creation and management of provenance, and the support of collaborative activities. While LifeWatch developers will construct components for solving generic LifeWatch tasks, users may add their own facilities to fulfil individual needs. Examples for application of the LifeWatch Reference Model and the LifeWatch Biodiversity Workbench Portal will be given.

  9. New approaches to digital transformation of petrochemical production

    NASA Astrophysics Data System (ADS)

    Andieva, E. Y.; Kapelyuhovskaya, A. A.

    2017-08-01

    The newest concepts of the reference architecture of digital industrial transformation are considered, the problems of their application for the enterprises having in their life cycle oil products processing and marketing are revealed. The concept of the reference architecture, providing a systematic representation of the fundamental changes in the approaches to production management based on the automation of production process control is proposed.

  10. Internet Architecture: Lessons Learned and Looking Forward

    DTIC Science & Technology

    2006-12-01

    Internet Architecture: Lessons Learned and Looking Forward Geoffrey G. Xie Department of Computer Science Naval Postgraduate School April 2006... Internet architecture. Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is...readers are referred there for more information about a specific protocol or concept. 2. Origin of Internet Architecture The Internet is easily

  11. Using NASA's Reference Architecture: Comparing Polar and Geostationary Data Processing Systems

    NASA Technical Reports Server (NTRS)

    Ullman, Richard; Burnett, Michael

    2013-01-01

    The JPSS and GOES-R programs are housed at NASA GSFC and jointly implemented by NASA and NOAA to NOAA requirements. NASA's role in the JPSS Ground System is to develop and deploy the system according to NOAA requirements. NASA's role in the GOES-R ground segment is to provide Systems Engineering expertise and oversight for NOAA's development and deployment of the system. NASA's Earth Science Data Systems Reference Architecture is a document developed by NASA's Earth Science Data Systems Standards Process Group that describes a NASA Earth Observing Mission Ground system as a generic abstraction. The authors work within the respective ground segment projects and are also separately contributors to the Reference Architecture document. Opinions expressed are the author's only and are not NOAA, NASA or the Ground Projects' official positions.

  12. Mathematical Modeling of RNA-Based Architectures for Closed Loop Control of Gene Expression.

    PubMed

    Agrawal, Deepak K; Tang, Xun; Westbrook, Alexandra; Marshall, Ryan; Maxwell, Colin S; Lucks, Julius; Noireaux, Vincent; Beisel, Chase L; Dunlop, Mary J; Franco, Elisa

    2018-05-08

    Feedback allows biological systems to control gene expression precisely and reliably, even in the presence of uncertainty, by sensing and processing environmental changes. Taking inspiration from natural architectures, synthetic biologists have engineered feedback loops to tune the dynamics and improve the robustness and predictability of gene expression. However, experimental implementations of biomolecular control systems are still far from satisfying performance specifications typically achieved by electrical or mechanical control systems. To address this gap, we present mathematical models of biomolecular controllers that enable reference tracking, disturbance rejection, and tuning of the temporal response of gene expression. These controllers employ RNA transcriptional regulators to achieve closed loop control where feedback is introduced via molecular sequestration. Sensitivity analysis of the models allows us to identify which parameters influence the transient and steady state response of a target gene expression process, as well as which biologically plausible parameter values enable perfect reference tracking. We quantify performance using typical control theory metrics to characterize response properties and provide clear selection guidelines for practical applications. Our results indicate that RNA regulators are well-suited for building robust and precise feedback controllers for gene expression. Additionally, our approach illustrates several quantitative methods useful for assessing the performance of biomolecular feedback control systems.

  13. 77 FR 35962 - Utilizing Rapidly Deployable Aerial Communications Architecture in Response to an Emergency

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-15

    ... Aerial Communications Architecture in Response to an Emergency AGENCY: Federal Communications Commission... deployable aerial communications architecture (DACA) in facilitating emergency response by rapidly restoring... copying during normal business hours in the FCC Reference Information Center, Portals II, 445 12th Street...

  14. From scenarios to domain models: processes and representations

    NASA Astrophysics Data System (ADS)

    Haddock, Gail; Harbison, Karan

    1994-03-01

    The domain specific software architectures (DSSA) community has defined a philosophy for the development of complex systems. This philosophy improves productivity and efficiency by increasing the user's role in the definition of requirements, increasing the systems engineer's role in the reuse of components, and decreasing the software engineer's role to the development of new components and component modifications only. The scenario-based engineering process (SEP), the first instantiation of the DSSA philosophy, has been adopted by the next generation controller project. It is also the chosen methodology of the trauma care information management system project, and the surrogate semi-autonomous vehicle project. SEP uses scenarios from the user to create domain models and define the system's requirements. Domain knowledge is obtained from a variety of sources including experts, documents, and videos. This knowledge is analyzed using three techniques: scenario analysis, task analysis, and object-oriented analysis. Scenario analysis results in formal representations of selected scenarios. Task analysis of the scenario representations results in descriptions of tasks necessary for object-oriented analysis and also subtasks necessary for functional system analysis. Object-oriented analysis of task descriptions produces domain models and system requirements. This paper examines the representations that support the DSSA philosophy, including reference requirements, reference architectures, and domain models. The processes used to create and use the representations are explained through use of the scenario-based engineering process. Selected examples are taken from the next generation controller project.

  15. Archetype-based semantic integration and standardization of clinical data.

    PubMed

    Moner, David; Maldonado, Jose A; Bosca, Diego; Fernandez, Jesualdo T; Angulo, Carlos; Crespo, Pere; Vivancos, Pedro J; Robles, Montserrat

    2006-01-01

    One of the basic needs for any healthcare professional is to be able to access to clinical information of patients in an understandable and normalized way. The lifelong clinical information of any person supported by electronic means configures his/her Electronic Health Record (EHR). This information is usually distributed among several independent and heterogeneous systems that may be syntactically or semantically incompatible. The Dual Model architecture has appeared as a new proposal for maintaining a homogeneous representation of the EHR with a clear separation between information and knowledge. Information is represented by a Reference Model which describes common data structures with minimal semantics. Knowledge is specified by archetypes, which are formal representations of clinical concepts built upon a particular Reference Model. This kind of architecture is originally thought for implantation of new clinical information systems, but archetypes can be also used for integrating data of existing and not normalized systems, adding at the same time a semantic meaning to the integrated data. In this paper we explain the possible use of a Dual Model approach for semantic integration and standardization of heterogeneous clinical data sources and present LinkEHR-Ed, a tool for developing archetypes as elements for integration purposes. LinkEHR-Ed has been designed to be easily used by the two main participants of the creation process of archetypes for clinical data integration: the Health domain expert and the Information Technologies domain expert.

  16. A Structured Approach for Reviewing Architecture Documentation

    DTIC Science & Technology

    2009-12-01

    as those found in ISO 12207 [ ISO /IEC 12207 :2008] (for software engineering), ISO 15288 [ ISO /IEC 15288:2008] (for systems engineering), the Rational...Open Distributed Processing - Reference Model: Foundations ( ISO /IEC 10746-2). 1996. [ ISO /IEC 12207 :2008] International Organization for...Standardization & International Electrotechnical Commission. Sys- tems and software engineering – Software life cycle processes ( ISO /IEC 12207 ). 2008. [ ISO

  17. A STUDY OF CONTINUING EDUCATION NEEDS OF SELECTED PROFESSIONAL GROUPS AND UNIVERSITY EXTENSION CONTRACT PROGRAMS IN WYOMING.

    ERIC Educational Resources Information Center

    NICHOLAS, ROBERT A.

    THIS STUDY AIMED TO DEVELOP PRINCIPLES FOR A MODEL PROGRAM OF CONTINUING EDUCATION FOR THE PROFESSIONS AT THE UNIVERSITY OF WYOMING. THE AUTHOR REVIEWED THE LITERATURE ON THE GROWTH OF THE PROFESSIONS AND ON CONTINUING EDUCATION IN THE PROFESSIONS GENERALLY, WITH SPECIAL REFERENCE TO ARCHITECTURE, DENTISTRY, LAW, MEDICINE, AND PHARMACY. FROM THIS…

  18. Fostering Enterprise Architecture Education and Training with the Enterprise Architecture Competence Framework

    ERIC Educational Resources Information Center

    Tambouris, Efthimios; Zotou, Maria; Kalampokis, Evangelos; Tarabanis, Konstantinos

    2012-01-01

    Enterprise architecture (EA) implementation refers to a set of activities ultimately aiming to align business objectives with information technology infrastructure in an organization. EA implementation is a multidisciplinary, complicated and endless process, hence, calls for adequate education and training programs that will build highly skilled…

  19. Validity of flowmeter data in heterogeneous alluvial aquifers

    NASA Astrophysics Data System (ADS)

    Bianchi, Marco

    2017-04-01

    Numerical simulations are performed to evaluate the impact of medium-scale sedimentary architecture and small-scale heterogeneity on the validity of the borehole flowmeter test, a widely used method for measuring hydraulic conductivity (K) at the scale required for detailed groundwater flow and solute transport simulations. Reference data from synthetic K fields representing the range of structures and small-scale heterogeneity typically observed in alluvial systems are compared with estimated values from numerical simulations of flowmeter tests. Systematic errors inherent in the flowmeter K estimates are significant when the reference K field structure deviates from the hypothetical perfectly stratified conceptual model at the basis of the interpretation method of flowmeter tests. Because of these errors, the true variability of the K field is underestimated and the distributions of the reference K data and log-transformed spatial increments are also misconstrued. The presented numerical analysis shows that the validity of flowmeter based K data depends on measureable parameters defining the architecture of the hydrofacies, the conductivity contrasts between the hydrofacies and the sub-facies-scale K variability. A preliminary geological characterization is therefore essential for evaluating the optimal approach for accurate K field characterization.

  20. Protocol Architecture Model Report

    NASA Technical Reports Server (NTRS)

    Dhas, Chris

    2000-01-01

    NASA's Glenn Research Center (GRC) defines and develops advanced technology for high priority national needs in communications technologies for application to aeronautics and space. GRC tasked Computer Networks and Software Inc. (CNS) to examine protocols and architectures for an In-Space Internet Node. CNS has developed a methodology for network reference models to support NASA's four mission areas: Earth Science, Space Science, Human Exploration and Development of Space (REDS), Aerospace Technology. This report applies the methodology to three space Internet-based communications scenarios for future missions. CNS has conceptualized, designed, and developed space Internet-based communications protocols and architectures for each of the independent scenarios. The scenarios are: Scenario 1: Unicast communications between a Low-Earth-Orbit (LEO) spacecraft inspace Internet node and a ground terminal Internet node via a Tracking and Data Rela Satellite (TDRS) transfer; Scenario 2: Unicast communications between a Low-Earth-Orbit (LEO) International Space Station and a ground terminal Internet node via a TDRS transfer; Scenario 3: Multicast Communications (or "Multicasting"), 1 Spacecraft to N Ground Receivers, N Ground Transmitters to 1 Ground Receiver via a Spacecraft.

  1. Trade-Off Analysis Report

    NASA Technical Reports Server (NTRS)

    Dhas, Chris

    2000-01-01

    NASAs Glenn Research Center (GRC) defines and develops advanced technology for high priority national needs in communications technologies for application to aeronautics and space. GRC tasked Computer Networks and Software Inc. (CNS) to examine protocols and architectures for an In-Space Internet Node. CNS has developed a methodology for network reference models to support NASAs four mission areas: Earth Science, Space Science, Human Exploration and Development of Space (REDS), Aerospace Technology. CNS previously developed a report which applied the methodology, to three space Internet-based communications scenarios for future missions. CNS conceptualized, designed, and developed space Internet-based communications protocols and architectures for each of the independent scenarios. GRC selected for further analysis the scenario that involved unicast communications between a Low-Earth-Orbit (LEO) International Space Station (ISS) and a ground terminal Internet node via a Tracking and Data Relay Satellite (TDRS) transfer. This report contains a tradeoff analysis on the selected scenario. The analysis examines the performance characteristics of the various protocols and architectures. The tradeoff analysis incorporates the results of a CNS developed analytical model that examined performance parameters.

  2. Space station needs, attributes and architectural options: Architectural options and selection

    NASA Technical Reports Server (NTRS)

    Nelson, W. G.

    1983-01-01

    The approach, study results, and recommendations for defining and selecting space station architectural options are described. Space station system architecture is defined as the arrangement of elements (manned and unmanned on-orbit facilities, shuttle vehicles, orbital transfer vehicles, etc.), the number of these elements, their location (orbital inclination and altitude, and their functional performance capability, power, volume, crew, etc.). Architectural options are evaluated based on the degree of mission capture versus cost and required funding rate. Mission capture refers to the number of missions accommodated by the particular architecture.

  3. Essential issues in multiprocessor systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gajski, D.D.; Peir, J.K.

    1985-06-01

    During the past several years, a great number of proposals have been made with the objective to increase supercomputer performance by an order of magnitude on the basis of a utilization of new computer architectures. The present paper is concerned with a suitable classification scheme for comparing these architectures. It is pointed out that there are basically four schools of thought as to the most important factor for an enhancement of computer performance. According to one school, the development of faster circuits will make it possible to retain present architectures, except, possibly, for a mechanism providing synchronization of parallel processes.more » A second school assigns priority to the optimization and vectorization of compilers, which will detect parallelism and help users to write better parallel programs. A third school believes in the predominant importance of new parallel algorithms, while the fourth school supports new models of computation. The merits of the four approaches are critically evaluated. 50 references.« less

  4. Positioning navigation and timing service applications in cyber physical systems

    NASA Astrophysics Data System (ADS)

    Qu, Yi; Wu, Xiaojing; Zeng, Lingchuan

    2017-10-01

    The positioning navigation and timing (PNT) architecture was discussed in detail, whose history, evolvement, current status and future plan were presented, main technologies were listed, advantages and limitations of most technologies were compared, novel approaches were introduced, and future capacities were sketched. The concept of cyber-physical system (CPS) was described and their primary features were interpreted. Then the three-layer architecture of CPS was illustrated. Next CPS requirements on PNT services were analyzed, including requirements on position reference and time reference, requirements on temporal-spatial error monitor, requirements on dynamic services, real-time services, autonomous services, security services and standard services. Finally challenges faced by PNT applications in CPS were concluded. The conclusion was expected to facilitate PNT applications in CPS, and furthermore to provide references to the design and implementation of both architectures.

  5. A method to evaluate utility for architectural comparisons for a campaign to explore the surface of Mars

    NASA Astrophysics Data System (ADS)

    Ward, Eric D.; Webb, Ryan R.; deWeck, Olivier L.

    2016-11-01

    There is a general consensus that Mars is the next high priority destination for human space exploration. There has been no lack of analysis and recommendations for human missions to Mars, including, for example, the NASA Design Reference Architectures and the Mars Direct proposal. These studies and others usually employ the traditional approach of selecting a baseline mission architecture and running individual trade studies. However, this can cause blind spots, as not all combinations are explored. An alternative approach is to holistically analyze the entire architectural trade-space such that all of the possible system interactions are identified and measured. In such a framework, an optimal design is sought by minimizing cost for maximal value. While cost is relatively easy to model for manned spaceflight, value is more difficult to define. In our efforts to develop a surface base architecture for the MIT Mars 2040 project, we explored several methods for quantifying value, including technology development benefits, challenge, and various metrics for measuring scientific return. We developed a science multi-score method that combines astrobiology and geologic research goals, which is weighted by the crew-member hours that can be used for scientific research rather than other activities.

  6. The Role of Architectural and Learning Constraints in Neural Network Models: A Case Study on Visual Space Coding.

    PubMed

    Testolin, Alberto; De Filippo De Grazia, Michele; Zorzi, Marco

    2017-01-01

    The recent "deep learning revolution" in artificial neural networks had strong impact and widespread deployment for engineering applications, but the use of deep learning for neurocomputational modeling has been so far limited. In this article we argue that unsupervised deep learning represents an important step forward for improving neurocomputational models of perception and cognition, because it emphasizes the role of generative learning as opposed to discriminative (supervised) learning. As a case study, we present a series of simulations investigating the emergence of neural coding of visual space for sensorimotor transformations. We compare different network architectures commonly used as building blocks for unsupervised deep learning by systematically testing the type of receptive fields and gain modulation developed by the hidden neurons. In particular, we compare Restricted Boltzmann Machines (RBMs), which are stochastic, generative networks with bidirectional connections trained using contrastive divergence, with autoencoders, which are deterministic networks trained using error backpropagation. For both learning architectures we also explore the role of sparse coding, which has been identified as a fundamental principle of neural computation. The unsupervised models are then compared with supervised, feed-forward networks that learn an explicit mapping between different spatial reference frames. Our simulations show that both architectural and learning constraints strongly influenced the emergent coding of visual space in terms of distribution of tuning functions at the level of single neurons. Unsupervised models, and particularly RBMs, were found to more closely adhere to neurophysiological data from single-cell recordings in the primate parietal cortex. These results provide new insights into how basic properties of artificial neural networks might be relevant for modeling neural information processing in biological systems.

  7. The Role of Architectural and Learning Constraints in Neural Network Models: A Case Study on Visual Space Coding

    PubMed Central

    Testolin, Alberto; De Filippo De Grazia, Michele; Zorzi, Marco

    2017-01-01

    The recent “deep learning revolution” in artificial neural networks had strong impact and widespread deployment for engineering applications, but the use of deep learning for neurocomputational modeling has been so far limited. In this article we argue that unsupervised deep learning represents an important step forward for improving neurocomputational models of perception and cognition, because it emphasizes the role of generative learning as opposed to discriminative (supervised) learning. As a case study, we present a series of simulations investigating the emergence of neural coding of visual space for sensorimotor transformations. We compare different network architectures commonly used as building blocks for unsupervised deep learning by systematically testing the type of receptive fields and gain modulation developed by the hidden neurons. In particular, we compare Restricted Boltzmann Machines (RBMs), which are stochastic, generative networks with bidirectional connections trained using contrastive divergence, with autoencoders, which are deterministic networks trained using error backpropagation. For both learning architectures we also explore the role of sparse coding, which has been identified as a fundamental principle of neural computation. The unsupervised models are then compared with supervised, feed-forward networks that learn an explicit mapping between different spatial reference frames. Our simulations show that both architectural and learning constraints strongly influenced the emergent coding of visual space in terms of distribution of tuning functions at the level of single neurons. Unsupervised models, and particularly RBMs, were found to more closely adhere to neurophysiological data from single-cell recordings in the primate parietal cortex. These results provide new insights into how basic properties of artificial neural networks might be relevant for modeling neural information processing in biological systems. PMID:28377709

  8. Automated Discovery of Machine-Specific Code Improvements

    DTIC Science & Technology

    1984-12-01

    operation of the source language. Additional analysis may reveal special features of the target architecture that may be exploited to generate efficient...Additional analysis may reveal special features of the target architecture that may be exploited to generate efficient code. Such analysis is optional...incorporate knowledge of the source language, but do not refer to features of the target machine. These early phases are sometimes referred to as the

  9. EDI system definition for a European medical device vigilance system.

    PubMed

    Doukidis, G; Pallikarakis, N; Pangalos, G; Vassilacopoulos, G; Pramataris, K

    1996-01-01

    EDI is expected to be the dominant form of business communication between organizations moving to the Electronic Commerce era of 2000. The healthcare sector is already using EDI in the hospital supply function as well as in the clinical area and the reimbursement process. In this paper, we examine the use of EDI in the healthcare administration sector and more specifically its application to the Medical Device Vigilance System. Firstly, the potential of this approach is examined, followed by the definition of the EDI System Reference Model and the specification of the required system architecture. Each of the architecture's components are then explained in more detail, followed by the most important implementation options relating to them.

  10. Ssip-a processor interconnection simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Navaux, P.; Weber, R.; Prezzi, J.

    1982-01-01

    Recent growing interest in multiple processor architectures has given rise to the study of procesor-memory interconnections for the determination of better architectures. This paper concerns the development of the SSIP-sistema simulador de interconexao de processadores (processor interconnection simulating system) which allows the evaluation of different interconnection structures comparing its performance in order to provide parameters which would help the designer to define an architcture. A wide spectrum of systems may be evaluated, and their behaviour observed due to the features incorporated into the simulator program. The system modelling and the simulator program implementation are described. Some results that can bemore » obtained are shown, along with the discussion of their usefulness. 12 references.« less

  11. What Are the Buildings Saying? A Study of First-Year Undergraduate Students' Attributions about College Campus Architecture.

    ERIC Educational Resources Information Center

    Bennett, Michael A.; Benton, Stephen L.

    2001-01-01

    Examines the attributions college students (N=301) make toward pictures of college campus buildings. Results reveal that students attributed greater likelihood of individual success to pictures depicting modern architecture than they did to those depicting traditional architecture. (Contains 28 references and 3 tables.) (Author/GCP)

  12. Implications of Multi-Core Architectures on the Development of Multiple Independent Levels of Security (MILS) Compliant Systems

    DTIC Science & Technology

    2012-10-01

    REPORT 3. DATES COVERED (From - To) MAR 2010 – APR 2012 4 . TITLE AND SUBTITLE IMPLICATIONS OF MULT-CORE ARCHITECTURES ON THE DEVELOPMENT OF...Framework for Multicore Information Flow Analysis ...................................... 23 4 4.1 A Hypothetical Reference Architecture... 4 Figure 2: Pentium II Block Diagram

  13. 77 FR 187 - Federal Acquisition Regulation; Transition to the System for Award Management (SAM)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-03

    ... architecture. Deletes reference to ``business partner network'' at 4.1100, Scope, which is no longer necessary...) architecture has begun. This effort will transition the Central Contractor Registration (CCR) database, the...) to the new architecture. This case provides the first step in updating the FAR for these changes, and...

  14. Artificial Intelligence for VHSIC Systems Design (AIVD) User Reference Manual

    DTIC Science & Technology

    1988-12-01

    The goal of this program was to develop prototype tools which would use artificial intelligence techniques to extend the Architecture Design and Assessment (ADAS) software capabilities. These techniques were applied in a number of ways to increase the productivity of ADAS users. AIM will reduce the amount of time spent on tedious, negative, and error-prone steps. It will also provide f documentation that will assist users in varifying that the models they build are correct Finally, AIVD will help make ADAS models more reusable.

  15. Counter Unmanned Aerial System Decision-Aid Logic Process (C-UAS DALP)

    DTIC Science & Technology

    decision -aid or logic process that bridges the middle elements of the kill... of use, location, general logic process , and reference mission. This is the framework for the IDEF0 functional architecture diagrams, decision -aid diagrams, logic process , and modeling and simulation....chain between detection to countermeasure response. This capstone project creates the logic for a decision process that transitions from the

  16. Requirements for plug and play information infrastructure frameworks and architectures to enable virtual enterprises

    NASA Astrophysics Data System (ADS)

    Bolton, Richard W.; Dewey, Allen; Horstmann, Paul W.; Laurentiev, John

    1997-01-01

    This paper examines the role virtual enterprises will have in supporting future business engagements and resulting technology requirements. Two representative end-user scenarios are proposed that define the requirements for 'plug-and-play' information infrastructure frameworks and architectures necessary to enable 'virtual enterprises' in US manufacturing industries. The scenarios provide a high- level 'needs analysis' for identifying key technologies, defining a reference architecture, and developing compliant reference implementations. Virtual enterprises are short- term consortia or alliances of companies formed to address fast-changing opportunities. Members of a virtual enterprise carry out their tasks as if they all worked for a single organization under 'one roof', using 'plug-and-play' information infrastructure frameworks and architectures to access and manage all information needed to support the product cycle. 'Plug-and-play' information infrastructure frameworks and architectures are required to enhance collaboration between companies corking together on different aspects of a manufacturing process. This new form of collaborative computing will decrease cycle-time and increase responsiveness to change.

  17. Software architecture for a distributed real-time system in Ada, with application to telerobotics

    NASA Technical Reports Server (NTRS)

    Olsen, Douglas R.; Messiora, Steve; Leake, Stephen

    1992-01-01

    The architecture structure and software design methodology presented is described in the context of telerobotic application in Ada, specifically the Engineering Test Bed (ETB), which was developed to support the Flight Telerobotic Servicer (FTS) Program at GSFC. However, the nature of the architecture is such that it has applications to any multiprocessor distributed real-time system. The ETB architecture, which is a derivation of the NASA/NBS Standard Reference Model (NASREM), defines a hierarchy for representing a telerobot system. Within this hierarchy, a module is a logical entity consisting of the software associated with a set of related hardware components in the robot system. A module is comprised of submodules, which are cyclically executing processes that each perform a specific set of functions. The submodules in a module can run on separate processors. The submodules in the system communicate via command/status (C/S) interface channels, which are used to send commands down and relay status back up the system hierarchy. Submodules also communicate via setpoint data links, which are used to transfer control data from one submodule to another. A submodule invokes submodule algorithms (SMA's) to perform algorithmic operations. Data that describe or models a physical component of the system are stored as objects in the World Model (WM). The WM is a system-wide distributed database that is accessible to submodules in all modules of the system for creating, reading, and writing objects.

  18. Development of an unmanned maritime system reference architecture

    NASA Astrophysics Data System (ADS)

    Duarte, Christiane N.; Cramer, Megan A.; Stack, Jason R.

    2014-06-01

    The concept of operations (CONOPS) for unmanned maritime systems (UMS) continues to envision systems that are multi-mission, re-configurable and capable of acceptable performance over a wide range of environmental and contextual variability. Key enablers for these concepts of operation are an autonomy module which can execute different mission directives and a mission payload consisting of re-configurable sensor or effector suites. This level of modularity in mission payloads enables affordability, flexibility (i.e., more capability with future platforms) and scalability (i.e., force multiplication). The modularity in autonomy facilitates rapid technology integration, prototyping, testing and leveraging of state-of-the-art advances in autonomy research. Capability drivers imply a requirement to maintain an open architecture design for both research and acquisition programs. As the maritime platforms become more stable in their design (e.g. unmanned surface vehicles, unmanned underwater vehicles) future developments are able to focus on more capable sensors and more robust autonomy algorithms. To respond to Fleet needs, given an evolving threat, programs will want to interchange the latest sensor or a new and improved algorithm in a cost effective and efficient manner. In order to make this possible, the programs need a reference architecture that will define for technology providers where their piece fits and how to successfully integrate. With these concerns in mind, the US Navy established the Unmanned Maritime Systems Reference Architecture (UMS-RA) Working Group in August 2011. This group consists of Department of Defense and industry participants working the problem of defining reference architecture for autonomous operations of maritime systems. This paper summarizes its efforts to date.

  19. Reference architecture and interoperability model for data mining and fusion in scientific cross-domain infrastructures

    NASA Astrophysics Data System (ADS)

    Haener, Rainer; Waechter, Joachim; Grellet, Sylvain; Robida, Francois

    2017-04-01

    Interoperability is the key factor in establishing scientific research environments and infrastructures, as well as in bringing together heterogeneous, geographically distributed risk management, monitoring, and early warning systems. Based on developments within the European Plate Observing System (EPOS), a reference architecture has been devised that comprises architectural blue-prints and interoperability models regarding the specification of business processes and logic as well as the encoding of data, metadata, and semantics. The architectural blueprint is developed on the basis of the so called service-oriented architecture (SOA) 2.0 paradigm, which combines intelligence and proactiveness of event-driven with service-oriented architectures. SOA 2.0 supports analysing (Data Mining) both, static and real-time data in order to find correlations of disparate information that do not at first appear to be intuitively obvious: Analysed data (e.g., seismological monitoring) can be enhanced with relationships discovered by associating them (Data Fusion) with other data (e.g., creepmeter monitoring), with digital models of geological structures, or with the simulation of geological processes. The interoperability model describes the information, communication (conversations) and the interactions (choreographies) of all participants involved as well as the processes for registering, providing, and retrieving information. It is based on the principles of functional integration, implemented via dedicated services, communicating via service-oriented and message-driven infrastructures. The services provide their functionality via standardised interfaces: Instead of requesting data directly, users share data via services that are built upon specific adapters. This approach replaces the tight coupling at data level by a flexible dependency on loosely coupled services. The main component of the interoperability model is the comprehensive semantic description of the information, business logic and processes on the basis of a minimal set of well-known, established standards. It implements the representation of knowledge with the application of domain-controlled vocabularies to statements about resources, information, facts, and complex matters (ontologies). Seismic experts for example, would be interested in geological models or borehole measurements at a certain depth, based on which it is possible to correlate and verify seismic profiles. The entire model is built upon standards from the Open Geospatial Consortium (Dictionaries, Service Layer), the International Organisation for Standardisation (Registries, Metadata), and the World Wide Web Consortium (Resource Description Framework, Spatial Data on the Web Best Practices). It has to be emphasised that this approach is scalable to the greatest possible extent: All information, necessary in the context of cross-domain infrastructures is referenced via vocabularies and knowledge bases containing statements that provide either the information itself or resources (service-endpoints), the information can be retrieved from. The entire infrastructure communication is subject to a broker-based business logic integration platform where the information exchanged between involved participants, is managed on the basis of standardised dictionaries, repositories, and registries. This approach also enables the development of Systems-of-Systems (SoS), which allow the collaboration of autonomous, large scale concurrent, and distributed systems, yet cooperatively interacting as a collective in a common environment.

  20. Probabilistic Ontology Architecture for a Terrorist Identification Decision Support System

    DTIC Science & Technology

    2014-06-01

    in real-world problems requires probabilistic ontologies, which integrate the inferential reasoning power of probabilistic representations with the... inferential reasoning power of probabilistic representations with the first-order expressivity of ontologies. The Reference Architecture for...ontology, terrorism, inferential reasoning, architecture I. INTRODUCTION A. Background Whether by nature or design, the personas of terrorists are

  1. An Architecture Combining IMS-LD and Web Services for Flexible Data-Transfer in CSCL

    ERIC Educational Resources Information Center

    Magnisalis, Ioannis; Demetriadis, Stavros

    2017-01-01

    This article presents evaluation data regarding the MAPIS3 architecture which is proposed as a solution for the data-transfer among various tools to promote flexible collaborative learning designs. We describe the problem that this architecture deals with as "tool orchestration" in collaborative learning settings. This term refers to a…

  2. Creating a New Architecture for the Learning College

    ERIC Educational Resources Information Center

    O'Banion, Terry

    2007-01-01

    The publication of "A Nation at Risk" in 1983 triggered a series of major reform efforts in education that are still evolving. As part of the reform efforts, leaders began to refer to a Learning Revolution that would "place learning first by overhauling the traditional architecture of education." The old architecture--time-bound, place-bound,…

  3. Coupling root architecture and pore network modeling - an attempt towards better understanding root-soil interactions

    NASA Astrophysics Data System (ADS)

    Leitner, Daniel; Bodner, Gernot; Raoof, Amir

    2013-04-01

    Understanding root-soil interactions is of high importance for environmental and agricultural management. Root uptake is an essential component in water and solute transport modeling. The amount of groundwater recharge and solute leaching significantly depends on the demand based plant extraction via its root system. Plant uptake however not only responds to the potential demand, but in most situations is limited by supply form the soil. The ability of the plant to access water and solutes in the soil is governed mainly by root distribution. Particularly under conditions of heterogeneous distribution of water and solutes in the soil, it is essential to capture the interaction between soil and roots. Root architecture models allow studying plant uptake from soil by describing growth and branching of root axes in the soil. Currently root architecture models are able to respond dynamically to water and nutrient distribution in the soil by directed growth (tropism), modified branching and enhanced exudation. The porous soil medium as rooting environment in these models is generally described by classical macroscopic water retention and sorption models, average over the pore scale. In our opinion this simplified description of the root growth medium implies several shortcomings for better understanding root-soil interactions: (i) It is well known that roots grow preferentially in preexisting pores, particularly in more rigid/dry soil. Thus the pore network contributes to the architectural form of the root system; (ii) roots themselves can influence the pore network by creating preferential flow paths (biopores) which are an essential element of structural porosity with strong impact on transport processes; (iii) plant uptake depend on both the spatial location of water/solutes in the pore network as well as the spatial distribution of roots. We therefore consider that for advancing our understanding in root-soil interactions, we need not only to extend our root models, but also improve the description of the rooting environment. Until now there have been no attempts to couple root architecture and pore network models. In our work we present a first attempt to join both types of models using the root architecture model of Leitner et al., (2010) and a pore network model presented by Raoof et al. (2010). The two main objectives of coupling both models are: (i) Representing the effect of root induced biopores on flow and transport processes: For this purpose a fixed root architecture created by the root model is superimposed as a secondary root induced pore network to the primary soil network, thus influencing the final pore topology in the network generation. (ii) Representing the influence of pre-existing pores on root branching: Using a given network of (rigid) pores, the root architecture model allocates its root axes into these preexisting pores as preferential growth paths with thereby shape the final root architecture. The main objective of our study is to reveal the potential of using a pore scale description of the plant growth medium for an improved representation of interaction processes at the interface of root and soil. References Raoof, A., Hassanizadeh, S.M. 2010. A New Method for Generating Pore-Network Models. Transp. Porous Med. 81, 391-407. Leitner, D, Klepsch, S., Bodner, G., Schnepf, S. 2010. A dynamic root system growth model based on L-Systems. Tropisms and coupling to nutrient uptake from soil. Plant Soil 332, 177-192.

  4. Achieving a Launch on Demand Capability

    NASA Astrophysics Data System (ADS)

    Greenberg, Joel S.

    2002-01-01

    The ability to place payloads [satellites] into orbit as and when required, often referred to as launch on demand, continues to be an elusive and yet largely unfulfilled goal. But what is the value of achieving launch on demand [LOD], and what metrics are appropriate? Achievement of a desired level of LOD capability must consider transportation system thruput, alternative transportation systems that comprise the transportation architecture, transportation demand, reliability and failure recovery characteristics of the alternatives, schedule guarantees, launch delays, payload integration schedules, procurement policies, and other factors. Measures of LOD capability should relate to the objective of the transportation architecture: the placement of payloads into orbit as and when required. Launch on demand capability must be defined in probabilistic terms such as the probability of not incurring a delay in excess of T when it is determined that it is necessary to place a payload into orbit. Three specific aspects of launch on demand are considered: [1] the ability to recover from adversity [i.e., a launch failure] and to keep up with the steady-state demand for placing satellites into orbit [this has been referred to as operability and resiliency], [2] the ability to respond to the requirement to launch a satellite when the need arises unexpectedly either because of an unexpected [random] on-orbit satellite failure that requires replacement or because of the sudden recognition of an unanticipated requirement, and [3] the ability to recover from adversity [i.e., a launch failure] during the placement of a constellation into orbit. The objective of this paper is to outline a formal approach for analyzing alternative transportation architectures in terms of their ability to provide a LOD capability. The economic aspect of LOD is developed by establishing a relationship between scheduling and the elimination of on-orbit spares while achieving the desired level of on-orbit availability. Results of an analysis are presented. The implications of launch on demand are addressed for each of the above three situations and related architecture performance metrics and computer simulation models are described that may be used to evaluate the implications of architecture and policy changes in terms of LOD requirements. The models and metrics are aimed at providing answers to such questions as: How well does a specified space transportation architecture respond to satellite launch demand and changes thereto? How well does a normally functioning and apparently architecture respond to unanticipated needs? What is the effect of a modification to the architecture on its ability to respond to satellite launch demand, including responding to unanticipated needs? What is the cost of the architecture [including facilities, operations, inventory, and satellites]? What is the sensitivity of overall architecture effectiveness and cost to various transportation system delays? What is the effect of adding [or eliminating] a launch vehicle or family of vehicles to [from] the architecture on its effectiveness and cost? What is the value of improving launch vehicle and satellite compatibility and what are the effects on probability of delay statistics and cost of designing for multi-launch vehicle compatibility

  5. Hierarchical Chunking of Sequential Memory on Neuromorphic Architecture with Reduced Synaptic Plasticity

    PubMed Central

    Li, Guoqi; Deng, Lei; Wang, Dong; Wang, Wei; Zeng, Fei; Zhang, Ziyang; Li, Huanglong; Song, Sen; Pei, Jing; Shi, Luping

    2016-01-01

    Chunking refers to a phenomenon whereby individuals group items together when performing a memory task to improve the performance of sequential memory. In this work, we build a bio-plausible hierarchical chunking of sequential memory (HCSM) model to explain why such improvement happens. We address this issue by linking hierarchical chunking with synaptic plasticity and neuromorphic engineering. We uncover that a chunking mechanism reduces the requirements of synaptic plasticity since it allows applying synapses with narrow dynamic range and low precision to perform a memory task. We validate a hardware version of the model through simulation, based on measured memristor behavior with narrow dynamic range in neuromorphic circuits, which reveals how chunking works and what role it plays in encoding sequential memory. Our work deepens the understanding of sequential memory and enables incorporating it for the investigation of the brain-inspired computing on neuromorphic architecture. PMID:28066223

  6. Development of a High Angular Resolution Diffusion Imaging Human Brain Template

    PubMed Central

    Varentsova, Anna; Zhang, Shengwei; Arfanakis, Konstantinos

    2014-01-01

    Brain diffusion templates contain rich information about the microstructure of the brain, and are used as references in spatial normalization or in the development of brain atlases. The accuracy of diffusion templates constructed based on the diffusion tensor (DT) model is limited in regions with complex neuronal micro-architecture. High angular resolution diffusion imaging (HARDI) overcomes limitations of the DT model and is capable of resolving intravoxel heterogeneity. However, when HARDI is combined with multiple-shot sequences to minimize image artifacts, the scan time becomes inappropriate for human brain imaging. In this work, an artifact-free HARDI template of the human brain was developed from low angular resolution multiple-shot diffusion data. The resulting HARDI template was produced in ICBM-152 space based on Turboprop diffusion data, was shown to resolve complex neuronal micro-architecture in regions with intravoxel heterogeneity, and contained fiber orientation information consistent with known human brain anatomy. PMID:24440528

  7. Integration of implant planning workflows into the PACS infrastructure

    NASA Astrophysics Data System (ADS)

    Gessat, Michael; Strauß, Gero; Burgert, Oliver

    2008-03-01

    The integration of imaging devices, diagnostic workstations, and image servers into Picture Archiving and Communication Systems (PACS) has had an enormous effect on the efficiency of radiology workflows. The standardization of the information exchange between the devices with the DICOM standard has been an essential precondition for that development. For surgical procedures, no such infrastructure exists. With the increasingly important role computerized planning and assistance systems play in the surgical domain, an infrastructure that unifies the communication between devices becomes necessary. In recent publications, the need for a modularized system design has been established. A reference architecture for a Therapy Imaging and Model Management System (TIMMS) has been proposed. It was accepted by the DICOM Working Group 6 as the reference architecture for DICOM developments for surgery. In this paper we propose the inclusion of implant planning systems into the PACS infrastructure. We propose a generic information model for the patient specific selection and positioning of implants from a repository according to patient image data. The information models are based on clinical workflows from ENT, cardiac, and orthopedic surgery as well as technical requirements derived from different use cases and systems. We show an exemplary implementation of the model for application in ENT surgery: the selection and positioning of an ossicular implant in the middle ear. An implant repository is stored in the PACS. It makes use of an experimental implementation of the Surface Mesh Module that is currently being developed as extension to the DICOM standard.

  8. The relationship between reference canopy conductance and simplified hydraulic architecture

    NASA Astrophysics Data System (ADS)

    Novick, Kimberly; Oren, Ram; Stoy, Paul; Juang, Jehn-Yih; Siqueira, Mario; Katul, Gabriel

    2009-06-01

    Terrestrial ecosystems are dominated by vascular plants that form a mosaic of hydraulic conduits to water movement from the soil to the atmosphere. Together with canopy leaf area, canopy stomatal conductance regulates plant water use and thereby photosynthesis and growth. Although stomatal conductance is coordinated with plant hydraulic conductance, governing relationships across species has not yet been formulated at a practical level that can be employed in large-scale models. Here, combinations of published conductance measurements obtained with several methodologies across boreal to tropical climates were used to explore relationships between canopy conductance rates and hydraulic constraints. A parsimonious hydraulic model requiring sapwood-to-leaf area ratio and canopy height generated acceptable agreement with measurements across a range of biomes (r2=0.75). The results suggest that, at long time scales, the functional convergence among ecosystems in the relationship between water-use and hydraulic architecture eclipses inter-specific variation in physiology and anatomy of the transport system. Prognostic applicability of this model requires independent knowledge of sapwood-to-leaf area. In this study, we did not find a strong relationship between sapwood-to-leaf area and physical or climatic variables that are readily determinable at coarse scales, though the results suggest that climate may have a mediating influence on the relationship between sapwood-to-leaf area and height. Within temperate forests, canopy height alone explained a large amount of the variance in reference canopy conductance (r2=0.68) and this relationship may be more immediately applicable in the terrestrial ecosystem models.

  9. Services, architectures, and protocols for space data systems

    NASA Technical Reports Server (NTRS)

    Helgert, Hermann J.

    1991-01-01

    The author presents a comprehensive discussion of three major aspects of the work of the Consultative Committee for Space Data Systems (CCSDS), a worldwide cooperative effort of national space agencies. The author examines the CCSDS space data communications network concept on which the data communications facilities of future advanced orbiting systems will be based. He derives the specifications of an open communications architecture as a reference model for the development of services and protocols that support the transfer of information over space data communications networks. Detailed specifications of the communication services and information transfer protocols that have reached a high degree of maturity and stability are offered. The author also includes a complete list of currently available CCSDS standards and supporting documentation.

  10. Nonlinear Dynamic Inversion Baseline Control Law: Architecture and Performance Predictions

    NASA Technical Reports Server (NTRS)

    Miller, Christopher J.

    2011-01-01

    A model reference dynamic inversion control law has been developed to provide a baseline control law for research into adaptive elements and other advanced flight control law components. This controller has been implemented and tested in a hardware-in-the-loop simulation; the simulation results show excellent handling qualities throughout the limited flight envelope. A simple angular momentum formulation was chosen because it can be included in the stability proofs for many basic adaptive theories, such as model reference adaptive control. Many design choices and implementation details reflect the requirements placed on the system by the nonlinear flight environment and the desire to keep the system as basic as possible to simplify the addition of the adaptive elements. Those design choices are explained, along with their predicted impact on the handling qualities.

  11. CisLunar Habitat Internal Architecture Design Criteria

    NASA Technical Reports Server (NTRS)

    Jones, R.; Kennedy, K.; Howard, R.; Whitmore, M.; Martin, C.; Garate, J.

    2017-01-01

    BACKGROUND: In preparation for human exploration to Mars, there is a need to define the development and test program that will validate deep space operations and systems. In that context, a Proving Grounds CisLunar habitat spacecraft is being defined as the next step towards this goal. This spacecraft will operate differently from the ISS or other spacecraft in human history. The performance envelope of this spacecraft (mass, volume, power, specifications, etc.) is being defined by the Future Capabilities Study Team. This team has recognized the need for a human-centered approach for the internal architecture of this spacecraft and has commissioned a CisLunar Phase-1 Habitat Internal Architecture Study Team to develop a NASA reference configuration, providing the Agency with a "smart buyer" approach for future acquisition. THE CISLUNAR HABITAT INTERNAL ARCHITECTURE STUDY: Overall, the CisLunar Habitat Internal Architecture study will address the most significant questions and risks in the current CisLunar architecture, habitation, and operations concept development. This effort is achieved through definition of design criteria, evaluation criteria and process, design of the CisLunar Habitat Phase-1 internal architecture, and the development and fabrication of internal architecture concepts combined with rigorous and methodical Human-in-the-Loop (HITL) evaluations and testing of the conceptual innovations in a controlled test environment. The vision of the CisLunar Habitat Internal Architecture Study is to design, build, and test a CisLunar Phase-1 Habitat Internal Architecture that will be used for habitation (e.g. habitability and human factors) evaluations. The evaluations will mature CisLunar habitat evaluation tools, guidelines, and standards, and will interface with other projects such as the Advanced Exploration Systems (AES) Program integrated Power, Avionics, Software (iPAS), and Logistics for integrated human-in-the-loop testing. The mission of the CisLunar Habitat Internal Architecture Study is to become a forcing function to establish a common understanding of CisLunar Phase-1 Habitation Internal Architecture design criteria, processes, and tools. The scope of the CisLunar Habitat Internal Architecture study is to design, develop, demonstrate, and evaluate a Phase-1 CisLunar Habitat common module internal architecture based on design criteria agreed to by NASA, the International Partners, and Commercial Exploration teams. This task is to define the CisLunar Phase-1 Internal Architecture Government Reference Design, assist NASA in becoming a "smart buyer" for Phase-1 Habitat Concepts, and ultimately to derive standards and requirements from the Internal Architecture Design Process. The first step was to define a Habitat Internal Architecture Design Criteria and create a structured philosophy to be used by design teams as a filter by which critical aspects of consideration would be identified for the purpose of organizing and utilizing interior spaces. With design criteria in place, the team will develop a series of iterative internal architecture concept designs which will be assessed by means of an evaluation criteria and process. These assessments will successively drive and refine the design, leading to the combination and down-selection of design concepts. A single refined reference design configuration will be developed into in a medium-to-high fidelity mockup. A multi-day human-in-the-loop mission test will fully evaluate the reference design and validate its configuration. Lessons learned from the design and evaluation will enable the team to identify appropriate standards for Phase-1 CisLunar Habitat Internal Architecture and will enable NASA to develop derived requirements in support of maturing CisLunar Habitation capabilities. This paper will describe the criteria definition process, workshop event, and resulting CisLunar Phase-1 Habitat Internal Architecture Design Criteria.

  12. Mathematical modeling and experimental testing of three bioreactor configurations based on windkessel models

    PubMed Central

    Ruel, Jean; Lachance, Geneviève

    2010-01-01

    This paper presents an experimental study of three bioreactor configurations. The bioreactor is intended to be used for the development of tissue-engineered heart valve substitutes. Therefore it must be able to reproduce physiological flow and pressure waveforms accurately. A detailed analysis of three bioreactor arrangements is presented using mathematical models based on the windkessel (WK) approach. First, a review of the many applications of this approach in medical studies enhances its fundamental nature and its usefulness. Then the models are developed with reference to the actual components of the bioreactor. This study emphasizes different conflicting issues arising in the design process of a bioreactor for biomedical purposes, where an optimization process is essential to reach a compromise satisfying all conditions. Two important aspects are the need for a simple system providing ease of use and long-term sterility, opposed to the need for an advanced (thus more complex) architecture capable of a more accurate reproduction of the physiological environment. Three classic WK architectures are analyzed, and experimental results enhance the advantages and limitations of each one. PMID:21977286

  13. Mesoscopic modelling and simulation of soft matter.

    PubMed

    Schiller, Ulf D; Krüger, Timm; Henrich, Oliver

    2017-12-20

    The deformability of soft condensed matter often requires modelling of hydrodynamical aspects to gain quantitative understanding. This, however, requires specialised methods that can resolve the multiscale nature of soft matter systems. We review a number of the most popular simulation methods that have emerged, such as Langevin dynamics, dissipative particle dynamics, multi-particle collision dynamics, sometimes also referred to as stochastic rotation dynamics, and the lattice-Boltzmann method. We conclude this review with a short glance at current compute architectures for high-performance computing and community codes for soft matter simulation.

  14. JETSPIN: A specific-purpose open-source software for simulations of nanofiber electrospinning

    NASA Astrophysics Data System (ADS)

    Lauricella, Marco; Pontrelli, Giuseppe; Coluzza, Ivan; Pisignano, Dario; Succi, Sauro

    2015-12-01

    We present the open-source computer program JETSPIN, specifically designed to simulate the electrospinning process of nanofibers. Its capabilities are shown with proper reference to the underlying model, as well as a description of the relevant input variables and associated test-case simulations. The various interactions included in the electrospinning model implemented in JETSPIN are discussed in detail. The code is designed to exploit different computational architectures, from single to parallel processor workstations. This paper provides an overview of JETSPIN, focusing primarily on its structure, parallel implementations, functionality, performance, and availability.

  15. LAN (Local Area Network) Interoperability Study of Protocols Needed for Distributed Command and Control

    DTIC Science & Technology

    1985-03-01

    model referred to by the study group . IMCKIiN cuINICATION 31215FOR SDATA TWSU•l Sa *BSTtEO POOTO• AA14FTICT’WI PEM1 ITS ...operating systems, compared the DOD and ISO networking protocol architecture models , the protocols for LAN’s developed by the IEEE and ANSI, reviewed and...be initiated, so as to provide the Air Force a roadmap to guide its * "technology develop •ents. 4,’ �/LAN 3-4 .°. SECTION 4.0

  16. A Survey on Next-generation Power Grid Data Architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    You, Shutang; Zhu, Dr. Lin; Liu, Yong

    2015-01-01

    The operation and control of power grids will increasingly rely on data. A high-speed, reliable, flexible and secure data architecture is the prerequisite of the next-generation power grid. This paper summarizes the challenges in collecting and utilizing power grid data, and then provides reference data architecture for future power grids. Based on the data architecture deployment, related research on data architecture is reviewed and summarized in several categories including data measurement/actuation, data transmission, data service layer, data utilization, as well as two cross-cutting issues, interoperability and cyber security. Research gaps and future work are also presented.

  17. Genome-wide association analyses identify new risk variants and the genetic architecture of amyotrophic lateral sclerosis

    PubMed Central

    van Rheenen, Wouter; Shatunov, Aleksey; Dekker, Annelot M; McLaughlin, Russell L; Diekstra, Frank P; Pulit, Sara L; van der Spek, Rick A A; Võsa, Urmo; de Jong, Simone; Robinson, Matthew R; Yang, Jian; Fogh, Isabella; van Doormaal, Perry TC; Tazelaar, Gijs H P; Koppers, Max; Blokhuis, Anna M; Sproviero, William; Jones, Ashley R; Kenna, Kevin P; van Eijk, Kristel R; Harschnitz, Oliver; Schellevis, Raymond D; Brands, William J; Medic, Jelena; Menelaou, Androniki; Vajda, Alice; Ticozzi, Nicola; Lin, Kuang; Rogelj, Boris; Vrabec, Katarina; Ravnik-Glavač, Metka; Koritnik, Blaž; Zidar, Janez; Leonardis, Lea; Grošelj, Leja Dolenc; Millecamps, Stéphanie; Salachas, François; Meininger, Vincent; de Carvalho, Mamede; Pinto, Susana; Mora, Jesus S; Rojas-García, Ricardo; Polak, Meraida; Chandran, Siddharthan; Colville, Shuna; Swingler, Robert; Morrison, Karen E; Shaw, Pamela J; Hardy, John; Orrell, Richard W; Pittman, Alan; Sidle, Katie; Fratta, Pietro; Malaspina, Andrea; Topp, Simon; Petri, Susanne; Abdulla, Susanne; Drepper, Carsten; Sendtner, Michael; Meyer, Thomas; Ophoff, Roel A; Staats, Kim A; Wiedau-Pazos, Martina; Lomen-Hoerth, Catherine; Van Deerlin, Vivianna M; Trojanowski, John Q; Elman, Lauren; McCluskey, Leo; Basak, A Nazli; Tunca, Ceren; Hamzeiy, Hamid; Parman, Yesim; Meitinger, Thomas; Lichtner, Peter; Radivojkov-Blagojevic, Milena; Andres, Christian R; Maurel, Cindy; Bensimon, Gilbert; Landwehrmeyer, Bernhard; Brice, Alexis; Payan, Christine A M; Saker-Delye, Safaa; Dürr, Alexandra; Wood, Nicholas W; Tittmann, Lukas; Lieb, Wolfgang; Franke, Andre; Rietschel, Marcella; Cichon, Sven; Nöthen, Markus M; Amouyel, Philippe; Tzourio, Christophe; Dartigues, Jean-François; Uitterlinden, Andre G; Rivadeneira, Fernando; Estrada, Karol; Hofman, Albert; Curtis, Charles; Blauw, Hylke M; van der Kooi, Anneke J; de Visser, Marianne; Goris, An; Weber, Markus; Shaw, Christopher E; Smith, Bradley N; Pansarasa, Orietta; Cereda, Cristina; Bo, Roberto Del; Comi, Giacomo P; D’Alfonso, Sandra; Bertolin, Cinzia; Sorarù, Gianni; Mazzini, Letizia; Pensato, Viviana; Gellera, Cinzia; Tiloca, Cinzia; Ratti, Antonia; Calvo, Andrea; Moglia, Cristina; Brunetti, Maura; Arcuti, Simona; Capozzo, Rosa; Zecca, Chiara; Lunetta, Christian; Penco, Silvana; Riva, Nilo; Padovani, Alessandro; Filosto, Massimiliano; Muller, Bernard; Stuit, Robbert Jan; Blair, Ian; Zhang, Katharine; McCann, Emily P; Fifita, Jennifer A; Nicholson, Garth A; Rowe, Dominic B; Pamphlett, Roger; Kiernan, Matthew C; Grosskreutz, Julian; Witte, Otto W; Ringer, Thomas; Prell, Tino; Stubendorff, Beatrice; Kurth, Ingo; Hübner, Christian A; Leigh, P Nigel; Casale, Federico; Chio, Adriano; Beghi, Ettore; Pupillo, Elisabetta; Tortelli, Rosanna; Logroscino, Giancarlo; Powell, John; Ludolph, Albert C; Weishaupt, Jochen H; Robberecht, Wim; Van Damme, Philip; Franke, Lude; Pers, Tune H; Brown, Robert H; Glass, Jonathan D; Landers, John E; Hardiman, Orla; Andersen, Peter M; Corcia, Philippe; Vourc’h, Patrick; Silani, Vincenzo; Wray, Naomi R; Visscher, Peter M; de Bakker, Paul I W; van Es, Michael A; Pasterkamp, R Jeroen; Lewis, Cathryn M; Breen, Gerome; Al-Chalabi, Ammar; van den Berg, Leonard H; Veldink, Jan H

    2017-01-01

    To elucidate the genetic architecture of amyotrophic lateral sclerosis (ALS) and find associated loci, we assembled a custom imputation reference panel from whole-genome-sequenced patients with ALS and matched controls (n = 1,861). Through imputation and mixed-model association analysis in 12,577 cases and 23,475 controls, combined with 2,579 cases and 2,767 controls in an independent replication cohort, we fine-mapped a new risk locus on chromosome 21 and identified C21orf2 as a gene associated with ALS risk. In addition, we identified MOBP and SCFD1 as new associated risk loci. We established evidence of ALS being a complex genetic trait with a polygenic architecture. Furthermore, we estimated the SNP-based heritability at 8.5%, with a distinct and important role for low-frequency variants (frequency 1–10%). This study motivates the interrogation of larger samples with full genome coverage to identify rare causal variants that underpin ALS risk. PMID:27455348

  18. An ISRU Propellant Production System to Fully Fuel a Mars Ascent Vehicle

    NASA Technical Reports Server (NTRS)

    Kleinhenz, Julie; Paz, Aaron

    2017-01-01

    ISRU of Mars resources was base lined in 2009 Design Reference Architecture (DRA) 5.0, but only for Oxygen production using atmospheric CO2The Methane (LCH4) needed for ascent propulsion of the Mars Ascent Vehicle (MAV) would need to be brought from Earth. HOWEVER: Extracting water from the Martian Regolith enables the production of both Oxygen and Methane from Mars resources Water resources could also be used for other applications including: Life support, radiation shielding, plant growth, etc. Water extraction was not base lined in DRA5.0 due to perceived difficulties and complexity in processing regolith. The NASA Evolvable Mars Campaign (EMC) requested studies to look at the quantitative benefits and trades of using Mars water ISRU Phase 1: Examined architecture scenarios for regolith water retrieval. Completed October 2015Phase 2: Deep dive of one architecture concept to look at end-to-end system size, mass, power of a LCH4LO2 ISRU production system.Evolvable Mars CampaignPre-deployed Mars ascent vehicle (MAV)4 crew membersPropellants: Oxygen MethaneGenerate a system model to roll up mass power of a full ISRU system and enable parametric trade studies. Leverage models from previous studies and technology development programs Anchor with mass power performance from existing hardware. Whenever possible used reference-able (published) numbers for traceability.Modular approach to allow subsystem trades and parametric studies. Propellant mass needs taken from most recently published MAV study:Polsgrove, T. et al. (2015), AIAA2015-4416MAV engines operate at mixture ratios (oxygen: methane) between 3:1 and 3.5:1, whereas the Sabatier reactor produces at a 4:1 ratio. Therefore:Methane production is the driving requirement-Excess Oxygen will be produced.

  19. 75 FR 34004 - State Cemetery Grants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-16

    ... architectural design codes that apply to grant applicants, we decided to update those references in a separate.... 39.63 Architectural design standards. Subpart C--Operation and Maintenance Projects Grant... acquisition, design and planning, earth moving, landscaping, construction, and provision of initial operating...

  20. Communication Strategies for Shared-Bus Embedded Multiprocessors

    DTIC Science & Technology

    2005-09-01

    target architecture [10]. We utilize the task execution model in [11], where each task vi in the task graph G = (V,E) is associated with three possible...predictability is therefore an interesting and important direction for further study. REFERENCES [1] T. Kogel, M. Doerper, A. Wieferink, R. Leupers, G ...Proceedings of Real-Time Technology and Applications Symposium, 1995, pp. 164–173. [11] S. Hua, G . Qu, and S. Bhattacharyya, “Energy reduction technique

  1. Understanding Evolutionary Potential in Virtual CPU Instruction Set Architectures

    PubMed Central

    Bryson, David M.; Ofria, Charles

    2013-01-01

    We investigate fundamental decisions in the design of instruction set architectures for linear genetic programs that are used as both model systems in evolutionary biology and underlying solution representations in evolutionary computation. We subjected digital organisms with each tested architecture to seven different computational environments designed to present a range of evolutionary challenges. Our goal was to engineer a general purpose architecture that would be effective under a broad range of evolutionary conditions. We evaluated six different types of architectural features for the virtual CPUs: (1) genetic flexibility: we allowed digital organisms to more precisely modify the function of genetic instructions, (2) memory: we provided an increased number of registers in the virtual CPUs, (3) decoupled sensors and actuators: we separated input and output operations to enable greater control over data flow. We also tested a variety of methods to regulate expression: (4) explicit labels that allow programs to dynamically refer to specific genome positions, (5) position-relative search instructions, and (6) multiple new flow control instructions, including conditionals and jumps. Each of these features also adds complication to the instruction set and risks slowing evolution due to epistatic interactions. Two features (multiple argument specification and separated I/O) demonstrated substantial improvements in the majority of test environments, along with versions of each of the remaining architecture modifications that show significant improvements in multiple environments. However, some tested modifications were detrimental, though most exhibit no systematic effects on evolutionary potential, highlighting the robustness of digital evolution. Combined, these observations enhance our understanding of how instruction architecture impacts evolutionary potential, enabling the creation of architectures that support more rapid evolution of complex solutions to a broad range of challenges. PMID:24376669

  2. Multi-Organization Multi-Discipline Effort Developing a Mitigation Concept for Planetary Defense

    NASA Technical Reports Server (NTRS)

    Leung, Ronald Y.; Barbee, Brent W.; Seery, Bernard D.; Bambacus, Myra; Finewood, Lee; Greenaugh, Kevin C.; Lewis, Anthony; Dearborn, David; Miller, Paul L.; Weaver, Robert P.; hide

    2017-01-01

    There have been significant recent efforts in addressing mitigation approaches to neutralize Potentially Hazardous Asteroids (PHA). One such research effort was performed in 2015 by an integrated, inter-disciplinary team of asteroid scientists, energy deposition modeling scientists, payload engineers, orbital dynamist engineers, spacecraft discipline engineers, and systems architecture engineer from NASAs Goddard Space Flight Center (GSFC) and the Department of Energy (DoE) National Nuclear Security Administration (NNSA) laboratories (Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratories (LLNL) and Sandia National Laboratories). The study team collaborated with GSFCs Integrated Design Centers Mission Design Lab (MDL) which engaged a team of GSFC flight hardware discipline engineers to work with GSFC, LANL, and LLNL NEA-related subject matter experts during a one-week intensive concept formulation study in an integrated concurrent engineering environment. This team has analyzed the first of several distinct study cases for a multi-year NASA research grant. This Case 1 study references the Near-Earth Asteroid (NEA) named Bennu as the notional target due to the availability of a very detailed Design Reference Asteroid (DRA) model for its orbit and physical characteristics (courtesy of the Spectral Interpretation, Resource Identification, Security-Regolith Explorer (OSIRIS-REx) mission team). The research involved the formulation and optimization of spacecraft trajectories to intercept Bennu, overall mission and architecture concepts, and high-fidelity modeling of both kinetic impact (spacecraft collision to change a NEAs momentum and orbit) and nuclear detonation effects on Bennu, for purposes of deflecting Bennu.

  3. Enabling Flexible and Continuous Capability Invocation in Mobile Prosumer Environments

    PubMed Central

    Alcarria, Ramon; Robles, Tomas; Morales, Augusto; López-de-Ipiña, Diego; Aguilera, Unai

    2012-01-01

    Mobile prosumer environments require the communication with heterogeneous devices during the execution of mobile services. These environments integrate sensors, actuators and smart devices, whose availability continuously changes. The aim of this paper is to design a reference architecture for implementing a model for continuous service execution and access to capabilities, i.e., the functionalities provided by these devices. The defined architecture follows a set of software engineering patterns and includes some communication paradigms to cope with the heterogeneity of sensors, actuators, controllers and other devices in the environment. In addition, we stress the importance of the flexibility in capability invocation by allowing the communication middleware to select the access technology and change the communication paradigm when dealing with smart devices, and by describing and evaluating two algorithms for resource access management. PMID:23012526

  4. Heavy Lift Vehicle (HLV) Avionics Flight Computing Architecture Study

    NASA Technical Reports Server (NTRS)

    Hodson, Robert F.; Chen, Yuan; Morgan, Dwayne R.; Butler, A. Marc; Sdhuh, Joseph M.; Petelle, Jennifer K.; Gwaltney, David A.; Coe, Lisa D.; Koelbl, Terry G.; Nguyen, Hai D.

    2011-01-01

    A NASA multi-Center study team was assembled from LaRC, MSFC, KSC, JSC and WFF to examine potential flight computing architectures for a Heavy Lift Vehicle (HLV) to better understand avionics drivers. The study examined Design Reference Missions (DRMs) and vehicle requirements that could impact the vehicles avionics. The study considered multiple self-checking and voting architectural variants and examined reliability, fault-tolerance, mass, power, and redundancy management impacts. Furthermore, a goal of the study was to develop the skills and tools needed to rapidly assess additional architectures should requirements or assumptions change.

  5. Lunar COTS: An Economical and Sustainable Approach to Reaching Mars

    NASA Technical Reports Server (NTRS)

    Zuniga, Allison F.; Rasky, Daniel; Pittman, Robert B.; Zapata, Edgar; Lepsch, Roger

    2015-01-01

    The NASA COTS (Commercial Orbital Transportation Services) Program was a very successful program that developed and demonstrated cost-effective development and acquisition of commercial cargo transportation services to the International Space Station (ISS). The COTS acquisition strategy utilized a newer model than normally accepted in traditional procurement practices. This new model used Space Act Agreements where NASA entered into partnerships with industry to jointly share cost, development and operational risks to demonstrate new capabilities for mutual benefit. This model proved to be very beneficial to both NASA and its industry partners as NASA saved significantly in development and operational costs while industry partners successfully expanded their market share of the global launch transportation business. The authors, who contributed to the development of the COTS model, would like to extend this model to a lunar commercial services program that will push development of technologies and capabilities that will serve a Mars architecture and lead to an economical and sustainable pathway to transporting humans to Mars. Over the past few decades, several architectures for the Moon and Mars have been proposed and studied but ultimately halted or not even started due to the projected costs significantly exceeding NASA's budgets. Therefore a new strategy is needed that will fit within NASA's projected budgets and takes advantage of the US commercial industry along with its creative and entrepreneurial attributes. The authors propose a new COTS-like program to enter into partnerships with industry to demonstrate cost-effective, cis-lunar commercial services, such as lunar transportation, lunar ISRU operations, and cis-lunar propellant depots that can enable an economical and sustainable Mars architecture. Similar to the original COTS program, the goals of the proposed program, being notionally referred to as Lunar Commercial Orbital Transfer Services (LCOTS) program will be to: 1) reduce development and operational costs by sharing costs with industry; 2) create new markets in cis-lunar space to further reduce operational costs; and 3) enable NASA to develop an affordable and economical exploration Mars architecture. The paper will describe a plan for a proposed LCOTS program, its potential impact to an eventual Mars architecture and its many benefits to NASA, commercial space industry and the US economy.

  6. An integrated decision-making framework for transportation architectures: Application to aviation systems design

    NASA Astrophysics Data System (ADS)

    Lewe, Jung-Ho

    The National Transportation System (NTS) is undoubtedly a complex system-of-systems---a collection of diverse 'things' that evolve over time, organized at multiple levels, to achieve a range of possibly conflicting objectives, and never quite behaving as planned. The purpose of this research is to develop a virtual transportation architecture for the ultimate goal of formulating an integrated decision-making framework. The foundational endeavor begins with creating an abstraction of the NTS with the belief that a holistic frame of reference is required to properly study such a multi-disciplinary, trans-domain system. The culmination of the effort produces the Transportation Architecture Field (TAF) as a mental model of the NTS, in which the relationships between four basic entity groups are identified and articulated. This entity-centric abstraction framework underpins the construction of a virtual NTS couched in the form of an agent-based model. The transportation consumers and the service providers are identified as adaptive agents that apply a set of preprogrammed behavioral rules to achieve their respective goals. The transportation infrastructure and multitude of exogenous entities (disruptors and drivers) in the whole system can also be represented without resorting to an extremely complicated structure. The outcome is a flexible, scalable, computational model that allows for examination of numerous scenarios which involve the cascade of interrelated effects of aviation technology, infrastructure, and socioeconomic changes throughout the entire system.

  7. On-Board Software Reference Architecture for Payloads

    NASA Astrophysics Data System (ADS)

    Bos, Victor; Rugina, Ana; Trcka, Adam

    2016-08-01

    The goal of the On-board Software Reference Architecture for Payloads (OSRA-P) is to identify an architecture for payload software to harmonize the payload domain, to enable more reuse of common/generic payload software across different payloads and missions and to ease the integration of the payloads with the platform.To investigate the payload domain, recent and current payload instruments of European space missions have been analyzed. This led to a Payload Catalogue describing 12 payload instruments as well as a Capability Matrix listing specific characteristics of each payload. In addition, a functional decomposition of payload software was prepared which contains functionalities typically found in payload systems. The definition of OSRA-P was evaluated by case studies and a dedicated OSRA-P workshop to gather feedback from the payload community.

  8. Mapping SOA Artefacts onto an Enterprise Reference Architecture Framework

    NASA Astrophysics Data System (ADS)

    Noran, Ovidiu

    Currently, there is still no common agreement on the service-Oriented architecture (SOA) definition, or the types and meaning of the artefacts involved in the creation and maintenance of an SOA. Furthermore, the SOA image shift from an infrastructure solution to a business-wide change project may have promoted a perception that SOA is a parallel initiative, a competitor and perhaps a successor of enterprise architecture (EA). This chapter attempts to map several typical SOA artefacts onto an enterprise reference framework commonly used in EA. This is done in order to show that the EA framework can express and structure most of the SOA artefacts and therefore, a framework for SOA could in fact be derived from an EA framework with the ensuing SOA-EA integration benefits.

  9. The PDS4 Information Model and its Role in Agile Science Data Curation

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Crichton, D.

    2017-12-01

    PDS4 is an information model-driven service architecture supporting the capture, management, distribution and integration of massive planetary science data captured in distributed data archives world-wide. The PDS4 Information Model (IM), the core element of the architecture, was developed using lessons learned from 20 years of archiving Planetary Science Data and best practices for information model development. The foundational principles were adopted from the Open Archival Information System (OAIS) Reference Model (ISO 14721), the Metadata Registry Specification (ISO/IEC 11179), and W3C XML (Extensible Markup Language) specifications. These provided respectively an object oriented model for archive information systems, a comprehensive schema for data dictionaries and hierarchical governance, and rules for rules for encoding documents electronically. The PDS4 Information model is unique in that it drives the PDS4 infrastructure by providing the representation of concepts and their relationships, constraints, rules, and operations; a sharable, stable, and organized set of information requirements; and machine parsable definitions that are suitable for configuring and generating code. This presentation will provide an over of the PDS4 Information Model and how it is being leveraged to develop and evolve the PDS4 infrastructure and enable agile curation of over 30 years of science data collected by the international Planetary Science community.

  10. SensoTube: A Scalable Hardware Design Architecture for Wireless Sensors and Actuators Networks Nodes in the Agricultural Domain.

    PubMed

    Piromalis, Dimitrios; Arvanitis, Konstantinos

    2016-08-04

    Wireless Sensor and Actuators Networks (WSANs) constitute one of the most challenging technologies with tremendous socio-economic impact for the next decade. Functionally and energy optimized hardware systems and development tools maybe is the most critical facet of this technology for the achievement of such prospects. Especially, in the area of agriculture, where the hostile operating environment comes to add to the general technological and technical issues, reliable and robust WSAN systems are mandatory. This paper focuses on the hardware design architectures of the WSANs for real-world agricultural applications. It presents the available alternatives in hardware design and identifies their difficulties and problems for real-life implementations. The paper introduces SensoTube, a new WSAN hardware architecture, which is proposed as a solution to the various existing design constraints of WSANs. The establishment of the proposed architecture is based, firstly on an abstraction approach in the functional requirements context, and secondly, on the standardization of the subsystems connectivity, in order to allow for an open, expandable, flexible, reconfigurable, energy optimized, reliable and robust hardware system. The SensoTube implementation reference model together with its encapsulation design and installation are analyzed and presented in details. Furthermore, as a proof of concept, certain use cases have been studied in order to demonstrate the benefits of migrating existing designs based on the available open-source hardware platforms to SensoTube architecture.

  11. 3-D Survey Applied to Industrial Archaeology by Tls Methodology

    NASA Astrophysics Data System (ADS)

    Monego, M.; Fabris, M.; Menin, A.; Achilli, V.

    2017-05-01

    This work describes the three-dimensional survey of "Ex Stazione Frigorifera Specializzata": initially used for agricultural storage, during the years it was allocated to different uses until the complete neglect. The historical relevance and the architectural heritage that this building represents has brought the start of a recent renovation project and functional restoration. In this regard it was necessary a global 3-D survey that was based on the application and integration of different geomatic methodologies (mainly terrestrial laser scanner, classical topography, and GNSS). The acquisitions of point clouds was performed using different laser scanners: with time of flight (TOF) and phase shift technologies for the distance measurements. The topographic reference network, needed for scans alignment in the same system, was measured with a total station. For the complete survey of the building, 122 scans were acquired and 346 targets were measured from 79 vertices of the reference network. Moreover, 3 vertices were measured with GNSS methodology in order to georeference the network. For the detail survey of machine room were executed 14 scans with 23 targets. The 3-D global model of the building have less than one centimeter of error in the alignment (for the machine room the error in alignment is not greater than 6 mm) and was used to extract products such as longitudinal and transversal sections, plans, architectural perspectives, virtual scans. A complete spatial knowledge of the building is obtained from the processed data, providing basic information for restoration project, structural analysis, industrial and architectural heritage valorization.

  12. A Bandwidth-Optimized Multi-Core Architecture for Irregular Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Secchi, Simone; Tumeo, Antonino; Villa, Oreste

    This paper presents an architecture template for next-generation high performance computing systems specifically targeted to irregular applications. We start our work by considering that future generation interconnection and memory bandwidth full-system numbers are expected to grow by a factor of 10. In order to keep up with such a communication capacity, while still resorting to fine-grained multithreading as the main way to tolerate unpredictable memory access latencies of irregular applications, we show how overall performance scaling can benefit from the multi-core paradigm. At the same time, we also show how such an architecture template must be coupled with specific techniquesmore » in order to optimize bandwidth utilization and achieve the maximum scalability. We propose a technique based on memory references aggregation, together with the related hardware implementation, as one of such optimization techniques. We explore the proposed architecture template by focusing on the Cray XMT architecture and, using a dedicated simulation infrastructure, validate the performance of our template with two typical irregular applications. Our experimental results prove the benefits provided by both the multi-core approach and the bandwidth optimization reference aggregation technique.« less

  13. Multi-level Operational C2 Holonic Reference Architecture Modeling for MHQ with MOC

    DTIC Science & Technology

    2009-06-01

    x), x(k), uj(k)) is defined as the task success probability, based on the asset allocation and task execution activities at the tactical level...on outcomes of asset- task allocation at the tactical level. We employ semi-Markov decision process (SMDP) approach to decide on missions to be...AGA) graph for addressing the mission monitoring/ planning issues related to task sequencing and asset allocation at the OLC-TLC layer (coordination

  14. Calibration of the Software Architecture Sizing and Estimation Tool (SASET).

    DTIC Science & Technology

    1995-09-01

    model is of more value than the uncalibrated one. Also, as will be discussed in Chapters 3 and 4, there are quite a few manual (and undocumented) steps...complexity, normalized effective size, and normalized effort. One other field ("development phases included") was extracted manually since it was not listed...Bowden, R.G., Cheadle, W.G., & Ratliff, R.W. SASET 3.0 Technical Reference Manual . Publication S-3730-93-2. Denver: Martin Marietta Astronautics

  15. Development of a high angular resolution diffusion imaging human brain template.

    PubMed

    Varentsova, Anna; Zhang, Shengwei; Arfanakis, Konstantinos

    2014-05-01

    Brain diffusion templates contain rich information about the microstructure of the brain, and are used as references in spatial normalization or in the development of brain atlases. The accuracy of diffusion templates constructed based on the diffusion tensor (DT) model is limited in regions with complex neuronal micro-architecture. High angular resolution diffusion imaging (HARDI) overcomes limitations of the DT model and is capable of resolving intravoxel heterogeneity. However, when HARDI is combined with multiple-shot sequences to minimize image artifacts, the scan time becomes inappropriate for human brain imaging. In this work, an artifact-free HARDI template of the human brain was developed from low angular resolution multiple-shot diffusion data. The resulting HARDI template was produced in ICBM-152 space based on Turboprop diffusion data, was shown to resolve complex neuronal micro-architecture in regions with intravoxel heterogeneity, and contained fiber orientation information consistent with known human brain anatomy. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Selected Reference Books of 1998-99.

    ERIC Educational Resources Information Center

    McIlvaine, Eileen

    1999-01-01

    Presents an annotated bibliography of selected reference books published between 1998 and 1999 under subject headings for biography, journalism, mythology, languages and literature, architecture and city planning, political science, economics, history, and new editions and supplements. (LRW)

  17. Constellation Program Life-cycle Cost Analysis Model (LCAM)

    NASA Technical Reports Server (NTRS)

    Prince, Andy; Rose, Heidi; Wood, James

    2008-01-01

    The Constellation Program (CxP) is NASA's effort to replace the Space Shuttle, return humans to the moon, and prepare for a human mission to Mars. The major elements of the Constellation Lunar sortie design reference mission architecture are shown. Unlike the Apollo Program of the 1960's, affordability is a major concern of United States policy makers and NASA management. To measure Constellation affordability, a total ownership cost life-cycle parametric cost estimating capability is required. This capability is being developed by the Constellation Systems Engineering and Integration (SE&I) Directorate, and is called the Lifecycle Cost Analysis Model (LCAM). The requirements for LCAM are based on the need to have a parametric estimating capability in order to do top-level program analysis, evaluate design alternatives, and explore options for future systems. By estimating the total cost of ownership within the context of the planned Constellation budget, LCAM can provide Program and NASA management with the cost data necessary to identify the most affordable alternatives. LCAM is also a key component of the Integrated Program Model (IPM), an SE&I developed capability that combines parametric sizing tools with cost, schedule, and risk models to perform program analysis. LCAM is used in the generation of cost estimates for system level trades and analyses. It draws upon the legacy of previous architecture level cost models, such as the Exploration Systems Mission Directorate (ESMD) Architecture Cost Model (ARCOM) developed for Simulation Based Acquisition (SBA), and ATLAS. LCAM is used to support requirements and design trade studies by calculating changes in cost relative to a baseline option cost. Estimated costs are generally low fidelity to accommodate available input data and available cost estimating relationships (CERs). LCAM is capable of interfacing with the Integrated Program Model to provide the cost estimating capability for that suite of tools.

  18. An Internet Protocol-Based Software System for Real-Time, Closed-Loop, Multi-Spacecraft Mission Simulation Applications

    NASA Technical Reports Server (NTRS)

    Davis, George; Cary, Everett; Higinbotham, John; Burns, Richard; Hogie, Keith; Hallahan, Francis

    2003-01-01

    The paper will provide an overview of the web-based distributed simulation software system developed for end-to-end, multi-spacecraft mission design, analysis, and test at the NASA Goddard Space Flight Center (GSFC). This software system was developed for an internal research and development (IR&D) activity at GSFC called the Distributed Space Systems (DSS) Distributed Synthesis Environment (DSE). The long-term goal of the DSS-DSE is to integrate existing GSFC stand-alone test beds, models, and simulation systems to create a "hands on", end-to-end simulation environment for mission design, trade studies and simulations. The short-term goal of the DSE was therefore to develop the system architecture, and then to prototype the core software simulation capability based on a distributed computing approach, with demonstrations of some key capabilities by the end of Fiscal Year 2002 (FY02). To achieve the DSS-DSE IR&D objective, the team adopted a reference model and mission upon which FY02 capabilities were developed. The software was prototyped according to the reference model, and demonstrations were conducted for the reference mission to validate interfaces, concepts, etc. The reference model, illustrated in Fig. 1, included both space and ground elements, with functional capabilities such as spacecraft dynamics and control, science data collection, space-to-space and space-to-ground communications, mission operations, science operations, and data processing, archival and distribution addressed.

  19. A Distributed Simulation Software System for Multi-Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Burns, Richard; Davis, George; Cary, Everett

    2003-01-01

    The paper will provide an overview of the web-based distributed simulation software system developed for end-to-end, multi-spacecraft mission design, analysis, and test at the NASA Goddard Space Flight Center (GSFC). This software system was developed for an internal research and development (IR&D) activity at GSFC called the Distributed Space Systems (DSS) Distributed Synthesis Environment (DSE). The long-term goal of the DSS-DSE is to integrate existing GSFC stand-alone test beds, models, and simulation systems to create a "hands on", end-to-end simulation environment for mission design, trade studies and simulations. The short-term goal of the DSE was therefore to develop the system architecture, and then to prototype the core software simulation capability based on a distributed computing approach, with demonstrations of some key capabilities by the end of Fiscal Year 2002 (FY02). To achieve the DSS-DSE IR&D objective, the team adopted a reference model and mission upon which FY02 capabilities were developed. The software was prototyped according to the reference model, and demonstrations were conducted for the reference mission to validate interfaces, concepts, etc. The reference model, illustrated in Fig. 1, included both space and ground elements, with functional capabilities such as spacecraft dynamics and control, science data collection, space-to-space and space-to-ground communications, mission operations, science operations, and data processing, archival and distribution addressed.

  20. Blue guardian: an open architecture for rapid ISR demonstration

    NASA Astrophysics Data System (ADS)

    Barrett, Donald A.; Borntrager, Luke A.; Green, David M.

    2016-05-01

    Throughout the Department of Defense (DoD), acquisition, platform integration, and life cycle costs for weapons systems have continued to rise. Although Open Architecture (OA) interface standards are one of the primary methods being used to reduce these costs, the Air Force Rapid Capabilities Office (AFRCO) has extended the OA concept and chartered the Open Mission System (OMS) initiative with industry to develop and demonstrate a consensus-based, non-proprietary, OA standard for integrating subsystems and services into airborne platforms. The new OMS standard provides the capability to decouple vendor-specific sensors, payloads, and service implementations from platform-specific architectures and is still in the early stages of maturation and demonstration. The Air Force Research Laboratory (AFRL) - Sensors Directorate has developed the Blue Guardian program to demonstrate advanced sensing technology utilizing open architectures in operationally relevant environments. Over the past year, Blue Guardian has developed a platform architecture using the Air Force's OMS reference architecture and conducted a ground and flight test program of multiple payload combinations. Systems tested included a vendor-unique variety of Full Motion Video (FMV) systems, a Wide Area Motion Imagery (WAMI) system, a multi-mode radar system, processing and database functions, multiple decompression algorithms, multiple communications systems, and a suite of software tools. Initial results of the Blue Guardian program show the promise of OA to DoD acquisitions, especially for Intelligence, Surveillance and Reconnaissance (ISR) payload applications. Specifically, the OMS reference architecture was extremely useful in reducing the cost and time required for integrating new systems.

  1. Low level image processing techniques using the pipeline image processing engine in the flight telerobotic servicer

    NASA Technical Reports Server (NTRS)

    Nashman, Marilyn; Chaconas, Karen J.

    1988-01-01

    The sensory processing system for the NASA/NBS Standard Reference Model (NASREM) for telerobotic control is described. This control system architecture was adopted by NASA of the Flight Telerobotic Servicer. The control system is hierarchically designed and consists of three parallel systems: task decomposition, world modeling, and sensory processing. The Sensory Processing System is examined, and in particular the image processing hardware and software used to extract features at low levels of sensory processing for tasks representative of those envisioned for the Space Station such as assembly and maintenance are described.

  2. Implementation and Evaluation of Multiple Adaptive Control Technologies for a Generic Transport Aircraft Simulation

    NASA Technical Reports Server (NTRS)

    Campbell, Stefan F.; Kaneshige, John T.; Nguyen, Nhan T.; Krishakumar, Kalmanje S.

    2010-01-01

    Presented here is the evaluation of multiple adaptive control technologies for a generic transport aircraft simulation. For this study, seven model reference adaptive control (MRAC) based technologies were considered. Each technology was integrated into an identical dynamic-inversion control architecture and tuned using a methodology based on metrics and specific design requirements. Simulation tests were then performed to evaluate each technology s sensitivity to time-delay, flight condition, model uncertainty, and artificially induced cross-coupling. The resulting robustness and performance characteristics were used to identify potential strengths, weaknesses, and integration challenges of the individual adaptive control technologies

  3. Data Retention and Anonymity Services

    NASA Astrophysics Data System (ADS)

    Berthold, Stefan; Böhme, Rainer; Köpsell, Stefan

    The recently introduced legislation on data retention to aid prosecuting cyber-related crime in Europe also affects the achievable security of systems for anonymous communication on the Internet. We argue that data retention requires a review of existing security evaluations against a new class of realistic adversary models. In particular, we present theoretical results and first empirical evidence for intersection attacks by law enforcement authorities. The reference architecture for our study is the anonymity service AN.ON, from which we also collect empirical data. Our adversary model reflects an interpretation of the current implementation of the EC Directive on Data Retention in Germany.

  4. Collaboration within Student Design Teams Participating in Architectural Design Competitions

    ERIC Educational Resources Information Center

    Erbil, Livanur; Dogan, Fehmi

    2012-01-01

    This paper investigates design collaboration with reference to convergent and divergent idea generation processes in architectural design teams entering a design competition. Study of design teams offer a unique opportunity to investigate how creativity is fostered through collaborative work. While views of creativity often relate creativity to…

  5. Service-Based Extensions to an OAIS Archive for Science Data Management

    NASA Astrophysics Data System (ADS)

    Flathers, E.; Seamon, E.; Gessler, P. E.

    2014-12-01

    With new data management mandates from major funding sources such as the National Institutes for Health and the National Science Foundation, architecture of science data archive systems is becoming a critical concern for research institutions. The Consultative Committee for Space Data Systems (CCSDS), in 2002, released their first version of a Reference Model for an Open Archival Information System (OAIS). The CCSDS document (now an ISO standard) was updated in 2012 with additional focus on verifying the authenticity of data and developing concepts of access rights and a security model. The OAIS model is a good fit for research data archives, having been designed to support data collections of heterogeneous types, disciplines, storage formats, etc. for the space sciences. As fast, reliable, persistent Internet connectivity spreads, new network-available resources have been developed that can support the science data archive. A natural extension of an OAIS archive is the interconnection with network- or cloud-based services and resources. We use the Service Oriented Architecture (SOA) design paradigm to describe a set of extensions to an OAIS-type archive: purpose and justification for each extension, where and how each extension connects to the model, and an example of a specific service that meets the purpose.

  6. A Reference Stack for PHM Architectures

    DTIC Science & Technology

    2014-10-02

    components, fault modes and prognostics such as that described by MIMOSA (2009) and ISO 13374-3:2012 (2012). Section 2.6 described a semantic...architecture, and the use of a SOA is further discussed in Section 3.3.2. MIMOSA is a stack-oriented data architecture. Figure 11 shows its stack of...format (US Army PEWG, 2011). The tagging in ABCD format respects the data layers that are found in the MIMOSA standard ( MIMOSA , 2009) and in ISO

  7. Architectures for Quantum Simulation Showing a Quantum Speedup

    NASA Astrophysics Data System (ADS)

    Bermejo-Vega, Juan; Hangleiter, Dominik; Schwarz, Martin; Raussendorf, Robert; Eisert, Jens

    2018-04-01

    One of the main aims in the field of quantum simulation is to achieve a quantum speedup, often referred to as "quantum computational supremacy," referring to the experimental realization of a quantum device that computationally outperforms classical computers. In this work, we show that one can devise versatile and feasible schemes of two-dimensional, dynamical, quantum simulators showing such a quantum speedup, building on intermediate problems involving nonadaptive, measurement-based, quantum computation. In each of the schemes, an initial product state is prepared, potentially involving an element of randomness as in disordered models, followed by a short-time evolution under a basic translationally invariant Hamiltonian with simple nearest-neighbor interactions and a mere sampling measurement in a fixed basis. The correctness of the final-state preparation in each scheme is fully efficiently certifiable. We discuss experimental necessities and possible physical architectures, inspired by platforms of cold atoms in optical lattices and a number of others, as well as specific assumptions that enter the complexity-theoretic arguments. This work shows that benchmark settings exhibiting a quantum speedup may require little control, in contrast to universal quantum computing. Thus, our proposal puts a convincing experimental demonstration of a quantum speedup within reach in the near term.

  8. Human Mars Missions: Cost Driven Architecture Assessments

    NASA Technical Reports Server (NTRS)

    Donahue, Benjamin

    1998-01-01

    This report investigates various methods of reducing the cost in space transportation systems for human Mars missions. The reference mission for this task is a mission currently under study at NASA. called the Mars Design Reference Mission, characterized by In-Situ propellant production at Mars. This study mainly consists of comparative evaluations to the reference mission with a view to selecting strategies that would reduce the cost of the Mars program as a whole. One of the objectives is to understand the implications of certain Mars architectures, mission modes, vehicle configurations, and potentials for vehicle reusability. The evaluations start with year 2011-2014 conjunction missions which were characterized by their abort-to-the-surface mission abort philosophy. Variations within this mission architecture, as well as outside the set to other architectures (not predicated on an abort to surface philosophy) were evaluated. Specific emphasis has been placed on identifying and assessing overall mission risk. Impacts that Mars mission vehicles might place upon the Space Station, if it were to be used as an assembly or operations base, were also discussed. Because of the short duration of this study only on a few propulsion elements were addressed (nuclear thermal, cryogenic oxygen-hydrogen, cryogenic oxygen-methane, and aerocapture). Primary ground rules and assumptions were taken from NASA material used in Marshall Space Flight Center's own assessment done in 1997.

  9. Cooperating reduction machines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kluge, W.E.

    1983-11-01

    This paper presents a concept and a system architecture for the concurrent execution of program expressions of a concrete reduction language based on lamda-expressions. If formulated appropriately, these expressions are well-suited for concurrent execution, following a demand-driven model of computation. In particular, recursive program expressions with nonlinear expansion may, at run time, recursively be partitioned into a hierarchy of independent subexpressions which can be reduced by a corresponding hierarchy of virtual reduction machines. This hierarchy unfolds and collapses dynamically, with virtual machines recursively assuming the role of masters that create and eventually terminate, or synchronize with, slaves. The paper alsomore » proposes a nonhierarchically organized system of reduction machines, each featuring a stack architecture, that effectively supports the allocation of virtual machines to the real machines of the system in compliance with their hierarchical order of creation and termination. 25 references.« less

  10. The high speed interconnect system architecture and operation

    NASA Astrophysics Data System (ADS)

    Anderson, Steven C.

    The design and operation of a fiber-optic high-speed interconnect system (HSIS) being developed to meet the requirements of future avionics and flight-control hardware with distributed-system architectures are discussed. The HSIS is intended for 100-Mb/s operation of a local-area network with up to 256 stations. It comprises a bus transmission system (passive star couplers and linear media linked by active elements) and network interface units (NIUs). Each NIU is designed to perform the physical, data link, network, and transport functions defined by the ISO OSI Basic Reference Model (1982 and 1983) and incorporates a fiber-optic transceiver, a high-speed protocol based on the SAE AE-9B linear token-passing data bus (1986), and a specialized application interface unit. The operating modes and capabilities of HSIS are described in detail and illustrated with diagrams.

  11. GTAG: architecture and design of miniature transmitter with position logging for radio telemetry

    NASA Astrophysics Data System (ADS)

    Řeřucha, Šimon; Bartonička, Tomáš; Jedlička, Petr

    2011-10-01

    The radio telemetry is a well-known technique used within zoological research to exploit the behaviour of animal species. A usage of GPS for a frequent and precise position recording gives interesting possibility for a further enhancement of this method. We present our proposal of an architecture and design concepts of telemetry transmitter with GPS module, called GTAG, that is suited for study of the Egyptian fruit bat (Rousettus aegyptiacus). The model group we study set particular constrains, especially the weight limit (9 g) and prevention of any power resources recharging technique. We discuss the aspect of physical realization and the energyconsumption issues. We have developed a reference implementation that has been already deployed during telemetry sessions and we evaluate the experience and compare the estimated performance of our device to a real data.

  12. PMG: Numerical model of a fault tolerant permanent magnet generator for high rpm applications

    NASA Astrophysics Data System (ADS)

    Bertrand, Alexandre

    The aerospace industry is confronting an increasing number of challenges these days. One can think for instance of the environmental challenges as well as the economic and social ones to name a few. These challenges have forced the industry to turn their design philosophy toward new ways of doing things. It is in this context that was born the More Electrical Aircraft (MEA) concept. This concept aims at giving a more prominent part to electrical power in the overall installed power balance aboard aircrafts (in comparison to more traditional power sources such as mechanical and hydraulic). In order to be able to support this increasing demand in electrical power, the electric power generation aboard aircrafts needed reengineering. This is one of the main reasons the More Electrical Engine (MEE) concept was born: to serve the needs of the MEA philosophy. It is precisely under the MEE concept that this project takes place. This project, realized in collaboration with Pratt & Whitney Canada (PWC), is a first attempt at the electrical modelling of this new type of electrical generator designed for aircrafts. The main objectives of this project are to understand the principles of operation of the New Architecture Electromagnetic Machine (NAEM) and to build a simplified model for EMTP-RV for steady-state simulations. This document contains the results that were obtained during the electrical modelling project of the New Architecture Electromagnetic Machine (NAEM) by the author using data from PWC. The model built by PWC using MagNet, a finite element analysis software, was used as the reference during the project. It was possible to develop an electrical model of the generator that replicate with a good accuracy the behaviour of the model of reference under steady-state operation. Some technical avenues are explored in the discussion in order to list the key improvements that will need to be done to the electrical model in future work.

  13. Design and Testing of Space Telemetry SCA Waveform

    NASA Technical Reports Server (NTRS)

    Mortensen, Dale J.; Handler, Louis M.; Quinn, Todd M.

    2006-01-01

    A Software Communications Architecture (SCA) Waveform for space telemetry is being developed at the NASA Glenn Research Center (GRC). The space telemetry waveform is implemented in a laboratory testbed consisting of general purpose processors, field programmable gate arrays (FPGAs), analog-to-digital converters (ADCs), and digital-to-analog converters (DACs). The radio hardware is integrated with an SCA Core Framework and other software development tools. The waveform design is described from both the bottom-up signal processing and top-down software component perspectives. Simulations and model-based design techniques used for signal processing subsystems are presented. Testing with legacy hardware-based modems verifies proper design implementation and dynamic waveform operations. The waveform development is part of an effort by NASA to define an open architecture for space based reconfigurable transceivers. Use of the SCA as a reference has increased understanding of software defined radio architectures. However, since space requirements put a premium on size, mass, and power, the SCA may be impractical for today s space ready technology. Specific requirements for an SCA waveform and other lessons learned from this development are discussed.

  14. Towards Reconstructing a Doric Column in a Virtual Construction Site

    NASA Astrophysics Data System (ADS)

    Bartzis, D.

    2017-02-01

    This paper deals with the 3D reconstruction of ancient Greek architectural members, especially with the element of the Doric column. The case study for this project is the Choragic monument of Nicias on the South Slope of the Athenian Acropolis, from which a column drum, two capitals and smaller fragments are preserved. The first goal of this paper is to present some benefits of using 3D reconstruction methods not only in documentation but also in understanding of ancient Greek architectural members. The second goal is to take advantage of the produced point clouds. By using the Cloud Compare software, comparisons are made between the actual architectural members and an "ideal" point cloud of the whole column in its original form. Seeking for probable overlaps between the two point clouds could assist in estimating the original position of each member/fragment on the column. This method is expanded with more comparisons between the reference column model and other members/fragments around the Acropolis, which may have not yet been ascribed to the monument of Nicias.

  15. Using JWST Heritage to Enable a Future Large Ultra-Violet Optical Infrared Telescope

    NASA Technical Reports Server (NTRS)

    Feinberg, Lee

    2016-01-01

    To the extent it makes sense, leverage JWST knowledge, designs, architectures, GSE. Develop a scalable design reference mission (9.2 meter). Do just enough work to understand launch break points in aperture size. Demonstrate 10 pm stability is achievable on a design reference mission. Make design compatible with starshades. While segmented coronagraphs with high throughput and large bandpasses are important, make the system serviceable so you can evolve the instruments. Keep it room temperature to minimize the costs associated with cryo. Focus resources on the contrast problem. Start with the architecture and connect it to the technology needs.

  16. Decentralized Estimation and Vision-based Guidance of Fast Autonomous Systems with Guaranteed Performance in Uncertain Environments

    DTIC Science & Technology

    2013-04-22

    Following for Unmanned Aerial Vehicles Using L1 Adaptive Augmentation of Commercial Autopilots, Journal of Guidance, Control, and Dynamics, (3 2010): 0...Naira Hovakimyan. L1 Adaptive Controller for MIMO system with Unmatched Uncertainties using Modi?ed Piecewise Constant Adaptation Law, IEEE 51st...adaptive input nominal input with  Nominal input L1 ‐based control generator  This L1 adaptive control architecture uses data from the reference model

  17. Strategic Mobility 21 Initial Capabilities Document (ICD)

    DTIC Science & Technology

    2006-07-28

    MANDATORY ARCHITECTURE FRAMWORK DOCUMENT .......................................A-1 APPENDIX B: REFERENCES...Document July 27, 2006 JPPSP ICD Version 1.0 A-1 APPENDIX A: MANDATORY ARCHITECTURE FRAMWORK DOCUMENT Legend next page. Initial Capabilities...SM21 will combine several end-to-end Force Projection Process enablers. Some of the enablers described below are at the conceptual stage while others

  18. Construction Morphology and the Parallel Architecture of Grammar

    ERIC Educational Resources Information Center

    Booij, Geert; Audring, Jenny

    2017-01-01

    This article presents a systematic exposition of how the basic ideas of Construction Grammar (CxG) (Goldberg, 2006) and the Parallel Architecture (PA) of grammar (Jackendoff, 2002]) provide the framework for a proper account of morphological phenomena, in particular word formation. This framework is referred to as Construction Morphology (CxM). As…

  19. Recursive computer architecture for VLSI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Treleaven, P.C.; Hopkins, R.P.

    1982-01-01

    A general-purpose computer architecture based on the concept of recursion and suitable for VLSI computer systems built from replicated (lego-like) computing elements is presented. The recursive computer architecture is defined by presenting a program organisation, a machine organisation and an experimental machine implementation oriented to VLSI. The experimental implementation is being restricted to simple, identical microcomputers each containing a memory, a processor and a communications capability. This future generation of lego-like computer systems are termed fifth generation computers by the Japanese. 30 references.

  20. Trades Between Opposition and Conjunction Class Trajectories for Early Human Missions to Mars

    NASA Technical Reports Server (NTRS)

    Mattfeld, Bryan; Stromgren, Chel; Shyface, Hilary; Komar, David R.; Cirillo, William; Goodliff, Kandyce

    2014-01-01

    Candidate human missions to Mars, including NASA's Design Reference Architecture 5.0, have focused on conjunction-class missions with long crewed durations and minimum energy trajectories to reduce total propellant requirements and total launch mass. However, in order to progressively reduce risk and gain experience in interplanetary mission operations, it may be desirable that initial human missions to Mars, whether to the surface or to Mars orbit, have shorter total crewed durations and minimal stay times at the destination. Opposition-class missions require larger total energy requirements relative to conjunction-class missions but offer the potential for much shorter mission durations, potentially reducing risk and overall systems performance requirements. This paper will present a detailed comparison of conjunction-class and opposition-class human missions to Mars vicinity with a focus on how such missions could be integrated into the initial phases of a Mars exploration campaign. The paper will present the results of a trade study that integrates trajectory/propellant analysis, element design, logistics and sparing analysis, and risk assessment to produce a comprehensive comparison of opposition and conjunction exploration mission constructs. Included in the trade study is an assessment of the risk to the crew and the trade offs between the mission duration and element, logistics, and spares mass. The analysis of the mission trade space was conducted using four simulation and analysis tools developed by NASA. Trajectory analyses for Mars destination missions were conducted using VISITOR (Versatile ImpulSive Interplanetary Trajectory OptimizeR), an in-house tool developed by NASA Langley Research Center. Architecture elements were evaluated using EXploration Architecture Model for IN-space and Earth-to-orbit (EXAMINE), a parametric modeling tool that generates exploration architectures through an integrated systems model. Logistics analysis was conducted using NASA's Human Exploration Logistics Model (HELM), and sparing allocation predictions were generated via the Exploration Maintainability Analysis Tool (EMAT), which is a probabilistic simulation engine that evaluates trades in spacecraft reliability and sparing requirements based on spacecraft system maintainability and reparability.

  1. Ares V Utilization in Support of a Human Mission to Mars

    NASA Technical Reports Server (NTRS)

    Holladay, J. B.; Jaap, J. P.; Pinson, R. M.; Creech, S. D.; Ryan, R. M.; Monk, T. S.; Baggett. K. E.; Runager, M. D.; Dux, I. J.; Hack, K. J.; hide

    2010-01-01

    During the analysis cycles of Phase A-Cycle 3 (PA-C3) and the follow-on 8-wk minicycle of PA-C3', the Ares V team assessed the Ares V PA-C3D configuration to the Mars Design Reference Mission as defined in the Constellation Architecture Requirements Document and further described in Mars Design Reference Architecture 5.0 (DRA 5.0) that was publicly released in July 2009. The ability to support the reference approach for the crewed Mars mission was confirmed through this analysis (7-launch nuclear thermal propulsion (NTP) architecture) and the reference chemical approach as defined in DRA 5.0 (11- or 12-launch chemical propulsion module approach). Additional chemical propulsion options were defined that utilized additional technology investments (primarily in-space cryogenic propellant transfer) that allowed for the same mission to be accomplished with 9 launches rather than the 11 or 12, as documented in DRA 5.0 and associated follow-on activities. This nine-launch chemical propulsion approach showed a unique ability to decouple the architecture from major technological developments (such as zero-boiloff technology or the development of NTP stages) and allowed for a relaxing of the infrastructure investments required to support a very rapid launch rate (30-day launch spacing as documented in DRA 5.0). As an enhancing capability, it also shows promise in allowing for and incorporating the development of a commercial market for cryogenic propellant delivery on orbit, without placing such development on the critical path of beyond low-Earth orbit exploration. The ability of Ares V to support all of the aforementioned options and discussion of key forward work that is required to fully understand the complexities and challenges presented by the Mars mission is further documented herein.

  2. WebGIS based community services architecture by griddization managements and crowdsourcing services

    NASA Astrophysics Data System (ADS)

    Wang, Haiyin; Wan, Jianhua; Zeng, Zhe; Zhou, Shengchuan

    2016-11-01

    Along with the fast economic development of cities, rapid urbanization, population surge, in China, the social community service mechanisms need to be rationalized and the policy standards need to be unified, which results in various types of conflicts and challenges for community services of government. Based on the WebGIS technology, the article provides a community service architecture by gridding management and crowdsourcing service. The WEBGIS service architecture includes two parts: the cloud part and the mobile part. The cloud part refers to community service centres, which can instantaneously response the emergency, visualize the scene of the emergency, and analyse the data from the emergency. The mobile part refers to the mobile terminal, which can call the centre, report the event, collect data and verify the feedback. This WebGIS based community service systems for Huangdao District of Qingdao, were awarded the “2015’ national innovation of social governance case of typical cases”.

  3. ATLAST and JWST Segmented Telescope Design Considerations

    NASA Technical Reports Server (NTRS)

    Feinberg, Lee

    2016-01-01

    To the extent it makes sense, leverage JWST (James Webb Space Telescope) knowledge, designs, architectures. GSE (Ground Support Equipment) good starting point. Develop a full end-to-end architecture that closes. Try to avoid recreating the wheel except where needed. Optimize from there (mainly for stability and coronagraphy). Develop a scalable design reference mission (9.2 meters). Do just enough work to understand launch break points in aperture size Demonstrate 10 pm (phase modulation) stability is achievable on a design reference mission. A really key design driver is the most robust stability possible!!! Make design compatible with starshades. While segmented coronagraphs with high throughput and large bandpasses are important, make the system serviceable so you can evolve the instruments. Keep it room temperature to minimize the costs associated with cryo. Focus resources on the contrast problem. Start with the architecture and connect it to the technology needs.

  4. SensoTube: A Scalable Hardware Design Architecture for Wireless Sensors and Actuators Networks Nodes in the Agricultural Domain

    PubMed Central

    Piromalis, Dimitrios; Arvanitis, Konstantinos

    2016-01-01

    Wireless Sensor and Actuators Networks (WSANs) constitute one of the most challenging technologies with tremendous socio-economic impact for the next decade. Functionally and energy optimized hardware systems and development tools maybe is the most critical facet of this technology for the achievement of such prospects. Especially, in the area of agriculture, where the hostile operating environment comes to add to the general technological and technical issues, reliable and robust WSAN systems are mandatory. This paper focuses on the hardware design architectures of the WSANs for real-world agricultural applications. It presents the available alternatives in hardware design and identifies their difficulties and problems for real-life implementations. The paper introduces SensoTube, a new WSAN hardware architecture, which is proposed as a solution to the various existing design constraints of WSANs. The establishment of the proposed architecture is based, firstly on an abstraction approach in the functional requirements context, and secondly, on the standardization of the subsystems connectivity, in order to allow for an open, expandable, flexible, reconfigurable, energy optimized, reliable and robust hardware system. The SensoTube implementation reference model together with its encapsulation design and installation are analyzed and presented in details. Furthermore, as a proof of concept, certain use cases have been studied in order to demonstrate the benefits of migrating existing designs based on the available open-source hardware platforms to SensoTube architecture. PMID:27527180

  5. A discrete time-varying internal model-based approach for high precision tracking of a multi-axis servo gantry.

    PubMed

    Zhang, Zhen; Yan, Peng; Jiang, Huan; Ye, Peiqing

    2014-09-01

    In this paper, we consider the discrete time-varying internal model-based control design for high precision tracking of complicated reference trajectories generated by time-varying systems. Based on a novel parallel time-varying internal model structure, asymptotic tracking conditions for the design of internal model units are developed, and a low order robust time-varying stabilizer is further synthesized. In a discrete time setting, the high precision tracking control architecture is deployed on a Voice Coil Motor (VCM) actuated servo gantry system, where numerical simulations and real time experimental results are provided, achieving the tracking errors around 3.5‰ for frequency-varying signals. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  6. a Range Based Method for Complex Facade Modeling

    NASA Astrophysics Data System (ADS)

    Adami, A.; Fregonese, L.; Taffurelli, L.

    2011-09-01

    3d modelling of Architectural Heritage does not follow a very well-defined way, but it goes through different algorithms and digital form according to the shape complexity of the object, to the main goal of the representation and to the starting data. Even if the process starts from the same data, such as a pointcloud acquired by laser scanner, there are different possibilities to realize a digital model. In particular we can choose between two different attitudes: the mesh and the solid model. In the first case the complexity of architecture is represented by a dense net of triangular surfaces which approximates the real surface of the object. In the other -opposite- case the 3d digital model can be realized by the use of simple geometrical shapes, by the use of sweeping algorithm and the Boolean operations. Obviously these two models are not the same and each one is characterized by some peculiarities concerning the way of modelling (the choice of a particular triangulation algorithm or the quasi-automatic modelling by known shapes) and the final results (a more detailed and complex mesh versus an approximate and more simple solid model). Usually the expected final representation and the possibility of publishing lead to one way or the other. In this paper we want to suggest a semiautomatic process to build 3d digital models of the facades of complex architecture to be used for example in city models or in other large scale representations. This way of modelling guarantees also to obtain small files to be published on the web or to be transmitted. The modelling procedure starts from laser scanner data which can be processed in the well known way. Usually more than one scan is necessary to describe a complex architecture and to avoid some shadows on the facades. These have to be registered in a single reference system by the use of targets which are surveyed by topography and then to be filtered in order to obtain a well controlled and homogeneous point cloud of the complex architecture. From the point cloud we can extract a false colour map depending on the distance of each point from the average plane. In this way we can represent each point of the facades by a height map in grayscale. In this operation it is important to define the scale of the final result in order to set the correct pixel size in the map. The following step is concerning the use of a modifier which is well-known in computer graphics. In fact the modifier Displacement allows to simulate on a planar surface the original roughness of the object according to a grayscale map. The value of gray is read by the modifier as the distance from the reference plane and it represents the displacement of the corresponding element of the virtual plane. Similar to the bump map, the displacement modifier does not only simulate the effect, but it really deforms the planar surface. In this way the 3d model can be use not only in a static representation, but also in dynamic animation or interactive application. The setting of the plane to be deformed is the most important step in this process. In 3d Max the planar surface has to be characterized by the real dimension of the façade and also by a correct number of quadrangular faces which are the smallest part of the whole surface. In this way we can consider the modified surface as a 3d raster representation where each quadrangular face (corresponding to traditional pixel) is displaced according the value of gray (= distance from the plane). This method can be applied in different context, above all when the object to be represented can be considered as a 2,5 dimension such as facades of architecture in city model or large scale representation. But also it can be used to represent particular effect such as deformation of walls in a complete 3d way.

  7. Human Exploration of Phobos

    NASA Technical Reports Server (NTRS)

    Abercromby, Andrew F. J.; Chappell, Steven P.; Gernhardt, Michael L.; Lee, David E.; Howe, A. Scott

    2015-01-01

    This study developed, analyzed, and compared mission architectures for human exploration of Mars' Moons within the context of an Evolvable Mars Campaign. METHODS: All trades assumed conjunction class missions to Phobos (approximately 500 days in Mars system) as it was considered the driving case for the transportation architecture. All architectures assumed that the Mars Transit Habitat would remain in a High Mars Orbit with crewmembers transferring between HMO and Phobos in a small crew taxi vehicle. A reference science / exploration program was developed including performance of a standard set of tasks at 55 locations on the Phobos surface. Detailed EVA timelines were developed using realistic flight rules to accomplish the reference science tasks using exploration systems ranging from jetpacks to multi-person pressurized excursion vehicles combined with Phobos surface and orbital (L1, L4/L5, 20km Distant Retrograde Orbit) habitat options. Detailed models of propellant mass, crew time, science productivity, radiation exposure, systems and consumables masses, and other figures of merit were integrated to enable quantitative comparison of different architectural options. Options for pre-staging assets using solar electric propulsion (SEP) vs. delivering all systems with the crew were also evaluated. Seven discrete mission architectures were evaluated. RESULTS: The driving consideration for habitat location (Phobos surface vs. orbital) was radiation exposure, with an estimated reduction in cumulative mission radiation exposure of up to 34% (vs. Mars orbital mission) when the habitat is located on the Phobos surface, compared with only 3-6% reduction for a habitat in a 20km DRO. The exploration utility of lightweight unpressurized excursion vehicles was limited by the need to remain within 20 minutes of Solar Particle Event radiation protection combined with complex GN&C systems required by the non-intuitive and highly-variable gravitational environment. Two-person pressurized excursion vehicles as well as mobile surface habitats offer significant exploration capability and operational benefits compared with unpressurized EVA mobility systems at the cost of increased system and propellant mass. Mechanical surface translation modes (i.e. hopping) were modeled and offer potentially significant propellant savings and the possibility of extended exploration operations between crewed missions. Options for extending the utilization of the crew taxi vehicle were examined, including use as an exploration asset for Phobos surface exploration (when combined with an alternate mobility system) and as an EVA platform, both on Phobos and for contingency EVA on the Mars Transit Habitat. CONCLUSIONS: Human exploration of Phobos offers a scientifically meaningful first step towards human Mars surface missions that develops and validates transportation, habitation, and exploration systems and operations in advance of the Mars landing systems.

  8. PUS Services Software Building Block Automatic Generation for Space Missions

    NASA Astrophysics Data System (ADS)

    Candia, S.; Sgaramella, F.; Mele, G.

    2008-08-01

    The Packet Utilization Standard (PUS) has been specified by the European Committee for Space Standardization (ECSS) and issued as ECSS-E-70-41A to define the application-level interface between Ground Segments and Space Segments. The ECSS-E- 70-41A complements the ECSS-E-50 and the Consultative Committee for Space Data Systems (CCSDS) recommendations for packet telemetry and telecommand. The ECSS-E-70-41A characterizes the identified PUS Services from a functional point of view and the ECSS-E-70-31 standard specifies the rules for their mission-specific tailoring. The current on-board software design for a space mission implies the production of several PUS terminals, each providing a specific tailoring of the PUS services. The associated on-board software building blocks are developed independently, leading to very different design choices and implementations even when the mission tailoring requires very similar services (from the Ground operative perspective). In this scenario, the automatic production of the PUS services building blocks for a mission would be a way to optimize the overall mission economy and improve the robusteness and reliability of the on-board software and of the Ground-Space interactions. This paper presents the Space Software Italia (SSI) activities for the development of an integrated environment to support: the PUS services tailoring activity for a specific mission. the mission-specific PUS services configuration. the generation the UML model of the software building block implementing the mission-specific PUS services and the related source code, support documentation (software requirements, software architecture, test plans/procedures, operational manuals), and the TM/TC database. The paper deals with: (a) the project objectives, (b) the tailoring, configuration, and generation process, (c) the description of the environments supporting the process phases, (d) the characterization of the meta-model used for the generation, (e) the characterization of the reference avionics architecture and of the reference on- board software high-level architecture.

  9. Selected Reference Books of 1992.

    ERIC Educational Resources Information Center

    McIlvaine, Eileen

    1993-01-01

    Presents an annotated bibliography of 40 recent scholarly and general works of interest to reference workers in university libraries. Topics areas covered include philosophy, religion, language, literature, architecture, economics, law, area studies, Russia and the Soviet Union, women's studies, and Christopher Columbus. New editions and…

  10. Selected Reference Books of 1993-1994.

    ERIC Educational Resources Information Center

    McIlvaine, Eileen

    1994-01-01

    Offers brief, critical reviews of recent scholarly and general works of interest to reference workers in university libraries. Titles covered include dictionaries, databases, religion, literature, music, dance, art and architecture, business, political science, social issues, and history. Brief descriptions of new editions and supplements for…

  11. The Use of Supporting Documentation for Information Architecture by Australian Libraries

    ERIC Educational Resources Information Center

    Hider, Philip; Burford, Sally; Ferguson, Stuart

    2009-01-01

    This article reports the results of an online survey that examined the development of information architecture of Australian library Web sites with reference to documented methods and guidelines. A broad sample of library Web managers responded from across the academic, public, and special sectors. A majority of libraries used either in-house or…

  12. Director, Platform and Audience.

    ERIC Educational Resources Information Center

    Meyer, Richard D.

    The open stage is discussed both as architecture and as part of a new theatrical style. In reference to use of the open stage, emphasis is given to specifics with which the director must deal, to special problems of the actor, to the approach to blocking a play, and to the open stage as "theatrical experience". The architectural advantage of the…

  13. MIT CSAIL and Lincoln Laboratory Task Force Report

    DTIC Science & Technology

    2016-08-01

    projects have been very diverse, spanning several areas of CSAIL concentration, including robotics, big data analytics , wireless communications...spanning several areas of CSAIL concentration, including robotics, big data analytics , wireless communications, computing architectures and...to machine learning systems and algorithms, such as recommender systems, and “Big Data ” analytics . Advanced computing architectures broadly refer to

  14. Human Exploration of Mars Design Reference Architecture 5.0, Addendum #2

    NASA Technical Reports Server (NTRS)

    Drake, Bret G. (Editor); Watts Kevin D. (Editor)

    2014-01-01

    This report serves as the second Addendum to NASA-SP-2009-566, "Human Exploration of Mars Design Reference Architecture 5.0." The data and descriptions contained within this Addendum capture some of the key assessments and studies produced since publication of the original document, predominately covering those conducted from 2009 through 2012. The assessments and studies described herein are for the most part independent stand-alone contributions. Effort has not been made to assimilate the findings to provide an updated integrated strategy. That is a recognized future effort. This report should not be viewed as constituting a formal plan for the human exploration of Mars.

  15. Parametric trade studies on a Shuttle 2 launch system architecture

    NASA Technical Reports Server (NTRS)

    Stanley, Douglas O.; Talay, Theodore A.; Lepsch, Roger A.; Morris, W. Douglas; Naftel, J. Christopher; Cruz, Christopher I.

    1991-01-01

    A series of trade studies are presented on a complementary architecture of launch vehicles as a part of a study often referred to as Shuttle-2. The results of the trade studies performed on the vehicles of a reference Shuttle-2 mixed fleet architecture have provided an increased understanding of the relative importance of each of the major vehicle parameters. As a result of trades on the reference booster-orbiter configuration with a methane booster, the study showed that 60 percent of the total liftoff thrust should be on the booster and 40 percent on the orbiter. It was also found that the liftoff thrust to weight ratio (T/W) on the booster-orbiter should be 1.3. This leads to a low dry weight and still provides enough thrust to allow the design of a heavy lift architecture. As a result of another trade study, the dry weight of the reference booster-orbiter was chosen for a variety of operational considerations. Other trade studies on the booster-orbiter demonstrate that the cross feeding of propellant during boost phase is desirable and that engine-out capability from launch to orbit is worth the performance penalty. Technology assumptions made during the Shuttle-2 design were shown to be approx. equivalent to a 25 percent across the board weight reduction over the Space Shuttle technology. The vehicles of the Shuttle-2 architecture were also sized for a wide variety of payloads and missions to different orbits. Many of these same parametric trades were also performed on completely liquid hydrogen fueled fully reusable concepts. If a booster-orbiter is designed using liquid hydrogen engines on both the booster and orbiter, the total vehicle dry weight is only 3.0 percent higher than the reference dual-fuel booster-orbiter, and the gross weight is 3.8 percent less. For this booster-orbiter vehicle, a liftoff T/W of 1.3, a thrust of about 60 percent on the booster, and a Mach staging number of 3 all proved to be desirable. This modest dry weight increase for a liquid hydrogen fueled Shuttle-2 system should be more than offset by the elimination of the entire hydrocarbon engine development program and the savings in operation cost realized by the elimination of an entire fuel type.

  16. Space Generic Open Avionics Architecture (SGOAA): Overview

    NASA Technical Reports Server (NTRS)

    Wray, Richard B.; Stovall, John R.

    1992-01-01

    A space generic open avionics architecture created for NASA is described. It will serve as the basis for entities in spacecraft core avionics, capable of being tailored by NASA for future space program avionics ranging from small vehicles such as Moon ascent/descent vehicles to large ones such as Mars transfer vehicles or orbiting stations. The standard consists of: (1) a system architecture; (2) a generic processing hardware architecture; (3) a six class architecture interface model; (4) a system services functional subsystem architectural model; and (5) an operations control functional subsystem architectural model.

  17. Human Exploration of Mars Design Reference Architecture 5.0

    NASA Technical Reports Server (NTRS)

    Drake, Bret G.

    2009-01-01

    This document reviews the Design Reference Architecture (DRA) for human exploration of Mars. The DRA represents the current best strategy for human missions. The DRA is not a formal plan, but provides a vision and context to tie current systems and technology developments to potential missions to Mars, and it also serves as a benchmark against which alternative architectures can be measured. The document also reviews the objectives and products of the 2007 study that was to update NASA's human Mars mission reference architecture, assess strategic linkages between lunar and Mars strategies, develop an understanding of methods for reducing cost/risk of human missions through investment in research, technology development and synergy with other exploration plans. There is also a review of the process by which the DRA will continue to be refined. The unique capacities of human exploration is reviewed. The possible goals and objectives of the first three human missions are presented, along with the recommendation that the mission involve a long stay visiting multiple sites.The deployment strategy is outlined and diagrammed including the pre-deployment of the many of the material requirements, and a six crew travel to Mars on a six month trajectory. The predeployment and the Orion crew vehicle are shown. The ground operations requirements are also explained. Also the use of resources found on the surface of Mars is postulated. The Mars surface exploration strategy is reviewed, including the planetary protection processes that are planned. Finally a listing of the key decisions and tenets is posed.

  18. Design and Development of an Equipotential Voltage Reference (Grounding) System for a Low-Cost Rapid-Development Modular Spacecraft Architecture

    NASA Technical Reports Server (NTRS)

    Lukash, James A.; Daley, Earl

    2011-01-01

    This work describes the design and development effort to adapt rapid-development space hardware by creating a ground system using solutions of low complexity, mass, & cost. The Lunar Atmosphere and Dust Environment Explorer (LADEE) spacecraft is based on the modular common spacecraft bus architecture developed at NASA Ames Research Center. The challenge was building upon the existing modular common bus design and development work and improving the LADEE spacecraft design by adding an Equipotential Voltage Reference (EVeR) system, commonly referred to as a ground system. This would aid LADEE in meeting Electromagnetic Environmental Effects (E3) requirements, thereby making the spacecraft more compatible with itself and its space environment. The methods used to adapt existing hardware are presented, including provisions which may be used on future spacecraft.

  19. ESPC Common Model Architecture

    DTIC Science & Technology

    2014-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. ESPC Common Model Architecture Earth System Modeling...Operational Prediction Capability (NUOPC) was established between NOAA and Navy to develop common software architecture for easy and efficient...development under a common model architecture and other software-related standards in this project. OBJECTIVES NUOPC proposes to accelerate

  20. A Proposed Pattern of Enterprise Architecture

    DTIC Science & Technology

    2013-02-01

    consistent architecture descriptions. UPDM comprises extensions to both OMG’s Unified Modelling Language (UML) and Systems Modelling Language ( SysML ...those who use UML and SysML . These represent significant advancements that enable architecture trade-off analyses, architecture model execution...Language ( SysML ), and thus provides for architectural descriptions that contain a rich set of (formally) connected DoDAF/MoDAF viewpoints expressed

  1. The System of Systems Architecture Feasibility Assessment Model

    DTIC Science & Technology

    2016-06-01

    OF SYSTEMS ARCHITECTURE FEASIBILITY ASSESSMENT MODEL by Stephen E. Gillespie June 2016 Dissertation Supervisor Eugene Paulo THIS PAGE...Dissertation 4. TITLE AND SUBTITLE THE SYSTEM OF SYSTEMS ARCHITECTURE FEASIBILITY ASSESSMENT MODEL 5. FUNDING NUMBERS 6. AUTHOR(S) Stephen E...SoS architecture feasibility assessment model (SoS-AFAM). Together, these extend current model- based systems engineering (MBSE) and SoS engineering

  2. Semi-Markov adjunction to the Computer-Aided Markov Evaluator (CAME)

    NASA Technical Reports Server (NTRS)

    Rosch, Gene; Hutchins, Monica A.; Leong, Frank J.; Babcock, Philip S., IV

    1988-01-01

    The rule-based Computer-Aided Markov Evaluator (CAME) program was expanded in its ability to incorporate the effect of fault-handling processes into the construction of a reliability model. The fault-handling processes are modeled as semi-Markov events and CAME constructs and appropriate semi-Markov model. To solve the model, the program outputs it in a form which can be directly solved with the Semi-Markov Unreliability Range Evaluator (SURE) program. As a means of evaluating the alterations made to the CAME program, the program is used to model the reliability of portions of the Integrated Airframe/Propulsion Control System Architecture (IAPSA 2) reference configuration. The reliability predictions are compared with a previous analysis. The results bear out the feasibility of utilizing CAME to generate appropriate semi-Markov models to model fault-handling processes.

  3. New optical architecture for holographic data storage system compatible with Blu-ray Disc™ system

    NASA Astrophysics Data System (ADS)

    Shimada, Ken-ichi; Ide, Tatsuro; Shimano, Takeshi; Anderson, Ken; Curtis, Kevin

    2014-02-01

    A new optical architecture for holographic data storage system which is compatible with a Blu-ray Disc™ (BD) system is proposed. In the architecture, both signal and reference beams pass through a single objective lens with numerical aperture (NA) 0.85 for realizing angularly multiplexed recording. The geometry of the architecture brings a high affinity with an optical architecture in the BD system because the objective lens can be placed parallel to a holographic medium. Through the comparison of experimental results with theory, the validity of the optical architecture was verified and demonstrated that the conventional objective lens motion technique in the BD system is available for angularly multiplexed recording. The test-bed composed of a blue laser system and an objective lens of the NA 0.85 was designed. The feasibility of its compatibility with BD is examined through the designed test-bed.

  4. Can architecture be barbaric?

    PubMed

    Hürol, Yonca

    2009-06-01

    The title of this article is adapted from Theodor W. Adorno's famous dictum: 'To write poetry after Auschwitz is barbaric.' After the catastrophic earthquake in Kocaeli, Turkey on the 17th of August 1999, in which more than 40,000 people died or were lost, Necdet Teymur, who was then the dean of the Faculty of Architecture of the Middle East Technical University, referred to Adorno in one of his 'earthquake poems' and asked: 'Is architecture possible after 17th of August?' The main objective of this article is to interpret Teymur's question in respect of its connection to Adorno's philosophy with a view to make a contribution to the politics and ethics of architecture in Turkey. Teymur's question helps in providing a new interpretation of a critical approach to architecture and architectural technology through Adorno's philosophy. The paper also presents a discussion of Adorno's dictum, which serves for a better understanding of its universality/particularity.

  5. Deploying electromagnetic particle-in-cell (EM-PIC) codes on Xeon Phi accelerators boards

    NASA Astrophysics Data System (ADS)

    Fonseca, Ricardo

    2014-10-01

    The complexity of the phenomena involved in several relevant plasma physics scenarios, where highly nonlinear and kinetic processes dominate, makes purely theoretical descriptions impossible. Further understanding of these scenarios requires detailed numerical modeling, but fully relativistic particle-in-cell codes such as OSIRIS are computationally intensive. The quest towards Exaflop computer systems has lead to the development of HPC systems based on add-on accelerator cards, such as GPGPUs and more recently the Xeon Phi accelerators that power the current number 1 system in the world. These cards, also referred to as Intel Many Integrated Core Architecture (MIC) offer peak theoretical performances of >1 TFlop/s for general purpose calculations in a single board, and are receiving significant attention as an attractive alternative to CPUs for plasma modeling. In this work we report on our efforts towards the deployment of an EM-PIC code on a Xeon Phi architecture system. We will focus on the parallelization and vectorization strategies followed, and present a detailed performance evaluation of code performance in comparison with the CPU code.

  6. DAsHER CD: Developing a Data-Oriented Human-Centric Enterprise Architecture for EarthCube

    NASA Astrophysics Data System (ADS)

    Yang, C. P.; Yu, M.; Sun, M.; Qin, H.; Robinson, E.

    2015-12-01

    One of the biggest challenges that face Earth scientists is the resource discovery, access, and sharing in a desired fashion. EarthCube is targeted to enable geoscientists to address the challenges by fostering community-governed efforts that develop a common cyberinfrastructure for the purpose of collecting, accessing, analyzing, sharing and visualizing all forms of data and related resources, through the use of advanced technological and computational capabilities. Here we design an Enterprise Architecture (EA) for EarthCube to facilitate the knowledge management, communication and human collaboration in pursuit of the unprecedented data sharing across the geosciences. The design results will provide EarthCube a reference framework for developing geoscience cyberinfrastructure collaborated by different stakeholders, and identifying topics which should invoke high interest in the community. The development of this EarthCube EA framework leverages popular frameworks, such as Zachman, Gartner, DoDAF, and FEAF. The science driver of this design is the needs from EarthCube community, including the analyzed user requirements from EarthCube End User Workshop reports and EarthCube working group roadmaps, and feedbacks or comments from scientists obtained by organizing workshops. The final product of this Enterprise Architecture is a four-volume reference document: 1) Volume one is this document and comprises an executive summary of the EarthCube architecture, serving as an overview in the initial phases of architecture development; 2) Volume two is the major body of the design product. It outlines all the architectural design components or viewpoints; 3) Volume three provides taxonomy of the EarthCube enterprise augmented with semantics relations; 4) Volume four describes an example of utilizing this architecture for a geoscience project.

  7. A Phobos-Deimos Mission as an Element of the NASA Mars Design Reference Architecture 5.0

    NASA Technical Reports Server (NTRS)

    Hoffman, Stephen J.

    2011-01-01

    NASA has conducted a series of mission studies over the past 25 years examining the eventual exploration of the surface of Mars by humans. The latest version of this evolutionary series of design reference missions/architectures - Design Reference Architecture 5 or DRA-5 - was completed in 2007. This paper examines the implications of including a human mission to explore the moons of Mars and teleoperate robots in various locations, but not to land the human crews on Mars, as an element of this reference architecture. Such a mission has been proposed several times during this same 25 year evolution leading up to the completion of DRA-5 primarily as a mission of testing the in-space vehicles and operations while surface vehicles and landers are under development. But such a precursor or test mission has never been explicitly included as an element of this Architecture. This paper will first summarize the key features of the DRA-5 to provide context for the remainder of the assessment. This will include a description of the in-space vehicles that would be the subject of a shakedown test during the Mars orbital mission. A decision tree will be used to illustrate the factors that will be analyzed, and the sequence in which they will be addressed, for this assessment. The factors that will be analyzed include the type of interplanetary transfer orbit (opposition class versus conjunction class), the type of parking orbit (circular versus elliptical), and the type of propulsion technology (high thrust chemical versus nuclear thermal rocket). The manner in which each of these factors impacts an individual mission will be described. In addition to the direct impact of these factors, additional considerations impacting crew health and overall programmatic outcomes will be discussed. Numerical results for each of the factors in the decision tree will be grouped with derived qualitative impacts from crew health and programmatic consideration. These quantitative and qualitative results will be summarized in a pros/cons table as a summary for this analysis.

  8. Space and Architecture's Current Line of Research? A Lunar Architecture Workshop With An Architectural Agenda.

    NASA Astrophysics Data System (ADS)

    Solomon, D.; van Dijk, A.

    The "2002 ESA Lunar Architecture Workshop" (June 3-16) ESTEC, Noordwijk, NL and V2_Lab, Rotterdam, NL) is the first-of-its-kind workshop for exploring the design of extra-terrestrial (infra) structures for human exploration of the Moon and Earth-like planets introducing 'architecture's current line of research', and adopting an architec- tural criteria. The workshop intends to inspire, engage and challenge 30-40 European masters students from the fields of aerospace engineering, civil engineering, archi- tecture, and art to design, validate and build models of (infra) structures for Lunar exploration. The workshop also aims to open up new physical and conceptual terrain for an architectural agenda within the field of space exploration. A sound introduc- tion to the issues, conditions, resources, technologies, and architectural strategies will initiate the workshop participants into the context of lunar architecture scenarios. In my paper and presentation about the development of the ideology behind this work- shop, I will comment on the following questions: * Can the contemporary architectural agenda offer solutions that affect the scope of space exploration? It certainly has had an impression on urbanization and colonization of previously sparsely populated parts of Earth. * Does the current line of research in architecture offer any useful strategies for com- bining scientific interests, commercial opportunity, and public space? What can be learned from 'state of the art' architecture that blends commercial and public pro- grammes within one location? * Should commercial 'colonisation' projects in space be required to provide public space in a location where all humans present are likely to be there in a commercial context? Is the wave in Koolhaas' new Prada flagship store just a gesture to public space, or does this new concept in architecture and shopping evolve the public space? * What can we learn about designing (infra-) structures on the Moon or any other space context that will be useful on Earth on a conceptual and practical level? * In what ways could architecture's field of reference offer building on the Moon (and other celestial bodies) a paradigm shift? 1 In addition to their models and designs, workshop participants will begin authoring a design recommendation for the building of (infra-) structures and habitats on celestial bodies in particular the Moon and Mars. The design recommendation, a substantiated aesthetic code of conduct (not legally binding) will address long term planning and incorporate issues of sustainability, durability, bio-diversity, infrastructure, CHANGE, and techniques that lend themselves to Earth-bound applications. It will also address the cultural implications of architectural design might have within the context of space exploration. The design recommendation will ultimately be presented for peer review to both the space and architecture communities. What would the endorsement from the architectural community of such a document mean to the space community? The Lunar Architecture Workshop is conceptualised, produced and organised by(in alphabetical order): Alexander van Dijk, Art Race in Space, Barbara Imhof; ES- CAPE*spHERE, Vienna, University of Technology, Institute for Design and Building Construction, Vienna, Bernard Foing; ESA SMART1 Project Scientist, Susmita Mo- hanty; MoonFront, LLC, Hans Schartner' Vienna University of Technology, Institute for Design and Building Construction, Debra Solomon; Art Race in Space, Dutch Art Institute, Paul van Susante; Lunar Explorers Society. Workshop locations: ESTEC, Noordwijk, NL and V2_Lab, Rotterdam, NL Workshop dates: June 3-16, 2002 (a Call for Participation will be made in March -April 2002.) 2

  9. The CMIP5 archive architecture: A system for petabyte-scale distributed archival of climate model data

    NASA Astrophysics Data System (ADS)

    Pascoe, Stephen; Cinquini, Luca; Lawrence, Bryan

    2010-05-01

    The Phase 5 Coupled Model Intercomparison Project (CMIP5) will produce a petabyte scale archive of climate data relevant to future international assessments of climate science (e.g., the IPCC's 5th Assessment Report scheduled for publication in 2013). The infrastructure for the CMIP5 archive must meet many challenges to support this ambitious international project. We describe here the distributed software architecture being deployed worldwide to meet these challenges. The CMIP5 architecture extends the Earth System Grid (ESG) distributed architecture of Datanodes, providing data access and visualisation services, and Gateways providing the user interface including registration, search and browse services. Additional features developed for CMIP5 include a publication workflow incorporating quality control and metadata submission, data replication, version control, update notification and production of citable metadata records. Implementation of these features have been driven by the requirements of reliable global access to over 1Pb of data and consistent citability of data and metadata. Central to the implementation is the concept of Atomic Datasets that are identifiable through a Data Reference Syntax (DRS). Atomic Datasets are immutable to allow them to be replicated and tracked whilst maintaining data consistency. However, since occasional errors in data production and processing is inevitable, new versions can be published and users notified of these updates. As deprecated datasets may be the target of existing citations they can remain visible in the system. Replication of Atomic Datasets is designed to improve regional access and provide fault tolerance. Several datanodes in the system are designated replicating nodes and hold replicas of a portion of the archive expected to be of broad interest to the community. Gateways provide a system-wide interface to users where they can track the version history and location of replicas to select the most appropriate location for download. In addition to meeting the immediate needs of CMIP5 this architecture provides a basis for the Earth System Modeling e-infrastructure being further developed within the EU FP7 IS-ENES project.

  10. Designing from Cinema: Film as Trigger of the Creative Process in Architecture

    ERIC Educational Resources Information Center

    Bergera, Iñaki

    2018-01-01

    The present paper examines, in a case study format, the use of films, short films and audiovisual documentaries as reasoning and references for design assignments during the first years of an architectural degree course. The aim of this fruitful and comparable experience is not so much to study and verify the well-known synergies between film and…

  11. Architectural Contributions to Effective Programming for the Mentally Retarded. Conference Report of the Architectural Institute (Denver, Colorado, May 15-16, 1967).

    ERIC Educational Resources Information Center

    American Association on Mental Deficiency, Washington, DC.

    Conference participants consider the role of the architect and the programer in planning and constructing facilities for the mentally handicapped. David Rosen discusses the design problems of state institutions with particular reference to the Woodbridge State School in New Jersey; Gunnar Dybwad describes the need of the programer for the…

  12. The Acoustical Properties of the Polyurethane Concrete Made of Oyster Shell Waste Comparing Other Concretes as Architectural Design Components

    NASA Astrophysics Data System (ADS)

    Setyowati, Erni; Hardiman, Gagoek; Purwanto

    2018-02-01

    This research aims to determine the acoustical properties of concrete material made of polyurethane and oyster shell waste as both fine aggregate and coarse aggregate comparing to other concrete mortar. Architecture needs aesthetics materials, so the innovation in architectural material should be driven through the efforts of research on materials for building designs. The DOE methods was used by mixing cement, oyster shell, sands, and polyurethane by composition of 160 ml:40 ml:100 ml: 120 ml respectively. Refer to the results of previous research, then cement consumption is reduced up to 20% to keep the concept of green material. This study compared three different compositions of mortars, namely portland cement concrete with gravel (PCG), polyurethane concrete of oyster shell (PCO) and concrete with plastics aggregate (PCP). The methods of acoustical tests were conducted refer to the ASTM E413-04 standard. The research results showed that polyurethane concrete with oyster shell waste aggregate has absorption coefficient 0.52 and STL 63 dB and has a more beautiful appearance when it was pressed into moulding. It can be concluded that polyurethane concrete with oyster shell aggregate (PCO) is well implemented in architectural acoustics-components.

  13. Selected Reference Books of 1998.

    ERIC Educational Resources Information Center

    McIlvaine, Eileen

    1999-01-01

    Reviews a selection of recent scholarly and general reference works under the categories of Periodicals and Newspapers, Philosophy, Literature, Film and Radio, Art and Architecture, Music, Political Science, Women's Studies, and History. A brief summary of new editions of standard works is provided at the end of the articles. (AEF)

  14. Collections Care: A Basic Reference Shelflist.

    ERIC Educational Resources Information Center

    de Torres, Amparo R., Ed.

    This is an extensive bibliography of reference sources--i.e., books and articles--that relate to the care and conservation of library, archival, and museum collections. Bibliographies are presented under the following headings: (1) General Information; (2) Basic Collections Care; (3) Architectural Conservation; (4) Collections Management: Law,…

  15. Cryogenic Pupil Alignment Test Architecture for Aberrated Pupil Images

    NASA Technical Reports Server (NTRS)

    Bos, Brent; Kubalak, David A.; Antonille, Scott; Ohl, Raymond; Hagopian, John G.

    2009-01-01

    A document describes cryogenic test architecture for the James Webb Space Telescope (JWST) integrated science instrument module (ISIM). The ISIM element primarily consists of a mechanical metering structure, three science instruments, and a fine guidance sensor. One of the critical optomechanical alignments is the co-registration of the optical telescope element (OTE) exit pupil with the entrance pupils of the ISIM instruments. The test architecture has been developed to verify that the ISIM element will be properly aligned with the nominal OTE exit pupil when the two elements come together. The architecture measures three of the most critical pupil degrees-of-freedom during optical testing of the ISIM element. The pupil measurement scheme makes use of specularly reflective pupil alignment references located inside the JWST instruments, ground support equipment that contains a pupil imaging module, an OTE simulator, and pupil viewing channels in two of the JWST flight instruments. Pupil alignment references (PARs) are introduced into the instrument, and their reflections are checked using the instrument's mirrors. After the pupil imaging module (PIM) captures a reflected PAR image, the image will be analyzed to determine the relative alignment offset. The instrument pupil alignment preferences are specularly reflective mirrors with non-reflective fiducials, which makes the test architecture feasible. The instrument channels have fairly large fields of view, allowing PAR tip/tilt tolerances on the order of 0.5deg.

  16. A cross-functional service-oriented architecture to support real-time information exchange in emergency medical response.

    PubMed

    Hauenstein, Logan; Gao, Tia; Sze, Tsz Wo; Crawford, David; Alm, Alex; White, David

    2006-01-01

    Real-time information communication presents a persistent challenge to the emergency response community. During a medical emergency, various first response disciplines including Emergency Medical Service (EMS), Fire, and Police, and multiple health service facilities including hospitals, auxiliary care centers and public health departments using disparate information technology systems must coordinate their efforts by sharing real-time information. This paper describes a service-oriented architecture (SOA) that uses shared data models of emergency incidents to support the exchange of data between heterogeneous systems. This architecture is employed in the Advanced Health and Disaster Aid Network (AID-N) system, a testbed investigating information technologies to improve interoperation among multiple emergency response organizations in the Washington DC Metropolitan region. This architecture allows us to enable real-time data communication between three deployed systems: 1) a pre-hospital patient care reporting software system used on all ambulances in Arlington County, Virginia (MICHAELS), 2) a syndromic surveillance system used by public health departments in the Washington area (ESSENCE), and 3) a hazardous material reference software system (WISER) developed by the National Library Medicine. Additionally, we have extended our system to communicate with three new data sources: 1) wireless automated vital sign sensors worn by patients, 2) web portals for admitting hospitals, and 3) PDAs used by first responders at emergency scenes to input data (SIRP).

  17. Space Launch System Ascent Flight Control Design

    NASA Technical Reports Server (NTRS)

    VanZwieten, Tannen S.; Orr, Jeb S.; Wall, John H.; Hall, Charles E.

    2014-01-01

    A robust and flexible autopilot architecture for NASA's Space Launch System (SLS) family of launch vehicles is presented. As the SLS configurations represent a potentially significant increase in complexity and performance capability of the integrated flight vehicle, it was recognized early in the program that a new, generalized autopilot design should be formulated to fulfill the needs of this new space launch architecture. The present design concept is intended to leverage existing NASA and industry launch vehicle design experience and maintain the extensibility and modularity necessary to accommodate multiple vehicle configurations while relying on proven and flight-tested control design principles for large boost vehicles. The SLS flight control architecture combines a digital three-axis autopilot with traditional bending filters to support robust active or passive stabilization of the vehicle's bending and sloshing dynamics using optimally blended measurements from multiple rate gyros on the vehicle structure. The algorithm also relies on a pseudo-optimal control allocation scheme to maximize the performance capability of multiple vectored engines while accommodating throttling and engine failure contingencies in real time with negligible impact to stability characteristics. The architecture supports active in-flight load relief through the use of a nonlinear observer driven by acceleration measurements, and envelope expansion and robustness enhancement is obtained through the use of a multiplicative forward gain modulation law based upon a simple model reference adaptive control scheme.

  18. Space Launch System Ascent Flight Control Design

    NASA Technical Reports Server (NTRS)

    Orr, Jeb S.; Wall, John H.; VanZwieten, Tannen S.; Hall, Charles E.

    2014-01-01

    A robust and flexible autopilot architecture for NASA's Space Launch System (SLS) family of launch vehicles is presented. The SLS configurations represent a potentially significant increase in complexity and performance capability when compared with other manned launch vehicles. It was recognized early in the program that a new, generalized autopilot design should be formulated to fulfill the needs of this new space launch architecture. The present design concept is intended to leverage existing NASA and industry launch vehicle design experience and maintain the extensibility and modularity necessary to accommodate multiple vehicle configurations while relying on proven and flight-tested control design principles for large boost vehicles. The SLS flight control architecture combines a digital three-axis autopilot with traditional bending filters to support robust active or passive stabilization of the vehicle's bending and sloshing dynamics using optimally blended measurements from multiple rate gyros on the vehicle structure. The algorithm also relies on a pseudo-optimal control allocation scheme to maximize the performance capability of multiple vectored engines while accommodating throttling and engine failure contingencies in real time with negligible impact to stability characteristics. The architecture supports active in-flight disturbance compensation through the use of nonlinear observers driven by acceleration measurements. Envelope expansion and robustness enhancement is obtained through the use of a multiplicative forward gain modulation law based upon a simple model reference adaptive control scheme.

  19. MIMO Sliding Mode Control for a Tailless Fighter Aircraft, An Alternative to Reconfigurable Architectures

    NASA Technical Reports Server (NTRS)

    Wells, S. R.; Hess, R. A.

    2002-01-01

    A frequency-domain procedure for the design of sliding mode controllers for multi-input, multi-output (MIMO) systems is presented. The methodology accommodates the effects of parasitic dynamics such as those introduced by unmodeled actuators through the introduction of multiple asymptotic observers and model reference hedging. The design procedure includes a frequency domain approach to specify the sliding manifold, the observer eigenvalues, and the hedge model. The procedure is applied to the development of a flight control system for a linear model of the Innovative Control Effector (ICE) fighter aircraft. The stability and performance robustness of the resulting design is demonstrated through the introduction of significant degradation in the control effector actuators and variation in vehicle dynamics.

  20. Beyond assemblies: system convergence and multi-materiality.

    PubMed

    Wiscombe, Tom

    2012-03-01

    The architectural construction industry has become increasingly more specialized over the past 50 years, creating a culture of layer thinking over part-to-whole thinking. Building systems and technologies are often cobbled together in conflicting and uncorrelated ways, even when referred to as 'integrated', such as by way of building information modeling. True integration of building systems requires rethinking how systems and architectural morphologies can push and pull on one another, creating not only innovation in technology but in aesthetics. The revolution in composite materials, with unprecedented plasticity and performance features, opens up a huge range of possibilities for achieving this kind of convergence. Composites by nature fuse envelope and structure, but through various types of inflections, they can also be made to conduct air and fluids through cavities and de-laminations, as well as integrate lighting and energy systems. Assembly as we know it moves away from mineral materials and hardware and toward polymers and 'healing'. Further, when projected into the near-future realm of multi-materiality and 3D manufacturing, possibilities for embedding systems and creating gradients of rigidity and opacity open up, pointing to an entirely new realm of architectural thinking.

  1. Enhancing the Reuse of Digital Resources for Integrated Systems to Represent, Understand and Dynamize Complex Interactions in Architectural Cultural Heritage Environments

    NASA Astrophysics Data System (ADS)

    Delgado, F. J.; Martinez, R.; Finat, J.; Martinez, J.; Puche, J. C.; Finat, F. J.

    2013-07-01

    In this work we develop a multiply interconnected system which involves objects, agents and interactions between them from the use of ICT applied to open repositories, users communities and web services. Our approach is applied to Architectural Cultural Heritage Environments (ACHE). It includes components relative to digital accessibility (to augmented ACHE repositories), contents management (ontologies for the semantic web), semiautomatic recognition (to ease the reuse of materials) and serious videogames (for interaction in urban environments). Their combination provides a support for local real/remote virtual tourism (including some tools for low-level RT display of rendering in portable devices), mobile-based smart interactions (with a special regard to monitored environments) and CH related games (as extended web services). Main contributions to AR models on usual GIS applied to architectural environments, concern to an interactive support performed directly on digital files which allows to access to CH contents which are referred to GIS of urban districts (involving facades, historical or preindustrial buildings) and/or CH repositories in a ludic and transversal way to acquire cognitive, medial and social abilities in collaborative environments.

  2. MuSE: accounting for tumor heterogeneity using a sample-specific error model improves sensitivity and specificity in mutation calling from sequencing data.

    PubMed

    Fan, Yu; Xi, Liu; Hughes, Daniel S T; Zhang, Jianjun; Zhang, Jianhua; Futreal, P Andrew; Wheeler, David A; Wang, Wenyi

    2016-08-24

    Subclonal mutations reveal important features of the genetic architecture of tumors. However, accurate detection of mutations in genetically heterogeneous tumor cell populations using next-generation sequencing remains challenging. We develop MuSE ( http://bioinformatics.mdanderson.org/main/MuSE ), Mutation calling using a Markov Substitution model for Evolution, a novel approach for modeling the evolution of the allelic composition of the tumor and normal tissue at each reference base. MuSE adopts a sample-specific error model that reflects the underlying tumor heterogeneity to greatly improve the overall accuracy. We demonstrate the accuracy of MuSE in calling subclonal mutations in the context of large-scale tumor sequencing projects using whole exome and whole genome sequencing.

  3. Development of enterprise architecture in university using TOGAF as framework

    NASA Astrophysics Data System (ADS)

    Amalia, Endang; Supriadi, Hari

    2017-06-01

    The university of XYZ is located in Bandung, West Java. It has an infrastructure of technology information (IT) which is managed independently. Currently, the IT at the University of XYZ employs a complex conventional management pattern that does not result in a fully integrated IT infrastructure. This is not adaptive in addressing solutions to changing business needs and applications. In addition, it impedes the innovative development of sustainable IT services and also contributes to an unnecessary high workload for managers. This research aims to establish the concept of IS/IT strategic planning. This is used in the development of the IS/IT and in designing the information technology infrastructure based on the framework of The Open Group Architecture Framework (TOGAF) and Architecture Development Method (ADM). A case study will be done at the University of XYZ using the concept of qualitative research through review of literatures and interviews. This study generates the following stages:(1) forming a design using TOGAF and the ADM around nine functional areas of business and propose 12 application candidates to be developed at XYZ University; (2) generating 11 principles of the development of information technology architecture; (3) creating a portfolio for future applications (McFarlan Grid), generating 6 applications in the strategic quadrant (SIAKAD-T, E-LIBRARY, SIPADU-T, DSS, SIPPM-T, KMS), 2 quadrant application operations (PMS-T, CRM), 4 quadrant application supports (MNC-T, NOPEC-T, EMAIL-SYSTEM, SSO); and (4) modelling the enterprise architecture of this study which could be a reference in making a blueprint for the development of information systems and information technology at the University of XYZ.

  4. Electronic Reference Library: Silverplatter's Database Networking Solution.

    ERIC Educational Resources Information Center

    Millea, Megan

    Silverplatter's Electronic Reference Library (ERL) provides wide area network access to its databases using TCP/IP communications and client-server architecture. ERL has two main components: The ERL clients (retrieval interface) and the ERL server (search engines). ERL clients provide patrons with seamless access to multiple databases on multiple…

  5. Framework for the Parametric System Modeling of Space Exploration Architectures

    NASA Technical Reports Server (NTRS)

    Komar, David R.; Hoffman, Jim; Olds, Aaron D.; Seal, Mike D., II

    2008-01-01

    This paper presents a methodology for performing architecture definition and assessment prior to, or during, program formulation that utilizes a centralized, integrated architecture modeling framework operated by a small, core team of general space architects. This framework, known as the Exploration Architecture Model for IN-space and Earth-to-orbit (EXAMINE), enables: 1) a significantly larger fraction of an architecture trade space to be assessed in a given study timeframe; and 2) the complex element-to-element and element-to-system relationships to be quantitatively explored earlier in the design process. Discussion of the methodology advantages and disadvantages with respect to the distributed study team approach typically used within NASA to perform architecture studies is presented along with an overview of EXAMINE s functional components and tools. An example Mars transportation system architecture model is used to demonstrate EXAMINE s capabilities in this paper. However, the framework is generally applicable for exploration architecture modeling with destinations to any celestial body in the solar system.

  6. Astronomical and Cosmological Aspects of Maya Architecture and Urbanism

    NASA Astrophysics Data System (ADS)

    Šprajc, I.

    2009-08-01

    Archaeoastronomical studies carried out so far have shown that the orientations in the ancient Maya architecture were, like elsewhere in Mesoamerica, largely astronomical, mostly referring to sunrises and sunsets on particular dates and allowing the use of observational calendars that facilitated a proper scheduling of agricultural activities. However, the astronomical alignments cannot be understood in purely utilitarian terms. Since the repeatedly occurring directions are most consistently incorporated in monumental architecture of civic and ceremonial urban cores, they must have had an important place in religion and worldview. The characteristics of urban layouts, as well as architectural and other elements associated with important buildings, reveal that the Maya architectural and urban planning was dictated by a complex set of rules, in which astronomical considerations related to practical needs were embedded in a broader framework of cosmological concepts substantiated by political ideology.

  7. Modeling Interoperable Information Systems with 3LGM² and IHE.

    PubMed

    Stäubert, S; Schaaf, M; Jahn, F; Brandner, R; Winter, A

    2015-01-01

    Strategic planning of information systems (IS) in healthcare requires descriptions of the current and the future IS state. Enterprise architecture planning (EAP) tools like the 3LGM² tool help to build up and to analyze IS models. A model of the planned architecture can be derived from an analysis of current state IS models. Building an interoperable IS, i. e. an IS consisting of interoperable components, can be considered a relevant strategic information management goal for many IS in healthcare. Integrating the healthcare enterprise (IHE) is an initiative which targets interoperability by using established standards. To link IHE concepts to 3LGM² concepts within the 3LGM² tool. To describe how an information manager can be supported in handling the complex IHE world and planning interoperable IS using 3LGM² models. To describe how developers or maintainers of IHE profiles can be supported by the representation of IHE concepts in 3LGM². Conceptualization and concept mapping methods are used to assign IHE concepts such as domains, integration profiles actors and transactions to the concepts of the three-layer graph-based meta-model (3LGM²). IHE concepts were successfully linked to 3LGM² concepts. An IHE-master-model, i. e. an abstract model for IHE concepts, was modeled with the help of 3LGM² tool. Two IHE domains were modeled in detail (ITI, QRPH). We describe two use cases for the representation of IHE concepts and IHE domains as 3LGM² models. Information managers can use the IHE-master-model as reference model for modeling interoperable IS based on IHE profiles during EAP activities. IHE developers are supported in analyzing consistency of IHE concepts with the help of the IHE-master-model and functions of the 3LGM² tool The complex relations between IHE concepts can be modeled by using the EAP method 3LGM². 3LGM² tool offers visualization and analysis features which are now available for the IHE-master-model. Thus information managers and IHE developers can use or develop IHE profiles systematically. In order to improve the usability and handling of the IHE-master-model and its usage as a reference model, some further refinements have to be done. Evaluating the use of the IHE-master-model by information managers and IHE developers is subject to further research.

  8. On the Execution Control of HLA Federations using the SISO Space Reference FOM

    NASA Technical Reports Server (NTRS)

    Moller, Bjorn; Garro, Alfredo; Falcone, Alberto; Crues, Edwin Z.; Dexter, Daniel E.

    2017-01-01

    In the Space domain the High Level Architecture (HLA) is one of the reference standard for Distributed Simulation. However, for the different organizations involved in the Space domain (e.g. NASA, ESA, Roscosmos, and JAXA) and their industrial partners, it is difficult to implement HLA simulators (called Federates) able to interact and interoperate in the context of a distributed HLA simulation (called Federation). The lack of a common FOM (Federation Object Model) for the Space domain is one of the main reasons that precludes a-priori interoperability between heterogeneous federates. To fill this lack a Product Development Group (PDG) has been recently activated in the Simulation Interoperability Standards Organization (SISO) with the aim to provide a Space Reference FOM (SRFOM) for international collaboration on Space systems simulations. Members of the PDG come from several countries and contribute experiences from projects within NASA, ESA and other organizations. Participants represent government, academia and industry. The paper presents an overview of the ongoing Space Reference FOM standardization initiative by focusing on the solution provided for managing the execution of an SRFOM-based Federation.

  9. Software architecture of INO340 telescope control system

    NASA Astrophysics Data System (ADS)

    Ravanmehr, Reza; Khosroshahi, Habib

    2016-08-01

    The software architecture plays an important role in distributed control system of astronomical projects because many subsystems and components must work together in a consistent and reliable way. We have utilized a customized architecture design approach based on "4+1 view model" in order to design INOCS software architecture. In this paper, after reviewing the top level INOCS architecture, we present the software architecture model of INOCS inspired by "4+1 model", for this purpose we provide logical, process, development, physical, and scenario views of our architecture using different UML diagrams and other illustrative visual charts. Each view presents INOCS software architecture from a different perspective. We finish the paper by science data operation of INO340 and the concluding remarks.

  10. A Bayesian Approach to Model Selection in Hierarchical Mixtures-of-Experts Architectures.

    PubMed

    Tanner, Martin A.; Peng, Fengchun; Jacobs, Robert A.

    1997-03-01

    There does not exist a statistical model that shows good performance on all tasks. Consequently, the model selection problem is unavoidable; investigators must decide which model is best at summarizing the data for each task of interest. This article presents an approach to the model selection problem in hierarchical mixtures-of-experts architectures. These architectures combine aspects of generalized linear models with those of finite mixture models in order to perform tasks via a recursive "divide-and-conquer" strategy. Markov chain Monte Carlo methodology is used to estimate the distribution of the architectures' parameters. One part of our approach to model selection attempts to estimate the worth of each component of an architecture so that relatively unused components can be pruned from the architecture's structure. A second part of this approach uses a Bayesian hypothesis testing procedure in order to differentiate inputs that carry useful information from nuisance inputs. Simulation results suggest that the approach presented here adheres to the dictum of Occam's razor; simple architectures that are adequate for summarizing the data are favored over more complex structures. Copyright 1997 Elsevier Science Ltd. All Rights Reserved.

  11. Avoiding and tolerating latency in large-scale next-generation shared-memory multiprocessors

    NASA Technical Reports Server (NTRS)

    Probst, David K.

    1993-01-01

    A scalable solution to the memory-latency problem is necessary to prevent the large latencies of synchronization and memory operations inherent in large-scale shared-memory multiprocessors from reducing high performance. We distinguish latency avoidance and latency tolerance. Latency is avoided when data is brought to nearby locales for future reference. Latency is tolerated when references are overlapped with other computation. Latency-avoiding locales include: processor registers, data caches used temporally, and nearby memory modules. Tolerating communication latency requires parallelism, allowing the overlap of communication and computation. Latency-tolerating techniques include: vector pipelining, data caches used spatially, prefetching in various forms, and multithreading in various forms. Relaxing the consistency model permits increased use of avoidance and tolerance techniques. Each model is a mapping from the program text to sets of partial orders on program operations; it is a convention about which temporal precedences among program operations are necessary. Information about temporal locality and parallelism constrains the use of avoidance and tolerance techniques. Suitable architectural primitives and compiler technology are required to exploit the increased freedom to reorder and overlap operations in relaxed models.

  12. Functional Requirements for Information Resource Provenance on the Web

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCusker, James P.; Lebo, Timothy; Graves, Alvaro

    We provide a means to formally explain the relationship between HTTP URLs and the representations returned when they are requested. According to existing World Wide Web architecture, the URL serves as an identier for a semiotic referent while the document returned via HTTP serves as a representation of the same referent. This begins with two sides of a semiotic triangle; the third side is the relationship between the URL and the representation received. We complete this description by extending the library science resource model Functional Requirements for Bibliographic Resources (FRBR) with cryptographic message and content digests to create a Functionalmore » Requirements for Information Resources (FRIR). We show how applying the FRIR model to HTTP GET and POST transactions disambiguates the many relationships between a given URL and all representations received from its request, provides fine-grained explanations that are complementary to existing explanations of web resources, and integrates easily into the emerging W3C provenance standard.« less

  13. Mobile platform for treatment of stroke: A case study of tele-assistance

    PubMed Central

    Torres Zenteno, Arturo Henry; Fernández, Francisco; Palomino-García, Alfredo; Moniche, Francisco; Escudero, Irene; Jiménez-Hernández, M Dolores; Caballero, Auxiliadora; Escobar-Rodriguez, Germán; Parra, Carlos

    2015-01-01

    This article presents the technological solution of a tele-assistance process for stroke patients in acute phase in the Seville metropolitan area. The main objective of this process is to reduce time from symptom onset to treatment of acute phase stroke patients by means of telemedicine, regarding mobility between an intensive care unit ambulance and an expert center and activating the pre-hospital care phase. The technological platform covering the process has been defined following an interoperability model based on standards and with a focus on service-oriented architecture focus. Messaging definition has been designed according to the reference model of the CEN/ISO 13606, messages content follows the structure of archetypes. An XDS-b (Cross-Enterprise Document Sharing-b) transaction messaging has been designed according to Integrating the Healthcare Enterprise profile for archetype notifications and update enquiries.This research has been performed by a multidisciplinary group. The Virgen del Rocío University Hospital acts as Reference Hospital and the Public Company for Healthcare as mobility surroundings. PMID:25975806

  14. Reference Architecture for MNE 5 Technical System

    DTIC Science & Technology

    2007-05-30

    of being available in most experiments. Core Services A core set of applications whi directories, web portal and collaboration applications etc. A...classifications Messages (xml, JMS, content level…) Meta data filtering, who can initiate services Web browsing Collaboration & messaging Border...Exchange Ref Architecture for MNE5 Tech System.doc 9 of 21 audit logging Person and machine Data lev objects, web services, messages rification el

  15. Effects of Spatial Experiences & Cognitive Styles in the Solution Process of Space-Based Design Problems in the First Year of Architectural Design Education

    ERIC Educational Resources Information Center

    Erkan Yazici, Yasemin

    2013-01-01

    There are many factors that influence designers in the architectural design process. Cognitive style, which varies according to the cognitive structure of persons, and spatial experience, which is created with spatial data acquired during life are two of these factors. Designers usually refer to their spatial experiences in order to find solutions…

  16. Parallel Logic Programming and Parallel Systems Software and Hardware

    DTIC Science & Technology

    1989-07-29

    Conference, Dallas TX. January 1985. (55) [Rous75] Roussel, P., "PROLOG: Manuel de Reference et d’Uilisation", Group d’ Intelligence Artificielle , Universite d...completed. Tools were provided for software development using artificial intelligence techniques. Al software for massively parallel architectures was...using artificial intelligence tech- niques. Al software for massively parallel architectures was started. 1. Introduction We describe research conducted

  17. Modelling multimedia teleservices with OSI upper layers framework: Short paper

    NASA Astrophysics Data System (ADS)

    Widya, I.; Vanrijssen, E.; Michiels, E.

    The paper presents the use of the concepts and modelling principles of the Open Systems Interconnection (OSI) upper layers structure in the modelling of multimedia teleservices. It puts emphasis on the revised Application Layer Structure (OSI/ALS). OSI/ALS is an object based reference model which intends to coordinate the development of application oriented services and protocols in a consistent and modular way. It enables the rapid deployment and integrated use of these services. The paper emphasizes further on the nesting structure defined in OSI/ALS which allows the design of scalable and user tailorable/controllable teleservices. OSI/ALS consistent teleservices are moreover implementable on communication platforms of different capabilities. An analysis of distributed multimedia architectures which can be found in the literature, confirms the ability of the OSI/ALS framework to model the interworking functionalities of teleservices.

  18. Role of System Architecture in Architecture in Developing New Drafting Tools

    NASA Astrophysics Data System (ADS)

    Sorguç, Arzu Gönenç

    In this study, the impact of information technologies in architectural design process is discussed. In this discussion, first the differences/nuances between the concept of software engineering and system architecture are clarified. Then, the design process in engineering, and design process in architecture has been compared by considering 3-D models as the center of design process over which the other disciplines involve the design. It is pointed out that in many high-end engineering applications, 3-D solid models and consequently digital mock-up concept has become a common practice. But, architecture as one of the important customers of CAD systems employing these tools has not started to use these 3-D models. It is shown that the reason of this time lag between architecture and engineering lies behind the tradition of design attitude. Therefore, it is proposed a new design scheme a meta-model to develop an integrated design model being centered on 3-D model. It is also proposed a system architecture to achieve the transformation of architectural design process by replacing 2-D thinking with 3-D thinking. It is stated that in the proposed system architecture, the CAD systems are included and adapted for 3-D architectural design in order to provide interfaces for integration of all possible disciplines to design process. It is also shown that such a change will allow to elaborate the intelligent or smart building concept in future.

  19. A Standard-Based and Context-Aware Architecture for Personal Healthcare Smart Gateways.

    PubMed

    Santos, Danilo F S; Gorgônio, Kyller C; Perkusich, Angelo; Almeida, Hyggo O

    2016-10-01

    The rising availability of Personal Health Devices (PHDs) capable of Personal Network Area (PAN) communication and the desire of keeping a high quality of life are the ingredients of the Connected Health vision. In parallel, a growing number of personal and portable devices, like smartphones and tablet computers, are becoming capable of taking the role of health gateway, that is, a data collector for the sensor PHDs. However, as the number of PHDs increase, the number of other peripherals connected in PAN also increases. Therefore, PHDs are now competing for medium access with other devices, decreasing the Quality of Service (QoS) of health applications in the PAN. In this article we present a reference architecture to prioritize PHD connections based on their state and requirements, creating a healthcare Smart Gateway. Healthcare context information is extracted by observing the traffic through the gateway. A standard-based approach was used to identify health traffic based on ISO/IEEE 11073 family of standards. A reference implementation was developed showing the relevance of the problem and how the proposed architecture can assist in the prioritization. The reference Smart Gateway solution was integrated with a Connected Health System for the Internet of Things, validating its use in a real case scenario.

  20. Building the Core Architecture of a Multiagent System Product Line: With an example from a future NASA Mission

    NASA Technical Reports Server (NTRS)

    Pena, Joaquin; Hinchey, Michael G.; Ruiz-Cortes, Antonio

    2006-01-01

    The field of Software Product Lines (SPL) emphasizes building a core architecture for a family of software products from which concrete products can be derived rapidly. This helps to reduce time-to-market, costs, etc., and can result in improved software quality and safety. Current AOSE methodologies are concerned with developing a single Multiagent System. We propose an initial approach to developing the core architecture of a Multiagent Systems Product Line (MAS-PL), exemplifying our approach with reference to a concept NASA mission based on multiagent technology.

  1. Reference Pricing Changes The 'Choice Architecture' Of Health Care For Consumers.

    PubMed

    Robinson, James C; Brown, Timothy T; Whaley, Christopher

    2017-03-01

    Reference pricing in health insurance creates incentives for patients to select for nonemergency services providers that charge relatively low prices and still offer high quality of care. It changes the "choice architecture" by offering standard coverage if the patient chooses cost-effective providers but requires considerable consumer cost sharing if more expensive alternatives are selected. The short-term impact of reference pricing has been to shift patient volumes from hospital-based to freestanding surgical, diagnostic, imaging, and laboratory facilities. This article summarizes reference pricing's impacts to date on patient choice, provider prices, surgical complications, and employer spending and estimates its potential impacts if expanded to more services and a broader population. Reference pricing induces consumers to select lower-price alternatives for all of the forms of care studied, leading to significant reductions in prices paid and spending incurred by insurers and employers. The impact on consumer cost sharing is mixed, with some studies finding higher copayments and some lower. We conclude with a discussion of the incentives created for providers to redesign their clinical processes and for efficient providers to expand into price-sensitive markets. Over time, reference pricing may increase pressures for price competition and lead to further cost-reducing innovations in health care products and processes. Project HOPE—The People-to-People Health Foundation, Inc.

  2. The Performance Analysis of a Real-Time Integrated INS/GPS Vehicle Navigation System with Abnormal GPS Measurement Elimination

    PubMed Central

    Chiang, Kai-Wei; Duong, Thanh Trung; Liao, Jhen-Kai

    2013-01-01

    The integration of an Inertial Navigation System (INS) and the Global Positioning System (GPS) is common in mobile mapping and navigation applications to seamlessly determine the position, velocity, and orientation of the mobile platform. In most INS/GPS integrated architectures, the GPS is considered to be an accurate reference with which to correct for the systematic errors of the inertial sensors, which are composed of biases, scale factors and drift. However, the GPS receiver may produce abnormal pseudo-range errors mainly caused by ionospheric delay, tropospheric delay and the multipath effect. These errors degrade the overall position accuracy of an integrated system that uses conventional INS/GPS integration strategies such as loosely coupled (LC) and tightly coupled (TC) schemes. Conventional tightly coupled INS/GPS integration schemes apply the Klobuchar model and the Hopfield model to reduce pseudo-range delays caused by ionospheric delay and tropospheric delay, respectively, but do not address the multipath problem. However, the multipath effect (from reflected GPS signals) affects the position error far more significantly in a consumer-grade GPS receiver than in an expensive, geodetic-grade GPS receiver. To avoid this problem, a new integrated INS/GPS architecture is proposed. The proposed method is described and applied in a real-time integrated system with two integration strategies, namely, loosely coupled and tightly coupled schemes, respectively. To verify the effectiveness of the proposed method, field tests with various scenarios are conducted and the results are compared with a reliable reference system. PMID:23955434

  3. Quality Attributes for Mission Flight Software: A Reference for Architects

    NASA Technical Reports Server (NTRS)

    Wilmot, Jonathan; Fesq, Lorraine; Dvorak, Dan

    2016-01-01

    In the international standards for architecture descriptions in systems and software engineering (ISO/IEC/IEEE 42010), "concern" is a primary concept that often manifests itself in relation to the quality attributes or "ilities" that a system is expected to exhibit - qualities such as reliability, security and modifiability. One of the main uses of an architecture description is to serve as a basis for analyzing how well the architecture achieves its quality attributes, and that requires architects to be as precise as possible about what they mean in claiming, for example, that an architecture supports "modifiability." This paper describes a table, generated by NASA's Software Architecture Review Board, which lists fourteen key quality attributes, identifies different important aspects of each quality attribute and considers each aspect in terms of requirements, rationale, evidence, and tactics to achieve the aspect. This quality attribute table is intended to serve as a guide to software architects, software developers, and software architecture reviewers in the domain of mission-critical real-time embedded systems, such as space mission flight software.

  4. Using Third Party Data to Update a Reference Dataset in a Quality Evaluation Service

    NASA Astrophysics Data System (ADS)

    Xavier, E. M. A.; Ariza-López, F. J.; Ureña-Cámara, M. A.

    2016-06-01

    Nowadays it is easy to find many data sources for various regions around the globe. In this 'data overload' scenario there are few, if any, information available about the quality of these data sources. In order to easily provide these data quality information we presented the architecture of a web service for the automation of quality control of spatial datasets running over a Web Processing Service (WPS). For quality procedures that require an external reference dataset, like positional accuracy or completeness, the architecture permits using a reference dataset. However, this reference dataset is not ageless, since it suffers the natural time degradation inherent to geospatial features. In order to mitigate this problem we propose the Time Degradation & Updating Module which intends to apply assessed data as a tool to maintain the reference database updated. The main idea is to utilize datasets sent to the quality evaluation service as a source of 'candidate data elements' for the updating of the reference database. After the evaluation, if some elements of a candidate dataset reach a determined quality level, they can be used as input data to improve the current reference database. In this work we present the first design of the Time Degradation & Updating Module. We believe that the outcomes can be applied in the search of a full-automatic on-line quality evaluation platform.

  5. Enterprise application architecture development based on DoDAF and TOGAF

    NASA Astrophysics Data System (ADS)

    Tao, Zhi-Gang; Luo, Yun-Feng; Chen, Chang-Xin; Wang, Ming-Zhe; Ni, Feng

    2017-05-01

    For the purpose of supporting the design and analysis of enterprise application architecture, here, we report a tailored enterprise application architecture description framework and its corresponding design method. The presented framework can effectively support service-oriented architecting and cloud computing by creating the metadata model based on architecture content framework (ACF), DoDAF metamodel (DM2) and Cloud Computing Modelling Notation (CCMN). The framework also makes an effort to extend and improve the mapping between The Open Group Architecture Framework (TOGAF) application architectural inputs/outputs, deliverables and Department of Defence Architecture Framework (DoDAF)-described models. The roadmap of 52 DoDAF-described models is constructed by creating the metamodels of these described models and analysing the constraint relationship among metamodels. By combining the tailored framework and the roadmap, this article proposes a service-oriented enterprise application architecture development process. Finally, a case study is presented to illustrate the results of implementing the tailored framework in the Southern Base Management Support and Information Platform construction project using the development process proposed by the paper.

  6. Science across Cultures.

    ERIC Educational Resources Information Center

    Selin, Helaine

    1993-01-01

    Describes scientific and technical accomplishments of Africans in developing the calendar, surgery, and gynecology and of Native Americans in developing astronomy, architecture, and agriculture. (Contains 93 references) (PR)

  7. Model-Drive Architecture for Agent-Based Systems

    NASA Technical Reports Server (NTRS)

    Gradanin, Denis; Singh, H. Lally; Bohner, Shawn A.; Hinchey, Michael G.

    2004-01-01

    The Model Driven Architecture (MDA) approach uses a platform-independent model to define system functionality, or requirements, using some specification language. The requirements are then translated to a platform-specific model for implementation. An agent architecture based on the human cognitive model of planning, the Cognitive Agent Architecture (Cougaar) is selected for the implementation platform. The resulting Cougaar MDA prescribes certain kinds of models to be used, how those models may be prepared and the relationships of the different kinds of models. Using the existing Cougaar architecture, the level of application composition is elevated from individual components to domain level model specifications in order to generate software artifacts. The software artifacts generation is based on a metamodel. Each component maps to a UML structured component which is then converted into multiple artifacts: Cougaar/Java code, documentation, and test cases.

  8. MicroCT-Based Skeletal Models for Use in Tomographic Voxel Phantoms for Radiological Protection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolch, Wesley

    The University of Florida (UF) proposes to develop two high-resolution image-based skeletal dosimetry models for direct use by ICRP Committee 2’s Task Group on Dose Calculation in their forthcoming Reference Voxel Male (RVM) and Reference Voxel Female (RVF) whole-body dosimetry phantoms. These two phantoms are CT-based, and thus do not have the image resolution to delineate and perform radiation transport modeling of the individual marrow cavities and bone trabeculae throughout their skeletal structures. Furthermore, new and innovative 3D microimaging techniques will now be required for the skeletal tissues following Committee 2’s revision of the target tissues of relevance for radiogenicmore » bone cancer induction. This target tissue had been defined in ICRP Publication 30 as a 10-μm cell layer on all bone surfaces of trabecular and cortical bone. The revised target tissue is now a 50-μm layer within the marrow cavities of trabecular bone only and is exclusive of the marrow adipocytes. Clearly, this new definition requires the use of 3D microimages of the trabecular architecture not available from past 2D optical studies of the adult skeleton. With our recent acquisition of two relatively young cadavers (males of age 18-years and 40-years), we will develop a series of reference skeletal models that can be directly applied to (1) the new ICRP reference voxel man and female phantoms developed for the ICRP, and (2) pediatric phantoms developed to target the ICRP reference children. Dosimetry data to be developed will include absorbed fractions for internal beta and alpha-particle sources, as well as photon and neutron fluence-to-dose response functions for direct use in external dosimetry studies of the ICRP reference workers and members of the general public« less

  9. Geometry and Mechanics of Chiral Pod Opening

    NASA Astrophysics Data System (ADS)

    Sharon, Eran; Armon, Shahaf; Efrati, Efi; Kupferman, Raz

    2012-02-01

    We study the geometry and mechanics that drive the opening of Bauhinia seeds pods. The pod valve wall consists of two fibrous layers oriented at ± 45^o with respect to the pod axis. Upon drying, each of the layers shrinks uniaxially, perpendicularly to the fibers orientation. This active deformation turn the valve into an incompatible sheet with reference saddle-like curvature tensor and a flat (Euclidean) reference metric. These two intrinsic properties are incompatible. The shape is, therefore, selected by a stretching-bending competition. Strips cut from the valve tissue and from synthetic model material adopt various helical configurations. We provide analytical expressions for these configurations in the bending and stretching dominated regimes. Surface measurements show the transition from minimal surfaces (narrow limit) to cylindrical ones (wide limit). Finally, we show how plants use these mechanical principles using different tissue architectures.

  10. A reference architecture for telemonitoring.

    PubMed

    Clarke, Malcolm

    2004-01-01

    The Telecare Interactive Continuous Monitoring System exploits GPRS to provide an ambulatory device that monitors selected vital signs on a continuous basis. Alarms are sent when parameters fall outside preset limits, and accompanying physiological data may also be transmitted. The always-connected property of GPRS allows continuous interactive control of the device and its sensors, permitting changes to monitoring parameters or even enabling continuous monitoring of a sensor in emergency. A new personal area network (PAN) has been developed to support short-range wireless connection to sensors worn on the body including ECG and finger worn SpO2. Most notable is use of ultra low radio frequency to reduce power to minimum. The system has been designed to use a hierarchical architecture for sensors and "derived" signals, such as HR from ECG, so that each can be independently controlled and managed. Sensors are treated as objects, and functions are defined to control aspects of behaviour. These are refined in order to define a generic set of abstract functions to handle the majority of functions, leaving a minimum of sensor specific commands. The intention is to define a reference architecture in order to research the functionality and system architecture of a telemonitoring system. The Telecare project is funded through a grant from the European Commission (IST programme).

  11. An avionics scenario and command model description for Space Generic Open Avionics Architecture (SGOAA)

    NASA Technical Reports Server (NTRS)

    Stovall, John R.; Wray, Richard B.

    1994-01-01

    This paper presents a description of a model for a space vehicle operational scenario and the commands for avionics. This model will be used in developing a dynamic architecture simulation model using the Statemate CASE tool for validation of the Space Generic Open Avionics Architecture (SGOAA). The SGOAA has been proposed as an avionics architecture standard to NASA through its Strategic Avionics Technology Working Group (SATWG) and has been accepted by the Society of Automotive Engineers (SAE) for conversion into an SAE Avionics Standard. This architecture was developed for the Flight Data Systems Division (FDSD) of the NASA Johnson Space Center (JSC) by the Lockheed Engineering and Sciences Company (LESC), Houston, Texas. This SGOAA includes a generic system architecture for the entities in spacecraft avionics, a generic processing external and internal hardware architecture, and a nine class model of interfaces. The SGOAA is both scalable and recursive and can be applied to any hierarchical level of hardware/software processing systems.

  12. A Modeling Pattern for Layered System Interfaces

    NASA Technical Reports Server (NTRS)

    Shames, Peter M.; Sarrel, Marc A.

    2015-01-01

    Communications between systems is often initially represented at a single, high level of abstraction, a link between components. During design evolution it is usually necessary to elaborate the interface model, defining it from several different, related viewpoints and levels of abstraction. This paper presents a pattern to model such multi-layered interface architectures simply and efficiently, in a way that supports expression of technical complexity, interfaces and behavior, and analysis of complexity. Each viewpoint and layer of abstraction has its own properties and behaviors. System elements are logically connected both horizontally along the communication path, and vertically across the different layers of protocols. The performance of upper layers depends on the performance of lower layers, yet the implementation of lower layers is intentionally opaque to upper layers. Upper layers are hidden from lower layers except as sources and sinks of data. The system elements may not be linked directly at each horizontal layer but only via a communication path, and end-to-end communications may depend on intermediate components that are hidden from them, but may need to be shown in certain views and analyzed for certain purposes. This architectural model pattern uses methods described in ISO 42010, Recommended Practice for Architectural Description of Software-intensive Systems and CCSDS 311.0-M-1, Reference Architecture for Space Data Systems (RASDS). A set of useful viewpoints and views are presented, along with the associated modeling representations, stakeholders and concerns. These viewpoints, views, and concerns then inform the modeling pattern. This pattern permits viewing the system from several different perspectives and at different layers of abstraction. An external viewpoint treats the systems of interest as black boxes and focuses on the applications view, another view exposes the details of the connections and other components between the black boxes. An internal view focuses on the implementation within the systems of interest, either showing external interface bindings and specific standards that define the communication stack profile or at the level of internal behavior. Orthogonally, a horizontal view isolates a single layer and a vertical viewpoint shows all layers at a single interface point between the systems of interest. Each of these views can in turn be described from both behavioral and structural viewpoints.

  13. Comparison of different artificial neural network architectures in modeling of Chlorella sp. flocculation.

    PubMed

    Zenooz, Alireza Moosavi; Ashtiani, Farzin Zokaee; Ranjbar, Reza; Nikbakht, Fatemeh; Bolouri, Oberon

    2017-07-03

    Biodiesel production from microalgae feedstock should be performed after growth and harvesting of the cells, and the most feasible method for harvesting and dewatering of microalgae is flocculation. Flocculation modeling can be used for evaluation and prediction of its performance under different affective parameters. However, the modeling of flocculation in microalgae is not simple and has not performed yet, under all experimental conditions, mostly due to different behaviors of microalgae cells during the process under different flocculation conditions. In the current study, the modeling of microalgae flocculation is studied with different neural network architectures. Microalgae species, Chlorella sp., was flocculated with ferric chloride under different conditions and then the experimental data modeled using artificial neural network. Neural network architectures of multilayer perceptron (MLP) and radial basis function architectures, failed to predict the targets successfully, though, modeling was effective with ensemble architecture of MLP networks. Comparison between the performances of the ensemble and each individual network explains the ability of the ensemble architecture in microalgae flocculation modeling.

  14. Rear wheel torque vectoring model predictive control with velocity regulation for electric vehicles

    NASA Astrophysics Data System (ADS)

    Siampis, Efstathios; Velenis, Efstathios; Longo, Stefano

    2015-11-01

    In this paper we propose a constrained optimal control architecture for combined velocity, yaw and sideslip regulation for stabilisation of the vehicle near the limit of lateral acceleration using the rear axle electric torque vectoring configuration of an electric vehicle. A nonlinear vehicle and tyre model are used to find reference steady-state cornering conditions and design two model predictive control (MPC) strategies of different levels of fidelity: one that uses a linearised version of the full vehicle model with the rear wheels' torques as the input, and another one that neglects the wheel dynamics and uses the rear wheels' slips as the input instead. After analysing the relative trade-offs between performance and computational effort, we compare the two MPC strategies against each other and against an unconstrained optimal control strategy in Simulink and Carsim environment.

  15. Modular modelling with Physiome standards

    PubMed Central

    Nickerson, David P.; Nielsen, Poul M. F.; Hunter, Peter J.

    2016-01-01

    Key points The complexity of computational models is increasing, supported by research in modelling tools and frameworks. But relatively little thought has gone into design principles for complex models.We propose a set of design principles for complex model construction with the Physiome standard modelling protocol CellML.By following the principles, models are generated that are extensible and are themselves suitable for reuse in larger models of increasing complexity.We illustrate these principles with examples including an architectural prototype linking, for the first time, electrophysiology, thermodynamically compliant metabolism, signal transduction, gene regulation and synthetic biology.The design principles complement other Physiome research projects, facilitating the application of virtual experiment protocols and model analysis techniques to assist the modelling community in creating libraries of composable, characterised and simulatable quantitative descriptions of physiology. Abstract The ability to produce and customise complex computational models has great potential to have a positive impact on human health. As the field develops towards whole‐cell models and linking such models in multi‐scale frameworks to encompass tissue, organ, or organism levels, reuse of previous modelling efforts will become increasingly necessary. Any modelling group wishing to reuse existing computational models as modules for their own work faces many challenges in the context of construction, storage, retrieval, documentation and analysis of such modules. Physiome standards, frameworks and tools seek to address several of these challenges, especially for models expressed in the modular protocol CellML. Aside from providing a general ability to produce modules, there has been relatively little research work on architectural principles of CellML models that will enable reuse at larger scales. To complement and support the existing tools and frameworks, we develop a set of principles to address this consideration. The principles are illustrated with examples that couple electrophysiology, signalling, metabolism, gene regulation and synthetic biology, together forming an architectural prototype for whole‐cell modelling (including human intervention) in CellML. Such models illustrate how testable units of quantitative biophysical simulation can be constructed. Finally, future relationships between modular models so constructed and Physiome frameworks and tools are discussed, with particular reference to how such frameworks and tools can in turn be extended to complement and gain more benefit from the results of applying the principles. PMID:27353233

  16. Practical Application of Model-based Programming and State-based Architecture to Space Missions

    NASA Technical Reports Server (NTRS)

    Horvath, Gregory; Ingham, Michel; Chung, Seung; Martin, Oliver; Williams, Brian

    2006-01-01

    A viewgraph presentation to develop models from systems engineers that accomplish mission objectives and manage the health of the system is shown. The topics include: 1) Overview; 2) Motivation; 3) Objective/Vision; 4) Approach; 5) Background: The Mission Data System; 6) Background: State-based Control Architecture System; 7) Background: State Analysis; 8) Overview of State Analysis; 9) Background: MDS Software Frameworks; 10) Background: Model-based Programming; 10) Background: Titan Model-based Executive; 11) Model-based Execution Architecture; 12) Compatibility Analysis of MDS and Titan Architectures; 13) Integrating Model-based Programming and Execution into the Architecture; 14) State Analysis and Modeling; 15) IMU Subsystem State Effects Diagram; 16) Titan Subsystem Model: IMU Health; 17) Integrating Model-based Programming and Execution into the Software IMU; 18) Testing Program; 19) Computationally Tractable State Estimation & Fault Diagnosis; 20) Diagnostic Algorithm Performance; 21) Integration and Test Issues; 22) Demonstrated Benefits; and 23) Next Steps

  17. Information Quality Evaluation of C2 Systems at Architecture Level

    DTIC Science & Technology

    2014-06-01

    based on architecture models of C2 systems, which can help to identify key factors impacting information quality and improve the system capability at the stage of architecture design of C2 system....capability evaluation of C2 systems at architecture level becomes necessary and important for improving the system capability at the stage of architecture ... design . This paper proposes a method for information quality evaluation of C2 system at architecture level. First, the information quality model is

  18. Emergence of a Common Modeling Architecture for Earth System Science (Invited)

    NASA Astrophysics Data System (ADS)

    Deluca, C.

    2010-12-01

    Common modeling architecture can be viewed as a natural outcome of common modeling infrastructure. The development of model utility and coupling packages (ESMF, MCT, OpenMI, etc.) over the last decade represents the realization of a community vision for common model infrastructure. The adoption of these packages has led to increased technical communication among modeling centers and newly coupled modeling systems. However, adoption has also exposed aspects of interoperability that must be addressed before easy exchange of model components among different groups can be achieved. These aspects include common physical architecture (how a model is divided into components) and model metadata and usage conventions. The National Unified Operational Prediction Capability (NUOPC), an operational weather prediction consortium, is collaborating with weather and climate researchers to define a common model architecture that encompasses these advanced aspects of interoperability and looks to future needs. The nature and structure of the emergent common modeling architecture will be discussed along with its implications for future model development.

  19. Error Propagation Analysis in the SAE Architecture Analysis and Design Language (AADL) and the EDICT Tool Framework

    NASA Technical Reports Server (NTRS)

    LaValley, Brian W.; Little, Phillip D.; Walter, Chris J.

    2011-01-01

    This report documents the capabilities of the EDICT tools for error modeling and error propagation analysis when operating with models defined in the Architecture Analysis & Design Language (AADL). We discuss our experience using the EDICT error analysis capabilities on a model of the Scalable Processor-Independent Design for Enhanced Reliability (SPIDER) architecture that uses the Reliable Optical Bus (ROBUS). Based on these experiences we draw some initial conclusions about model based design techniques for error modeling and analysis of highly reliable computing architectures.

  20. Empowering open systems through cross-platform interoperability

    NASA Astrophysics Data System (ADS)

    Lyke, James C.

    2014-06-01

    Most of the motivations for open systems lie in the expectation of interoperability, sometimes referred to as "plug-and-play". Nothing in the notion of "open-ness", however, guarantees this outcome, which makes the increased interest in open architecture more perplexing. In this paper, we explore certain themes of open architecture. We introduce the concept of "windows of interoperability", which can be used to align disparate portions of architecture. Such "windows of interoperability", which concentrate on a reduced set of protocol and interface features, might achieve many of the broader purposes assigned as benefits in open architecture. Since it is possible to engineer proprietary systems that interoperate effectively, this nuanced definition of interoperability may in fact be a more important concept to understand and nurture for effective systems engineering and maintenance.

  1. Macromolecular 'size' and 'hardness' drives structure in solvent-swollen blends of linear, cyclic, and star polymers.

    PubMed

    Gartner, Thomas E; Jayaraman, Arthi

    2018-01-17

    In this paper, we apply molecular simulation and liquid state theory to uncover the structure and thermodynamics of homopolymer blends of the same chemistry and varying chain architecture in the presence of explicit solvent species. We use hybrid Monte Carlo (MC)/molecular dynamics (MD) simulations in the Gibbs ensemble to study the swelling of ∼12 000 g mol -1 linear, cyclic, and 4-arm star polystyrene chains in toluene. Our simulations show that the macroscopic swelling response is indistinguishable between the various architectures and matches published experimental data for the solvent annealing of linear polystyrene by toluene vapor. We then use standard MD simulations in the NPT ensemble along with polymer reference interaction site model (PRISM) theory to calculate effective polymer-solvent and polymer-polymer Flory-Huggins interaction parameters (χ eff ) in these systems. As seen in the macroscopic swelling results, there are no significant differences in the polymer-solvent and polymer-polymer χ eff between the various architectures. Despite similar macroscopic swelling and effective interaction parameters between various architectures, the pair correlation function between chain centers-of-mass indicates stronger correlations between cyclic or star chains in the linear-cyclic blends and linear-star blends, compared to linear chain-linear chain correlations. Furthermore, we note striking similarities in the chain-level correlations and the radius of gyration of cyclic and 4-arm star architectures of identical molecular weight. Our results indicate that the cyclic and star chains are 'smaller' and 'harder' than their linear counterparts, and through comparison with MD simulations of blends of soft spheres with varying hardness and size we suggest that these macromolecular characteristics are the source of the stronger cyclic-cyclic and star-star correlations.

  2. Ares V Overview and Status

    NASA Technical Reports Server (NTRS)

    Creech, Steve; Sumrall, Phil; Cockrell, Charles E., Jr.; Burris, Mike

    2009-01-01

    As part of NASA s Constellation Program to resume exploration beyond low Earth orbit (LEO), the Ares V heavy-lift cargo launch vehicle as currently conceived will be able to send more crew and cargo to more places on the Moon than the Apollo Program Saturn V. (Figure 1) It also has unprecedented cargo mass and volume capabilities that will be a national asset for science, commerce, and national defense applications. Compared to current systems, it will offer approximately five times the mass and volume to most orbits and locations. The Columbia space shuttle accident, the resulting investigation, the Vision for Space Exploration, and the Exploration Systems Architecture Study (ESAS) broadly shaped the Constellation architecture. Out of those events and initiatives emerged an architecture intended to replace the space shuttle, complete the International Space Station (ISS), resume a much more ambitious plan to explore the moon as a stepping stone to other destinations in the solar system. The Ares I was NASA s main priority because of the goal to retire the Shuttle. Ares V remains in a concept development phase, evolving through hundreds of configurations. The current reference design was approved during the Lunar Capabilities Concept Review/Ares V Mission Concept Review (LCCR/MCR) in June 2008. This reference concept serves as a starting point for a renewed set of design trades and detailed analysis into its interaction with the other components of the Constellation architecture and existing launch infrastructure. In 2009, the Ares V team was heavily involved in supporting the Review of U.S. Human Space Flight Plans Committee. Several alternative designs for Ares V have been supplied to the committee. This paper will discuss the origins of the Ares V design, the evolution to the current reference configuration, and the options provided to the review committee.

  3. Research on key technologies of data processing in internet of things

    NASA Astrophysics Data System (ADS)

    Zhu, Yangqing; Liang, Peiying

    2017-08-01

    The data of Internet of things (IOT) has the characteristics of polymorphism, heterogeneous, large amount and processing real-time. The traditional structured and static batch processing method has not met the requirements of data processing of IOT. This paper studied a middleware that can integrate heterogeneous data of IOT, and integrated different data formats into a unified format. Designed a data processing model of IOT based on the Storm flow calculation architecture, integrated the existing Internet security technology to build the Internet security system of IOT data processing, which provided reference for the efficient transmission and processing of IOT data.

  4. Software Management Environment (SME): Components and algorithms

    NASA Technical Reports Server (NTRS)

    Hendrick, Robert; Kistler, David; Valett, Jon

    1994-01-01

    This document presents the components and algorithms of the Software Management Environment (SME), a management tool developed for the Software Engineering Branch (Code 552) of the Flight Dynamics Division (FDD) of the Goddard Space Flight Center (GSFC). The SME provides an integrated set of visually oriented experienced-based tools that can assist software development managers in managing and planning software development projects. This document describes and illustrates the analysis functions that underlie the SME's project monitoring, estimation, and planning tools. 'SME Components and Algorithms' is a companion reference to 'SME Concepts and Architecture' and 'Software Engineering Laboratory (SEL) Relationships, Models, and Management Rules.'

  5. Strategies for concurrent processing of complex algorithms in data driven architectures

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Mielke, Roland R.

    1988-01-01

    Research directed at developing a graph theoretical model for describing data and control flow associated with the execution of large grained algorithms in a special distributed computer environment is presented. This model is identified by the acronym ATAMM which represents Algorithms To Architecture Mapping Model. The purpose of such a model is to provide a basis for establishing rules for relating an algorithm to its execution in a multiprocessor environment. Specifications derived from the model lead directly to the description of a data flow architecture which is a consequence of the inherent behavior of the data and control flow described by the model. The purpose of the ATAMM based architecture is to provide an analytical basis for performance evaluation. The ATAMM model and architecture specifications are demonstrated on a prototype system for concept validation.

  6. Development of a real-time clinical decision support system upon the web mvc-based architecture for prostate cancer treatment

    PubMed Central

    2011-01-01

    Background A real-time clinical decision support system (RTCDSS) with interactive diagrams enables clinicians to instantly and efficiently track patients' clinical records (PCRs) and improve their quality of clinical care. We propose a RTCDSS to process online clinical informatics from multiple databases for clinical decision making in the treatment of prostate cancer based on Web Model-View-Controller (MVC) architecture, by which the system can easily be adapted to different diseases and applications. Methods We designed a framework upon the Web MVC-based architecture in which the reusable and extractable models can be conveniently adapted to other hospital information systems and which allows for efficient database integration. Then, we determined the clinical variables of the prostate cancer treatment based on participating clinicians' opinions and developed a computational model to determine the pretreatment parameters. Furthermore, the components of the RTCDSS integrated PCRs and decision factors for real-time analysis to provide evidence-based diagrams upon the clinician-oriented interface for visualization of treatment guidance and health risk assessment. Results The resulting system can improve quality of clinical treatment by allowing clinicians to concurrently analyze and evaluate the clinical markers of prostate cancer patients with instantaneous clinical data and evidence-based diagrams which can automatically identify pretreatment parameters. Moreover, the proposed RTCDSS can aid interactions between patients and clinicians. Conclusions Our proposed framework supports online clinical informatics, evaluates treatment risks, offers interactive guidance, and provides real-time reference for decision making in the treatment of prostate cancer. The developed clinician-oriented interface can assist clinicians in conveniently presenting evidence-based information to patients and can be readily adapted to an existing hospital information system and be easily applied in other chronic diseases. PMID:21385459

  7. Development of a real-time clinical decision support system upon the Web MVC-based architecture for prostate cancer treatment.

    PubMed

    Lin, Hsueh-Chun; Wu, Hsi-Chin; Chang, Chih-Hung; Li, Tsai-Chung; Liang, Wen-Miin; Wang, Jong-Yi Wang

    2011-03-08

    A real-time clinical decision support system (RTCDSS) with interactive diagrams enables clinicians to instantly and efficiently track patients' clinical records (PCRs) and improve their quality of clinical care. We propose a RTCDSS to process online clinical informatics from multiple databases for clinical decision making in the treatment of prostate cancer based on Web Model-View-Controller (MVC) architecture, by which the system can easily be adapted to different diseases and applications. We designed a framework upon the Web MVC-based architecture in which the reusable and extractable models can be conveniently adapted to other hospital information systems and which allows for efficient database integration. Then, we determined the clinical variables of the prostate cancer treatment based on participating clinicians' opinions and developed a computational model to determine the pretreatment parameters. Furthermore, the components of the RTCDSS integrated PCRs and decision factors for real-time analysis to provide evidence-based diagrams upon the clinician-oriented interface for visualization of treatment guidance and health risk assessment. The resulting system can improve quality of clinical treatment by allowing clinicians to concurrently analyze and evaluate the clinical markers of prostate cancer patients with instantaneous clinical data and evidence-based diagrams which can automatically identify pretreatment parameters. Moreover, the proposed RTCDSS can aid interactions between patients and clinicians. Our proposed framework supports online clinical informatics, evaluates treatment risks, offers interactive guidance, and provides real-time reference for decision making in the treatment of prostate cancer. The developed clinician-oriented interface can assist clinicians in conveniently presenting evidence-based information to patients and can be readily adapted to an existing hospital information system and be easily applied in other chronic diseases.

  8. An OAIS-Based Hospital Information System on the Cloud: Analysis of a NoSQL Column-Oriented Approach.

    PubMed

    Celesti, Antonio; Fazio, Maria; Romano, Agata; Bramanti, Alessia; Bramanti, Placido; Villari, Massimo

    2018-05-01

    The Open Archive Information System (OAIS) is a reference model for organizing people and resources in a system, and it is already adopted in care centers and medical systems to efficiently manage clinical data, medical personnel, and patients. Archival storage systems are typically implemented using traditional relational database systems, but the relation-oriented technology strongly limits the efficiency in the management of huge amount of patients' clinical data, especially in emerging cloud-based, that are distributed. In this paper, we present an OAIS healthcare architecture useful to manage a huge amount of HL7 clinical documents in a scalable way. Specifically, it is based on a NoSQL column-oriented Data Base Management System deployed in the cloud, thus to benefit from a big tables and wide rows available over a virtual distributed infrastructure. We developed a prototype of the proposed architecture at the IRCCS, and we evaluated its efficiency in a real case of study.

  9. Bioinspired, Graphene/Al2O3 Doubly Reinforced Aluminum Composites with High Strength and Toughness.

    PubMed

    Zhang, Yunya; Li, Xiaodong

    2017-11-08

    Nacre, commonly referred to as nature's armor, has served as a blueprint for engineering stronger and tougher bioinspired materials. Nature organizes a brick-and-mortar-like architecture in nacre, with hard bricks of aragonite sandwiched with soft biopolymer layers. However, cloning nacre's entire reinforcing mechanisms in engineered materials remains a challenge. In this study, we employed hybrid graphene/Al 2 O 3 platelets with surface nanointerlocks as hard bricks for primary load bearer and mechanical interlocking, along with aluminum laminates as soft mortar for load distribution and energy dissipation, to replicate nacre's architecture and reinforcing effects in aluminum composites. Compared with aluminum, the bioinspired, graphene/Al 2 O 3 doubly reinforced aluminum composite demonstrated an exceptional, joint improvement in hardness (210%), strength (223%), stiffness (78%), and toughness (30%), which are even superior over nacre. This design strategy and model material system should guide the synthesis of bioinspired materials to achieve exceptionally high strength and toughness.

  10. Nanoscale protein architecture of the kidney glomerular basement membrane

    PubMed Central

    Suleiman, Hani; Zhang, Lei; Roth, Robyn; Heuser, John E; Miner, Jeffrey H; Shaw, Andrey S; Dani, Adish

    2013-01-01

    In multicellular organisms, proteins of the extracellular matrix (ECM) play structural and functional roles in essentially all organs, so understanding ECM protein organization in health and disease remains an important goal. Here, we used sub-diffraction resolution stochastic optical reconstruction microscopy (STORM) to resolve the in situ molecular organization of proteins within the kidney glomerular basement membrane (GBM), an essential mediator of glomerular ultrafiltration. Using multichannel STORM and STORM-electron microscopy correlation, we constructed a molecular reference frame that revealed a laminar organization of ECM proteins within the GBM. Separate analyses of domains near the N- and C-termini of agrin, laminin, and collagen IV in mouse and human GBM revealed a highly oriented macromolecular organization. Our analysis also revealed disruptions in this GBM architecture in a mouse model of Alport syndrome. These results provide the first nanoscopic glimpse into the organization of a complex ECM. DOI: http://dx.doi.org/10.7554/eLife.01149.001 PMID:24137544

  11. Payload mass improvements of supersonic retropropulsive flight for human class missions to Mars

    NASA Astrophysics Data System (ADS)

    Fagin, Maxwell H.

    Supersonic retropropulsion (SRP) is the use of retrorockets to decelerate during atmospheric flight while the vehicle is still traveling in the supersonic/hypersonic flight regime. In the context of Mars exploration, subsonic retropropulsion has a robust flight heritage for terminal landing guidance and control, but all supersonic deceleration has, to date, been performed by non-propulsive (i.e. purely aerodynamic) methods, such as aeroshells and parachutes. Extending the use of retropropulsion from the subsonic to the supersonic regime has been identified as an enabling technology for high mass humans-to-Mars architectures. However, supersonic retropropulsion still poses significant design and control challenges, stemming mainly from the complex interactions between the hypersonic engine plumes, the oncoming air flow, and the vehicle's exterior surface. These interactions lead to flow fields that are difficult to model and produce counter intuitive behaviors that are not present in purely propulsive or purely aerodynamic flight. This study will provide an overview of the work done in the design of SRP systems. Optimal throttle laws for certain trajectories will be derived that leverage aero/propulsive effects to decrease propellant requirements and increase total useful landing mass. A study of the mass savings will be made for a 10 mT reference vehicle based on a propulsive version of the Orion capsule, followed by the 100 mT ellipsoid vehicle assumed by NASA's Mars Design Reference Architecture.

  12. Optical RAM-enabled cache memory and optical routing for chip multiprocessors: technologies and architectures

    NASA Astrophysics Data System (ADS)

    Pleros, Nikos; Maniotis, Pavlos; Alexoudi, Theonitsa; Fitsios, Dimitris; Vagionas, Christos; Papaioannou, Sotiris; Vyrsokinos, K.; Kanellos, George T.

    2014-03-01

    The processor-memory performance gap, commonly referred to as "Memory Wall" problem, owes to the speed mismatch between processor and electronic RAM clock frequencies, forcing current Chip Multiprocessor (CMP) configurations to consume more than 50% of the chip real-estate for caching purposes. In this article, we present our recent work spanning from Si-based integrated optical RAM cell architectures up to complete optical cache memory architectures for Chip Multiprocessor configurations. Moreover, we discuss on e/o router subsystems with up to Tb/s routing capacity for cache interconnection purposes within CMP configurations, currently pursued within the FP7 PhoxTrot project.

  13. Patterns and Practices for Future Architectures

    DTIC Science & Technology

    2014-08-01

    14. SUBJECT TERMS computing architecture, graph algorithms, high-performance computing, big data , GPU 15. NUMBER OF PAGES 44 16. PRICE CODE 17...at Vertex 1 6 Figure 4: Data Structures Created by Kernel 1 of Single CPU, List Implementation Using the Graph in the Example from Section 1.2 9...Figure 5: Kernel 2 of Graph500 BFS Reference Implementation: Single CPU, List 10 Figure 6: Data Structures for Sequential CSR Algorithm 12 Figure 7

  14. Archetype Model-Driven Development Framework for EHR Web System.

    PubMed

    Kobayashi, Shinji; Kimura, Eizen; Ishihara, Ken

    2013-12-01

    This article describes the Web application framework for Electronic Health Records (EHRs) we have developed to reduce construction costs for EHR sytems. The openEHR project has developed clinical model driven architecture for future-proof interoperable EHR systems. This project provides the specifications to standardize clinical domain model implementations, upon which the ISO/CEN 13606 standards are based. The reference implementation has been formally described in Eiffel. Moreover C# and Java implementations have been developed as reference. While scripting languages had been more popular because of their higher efficiency and faster development in recent years, they had not been involved in the openEHR implementations. From 2007, we have used the Ruby language and Ruby on Rails (RoR) as an agile development platform to implement EHR systems, which is in conformity with the openEHR specifications. We implemented almost all of the specifications, the Archetype Definition Language parser, and RoR scaffold generator from archetype. Although some problems have emerged, most of them have been resolved. We have provided an agile EHR Web framework, which can build up Web systems from archetype models using RoR. The feasibility of the archetype model to provide semantic interoperability of EHRs has been demonstrated and we have verified that that it is suitable for the construction of EHR systems.

  15. High resolution diagnosis of common nevi by multiphoton laser tomography and fluorescence lifetime imaging.

    PubMed

    Arginelli, Federica; Manfredini, Marco; Bassoli, Sara; Dunsby, Christopher; French, Paul; König, Karsten; Magnoni, Cristina; Ponti, Giovanni; Talbot, Clifford; Seidenari, Stefania

    2013-05-01

    Multiphoton Laser Tomography (MPT) has developed as a non-invasive tool that allows real-time observation of the skin with subcellular resolution. MPT is readily combined with time resolved detectors to achieve fluorescence lifetime imaging (FLIM). The aim of our study was to identify morphologic MPT/FLIM descriptors of melanocytic nevi, referring to cellular and architectural features. In the preliminary study, MPT/FLIM images referring to 16 ex vivo nevi were simultaneously evaluated by 3 observers for the identification of morphologic descriptors characteristic of melanocytic nevi. Proposed descriptors were discussed and the parameters referring to epidermal keratinocytes, epidermal melanocytes, dermo-epidermal junction, papillary dermis and overall architecture were selected. In the main study, the presence/absence of the specified criteria were blindly evaluated on a test set, comprising 102 ex vivo samples (51 melanocytic nevi, 51 miscellaneous skin lesions) by 2 observers. Twelve descriptors were identified: "short-lifetime cells in the stratum corneum", "melanin-containing keratinocytes", "dendritic cells", "small short-lifetime cells" in the upper and lower layers", "edged papillae", "non-edged papillae", "junctional nests of short-lifetime cells", "dermal cell clusters", "short-lifetime cells in the papilla", "monomorphic and regular histoarchitecture", "architectural disarray". Identified descriptors for benign melanocytic lesions proved sensitive and specific, enabling the differentiation between melanocytic nevi and non-melanocytic lesions. © 2012 John Wiley & Sons A/S. Published by Blackwell Publishing Ltd.

  16. Implementation of an Integrated On-Board Aircraft Engine Diagnostic Architecture

    NASA Technical Reports Server (NTRS)

    Armstrong, Jeffrey B.; Simon, Donald L.

    2012-01-01

    An on-board diagnostic architecture for aircraft turbofan engine performance trending, parameter estimation, and gas-path fault detection and isolation has been developed and evaluated in a simulation environment. The architecture incorporates two independent models: a realtime self-tuning performance model providing parameter estimates and a performance baseline model for diagnostic purposes reflecting long-term engine degradation trends. This architecture was evaluated using flight profiles generated from a nonlinear model with realistic fleet engine health degradation distributions and sensor noise. The architecture was found to produce acceptable estimates of engine health and unmeasured parameters, and the integrated diagnostic algorithms were able to perform correct fault isolation in approximately 70 percent of the tested cases

  17. Compositional Specification of Software Architecture

    NASA Technical Reports Server (NTRS)

    Penix, John; Lau, Sonie (Technical Monitor)

    1998-01-01

    This paper describes our experience using parameterized algebraic specifications to model properties of software architectures. The goal is to model the decomposition of requirements independent of the style used to implement the architecture. We begin by providing an overview of the role of architecture specification in software development. We then describe how architecture specifications are build up from component and connector specifications and give an overview of insights gained from a case study used to validate the method.

  18. Jupiter Europa Orbiter Architecture Definition Process

    NASA Technical Reports Server (NTRS)

    Rasmussen, Robert; Shishko, Robert

    2011-01-01

    The proposed Jupiter Europa Orbiter mission, planned for launch in 2020, is using a new architectural process and framework tool to drive its model-based systems engineering effort. The process focuses on getting the architecture right before writing requirements and developing a point design. A new architecture framework tool provides for the structured entry and retrieval of architecture artifacts based on an emerging architecture meta-model. This paper describes the relationships among these artifacts and how they are used in the systems engineering effort. Some early lessons learned are discussed.

  19. Strategies for concurrent processing of complex algorithms in data driven architectures

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Mielke, Roland R.

    1987-01-01

    The results of ongoing research directed at developing a graph theoretical model for describing data and control flow associated with the execution of large grained algorithms in a spatial distributed computer environment is presented. This model is identified by the acronym ATAMM (Algorithm/Architecture Mapping Model). The purpose of such a model is to provide a basis for establishing rules for relating an algorithm to its execution in a multiprocessor environment. Specifications derived from the model lead directly to the description of a data flow architecture which is a consequence of the inherent behavior of the data and control flow described by the model. The purpose of the ATAMM based architecture is to optimize computational concurrency in the multiprocessor environment and to provide an analytical basis for performance evaluation. The ATAMM model and architecture specifications are demonstrated on a prototype system for concept validation.

  20. Instruction-level performance modeling and characterization of multimedia applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Y.; Cameron, K.W.

    1999-06-01

    One of the challenges for characterizing and modeling realistic multimedia applications is the lack of access to source codes. On-chip performance counters effectively resolve this problem by monitoring run-time behaviors at the instruction-level. This paper presents a novel technique of characterizing and modeling workloads at the instruction level for realistic multimedia applications using hardware performance counters. A variety of instruction counts are collected from some multimedia applications, such as RealPlayer, GSM Vocoder, MPEG encoder/decoder, and speech synthesizer. These instruction counts can be used to form a set of abstract characteristic parameters directly related to a processor`s architectural features. Based onmore » microprocessor architectural constraints and these calculated abstract parameters, the architectural performance bottleneck for a specific application can be estimated. Meanwhile, the bottleneck estimation can provide suggestions about viable architectural/functional improvement for certain workloads. The biggest advantage of this new characterization technique is a better understanding of processor utilization efficiency and architectural bottleneck for each application. This technique also provides predictive insight of future architectural enhancements and their affect on current codes. In this paper the authors also attempt to model architectural effect on processor utilization without memory influence. They derive formulas for calculating CPI{sub 0}, CPI without memory effect, and they quantify utilization of architectural parameters. These equations are architecturally diagnostic and predictive in nature. Results provide promise in code characterization, and empirical/analytical modeling.« less

  1. Computational Analyses of Synergism in Small Molecular Network Motifs

    PubMed Central

    Zhang, Yili; Smolen, Paul; Baxter, Douglas A.; Byrne, John H.

    2014-01-01

    Cellular functions and responses to stimuli are controlled by complex regulatory networks that comprise a large diversity of molecular components and their interactions. However, achieving an intuitive understanding of the dynamical properties and responses to stimuli of these networks is hampered by their large scale and complexity. To address this issue, analyses of regulatory networks often focus on reduced models that depict distinct, reoccurring connectivity patterns referred to as motifs. Previous modeling studies have begun to characterize the dynamics of small motifs, and to describe ways in which variations in parameters affect their responses to stimuli. The present study investigates how variations in pairs of parameters affect responses in a series of ten common network motifs, identifying concurrent variations that act synergistically (or antagonistically) to alter the responses of the motifs to stimuli. Synergism (or antagonism) was quantified using degrees of nonlinear blending and additive synergism. Simulations identified concurrent variations that maximized synergism, and examined the ways in which it was affected by stimulus protocols and the architecture of a motif. Only a subset of architectures exhibited synergism following paired changes in parameters. The approach was then applied to a model describing interlocked feedback loops governing the synthesis of the CREB1 and CREB2 transcription factors. The effects of motifs on synergism for this biologically realistic model were consistent with those for the abstract models of single motifs. These results have implications for the rational design of combination drug therapies with the potential for synergistic interactions. PMID:24651495

  2. M3BA: A Mobile, Modular, Multimodal Biosignal Acquisition Architecture for Miniaturized EEG-NIRS-Based Hybrid BCI and Monitoring.

    PubMed

    von Luhmann, Alexander; Wabnitz, Heidrun; Sander, Tilmann; Muller, Klaus-Robert

    2017-06-01

    For the further development of the fields of telemedicine, neurotechnology, and brain-computer interfaces, advances in hybrid multimodal signal acquisition and processing technology are invaluable. Currently, there are no commonly available hybrid devices combining bioelectrical and biooptical neurophysiological measurements [here electroencephalography (EEG) and functional near-infrared spectroscopy (NIRS)]. Our objective was to design such an instrument in a miniaturized, customizable, and wireless form. We present here the design and evaluation of a mobile, modular, multimodal biosignal acquisition architecture (M3BA) based on a high-performance analog front-end optimized for biopotential acquisition, a microcontroller, and our openNIRS technology. The designed M3BA modules are very small configurable high-precision and low-noise modules (EEG input referred noise @ 500 SPS 1.39 μV pp , NIRS noise equivalent power NEP 750 nm = 5.92 pW pp , and NEP 850 nm = 4.77 pW pp ) with full input linearity, Bluetooth, 3-D accelerometer, and low power consumption. They support flexible user-specified biopotential reference setups and wireless body area/sensor network scenarios. Performance characterization and in-vivo experiments confirmed functionality and quality of the designed architecture. Telemedicine and assistive neurotechnology scenarios will increasingly include wearable multimodal sensors in the future. The M3BA architecture can significantly facilitate future designs for research in these and other fields that rely on customized mobile hybrid biosignal modal biosignal acquisition architecture (M3BA), multimodal, near-infrared spectroscopy (NIRS), wireless body area network (WBAN), wireless body sensor network (WBSN).

  3. Modeling Techniques for High Dependability Protocols and Architecture

    NASA Technical Reports Server (NTRS)

    LaValley, Brian; Ellis, Peter; Walter, Chris J.

    2012-01-01

    This report documents an investigation into modeling high dependability protocols and some specific challenges that were identified as a result of the experiments. The need for an approach was established and foundational concepts proposed for modeling different layers of a complex protocol and capturing the compositional properties that provide high dependability services for a system architecture. The approach centers around the definition of an architecture layer, its interfaces for composability with other layers and its bindings to a platform specific architecture model that implements the protocols required for the layer.

  4. Xyce parallel electronic simulator : reference guide.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mei, Ting; Rankin, Eric Lamont; Thornquist, Heidi K.

    2011-05-01

    This document is a reference guide to the Xyce Parallel Electronic Simulator, and is a companion document to the Xyce Users Guide. The focus of this document is (to the extent possible) exhaustively list device parameters, solver options, parser options, and other usage details of Xyce. This document is not intended to be a tutorial. Users who are new to circuit simulation are better served by the Xyce Users Guide. The Xyce Parallel Electronic Simulator has been written to support, in a rigorous manner, the simulation needs of the Sandia National Laboratories electrical designers. It is targeted specifically to runmore » on large-scale parallel computing platforms but also runs well on a variety of architectures including single processor workstations. It also aims to support a variety of devices and models specific to Sandia needs. This document is intended to complement the Xyce Users Guide. It contains comprehensive, detailed information about a number of topics pertinent to the usage of Xyce. Included in this document is a netlist reference for the input-file commands and elements supported within Xyce; a command line reference, which describes the available command line arguments for Xyce; and quick-references for users of other circuit codes, such as Orcad's PSpice and Sandia's ChileSPICE.« less

  5. Effect of pore architecture on oxygen diffusion in 3D scaffolds for tissue engineering.

    PubMed

    Ahn, Geunseon; Park, Jeong Hun; Kang, Taeyun; Lee, Jin Woo; Kang, Hyun-Wook; Cho, Dong-Woo

    2010-10-01

    The aim of this study was to maximize oxygen diffusion within a three-dimensional scaffold in order to improve cell viability and proliferation. To evaluate the effect of pore architecture on oxygen diffusion, we designed a regular channel shape with uniform diameter, referred to as cylinder shaped, and a new channel shape with a channel diameter gradient, referred to as cone shaped. A numerical analysis predicted higher oxygen concentration in the cone-shaped channels than in the cylinder-shaped channels, throughout the scaffold. To confirm these numerical results, we examined cell proliferation and viability in 2D constructs and 3D scaffolds. Cell culture experiments revealed that cell proliferation and viability were superior in the constructs and scaffolds with cone-shaped channels.

  6. Medical Data GRIDs as approach towards secure cross enterprise document sharing (based on IHE XDS).

    PubMed

    Wozak, Florian; Ammenwerth, Elske; Breu, Micheal; Penz, Robert; Schabetsberger, Thomas; Vogl, Raimund; Wurz, Manfred

    2006-01-01

    Quality and efficiency of health care services is expected to be improved by the electronic processing and trans-institutional availability of medical data. A prototype architecture based on the IHE-XDS profile is currently being developed. Due to legal and organizational requirements specific adaptations to the IHE-XDS profile have been made. In this work the services of the health@net reference architecture are described in details, which have been developed with focus on compliance to both, the IHE-XDS profile and the legal situation in Austria. We expect to gain knowledge about the development of a shared electronic health record using Medical Data Grids as an Open Source reference implementation and how proprietary Hospital Information systems can be integrated in this environment.

  7. Assessing the effects of architectural variations on light partitioning within virtual wheat–pea mixtures

    PubMed Central

    Barillot, Romain; Escobar-Gutiérrez, Abraham J.; Fournier, Christian; Huynh, Pierre; Combes, Didier

    2014-01-01

    Background and Aims Predicting light partitioning in crop mixtures is a critical step in improving the productivity of such complex systems, and light interception has been shown to be closely linked to plant architecture. The aim of the present work was to analyse the relationships between plant architecture and light partitioning within wheat–pea (Triticum aestivum–Pisum sativum) mixtures. An existing model for wheat was utilized and a new model for pea morphogenesis was developed. Both models were then used to assess the effects of architectural variations in light partitioning. Methods First, a deterministic model (L-Pea) was developed in order to obtain dynamic reconstructions of pea architecture. The L-Pea model is based on L-systems formalism and consists of modules for ‘vegetative development’ and ‘organ extension’. A tripartite simulator was then built up from pea and wheat models interfaced with a radiative transfer model. Architectural parameters from both plant models, selected on the basis of their contribution to leaf area index (LAI), height and leaf geometry, were then modified in order to generate contrasting architectures of wheat and pea. Key results By scaling down the analysis to the organ level, it could be shown that the number of branches/tillers and length of internodes significantly determined the partitioning of light within mixtures. Temporal relationships between light partitioning and the LAI and height of the different species showed that light capture was mainly related to the architectural traits involved in plant LAI during the early stages of development, and in plant height during the onset of interspecific competition. Conclusions In silico experiments enabled the study of the intrinsic effects of architectural parameters on the partitioning of light in crop mixtures of wheat and pea. The findings show that plant architecture is an important criterion for the identification/breeding of plant ideotypes, particularly with respect to light partitioning. PMID:24907314

  8. RACE/A: An Architectural Account of the Interactions between Learning, Task Control, and Retrieval Dynamics

    ERIC Educational Resources Information Center

    van Maanen, Leendert; van Rijn, Hedderik; Taatgen, Niels

    2012-01-01

    This article discusses how sequential sampling models can be integrated in a cognitive architecture. The new theory Retrieval by Accumulating Evidence in an Architecture (RACE/A) combines the level of detail typically provided by sequential sampling models with the level of task complexity typically provided by cognitive architectures. We will use…

  9. Architecture Study for a Fuel Depot Supplied from Lunar Assets

    NASA Technical Reports Server (NTRS)

    Perrin, Thomas M.; Casler, James G.

    2016-01-01

    This architecture study sought to determine the optimum architecture for a fuel depot supplied from lunar assets. Four factors - the location of propellant processing (on the Moon or on the depot), the depot location (on the Moon, L1, GEO, or LEO), the propellant transfer location (L1, GEO, or LEO), and the propellant transfer method (bulk fuel or canister exchange) were combined to identify 18 candidate architectures. Two design reference missions (DRMs) - a commercial satellite servicing mission and a Government cargo mission to Mars - created demand for propellants, while a propellant delivery DRM examined supply issues. The study concluded Earth-Moon L1 is the best location for an orbiting depot. For all architectures, propellant boiloff was less than anticipated, and was far overshadowed by delta-v requirements and resulting fuel consumption. Bulk transfer is the most flexible for both the supplier and customer. However, since canister exchange bypasses the transfer of bulk cryogens and necessary chilldown losses, canister exchange shows promise and merits further investigation. Overall, this work indicates propellant consumption and loss is an essential factor in assessing fuel depot architectures.

  10. Parameterized Micro-benchmarking: An Auto-tuning Approach for Complex Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Wenjing; Krishnamoorthy, Sriram; Agrawal, Gagan

    2012-05-15

    Auto-tuning has emerged as an important practical method for creating highly optimized implementations of key computational kernels and applications. However, the growing complexity of architectures and applications is creating new challenges for auto-tuning. Complex applications can involve a prohibitively large search space that precludes empirical auto-tuning. Similarly, architectures are becoming increasingly complicated, making it hard to model performance. In this paper, we focus on the challenge to auto-tuning presented by applications with a large number of kernels and kernel instantiations. While these kernels may share a somewhat similar pattern, they differ considerably in problem sizes and the exact computation performed.more » We propose and evaluate a new approach to auto-tuning which we refer to as parameterized micro-benchmarking. It is an alternative to the two existing classes of approaches to auto-tuning: analytical model-based and empirical search-based. Particularly, we argue that the former may not be able to capture all the architectural features that impact performance, whereas the latter might be too expensive for an application that has several different kernels. In our approach, different expressions in the application, different possible implementations of each expression, and the key architectural features, are used to derive a simple micro-benchmark and a small parameter space. This allows us to learn the most significant features of the architecture that can impact the choice of implementation for each kernel. We have evaluated our approach in the context of GPU implementations of tensor contraction expressions encountered in excited state calculations in quantum chemistry. We have focused on two aspects of GPUs that affect tensor contraction execution: memory access patterns and kernel consolidation. Using our parameterized micro-benchmarking approach, we obtain a speedup of up to 2 over the version that used default optimizations, but no auto-tuning. We demonstrate that observations made from microbenchmarks match the behavior seen from real expressions. In the process, we make important observations about the memory hierarchy of two of the most recent NVIDIA GPUs, which can be used in other optimization frameworks as well.« less

  11. Perceptual control models of pursuit manual tracking demonstrate individual specificity and parameter consistency.

    PubMed

    Parker, Maximilian G; Tyson, Sarah F; Weightman, Andrew P; Abbott, Bruce; Emsley, Richard; Mansell, Warren

    2017-11-01

    Computational models that simulate individuals' movements in pursuit-tracking tasks have been used to elucidate mechanisms of human motor control. Whilst there is evidence that individuals demonstrate idiosyncratic control-tracking strategies, it remains unclear whether models can be sensitive to these idiosyncrasies. Perceptual control theory (PCT) provides a unique model architecture with an internally set reference value parameter, and can be optimized to fit an individual's tracking behavior. The current study investigated whether PCT models could show temporal stability and individual specificity over time. Twenty adults completed three blocks of 15 1-min, pursuit-tracking trials. Two blocks (training and post-training) were completed in one session and the third was completed after 1 week (follow-up). The target moved in a one-dimensional, pseudorandom pattern. PCT models were optimized to the training data using a least-mean-squares algorithm, and validated with data from post-training and follow-up. We found significant inter-individual variability (partial η 2 : .464-.697) and intra-individual consistency (Cronbach's α: .880-.976) in parameter estimates. Polynomial regression revealed that all model parameters, including the reference value parameter, contribute to simulation accuracy. Participants' tracking performances were significantly more accurately simulated by models developed from their own tracking data than by models developed from other participants' data. We conclude that PCT models can be optimized to simulate the performance of an individual and that the test-retest reliability of individual models is a necessary criterion for evaluating computational models of human performance.

  12. Promoting A-Priori Interoperability of HLA-Based Simulations in the Space Domain: The SISO Space Reference FOM Initiative

    NASA Technical Reports Server (NTRS)

    Moller, Bjorn; Garro, Alfredo; Falcone, Alberto; Crues, Edwin Z.; Dexter, Daniel E.

    2016-01-01

    Distributed and Real-Time Simulation plays a key-role in the Space domain being exploited for missions and systems analysis and engineering as well as for crew training and operational support. One of the most popular standards is the 1516-2010 IEEE Standard for Modeling and Simulation (M&S) High Level Architecture (HLA). HLA supports the implementation of distributed simulations (called Federations) in which a set of simulation entities (called Federates) can interact using a Run-Time Infrastructure (RTI). In a given Federation, a Federate can publish and/or subscribes objects and interactions on the RTI only in accordance with their structures as defined in a FOM (Federation Object Model). Currently, the Space domain is characterized by a set of incompatible FOMs that, although meet the specific needs of different organizations and projects, increases the long-term cost for interoperability. In this context, the availability of a reference FOM for the Space domain will enable the development of interoperable HLA-based simulators for related joint projects and collaborations among worldwide organizations involved in the Space domain (e.g. NASA, ESA, Roscosmos, and JAXA). The paper presents a first set of results achieved by a SISO standardization effort that aims at providing a Space Reference FOM for international collaboration on Space systems simulations.

  13. NASA Human Spaceflight Architecture Team: Lunar Surface Exploration Strategies

    NASA Technical Reports Server (NTRS)

    Mueller, Rob P.

    2012-01-01

    NASA s agency wide Human Spaceflight Architecture Team (HAT) has been developing Design Reference Missions (DRMs) to support the ongoing effort to characterize NASA s future human exploration strategy. The DRM design effort includes specific articulations of transportation and surface elements, technologies and operations required to enable future human exploration of various destinations including the moon, Near Earth Asteroids (NEAs) and Mars as well as interim cis-lunar targets. In prior architecture studies, transportation concerns have dominated the analysis. As a result, an effort was made to study the human utilization strategy at each specific destination and the resultant impacts on the overall architecture design. In particular, this paper considers various lunar surface strategies as representative scenarios that could occur in a human lunar return, and demonstrates their alignment with the internationally developed Global Exploration Roadmap (GER).

  14. Empowering citizens with access control mechanisms to their personal health resources.

    PubMed

    Calvillo, J; Román, I; Roa, L M

    2013-01-01

    Advancements in information and communication technologies have allowed the development of new approaches to the management and use of healthcare resources. Nowadays it is possible to address complex issues such as meaningful access to distributed data or communication and understanding among heterogeneous systems. As a consequence, the discussion focuses on the administration of the whole set of resources providing knowledge about a single subject of care (SoC). New trends make the SoC administrator and responsible for all these elements (related to his/her demographic data, health, well-being, social conditions, etc.) and s/he is granted the ability of controlling access to them by third parties. The subject of care exchanges his/her passive role without any decision capacity for an active one allowing to control who accesses what. We study the necessary access control infrastructure to support this approach and develop mechanisms based on semantic tools to assist the subject of care with the specification of access control policies. This infrastructure is a building block of a wider scenario, the Person-Oriented Virtual Organization (POVO), aiming at integrating all the resources related to each citizen's health-related data. The POVO covers the wide range and heterogeneity of available healthcare resources (e.g., information sources, monitoring devices, or software simulation tools) and grants each SoC the access control to them. Several methodological issues are crucial for the design of the targeted infrastructure. The distributed system concept and focus are reviewed from the service oriented architecture (SOA) perspective. The main frameworks for the formalization of distributed system architectures (Reference Model-Open Distributed Processing, RM-ODP; and Model Driven Architecture, MDA) are introduced, as well as how the use of the Unified Modelling Language (UML) is standardized. The specification of access control policies and decision making mechanisms are essential keys for this approach and they are accomplished by using semantic technologies (i.e., ontologies, rule languages, and inference engines). The results are mainly focused on the security and access control of the proposed scenario. An ontology has been designed and developed for the POVO covering the terminology of the scenario and easing the automation of administration tasks. Over that ontology, an access control mechanism based on rule languages allows specifying access control policies, and an inference engine performs the decision making process automatically. The usability of solutions to ease administration tasks to the SoC is improved by the Me-As-An-Admin (M3A) application. This guides the SoC through the specification of personal access control policies to his/her distributed resources by using semantic technologies (e.g., metamodeling, model-to-text transformations, etc.). All results are developed as services and included in an architecture in accordance with standards and principles of openness and interoperability. Current technology can bring health, social and well-being care actually centered on citizens, and granting each person the management of his/her health information. However, the application of technology without adopting methodologies or normalized guidelines will reduce the interoperability of solutions developed, failing in the development of advanced services and improved scenarios for health delivery. Standards and reference architectures can be cornerstones for future-proof and powerful developments. Finally, not only technology must follow citizen-centric approaches, but also the gaps needing legislative efforts that support these new paradigms of healthcare delivery must be identified and addressed. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  15. Network-driven design principles for neuromorphic systems.

    PubMed

    Partzsch, Johannes; Schüffny, Rene

    2015-01-01

    Synaptic connectivity is typically the most resource-demanding part of neuromorphic systems. Commonly, the architecture of these systems is chosen mainly on technical considerations. As a consequence, the potential for optimization arising from the inherent constraints of connectivity models is left unused. In this article, we develop an alternative, network-driven approach to neuromorphic architecture design. We describe methods to analyse performance of existing neuromorphic architectures in emulating certain connectivity models. Furthermore, we show step-by-step how to derive a neuromorphic architecture from a given connectivity model. For this, we introduce a generalized description for architectures with a synapse matrix, which takes into account shared use of circuit components for reducing total silicon area. Architectures designed with this approach are fitted to a connectivity model, essentially adapting to its connection density. They are guaranteeing faithful reproduction of the model on chip, while requiring less total silicon area. In total, our methods allow designers to implement more area-efficient neuromorphic systems and verify usability of the connectivity resources in these systems.

  16. Network-driven design principles for neuromorphic systems

    PubMed Central

    Partzsch, Johannes; Schüffny, Rene

    2015-01-01

    Synaptic connectivity is typically the most resource-demanding part of neuromorphic systems. Commonly, the architecture of these systems is chosen mainly on technical considerations. As a consequence, the potential for optimization arising from the inherent constraints of connectivity models is left unused. In this article, we develop an alternative, network-driven approach to neuromorphic architecture design. We describe methods to analyse performance of existing neuromorphic architectures in emulating certain connectivity models. Furthermore, we show step-by-step how to derive a neuromorphic architecture from a given connectivity model. For this, we introduce a generalized description for architectures with a synapse matrix, which takes into account shared use of circuit components for reducing total silicon area. Architectures designed with this approach are fitted to a connectivity model, essentially adapting to its connection density. They are guaranteeing faithful reproduction of the model on chip, while requiring less total silicon area. In total, our methods allow designers to implement more area-efficient neuromorphic systems and verify usability of the connectivity resources in these systems. PMID:26539079

  17. Demand Activated Manufacturing Architecture (DAMA) model for supply chain collaboration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    CHAPMAN,LEON D.; PETERSEN,MARJORIE B.

    The Demand Activated Manufacturing Architecture (DAMA) project during the last five years of work with the U.S. Integrated Textile Complex (retail, apparel, textile, and fiber sectors) has developed an inter-enterprise architecture and collaborative model for supply chains. This model will enable improved collaborative business across any supply chain. The DAMA Model for Supply Chain Collaboration is a high-level model for collaboration to achieve Demand Activated Manufacturing. The five major elements of the architecture to support collaboration are (1) activity or process, (2) information, (3) application, (4) data, and (5) infrastructure. These five elements are tied to the application of themore » DAMA architecture to three phases of collaboration - prepare, pilot, and scale. There are six collaborative activities that may be employed in this model: (1) Develop Business Planning Agreements, (2) Define Products, (3) Forecast and Plan Capacity Commitments, (4) Schedule Product and Product Delivery, (5) Expedite Production and Delivery Exceptions, and (6) Populate Supply Chain Utility. The Supply Chain Utility is a set of applications implemented to support collaborative product definition, forecast visibility, planning, scheduling, and execution. The DAMA architecture and model will be presented along with the process for implementing this DAMA model.« less

  18. [Utilitarian goals and artistic autonomy architectural forms and their functions].

    PubMed

    Thibault, Estelle

    2012-01-01

    In the late 19(th) century, authors writing on aesthetics often referred to architecture to justify establishing a new hierarchy between things beautiful and things useful, a change underwritten by the rising sociological and anthropological perspectives on art. Meanwhile, architects debated the origins and evolution of artistic styles from the earliest forms of art to the most advanced monumental art works, a debate that fundamentally transformed the relationship between artistic expression and material determinism.

  19. Avionics Architecture Standards as an Approach to Obsolescence Management

    DTIC Science & Technology

    2000-10-01

    and goals is one method of system. The term System Architecture refers to a achieving the necessary critical mass of skilled and consistent set of such...Processing Module (GPM), Mass Memory Module executed on the modules within an ASAAC system will (MMM) and Power Conversion Module (PCM). be stored in a central...location, the Mass Memory * MOS -Module Support Layer to Operating System Module (MMM). Therefore, if modules are to be The purpose of the MOS

  20. On Using SysML, DoDAF 2.0 and UPDM to Model the Architecture for the NOAA's Joint Polar Satellite System (JPSS) Ground System (GS)

    NASA Technical Reports Server (NTRS)

    Hayden, Jeffrey L.; Jeffries, Alan

    2012-01-01

    The JPSS Ground System is a lIexible system of systems responsible for telemetry, tracking & command (TT &C), data acquisition, routing and data processing services for a varied lIeet of satellites to support weather prediction, modeling and climate modeling. To assist in this engineering effort, architecture modeling tools are being employed to translate the former NPOESS baseline to the new JPSS baseline, The paper will focus on the methodology for the system engineering process and the use of these architecture modeling tools within that process, The Department of Defense Architecture Framework version 2,0 (DoDAF 2.0) viewpoints and views that are being used to describe the JPSS GS architecture are discussed. The Unified Profile for DoOAF and MODAF (UPDM) and Systems Modeling Language (SysML), as ' provided by extensions to the MagicDraw UML modeling tool, are used to develop the diagrams and tables that make up the architecture model. The model development process and structure are discussed, examples are shown, and details of handling the complexities of a large System of Systems (SoS), such as the JPSS GS, with an equally complex modeling tool, are described

  1. Spatiotemporal Features of the Three-Dimensional Architectural Landscape in Qingdao, China.

    PubMed

    Zhang, Peifeng

    2015-01-01

    The evolution and development of the three-dimensional (3D) architectural landscape is the basis of proper urban planning, eco-environment construction and the improvement of environmental quality. This paper presents the spatiotemporal characteristics of the 3D architectural landscape of the Shinan and Shibei districts in Qingdao, China, based on buildings' 3D information extracted from Quickbird images from 2003 to 2012, supported by Barista, landscape metrics and GIS. The results demonstrated that: (1) Shinan and Shibei districts expanded vertically and urban land use intensity increased noticeably from year to year. (2) Significant differences in the 3D architectural landscape existed among the western, central and eastern regions, and among the 26 sub-districts over the study period. The differentiation was consistent with the diverse development history, function and planning of the two districts. Finally, we found that population correlates positively with the variation in the 3D architectural landscape. This research provides an important reference for related studies, urban planning and eco-city construction.

  2. Spatiotemporal Features of the Three-Dimensional Architectural Landscape in Qingdao, China

    PubMed Central

    Zhang, Peifeng

    2015-01-01

    The evolution and development of the three-dimensional (3D) architectural landscape is the basis of proper urban planning, eco-environment construction and the improvement of environmental quality. This paper presents the spatiotemporal characteristics of the 3D architectural landscape of the Shinan and Shibei districts in Qingdao, China, based on buildings’ 3D information extracted from Quickbird images from 2003 to 2012, supported by Barista, landscape metrics and GIS. The results demonstrated that: (1) Shinan and Shibei districts expanded vertically and urban land use intensity increased noticeably from year to year. (2) Significant differences in the 3D architectural landscape existed among the western, central and eastern regions, and among the 26 sub-districts over the study period. The differentiation was consistent with the diverse development history, function and planning of the two districts. Finally, we found that population correlates positively with the variation in the 3D architectural landscape. This research provides an important reference for related studies, urban planning and eco-city construction. PMID:26361016

  3. Lunar Outpost Life Support Architecture Study Based on a High-Mobility Exploration Scenario

    NASA Technical Reports Server (NTRS)

    Lange, Kevin E.; Anderson, Molly S.

    2010-01-01

    This paper presents results of a life support architecture study based on a 2009 NASA lunar surface exploration scenario known as Scenario 12. The study focuses on the assembly complete outpost configuration and includes pressurized rovers as part of a distributed outpost architecture in both stand-alone and integrated configurations. A range of life support architectures are examined reflecting different levels of closure and distributed functionality. Monte Carlo simulations are used to assess the sensitivity of results to volatile high-impact mission variables, including the quantity of residual Lander oxygen and hydrogen propellants available for scavenging, the fraction of crew time away from the outpost on excursions, total extravehicular activity hours, and habitat leakage. Surpluses or deficits of water and oxygen are reported for each architecture, along with fixed and 10-year total equivalent system mass estimates relative to a reference case. System robustness is discussed in terms of the probability of no water or oxygen resupply as determined from the Monte Carlo simulations.

  4. The Swedish strategy and method for development of a national healthcare information architecture.

    PubMed

    Rosenälv, Jessica; Lundell, Karl-Henrik

    2012-01-01

    "We need a precise framework of regulations in order to maintain appropriate and structured health care documentation that ensures that the information maintains a sufficient level of quality to be used in treatment, in research and by the actual patient. The users shall be aided by clearly and uniformly defined terms and concepts, and there should be an information structure that clarifies what to document and how to make the information more useful. Most of all, we need to standardize the information, not just the technical systems." (eHälsa - nytta och näring, Riksdag report 2011/12:RFR5, p. 37). In 2010, the Swedish Government adopted the National e-Health - the national strategy for accessible and secure information in healthcare. The strategy is a revision and extension of the previous strategy from 2006, which was used as input for the most recent efforts to develop a national information structure utilizing business-oriented generic models. A national decision on healthcare informatics standards was made by the Swedish County Councils, which decided to follow and use EN/ISO 13606 as a standard for the development of a universally applicable information structure, including archetypes and templates. The overall aim of the Swedish strategy for development of National Healthcare Information Architecture is to achieve high level semantic interoperability for clinical content and clinical contexts. High level semantic interoperability requires consistently structured clinical data and other types of data with coherent traceability to be mapped to reference clinical models. Archetypes that are formal definitions of the clinical and demographic concepts and some administrative data were developed. Each archetype describes the information structure and content of overarching core clinical concepts. Information that is defined in archetypes should be used for different purposes. Generic clinical process model was made concrete and analyzed. For each decision-making step in the process where information is processed, the amount and type of information and its structure were defined in terms of reference templates. Reference templates manage clinical, administrative and demographic types of information in a specific clinical context. Based on a survey of clinical processes at the reference level, the identification of specific clinical processes such as diabetes and congestive heart failure in adults were made. Process-specific templates were defined by using reference templates and populated with information that was relevant to each health problem in a specific clinical context. Throughout this process, medical data for knowledge management were collected for each health problem. Parallel with the efforts to define archetypes and templates, terminology binding work is on-going. Different strategies are used depending on the terminology binding level.

  5. Mobile platform for treatment of stroke: A case study of tele-assistance.

    PubMed

    Torres Zenteno, Arturo Henry; Fernández, Francisco; Palomino-García, Alfredo; Moniche, Francisco; Escudero, Irene; Jiménez-Hernández, M Dolores; Caballero, Auxiliadora; Escobar-Rodriguez, Germán; Parra, Carlos

    2016-09-01

    This article presents the technological solution of a tele-assistance process for stroke patients in acute phase in the Seville metropolitan area. The main objective of this process is to reduce time from symptom onset to treatment of acute phase stroke patients by means of telemedicine, regarding mobility between an intensive care unit ambulance and an expert center and activating the pre-hospital care phase. The technological platform covering the process has been defined following an interoperability model based on standards and with a focus on service-oriented architecture focus. Messaging definition has been designed according to the reference model of the CEN/ISO 13606, messages content follows the structure of archetypes. An XDS-b (Cross-Enterprise Document Sharing-b) transaction messaging has been designed according to Integrating the Healthcare Enterprise profile for archetype notifications and update enquiries.This research has been performed by a multidisciplinary group. The Virgen del Rocío University Hospital acts as Reference Hospital and the Public Company for Healthcare as mobility surroundings. © The Author(s) 2015.

  6. Data flow language and interpreter for a reconfigurable distributed data processor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurt, A.D.; Heath, J.R.

    1982-01-01

    An analytic language and an interpreter whereby an applications data flow graph may serve as an input to a reconfigurable distributed data processor is proposed. The architecture considered consists of a number of loosely coupled computing elements (CES) which may be linked to data and file memories through fully nonblocking interconnect networks. The real-time performance of such an architecture depends upon its ability to alter its topology in response to changes in application, asynchronous data rates and faults. Such a data flow language enhances the versatility of a reconfigurable architecture by allowing the user to specify the machine's topology atmore » a very high level. 11 references.« less

  7. A task-based support architecture for developing point-of-care clinical decision support systems for the emergency department.

    PubMed

    Wilk, S; Michalowski, W; O'Sullivan, D; Farion, K; Sayyad-Shirabad, J; Kuziemsky, C; Kukawka, B

    2013-01-01

    The purpose of this study was to create a task-based support architecture for developing clinical decision support systems (CDSSs) that assist physicians in making decisions at the point-of-care in the emergency department (ED). The backbone of the proposed architecture was established by a task-based emergency workflow model for a patient-physician encounter. The architecture was designed according to an agent-oriented paradigm. Specifically, we used the O-MaSE (Organization-based Multi-agent System Engineering) method that allows for iterative translation of functional requirements into architectural components (e.g., agents). The agent-oriented paradigm was extended with ontology-driven design to implement ontological models representing knowledge required by specific agents to operate. The task-based architecture allows for the creation of a CDSS that is aligned with the task-based emergency workflow model. It facilitates decoupling of executable components (agents) from embedded domain knowledge (ontological models), thus supporting their interoperability, sharing, and reuse. The generic architecture was implemented as a pilot system, MET3-AE--a CDSS to help with the management of pediatric asthma exacerbation in the ED. The system was evaluated in a hospital ED. The architecture allows for the creation of a CDSS that integrates support for all tasks from the task-based emergency workflow model, and interacts with hospital information systems. Proposed architecture also allows for reusing and sharing system components and knowledge across disease-specific CDSSs.

  8. Connected Vehicle Reference Implementation Architecture

    DOT National Transportation Integrated Search

    2016-11-16

    Over the past 10 years, the U.S. Department of Transportation (USDOT) has researched and developed connected vehicle technology, which allows vehicles to communicate with each other, roadway infrastructure, traffic management centers, and travelers' ...

  9. Efficient Graph Based Assembly of Short-Read Sequences on Hybrid Core Architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sczyrba, Alex; Pratap, Abhishek; Canon, Shane

    2011-03-22

    Advanced architectures can deliver dramatically increased throughput for genomics and proteomics applications, reducing time-to-completion in some cases from days to minutes. One such architecture, hybrid-core computing, marries a traditional x86 environment with a reconfigurable coprocessor, based on field programmable gate array (FPGA) technology. In addition to higher throughput, increased performance can fundamentally improve research quality by allowing more accurate, previously impractical approaches. We will discuss the approach used by Convey?s de Bruijn graph constructor for short-read, de-novo assembly. Bioinformatics applications that have random access patterns to large memory spaces, such as graph-based algorithms, experience memory performance limitations on cache-based x86more » servers. Convey?s highly parallel memory subsystem allows application-specific logic to simultaneously access 8192 individual words in memory, significantly increasing effective memory bandwidth over cache-based memory systems. Many algorithms, such as Velvet and other de Bruijn graph based, short-read, de-novo assemblers, can greatly benefit from this type of memory architecture. Furthermore, small data type operations (four nucleotides can be represented in two bits) make more efficient use of logic gates than the data types dictated by conventional programming models.JGI is comparing the performance of Convey?s graph constructor and Velvet on both synthetic and real data. We will present preliminary results on memory usage and run time metrics for various data sets with different sizes, from small microbial and fungal genomes to very large cow rumen metagenome. For genomes with references we will also present assembly quality comparisons between the two assemblers.« less

  10. An Agent-Based Dynamic Model for Analysis of Distributed Space Exploration Architectures

    NASA Astrophysics Data System (ADS)

    Sindiy, Oleg V.; DeLaurentis, Daniel A.; Stein, William B.

    2009-07-01

    A range of complex challenges, but also potentially unique rewards, underlie the development of exploration architectures that use a distributed, dynamic network of resources across the solar system. From a methodological perspective, the prime challenge is to systematically model the evolution (and quantify comparative performance) of such architectures, under uncertainty, to effectively direct further study of specialized trajectories, spacecraft technologies, concept of operations, and resource allocation. A process model for System-of-Systems Engineering is used to define time-varying performance measures for comparative architecture analysis and identification of distinguishing patterns among interoperating systems. Agent-based modeling serves as the means to create a discrete-time simulation that generates dynamics for the study of architecture evolution. A Solar System Mobility Network proof-of-concept problem is introduced representing a set of longer-term, distributed exploration architectures. Options within this set revolve around deployment of human and robotic exploration and infrastructure assets, their organization, interoperability, and evolution, i.e., a system-of-systems. Agent-based simulations quantify relative payoffs for a fully distributed architecture (which can be significant over the long term), the latency period before they are manifest, and the up-front investment (which can be substantial compared to alternatives). Verification and sensitivity results provide further insight on development paths and indicate that the framework and simulation modeling approach may be useful in architectural design of other space exploration mass, energy, and information exchange settings.

  11. CLOCS (Computer with Low Context-Switching Time) Architecture Reference Documents

    DTIC Science & Technology

    1988-05-06

    Peculiarities The only state inside the central processing unit(CPU) is a program status word. All data operations are memory to memory. One result of this... to the challenge "if I whore to design RISC, this is how I would do it." The architecture was designed by Mark Davis and Bill Gallmeister. 1.2...are memory to memory. Any special devices added should be memory mapped. The program counter is even memory mapped. 1.3.1 Working storage There is no

  12. A Reference Software Architecture to Support Unmanned Aircraft Integration in the National Airspace System

    DTIC Science & Technology

    2012-07-01

    and Avoid ( SAA ) testbed that provides some of the core services . This paper describes the general architecture and a SAA testbed implementation that...that provides data and software services to enable a set of Unmanned Aircraft (UA) platforms to operate in a wide range of air domains which may...implemented by MIT Lincoln Laboratory in the form of a Sense and Avoid ( SAA ) testbed that provides some of the core services . This paper describes the general

  13. Grid enablement of OpenGeospatial Web Services: the G-OWS Working Group

    NASA Astrophysics Data System (ADS)

    Mazzetti, Paolo

    2010-05-01

    In last decades two main paradigms for resource sharing emerged and reached maturity: the Web and the Grid. They both demonstrate suitable for building Distributed Computing Infrastructures (DCIs) supporting the coordinated sharing of resources (i.e. data, information, services, etc) on the Internet. Grid and Web DCIs have much in common as a result of their underlying Internet technology (protocols, models and specifications). However, being based on different requirements and architectural approaches, they show some differences as well. The Web's "major goal was to be a shared information space through which people and machines could communicate" [Berners-Lee 1996]. The success of the Web, and its consequent pervasiveness, made it appealing for building specialized systems like the Spatial Data Infrastructures (SDIs). In this systems the introduction of Web-based geo-information technologies enables specialized services for geospatial data sharing and processing. The Grid was born to achieve "flexible, secure, coordinated resource sharing among dynamic collections of individuals, institutions, and resources" [Foster 2001]. It specifically focuses on large-scale resource sharing, innovative applications, and, in some cases, high-performance orientation. In the Earth and Space Sciences (ESS) the most part of handled information is geo-referred (geo-information) since spatial and temporal meta-information is of primary importance in many application domains: Earth Sciences, Disasters Management, Environmental Sciences, etc. On the other hand, in several application areas there is the need of running complex models which require the large processing and storage capabilities that the Grids are able to provide. Therefore the integration of geo-information and Grid technologies might be a valuable approach in order to enable advanced ESS applications. Currently both geo-information and Grid technologies have reached a high level of maturity, allowing to build such an integration on existing solutions. More specifically, the Open Geospatial Consortium (OGC) Web Services (OWS) specifications play a fundamental role in geospatial information sharing (e.g. in INSPIRE Implementing Rules, GEOSS architecture, GMES Services, etc.). On the Grid side, the gLite middleware, developed in the European EGEE (Enabling Grids for E-sciencE) Projects, is widely spread in Europe and beyond, proving its high scalability and it is one of the middleware chosen for the future European Grid Infrastructure (EGI) initiative. Therefore the convergence between OWS and gLite technologies would be desirable for a seamless access to the Grid capabilities through OWS-compliant systems. Anyway, to achieve this harmonization there are some obstacles to overcome. Firstly, a semantics mismatch must be addressed: gLite handle low-level (e.g. close to the machine) concepts like "file", "data", "instruments", "job", etc., while geo-information services handle higher-level (closer to the human) concepts like "coverage", "observation", "measurement", "model", etc. Secondly, an architectural mismatch must be addressed: OWS implements a Web Service-Oriented-Architecture which is stateless, synchronous and with no embedded security (which is demanded to other specs), while gLite implements the Grid paradigm in an architecture which is stateful, asynchronous (even not fully event-based) and with strong embedded security (based on the VO paradigm). In recent years many initiatives and projects have worked out possible approaches for implementing Grid-enabled OWSs. Just to mention some: (i) in 2007 the OGC has signed a Memorandum of Understanding with the Open Grid Forum, "a community of users, developers, and vendors leading the global standardization effort for grid computing."; (ii) the OGC identified "WPS Profiles - Conflation; and Grid processing" as one of the tasks in the Geo Processing Workflow theme of the OWS Phase 6 (OWS-6); (iii) several national, European and international projects investigated different aspects of this integration, developing demonstrators and Proof-of-Concepts; In this context, "gLite enablement of OpenGeospatial Web Services" (G-OWS) is an initiative started in 2008 by the European CYCLOPS, GENESI-DR, and DORII Projects Consortia in order to collect/coordinate experiences on the enablement of OWS on top of the gLite middleware [GOWS]. Currently G-OWS counts ten member organizations from Europe and beyond, and four European Projects involved. It broadened its scope to the development of Spatial Data and Information Infrastructures (SDI and SII) based on the Grid/Cloud capacity in order to enable Earth Science applications and tools. Its operational objectives are the following: i) to contribute to the OGC-OGF initiative; ii) to release a reference implementation as standard gLite APIs (under the gLite software license); iii) to release a reference model (including procedures and guidelines) for OWS Grid-ification, as far as gLite is concerned; iv) to foster and promote the formation of consortiums for participation to projects/initiatives aimed at building Grid-enabled SDIs To achieve this objectives G-OWS bases its activities on two main guiding principles: a) the adoption of a service-oriented architecture based on the information modelling approach, and b) standardization as a means of achieving interoperability (i.e. adoption of standards from ISO TC211, OGC OWS, OGF). In the first year of activity G-OWS has designed a general architectural framework stemming from the FP6 CYCLOPS studies and enriched by the outcomes of other projects and initiatives involved (i.e. FP7 GENESI-DR, FP7 DORII, AIST GeoGrid, etc.). Some proof-of-concepts have been developed to demonstrate the flexibility and scalability of such architectural framework. The G-OWS WG developed implementations of gLite-enabled Web Coverage Service (WCS) and Web Processing Service (WPS), and an implementation of a Shibboleth authentication for gLite-enabled OWS in order to evaluate the possible integration of Web and Grid security models. The presentation will aim to communicate the G-OWS organization, activities, future plans and means to involve the ESSI community. References [Berners-Lee 1996] T. Berners-Lee, "WWW: Past, present, and future". IEEE Computer, 29(10), Oct. 1996, pp. 69-77. [Foster 2001] I. Foster, C. Kesselman and S. Tuecke, "The Anatomy of the Grid. The International Journal ofHigh Performance Computing Applications", 15(3):200-222, Fall 2001 [GOWS] G-OWS WG, https://www.g-ows.org/, accessed: 15 January 2010

  14. Electromagnetic Physics Models for Parallel Computing Architectures

    NASA Astrophysics Data System (ADS)

    Amadio, G.; Ananya, A.; Apostolakis, J.; Aurora, A.; Bandieramonte, M.; Bhattacharyya, A.; Bianchini, C.; Brun, R.; Canal, P.; Carminati, F.; Duhem, L.; Elvira, D.; Gheata, A.; Gheata, M.; Goulas, I.; Iope, R.; Jun, S. Y.; Lima, G.; Mohanty, A.; Nikitina, T.; Novak, M.; Pokorski, W.; Ribon, A.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.; Zhang, Y.

    2016-10-01

    The recent emergence of hardware architectures characterized by many-core or accelerated processors has opened new opportunities for concurrent programming models taking advantage of both SIMD and SIMT architectures. GeantV, a next generation detector simulation, has been designed to exploit both the vector capability of mainstream CPUs and multi-threading capabilities of coprocessors including NVidia GPUs and Intel Xeon Phi. The characteristics of these architectures are very different in terms of the vectorization depth and type of parallelization needed to achieve optimal performance. In this paper we describe implementation of electromagnetic physics models developed for parallel computing architectures as a part of the GeantV project. Results of preliminary performance evaluation and physics validation are presented as well.

  15. Modelling of internal architecture of kinesin nanomotor as a machine language.

    PubMed

    Khataee, H R; Ibrahim, M Y

    2012-09-01

    Kinesin is a protein-based natural nanomotor that transports molecular cargoes within cells by walking along microtubules. Kinesin nanomotor is considered as a bio-nanoagent which is able to sense the cell through its sensors (i.e. its heads and tail), make the decision internally and perform actions on the cell through its actuator (i.e. its motor domain). The study maps the agent-based architectural model of internal decision-making process of kinesin nanomotor to a machine language using an automata algorithm. The applied automata algorithm receives the internal agent-based architectural model of kinesin nanomotor as a deterministic finite automaton (DFA) model and generates a regular machine language. The generated regular machine language was acceptable by the architectural DFA model of the nanomotor and also in good agreement with its natural behaviour. The internal agent-based architectural model of kinesin nanomotor indicates the degree of autonomy and intelligence of the nanomotor interactions with its cell. Thus, our developed regular machine language can model the degree of autonomy and intelligence of kinesin nanomotor interactions with its cell as a language. Modelling of internal architectures of autonomous and intelligent bio-nanosystems as machine languages can lay the foundation towards the concept of bio-nanoswarms and next phases of the bio-nanorobotic systems development.

  16. Detailed Primitive-Based 3d Modeling of Architectural Elements

    NASA Astrophysics Data System (ADS)

    Remondino, F.; Lo Buglio, D.; Nony, N.; De Luca, L.

    2012-07-01

    The article describes a pipeline, based on image-data, for the 3D reconstruction of building façades or architectural elements and the successive modeling using geometric primitives. The approach overcome some existing problems in modeling architectural elements and deliver efficient-in-size reality-based textured 3D models useful for metric applications. For the 3D reconstruction, an opensource pipeline developed within the TAPENADE project is employed. In the successive modeling steps, the user manually selects an area containing an architectural element (capital, column, bas-relief, window tympanum, etc.) and then the procedure fits geometric primitives and computes disparity and displacement maps in order to tie visual and geometric information together in a light but detailed 3D model. Examples are reported and commented.

  17. Shaping Gene Expression by Landscaping Chromatin Architecture: Lessons from a Master.

    PubMed

    Sartorelli, Vittorio; Puri, Pier Lorenzo

    2018-05-19

    Since its discovery as a skeletal muscle-specific transcription factor able to reprogram somatic cells into differentiated myofibers, MyoD has provided an instructive model to understand how transcription factors regulate gene expression. Reciprocally, studies of other transcriptional regulators have provided testable hypotheses to further understand how MyoD activates transcription. Using MyoD as a reference, in this review, we discuss the similarities and differences in the regulatory mechanisms employed by tissue-specific transcription factors to access DNA and regulate gene expression by cooperatively shaping the chromatin landscape within the context of cellular differentiation. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. Towards Formal Verification of a Separation Microkernel

    NASA Astrophysics Data System (ADS)

    Butterfield, Andrew; Sanan, David; Hinchey, Mike

    2013-08-01

    The best approach to verifying an IMA separation kernel is to use a (fixed) time-space partitioning kernel with a multiple independent levels of separation (MILS) architecture. We describe an activity that explores the cost and feasibility of doing a formal verification of such a kernel to the Common Criteria (CC) levels mandated by the Separation Kernel Protection Profile (SKPP). We are developing a Reference Specification of such a kernel, and are using higher-order logic (HOL) to construct formal models of this specification and key separation properties. We then plan to do a dry run of part of a formal proof of those properties using the Isabelle/HOL theorem prover.

  19. GRADUATE AND PROFESSIONAL EDUCATION, AN ANNOTATED BIBLIOGRAPHY.

    ERIC Educational Resources Information Center

    HEISS, ANN M.; AND OTHERS

    THIS ANNOTATED BIBLIOGRAPHY CONTAINS REFERENCES TO GENERAL GRADUATE EDUCATION AND TO EDUCATION FOR THE FOLLOWING PROFESSIONAL FIELDS--ARCHITECTURE, BUSINESS, CLINICAL PSYCHOLOGY, DENTISTRY, ENGINEERING, LAW, LIBRARY SCIENCE, MEDICINE, NURSING, SOCIAL WORK, TEACHING, AND THEOLOGY. (HW)

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yao; Balaprakash, Prasanna; Meng, Jiayuan

    We present Raexplore, a performance modeling framework for architecture exploration. Raexplore enables rapid, automated, and systematic search of architecture design space by combining hardware counter-based performance characterization and analytical performance modeling. We demonstrate Raexplore for two recent manycore processors IBM Blue- Gene/Q compute chip and Intel Xeon Phi, targeting a set of scientific applications. Our framework is able to capture complex interactions between architectural components including instruction pipeline, cache, and memory, and to achieve a 3–22% error for same-architecture and cross-architecture performance predictions. Furthermore, we apply our framework to assess the two processors, and discover and evaluate a list ofmore » architectural scaling options for future processor designs.« less

  1. Little by Little Does the Trick: Design and Construction of a Discrete Event Agent-Based Simulation Framework

    DTIC Science & Technology

    2007-12-01

    model. Finally, we build a small agent-based model using the component architecture to demonstrate the library’s functionality. 15. NUMBER OF...and a Behavioral model. Finally, we build a small agent-based model using the component architecture to demonstrate the library’s functionality...prototypes an architectural design which is generalizable, reusable, and extensible. We have created an initial set of model elements that demonstrate

  2. An independent review of the Multi-Path Redundant Avionics Suite (MPRAS) architecture assessment and characterization report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, M.R.

    1991-02-01

    In recent years the NASA Langley Research Center has funded several contractors to conduct conceptual designs defining architectures for fault tolerant computer systems. Such a system is referred to as a Multi-Path Redundant Avionics Suite (MPRAS), and would form the basis for avionics systems that would be used in future families of space vehicles in a variety of missions. The principal contractors were General Dynamics, Boeing, and Draper Laboratories. These contractors participated in a series of review meetings, and submitted final reports defining their candidate architectures. NASA then commissioned the Research Triangle Institute (RTI) to perform an assessment of thesemore » architectures to identify strengths and weaknesses of each. This report is a separate, independent review of the RTI assessment, done primarily to assure that the assessment was comprehensive and objective. The report also includes general recommendations relative to further MPRAS development.« less

  3. Architectural programming for the workplace and the careplace.

    PubMed

    Easter, James G

    2002-01-01

    Sensitive planning and architectural design will impact long-term costs and daily operations. At the same time, the quality of the total environment has a direct impact on the patient, the family and the staff. These needs should be carefully balanced with the emotions of the patient, the care partner (parent, husband, wife or guardian) and those of the clinical team (physicians, nurses and staff). This article addresses the first step in the process; the master plan and then focuses in detail on one aspect of the architectural work referred to as architectural programming. The key to the process is selecting the best team of consultants, following the steps carefully, involving the client at every appropriate milestone along the way and asking the right questions. With this experienced team on board; following the proper steps, listening carefully to the answers and observing the daily process one can expect a successful product.

  4. Plant architecture, growth and radiative transfer for terrestrial and space environments

    NASA Technical Reports Server (NTRS)

    Norman, John M.; Goel, Narendra S.

    1993-01-01

    The overall objective of this research was to develop a hardware implemented model that would incorporate realistic and dynamic descriptions of canopy architecture in physiologically based models of plant growth and functioning, with an emphasis on radiative transfer while accommodating other environmental constraints. The general approach has five parts: a realistic mathematical treatment of canopy architecture, a methodology for combining this general canopy architectural description with a general radiative transfer model, the inclusion of physiological and environmental aspects of plant growth, inclusion of plant phenology, and integration.

  5. Roofline model toolkit: A practical tool for architectural and program analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lo, Yu Jung; Williams, Samuel; Van Straalen, Brian

    We present preliminary results of the Roofline Toolkit for multicore, many core, and accelerated architectures. This paper focuses on the processor architecture characterization engine, a collection of portable instrumented micro benchmarks implemented with Message Passing Interface (MPI), and OpenMP used to express thread-level parallelism. These benchmarks are specialized to quantify the behavior of different architectural features. Compared to previous work on performance characterization, these microbenchmarks focus on capturing the performance of each level of the memory hierarchy, along with thread-level parallelism, instruction-level parallelism and explicit SIMD parallelism, measured in the context of the compilers and run-time environments. We also measuremore » sustained PCIe throughput with four GPU memory managed mechanisms. By combining results from the architecture characterization with the Roofline model based solely on architectural specifications, this work offers insights for performance prediction of current and future architectures and their software systems. To that end, we instrument three applications and plot their resultant performance on the corresponding Roofline model when run on a Blue Gene/Q architecture.« less

  6. Clinical, information and business process modeling to promote development of safe and flexible software.

    PubMed

    Liaw, Siaw-Teng; Deveny, Elizabeth; Morrison, Iain; Lewis, Bryn

    2006-09-01

    Using a factorial vignette survey and modeling methodology, we developed clinical and information models - incorporating evidence base, key concepts, relevant terms, decision-making and workflow needed to practice safely and effectively - to guide the development of an integrated rule-based knowledge module to support prescribing decisions in asthma. We identified workflows, decision-making factors, factor use, and clinician information requirements. The Unified Modeling Language (UML) and public domain software and knowledge engineering tools (e.g. Protégé) were used, with the Australian GP Data Model as the starting point for expressing information needs. A Web Services service-oriented architecture approach was adopted within which to express functional needs, and clinical processes and workflows were expressed in the Business Process Execution Language (BPEL). This formal analysis and modeling methodology to define and capture the process and logic of prescribing best practice in a reference implementation is fundamental to tackling deficiencies in prescribing decision support software.

  7. Development Of A Three-Dimensional Circuit Integration Technology And Computer Architecture

    NASA Astrophysics Data System (ADS)

    Etchells, R. D.; Grinberg, J.; Nudd, G. R.

    1981-12-01

    This paper is the first of a series 1,2,3 describing a range of efforts at Hughes Research Laboratories, which are collectively referred to as "Three-Dimensional Microelectronics." The technology being developed is a combination of a unique circuit fabrication/packaging technology and a novel processing architecture. The packaging technology greatly reduces the parasitic impedances associated with signal-routing in complex VLSI structures, while simultaneously allowing circuit densities orders of magnitude higher than the current state-of-the-art. When combined with the 3-D processor architecture, the resulting machine exhibits a one- to two-order of magnitude simultaneous improvement over current state-of-the-art machines in the three areas of processing speed, power consumption, and physical volume. The 3-D architecture is essentially that commonly referred to as a "cellular array", with the ultimate implementation having as many as 512 x 512 processors working in parallel. The three-dimensional nature of the assembled machine arises from the fact that the chips containing the active circuitry of the processor are stacked on top of each other. In this structure, electrical signals are passed vertically through the chips via thermomigrated aluminum feedthroughs. Signals are passed between adjacent chips by micro-interconnects. This discussion presents a broad view of the total effort, as well as a more detailed treatment of the fabrication and packaging technologies themselves. The results of performance simulations of the completed 3-D processor executing a variety of algorithms are also presented. Of particular pertinence to the interests of the focal-plane array community is the simulation of the UNICORNS nonuniformity correction algorithms as executed by the 3-D architecture.

  8. a Framework for Architectural Heritage Hbim Semantization and Development

    NASA Astrophysics Data System (ADS)

    Brusaporci, S.; Maiezza, P.; Tata, A.

    2018-05-01

    Despite the recognized advantages of the use of BIM in the field of architecture and engineering, the extension of this procedure to the architectural heritage is neither immediate nor critical. The uniqueness and irregularity of historical architecture, on the one hand, and the great quantity of information necessary for the knowledge of architectural heritage, on the other, require appropriate reflections. The aim of this paper is to define a general framework for the use of BIM procedures for architectural heritage. The proposed methodology consists of three different Level of Development (LoD), depending on the characteristics of the building and the objectives of the study: a simplified model with a low geometric accuracy and a minimum quantity of information (LoD 200); a model nearer to the reality but, however, with a high deviation between virtual and real model (LoD 300); a detailed BIM model that reproduce as much as possible the geometric irregularities of the building and is enriched by the maximum quantity of information available (LoD 400).

  9. Quality evaluation of health information system's architectures developed using the HIS-DF methodology.

    PubMed

    López, Diego M; Blobel, Bernd; Gonzalez, Carolina

    2010-01-01

    Requirement analysis, design, implementation, evaluation, use, and maintenance of semantically interoperable Health Information Systems (HIS) have to be based on eHealth standards. HIS-DF is a comprehensive approach for HIS architectural development based on standard information models and vocabulary. The empirical validity of HIS-DF has not been demonstrated so far. Through an empirical experiment, the paper demonstrates that using HIS-DF and HL7 information models, semantic quality of HIS architecture can be improved, compared to architectures developed using traditional RUP process. Semantic quality of the architecture has been measured in terms of model's completeness and validity metrics. The experimental results demonstrated an increased completeness of 14.38% and an increased validity of 16.63% when using the HIS-DF and HL7 information models in a sample HIS development project. Quality assurance of the system architecture in earlier stages of HIS development presumes an increased quality of final HIS systems, which supposes an indirect impact on patient care.

  10. Assessment of various parameters to improve MALDI-TOF MS reference spectra libraries constructed for the routine identification of filamentous fungi.

    PubMed

    Normand, Anne-Cécile; Cassagne, Carole; Ranque, Stéphane; L'ollivier, Coralie; Fourquet, Patrick; Roesems, Sam; Hendrickx, Marijke; Piarroux, Renaud

    2013-04-08

    The poor reproducibility of matrix-assisted desorption/ionization time-of-flight (MALDI-TOF) spectra limits the effectiveness of the MALDI-TOF MS-based identification of filamentous fungi with highly heterogeneous phenotypes in routine clinical laboratories. This study aimed to enhance the MALDI-TOF MS-based identification of filamentous fungi by assessing several architectures of reference spectrum libraries. We established reference spectrum libraries that included 30 filamentous fungus species with various architectures characterized by distinct combinations of the following: i) technical replicates, i.e., the number of analyzed deposits for each culture used to build a reference meta-spectrum (RMS); ii) biological replicates, i.e., the number of RMS derived from the distinct subculture of each strain; and iii) the number of distinct strains of a given species. We then compared the effectiveness of each library in the identification of 200 prospectively collected clinical isolates, including 38 species in 28 genera.Identification effectiveness was improved by increasing the number of both RMS per strain (p<10-4) and strains for a given species (p<10-4) in a multivariate analysis. Addressing the heterogeneity of MALDI-TOF spectra derived from filamentous fungi by increasing the number of RMS obtained from distinct subcultures of strains included in the reference spectra library markedly improved the effectiveness of the MALDI-TOF MS-based identification of clinical filamentous fungi.

  11. An e-consent-based shared EHR system architecture for integrated healthcare networks.

    PubMed

    Bergmann, Joachim; Bott, Oliver J; Pretschner, Dietrich P; Haux, Reinhold

    2007-01-01

    Virtual integration of distributed patient data promises advantages over a consolidated health record, but raises questions mainly about practicability and authorization concepts. Our work aims on specification and development of a virtual shared health record architecture using a patient-centred integration and authorization model. A literature survey summarizes considerations of current architectural approaches. Complemented by a methodical analysis in two regional settings, a formal architecture model was specified and implemented. Results presented in this paper are a survey of architectural approaches for shared health records and an architecture model for a virtual shared EHR, which combines a patient-centred integration policy with provider-oriented document management. An electronic consent system assures, that access to the shared record remains under control of the patient. A corresponding system prototype has been developed and is currently being introduced and evaluated in a regional setting. The proposed architecture is capable of partly replacing message-based communications. Operating highly available provider repositories for the virtual shared EHR requires advanced technology and probably means additional costs for care providers. Acceptance of the proposed architecture depends on transparently embedding document validation and digital signature into the work processes. The paradigm shift from paper-based messaging to a "pull model" needs further evaluation.

  12. Fifth NASA Goddard Conference on Mass Storage Systems and Technologies.. Volume 1

    NASA Technical Reports Server (NTRS)

    Kobler, Benjamin (Editor); Hariharan, P. C. (Editor)

    1996-01-01

    This document contains copies of those technical papers received in time for publication prior to the Fifth Goddard Conference on Mass Storage Systems and Technologies. As one of an ongoing series, this conference continues to serve as a unique medium for the exchange of information on topics relating to the ingestion and management of substantial amounts of data and the attendant problems involved. This year's discussion topics include storage architecture, database management, data distribution, file system performance and modeling, and optical recording technology. There will also be a paper on Application Programming Interfaces (API) for a Physical Volume Repository (PVR) defined in Version 5 of the Institute of Electrical and Electronics Engineers (IEEE) Reference Model (RM). In addition, there are papers on specific archives and storage products.

  13. MACCIS 2.0 - An Architecture Description Framework for Technical Infostructures and Their Enterprise Environment

    DTIC Science & Technology

    2004-06-01

    Viewpoint Component Viewpoint View Architecture Description of Enterprise or Infostructure View Security Concern Business Security Model Business...security concern, when applied to the different viewpoints, addresses both stakeholders, and is described as a business security model or component...Viewpoint View Architecture Description of Enterprise or Infostructure View Security Concern Business Security Model Business Stakeholder IT Architect

  14. A Model for Communications Satellite System Architecture Assessment

    DTIC Science & Technology

    2011-09-01

    This is shown in Equation 4. The total system cost includes all development, acquisition, fielding, operations, maintenance and upgrades, and system...protection. A mathematical model was implemented to enable the analysis of communications satellite system architectures based on multiple system... implemented to enable the analysis of communications satellite system architectures based on multiple system attributes. Utilization of the model in

  15. Clinical engineering and risk management in healthcare technological process using architecture framework.

    PubMed

    Signori, Marcos R; Garcia, Renato

    2010-01-01

    This paper presents a model that aids the Clinical Engineering to deal with Risk Management in the Healthcare Technological Process. The healthcare technological setting is complex and supported by three basics entities: infrastructure (IS), healthcare technology (HT), and human resource (HR). Was used an Enterprise Architecture - MODAF (Ministry of Defence Architecture Framework) - to model this process for risk management. Thus, was created a new model to contribute to the risk management in the HT process, through the Clinical Engineering viewpoint. This architecture model can support and improve the decision making process of the Clinical Engineering to the Risk Management in the Healthcare Technological process.

  16. An information model for a virtual private optical network (OVPN) using virtual routers (VRs)

    NASA Astrophysics Data System (ADS)

    Vo, Viet Minh Nhat

    2002-05-01

    This paper describes a virtual private optical network architecture (Optical VPN - OVPN) based on virtual router (VR). It improves over architectures suggested for virtual private networks by using virtual routers with optical networks. The new things in this architecture are necessary changes to adapt to devices and protocols used in optical networks. This paper also presents information models for the OVPN: at the architecture level and at the service level. These are extensions to the DEN (directory enable network) and CIM (Common Information Model) for OVPNs using VRs. The goal is to propose a common management model using policies.

  17. Modeling the evolution of protein domain architectures using maximum parsimony.

    PubMed

    Fong, Jessica H; Geer, Lewis Y; Panchenko, Anna R; Bryant, Stephen H

    2007-02-09

    Domains are basic evolutionary units of proteins and most proteins have more than one domain. Advances in domain modeling and collection are making it possible to annotate a large fraction of known protein sequences by a linear ordering of their domains, yielding their architecture. Protein domain architectures link evolutionarily related proteins and underscore their shared functions. Here, we attempt to better understand this association by identifying the evolutionary pathways by which extant architectures may have evolved. We propose a model of evolution in which architectures arise through rearrangements of inferred precursor architectures and acquisition of new domains. These pathways are ranked using a parsimony principle, whereby scenarios requiring the fewest number of independent recombination events, namely fission and fusion operations, are assumed to be more likely. Using a data set of domain architectures present in 159 proteomes that represent all three major branches of the tree of life allows us to estimate the history of over 85% of all architectures in the sequence database. We find that the distribution of rearrangement classes is robust with respect to alternative parsimony rules for inferring the presence of precursor architectures in ancestral species. Analyzing the most parsimonious pathways, we find 87% of architectures to gain complexity over time through simple changes, among which fusion events account for 5.6 times as many architectures as fission. Our results may be used to compute domain architecture similarities, for example, based on the number of historical recombination events separating them. Domain architecture "neighbors" identified in this way may lead to new insights about the evolution of protein function.

  18. Modeling the Evolution of Protein Domain Architectures Using Maximum Parsimony

    PubMed Central

    Fong, Jessica H.; Geer, Lewis Y.; Panchenko, Anna R.; Bryant, Stephen H.

    2007-01-01

    Domains are basic evolutionary units of proteins and most proteins have more than one domain. Advances in domain modeling and collection are making it possible to annotate a large fraction of known protein sequences by a linear ordering of their domains, yielding their architecture. Protein domain architectures link evolutionarily related proteins and underscore their shared functions. Here, we attempt to better understand this association by identifying the evolutionary pathways by which extant architectures may have evolved. We propose a model of evolution in which architectures arise through rearrangements of inferred precursor architectures and acquisition of new domains. These pathways are ranked using a parsimony principle, whereby scenarios requiring the fewest number of independent recombination events, namely fission and fusion operations, are assumed to be more likely. Using a data set of domain architectures present in 159 proteomes that represent all three major branches of the tree of life allows us to estimate the history of over 85% of all architectures in the sequence database. We find that the distribution of rearrangement classes is robust with respect to alternative parsimony rules for inferring the presence of precursor architectures in ancestral species. Analyzing the most parsimonious pathways, we find 87% of architectures to gain complexity over time through simple changes, among which fusion events account for 5.6 times as many architectures as fission. Our results may be used to compute domain architecture similarities, for example, based on the number of historical recombination events separating them. Domain architecture “neighbors” identified in this way may lead to new insights about the evolution of protein function. PMID:17166515

  19. Modeling driver behavior in a cognitive architecture.

    PubMed

    Salvucci, Dario D

    2006-01-01

    This paper explores the development of a rigorous computational model of driver behavior in a cognitive architecture--a computational framework with underlying psychological theories that incorporate basic properties and limitations of the human system. Computational modeling has emerged as a powerful tool for studying the complex task of driving, allowing researchers to simulate driver behavior and explore the parameters and constraints of this behavior. An integrated driver model developed in the ACT-R (Adaptive Control of Thought-Rational) cognitive architecture is described that focuses on the component processes of control, monitoring, and decision making in a multilane highway environment. This model accounts for the steering profiles, lateral position profiles, and gaze distributions of human drivers during lane keeping, curve negotiation, and lane changing. The model demonstrates how cognitive architectures facilitate understanding of driver behavior in the context of general human abilities and constraints and how the driving domain benefits cognitive architectures by pushing model development toward more complex, realistic tasks. The model can also serve as a core computational engine for practical applications that predict and recognize driver behavior and distraction.

  20. SANDS: a service-oriented architecture for clinical decision support in a National Health Information Network.

    PubMed

    Wright, Adam; Sittig, Dean F

    2008-12-01

    In this paper, we describe and evaluate a new distributed architecture for clinical decision support called SANDS (Service-oriented Architecture for NHIN Decision Support), which leverages current health information exchange efforts and is based on the principles of a service-oriented architecture. The architecture allows disparate clinical information systems and clinical decision support systems to be seamlessly integrated over a network according to a set of interfaces and protocols described in this paper. The architecture described is fully defined and developed, and six use cases have been developed and tested using a prototype electronic health record which links to one of the existing prototype National Health Information Networks (NHIN): drug interaction checking, syndromic surveillance, diagnostic decision support, inappropriate prescribing in older adults, information at the point of care and a simple personal health record. Some of these use cases utilize existing decision support systems, which are either commercially or freely available at present, and developed outside of the SANDS project, while other use cases are based on decision support systems developed specifically for the project. Open source code for many of these components is available, and an open source reference parser is also available for comparison and testing of other clinical information systems and clinical decision support systems that wish to implement the SANDS architecture. The SANDS architecture for decision support has several significant advantages over other architectures for clinical decision support. The most salient of these are:

  1. Updated Mars Mission Architectures Featuring Nuclear Thermal Propulsion

    NASA Technical Reports Server (NTRS)

    Rodriguez, Mitchell A.; Percy, Thomas K.

    2017-01-01

    Nuclear thermal propulsion (NTP) can potentially enable routine human exploration of Mars and the solar system. By using nuclear fission instead of a chemical combustion process, and using hydrogen as the propellant, NTP systems promise rocket efficiencies roughly twice that of the best chemical rocket engines currently available. The most recent major Mars architecture study featuring NTP was the Design Reference Architecture 5.0 (DRA 5.0), performed in 2009. Currently, the predominant transportation options being considered are solar electric propulsion (SEP) and chemical propulsion; however, given NTP's capabilities, an updated architectural analysis is needed. This paper provides a top-level overview of several different architectures featuring updated NTP performance data. New architectures presented include a proposed update to the DRA 5.0 as well as an investigation of architectures based on the current Evolvable Mars Campaign, which is the focus of NASA's current analyses for the Journey to Mars. Architectures investigated leverage the latest information relating to NTP performance and design considerations and address new support elements not available at the time of DRA 5.0, most notably the Orion crew module and the Space Launch System (SLS). The paper provides a top level quantitative comparison of key performance metrics as well as a qualitative discussion of improvements and key challenges still to be addressed. Preliminary results indicate that the updated NTP architectures can significantly reduce the campaign mass and subsequently the costs for assembly and number of launches.

  2. A wide-range programmable frequency synthesizer based on a finite state machine filter

    NASA Astrophysics Data System (ADS)

    Alser, Mohammed H.; Assaad, Maher M.; Hussin, Fawnizu A.

    2013-11-01

    In this article, an FPGA-based design and implementation of a fully digital wide-range programmable frequency synthesizer based on a finite state machine filter is presented. The advantages of the proposed architecture are that, it simultaneously generates a high frequency signal from a low frequency reference signal (i.e. synthesising), and synchronising the two signals (signals have the same phase, or a constant difference) without jitter accumulation issue. The architecture is portable and can be easily implemented for various platforms, such as FPGAs and integrated circuits. The frequency synthesizer circuit can be used as a part of SERDES devices in intra/inter chip communication in system-on-chip (SoC). The proposed circuit is designed using Verilog language and synthesized for the Altera DE2-70 development board, with the Cyclone II (EP2C35F672C6) device on board. Simulation and experimental results are included; they prove the synthesizing and tracking features of the proposed architecture. The generated clock signal frequency of a range from 19.8 MHz to 440 MHz is synchronized to the input reference clock with a frequency step of 0.12 MHz.

  3. Model-Based Systems Engineering for Capturing Mission Architecture System Processes with an Application Case Study - Orion Flight Test 1

    NASA Technical Reports Server (NTRS)

    Bonanne, Kevin H.

    2011-01-01

    Model-based Systems Engineering (MBSE) is an emerging methodology that can be leveraged to enhance many system development processes. MBSE allows for the centralization of an architecture description that would otherwise be stored in various locations and formats, thus simplifying communication among the project stakeholders, inducing commonality in representation, and expediting report generation. This paper outlines the MBSE approach taken to capture the processes of two different, but related, architectures by employing the Systems Modeling Language (SysML) as a standard for architecture description and the modeling tool MagicDraw. The overarching goal of this study was to demonstrate the effectiveness of MBSE as a means of capturing and designing a mission systems architecture. The first portion of the project focused on capturing the necessary system engineering activities that occur when designing, developing, and deploying a mission systems architecture for a space mission. The second part applies activities from the first to an application problem - the system engineering of the Orion Flight Test 1 (OFT-1) End-to-End Information System (EEIS). By modeling the activities required to create a space mission architecture and then implementing those activities in an application problem, the utility of MBSE as an approach to systems engineering can be demonstrated.

  4. Spatial reference frames of visual, vestibular, and multimodal heading signals in the dorsal subdivision of the medial superior temporal area.

    PubMed

    Fetsch, Christopher R; Wang, Sentao; Gu, Yong; Deangelis, Gregory C; Angelaki, Dora E

    2007-01-17

    Heading perception is a complex task that generally requires the integration of visual and vestibular cues. This sensory integration is complicated by the fact that these two modalities encode motion in distinct spatial reference frames (visual, eye-centered; vestibular, head-centered). Visual and vestibular heading signals converge in the primate dorsal subdivision of the medial superior temporal area (MSTd), a region thought to contribute to heading perception, but the reference frames of these signals remain unknown. We measured the heading tuning of MSTd neurons by presenting optic flow (visual condition), inertial motion (vestibular condition), or a congruent combination of both cues (combined condition). Static eye position was varied from trial to trial to determine the reference frame of tuning (eye-centered, head-centered, or intermediate). We found that tuning for optic flow was predominantly eye-centered, whereas tuning for inertial motion was intermediate but closer to head-centered. Reference frames in the two unimodal conditions were rarely matched in single neurons and uncorrelated across the population. Notably, reference frames in the combined condition varied as a function of the relative strength and spatial congruency of visual and vestibular tuning. This represents the first investigation of spatial reference frames in a naturalistic, multimodal condition in which cues may be integrated to improve perceptual performance. Our results compare favorably with the predictions of a recent neural network model that uses a recurrent architecture to perform optimal cue integration, suggesting that the brain could use a similar computational strategy to integrate sensory signals expressed in distinct frames of reference.

  5. A resource-oriented architecture for a Geospatial Web

    NASA Astrophysics Data System (ADS)

    Mazzetti, Paolo; Nativi, Stefano

    2010-05-01

    In this presentation we discuss some architectural issues on the design of an architecture for a Geospatial Web, that is an information system for sharing geospatial resources according to the Web paradigm. The success of the Web in building a multi-purpose information space, has raised questions about the possibility of adopting the same approach for systems dedicated to the sharing of more specific resources, such as the geospatial information, that is information characterized by spatial/temporal reference. To this aim an investigation on the nature of the Web and on the validity of its paradigm for geospatial resources is required. The Web was born in the early 90's to provide "a shared information space through which people and machines could communicate" [Berners-Lee 1996]. It was originally built around a small set of specifications (e.g. URI, HTTP, HTML, etc.); however, in the last two decades several other technologies and specifications have been introduced in order to extend its capabilities. Most of them (e.g. the SOAP family) actually aimed to transform the Web in a generic Distributed Computing Infrastructure. While these efforts were definitely successful enabling the adoption of service-oriented approaches for machine-to-machine interactions supporting complex business processes (e.g. for e-Government and e-Business applications), they do not fit in the original concept of the Web. In the year 2000, R. T. Fielding, one of the designers of the original Web specifications, proposes a new architectural style for distributed systems, called REST (Representational State Transfer), aiming to capture the fundamental characteristics of the Web as it was originally conceived [Fielding 2000]. In this view, the nature of the Web lies not so much in the technologies, as in the way they are used. Maintaining the Web architecture conform to the REST style would then assure the scalability, extensibility and low entry barrier of the original Web. On the contrary, systems using the same Web technologies and specifications but according to a different architectural style, despite their usefulness, should not be considered part of the Web. If the REST style captures the significant Web characteristics, then, in order to build a Geospatial Web it is necessary that its architecture satisfies all the REST constraints. One of them is of particular importance: the adoption of a Uniform Interface. It prescribes that all the geospatial resources must be accessed through the same interface; moreover according to the REST style this interface must satisfy four further constraints: a) identification of resources; b) manipulation of resources through representations; c) self-descriptive messages; and, d) hypermedia as the engine of application state. In the Web, the uniform interface provides basic operations which are meaningful for generic resources. They typically implement the CRUD pattern (Create-Retrieve-Update-Delete) which demonstrated to be flexible and powerful in several general-purpose contexts (e.g. filesystem management, SQL for database management systems, etc.). Restricting the scope to a subset of resources it would be possible to identify other generic actions which are meaningful for all of them. For example for geospatial resources, subsetting, resampling, interpolation and coordinate reference systems transformations functionalities are candidate functionalities for a uniform interface. However an investigation is needed to clarify the semantics of those actions for different resources, and consequently if they can really ascend the role of generic interface operation. Concerning the point a), (identification of resources), it is required that every resource addressable in the Geospatial Web has its own identifier (e.g. a URI). This allows to implement citation and re-use of resources, simply providing the URI. OPeNDAP and KVP encodings of OGC data access services specifications might provide a basis for it. Concerning point b) (manipulation of resources through representations), the Geospatial Web poses several issues. In fact, while the Web mainly handles semi-structured information, in the Geospatial Web the information is typically structured with several possible data models (e.g. point series, gridded coverages, trajectories, etc.) and encodings. A possibility would be to simplify the interchange formats, choosing to support a subset of data models and format(s). This is what actually the Web designers did choosing to define a common format for hypermedia (HTML), although the underlying protocol would be generic. Concerning point c), self-descriptive messages, the exchanged messages should describe themselves and their content. This would not be actually a major issue considering the effort put in recent years on geospatial metadata models and specifications. The point d), hypermedia as the engine of application state, is actually where the Geospatial Web would mainly differ from existing geospatial information sharing systems. In fact the existing systems typically adopt a service-oriented architecture, where applications are built as a single service or as a workflow of services. On the other hand, in the Geospatial Web, applications should be built following the path between interconnected resources. The link between resources should be made explicit as hyperlinks. The adoption of Semantic Web solutions would allow to define not only the existence of a link between two resources, but also the nature of the link. The implementation of a Geospatial Web would allow to build an information system with the same characteristics of the Web sharing its points-of-strength and weaknesses. The main advantages would be the following: • The user would interact with the Geospatial Web according to the well-known Web navigation paradigm. This would lower the barrier to the access to geospatial applications for non-specialists (e.g. the success of Google Maps and other Web mapping applications); • Successful Web and Web 2.0 applications - search engines, feeds, social network - could be integrated/replicated in the Geospatial Web; The main drawbacks would be the following: • The Uniform Interface simplifies the overall system architecture (e.g. no service registry, and service descriptors required), but moves the complexity to the data representation. Moreover since the interface must stay generic, it results really simple and therefore complex interactions would require several transfers. • In the geospatial domain one of the most valuable resources are processes (e.g. environmental models). How they can be modeled as resources accessed through the common interface is an open issue. Taking into account advantages and drawback it seems that a Geospatial Web would be useful, but its use would be limited to specific use-cases not covering all the possible applications. The Geospatial Web architecture could be partly based on existing specifications, while other aspects need investigation. References [Berners-Lee 1996] T. Berners-Lee, "WWW: Past, present, and future". IEEE Computer, 29(10), Oct. 1996, pp. 69-77. [Fielding 2000] Fielding, R. T. 2000. Architectural styles and the design of network-based software architectures. PhD Dissertation. Dept. of Information and Computer Science, University of California, Irvine

  6. Comparative height crown allometry and mechanical design in 22 tree species of Kuala Belalong rainforest, Brunei, Borneo.

    PubMed

    Osunkoya, Olusegun O; Omar-Ali, Kharunnisa; Amit, Norratna; Dayan, Juita; Daud, Dayanawati S; Sheng, Tan K

    2007-12-01

    In rainforests, trunk size, strength, crown position, and geometry of a tree affect light interception and the likelihood of mechanical failure. Allometric relationships of tree diameter, wood density, and crown architecture vs. height are described for a diverse range of rainforest trees in Brunei, northern Borneo. The understory species follow a geometric model in their diameter-height relationship (slope, β = 1.08), while the stress-elasticity models prevail (β = 1.27-1.61) for the midcanopy and canopy/emergent species. These relationships changed with ontogeny, especially for the understory species. Within species, the tree stability safety factor (SSF) and relative crown width decreased exponentially with increasing tree height. These trends failed to emerge in across-species comparisons and were reversed at a common (low) height. Across species, the relative crown depth decreased with maximum potential height and was indistinguishable at a common (low) height. Crown architectural traits influence SSF more than structural property of wood density. These findings emphasize the importance of applying a common reference size in comparative studies and suggest that forest trees (especially the understory group) may adapt to low light by having deeper rather than wider crowns due to an efficient distribution and geometry of their foliage.

  7. An application of business process method to the clinical efficiency of hospital.

    PubMed

    Leu, Jun-Der; Huang, Yu-Tsung

    2011-06-01

    The concept of Total Quality Management (TQM) has come to be applied in healthcare over the last few years. The process management category in the Baldrige Health Care Criteria for Performance Excellence model is designed to evaluate the quality of medical services. However, a systematic approach for implementation support is necessary to achieve excellence in the healthcare business process. The Architecture of Integrated Information Systems (ARIS) is a business process architecture developed by IDS Scheer AG and has been applied in a variety of industrial application. It starts with a business strategy to identify the core and support processes, and encompasses the whole life-cycle range, from business process design to information system deployment, which is compatible with the concept of healthcare performance excellence criteria. In this research, we apply the basic ARIS framework to optimize the clinical processes of an emergency department in a mid-size hospital with 300 clinical beds while considering the characteristics of the healthcare organization. Implementation of the case is described, and 16 months of clinical data are then collected, which are used to study the performance and feasibility of the method. The experience gleaned in this case study can be used a reference for mid-size hospitals with similar business models.

  8. Parallel Architectures for Planetary Exploration Requirements (PAPER)

    NASA Technical Reports Server (NTRS)

    Cezzar, Ruknet; Sen, Ranjan K.

    1989-01-01

    The Parallel Architectures for Planetary Exploration Requirements (PAPER) project is essentially research oriented towards technology insertion issues for NASA's unmanned planetary probes. It was initiated to complement and augment the long-term efforts for space exploration with particular reference to NASA/LaRC's (NASA Langley Research Center) research needs for planetary exploration missions of the mid and late 1990s. The requirements for space missions as given in the somewhat dated Advanced Information Processing Systems (AIPS) requirements document are contrasted with the new requirements from JPL/Caltech involving sensor data capture and scene analysis. It is shown that more stringent requirements have arisen as a result of technological advancements. Two possible architectures, the AIPS Proof of Concept (POC) configuration and the MAX Fault-tolerant dataflow multiprocessor, were evaluated. The main observation was that the AIPS design is biased towards fault tolerance and may not be an ideal architecture for planetary and deep space probes due to high cost and complexity. The MAX concepts appears to be a promising candidate, except that more detailed information is required. The feasibility for adding neural computation capability to this architecture needs to be studied. Key impact issues for architectural design of computing systems meant for planetary missions were also identified.

  9. Towards a Framework for Modeling Space Systems Architectures

    NASA Technical Reports Server (NTRS)

    Shames, Peter; Skipper, Joseph

    2006-01-01

    Topics covered include: 1) Statement of the problem: a) Space system architecture is complex; b) Existing terrestrial approaches must be adapted for space; c) Need a common architecture methodology and information model; d) Need appropriate set of viewpoints. 2) Requirements on a space systems model. 3) Model Based Engineering and Design (MBED) project: a) Evaluated different methods; b) Adapted and utilized RASDS & RM-ODP; c) Identified useful set of viewpoints; d) Did actual model exchanges among selected subset of tools. 4) Lessons learned & future vision.

  10. Towards Complete, Geo-Referenced 3d Models from Crowd-Sourced Amateur Images

    NASA Astrophysics Data System (ADS)

    Hartmann, W.; Havlena, M.; Schindler, K.

    2016-06-01

    Despite a lot of recent research, photogrammetric reconstruction from crowd-sourced imagery is plagued by a number of recurrent problems. (i) The resulting models are chronically incomplete, because even touristic landmarks are photographed mostly from a few "canonical" viewpoints. (ii) Man-made constructions tend to exhibit repetitive structure and rotational symmetries, which lead to gross errors in the 3D reconstruction and aggravate the problem of incomplete reconstruction. (iii) The models are normally not geo-referenced. In this paper, we investigate the possibility of using sparse GNSS geo-tags from digital cameras to address these issues and push the boundaries of crowd-sourced photogrammetry. A small proportion of the images in Internet collections (≍ 10 %) do possess geo-tags. While the individual geo-tags are very inaccurate, they nevertheless can help to address the problems above. By providing approximate geo-reference for partial reconstructions they make it possible to fuse those pieces into more complete models; the capability to fuse partial reconstruction opens up the possibility to be more restrictive in the matching phase and avoid errors due to repetitive structure; and collectively, the redundant set of low-quality geo-tags can provide reasonably accurate absolute geo-reference. We show that even few, noisy geo-tags can help to improve architectural models, compared to puristic structure-from-motion only based on image correspondence.

  11. Archetype Model-Driven Development Framework for EHR Web System

    PubMed Central

    Kimura, Eizen; Ishihara, Ken

    2013-01-01

    Objectives This article describes the Web application framework for Electronic Health Records (EHRs) we have developed to reduce construction costs for EHR sytems. Methods The openEHR project has developed clinical model driven architecture for future-proof interoperable EHR systems. This project provides the specifications to standardize clinical domain model implementations, upon which the ISO/CEN 13606 standards are based. The reference implementation has been formally described in Eiffel. Moreover C# and Java implementations have been developed as reference. While scripting languages had been more popular because of their higher efficiency and faster development in recent years, they had not been involved in the openEHR implementations. From 2007, we have used the Ruby language and Ruby on Rails (RoR) as an agile development platform to implement EHR systems, which is in conformity with the openEHR specifications. Results We implemented almost all of the specifications, the Archetype Definition Language parser, and RoR scaffold generator from archetype. Although some problems have emerged, most of them have been resolved. Conclusions We have provided an agile EHR Web framework, which can build up Web systems from archetype models using RoR. The feasibility of the archetype model to provide semantic interoperability of EHRs has been demonstrated and we have verified that that it is suitable for the construction of EHR systems. PMID:24523991

  12. Implementation theory of distortion-invariant pattern recognition for optical and digital signal processing systems

    NASA Astrophysics Data System (ADS)

    Lhamon, Michael Earl

    A pattern recognition system which uses complex correlation filter banks requires proportionally more computational effort than single-real valued filters. This introduces increased computation burden but also introduces a higher level of parallelism, that common computing platforms fail to identify. As a result, we consider algorithm mapping to both optical and digital processors. For digital implementation, we develop computationally efficient pattern recognition algorithms, referred to as, vector inner product operators that require less computational effort than traditional fast Fourier methods. These algorithms do not need correlation and they map readily onto parallel digital architectures, which imply new architectures for optical processors. These filters exploit circulant-symmetric matrix structures of the training set data representing a variety of distortions. By using the same mathematical basis as with the vector inner product operations, we are able to extend the capabilities of more traditional correlation filtering to what we refer to as "Super Images". These "Super Images" are used to morphologically transform a complicated input scene into a predetermined dot pattern. The orientation of the dot pattern is related to the rotational distortion of the object of interest. The optical implementation of "Super Images" yields feature reduction necessary for using other techniques, such as artificial neural networks. We propose a parallel digital signal processor architecture based on specific pattern recognition algorithms but general enough to be applicable to other similar problems. Such an architecture is classified as a data flow architecture. Instead of mapping an algorithm to an architecture, we propose mapping the DSP architecture to a class of pattern recognition algorithms. Today's optical processing systems have difficulties implementing full complex filter structures. Typically, optical systems (like the 4f correlators) are limited to phase-only implementation with lower detection performance than full complex electronic systems. Our study includes pseudo-random pixel encoding techniques for approximating full complex filtering. Optical filter bank implementation is possible and they have the advantage of time averaging the entire filter bank at real time rates. Time-averaged optical filtering is computational comparable to billions of digital operations-per-second. For this reason, we believe future trends in high speed pattern recognition will involve hybrid architectures of both optical and DSP elements.

  13. Engineering Technology Education: Bibliography 1989.

    ERIC Educational Resources Information Center

    Dyrud, Marilyn A., Comp.

    1990-01-01

    Over 200 references divided into 24 different areas are presented. Topics include administration, aeronautics, architecture, biomedical technology, CAD/CAM, civil engineering, computers, curriculum, electrical/electronics engineering, industrial engineering, industry and employment, instructional technology, laboratories, lasers, liberal studies,…

  14. Reference Points: Engineering Technology Education Bibliography, 1987.

    ERIC Educational Resources Information Center

    Engineering Education, 1989

    1989-01-01

    Lists articles and books published in 1987. Selects the following headings: administration, aeronautical, architectural, CAD/CAM, civil, computers, curriculum, electrical/electronics, industrial, industry/government/employers, instructional technology, laboratories, liberal studies, manufacturing, mechanical, minorities, research, robotics,…

  15. 46 CFR 172.020 - Incorporation by reference.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., Naval Architecture Division, Office of Design and Engineering Standards, (CG-521), 2100 2nd St., SW...). For information on the availability of this material at NARA, call 202-741-6030, or go to: http://www...

  16. Development of brain-wide connectivity architecture in awake rats.

    PubMed

    Ma, Zilu; Ma, Yuncong; Zhang, Nanyin

    2018-08-01

    Childhood and adolescence are both critical developmental periods, evidenced by complex neurophysiological changes the brain undergoes and high occurrence rates of neuropsychiatric disorders during these periods. Despite substantial progress in elucidating the developmental trajectories of individual neural circuits, our knowledge of developmental changes of whole-brain connectivity architecture in animals is sparse. To fill this gap, here we longitudinally acquired rsfMRI data in awake rats during five developmental stages from juvenile to adulthood. We found that the maturation timelines of brain circuits were heterogeneous and system specific. Functional connectivity (FC) tended to decrease in subcortical circuits, but increase in cortical circuits during development. In addition, the developing brain exhibited hemispheric functional specialization, evidenced by reduced inter-hemispheric FC between homotopic regions, and lower similarity of region-to-region FC patterns between the two hemispheres. Finally, we showed that whole-brain network development was characterized by reduced clustering (i.e. local communication) but increased integration (distant communication). Taken together, the present study has systematically characterized the development of brain-wide connectivity architecture from juvenile to adulthood in awake rats. It also serves as a critical reference point for understanding circuit- and network-level changes in animal models of brain development-related disorders. Furthermore, FC data during brain development in awake rodents contain high translational value and can shed light onto comparative neuroanatomy. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. A self-scaling, distributed information architecture for public health, research, and clinical care.

    PubMed

    McMurry, Andrew J; Gilbert, Clint A; Reis, Ben Y; Chueh, Henry C; Kohane, Isaac S; Mandl, Kenneth D

    2007-01-01

    This study sought to define a scalable architecture to support the National Health Information Network (NHIN). This architecture must concurrently support a wide range of public health, research, and clinical care activities. The architecture fulfils five desiderata: (1) adopt a distributed approach to data storage to protect privacy, (2) enable strong institutional autonomy to engender participation, (3) provide oversight and transparency to ensure patient trust, (4) allow variable levels of access according to investigator needs and institutional policies, (5) define a self-scaling architecture that encourages voluntary regional collaborations that coalesce to form a nationwide network. Our model has been validated by a large-scale, multi-institution study involving seven medical centers for cancer research. It is the basis of one of four open architectures developed under funding from the Office of the National Coordinator of Health Information Technology, fulfilling the biosurveillance use case defined by the American Health Information Community. The model supports broad applicability for regional and national clinical information exchanges. This model shows the feasibility of an architecture wherein the requirements of care providers, investigators, and public health authorities are served by a distributed model that grants autonomy, protects privacy, and promotes participation.

  18. Electromagnetic physics models for parallel computing architectures

    DOE PAGES

    Amadio, G.; Ananya, A.; Apostolakis, J.; ...

    2016-11-21

    The recent emergence of hardware architectures characterized by many-core or accelerated processors has opened new opportunities for concurrent programming models taking advantage of both SIMD and SIMT architectures. GeantV, a next generation detector simulation, has been designed to exploit both the vector capability of mainstream CPUs and multi-threading capabilities of coprocessors including NVidia GPUs and Intel Xeon Phi. The characteristics of these architectures are very different in terms of the vectorization depth and type of parallelization needed to achieve optimal performance. In this paper we describe implementation of electromagnetic physics models developed for parallel computing architectures as a part ofmore » the GeantV project. Finally, the results of preliminary performance evaluation and physics validation are presented as well.« less

  19. Domain architecture conservation in orthologs

    PubMed Central

    2011-01-01

    Background As orthologous proteins are expected to retain function more often than other homologs, they are often used for functional annotation transfer between species. However, ortholog identification methods do not take into account changes in domain architecture, which are likely to modify a protein's function. By domain architecture we refer to the sequential arrangement of domains along a protein sequence. To assess the level of domain architecture conservation among orthologs, we carried out a large-scale study of such events between human and 40 other species spanning the entire evolutionary range. We designed a score to measure domain architecture similarity and used it to analyze differences in domain architecture conservation between orthologs and paralogs relative to the conservation of primary sequence. We also statistically characterized the extents of different types of domain swapping events across pairs of orthologs and paralogs. Results The analysis shows that orthologs exhibit greater domain architecture conservation than paralogous homologs, even when differences in average sequence divergence are compensated for, for homologs that have diverged beyond a certain threshold. We interpret this as an indication of a stronger selective pressure on orthologs than paralogs to retain the domain architecture required for the proteins to perform a specific function. In general, orthologs as well as the closest paralogous homologs have very similar domain architectures, even at large evolutionary separation. The most common domain architecture changes observed in both ortholog and paralog pairs involved insertion/deletion of new domains, while domain shuffling and segment duplication/deletion were very infrequent. Conclusions On the whole, our results support the hypothesis that function conservation between orthologs demands higher domain architecture conservation than other types of homologs, relative to primary sequence conservation. This supports the notion that orthologs are functionally more similar than other types of homologs at the same evolutionary distance. PMID:21819573

  20. Programming model for distributed intelligent systems

    NASA Technical Reports Server (NTRS)

    Sztipanovits, J.; Biegl, C.; Karsai, G.; Bogunovic, N.; Purves, B.; Williams, R.; Christiansen, T.

    1988-01-01

    A programming model and architecture which was developed for the design and implementation of complex, heterogeneous measurement and control systems is described. The Multigraph Architecture integrates artificial intelligence techniques with conventional software technologies, offers a unified framework for distributed and shared memory based parallel computational models and supports multiple programming paradigms. The system can be implemented on different hardware architectures and can be adapted to strongly different applications.

  1. Executable Architecture Research at Old Dominion University

    NASA Technical Reports Server (NTRS)

    Tolk, Andreas; Shuman, Edwin A.; Garcia, Johnny J.

    2011-01-01

    Executable Architectures allow the evaluation of system architectures not only regarding their static, but also their dynamic behavior. However, the systems engineering community do not agree on a common formal specification of executable architectures. To close this gap and identify necessary elements of an executable architecture, a modeling language, and a modeling formalism is topic of ongoing PhD research. In addition, systems are generally defined and applied in an operational context to provide capabilities and enable missions. To maximize the benefits of executable architectures, a second PhD effort introduces the idea of creating an executable context in addition to the executable architecture. The results move the validation of architectures from the current information domain into the knowledge domain and improve the reliability of such validation efforts. The paper presents research and results of both doctoral research efforts and puts them into a common context of state-of-the-art of systems engineering methods supporting more agility.

  2. Space Generic Open Avionics Architecture (SGOAA) standard specification

    NASA Technical Reports Server (NTRS)

    Wray, Richard B.; Stovall, John R.

    1994-01-01

    This standard establishes the Space Generic Open Avionics Architecture (SGOAA). The SGOAA includes a generic functional model, processing structural model, and an architecture interface model. This standard defines the requirements for applying these models to the development of spacecraft core avionics systems. The purpose of this standard is to provide an umbrella set of requirements for applying the generic architecture models to the design of a specific avionics hardware/software processing system. This standard defines a generic set of system interface points to facilitate identification of critical services and interfaces. It establishes the requirement for applying appropriate low level detailed implementation standards to those interfaces points. The generic core avionics functions and processing structural models provided herein are robustly tailorable to specific system applications and provide a platform upon which the interface model is to be applied.

  3. Resource utilization model for the algorithm to architecture mapping model

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Patel, Rakesh R.

    1993-01-01

    The analytical model for resource utilization and the variable node time and conditional node model for the enhanced ATAMM model for a real-time data flow architecture are presented in this research. The Algorithm To Architecture Mapping Model, ATAMM, is a Petri net based graph theoretic model developed at Old Dominion University, and is capable of modeling the execution of large-grained algorithms on a real-time data flow architecture. Using the resource utilization model, the resource envelope may be obtained directly from a given graph and, consequently, the maximum number of required resources may be evaluated. The node timing diagram for one iteration period may be obtained using the analytical resource envelope. The variable node time model, which describes the change in resource requirement for the execution of an algorithm under node time variation, is useful to expand the applicability of the ATAMM model to heterogeneous architectures. The model also describes a method of detecting the presence of resource limited mode and its subsequent prevention. Graphs with conditional nodes are shown to be reduced to equivalent graphs with time varying nodes and, subsequently, may be analyzed using the variable node time model to determine resource requirements. Case studies are performed on three graphs for the illustration of applicability of the analytical theories.

  4. Reconciliation of the cloud computing model with US federal electronic health record regulations

    PubMed Central

    2011-01-01

    Cloud computing refers to subscription-based, fee-for-service utilization of computer hardware and software over the Internet. The model is gaining acceptance for business information technology (IT) applications because it allows capacity and functionality to increase on the fly without major investment in infrastructure, personnel or licensing fees. Large IT investments can be converted to a series of smaller operating expenses. Cloud architectures could potentially be superior to traditional electronic health record (EHR) designs in terms of economy, efficiency and utility. A central issue for EHR developers in the US is that these systems are constrained by federal regulatory legislation and oversight. These laws focus on security and privacy, which are well-recognized challenges for cloud computing systems in general. EHRs built with the cloud computing model can achieve acceptable privacy and security through business associate contracts with cloud providers that specify compliance requirements, performance metrics and liability sharing. PMID:21727204

  5. Clinical data interoperability based on archetype transformation.

    PubMed

    Costa, Catalina Martínez; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2011-10-01

    The semantic interoperability between health information systems is a major challenge to improve the quality of clinical practice and patient safety. In recent years many projects have faced this problem and provided solutions based on specific standards and technologies in order to satisfy the needs of a particular scenario. Most of such solutions cannot be easily adapted to new scenarios, thus more global solutions are needed. In this work, we have focused on the semantic interoperability of electronic healthcare records standards based on the dual model architecture and we have developed a solution that has been applied to ISO 13606 and openEHR. The technological infrastructure combines reference models, archetypes and ontologies, with the support of Model-driven Engineering techniques. For this purpose, the interoperability infrastructure developed in previous work by our group has been reused and extended to cover the requirements of data transformation. Copyright © 2011 Elsevier Inc. All rights reserved.

  6. A cloud-based approach for interoperable electronic health records (EHRs).

    PubMed

    Bahga, Arshdeep; Madisetti, Vijay K

    2013-09-01

    We present a cloud-based approach for the design of interoperable electronic health record (EHR) systems. Cloud computing environments provide several benefits to all the stakeholders in the healthcare ecosystem (patients, providers, payers, etc.). Lack of data interoperability standards and solutions has been a major obstacle in the exchange of healthcare data between different stakeholders. We propose an EHR system - cloud health information systems technology architecture (CHISTAR) that achieves semantic interoperability through the use of a generic design methodology which uses a reference model that defines a general purpose set of data structures and an archetype model that defines the clinical data attributes. CHISTAR application components are designed using the cloud component model approach that comprises of loosely coupled components that communicate asynchronously. In this paper, we describe the high-level design of CHISTAR and the approaches for semantic interoperability, data integration, and security.

  7. Reconciliation of the cloud computing model with US federal electronic health record regulations.

    PubMed

    Schweitzer, Eugene J

    2012-01-01

    Cloud computing refers to subscription-based, fee-for-service utilization of computer hardware and software over the Internet. The model is gaining acceptance for business information technology (IT) applications because it allows capacity and functionality to increase on the fly without major investment in infrastructure, personnel or licensing fees. Large IT investments can be converted to a series of smaller operating expenses. Cloud architectures could potentially be superior to traditional electronic health record (EHR) designs in terms of economy, efficiency and utility. A central issue for EHR developers in the US is that these systems are constrained by federal regulatory legislation and oversight. These laws focus on security and privacy, which are well-recognized challenges for cloud computing systems in general. EHRs built with the cloud computing model can achieve acceptable privacy and security through business associate contracts with cloud providers that specify compliance requirements, performance metrics and liability sharing.

  8. The Laplacian spectrum of neural networks

    PubMed Central

    de Lange, Siemon C.; de Reus, Marcel A.; van den Heuvel, Martijn P.

    2014-01-01

    The brain is a complex network of neural interactions, both at the microscopic and macroscopic level. Graph theory is well suited to examine the global network architecture of these neural networks. Many popular graph metrics, however, encode average properties of individual network elements. Complementing these “conventional” graph metrics, the eigenvalue spectrum of the normalized Laplacian describes a network's structure directly at a systems level, without referring to individual nodes or connections. In this paper, the Laplacian spectra of the macroscopic anatomical neuronal networks of the macaque and cat, and the microscopic network of the Caenorhabditis elegans were examined. Consistent with conventional graph metrics, analysis of the Laplacian spectra revealed an integrative community structure in neural brain networks. Extending previous findings of overlap of network attributes across species, similarity of the Laplacian spectra across the cat, macaque and C. elegans neural networks suggests a certain level of consistency in the overall architecture of the anatomical neural networks of these species. Our results further suggest a specific network class for neural networks, distinct from conceptual small-world and scale-free models as well as several empirical networks. PMID:24454286

  9. Systems Engineering Case Studies: Synopsis of the Learning Principles

    DTIC Science & Technology

    2010-05-17

    Engineering Case Study HST refers to the Hubble Space Telescope Systems Engineering Case Study TBMCS refers to the Theater Battle Management Core System...going to orbit undetected in spite of substantial evidence that could have been used to prevent this occurrence. TBMCS /1 Requirements Definition...baseline was volatile up to system acceptance, which took place after TBMCS passed operational test and evaluation. TBMCS /2 System Architecture The

  10. Hybrid Architectures for Evolutionary Computing Algorithms

    DTIC Science & Technology

    2008-01-01

    other EC algorithms to FPGA Core Burns P1026/MAPLD 200532 Genetic Algorithm Hardware References S. Scott, A. Samal , and S. Seth, “HGA: A Hardware Based...on Parallel and Distributed Processing (IPPS/SPDP 󈨦), pp. 316-320, Proceedings. IEEE Computer Society 1998. [12] Scott, S. D. , Samal , A., and...Algorithm Hardware References S. Scott, A. Samal , and S. Seth, “HGA: A Hardware Based Genetic Algorithm”, Proceedings of the 1995 ACM Third

  11. Supervisory Control System Architecture for Advanced Small Modular Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cetiner, Sacit M; Cole, Daniel L; Fugate, David L

    2013-08-01

    This technical report was generated as a product of the Supervisory Control for Multi-Modular SMR Plants project within the Instrumentation, Control and Human-Machine Interface technology area under the Advanced Small Modular Reactor (SMR) Research and Development Program of the U.S. Department of Energy. The report documents the definition of strategies, functional elements, and the structural architecture of a supervisory control system for multi-modular advanced SMR (AdvSMR) plants. This research activity advances the state-of-the art by incorporating decision making into the supervisory control system architectural layers through the introduction of a tiered-plant system approach. The report provides a brief history ofmore » hierarchical functional architectures and the current state-of-the-art, describes a reference AdvSMR to show the dependencies between systems, presents a hierarchical structure for supervisory control, indicates the importance of understanding trip setpoints, applies a new theoretic approach for comparing architectures, identifies cyber security controls that should be addressed early in system design, and describes ongoing work to develop system requirements and hardware/software configurations.« less

  12. Architecture Study for a Fuel Depot Supplied from Lunar Resources

    NASA Technical Reports Server (NTRS)

    Perrin, Thomas M.

    2016-01-01

    Heretofore, discussions of space fuel depots assumed the depots would be supplied from Earth. However, the confirmation of deposits of water ice at the lunar poles in 2009 suggests the possibility of supplying a space depot with liquid hydrogen/liquid oxygen produced from lunar ice. This architecture study sought to determine the optimum architecture for a fuel depot supplied from lunar resources. Four factors - the location of propellant processing (on the Moon or on the depot), the location of the depot (on the Moon or in cislunar space), and if in cislunar space, where (LEO, GEO, or Earth-Moon L1), and the method of propellant transfer (bulk fuel or canister exchange) were combined to identify 18 potential architectures. Two design reference missions (DRMs) - a satellite servicing mission and a cargo mission to Mars - were used to create demand for propellants, while a third DRM - a propellant delivery mission - was used to examine supply issues. The architectures were depicted graphically in a network diagram with individual segments representing the movement of propellant from the Moon to the depot, and from the depot to the customer

  13. Architecture Study for a Fuel Depot Supplied from Lunar Resources

    NASA Technical Reports Server (NTRS)

    Perrin, Thomas M.

    2016-01-01

    Heretofore, discussions of space fuel depots assumed the depots would be supplied from Earth. However, the confirmation of deposits of water ice at the lunar poles in 2009 suggests the possibility of supplying a space depot with liquid hydrogen/liquid oxygen produced from lunar ice. This architecture study sought to determine the optimum architecture for a fuel depot supplied from lunar resources. Four factors - the location of propellant processing (on the Moon or on the depot), the location of the depot (on the Moon, or at L1, GEO, or LEO), the location of propellant transfer (L1, GEO, or LEO), and the method of propellant transfer (bulk fuel or canister exchange) were combined to identify 18 potential architectures. Two design reference missions (DRMs) - a satellite servicing mission and a cargo mission to Mars - were used to create demand for propellants, while a third DRM - a propellant delivery mission - was used to examine supply issues. The architectures were depicted graphically in a network diagram with individual segments representing the movement of propellant from the Moon to the depot, and from the depot to the customer.

  14. A Distributed Intelligent E-Learning System

    ERIC Educational Resources Information Center

    Kristensen, Terje

    2016-01-01

    An E-learning system based on a multi-agent (MAS) architecture combined with the Dynamic Content Manager (DCM) model of E-learning, is presented. We discuss the benefits of using such a multi-agent architecture. Finally, the MAS architecture is compared with a pure service-oriented architecture (SOA). This MAS architecture may also be used within…

  15. A development framework for semantically interoperable health information systems.

    PubMed

    Lopez, Diego M; Blobel, Bernd G M E

    2009-02-01

    Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.

  16. Modeling Operations Costs for Human Exploration Architectures

    NASA Technical Reports Server (NTRS)

    Shishko, Robert

    2013-01-01

    Operations and support (O&S) costs for human spaceflight have not received the same attention in the cost estimating community as have development costs. This is unfortunate as O&S costs typically comprise a majority of life-cycle costs (LCC) in such programs as the International Space Station (ISS) and the now-cancelled Constellation Program. Recognizing this, the Constellation Program and NASA HQs supported the development of an O&S cost model specifically for human spaceflight. This model, known as the Exploration Architectures Operations Cost Model (ExAOCM), provided the operations cost estimates for a variety of alternative human missions to the moon, Mars, and Near-Earth Objects (NEOs) in architectural studies. ExAOCM is philosophically based on the DoD Architecture Framework (DoDAF) concepts of operational nodes, systems, operational functions, and milestones. This paper presents some of the historical background surrounding the development of the model, and discusses the underlying structure, its unusual user interface, and lastly, previous examples of its use in the aforementioned architectural studies.

  17. Enhancement of the Acquisition Process for a Combat System-A Case Study to Model the Workflow Processes for an Air Defense System Acquisition

    DTIC Science & Technology

    2009-12-01

    Business Process Modeling BPMN Business Process Modeling Notation SoA Service-oriented Architecture UML Unified Modeling Language CSP...system developers. Supporting technologies include Business Process Modeling Notation ( BPMN ), Unified Modeling Language (UML), model-driven architecture

  18. Trees in the Landscape.

    ERIC Educational Resources Information Center

    Webb, Richard; Forbatha, Ann

    1982-01-01

    Strategies for using trees in classroom instruction are provided. Includes: (1) activities (such as tree identification, mapping, measuring tree height/width); (2) list of asthetic, architectural, engineering, climate, and wildlife functions of trees; (3) tree discussion questions; and (4) references. (JN)

  19. New Directions in Space: A Report on the Lunar and Mars Initiatives

    NASA Technical Reports Server (NTRS)

    Seitz, Frederick; Hawkins, Willis; Jastrow, Robert; Nierenberg, William A.

    1990-01-01

    This report focuses on one aspect of the current space program: The establishment of a manned base on the Moon and the manned exploration of Mars. These missions were announced by the President last year as a major U.S. space policy objective to be implemented under the leadership of the Vice President, acting as Chairman of the National Space Council. On March 8, 1990, the White House released Presidential guidelines for the execution of the lunar and Mars programs. The guidelines stressed the need for new approaches and the development of innovative technologies with a potential for major cost, schedule and performance improvements. They also called for a competitive environment, with several years allotted to the definition of at least two significantly different human space exploration "reference architectures." Selection of the final technical concepts for the mission is scheduled to occur only after the relative merits of the competing reference architectures have been evaluated.

  20. Mars Surface Habitability Options

    NASA Technical Reports Server (NTRS)

    Howe, A. Scott; Simon, Matthew; Smitherman, David; Howard, Robert; Toups, Larry; Hoffman, Stephen J.

    2015-01-01

    This paper reports on current habitability concepts for an Evolvable Mars Campaign (EMC) prepared by the NASA Human Spaceflight Architecture Team (HAT). For many years NASA has investigated alternative human Mars missions, examining different mission objectives, trajectories, vehicles, and technologies; the combinations of which have been referred to as reference missions or architectures. At the highest levels, decisions regarding the timing and objectives for a human mission to Mars continue to evolve while at the lowest levels, applicable technologies continue to advance. This results in an on-going need for assessments of alternative system designs such as the habitat, a significant element in any human Mars mission scenario, to provide meaningful design sensitivity characterizations to assist decision-makers regarding timing, objectives, and technologies. As a subset of the Evolvable Mars Campaign activities, the habitability team builds upon results from past studies and recommends options for Mars surface habitability compatible with updated technologies.

  1. Mars Design Reference Architecture 5.0 Study: Executive Summary

    NASA Technical Reports Server (NTRS)

    Drake, Bret G.

    2008-01-01

    The NASA Mars Design Reference Architecture 5.0 Study seeks to update its long term goals and objective for human exploration missions; flight and surface systems for human missions and supporting infrastructure; operational concept for human and robotic exploration of Mars; key challenges including risk and cost drivers; and, its development schedule options. It additionally seeks to assess strategic linkages between lunar and Mars strategies and develop and understanding of methods for reducing the cost/risk of human Mars missions through investment in research, technology development, and synergy with other exploration plans. Recommendations are made regarding conjunction class (long-stay) missions which are seen as providing the best balance of cost, risk, and performance. Additionally, this study reviews entry, descent, and landing challenges; in-space transportation systems; launch vehicle and Orion assessments; risk and risk mitigation; key driving requirements and challenges; and, lunar linkages.

  2. Sensitivity Analysis of a Cognitive Architecture for the Cultural Geography Model

    DTIC Science & Technology

    2011-12-01

    developmental inquiry. American Psychologist, 34(10), 906–911. Gazzaniga, M. S . (2004) The cognitive neurosciences III. Cambridge: MIT Press. Greeno, J. G...129 ix LIST OF FIGURES Situation-Based Cognitive Architecture (From Alt et al., 2011) .....................13 Figure 1. Theory of Planned...Harold, CG Model developer at TRAC-MTRY, who spend countless hours explaining to me the implementation of the Cognitive Architecture and CG model

  3. Capability-Based Modeling Methodology: A Fleet-First Approach to Architecture

    DTIC Science & Technology

    2014-02-01

    reconnaissance (ISR) aircraft , or unmanned systems . Accordingly, a mission architecture used to model SAG operations for a given Fleet unit should include all...would use an ISR aircraft to increase fidelity of a targeting solution; another mission thread to show how unmanned systems can augment targeting... unmanned systems . Therefore, an architect can generate, from a comprehensive SAG mission architecture, individual mission threads that model how a SAG

  4. Petri net model for analysis of concurrently processed complex algorithms

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Mielke, Roland R.

    1986-01-01

    This paper presents a Petri-net model suitable for analyzing the concurrent processing of computationally complex algorithms. The decomposed operations are to be processed in a multiple processor, data driven architecture. Of particular interest is the application of the model to both the description of the data/control flow of a particular algorithm, and to the general specification of the data driven architecture. A candidate architecture is also presented.

  5. Upper and Lower Limb Muscle Architecture of a 104 Year-Old Cadaver

    PubMed Central

    Infantolino, Benjamin

    2016-01-01

    Muscle architecture is an important component to typical musculoskeletal models. Previous studies of human muscle architecture have focused on a single joint, two adjacent joints, or an entire limb. To date, no study has presented muscle architecture for the upper and lower limbs of a single cadaver. Additionally, muscle architectural parameters from elderly cadavers are lacking, making it difficult to accurately model elderly populations. Therefore, the purpose of this study was to present muscle architecture of the upper and lower limbs of a 104 year old female cadaver. The major muscles of the upper and lower limbs were removed and the musculotendon mass, tendon mass, musculotendon length, tendon length, pennation angle, optimal fascicle length, physiological cross-sectional area, and tendon cross-sectional area were determined for each muscle. Data from this complete cadaver are presented in table format. The data from this study can be used to construct a musculoskeletal model of a specific individual who was ambulatory, something which has not been possible to date. This should increase the accuracy of the model output as the model will be representing a specific individual, not a synthesis of measurements from multiple individuals. Additionally, an elderly individual can be modeled which will provide insight into muscle function as we age. PMID:28033339

  6. Upper and Lower Limb Muscle Architecture of a 104 Year-Old Cadaver.

    PubMed

    Ruggiero, Marissa; Cless, Daniel; Infantolino, Benjamin

    2016-01-01

    Muscle architecture is an important component to typical musculoskeletal models. Previous studies of human muscle architecture have focused on a single joint, two adjacent joints, or an entire limb. To date, no study has presented muscle architecture for the upper and lower limbs of a single cadaver. Additionally, muscle architectural parameters from elderly cadavers are lacking, making it difficult to accurately model elderly populations. Therefore, the purpose of this study was to present muscle architecture of the upper and lower limbs of a 104 year old female cadaver. The major muscles of the upper and lower limbs were removed and the musculotendon mass, tendon mass, musculotendon length, tendon length, pennation angle, optimal fascicle length, physiological cross-sectional area, and tendon cross-sectional area were determined for each muscle. Data from this complete cadaver are presented in table format. The data from this study can be used to construct a musculoskeletal model of a specific individual who was ambulatory, something which has not been possible to date. This should increase the accuracy of the model output as the model will be representing a specific individual, not a synthesis of measurements from multiple individuals. Additionally, an elderly individual can be modeled which will provide insight into muscle function as we age.

  7. Logs Analysis of Adapted Pedagogical Scenarios Generated by a Simulation Serious Game Architecture

    ERIC Educational Resources Information Center

    Callies, Sophie; Gravel, Mathieu; Beaudry, Eric; Basque, Josianne

    2017-01-01

    This paper presents an architecture designed for simulation serious games, which automatically generates game-based scenarios adapted to learner's learning progression. We present three central modules of the architecture: (1) the learner model, (2) the adaptation module and (3) the logs module. The learner model estimates the progression of the…

  8. A Concept Transformation Learning Model for Architectural Design Learning Process

    ERIC Educational Resources Information Center

    Wu, Yun-Wu; Weng, Kuo-Hua; Young, Li-Ming

    2016-01-01

    Generally, in the foundation course of architectural design, much emphasis is placed on teaching of the basic design skills without focusing on teaching students to apply the basic design concepts in their architectural designs or promoting students' own creativity. Therefore, this study aims to propose a concept transformation learning model to…

  9. Formalism Challenges of the Cougaar Model Driven Architecture

    NASA Technical Reports Server (NTRS)

    Bohner, Shawn A.; George, Boby; Gracanin, Denis; Hinchey, Michael G.

    2004-01-01

    The Cognitive Agent Architecture (Cougaar) is one of the most sophisticated distributed agent architectures developed today. As part of its research and evolution, Cougaar is being studied for application to large, logistics-based applications for the Department of Defense (DoD). Anticipiting future complex applications of Cougaar, we are investigating the Model Driven Architecture (MDA) approach to understand how effective it would be for increasing productivity in Cougar-based development efforts. Recognizing the sophistication of the Cougaar development environment and the limitations of transformation technologies for agents, we have systematically developed an approach that combines component assembly in the large and transformation in the small. This paper describes some of the key elements that went into the Cougaar Model Driven Architecture approach and the characteristics that drove the approach.

  10. Architectural Large Constructed Environment. Modeling and Interaction Using Dynamic Simulations

    NASA Astrophysics Data System (ADS)

    Fiamma, P.

    2011-09-01

    How to use for the architectural design, the simulation coming from a large size data model? The topic is related to the phase coming usually after the acquisition of the data, during the construction of the model and especially after, when designers must have an interaction with the simulation, in order to develop and verify their idea. In the case of study, the concept of interaction includes the concept of real time "flows". The work develops contents and results that can be part of the large debate about the current connection between "architecture" and "movement". The focus of the work, is to realize a collaborative and participative virtual environment on which different specialist actors, client and final users can share knowledge, targets and constraints to better gain the aimed result. The goal is to have used a dynamic micro simulation digital resource that allows all the actors to explore the model in powerful and realistic way and to have a new type of interaction in a complex architectural scenario. On the one hand, the work represents a base of knowledge that can be implemented more and more; on the other hand the work represents a dealt to understand the large constructed architecture simulation as a way of life, a way of being in time and space. The architectural design before, and the architectural fact after, both happen in a sort of "Spatial Analysis System". The way is open to offer to this "system", knowledge and theories, that can support architectural design work for every application and scale. We think that the presented work represents a dealt to understand the large constructed architecture simulation as a way of life, a way of being in time and space. Architecture like a spatial configuration, that can be reconfigurable too through designing.

  11. Hijazi Architectural Object Library (haol)

    NASA Astrophysics Data System (ADS)

    Baik, A.; Boehm, J.

    2017-02-01

    As with many historical buildings around the world, building façades are of special interest; moreover, the details of such windows, stonework, and ornaments give each historic building its individual character. Each object of these buildings must be classified in an architectural object library. Recently, a number of researches have been focusing on this topic in Europe and Canada. From this standpoint, the Hijazi Architectural Objects Library (HAOL) has reproduced Hijazi elements as 3D computer models, which are modelled using a Revit Family (RFA). The HAOL will be dependent on the image survey and point cloud data. The Hijazi Object such as Roshan and Mashrabiyah, become as vocabulary of many Islamic cities in the Hijazi region such as Jeddah in Saudi Arabia, and even for a number of Islamic historic cities such as Istanbul and Cairo. These architectural vocabularies are the main cause of the beauty of these heritage. However, there is a big gap in both the Islamic architectural library and the Hijazi architectural library to provide these unique elements. Besides, both Islamic and Hijazi architecture contains a huge amount of information which has not yet been digitally classified according to period and styles. Due to this issue, this paper will be focusing on developing of Heritage BIM (HBIM) standards and the HAOL library to reduce the cost and the delivering time for heritage and new projects that involve in Hijazi architectural styles. Through this paper, the fundamentals of Hijazi architecture informatics will be provided via developing framework for HBIM models and standards. This framework will provide schema and critical information, for example, classifying the different shapes, models, and forms of structure, construction, and ornamentation of Hijazi architecture in order to digitalize parametric building identity.

  12. GAMETES: a fast, direct algorithm for generating pure, strict, epistatic models with random architectures.

    PubMed

    Urbanowicz, Ryan J; Kiralis, Jeff; Sinnott-Armstrong, Nicholas A; Heberling, Tamra; Fisher, Jonathan M; Moore, Jason H

    2012-10-01

    Geneticists who look beyond single locus disease associations require additional strategies for the detection of complex multi-locus effects. Epistasis, a multi-locus masking effect, presents a particular challenge, and has been the target of bioinformatic development. Thorough evaluation of new algorithms calls for simulation studies in which known disease models are sought. To date, the best methods for generating simulated multi-locus epistatic models rely on genetic algorithms. However, such methods are computationally expensive, difficult to adapt to multiple objectives, and unlikely to yield models with a precise form of epistasis which we refer to as pure and strict. Purely and strictly epistatic models constitute the worst-case in terms of detecting disease associations, since such associations may only be observed if all n-loci are included in the disease model. This makes them an attractive gold standard for simulation studies considering complex multi-locus effects. We introduce GAMETES, a user-friendly software package and algorithm which generates complex biallelic single nucleotide polymorphism (SNP) disease models for simulation studies. GAMETES rapidly and precisely generates random, pure, strict n-locus models with specified genetic constraints. These constraints include heritability, minor allele frequencies of the SNPs, and population prevalence. GAMETES also includes a simple dataset simulation strategy which may be utilized to rapidly generate an archive of simulated datasets for given genetic models. We highlight the utility and limitations of GAMETES with an example simulation study using MDR, an algorithm designed to detect epistasis. GAMETES is a fast, flexible, and precise tool for generating complex n-locus models with random architectures. While GAMETES has a limited ability to generate models with higher heritabilities, it is proficient at generating the lower heritability models typically used in simulation studies evaluating new algorithms. In addition, the GAMETES modeling strategy may be flexibly combined with any dataset simulation strategy. Beyond dataset simulation, GAMETES could be employed to pursue theoretical characterization of genetic models and epistasis.

  13. Influence of fiber architecture on the elastic an d inelastic response of metal matrix composites

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Pindera, Marek-Jerzy; Wilt, Thomas E.

    1995-01-01

    This three part paper focuses on the effect of fiber architecture (i.e., shape and distribution) on the elastic and inelastic response of metal matrix composites. The first part provides an annotative survey of the literature, presented as a historical perspective, dealing with the effects of fiber shape and distribution on the response of advanced polymeric matrix and metal matrix composites. Previous investigations dealing with both continuously and discontinuously reinforced composites are included. A summary of the state-of-the-art will assist in defining new directions in this quickly reviving area of research. The second part outlines a recently developed analytical micromechanics model that is particularly well suited for studying the influence of these effects on the response of metal matrix composites. This micromechanics model, referred to as the generalized method of cells (GMC), is capable of predicting the overall, inelastic behavior of unidirectional, multi-phased composites given the properties of the constituents. In particular, the model is sufficiently general to predict the response of unidirectional composites reinforced by either continuous or discontinuous fibers with different inclusion shapes and spatial arrangements in the presence of either perfect or imperfect interfaces and/or interfacial layers. Recent developments regarding this promising model, as well as directions for future enhancements of the model's predictive capability, are included. Finally, the third pan provides qualitative results generated using GMC for a representative titanium matix composite system, SCS-6/TlMETAL 21S. Results are presented that correctly demonstrate the relative effects of fiber arrangement and shape on the longitudinal and transverse stress-strain and creep response, with both strong and weak fiber/matrix interfacial bonds. The fiber arrangements include square, square diagonal, hexagonal and rectangular periodic arrays, as well as a random array. The fiber shapes include circular, square and cross-shaped cross sections. The effect of fiber volume fraction on the observed stress-strain response is also discussed, as the thus-far poorly documented strain rate sensitivity effect. In addition to the well documented features of architecture dependent response of continuously reinforced two-phase MMC's, new results involving continuous multi-phase internal architectures are presented. Specifically, stress strain and creep response of composites with different size fibers having different internal arrangements and bond strengths are investigated with the aim of determining the feasibility of using this approach to enhance the transverse toughness and creep resistance of TMC's.

  14. Autonomic Computing for Spacecraft Ground Systems

    NASA Technical Reports Server (NTRS)

    Li, Zhenping; Savkli, Cetin; Jones, Lori

    2007-01-01

    Autonomic computing for spacecraft ground systems increases the system reliability and reduces the cost of spacecraft operations and software maintenance. In this paper, we present an autonomic computing solution for spacecraft ground systems at NASA Goddard Space Flight Center (GSFC), which consists of an open standard for a message oriented architecture referred to as the GMSEC architecture (Goddard Mission Services Evolution Center), and an autonomic computing tool, the Criteria Action Table (CAT). This solution has been used in many upgraded ground systems for NASA 's missions, and provides a framework for developing solutions with higher autonomic maturity.

  15. Soil hydraulic material properties and layered architecture from time-lapse GPR

    NASA Astrophysics Data System (ADS)

    Jaumann, Stefan; Roth, Kurt

    2018-04-01

    Quantitative knowledge of the subsurface material distribution and its effective soil hydraulic material properties is essential to predict soil water movement. Ground-penetrating radar (GPR) is a noninvasive and nondestructive geophysical measurement method that is suitable to monitor hydraulic processes. Previous studies showed that the GPR signal from a fluctuating groundwater table is sensitive to the soil water characteristic and the hydraulic conductivity function. In this work, we show that the GPR signal originating from both the subsurface architecture and the fluctuating groundwater table is suitable to estimate the position of layers within the subsurface architecture together with the associated effective soil hydraulic material properties with inversion methods. To that end, we parameterize the subsurface architecture, solve the Richards equation, convert the resulting water content to relative permittivity with the complex refractive index model (CRIM), and solve Maxwell's equations numerically. In order to analyze the GPR signal, we implemented a new heuristic algorithm that detects relevant signals in the radargram (events) and extracts the corresponding signal travel time and amplitude. This algorithm is applied to simulated as well as measured radargrams and the detected events are associated automatically. Using events instead of the full wave regularizes the inversion focussing on the relevant measurement signal. For optimization, we use a global-local approach with preconditioning. Starting from an ensemble of initial parameter sets drawn with a Latin hypercube algorithm, we sequentially couple a simulated annealing algorithm with a Levenberg-Marquardt algorithm. The method is applied to synthetic as well as measured data from the ASSESS test site. We show that the method yields reasonable estimates for the position of the layers as well as for the soil hydraulic material properties by comparing the results to references derived from ground truth data as well as from time domain reflectometry (TDR).

  16. A hybrid expectation maximisation and MCMC sampling algorithm to implement Bayesian mixture model based genomic prediction and QTL mapping.

    PubMed

    Wang, Tingting; Chen, Yi-Ping Phoebe; Bowman, Phil J; Goddard, Michael E; Hayes, Ben J

    2016-09-21

    Bayesian mixture models in which the effects of SNP are assumed to come from normal distributions with different variances are attractive for simultaneous genomic prediction and QTL mapping. These models are usually implemented with Monte Carlo Markov Chain (MCMC) sampling, which requires long compute times with large genomic data sets. Here, we present an efficient approach (termed HyB_BR), which is a hybrid of an Expectation-Maximisation algorithm, followed by a limited number of MCMC without the requirement for burn-in. To test prediction accuracy from HyB_BR, dairy cattle and human disease trait data were used. In the dairy cattle data, there were four quantitative traits (milk volume, protein kg, fat% in milk and fertility) measured in 16,214 cattle from two breeds genotyped for 632,002 SNPs. Validation of genomic predictions was in a subset of cattle either from the reference set or in animals from a third breeds that were not in the reference set. In all cases, HyB_BR gave almost identical accuracies to Bayesian mixture models implemented with full MCMC, however computational time was reduced by up to 1/17 of that required by full MCMC. The SNPs with high posterior probability of a non-zero effect were also very similar between full MCMC and HyB_BR, with several known genes affecting milk production in this category, as well as some novel genes. HyB_BR was also applied to seven human diseases with 4890 individuals genotyped for around 300 K SNPs in a case/control design, from the Welcome Trust Case Control Consortium (WTCCC). In this data set, the results demonstrated again that HyB_BR performed as well as Bayesian mixture models with full MCMC for genomic predictions and genetic architecture inference while reducing the computational time from 45 h with full MCMC to 3 h with HyB_BR. The results for quantitative traits in cattle and disease in humans demonstrate that HyB_BR can perform equally well as Bayesian mixture models implemented with full MCMC in terms of prediction accuracy, but with up to 17 times faster than the full MCMC implementations. The HyB_BR algorithm makes simultaneous genomic prediction, QTL mapping and inference of genetic architecture feasible in large genomic data sets.

  17. SANDS: A Service-Oriented Architecture for Clinical Decision Support in a National Health Information Network

    PubMed Central

    Wright, Adam; Sittig, Dean F.

    2008-01-01

    In this paper we describe and evaluate a new distributed architecture for clinical decision support called SANDS (Service-oriented Architecture for NHIN Decision Support), which leverages current health information exchange efforts and is based on the principles of a service-oriented architecture. The architecture allows disparate clinical information systems and clinical decision support systems to be seamlessly integrated over a network according to a set of interfaces and protocols described in this paper. The architecture described is fully defined and developed, and six use cases have been developed and tested using a prototype electronic health record which links to one of the existing prototype National Health Information Networks (NHIN): drug interaction checking, syndromic surveillance, diagnostic decision support, inappropriate prescribing in older adults, information at the point of care and a simple personal health record. Some of these use cases utilize existing decision support systems, which are either commercially or freely available at present, and developed outside of the SANDS project, while other use cases are based on decision support systems developed specifically for the project. Open source code for many of these components is available, and an open source reference parser is also available for comparison and testing of other clinical information systems and clinical decision support systems that wish to implement the SANDS architecture. PMID:18434256

  18. When Neuroscience 'Touches' Architecture: From Hapticity to a Supramodal Functioning of the Human Brain.

    PubMed

    Papale, Paolo; Chiesi, Leonardo; Rampinini, Alessandra C; Pietrini, Pietro; Ricciardi, Emiliano

    2016-01-01

    In the last decades, the rapid growth of functional brain imaging methodologies allowed cognitive neuroscience to address open questions in philosophy and social sciences. At the same time, novel insights from cognitive neuroscience research have begun to influence various disciplines, leading to a turn to cognition and emotion in the fields of planning and architectural design. Since 2003, the Academy of Neuroscience for Architecture has been supporting 'neuro-architecture' as a way to connect neuroscience and the study of behavioral responses to the built environment. Among the many topics related to multisensory perceptual integration and embodiment, the concept of hapticity was recently introduced, suggesting a pivotal role of tactile perception and haptic imagery in architectural appraisal. Arguments have thus risen in favor of the existence of shared cognitive foundations between hapticity and the supramodal functional architecture of the human brain. Precisely, supramodality refers to the functional feature of defined brain regions to process and represent specific information content in a more abstract way, independently of the sensory modality conveying such information to the brain. Here, we highlight some commonalities and differences between the concepts of hapticity and supramodality according to the distinctive perspectives of architecture and cognitive neuroscience. This comparison and connection between these two different approaches may lead to novel observations in regard to people-environment relationships, and even provide empirical foundations for a renewed evidence-based design theory.

  19. Systematic Development of Intelligent Systems for Public Road Transport.

    PubMed

    García, Carmelo R; Quesada-Arencibia, Alexis; Cristóbal, Teresa; Padrón, Gabino; Alayón, Francisco

    2016-07-16

    This paper presents an architecture model for the development of intelligent systems for public passenger transport by road. The main objective of our proposal is to provide a framework for the systematic development and deployment of telematics systems to improve various aspects of this type of transport, such as efficiency, accessibility and safety. The architecture model presented herein is based on international standards on intelligent transport system architectures, ubiquitous computing and service-oriented architecture for distributed systems. To illustrate the utility of the model, we also present a use case of a monitoring system for stops on a public passenger road transport network.

  20. Systematic Development of Intelligent Systems for Public Road Transport

    PubMed Central

    García, Carmelo R.; Quesada-Arencibia, Alexis; Cristóbal, Teresa; Padrón, Gabino; Alayón, Francisco

    2016-01-01

    This paper presents an architecture model for the development of intelligent systems for public passenger transport by road. The main objective of our proposal is to provide a framework for the systematic development and deployment of telematics systems to improve various aspects of this type of transport, such as efficiency, accessibility and safety. The architecture model presented herein is based on international standards on intelligent transport system architectures, ubiquitous computing and service-oriented architecture for distributed systems. To illustrate the utility of the model, we also present a use case of a monitoring system for stops on a public passenger road transport network. PMID:27438836

  1. Nanoscale nuclear architecture for cancer diagnosis by spatial-domain low-coherence quantitative phase microscopy

    NASA Astrophysics Data System (ADS)

    Wang, Pin; Bista, Rajan K.; Khalbuss, Walid E.; Qiu, Wei; Staton, Kevin D.; Zhang, Lin; Brentnall, Teresa A.; Brand, Randall E.; Liu, Yang

    2011-03-01

    Alterations in nuclear architecture are the hallmark diagnostic characteristic of cancer cells. In this work, we show that the nuclear architectural characteristics quantified by spatial-domain low-coherence quantitative phase microscopy (SL-QPM), is more sensitive for the identification of cancer cells than conventional cytopathology. We demonstrated the importance of nuclear architectural characteristics in both an animal model of intestinal carcinogenesis - APC/Min mouse model and human cytology specimens with colorectal cancer by identifying cancer from cytologically noncancerous appearing cells. The determination of nanoscale nuclear architecture using this simple and practical optical instrument is a significant advance towards cancer diagnosis.

  2. The past, present, and future of cognitive architectures.

    PubMed

    Taatgen, Niels; Anderson, John R

    2010-10-01

    Cognitive architectures are theories of cognition that try to capture the essential representations and mechanisms that underlie cognition. Research in cognitive architectures has gradually moved from a focus on the functional capabilities of architectures to the ability to model the details of human behavior, and, more recently, brain activity. Although there are many different architectures, they share many identical or similar mechanisms, permitting possible future convergence. In judging the quality of a particular cognitive model, it is pertinent to not just judge its fit to the experimental data but also its simplicity and ability to make predictions. Copyright © 2009 Cognitive Science Society, Inc.

  3. The Software Architecture of Global Climate Models

    NASA Astrophysics Data System (ADS)

    Alexander, K. A.; Easterbrook, S. M.

    2011-12-01

    It has become common to compare and contrast the output of multiple global climate models (GCMs), such as in the Climate Model Intercomparison Project Phase 5 (CMIP5). However, intercomparisons of the software architecture of GCMs are almost nonexistent. In this qualitative study of seven GCMs from Canada, the United States, and Europe, we attempt to fill this gap in research. We describe the various representations of the climate system as computer programs, and account for architectural differences between models. Most GCMs now practice component-based software engineering, where Earth system components (such as the atmosphere or land surface) are present as highly encapsulated sub-models. This architecture facilitates a mix-and-match approach to climate modelling that allows for convenient sharing of model components between institutions, but it also leads to difficulty when choosing where to draw the lines between systems that are not encapsulated in the real world, such as sea ice. We also examine different styles of couplers in GCMs, which manage interaction and data flow between components. Finally, we pay particular attention to the varying levels of complexity in GCMs, both between and within models. Many GCMs have some components that are significantly more complex than others, a phenomenon which can be explained by the respective institution's research goals as well as the origin of the model components. In conclusion, although some features of software architecture have been adopted by every GCM we examined, other features show a wide range of different design choices and strategies. These architectural differences may provide new insights into variability and spread between models.

  4. Lander Propulsion Overview and Technology Requirements Discussion

    NASA Technical Reports Server (NTRS)

    Brown, Thomas M.

    2007-01-01

    This viewgraph presentation reviews the lunar lander propulsion requirements. It includes discussion on: Lander Project Overview, Project Evolution/Design Cycles, Lunar Architecture & Lander Reference Missions, Lander Concept Configurations, Descent and Ascent propulsion reviews, and a review of the technology requirements.

  5. Assessment of various parameters to improve MALDI-TOF MS reference spectra libraries constructed for the routine identification of filamentous fungi

    PubMed Central

    2013-01-01

    Background The poor reproducibility of matrix-assisted desorption/ionization time-of-flight (MALDI-TOF) spectra limits the effectiveness of the MALDI-TOF MS-based identification of filamentous fungi with highly heterogeneous phenotypes in routine clinical laboratories. This study aimed to enhance the MALDI-TOF MS-based identification of filamentous fungi by assessing several architectures of reference spectrum libraries. Results We established reference spectrum libraries that included 30 filamentous fungus species with various architectures characterized by distinct combinations of the following: i) technical replicates, i.e., the number of analyzed deposits for each culture used to build a reference meta-spectrum (RMS); ii) biological replicates, i.e., the number of RMS derived from the distinct subculture of each strain; and iii) the number of distinct strains of a given species. We then compared the effectiveness of each library in the identification of 200 prospectively collected clinical isolates, including 38 species in 28 genera. Identification effectiveness was improved by increasing the number of both RMS per strain (p<10-4) and strains for a given species (p<10-4) in a multivariate analysis. Conclusion Addressing the heterogeneity of MALDI-TOF spectra derived from filamentous fungi by increasing the number of RMS obtained from distinct subcultures of strains included in the reference spectra library markedly improved the effectiveness of the MALDI-TOF MS-based identification of clinical filamentous fungi. PMID:23565856

  6. Modeling and Simulation of Phased Array Antennas to Support Next-Generation Satellite Design

    NASA Technical Reports Server (NTRS)

    Tchorowski, Nicole; Murawski, Robert; Manning, Robert; Fuentes, Michael

    2016-01-01

    Developing enhanced simulation capabilities has become a significant priority for the Space Communications and Navigation (SCaN) project at NASA as new space communications technologies are proposed to replace aging NASA communications assets, such as the Tracking and Data Relay Satellite System (TDRSS). When developing the architecture for these new space communications assets, it is important to develop updated modeling and simulation methodologies, such that competing architectures can be weighed against one another and the optimal path forward can be determined. There have been many simulation tools developed here at NASA for the simulation of single RF link budgets, or for the modeling and simulation of an entire network of spacecraft and their supporting SCaN network elements. However, the modeling capabilities are never fully complete and as new technologies are proposed, gaps are identified. One such gap is the ability to rapidly develop high fidelity simulation models of electronically steerable phased array systems. As future relay satellite architectures are proposed that include optical communications links, electronically steerable antennas will become more desirable due to the reduction in platform vibration introduced by mechanically steerable devices. In this research, we investigate how modeling of these antennas can be introduced into out overall simulation and modeling structure. The ultimate goal of this research is two-fold. First, to enable NASA engineers to model various proposed simulation architectures and determine which proposed architecture meets the given architectural requirements. Second, given a set of communications link requirements for a proposed satellite architecture, determine the optimal configuration for a phased array antenna. There is a variety of tools available that can be used to model phased array antennas. To meet our stated goals, the first objective of this research is to compare the subset of tools available to us, trading-off modeling fidelity of the tool with simulation performance. When comparing several proposed architectures, higher- fidelity modeling may be desirable, however, when iterating a proposed set of communication link requirements across ranges of phased array configuration parameters, the practicality of performance becomes a significant requirement. In either case, a minimum simulation - fidelity must be met, regardless of performance considerations, which will be discussed in this research. Given a suitable set of phased array modeling tools, this research then focuses on integration with current SCaN modeling and simulation tools. While properly modeling the antenna elements of a system are vital, this is only a small part of the end-to-end communication path between a satellite and the supporting ground station and/or relay satellite assets. To properly model a proposed simulation architecture, this toolset must be integrated with other commercial and government development tools, such that the overall architecture can be examined in terms of communications, reliability, and cost. In this research, integration with previously developed communication tools is investigated.

  7. Rethinking the architectural design concept in the digital culture (in architecture's practice perspective)

    NASA Astrophysics Data System (ADS)

    Prawata, Albertus Galih

    2017-11-01

    The architectural design stages in architectural practices or in architectural design studio consist of many aspects. One of them is during the early phases of the design process, where the architects or designers try to interpret the project brief into the design concept. This paper is a report of the procedure of digital tools in the early design process in an architectural practice in Jakarta. It targets principally the use of BIM and digital modeling to generate information and transform them into conceptual forms, which is not very common in Indonesian architectural practices. Traditionally, the project brief is transformed into conceptual forms by using sketches, drawings, and physical model. The new method using digital tools shows that it is possible to do the same thing during the initial stage of the design process to create early architectural design forms. Architect's traditional tools and methods begin to be replaced effectively by digital tools, which would drive bigger opportunities for innovation.

  8. Modeling and Verification of Dependable Electronic Power System Architecture

    NASA Astrophysics Data System (ADS)

    Yuan, Ling; Fan, Ping; Zhang, Xiao-fang

    The electronic power system can be viewed as a system composed of a set of concurrently interacting subsystems to generate, transmit, and distribute electric power. The complex interaction among sub-systems makes the design of electronic power system complicated. Furthermore, in order to guarantee the safe generation and distribution of electronic power, the fault tolerant mechanisms are incorporated in the system design to satisfy high reliability requirements. As a result, the incorporation makes the design of such system more complicated. We propose a dependable electronic power system architecture, which can provide a generic framework to guide the development of electronic power system to ease the development complexity. In order to provide common idioms and patterns to the system *designers, we formally model the electronic power system architecture by using the PVS formal language. Based on the PVS model of this system architecture, we formally verify the fault tolerant properties of the system architecture by using the PVS theorem prover, which can guarantee that the system architecture can satisfy high reliability requirements.

  9. Understanding Portability of a High-Level Programming Model on Contemporary Heterogeneous Architectures

    DOE PAGES

    Sabne, Amit J.; Sakdhnagool, Putt; Lee, Seyong; ...

    2015-07-13

    Accelerator-based heterogeneous computing is gaining momentum in the high-performance computing arena. However, the increased complexity of heterogeneous architectures demands more generic, high-level programming models. OpenACC is one such attempt to tackle this problem. Although the abstraction provided by OpenACC offers productivity, it raises questions concerning both functional and performance portability. In this article, the authors propose HeteroIR, a high-level, architecture-independent intermediate representation, to map high-level programming models, such as OpenACC, to heterogeneous architectures. They present a compiler approach that translates OpenACC programs into HeteroIR and accelerator kernels to obtain OpenACC functional portability. They then evaluate the performance portability obtained bymore » OpenACC with their approach on 12 OpenACC programs on Nvidia CUDA, AMD GCN, and Intel Xeon Phi architectures. They study the effects of various compiler optimizations and OpenACC program settings on these architectures to provide insights into the achieved performance portability.« less

  10. NASA Capability Roadmaps Executive Summary

    NASA Technical Reports Server (NTRS)

    Willcoxon, Rita; Thronson, Harley; Varsi, Guilio; Mueller, Robert; Regenie, Victoria; Inman, Tom; Crooke, Julie; Coulter, Dan

    2005-01-01

    This document is the result of eight months of hard work and dedication from NASA, industry, other government agencies, and academic experts from across the nation. It provides a summary of the capabilities necessary to execute the Vision for Space Exploration and the key architecture decisions that drive the direction for those capabilities. This report is being provided to the Exploration Systems Architecture Study (ESAS) team for consideration in development of an architecture approach and investment strategy to support NASA future mission, programs and budget requests. In addition, it will be an excellent reference for NASA's strategic planning. A more detailed set of roadmaps at the technology and sub-capability levels are available on CD. These detailed products include key driving assumptions, capability maturation assessments, and technology and capability development roadmaps.

  11. Distributed and parallel approach for handle and perform huge datasets

    NASA Astrophysics Data System (ADS)

    Konopko, Joanna

    2015-12-01

    Big Data refers to the dynamic, large and disparate volumes of data comes from many different sources (tools, machines, sensors, mobile devices) uncorrelated with each others. It requires new, innovative and scalable technology to collect, host and analytically process the vast amount of data. Proper architecture of the system that perform huge data sets is needed. In this paper, the comparison of distributed and parallel system architecture is presented on the example of MapReduce (MR) Hadoop platform and parallel database platform (DBMS). This paper also analyzes the problem of performing and handling valuable information from petabytes of data. The both paradigms: MapReduce and parallel DBMS are described and compared. The hybrid architecture approach is also proposed and could be used to solve the analyzed problem of storing and processing Big Data.

  12. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 1: HARP introduction and user's guide

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.; Rothmann, Elizabeth; Dugan, Joanne Bechta; Trivedi, Kishor S.; Mittal, Nitin; Boyd, Mark A.; Geist, Robert M.; Smotherman, Mark D.

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed to be compatible with most computing platforms and operating systems, and some programs have been beta tested, within the aerospace community for over 8 years. Volume 1 provides an introduction to the HARP program. Comprehensive information on HARP mathematical models can be found in the references.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sierra Thermal /Fluid Team

    The SIERRA Low Mach Module: Fuego along with the SIERRA Participating Media Radiation Module: Syrinx, henceforth referred to as Fuego and Syrinx, respectively, are the key elements of the ASCI fire environment simulation project. The fire environment simulation project is directed at characterizing both open large-scale pool fires and building enclosure fires. Fuego represents the turbulent, buoyantly-driven incompressible flow, heat transfer, mass transfer, combustion, soot, and absorption coefficient model portion of the simulation software. Syrinx represents the participating-media thermal radiation mechanics. This project is an integral part of the SIERRA multi-mechanics software development project. Fuego depends heavily upon the coremore » architecture developments provided by SIERRA for massively parallel computing, solution adaptivity, and mechanics coupling on unstructured grids.« less

  14. On some methods for improving time of reachability sets computation for the dynamic system control problem

    NASA Astrophysics Data System (ADS)

    Zimovets, Artem; Matviychuk, Alexander; Ushakov, Vladimir

    2016-12-01

    The paper presents two different approaches to reduce the time of computer calculation of reachability sets. First of these two approaches use different data structures for storing the reachability sets in the computer memory for calculation in single-threaded mode. Second approach is based on using parallel algorithms with reference to the data structures from the first approach. Within the framework of this paper parallel algorithm of approximate reachability set calculation on computer with SMP-architecture is proposed. The results of numerical modelling are presented in the form of tables which demonstrate high efficiency of parallel computing technology and also show how computing time depends on the used data structure.

  15. Demographic management in a federated healthcare environment.

    PubMed

    Román, I; Roa, L M; Reina-Tosina, J; Madinabeitia, G

    2006-09-01

    The purpose of this paper is to provide a further step toward the decentralization of identification and demographic information about persons by solving issues related to the integration of demographic agents in a federated healthcare environment. The aim is to identify a particular person in every system of a federation and to obtain a unified view of his/her demographic information stored in different locations. This work is based on semantic models and techniques, and pursues the reconciliation of several current standardization works including ITU-T's Open Distributed Processing, CEN's prEN 12967, OpenEHR's dual and reference models, CEN's General Purpose Information Components and CORBAmed's PID service. We propose a new paradigm for the management of person identification and demographic data, based on the development of an open architecture of specialized distributed components together with the incorporation of techniques for the efficient management of domain ontologies, in order to have a federated demographic service. This new service enhances previous correlation solutions sharing ideas with different standards and domains like semantic techniques and database systems. The federation philosophy enforces us to devise solutions to the semantic, functional and instance incompatibilities in our approach. Although this work is based on several models and standards, we have improved them by combining their contributions and developing a federated architecture that does not require the centralization of demographic information. The solution is thus a good approach to face integration problems and the applied methodology can be easily extended to other tasks involved in the healthcare organization.

  16. Engineering fibrin-based tissue constructs from myofibroblasts and application of constraints and strain to induce cell and collagen reorganization.

    PubMed

    de Jonge, Nicky; Baaijens, Frank P T; Bouten, Carlijn V C

    2013-10-28

    Collagen content and organization in developing collagenous tissues can be influenced by local tissue strains and tissue constraint. Tissue engineers aim to use these principles to create tissues with predefined collagen architectures. A full understanding of the exact underlying processes of collagen remodeling to control the final tissue architecture, however, is lacking. In particular, little is known about the (re)orientation of collagen fibers in response to changes in tissue mechanical loading conditions. We developed an in vitro model system, consisting of biaxially-constrained myofibroblast-seeded fibrin constructs, to further elucidate collagen (re)orientation in response to i) reverting biaxial to uniaxial static loading conditions and ii) cyclic uniaxial loading of the biaxially-constrained constructs before and after a change in loading direction, with use of the Flexcell FX4000T loading device. Time-lapse confocal imaging is used to visualize collagen (re)orientation in a nondestructive manner. Cell and collagen organization in the constructs can be visualized in real-time, and an internal reference system allows us to relocate cells and collagen structures for time-lapse analysis. Various aspects of the model system can be adjusted, like cell source or use of healthy and diseased cells. Additives can be used to further elucidate mechanisms underlying collagen remodeling, by for example adding MMPs or blocking integrins. Shape and size of the construct can be easily adapted to specific needs, resulting in a highly tunable model system to study cell and collagen (re)organization.

  17. CCSDS Spacecraft Monitor and Control Service Framework

    NASA Technical Reports Server (NTRS)

    Merri, Mario; Schmidt, Michael; Ercolani, Alessandro; Dankiewicz, Ivan; Cooper, Sam; Thompson, Roger; Symonds, Martin; Oyake, Amalaye; Vaughs, Ashton; Shames, Peter

    2004-01-01

    This CCSDS paper presents a reference architecture and service framework for spacecraft monitoring and control. It has been prepared by the Spacecraft Monitoring and Control working group of the CCSDS Mission Operations and Information Management Systems (MOIMS) area. In this context, Spacecraft Monitoring and Control (SM&C) refers to end-to-end services between on- board or remote applications and ground-based functions responsible for mission operations. The scope of SM&C includes: 1) Operational Concept: definition of an operational concept that covers a set of standard operations activities related to the monitoring and control of both ground and space segments. 2) Core Set of Services: definition of an extensible set of services to support the operational concept together with its information model and behaviours. This includes (non exhaustively) ground systems such as Automatic Command and Control, Data Archiving and Retrieval, Flight Dynamics, Mission Planning and Performance Evaluation. 3) Application-layer information: definition of the standard information set to be exchanged for SM&C purposes.

  18. An Object-Oriented Network-Centric Software Architecture for Physical Computing

    NASA Astrophysics Data System (ADS)

    Palmer, Richard

    1997-08-01

    Recent developments in object-oriented computer languages and infrastructure such as the Internet, Web browsers, and the like provide an opportunity to define a more productive computational environment for scientific programming that is based more closely on the underlying mathematics describing physics than traditional programming languages such as FORTRAN or C++. In this talk I describe an object-oriented software architecture for representing physical problems that includes classes for such common mathematical objects as geometry, boundary conditions, partial differential and integral equations, discretization and numerical solution methods, etc. In practice, a scientific program written using this architecture looks remarkably like the mathematics used to understand the problem, is typically an order of magnitude smaller than traditional FORTRAN or C++ codes, and hence easier to understand, debug, describe, etc. All objects in this architecture are ``network-enabled,'' which means that components of a software solution to a physical problem can be transparently loaded from anywhere on the Internet or other global network. The architecture is expressed as an ``API,'' or application programmers interface specification, with reference embeddings in Java, Python, and C++. A C++ class library for an early version of this API has been implemented for machines ranging from PC's to the IBM SP2, meaning that phidentical codes run on all architectures.

  19. Re-Engineering Complex Legacy Systems at NASA

    NASA Technical Reports Server (NTRS)

    Ruszkowski, James; Meshkat, Leila

    2010-01-01

    The Flight Production Process (FPP) Re-engineering project has established a Model-Based Systems Engineering (MBSE) methodology and the technological infrastructure for the design and development of a reference, product-line architecture as well as an integrated workflow model for the Mission Operations System (MOS) for human space exploration missions at NASA Johnson Space Center. The design and architectural artifacts have been developed based on the expertise and knowledge of numerous Subject Matter Experts (SMEs). The technological infrastructure developed by the FPP Re-engineering project has enabled the structured collection and integration of this knowledge and further provides simulation and analysis capabilities for optimization purposes. A key strength of this strategy has been the judicious combination of COTS products with custom coding. The lean management approach that has led to the success of this project is based on having a strong vision for the whole lifecycle of the project and its progress over time, a goal-based design and development approach, a small team of highly specialized people in areas that are critical to the project, and an interactive approach for infusing new technologies into existing processes. This project, which has had a relatively small amount of funding, is on the cutting edge with respect to the utilization of model-based design and systems engineering. An overarching challenge that was overcome by this project was to convince upper management of the needs and merits of giving up more conventional design methodologies (such as paper-based documents and unwieldy and unstructured flow diagrams and schedules) in favor of advanced model-based systems engineering approaches.

  20. Designing Low-Income Housing Using Local Architectural Concepts

    NASA Astrophysics Data System (ADS)

    Trumansyahjaya, K.; Tatura, L. S.

    2018-02-01

    The provision of houses for low-income people who do not have a home worthy of being one of the major problems in the city of Gorontalo, because the community in establishing the house only pay attention to their wants and needs in creating a healthy environment, the beauty of the city and the planning of the home environment in accordance with the culture of the people of Gorontalo. In relation to the condition, the focus of this research is the design of housing based on local architecture as residential house so that it can be reached by a group of low income people with house and environment form determined based on family development, social and economic development of society and environment which take into account the local culture. Stages of this research includes five (5) stages, including the identification phase characteristics Gorontalo people of low income, the characteristics of the identification phase house inhabited by low-income people, the stage of identification preference low-income households, the phase formation house prototype and the environment, as well as the stage of formation model home for low-income people. Analysis of the model homes for low-income people using descriptive analysis, Hierarchical Cluster Analysis, and discrimination analysis to produce a prototype of the house and its surroundings. The prototype is then reanalyzed to obtain the model home for low-income people in the city of Gorontalo. The shape of a model home can be used as a reference for developers of housing intended for low-income people so that housing is provided to achieve the goals and the desired target group.

  1. A component-based problem list subsystem for the HOLON testbed. Health Object Library Online.

    PubMed Central

    Law, V.; Goldberg, H. S.; Jones, P.; Safran, C.

    1998-01-01

    One of the deliverables of the HOLON (Health Object Library Online) project is the specification of a reference architecture for clinical information systems that facilitates the development of a variety of discrete, reusable software components. One of the challenges facing the HOLON consortium is determining what kinds of components can be made available in a library for developers of clinical information systems. To further explore the use of component architectures in the development of reusable clinical subsystems, we have incorporated ongoing work in the development of enterprise terminology services into a Problem List subsystem for the HOLON testbed. We have successfully implemented a set of components using CORBA (Common Object Request Broker Architecture) and Java distributed object technologies that provide a functional problem list application and UMLS-based "Problem Picker." Through this development, we have overcome a variety of obstacles characteristic of rapidly emerging technologies, and have identified architectural issues necessary to scale these components for use and reuse within an enterprise clinical information system. PMID:9929252

  2. Communications System Architecture Development for Air Traffic Management and Aviation Weather Information Dissemination

    NASA Technical Reports Server (NTRS)

    Gallagher, Seana; Olson, Matt; Blythe, Doug; Heletz, Jacob; Hamilton, Griff; Kolb, Bill; Homans, Al; Zemrowski, Ken; Decker, Steve; Tegge, Cindy

    2000-01-01

    This document is the NASA AATT Task Order 24 Final Report. NASA Research Task Order 24 calls for the development of eleven distinct task reports. Each task was a necessary exercise in the development of comprehensive communications systems architecture (CSA) for air traffic management and aviation weather information dissemination for 2015, the definition of the interim architecture for 2007, and the transition plan to achieve the desired End State. The eleven tasks are summarized along with the associated Task Order reference. The output of each task was an individual task report. The task reports that make up the main body of this document include Task 5, Task 6, Task 7, Task 8, Task 10, and Task 11. The other tasks provide the supporting detail used in the development of the architecture. These reports are included in the appendices. The detailed user needs, functional communications requirements and engineering requirements associated with Tasks 1, 2, and 3 have been put into a relational database and are provided electronically.

  3. Parallel digital modem using multirate digital filter banks

    NASA Technical Reports Server (NTRS)

    Sadr, Ramin; Vaidyanathan, P. P.; Raphaeli, Dan; Hinedi, Sami

    1994-01-01

    A new class of architectures for an all-digital modem is presented in this report. This architecture, referred to as the parallel receiver (PRX), is based on employing multirate digital filter banks (DFB's) to demodulate, track, and detect the received symbol stream. The resulting architecture is derived, and specifications are outlined for designing the DFB for the PRX. The key feature of this approach is a lower processing rate then either the Nyquist rate or the symbol rate, without any degradation in the symbol error rate. Due to the freedom in choosing the processing rate, the designer is able to arbitrarily select and use digital components, independent of the speed of the integrated circuit technology. PRX architecture is particularly suited for high data rate applications, and due to the modular structure of the parallel signal path, expansion to even higher data rates is accommodated with each. Applications of the PRX would include gigabit satellite channels, multiple spacecraft, optical links, interactive cable-TV, telemedicine, code division multiple access (CDMA) communications, and others.

  4. A component-based problem list subsystem for the HOLON testbed. Health Object Library Online.

    PubMed

    Law, V; Goldberg, H S; Jones, P; Safran, C

    1998-01-01

    One of the deliverables of the HOLON (Health Object Library Online) project is the specification of a reference architecture for clinical information systems that facilitates the development of a variety of discrete, reusable software components. One of the challenges facing the HOLON consortium is determining what kinds of components can be made available in a library for developers of clinical information systems. To further explore the use of component architectures in the development of reusable clinical subsystems, we have incorporated ongoing work in the development of enterprise terminology services into a Problem List subsystem for the HOLON testbed. We have successfully implemented a set of components using CORBA (Common Object Request Broker Architecture) and Java distributed object technologies that provide a functional problem list application and UMLS-based "Problem Picker." Through this development, we have overcome a variety of obstacles characteristic of rapidly emerging technologies, and have identified architectural issues necessary to scale these components for use and reuse within an enterprise clinical information system.

  5. Architectural Design of a LMS with LTSA-Conformance

    ERIC Educational Resources Information Center

    Sengupta, Souvik; Dasgupta, Ranjan

    2017-01-01

    This paper illustrates an approach for architectural design of a Learning Management System (LMS), which is verifiable against the Learning Technology System Architecture (LTSA) conformance rules. We introduce a new method for software architectural design that extends the Unified Modeling Language (UML) component diagram with the formal…

  6. A processing architecture for associative short-term memory in electronic noses

    NASA Astrophysics Data System (ADS)

    Pioggia, G.; Ferro, M.; Di Francesco, F.; DeRossi, D.

    2006-11-01

    Electronic nose (e-nose) architectures usually consist of several modules that process various tasks such as control, data acquisition, data filtering, feature selection and pattern analysis. Heterogeneous techniques derived from chemometrics, neural networks, and fuzzy rules used to implement such tasks may lead to issues concerning module interconnection and cooperation. Moreover, a new learning phase is mandatory once new measurements have been added to the dataset, thus causing changes in the previously derived model. Consequently, if a loss in the previous learning occurs (catastrophic interference), real-time applications of e-noses are limited. To overcome these problems this paper presents an architecture for dynamic and efficient management of multi-transducer data processing techniques and for saving an associative short-term memory of the previously learned model. The architecture implements an artificial model of a hippocampus-based working memory, enabling the system to be ready for real-time applications. Starting from the base models available in the architecture core, dedicated models for neurons, maps and connections were tailored to an artificial olfactory system devoted to analysing olive oil. In order to verify the ability of the processing architecture in associative and short-term memory, a paired-associate learning test was applied. The avoidance of catastrophic interference was observed.

  7. Reference Specifications for SAVOIR Avionics Elements

    NASA Astrophysics Data System (ADS)

    Hult, Torbjorn; Lindskog, Martin; Roques, Remi; Planche, Luc; Brunjes, Bernhard; Dellandrea, Brice; Terraillon, Jean-Loup

    2012-08-01

    Space industry and Agencies have been recognizing already for quite some time the need to raise the level of standardisation in the spacecraft avionics systems in order to increase efficiency and reduce development cost and schedule. This also includes the aspect of increasing competition in global space business, which is a challenge that European space companies are facing at all stages of involvement in the international markets.A number of initiatives towards this vision are driven both by the industry and ESA’s R&D programmes. However, today an intensified coordination of these activities is required in order to achieve the necessary synergy and to ensure they converge towards the shared vision. It has been proposed to federate these initiatives under the common Space Avionics Open Interface Architecture (SAVOIR) initiative. Within this initiative, the approach based on reference architectures and building blocks plays a key role.Following the principles outlined above, the overall goal of the SAVOIR is to establish a streamlined onboard architecture in order to standardize the development of avionics systems for space programmes. This reflects the need to increase efficiency and cost-effectiveness in the development process as well as account the trend towards more functionality implemented by the onboard building blocks, i.e. HW and SW components, and more complexity for the overall space mission objectives.

  8. Finding Services for an Open Architecture: A Review of Existing Applications and Programs in PEO C4I

    DTIC Science & Technology

    2011-01-01

    2004) Two key SOA success factors listed were as follows: 1. Shared Services Strategy: Existence of a strategy to identify overlapping business and...model Architectural pattern 22 Finding Services for an Open Architecture or eliminating redundancies and overlaps through use of shared services 2...Funding Model: Existence of an IT funding model aligned with and supportive of a shared services strategy. (Sun Micro- systems, 2004) Become Data

  9. Effect of genetic architecture on the prediction accuracy of quantitative traits in samples of unrelated individuals.

    PubMed

    Morgante, Fabio; Huang, Wen; Maltecca, Christian; Mackay, Trudy F C

    2018-06-01

    Predicting complex phenotypes from genomic data is a fundamental aim of animal and plant breeding, where we wish to predict genetic merits of selection candidates; and of human genetics, where we wish to predict disease risk. While genomic prediction models work well with populations of related individuals and high linkage disequilibrium (LD) (e.g., livestock), comparable models perform poorly for populations of unrelated individuals and low LD (e.g., humans). We hypothesized that low prediction accuracies in the latter situation may occur when the genetics architecture of the trait departs from the infinitesimal and additive architecture assumed by most prediction models. We used simulated data for 10,000 lines based on sequence data from a population of unrelated, inbred Drosophila melanogaster lines to evaluate this hypothesis. We show that, even in very simplified scenarios meant as a stress test of the commonly used Genomic Best Linear Unbiased Predictor (G-BLUP) method, using all common variants yields low prediction accuracy regardless of the trait genetic architecture. However, prediction accuracy increases when predictions are informed by the genetic architecture inferred from mapping the top variants affecting main effects and interactions in the training data, provided there is sufficient power for mapping. When the true genetic architecture is largely or partially due to epistatic interactions, the additive model may not perform well, while models that account explicitly for interactions generally increase prediction accuracy. Our results indicate that accounting for genetic architecture can improve prediction accuracy for quantitative traits.

  10. Joint Composable Object Model and LVC Methodology

    NASA Technical Reports Server (NTRS)

    Rheinsmith, Richard; Wallace, Jeffrey; Bizub, Warren; Ceranowicz, Andy; Cutts, Dannie; Powell, Edward T.; Gustavson, Paul; Lutz, Robert; McCloud, Terrell

    2010-01-01

    Within the Department of Defense, multiple architectures are created to serve and fulfill one or several specific service or mission related LVC training goals. Multiple Object Models exist across and within those architectures and it is there that those disparate object models are a major source of interoperability problems when developing and constructing the training scenarios. The two most commonly used architectures are; HLA and TENA, with DIS and CTIA following close behind in terms of the number of users. Although these multiple architectures can share and exchange data the underlying meta-models for runtime data exchange are quite different, requiring gateways/translators to bridge between the different object model representations; while the Department of Defense's use of gateways are generally effective in performing these functions, as the LVC environment increases so too does the cost and complexity of these gateways. Coupled with the wide range of different object models across the various user communities we increase the propensity for run time errors, increased programmer stop gap measures during coordinated exercises, or failure of the system as a whole due to unknown or unforeseen incompatibilities. The Joint Composable Object Model (JCOM) project was established under an M&S Steering Committee (MSSC)-sponsored effort with oversight and control placed under the Joint Forces Command J7 Advanced Concepts Program Directorate. The purpose of this paper is to address the initial and the current progress that has been made in the following areas; the Conceptual Model Development Format, the Common Object Model, the Architecture Neutral Data Exchange Model (ANDEM), and the association methodology to allow the re-use of multiple architecture object models and the development of the prototype persistent reusable library.

  11. New architecture for dynamic frame-skipping transcoder.

    PubMed

    Fung, Kai-Tat; Chan, Yui-Lam; Siu, Wan-Chi

    2002-01-01

    Transcoding is a key technique for reducing the bit rate of a previously compressed video signal. A high transcoding ratio may result in an unacceptable picture quality when the full frame rate of the incoming video bitstream is used. Frame skipping is often used as an efficient scheme to allocate more bits to the representative frames, so that an acceptable quality for each frame can be maintained. However, the skipped frame must be decompressed completely, which might act as a reference frame to nonskipped frames for reconstruction. The newly quantized discrete cosine transform (DCT) coefficients of the prediction errors need to be re-computed for the nonskipped frame with reference to the previous nonskipped frame; this can create undesirable complexity as well as introduce re-encoding errors. In this paper, we propose new algorithms and a novel architecture for frame-rate reduction to improve picture quality and to reduce complexity. The proposed architecture is mainly performed on the DCT domain to achieve a transcoder with low complexity. With the direct addition of DCT coefficients and an error compensation feedback loop, re-encoding errors are reduced significantly. Furthermore, we propose a frame-rate control scheme which can dynamically adjust the number of skipped frames according to the incoming motion vectors and re-encoding errors due to transcoding such that the decoded sequence can have a smooth motion as well as better transcoded pictures. Experimental results show that, as compared to the conventional transcoder, the new architecture for frame-skipping transcoder is more robust, produces fewer requantization errors, and has reduced computational complexity.

  12. Human Exploration of Mars: The Reference Mission of the NASA Mars Exploration Study Team

    NASA Astrophysics Data System (ADS)

    Hoffman, Stephen J.; Kaplan, David I.

    1997-07-01

    Personnel representing several NASA field centers have formulated a "Reference Mission" addressing human exploration of Mars. This report summarizes their work and describes a plan for the first human missions to Mars, using approaches that are technically feasible, have reasonable risks, and have relatively low costs. The architecture for the Mars Reference Mission builds on previous work of the Synthesis Group (1991) and Zubrin's (1991) concepts for the use of propellants derived from the Martian Atmosphere. In defining the Reference Mission, choices have been made. In this report, the rationale for each choice is documented; however, unanticipated technology advances or political decisions might change the choices in the future.

  13. Human Exploration of Mars: The Reference Mission of the NASA Mars Exploration Study Team

    NASA Technical Reports Server (NTRS)

    Hoffman, Stephen J. (Editor); Kaplan, David I. (Editor)

    1997-01-01

    Personnel representing several NASA field centers have formulated a "Reference Mission" addressing human exploration of Mars. This report summarizes their work and describes a plan for the first human missions to Mars, using approaches that are technically feasible, have reasonable risks, and have relatively low costs. The architecture for the Mars Reference Mission builds on previous work of the Synthesis Group (1991) and Zubrin's (1991) concepts for the use of propellants derived from the Martian Atmosphere. In defining the Reference Mission, choices have been made. In this report, the rationale for each choice is documented; however, unanticipated technology advances or political decisions might change the choices in the future.

  14. Enabling Future Science and Human Exploration with NASA's Next Generation Near Earth and Deep Space Communications and Navigation Architecture

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard; Schier, James; Israel, David; Tai, Wallace; Liebrecht, Philip; Townes, Stephen

    2017-01-01

    The National Aeronautics and Space Administration (NASA) is studying alternatives for the United States space communications architecture through the 2040 timeframe. This architecture provides communication and navigation services to both human exploration and science missions throughout the solar system. Several of NASA's key space assets are approaching their end of design life and major systems are in need of replacement. The changes envisioned in the relay satellite architecture and capabilities around both Earth and Mars are significant undertakings and occur only once or twice each generation, and therefore is referred to as NASA's next generation space communications architecture. NASA's next generation architecture will benefit from technology and services developed over recent years. These innovations will provide missions with new operations concepts, increased performance, and new business and operating models. Advancements in optical communications will enable high-speed data channels and the use of new and more complex science instruments. Modern multiple beam/multiple access technologies such as those employed on commercial high throughput satellites will enable enhanced capabilities for on-demand service, and with new protocols will help provide Internet-like connectivity for cooperative spacecraft to improve data return and coordinate joint mission objectives. On-board processing with autonomous and cognitive networking will play larger roles to help manage system complexity. Spacecraft and ground systems will coordinate among themselves to establish communications, negotiate link connectivity, and learn to share spectrum to optimize resource allocation. Spacecraft will autonomously navigate, plan trajectories, and handle off-nominal events. NASA intends to leverage the ever-expanding capabilities of the satellite communications industry and foster its continued growth. NASA's technology development will complement and extend commercial capabilities to meet unique space environment requirements and to provide capabilities that are beyond the commercial marketplace. The progress of the communications industry, including the emerging global space internet segment and its planned constellations of 100's of satellites offer additional opportunities for new capability and mission concepts. The opportunities and challenges of a future space architecture require an optimal solution encompassing a global perspective. The concepts and technologies intentionally define an architecture that applies not only to NASA, but to other U.S. government agencies, international space and government agencies, and domestic and international industries to advance the openness, interoperability, and affordability of space communications. Cooperation among the worlds space agencies, their capabilities, standards, operations, and interoperability are key to advancing humankinds understand of the universe and extending human presence into the solar system.

  15. Enabling Future Science and Human Exploration with NASA's Next Generation near Earth and Deep Space Communications and Navigation Architecture

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Schier, James S.; Israel, David J.; Tai, Wallace; Liebrecht, Philip E.; Townes, Stephen A.

    2017-01-01

    The National Aeronautics and Space Administration (NASA) is studying alternatives for the United States space communications architecture through the 2040 timeframe. This architecture provides communication and navigation services to both human exploration and science missions throughout the solar system. Several of NASA's key space assets are approaching their end of design life and major systems are in need of replacement. The changes envisioned in the relay satellite architecture and capabilities around both Earth and Mars are significant undertakings and occur only once or twice each generation, and therefore is referred to as NASA's next generation space communications architecture. NASA's next generation architecture will benefit from technology and services developed over recent years. These innovations will provide missions with new operations concepts, increased performance, and new business and operating models. Advancements in optical communications will enable high-speed data channels and the use of new and more complex science instruments. Modern multiple beam/multiple access technologies such as those employed on commercial high throughput satellites will enable enhanced capabilities for on-demand service, and with new protocols will help provide Internet-like connectivity for cooperative spacecraft to improve data return and coordinate joint mission objectives. On-board processing with autonomous and cognitive networking will play larger roles to help manage system complexity. Spacecraft and ground systems will coordinate among themselves to establish communications, negotiate link connectivity, and learn to share spectrum to optimize resource allocation. Spacecraft will autonomously navigate, plan trajectories, and handle off-nominal events. NASA intends to leverage the ever-expanding capabilities of the satellite communications industry and foster its continued growth. NASA's technology development will complement and extend commercial capabilities to meet unique space environment requirements and to provide capabilities that are beyond the commercial marketplace. The progress of the communications industry, including the emerging global space internet segment and its planned constellations of 100's of satellites offer additional opportunities for new capability and mission concepts. The opportunities and challenges of a future space architecture require an optimal solution encompassing a global perspective. The concepts and technologies intentionally define an architecture that applies not only to NASA, but to other U.S. government agencies, international space and government agencies, and domestic and international industries to advance the openness, interoperability, and affordability of space communications. Cooperation among the worlds space agencies, their capabilities, standards, operations, and interoperability are key to advancing humankind's understand of the universe and extending human presence into the solar system.

  16. 40 CFR 21.2 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., architectural, legal, fiscal, or economic investigations or studies; surveys, designs, plans, writings, drawings... one or more applicable standards. This can be determined with reference to design specifications..., alterations, or methods of operation the design specifications of which will provide a measure of treatment or...

  17. 40 CFR 21.2 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., architectural, legal, fiscal, or economic investigations or studies; surveys, designs, plans, writings, drawings... one or more applicable standards. This can be determined with reference to design specifications..., alterations, or methods of operation the design specifications of which will provide a measure of treatment or...

  18. 40 CFR 21.2 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., architectural, legal, fiscal, or economic investigations or studies; surveys, designs, plans, writings, drawings... one or more applicable standards. This can be determined with reference to design specifications..., alterations, or methods of operation the design specifications of which will provide a measure of treatment or...

  19. 40 CFR 21.2 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., architectural, legal, fiscal, or economic investigations or studies; surveys, designs, plans, writings, drawings... one or more applicable standards. This can be determined with reference to design specifications..., alterations, or methods of operation the design specifications of which will provide a measure of treatment or...

  20. 40 CFR 21.2 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., architectural, legal, fiscal, or economic investigations or studies; surveys, designs, plans, writings, drawings... one or more applicable standards. This can be determined with reference to design specifications..., alterations, or methods of operation the design specifications of which will provide a measure of treatment or...

  1. OSD CALS Architecture Master Plan Study. Concept Paper. Indexing. Volume 30

    DOT National Transportation Integrated Search

    1989-06-01

    An index identifies and reference information which is exchanged between multiple users and systems. The increased automation that will take place as CALS evolves will dictate an increased use of indexes for the successful exchange of information. Th...

  2. Intelligent Transportation Systems (ITS) logical architecture : volume 3 : data dictionary

    DOT National Transportation Integrated Search

    1982-01-01

    A Guide to Reporting Highway Statistics is a principal part of Federal Highway Administration's comprehensive highway information collection effort. This Guide has two objectives: 1) To serve as a reference to the reporting system that the Federal Hi...

  3. 40 CFR 59.412 - Incorporations by reference.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (CONTINUED) NATIONAL VOLATILE ORGANIC COMPOUND EMISSION STANDARDS FOR CONSUMER AND COMMERCIAL PRODUCTS National Volatile Organic Compound Emission Standards for Architectural Coatings § 59.412 Incorporations by... 19428-2959. (1) ASTM Method C 1315-95, Standard Specification for Liquid Membrane-Forming Compounds...

  4. 40 CFR 59.412 - Incorporations by reference.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (CONTINUED) NATIONAL VOLATILE ORGANIC COMPOUND EMISSION STANDARDS FOR CONSUMER AND COMMERCIAL PRODUCTS National Volatile Organic Compound Emission Standards for Architectural Coatings § 59.412 Incorporations by... 19428-2959. (1) ASTM Method C 1315-95, Standard Specification for Liquid Membrane-Forming Compounds...

  5. An adaptable architecture for patient cohort identification from diverse data sources.

    PubMed

    Bache, Richard; Miles, Simon; Taweel, Adel

    2013-12-01

    We define and validate an architecture for systems that identify patient cohorts for clinical trials from multiple heterogeneous data sources. This architecture has an explicit query model capable of supporting temporal reasoning and expressing eligibility criteria independently of the representation of the data used to evaluate them. The architecture has the key feature that queries defined according to the query model are both pre and post-processed and this is used to address both structural and semantic heterogeneity. The process of extracting the relevant clinical facts is separated from the process of reasoning about them. A specific instance of the query model is then defined and implemented. We show that the specific instance of the query model has wide applicability. We then describe how it is used to access three diverse data warehouses to determine patient counts. Although the proposed architecture requires greater effort to implement the query model than would be the case for using just SQL and accessing a data-based management system directly, this effort is justified because it supports both temporal reasoning and heterogeneous data sources. The query model only needs to be implemented once no matter how many data sources are accessed. Each additional source requires only the implementation of a lightweight adaptor. The architecture has been used to implement a specific query model that can express complex eligibility criteria and access three diverse data warehouses thus demonstrating the feasibility of this approach in dealing with temporal reasoning and data heterogeneity.

  6. Cognitive Architectures and Human-Computer Interaction. Introduction to Special Issue.

    ERIC Educational Resources Information Center

    Gray, Wayne D.; Young, Richard M.; Kirschenbaum, Susan S.

    1997-01-01

    In this introduction to a special issue on cognitive architectures and human-computer interaction (HCI), editors and contributors provide a brief overview of cognitive architectures. The following four architectures represented by articles in this issue are: Soar; LICAI (linked model of comprehension-based action planning and instruction taking);…

  7. Privacy enhanced group communication in clinical environment

    NASA Astrophysics Data System (ADS)

    Li, Mingyan; Narayanan, Sreeram; Poovendran, Radha

    2005-04-01

    Privacy protection of medical records has always been an important issue and is mandated by the recent Health Insurance Portability and Accountability Act (HIPAA) standards. In this paper, we propose security architectures for a tele-referring system that allows electronic group communication among professionals for better quality treatments, while protecting patient privacy against unauthorized access. Although DICOM defines the much-needed guidelines for confidentiality of medical data during transmission, there is no provision in the existing medical security systems to guarantee patient privacy once the data has been received. In our design, we address this issue by enabling tracing back to the recipient whose received data is disclosed to outsiders, using watermarking technique. We present security architecture design of a tele-referring system using a distributed approach and a centralized web-based approach. The resulting tele-referring system (i) provides confidentiality during the transmission and ensures integrity and authenticity of the received data, (ii) allows tracing of the recipient who has either distributed the data to outsiders or whose system has been compromised, (iii) provides proof of receipt or origin, and (iv) can be easy to use and low-cost to employ in clinical environment.

  8. Minimizing energy dissipation of matrix multiplication kernel on Virtex-II

    NASA Astrophysics Data System (ADS)

    Choi, Seonil; Prasanna, Viktor K.; Jang, Ju-wook

    2002-07-01

    In this paper, we develop energy-efficient designs for matrix multiplication on FPGAs. To analyze the energy dissipation, we develop a high-level model using domain-specific modeling techniques. In this model, we identify architecture parameters that significantly affect the total energy (system-wide energy) dissipation. Then, we explore design trade-offs by varying these parameters to minimize the system-wide energy. For matrix multiplication, we consider a uniprocessor architecture and a linear array architecture to develop energy-efficient designs. For the uniprocessor architecture, the cache size is a parameter that affects the I/O complexity and the system-wide energy. For the linear array architecture, the amount of storage per processing element is a parameter affecting the system-wide energy. By using maximum amount of storage per processing element and minimum number of multipliers, we obtain a design that minimizes the system-wide energy. We develop several energy-efficient designs for matrix multiplication. For example, for 6×6 matrix multiplication, energy savings of upto 52% for the uniprocessor architecture and 36% for the linear arrary architecture is achieved over an optimized library for Virtex-II FPGA from Xilinx.

  9. Control Activity in Support of NASA Turbine Based Combined Cycle (TBCC) Research

    NASA Technical Reports Server (NTRS)

    Stueber, Thomas J.; Vrnak, Daniel R.; Le, Dzu K.; Ouzts, Peter J.

    2010-01-01

    Control research for a Turbine Based Combined Cycle (TBCC) propulsion system is the current focus of the Hypersonic Guidance, Navigation, and Control (GN&C) discipline team. The ongoing work at the NASA Glenn Research Center (GRC) supports the Hypersonic GN&C effort in developing tools to aid the design of control algorithms to manage a TBCC airbreathing propulsion system during a critical operating period. The critical operating period being addressed in this paper is the span when the propulsion system transitions from one cycle to another, referred to as mode transition. One such tool, that is a basic need for control system design activities, is computational models (hereto forth referred to as models) of the propulsion system. The models of interest for designing and testing controllers are Control Development Models (CDMs) and Control Validation Models (CVMs). CDMs and CVMs are needed for each of the following propulsion system elements: inlet, turbine engine, ram/scram dual-mode combustor, and nozzle. This paper presents an overall architecture for a TBCC propulsion system model that includes all of the propulsion system elements. Efforts are under way, focusing on one of the propulsion system elements, to develop CDMs and CVMs for a TBCC propulsion system inlet. The TBCC inlet aerodynamic design being modeled is that of the Combined-Cycle Engine (CCE) Testbed. The CCE Testbed is a large-scale model of an aerodynamic design that was verified in a small-scale screening experiment. The modeling approach includes employing existing state-of-the-art simulation codes, developing new dynamic simulations, and performing system identification experiments on the hardware in the NASA GRC 10 by10-Foot Supersonic Wind Tunnel. The developed CDMs and CVMs will be available for control studies prior to hardware buildup. The system identification experiments on the CCE Testbed will characterize the necessary dynamics to be represented in CDMs for control design. These system identification models will also be the reference models to validate the CDM and CVM models. Validated models will give value to the tools used to develop the models.

  10. GASP-PL/I Simulation of Integrated Avionic System Processor Architectures. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Brent, G. A.

    1978-01-01

    A development study sponsored by NASA was completed in July 1977 which proposed a complete integration of all aircraft instrumentation into a single modular system. Instead of using the current single-function aircraft instruments, computers compiled and displayed inflight information for the pilot. A processor architecture called the Team Architecture was proposed. This is a hardware/software approach to high-reliability computer systems. A follow-up study of the proposed Team Architecture is reported. GASP-PL/1 simulation models are used to evaluate the operating characteristics of the Team Architecture. The problem, model development, simulation programs and results at length are presented. Also included are program input formats, outputs and listings.

  11. Architectural design proposal for a Martian base to continue NASA Mars Design Reference Mission

    NASA Astrophysics Data System (ADS)

    Kozicki, Janek

    The issue of extraterrestrial bases has recently been a very vivid one. There are orbital stations currently existing and humans will travel to Mars around 2030. They will need stations established there, which will provide them the proper living conditions. Firstly, it might be a small module brought from Earth (e.g. NASA Mars Design Reference Mission module (DRM)), in later stages equivalents of Earth houses may be built from local resources. The goal of this paper is to propose an architectural design for an intermediate stage — for a larger habitable unit transported from Earth. It is inspired by terrestrial portable architecture ideas. A pneumatic structure requires small volume during transportation. However, it provides large habitable space after deployment. It is designed for transport by DRM transportation module and its deployment is considerable easy and brief. An architectural solution analogous to a terrestrial house with a studio and a workshop was assumed. Its form was a result of technical and environmental limitations, and the need for an ergonomic interior. The spatial placement of following zones was carefully considered: residential, agricultural and science, as well as a garage with a workshop, transportation routes, and a control and communication center. The issues of Life Support System, energy, food, water and waste recycling were also discussed. This Martian base was designed to be crewed by a team of eight people to stay on Mars for at least 1.5 year. An Open Plan architectural solution was assumed in pneumatic modules, with a high level of modularity. Walls of standardized sizes with zip-fasteners allow free rearrangement of the interior to adapt to a new situation (e.g. damage of one of the pneumatic modules or a psychological ,,need of a change"). The architectural design focuses on ergonomic and psychological aspects of longer stay in hostile Martian environment. This solution provides Martian crew with a comfortable habitable space larger than DRM modules. It is proposed to send this base in a DRM transportation module after the first successful human mission. The author of this paper hopes that this, or other similar Martian base designs will help establishing a permanent presence of humans on Mars.

  12. Mission Architecture Comparison for Human Lunar Exploration

    NASA Technical Reports Server (NTRS)

    Geffre, Jim; Robertson, Ed; Lenius, Jon

    2006-01-01

    The Vision for Space Exploration outlines a bold new national space exploration policy that holds as one of its primary objectives the extension of human presence outward into the Solar System, starting with a return to the Moon in preparation for the future exploration of Mars and beyond. The National Aeronautics and Space Administration is currently engaged in several preliminary analysis efforts in order to develop the requirements necessary for implementing this objective in a manner that is both sustainable and affordable. Such analyses investigate various operational concepts, or mission architectures , by which humans can best travel to the lunar surface, live and work there for increasing lengths of time, and then return to Earth. This paper reports on a trade study conducted in support of NASA s Exploration Systems Mission Directorate investigating the relative merits of three alternative lunar mission architecture strategies. The three architectures use for reference a lunar exploration campaign consisting of multiple 90-day expeditions to the Moon s polar regions, a strategy which was selected for its high perceived scientific and operational value. The first architecture discussed incorporates the lunar orbit rendezvous approach employed by the Apollo lunar exploration program. This concept has been adapted from Apollo to meet the particular demands of a long-stay polar exploration campaign while assuring the safe return of crew to Earth. Lunar orbit rendezvous is also used as the baseline against which the other alternate concepts are measured. The first such alternative, libration point rendezvous, utilizes the unique characteristics of the cislunar libration point instead of a low altitude lunar parking orbit as a rendezvous and staging node. Finally, a mission strategy which does not incorporate rendezvous after the crew ascends from the Moon is also studied. In this mission strategy, the crew returns directly to Earth from the lunar surface, and is thus referred to as direct return. Figures of merit in the areas of safety and mission success, mission effectiveness, extensibility, and affordability are used to evaluate and compare the lunar orbit rendezvous, libration point rendezvous, and direct return architectures, and this paper summarizes the results of those assessments.

  13. A RESTful interface to pseudonymization services in modern web applications.

    PubMed

    Lablans, Martin; Borg, Andreas; Ückert, Frank

    2015-02-07

    Medical research networks rely on record linkage and pseudonymization to determine which records from different sources relate to the same patient. To establish informational separation of powers, the required identifying data are redirected to a trusted third party that has, in turn, no access to medical data. This pseudonymization service receives identifying data, compares them with a list of already reported patient records and replies with a (new or existing) pseudonym. We found existing solutions to be technically outdated, complex to implement or not suitable for internet-based research infrastructures. In this article, we propose a new RESTful pseudonymization interface tailored for use in web applications accessed by modern web browsers. The interface is modelled as a resource-oriented architecture, which is based on the representational state transfer (REST) architectural style. We translated typical use-cases into resources to be manipulated with well-known HTTP verbs. Patients can be re-identified in real-time by authorized users' web browsers using temporary identifiers. We encourage the use of PID strings for pseudonyms and the EpiLink algorithm for record linkage. As a proof of concept, we developed a Java Servlet as reference implementation. The following resources have been identified: Sessions allow data associated with a client to be stored beyond a single request while still maintaining statelessness. Tokens authorize for a specified action and thus allow the delegation of authentication. Patients are identified by one or more pseudonyms and carry identifying fields. Relying on HTTP calls alone, the interface is firewall-friendly. The reference implementation has proven to be production stable. The RESTful pseudonymization interface fits the requirements of web-based scenarios and allows building applications that make pseudonymization transparent to the user using ordinary web technology. The open-source reference implementation implements the web interface as well as a scientifically grounded algorithm to generate non-speaking pseudonyms.

  14. Energy efficient low-noise neural recording amplifier with enhanced noise efficiency factor.

    PubMed

    Majidzadeh, V; Schmid, A; Leblebici, Y

    2011-06-01

    This paper presents a neural recording amplifier array suitable for large-scale integration with multielectrode arrays in very low-power microelectronic cortical implants. The proposed amplifier is one of the most energy-efficient structures reported to date, which theoretically achieves an effective noise efficiency factor (NEF) smaller than the limit that can be achieved by any existing amplifier topology, which utilizes a differential pair input stage. The proposed architecture, which is referred to as a partial operational transconductance amplifier sharing architecture, results in a significant reduction of power dissipation as well as silicon area, in addition to the very low NEF. The effect of mismatch on crosstalk between channels and the tradeoff between noise and crosstalk are theoretically analyzed. Moreover, a mathematical model of the nonlinearity of the amplifier is derived, and its accuracy is confirmed by simulations and measurements. For an array of four neural amplifiers, measurement results show a midband gain of 39.4 dB and a -3-dB bandwidth ranging from 10 Hz to 7.2 kHz. The input-referred noise integrated from 10 Hz to 100 kHz is measured at 3.5 μVrms and the power consumption is 7.92 μW from a 1.8-V supply, which corresponds to NEF = 3.35. The worst-case crosstalk and common-mode rejection ratio within the desired bandwidth are - 43.5 dB and 70.1 dB, respectively, and the active silicon area of each amplifier is 256 μm × 256 μm in 0.18-μm complementary metal-oxide semiconductor technology.

  15. Generic Safety Requirements for Developing Safe Insulin Pump Software

    PubMed Central

    Zhang, Yi; Jetley, Raoul; Jones, Paul L; Ray, Arnab

    2011-01-01

    Background The authors previously introduced a highly abstract generic insulin infusion pump (GIIP) model that identified common features and hazards shared by most insulin pumps on the market. The aim of this article is to extend our previous work on the GIIP model by articulating safety requirements that address the identified GIIP hazards. These safety requirements can be validated by manufacturers, and may ultimately serve as a safety reference for insulin pump software. Together, these two publications can serve as a basis for discussing insulin pump safety in the diabetes community. Methods In our previous work, we established a generic insulin pump architecture that abstracts functions common to many insulin pumps currently on the market and near-future pump designs. We then carried out a preliminary hazard analysis based on this architecture that included consultations with many domain experts. Further consultation with domain experts resulted in the safety requirements used in the modeling work presented in this article. Results Generic safety requirements for the GIIP model are presented, as appropriate, in parameterized format to accommodate clinical practices or specific insulin pump criteria important to safe device performance. Conclusions We believe that there is considerable value in having the diabetes, academic, and manufacturing communities consider and discuss these generic safety requirements. We hope that the communities will extend and revise them, make them more representative and comprehensive, experiment with them, and use them as a means for assessing the safety of insulin pump software designs. One potential use of these requirements is to integrate them into model-based engineering (MBE) software development methods. We believe, based on our experiences, that implementing safety requirements using MBE methods holds promise in reducing design/implementation flaws in insulin pump development and evolutionary processes, therefore improving overall safety of insulin pump software. PMID:22226258

  16. Noise performance limits of advanced x-ray imagers employing poly-Si-based active pixel architectures

    NASA Astrophysics Data System (ADS)

    Koniczek, Martin; El-Mohri, Youcef; Antonuk, Larry E.; Liang, Albert; Zhao, Qihua; Jiang, Hao

    2011-03-01

    A decade after the clinical introduction of active matrix, flat-panel imagers (AMFPIs), the performance of this technology continues to be limited by the relatively large additive electronic noise of these systems - resulting in significant loss of detective quantum efficiency (DQE) under conditions of low exposure or high spatial frequencies. An increasingly promising approach for overcoming such limitations involves the incorporation of in-pixel amplification circuits, referred to as active pixel architectures (AP) - based on low-temperature polycrystalline silicon (poly-Si) thin-film transistors (TFTs). In this study, a methodology for theoretically examining the limiting noise and DQE performance of circuits employing 1-stage in-pixel amplification is presented. This methodology involves sophisticated SPICE circuit simulations along with cascaded systems modeling. In these simulations, a device model based on the RPI poly-Si TFT model is used with additional controlled current sources corresponding to thermal and flicker (1/f) noise. From measurements of transfer and output characteristics (as well as current noise densities) performed upon individual, representative, poly-Si TFTs test devices, model parameters suitable for these simulations are extracted. The input stimuli and operating-point-dependent scaling of the current sources are derived from the measured current noise densities (for flicker noise), or from fundamental equations (for thermal noise). Noise parameters obtained from the simulations, along with other parametric information, is input to a cascaded systems model of an AP imager design to provide estimates of DQE performance. In this paper, this method of combining circuit simulations and cascaded systems analysis to predict the lower limits on additive noise (and upper limits on DQE) for large area AP imagers with signal levels representative of those generated at fluoroscopic exposures is described, and initial results are reported.

  17. Systems Architecture for a Nationwide Healthcare System.

    PubMed

    Abin, Jorge; Nemeth, Horacio; Friedmann, Ignacio

    2015-01-01

    From a national level to give Internet technology support, the Nationwide Integrated Healthcare System in Uruguay requires a model of Information Systems Architecture. This system has multiple healthcare providers (public and private), and a strong component of supplementary services. Thus, the data processing system should have an architecture that considers this fact, while integrating the central services provided by the Ministry of Public Health. The national electronic health record, as well as other related data processing systems, should be based on this architecture. The architecture model described here conceptualizes a federated framework of electronic health record systems, according to the IHE affinity model, HL7 standards, local standards on interoperability and security, as well as technical advice provided by AGESIC. It is the outcome of the research done by AGESIC and Systems Integration Laboratory (LINS) on the development and use of the e-Government Platform since 2008, as well as the research done by the team Salud.uy since 2013.

  18. End-to-end interoperability and workflows from building architecture design to one or more simulations

    DOEpatents

    Chao, Tian-Jy; Kim, Younghun

    2015-02-10

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented that communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.

  19. Motion camera based on a custom vision sensor and an FPGA architecture

    NASA Astrophysics Data System (ADS)

    Arias-Estrada, Miguel

    1998-09-01

    A digital camera for custom focal plane arrays was developed. The camera allows the test and development of analog or mixed-mode arrays for focal plane processing. The camera is used with a custom sensor for motion detection to implement a motion computation system. The custom focal plane sensor detects moving edges at the pixel level using analog VLSI techniques. The sensor communicates motion events using the event-address protocol associated to a temporal reference. In a second stage, a coprocessing architecture based on a field programmable gate array (FPGA) computes the time-of-travel between adjacent pixels. The FPGA allows rapid prototyping and flexible architecture development. Furthermore, the FPGA interfaces the sensor to a compact PC computer which is used for high level control and data communication to the local network. The camera could be used in applications such as self-guided vehicles, mobile robotics and smart surveillance systems. The programmability of the FPGA allows the exploration of further signal processing like spatial edge detection or image segmentation tasks. The article details the motion algorithm, the sensor architecture, the use of the event- address protocol for velocity vector computation and the FPGA architecture used in the motion camera system.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oxstrand, Johanna; Bly, Aaron

    The U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program initiated research in to what is needed in order to provide a roadmap or model for Nuclear Power Plants to reference when building an architecture that can support the growing data supply and demand flowing through their networks. The Digital Architecture project published report Digital Architecture Planning Model (Oxstrand et. al, 2016) discusses things to consider when building an architecture to support the increasing needs and demands of data throughout the plant. Once the plant is able to support the data demands it still needs to be able tomore » provide the data in an easy, quick and reliable method. A common method is to create a “one stop shop” application that a user can go to get all the data they need. The creation of this leads to the need of creating a Seamless Digital Environment (SDE) to integrate all the “siloed” data. An SDE is the desired perception that should be presented to users by gathering the data from any data source (e.g., legacy applications and work management systems) without effort by the user. The goal for FY16 was to complete a feasibility study for data mining and analytics for employing information from computer-based procedures enabled technologies for use in developing improved business analytics. The research team collaborated with multiple organizations to identify use cases or scenarios, which could be beneficial to investigate in a feasibility study. Many interesting potential use cases were identified throughout the FY16 activity. Unfortunately, due to factors out of the research team’s control, none of the studies were initiated this year. However, the insights gained and the relationships built with both PVNGS and NextAxiom will be valuable when moving forward with future research. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting it was identified would be very beneficial to the industry to support a research effort focused on data analytics. It was suggested that the effort would develop and evaluate use cases for data mining and analytics for employing information from plant sensors and database for use in developing improved business analytics.« less

  1. A modular architecture for transparent computation in recurrent neural networks.

    PubMed

    Carmantini, Giovanni S; Beim Graben, Peter; Desroches, Mathieu; Rodrigues, Serafim

    2017-01-01

    Computation is classically studied in terms of automata, formal languages and algorithms; yet, the relation between neural dynamics and symbolic representations and operations is still unclear in traditional eliminative connectionism. Therefore, we suggest a unique perspective on this central issue, to which we would like to refer as transparent connectionism, by proposing accounts of how symbolic computation can be implemented in neural substrates. In this study we first introduce a new model of dynamics on a symbolic space, the versatile shift, showing that it supports the real-time simulation of a range of automata. We then show that the Gödelization of versatile shifts defines nonlinear dynamical automata, dynamical systems evolving on a vectorial space. Finally, we present a mapping between nonlinear dynamical automata and recurrent artificial neural networks. The mapping defines an architecture characterized by its granular modularity, where data, symbolic operations and their control are not only distinguishable in activation space, but also spatially localizable in the network itself, while maintaining a distributed encoding of symbolic representations. The resulting networks simulate automata in real-time and are programmed directly, in the absence of network training. To discuss the unique characteristics of the architecture and their consequences, we present two examples: (i) the design of a Central Pattern Generator from a finite-state locomotive controller, and (ii) the creation of a network simulating a system of interactive automata that supports the parsing of garden-path sentences as investigated in psycholinguistics experiments. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Opening up Architectures of Software-Intensive Systems: A Functional Decomposition to Support System Comprehension

    DTIC Science & Technology

    2007-10-01

    Architecture ................................................................................ 14 Figure 2. Eclipse Java Model...16 Figure 3. Eclipse Java Model at the Source Code Level...24 Figure 9. Java Source Code

  3. Assessment of IT solutions used in the Hungarian income tax microsimulation system

    NASA Astrophysics Data System (ADS)

    Molnar, I.; Hardhienata, S.

    2017-01-01

    This paper focuses on the use of information technology (IT) in diverse microsimulation studies and presents state-of-the-art solutions in the traditional application field of personal income tax simulation. The aim of the paper is to promote solutions, which can improve the efficiency and quality of microsimulation model implementation, assess their applicability and help to shift attention from microsimulation model implementation and data analysis towards experiment design and model use. First, the authors shortly discuss the relevant characteristics of the microsimulation application field and the managerial decision-making problem. After examination of the salient problems, advanced IT solutions, such as meta-database and service-oriented architecture are presented. The authors show how selected technologies can be applied to support both data- and behavior-driven and even agent-based personal income tax microsimulation model development. Finally, examples are presented and references made to the Hungarian Income Tax Simulator (HITS) models and their results. The paper concludes with a summary of the IT assessment and application-related author remarks dedicated to an Indonesian Income Tax Microsimulation Model.

  4. System design for 3D wound imaging using low-cost mobile devices

    NASA Astrophysics Data System (ADS)

    Sirazitdinova, Ekaterina; Deserno, Thomas M.

    2017-03-01

    The state-of-the art method of wound assessment is a manual, imprecise and time-consuming procedure. Per- formed by clinicians, it has limited reproducibility and accuracy, large time consumption and high costs. Novel technologies such as laser scanning microscopy, multi-photon microscopy, optical coherence tomography and hyper-spectral imaging, as well as devices relying on the structured light sensors, make accurate wound assessment possible. However, such methods have limitations due to high costs and may lack portability and availability. In this paper, we present a low-cost wound assessment system and architecture for fast and accurate cutaneous wound assessment using inexpensive consumer smartphone devices. Computer vision techniques are applied either on the device or the server to reconstruct wounds in 3D as dense models, which are generated from images taken with a built-in single camera of a smartphone device. The system architecture includes imaging (smartphone), processing (smartphone or PACS) and storage (PACS) devices. It supports tracking over time by alignment of 3D models, color correction using a reference color card placed into the scene and automatic segmentation of wound regions. Using our system, we are able to detect and document quantitative characteristics of chronic wounds, including size, depth, volume, rate of healing, as well as qualitative characteristics as color, presence of necrosis and type of involved tissue.

  5. Digital Architecture Planning Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oxstrand, Johanna Helene; Al Rashdan, Ahmad Yahya Mohammad; Bly, Aaron Douglas

    As part of the U.S. Department of Energy’s Light Water Reactor Sustainability Program, the Digital Architecture (DA) Project focuses on providing a model that nuclear utilities can refer to when planning deployment of advanced technologies. The digital architecture planning model (DAPM) is the methodology for mapping power plant operational and support activities into a DA that unifies all data sources needed by the utilities to operate their plants. The DA is defined as a collection of information technology capabilities needed to support and integrate a wide spectrum of real-time digital capabilities for performance improvements of nuclear power plants. DA canmore » be thought of as integration of the separate instrumentation and control and information systems already in place in nuclear power plants, which are brought together for the purpose of creating new levels of automation in plant work activities. A major objective in DAPM development was to survey all key areas that needed to be reviewed in order for a utility to make knowledgeable decisions regarding needs and plans to implement a DA at the plant. The development was done in two steps. First, researchers surveyed the nuclear industry in order to learn their near-term plans for adopting new advanced capabilities and implementing a network (i.e., wireless and wire) infrastructure throughout the plant, including the power block. Secondly, a literature review covering regulatory documents, industry standards, and technical research reports and articles was conducted. The objective of the review was to identify key areas to be covered by the DAPM, which included the following: 1. The need for a DA and its benefits to the plant 2. Resources required to implement the DA 3. Challenges that need to be addressed and resolved to implement the DA 4. Roles and responsibilities of the DA implementation plan. The DAPM was developed based on results from the survey and the literature review. Model development, including the survey results and conclusions made about the key areas during the literature review, are described in this report.« less

  6. French Influence on Portuguese Architects in the Age of Enlightenment

    NASA Astrophysics Data System (ADS)

    Sampayo, Mafalda

    2017-10-01

    This investigation shows the European influence on the work of Portuguese architects of the Enlightenment period. Based on previous studies we focus our attention on the design of “Praça do Comércio” square and on a hypothesis, that it was based on the French Royal Square. We demonstrate that the design of Lisbon from the second half of the eighteenth-century was influenced by the theories and best practices of the time. We also confirm that the architect Eugénio dos Santos e Carvalho, a member of the reconstruction team for the Baixa, had in his personal library several reference books of French architectural practice that certainly influenced his architecture. The plans for the main square of Lisbon’s lower city, “Praça do Comércio”, can be compared to the “Place de Nos Conquêtes”, predecessor of the “Place Vêndome”, in its design, architecture and dimensions. This research analysed the cartography and iconography of Lisbon’s reconstruction. In particular, the drawings of “Praça do Comércio” and “Place de nos Conquêtes” were exhaustively studied. The comparative study of the elements for both squares lead to the conclusion that the Portuguese square presents many aspects of the French Age of Enlightenment, and in particular those featured in the “Place de nos Conquêtes”. This paper concludes that the Portuguese urban design and architectural projects of the 18th century are the result of previous knowledge where it was always possible to articulate the vernacular with academic design, and where many different influences left their mark on the culture of the period. The plans for the lower part of Lisbon display a mixture of references that relate to architectural and urban planning traditions of the Portuguese military engineering and contemporary French urban planning.

  7. An object-oriented software approach for a distributed human tracking motion system

    NASA Astrophysics Data System (ADS)

    Micucci, Daniela L.

    2003-06-01

    Tracking is a composite job involving the co-operation of autonomous activities which exploit a complex information model and rely on a distributed architecture. Both information and activities must be classified and related in several dimensions: abstraction levels (what is modelled and how information is processed); topology (where the modelled entities are); time (when entities exist); strategy (why something happens); responsibilities (who is in charge of processing the information). A proper Object-Oriented analysis and design approach leads to a modular architecture where information about conceptual entities is modelled at each abstraction level via classes and intra-level associations, whereas inter-level associations between classes model the abstraction process. Both information and computation are partitioned according to level-specific topological models. They are also placed in a temporal framework modelled by suitable abstractions. Domain-specific strategies control the execution of the computations. Computational components perform both intra-level processing and intra-level information conversion. The paper overviews the phases of the analysis and design process, presents major concepts at each abstraction level, and shows how the resulting design turns into a modular, flexible and adaptive architecture. Finally, the paper sketches how the conceptual architecture can be deployed into a concrete distribute architecture by relying on an experimental framework.

  8. Architectural assessment of rhesus macaque pelvic floor muscles: comparison for use as a human model.

    PubMed

    Stewart, Amanda M; Cook, Mark S; Esparza, Mary C; Slayden, Ov D; Alperin, Marianna

    2017-10-01

    Animal models are essential to further our understanding of the independent and combined function of human pelvic floor muscles (PFMs), as direct studies in women are limited. To assure suitability of the rhesus macaque (RM), we compared RM and human PFM architecture, the strongest predictor of muscle function. We hypothesized that relative to other models, RM best resembles human PFM. Major architectural parameters of cadaveric human coccygeus, iliococcygeus, and pubovisceralis (pubococcygeus + puborectalis) and corresponding RM coccygeus, iliocaudalis, and pubovisceralis (pubovaginalis + pubocaudalis) were compared using 1- and 2-way analysis of variance (ANOVA) with post hoc testing. Architectural difference index (ADI), a combined measure of functionally relevant structural parameters predictive of length-tension, force-generation, and excursional muscle properties was used to compare PFMs across RM, rabbit, rat, and mouse. RM and human PFMs were similar with respect to architecture. However, the magnitude of similarity varied between individual muscles, with the architecture of the most distinct RM PFM, iliocaudalis, being well suited for quadrupedal locomotion. Except for the pubovaginalis, RM PFMs inserted onto caudal vertebrae, analogous to all tailed animals. Comparison of the PFM complex architecture across species revealed the lowest, thus closest to human, ADI for RM (1.9), followed by rat (2.0), mouse (2.6), and rabbit (4.7). Overall, RM provides the closest architectural representation of human PFM complex among species examined; however, differences between individual PFMs should be taken into consideration. As RM is closely followed by rat with respect to PFM similarity with humans, this less-sentient and substantially cheaper model is a good alternative for PFM studies.

  9. A Methodology For Developing an Agent Systems Reference Architecture

    DTIC Science & Technology

    2010-05-01

    agent framworks , we create an abstraction noting similarities and differences. The differences are documented as points of variation. The result...situated in the physical en- vironment. Addressing how conceptual components of an agent system is beneficial to agent system architects, developers, and

  10. A bibliography on parallel and vector numerical algorithms

    NASA Technical Reports Server (NTRS)

    Ortega, James M.; Voigt, Robert G.; Romine, Charles H.

    1988-01-01

    This is a bibliography on numerical methods. It also includes a number of other references on machine architecture, programming language, and other topics of interest to scientific computing. Certain conference proceedings and anthologies which have been published in book form are also listed.

  11. A bibliography on parallel and vector numerical algorithms

    NASA Technical Reports Server (NTRS)

    Ortega, J. M.; Voigt, R. G.

    1987-01-01

    This is a bibliography of numerical methods. It also includes a number of other references on machine architecture, programming language, and other topics of interest to scientific computing. Certain conference proceedings and anthologies which have been published in book form are listed also.

  12. A bibliography on parallel and vector numerical algorithms

    NASA Technical Reports Server (NTRS)

    Ortega, James M.; Voigt, Robert G.; Romine, Charles H.

    1990-01-01

    This is a bibliography on numerical methods. It also includes a number of other references on machine architecture, programming language, and other topics of interest to scientific computing. Certain conference proceedings and anthologies which have been published in book form are also listed.

  13. MultiLIS: A Description of the System Design and Operational Features.

    ERIC Educational Resources Information Center

    Kelly, Glen J.; And Others

    1988-01-01

    Describes development, hardware requirements, and features of the MultiLIS integrated library software package. A system profile provides pricing information, operational characteristics, and technical specifications. Sidebars discuss MultiLIS integration structure, incremental architecture, and NCR Tower Computers. (4 references) (MES)

  14. David Goldwasser | NREL

    Science.gov Websites

    . Prior to joining NREL, David worked in architectural design, 3D modeling, and interactive media. He consulted for Google on 3D modeling tools and worked in Colorado on sustainable architecture projects

  15. LTSA Conformance Testing to Architectural Design of LMS Using Ontology

    ERIC Educational Resources Information Center

    Sengupta, Souvik; Dasgupta, Ranjan

    2017-01-01

    This paper proposes a new methodology for checking conformance of the software architectural design of Learning Management System (LMS) to Learning Technology System Architecture (LTSA). In our approach, the architectural designing of LMS follows the formal modeling style of Acme. An ontology is built to represent the LTSA rules and the software…

  16. Forecast analysis of optical waveguide bus performance

    NASA Technical Reports Server (NTRS)

    Ledesma, R.; Rourke, M. D.

    1979-01-01

    Elements to be considered in the design of a data bus include: architecture; data rate; modulation, encoding, detection; power distribution requirements; protocol, work structure; bus reliability, maintainability; interterminal transmission medium; cost; and others specific to application. Fiber- optic data bus considerations for a 32 port transmissive star architecture, are discussed in a tutorial format. General optical-waveguide bus concepts, are reviewed. The electrical and optical performance of a 32 port transmissive star bus, and the effects of temperature on the performance of optical-waveguide buses are examined. A bibliography of pertinent references and the bus receiver test results are included.

  17. Semitransparent organic photovoltaic modules with Ag nanowire top electrodes

    NASA Astrophysics Data System (ADS)

    Guo, Fei; Kubis, Peter; Przybilla, Thomas; Spiecker, Erdmann; Forberich, Karen; Brabec, Christoph J.

    2014-10-01

    Semitransparent organic photovoltaic (OPV) cells are promising for applications in transparent architectures where their opaque counterparts are not suitable. Manufacturing of large-area modules without performance losses compared to their lab-scale devices is a key step towards practical applications of this PV technology. In this paper, we report the use of solution-processed silver nanowires as top electrodes and fabricate semitransparent OPV modules based on ultra-fast laser scribing. Through a rational choice of device architecture in combination with high-precision laser patterning, we demonstrate efficient semitransparent modules with comparable performance as compared to the reference devices.

  18. Architecture Studies for Commercial Production of Propellants From the Lunar Poles

    NASA Astrophysics Data System (ADS)

    Duke, Michael B.; Diaz, Javier; Blair, Brad R.; Oderman, Mark; Vaucher, Marc

    2003-01-01

    Two architectures are developed that could be used to convert water held in regolith deposits within permanently shadowed lunar craters into propellant for use in near-Earth space. In particular, the model has been applied to an analysis of the commercial feasibility of using lunar derived propellant to convey payloads from low Earth orbit to geosynchronous Earth orbit. Production and transportation system masses were estimated for each architecture and cost analysis was made using the NAFCOM cost model. Data from the cost model were analyzed using a financial analysis tool reported in a companion paper (Lamassoure et al., 2002) to determine under what conditions the architectures might be commercially viable. Analysis of the architectural assumptions is used to identify the principal areas for further research, which include technological development of lunar mining and water extraction systems, power systems, reusable space transportation systems, and orbital propellant depots. The architectures and commercial viability are sensitive to the assumed concentration of ice in the lunar deposits, suggesting that further lunar exploration to determine whether higher-grade deposits exist would be economically justified.

  19. DigR: a generic model and its open source simulation software to mimic three-dimensional root-system architecture diversity.

    PubMed

    Barczi, Jean-François; Rey, Hervé; Griffon, Sébastien; Jourdan, Christophe

    2018-04-18

    Many studies exist in the literature dealing with mathematical representations of root systems, categorized, for example, as pure structure description, partial derivative equations or functional-structural plant models. However, in these studies, root architecture modelling has seldom been carried out at the organ level with the inclusion of environmental influences that can be integrated into a whole plant characterization. We have conducted a multidisciplinary study on root systems including field observations, architectural analysis, and formal and mathematical modelling. This integrative and coherent approach leads to a generic model (DigR) and its software simulator. Architecture analysis applied to root systems helps at root type classification and architectural unit design for each species. Roots belonging to a particular type share dynamic and morphological characteristics which consist of topological and geometric features. The DigR simulator is integrated into the Xplo environment, with a user interface to input parameter values and make output ready for dynamic 3-D visualization, statistical analysis and saving to standard formats. DigR is simulated in a quasi-parallel computing algorithm and may be used either as a standalone tool or integrated into other simulation platforms. The software is open-source and free to download at http://amapstudio.cirad.fr/soft/xplo/download. DigR is based on three key points: (1) a root-system architectural analysis, (2) root type classification and modelling and (3) a restricted set of 23 root type parameters with flexible values indexed in terms of root position. Genericity and botanical accuracy of the model is demonstrated for growth, branching, mortality and reiteration processes, and for different root architectures. Plugin examples demonstrate the model's versatility at simulating plastic responses to environmental constraints. Outputs of the model include diverse root system structures such as tap-root, fasciculate, tuberous, nodulated and clustered root systems. DigR is based on plant architecture analysis which leads to specific root type classification and organization that are directly linked to field measurements. The open source simulator of the model has been included within a friendly user environment. DigR accuracy and versatility are demonstrated for growth simulations of complex root systems for both annual and perennial plants.

  20. Architectural Implications for Spatial Object Association Algorithms*

    PubMed Central

    Kumar, Vijay S.; Kurc, Tahsin; Saltz, Joel; Abdulla, Ghaleb; Kohn, Scott R.; Matarazzo, Celeste

    2013-01-01

    Spatial object association, also referred to as crossmatch of spatial datasets, is the problem of identifying and comparing objects in two or more datasets based on their positions in a common spatial coordinate system. In this work, we evaluate two crossmatch algorithms that are used for astronomical sky surveys, on the following database system architecture configurations: (1) Netezza Performance Server®, a parallel database system with active disk style processing capabilities, (2) MySQL Cluster, a high-throughput network database system, and (3) a hybrid configuration consisting of a collection of independent database system instances with data replication support. Our evaluation provides insights about how architectural characteristics of these systems affect the performance of the spatial crossmatch algorithms. We conducted our study using real use-case scenarios borrowed from a large-scale astronomy application known as the Large Synoptic Survey Telescope (LSST). PMID:25692244

Top