The Action Execution Process Implemented in Different Cognitive Architectures: A Review
NASA Astrophysics Data System (ADS)
Dong, Daqi; Franklin, Stan
2014-12-01
An agent achieves its goals by interacting with its environment, cyclically choosing and executing suitable actions. An action execution process is a reasonable and critical part of an entire cognitive architecture, because the process of generating executable motor commands is not only driven by low-level environmental information, but is also initiated and affected by the agent's high-level mental processes. This review focuses on cognitive models of action, or more specifically, of the action execution process, as implemented in a set of popular cognitive architectures. We examine the representations and procedures inside the action execution process, as well as the cooperation between action execution and other high-level cognitive modules. We finally conclude with some general observations regarding the nature of action execution.
On the Inevitable Intertwining of Requirements and Architecture
NASA Astrophysics Data System (ADS)
Sutcliffe, Alistair
The chapter investigates the relationship between architecture and requirements, arguing that architectural issues need to be addressed early in the RE process. Three trends are driving architectural implications for RE: the growth of intelligent, context-aware and adaptable systems. First the relationship between architecture and requirements is considered from a theoretical viewpoint of problem frames and abstract conceptual models. The relationships between architectural decisions and non-functional requirements is reviewed, and then the impact of architecture on the RE process is assessed using a case study of developing configurable, semi-intelligent software to support medical researchers in e-science domains.
Hadoop-based implementation of processing medical diagnostic records for visual patient system
NASA Astrophysics Data System (ADS)
Yang, Yuanyuan; Shi, Liehang; Xie, Zhe; Zhang, Jianguo
2018-03-01
We have innovatively introduced Visual Patient (VP) concept and method visually to represent and index patient imaging diagnostic records (IDR) in last year SPIE Medical Imaging (SPIE MI 2017), which can enable a doctor to review a large amount of IDR of a patient in a limited appointed time slot. In this presentation, we presented a new approach to design data processing architecture of VP system (VPS) to acquire, process and store various kinds of IDR to build VP instance for each patient in hospital environment based on Hadoop distributed processing structure. We designed this system architecture called Medical Information Processing System (MIPS) with a combination of Hadoop batch processing architecture and Storm stream processing architecture. The MIPS implemented parallel processing of various kinds of clinical data with high efficiency, which come from disparate hospital information system such as PACS, RIS LIS and HIS.
The development of a post-test diagnostic system for rocket engines
NASA Technical Reports Server (NTRS)
Zakrajsek, June F.
1991-01-01
An effort was undertaken by NASA to develop an automated post-test, post-flight diagnostic system for rocket engines. The automated system is designed to be generic and to automate the rocket engine data review process. A modular, distributed architecture with a generic software core was chosen to meet the design requirements. The diagnostic system is initially being applied to the Space Shuttle Main Engine data review process. The system modules currently under development are the session/message manager, and portions of the applications section, the component analysis section, and the intelligent knowledge server. An overview is presented of a rocket engine data review process, the design requirements and guidelines, the architecture and modules, and the projected benefits of the automated diagnostic system.
Loya, Salvador Rodriguez; Kawamoto, Kensaku; Chatwin, Chris; Huser, Vojtech
2014-12-01
The use of a service-oriented architecture (SOA) has been identified as a promising approach for improving health care by facilitating reliable clinical decision support (CDS). A review of the literature through October 2013 identified 44 articles on this topic. The review suggests that SOA related technologies such as Business Process Model and Notation (BPMN) and Service Component Architecture (SCA) have not been generally adopted to impact health IT systems' performance for better care solutions. Additionally, technologies such as Enterprise Service Bus (ESB) and architectural approaches like Service Choreography have not been generally exploited among researchers and developers. Based on the experience of other industries and our observation of the evolution of SOA, we found that the greater use of these approaches have the potential to significantly impact SOA implementations for CDS.
Loya, Salvador Rodriguez; Kawamoto, Kensaku; Chatwin, Chris; Huser, Vojtech
2017-01-01
The use of a service-oriented architecture (SOA) has been identified as a promising approach for improving health care by facilitating reliable clinical decision support (CDS). A review of the literature through October 2013 identified 44 articles on this topic. The review suggests that SOA related technologies such as Business Process Model and Notation (BPMN) and Service Component Architecture (SCA) have not been generally adopted to impact health IT systems’ performance for better care solutions. Additionally, technologies such as Enterprise Service Bus (ESB) and architectural approaches like Service Choreography have not been generally exploited among researchers and developers. Based on the experience of other industries and our observation of the evolution of SOA, we found that the greater use of these approaches have the potential to significantly impact SOA implementations for CDS PMID:25325996
The dynamic relationship between plant architecture and competition
Ford, E. David
2014-01-01
In this review, structural and functional changes are described in single-species, even-aged, stands undergoing competition for light. Theories of the competition process as interactions between whole plants have been advanced but have not been successful in explaining these changes and how they vary between species or growing conditions. This task now falls to researchers in plant architecture. Research in plant architecture has defined three important functions of individual plants that determine the process of canopy development and competition: (i) resource acquisition plasticity; (ii) morphogenetic plasticity; (iii) architectural variation in efficiency of interception and utilization of light. In this review, this research is synthesized into a theory for competition based on five groups of postulates about the functioning of plants in stands. Group 1: competition for light takes place at the level of component foliage and branches. Group 2: the outcome of competition is determined by the dynamic interaction between processes that exert dominance and processes that react to suppression. Group 3: species differences may affect both exertion of dominance and reaction to suppression. Group 4: individual plants may simultaneously exhibit, in different component parts, resource acquisition and morphogenetic plasticity. Group 5: mortality is a time-delayed response to suppression. Development of architectural models when combined with field investigations is identifying research needed to develop a theory of architectural influences on the competition process. These include analyses of the integration of foliage and branch components into whole-plant growth and precise definitions of environmental control of morphogenetic plasticity and its interaction with acquisition of carbon for plant growth. PMID:24987396
The dynamic relationship between plant architecture and competition.
Ford, E David
2014-01-01
In this review, structural and functional changes are described in single-species, even-aged, stands undergoing competition for light. Theories of the competition process as interactions between whole plants have been advanced but have not been successful in explaining these changes and how they vary between species or growing conditions. This task now falls to researchers in plant architecture. Research in plant architecture has defined three important functions of individual plants that determine the process of canopy development and competition: (i) resource acquisition plasticity; (ii) morphogenetic plasticity; (iii) architectural variation in efficiency of interception and utilization of light. In this review, this research is synthesized into a theory for competition based on five groups of postulates about the functioning of plants in stands. Group 1: competition for light takes place at the level of component foliage and branches. Group 2: the outcome of competition is determined by the dynamic interaction between processes that exert dominance and processes that react to suppression. Group 3: species differences may affect both exertion of dominance and reaction to suppression. Group 4: individual plants may simultaneously exhibit, in different component parts, resource acquisition and morphogenetic plasticity. Group 5: mortality is a time-delayed response to suppression. Development of architectural models when combined with field investigations is identifying research needed to develop a theory of architectural influences on the competition process. These include analyses of the integration of foliage and branch components into whole-plant growth and precise definitions of environmental control of morphogenetic plasticity and its interaction with acquisition of carbon for plant growth.
Optical, analog and digital domain architectural considerations for visual communications
NASA Astrophysics Data System (ADS)
Metz, W. A.
2008-01-01
The end of the performance entitlement historically achieved by classic scaling of CMOS devices is within sight, driven ultimately by fundamental limits. Performance entitlements predicted by classic CMOS scaling have progressively failed to be realized in recent process generations due to excessive leakage, increasing interconnect delays and scaling of gate dielectrics. Prior to reaching fundamental limits, trends in technology, architecture and economics will pressure the industry to adopt new paradigms. A likely response is to repartition system functions away from digital implementations and into new architectures. Future architectures for visual communications will require extending the implementation into the optical and analog processing domains. The fundamental properties of these domains will in turn give rise to new architectural concepts. The limits of CMOS scaling and impact on architectures will be briefly reviewed. Alternative approaches in the optical, electronic and analog domains will then be examined for advantages, architectural impact and drawbacks.
Software architecture of INO340 telescope control system
NASA Astrophysics Data System (ADS)
Ravanmehr, Reza; Khosroshahi, Habib
2016-08-01
The software architecture plays an important role in distributed control system of astronomical projects because many subsystems and components must work together in a consistent and reliable way. We have utilized a customized architecture design approach based on "4+1 view model" in order to design INOCS software architecture. In this paper, after reviewing the top level INOCS architecture, we present the software architecture model of INOCS inspired by "4+1 model", for this purpose we provide logical, process, development, physical, and scenario views of our architecture using different UML diagrams and other illustrative visual charts. Each view presents INOCS software architecture from a different perspective. We finish the paper by science data operation of INO340 and the concluding remarks.
Updates to the NASA Space Telecommunications Radio System (STRS) Architecture
NASA Technical Reports Server (NTRS)
Kacpura, Thomas J.; Handler, Louis M.; Briones, Janette; Hall, Charles S.
2008-01-01
This paper describes an update of the Space Telecommunications Radio System (STRS) open architecture for NASA space based radios. The STRS architecture has been defined as a framework for the design, development, operation and upgrade of space based software defined radios, where processing resources are constrained. The architecture has been updated based upon reviews by NASA missions, radio providers, and component vendors. The STRS Standard prescribes the architectural relationship between the software elements used in software execution and defines the Application Programmer Interface (API) between the operating environment and the waveform application. Modeling tools have been adopted to present the architecture. The paper will present a description of the updated API, configuration files, and constraints. Minimum compliance is discussed for early implementations. The paper then closes with a summary of the changes made and discussion of the relevant alignment with the Object Management Group (OMG) SWRadio specification, and enhancements to the specialized signal processing abstraction.
Human Exploration of Mars Design Reference Architecture 5.0
NASA Technical Reports Server (NTRS)
Drake, Bret G.
2009-01-01
This document reviews the Design Reference Architecture (DRA) for human exploration of Mars. The DRA represents the current best strategy for human missions. The DRA is not a formal plan, but provides a vision and context to tie current systems and technology developments to potential missions to Mars, and it also serves as a benchmark against which alternative architectures can be measured. The document also reviews the objectives and products of the 2007 study that was to update NASA's human Mars mission reference architecture, assess strategic linkages between lunar and Mars strategies, develop an understanding of methods for reducing cost/risk of human missions through investment in research, technology development and synergy with other exploration plans. There is also a review of the process by which the DRA will continue to be refined. The unique capacities of human exploration is reviewed. The possible goals and objectives of the first three human missions are presented, along with the recommendation that the mission involve a long stay visiting multiple sites.The deployment strategy is outlined and diagrammed including the pre-deployment of the many of the material requirements, and a six crew travel to Mars on a six month trajectory. The predeployment and the Orion crew vehicle are shown. The ground operations requirements are also explained. Also the use of resources found on the surface of Mars is postulated. The Mars surface exploration strategy is reviewed, including the planetary protection processes that are planned. Finally a listing of the key decisions and tenets is posed.
ARCHITECTURAL PROGRAMMING--STATE OF THE ART.
ERIC Educational Resources Information Center
EVANS, BENJAMIN H.
IN RESPONSE TO A NEED FOR A MORE THOROUGH AND RIGOROUS STUDY AND ANALYSIS PROCESS IN ENVIRONMENTAL FUNCTIONS PRIOR TO THE DESIGN OF NEW BUILDINGS, A STUDY WAS UNDERTAKEN TO IDENTIFY THE EMERGING TECHNIQUES OF ARCHITECTURAL PROGRAMING PRACTICE. THE STUDY INCLUDED CORRESPONDENCE AND REVIEW OF PERIODICALS, QUESTIONNAIRES AND VISITATIONS, AND A…
NASA Technical Reports Server (NTRS)
Rickard, D. A.; Bodenheimer, R. E.
1976-01-01
Digital computer components which perform two dimensional array logic operations (Tse logic) on binary data arrays are described. The properties of Golay transforms which make them useful in image processing are reviewed, and several architectures for Golay transform processors are presented with emphasis on the skeletonizing algorithm. Conventional logic control units developed for the Golay transform processors are described. One is a unique microprogrammable control unit that uses a microprocessor to control the Tse computer. The remaining control units are based on programmable logic arrays. Performance criteria are established and utilized to compare the various Golay transform machines developed. A critique of Tse logic is presented, and recommendations for additional research are included.
Protein Kinases in Shaping Plant Architecture.
Wu, Juan; Wang, Bo; Xin, Xiaoyun; Ren, Dongtao
2018-02-13
Plant architecture, the three-dimensional organization of the plant body, includes the branching pattern and the size, shape, and position of organs. Plant architecture is genetically controlled and is influenced by environmental conditions. The regulations occur at most of the stages from the first division of the fertilized eggs to the final establishment of plant architecture. Among the various endogenous regulators, protein kinases and their associated signaling pathways have been shown to play important roles in regulating the process of plant architecture establishment. In this review, we summarize recent progress in the understanding of the mechanisms by which plant architecture formation is regulated by protein kinases, especially mitogen-activated protein kinase (MAPK). Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
SAW chirp filter technology for satellite on-board processing applications
NASA Astrophysics Data System (ADS)
Shaw, M. D.; Miller, N. D. J.; Malarky, A. P.; Warne, D. H.
1989-11-01
Market growth in the area of thin route satellite communications services has led to consideration of nontraditional system architectures requiring sophisticated on-board processing functions. Surface acoustic wave (SAW) technology exists today which can provide implementation of key on-board processing subsystems by using multicarrier demodulators. This paper presents a review of this signal processing technology, along with a brief review of dispersive SAW device technology as applied to the implementation of multicarrier demodulators for on-board signal processing.
NASA Astrophysics Data System (ADS)
Dave, Gaurav P.; Sureshkumar, N.; Blessy Trencia Lincy, S. S.
2017-11-01
Current trend in processor manufacturing focuses on multi-core architectures rather than increasing the clock speed for performance improvement. Graphic processors have become as commodity hardware for providing fast co-processing in computer systems. Developments in IoT, social networking web applications, big data created huge demand for data processing activities and such kind of throughput intensive applications inherently contains data level parallelism which is more suited for SIMD architecture based GPU. This paper reviews the architectural aspects of multi/many core processors and graphics processors. Different case studies are taken to compare performance of throughput computing applications using shared memory programming in OpenMP and CUDA API based programming.
ERIC Educational Resources Information Center
Bennett, David A.
This document comprises a report on the architectural elements of choice in the desegregation process, a review of the choice process based on Minnesota's experience, and a statement of implications for state policymakers. The following organizational principles of the choice process are discussed: (1) enrollment based on a "first come, first…
1991-06-01
Validation And Reconstruction -~ Phase 1: System Architecture Study i ".- Contract NAS 3 -25883 I - _ CR-187124 -4 Phase I Final Report,, " , I Prepared for...131 NAS 3 -25883 1.0 INTRODUCTION 1 2.0 EXECUTIVE SUMMARY 2 3.0 TECHNICAL DISCUSSION 8 3.1 Review of SSME Test Data and Validation Procedure 8 3.1.1...NAS 3 -25883 FIGURES FigureNo. e 1 Elements The Sensor Data Validation and Signal Reconstuction System 7 3 Current NASA MSFC Data Review Process 12 4
1998-04-01
revision phase would be followed, after which a second review would be scheduled , and so forth, until the review succeeds. 2.3 Realization of the...normal rules, when Summit rules are inferred they are enqueued in a separate Summit queue and are scheduled for execution only after local forward... scheduling and activating activities according to the defined process; reac- tively triggering activities based on state changes; monitoring the process
49 CFR 236.913 - Filing and approval of PSPs.
Code of Federal Regulations, 2011 CFR
2011-10-01
... architectural concepts; the PSP describes a product that uses design or safety assurance concepts considered... the end of the system design review phase of product development and 180 days prior to planned implementation, inviting FRA to participate in the design review process and receive periodic briefings and...
49 CFR 236.913 - Filing and approval of PSPs.
Code of Federal Regulations, 2010 CFR
2010-10-01
... architectural concepts; the PSP describes a product that uses design or safety assurance concepts considered... the end of the system design review phase of product development and 180 days prior to planned implementation, inviting FRA to participate in the design review process and receive periodic briefings and...
a New Protocol for Texture Mapping Process and 2d Representation of Rupestrian Architecture
NASA Astrophysics Data System (ADS)
Carnevali, L.; Carpiceci, M.; Angelini, A.
2018-05-01
The development of the survey techniques for architecture and archaeology requires a general review in the methods used for the representation of numerical data. The possibilities offered by data processing allow to find new paths for studying issues connected to the drawing discipline. The research project aimed at experimenting different approaches for the representation of the rupestrian architecture and the texture mapping process. The nature of the rupestrian architecture does not allow a traditional representation of sections and projections of edges and outlines. The paper presents a method, the Equidistant Multiple Sections (EMS), inspired by cartography and based on the use of isohipses generated from different geometric plane. A specific paragraph is dedicated to the texture mapping process for unstructured surface models. One of the main difficulty in the image projection consists in the recognition of homologous points between image and point cloud, above all in the areas with most deformations. With the aid of the "virtual scan" tool a different procedure was developed for improving the correspondences of the image. The result show a sensible improvement of the entire process above all for the architectural vaults. A detailed study concerned the unfolding of the straight line surfaces; the barrel vault of the analyzed chapel has been unfolded for observing the paintings in the real shapes out of the morphological context.
Microwave intersatellite links for communications satellites
NASA Technical Reports Server (NTRS)
Welti, G. R.
1982-01-01
Applications and interface requirements for intersatellite links (ISLs) between commercial communications satellites are reviewed, ranging from ISLs between widely separated satellites to ISLs between clustered satellites. On-board processing architectures for ISLs employing a variety of modulation schemes are described. These schemes include FM remodulation and QPSK regeneration in combination with switching and buffering. The various architectures are compared in terms of complexity, required performance, antenna size, mass, and power.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-13
... Architecture Proposal Review Meetings and Webinars; Notice of Public Meeting AGENCY: Research and Innovative... webinars to discuss the Vehicle to Infrastructure (V2I) Core System Requirements and Architecture Proposal... review of System Requirements Specification and Architecture Proposal. The second meeting will be a...
SOA: A Quality Attribute Perspective
2011-06-23
in software engineering from CMU. 6June 2011 Twitter #seiwebinar © 2011 Carnegie Mellon University Agenda Service -Oriented Architecture and... Software Architecture: Review Service -Orientation and Quality Attributes Summary and Future Challenges 7June 2011 Twitter #seiwebinar © 2011...Architecture and Software Architecture: Review Service -Orientation and Quality Attributes Summary and Future Challenges Review 10June 2011 Twitter
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-23
... Architecture Proposal Review Meetings and Webinars; Notice of Public Meeting AGENCY: Research and Innovative... Requirements and Architecture Proposal. The first meeting, June 28-30, 2011, 9 a.m.-4:30 p.m. at the University..., will walk through the review of System Requirements Specification and Architecture Proposal. The second...
Interferometric architectures based All-Optical logic design methods and their implementations
NASA Astrophysics Data System (ADS)
Singh, Karamdeep; Kaur, Gurmeet
2015-06-01
All-Optical Signal Processing is an emerging technology which can avoid costly Optical-electronic-optical (O-E-O) conversions which are usually compulsory in traditional Electronic Signal Processing systems, thus greatly enhancing operating bit rate with some added advantages such as electro-magnetic interference immunity and low power consumption etc. In order to implement complex signal processing tasks All-Optical logic gates are required as backbone elements. This review describes the advances in the field of All-Optical logic design methods based on interferometric architectures such as Mach-Zehnder Interferometer (MZI), Sagnac Interferometers and Ultrafast Non-Linear Interferometer (UNI). All-Optical logic implementations for realization of arithmetic and signal processing applications based on each interferometric arrangement are also presented in a categorized manner.
A fast, programmable hardware architecture for the processing of spaceborne SAR data
NASA Technical Reports Server (NTRS)
Bennett, J. R.; Cumming, I. G.; Lim, J.; Wedding, R. M.
1984-01-01
The development of high-throughput SAR processors (HTSPs) for the spaceborne SARs being planned by NASA, ESA, DFVLR, NASDA, and the Canadian Radarsat Project is discussed. The basic parameters and data-processing requirements of the SARs are listed in tables, and the principal problems are identified as real-operations rates in excess of 2 x 10 to the 9th/sec, I/O rates in excess of 8 x 10 to the 6th samples/sec, and control computation loads (as for range cell migration correction) as high as 1.4 x 10 to the 6th instructions/sec. A number of possible HTSP architectures are reviewed; host/array-processor (H/AP) and distributed-control/data-path (DCDP) architectures are examined in detail and illustrated with block diagrams; and a cost/speed comparison of these two architectures is presented. The H/AP approach is found to be adequate and economical for speeds below 1/200 of real time, while DCDP is more cost-effective above 1/50 of real time.
Cognitive Architectures and Autonomy: A Comparative Review
NASA Astrophysics Data System (ADS)
Thórisson, Kristinn; Helgasson, Helgi
2012-05-01
One of the original goals of artificial intelligence (AI) research was to create machines with very general cognitive capabilities and a relatively high level of autonomy. It has taken the field longer than many had expected to achieve even a fraction of this goal; the community has focused on building specific, targeted cognitive processes in isolation, and as of yet no system exists that integrates a broad range of capabilities or presents a general solution to autonomous acquisition of a large set of skills. Among the reasons for this are the highly limited machine learning and adaptation techniques available, and the inherent complexity of integrating numerous cognitive and learning capabilities in a coherent architecture. In this paper we review selected systems and architectures built expressly to address integrated skills. We highlight principles and features of these systems that seem promising for creating generally intelligent systems with some level of autonomy, and discuss them in the context of the development of future cognitive architectures. Autonomy is a key property for any system to be considered generally intelligent, in our view; we use this concept as an organizing principle for comparing the reviewed systems. Features that remain largely unaddressed in present research, but seem nevertheless necessary for such efforts to succeed, are also discussed.
Principles and Benefits of Explicitly Designed Medical Device Safety Architecture.
Larson, Brian R; Jones, Paul; Zhang, Yi; Hatcliff, John
The complexity of medical devices and the processes by which they are developed pose considerable challenges to producing safe designs and regulatory submissions that are amenable to effective reviews. Designing an appropriate and clearly documented architecture can be an important step in addressing this complexity. Best practices in medical device design embrace the notion of a safety architecture organized around distinct operation and safety requirements. By explicitly separating many safety-related monitoring and mitigation functions from operational functionality, the aspects of a device most critical to safety can be localized into a smaller and simpler safety subsystem, thereby enabling easier verification and more effective reviews of claims that causes of hazardous situations are detected and handled properly. This article defines medical device safety architecture, describes its purpose and philosophy, and provides an example. Although many of the presented concepts may be familiar to those with experience in realization of safety-critical systems, this article aims to distill the essence of the approach and provide practical guidance that can potentially improve the quality of device designs and regulatory submissions.
State-of-the-art in Heterogeneous Computing
Brodtkorb, Andre R.; Dyken, Christopher; Hagen, Trond R.; ...
2010-01-01
Node level heterogeneous architectures have become attractive during the last decade for several reasons: compared to traditional symmetric CPUs, they offer high peak performance and are energy and/or cost efficient. With the increase of fine-grained parallelism in high-performance computing, as well as the introduction of parallelism in workstations, there is an acute need for a good overview and understanding of these architectures. We give an overview of the state-of-the-art in heterogeneous computing, focusing on three commonly found architectures: the Cell Broadband Engine Architecture, graphics processing units (GPUs), and field programmable gate arrays (FPGAs). We present a review of hardware, availablemore » software tools, and an overview of state-of-the-art techniques and algorithms. Furthermore, we present a qualitative and quantitative comparison of the architectures, and give our view on the future of heterogeneous computing.« less
Genes and networks regulating root anatomy and architecture.
Wachsman, Guy; Sparks, Erin E; Benfey, Philip N
2015-10-01
The root is an excellent model for studying developmental processes that underlie plant anatomy and architecture. Its modular structure, the lack of cell movement and relative accessibility to microscopic visualization facilitate research in a number of areas of plant biology. In this review, we describe several examples that demonstrate how cell type-specific developmental mechanisms determine cell fate and the formation of defined tissues with unique characteristics. In the last 10 yr, advances in genome-wide technologies have led to the sequencing of thousands of plant genomes, transcriptomes and proteomes. In parallel with the development of these high-throughput technologies, biologists have had to establish computational, statistical and bioinformatic tools that can deal with the wealth of data generated by them. These resources provide a foundation for posing more complex questions about molecular interactions, and have led to the discovery of new mechanisms that control phenotypic differences. Here we review several recent studies that shed new light on developmental processes, which are involved in establishing root anatomy and architecture. We highlight the power of combining large-scale experiments with classical techniques to uncover new pathways in root development. © 2015 The Authors. New Phytologist © 2015 New Phytologist Trust.
Akbarzadeh, Rosa; Yousefi, Azizeh-Mitra
2014-08-01
Tissue engineering makes use of 3D scaffolds to sustain three-dimensional growth of cells and guide new tissue formation. To meet the multiple requirements for regeneration of biological tissues and organs, a wide range of scaffold fabrication techniques have been developed, aiming to produce porous constructs with the desired pore size range and pore morphology. Among different scaffold fabrication techniques, thermally induced phase separation (TIPS) method has been widely used in recent years because of its potential to produce highly porous scaffolds with interconnected pore morphology. The scaffold architecture can be closely controlled by adjusting the process parameters, including polymer type and concentration, solvent composition, quenching temperature and time, coarsening process, and incorporation of inorganic particles. The objective of this review is to provide information pertaining to the effect of these parameters on the architecture and properties of the scaffolds fabricated by the TIPS technique. © 2014 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Lee, J.; Kim, K.
A Very Large Scale Integration (VLSI) architecture for robot direct kinematic computation suitable for industrial robot manipulators was investigated. The Denavit-Hartenberg transformations are reviewed to exploit a proper processing element, namely an augmented CORDIC. Specifically, two distinct implementations are elaborated on, such as the bit-serial and parallel. Performance of each scheme is analyzed with respect to the time to compute one location of the end-effector of a 6-links manipulator, and the number of transistors required.
NASA Technical Reports Server (NTRS)
Lee, J.; Kim, K.
1991-01-01
A Very Large Scale Integration (VLSI) architecture for robot direct kinematic computation suitable for industrial robot manipulators was investigated. The Denavit-Hartenberg transformations are reviewed to exploit a proper processing element, namely an augmented CORDIC. Specifically, two distinct implementations are elaborated on, such as the bit-serial and parallel. Performance of each scheme is analyzed with respect to the time to compute one location of the end-effector of a 6-links manipulator, and the number of transistors required.
Space Station Needs, Attributes and Architectural Options. Contractor orientation briefings
NASA Technical Reports Server (NTRS)
1983-01-01
Requirements are considered for user missions involving life sciences; astrophysics, environmental observation; Earth and planetary exploration; materials processing; Spacelab payloads; technology development; and communications are analyzed. Plans to exchange data with potential cooperating nations and ESA are reviewed. The capability of the space shuttle to support space station activities are discussed. The status of the OAST space station technology study, conceptual architectures for a space station, elements of the space-based infrastructure, and the use of the shuttle external tank are also considered.
Cognitive architecture of perceptual organization: from neurons to gnosons.
van der Helm, Peter A
2012-02-01
What, if anything, is cognitive architecture and how is it implemented in neural architecture? Focusing on perceptual organization, this question is addressed by way of a pluralist approach which, supported by metatheoretical considerations, combines complementary insights from representational, connectionist, and dynamic systems approaches to cognition. This pluralist approach starts from a representationally inspired model which implements the intertwined but functionally distinguishable subprocesses of feedforward feature encoding, horizontal feature binding, and recurrent feature selection. As sustained by a review of neuroscientific evidence, these are the subprocesses that are believed to take place in the visual hierarchy in the brain. Furthermore, the model employs a special form of processing, called transparallel processing, whose neural signature is proposed to be gamma-band synchronization in transient horizontal neural assemblies. In neuroscience, such assemblies are believed to mediate binding of similar features. Their formal counterparts in the model are special input-dependent distributed representations, called hyperstrings, which allow many similar features to be processed in a transparallel fashion, that is, simultaneously as if only one feature were concerned. This form of processing does justice to both the high combinatorial capacity and the high speed of the perceptual organization process. A naturally following proposal is that those temporarily synchronized neural assemblies are "gnosons", that is, constituents of flexible self-organizing cognitive architecture in between the relatively rigid level of neurons and the still elusive level of consciousness.
Biomimetic Structural Materials: Inspiration from Design and Assembly.
Yaraghi, Nicholas A; Kisailus, David
2018-04-20
Nature assembles weak organic and inorganic constituents into sophisticated hierarchical structures, forming structural composites that demonstrate impressive combinations of strength and toughness. Two such composites are the nacre structure forming the inner layer of many mollusk shells, whose brick-and-mortar architecture has been the gold standard for biomimetic composites, and the cuticle forming the arthropod exoskeleton, whose helicoidal fiber-reinforced architecture has only recently attracted interest for structural biomimetics. In this review, we detail recent biomimetic efforts for the fabrication of strong and tough composite materials possessing the brick-and-mortar and helicoidal architectures. Techniques discussed for the fabrication of nacre- and cuticle-mimetic structures include freeze casting, layer-by-layer deposition, spray deposition, magnetically assisted slip casting, fiber-reinforced composite processing, additive manufacturing, and cholesteric self-assembly. Advantages and limitations to these processes are discussed, as well as the future outlook on the biomimetic landscape for structural composite materials.
Biomimetic Structural Materials: Inspiration from Design and Assembly
NASA Astrophysics Data System (ADS)
Yaraghi, Nicholas A.; Kisailus, David
2018-04-01
Nature assembles weak organic and inorganic constituents into sophisticated hierarchical structures, forming structural composites that demonstrate impressive combinations of strength and toughness. Two such composites are the nacre structure forming the inner layer of many mollusk shells, whose brick-and-mortar architecture has been the gold standard for biomimetic composites, and the cuticle forming the arthropod exoskeleton, whose helicoidal fiber-reinforced architecture has only recently attracted interest for structural biomimetics. In this review, we detail recent biomimetic efforts for the fabrication of strong and tough composite materials possessing the brick-and-mortar and helicoidal architectures. Techniques discussed for the fabrication of nacre- and cuticle-mimetic structures include freeze casting, layer-by-layer deposition, spray deposition, magnetically assisted slip casting, fiber-reinforced composite processing, additive manufacturing, and cholesteric self-assembly. Advantages and limitations to these processes are discussed, as well as the future outlook on the biomimetic landscape for structural composite materials.
Visual search, visual streams, and visual architectures.
Green, M
1991-10-01
Most psychological, physiological, and computational models of early vision suggest that retinal information is divided into a parallel set of feature modules. The dominant theories of visual search assume that these modules form a "blackboard" architecture: a set of independent representations that communicate only through a central processor. A review of research shows that blackboard-based theories, such as feature-integration theory, cannot easily explain the existing data. The experimental evidence is more consistent with a "network" architecture, which stresses that: (1) feature modules are directly connected to one another, (2) features and their locations are represented together, (3) feature detection and integration are not distinct processing stages, and (4) no executive control process, such as focal attention, is needed to integrate features. Attention is not a spotlight that synthesizes objects from raw features. Instead, it is better to conceptualize attention as an aperture which masks irrelevant visual information.
A RESTful Service Oriented Architecture for Science Data Processing
NASA Astrophysics Data System (ADS)
Duggan, B.; Tilmes, C.; Durbin, P.; Masuoka, E.
2012-12-01
The Atmospheric Composition Processing System is an implementation of a RESTful Service Oriented Architecture which handles incoming data from the Ozone Monitoring Instrument and the Ozone Monitoring and Profiler Suite aboard the Aura and NPP spacecrafts respectively. The system has been built entirely from open source components, such as Postgres, Perl, and SQLite and has leveraged the vast resources of the Comprehensive Perl Archive Network (CPAN). The modular design of the system also allows for many of the components to be easily released and integrated into the CPAN ecosystem and reused independently. At minimal expense, the CPAN infrastructure and community provide peer review, feedback and continuous testing in a wide variety of environments and architectures. A well defined set of conventions also facilitates dependency management, packaging, and distribution of code. Test driven development also provides a way to ensure stability despite a continuously changing base of dependencies.
Technical architecture of ONC-approved plans for statewide health information exchange.
Barrows, Randolph C; Ezzard, John
2011-01-01
ONC-approved state plans for HIE were reviewed for descriptions and depictions of statewide HIE technical architecture. Review was complicated by non-standard organizational elements and technical terminology across state plans. Findings were mapped to industry standard, referenced, and defined HIE architecture descriptions and characteristics. Results are preliminary due to the initial subset of ONC-approved plans available, the rapid pace of new ONC-plan approvals, and continuing advancements in standards and technology of HIE, etc. Review of 28 state plans shows virtually all include a direct messaging component, but for participating entities at state-specific levels of granularity (RHIO, enterprise, organization/provider). About ½ of reviewed plans describe a federated architecture, and ¼ of plans utilize a single-vendor "hybrid-federated" architecture. About 1/3 of states plan to leverage new federal and open exchange technologies (DIRECT, CONNECT, etc.). Only one plan describes a centralized architecture for statewide HIE, but others combine central and federated architectural approaches.
Technical Architecture of ONC-Approved Plans For Statewide Health Information Exchange
Barrows, Randolph C.; Ezzard, John
2011-01-01
ONC-approved state plans for HIE were reviewed for descriptions and depictions of statewide HIE technical architecture. Review was complicated by non-standard organizational elements and technical terminology across state plans. Findings were mapped to industry standard, referenced, and defined HIE architecture descriptions and characteristics. Results are preliminary due to the initial subset of ONC-approved plans available, the rapid pace of new ONC-plan approvals, and continuing advancements in standards and technology of HIE, etc. Review of 28 state plans shows virtually all include a direct messaging component, but for participating entities at state-specific levels of granularity (RHIO, enterprise, organization/provider). About ½ of reviewed plans describe a federated architecture, and ¼ of plans utilize a single-vendor “hybrid-federated” architecture. About 1/3 of states plan to leverage new federal and open exchange technologies (DIRECT, CONNECT, etc.). Only one plan describes a centralized architecture for statewide HIE, but others combine central and federated architectural approaches. PMID:22195059
NASA Technical Reports Server (NTRS)
Dezfuli, Homayoon
2010-01-01
This slide presentation reviews the evolution of risk management (RM) at NASA. The aim of the RM approach at NASA is to promote an approach that is heuristic, proactive, and coherent across all of NASA. Risk Informed Decision Making (RIDM) is a decision making process that uses a diverse set of performance measures along with other considerations within a deliberative process to inform decision making. RIDM is invoked for key decisions such as architecture and design decisions, make-buy decisions, and budget reallocation. The RIDM process and how it relates to the continuous Risk Management (CRM) process is reviewed.
Freeze-Casting of Porous Biomaterials: Structure, Properties and Opportunities
Deville, Sylvain
2010-01-01
The freeze-casting of porous materials has received a great deal of attention during the past few years. This simple process, where a material suspension is simply frozen and then sublimated, provides materials with unique porous architectures, where the porosity is almost a direct replica of the frozen solvent crystals. This review focuses on the recent results on the process and the derived porous structures with regards to the biomaterials applications. Of particular interest is the architecture of the materials and the versatility of the process, which can be readily controlled and applied to biomaterials applications. A careful control of the starting formulation and processing conditions is required to control the integrity of the structure and resulting properties. Further in vitro and in vivo investigations are required to validate the potential of this new class of porous materials.
Orion Integrated Guidance, Navigation, and Control [GN and C
NASA Technical Reports Server (NTRS)
Chevray, Kay
2009-01-01
This slide presentation reviews the integrated Guidance, Navigation and Control (iGN&C) system in the design for the Orion spacecraft. Included in the review are the plans for the design and development of the external interfaces, the functional architecture, the iGN&C software, the development and validation process, and the key challenges that are involved in the development of the iGN&C system
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-11
... Buildings Service; Submission for OMB Review; Art-in- Architecture Program National Artist Registry (GSA... Architecture Program National Artist Registry (GSA Form 7437). A notice was published in the Federal Register at 77 FR 58141, on September 19, 2012. No comments were received. The Art-in-Architecture Program is...
Fungal chitinases: diversity, mechanistic properties and biotechnological potential.
Hartl, Lukas; Zach, Simone; Seidl-Seiboth, Verena
2012-01-01
Chitin derivatives, chitosan and substituted chito-oligosaccharides have a wide spectrum of applications ranging from medicine to cosmetics and dietary supplements. With advancing knowledge about the substrate-binding properties of chitinases, enzyme-based production of these biotechnologically relevant sugars from biological resources is becoming increasingly interesting. Fungi have high numbers of glycoside hydrolase family 18 chitinases with different substrate-binding site architectures. As presented in this review, the large diversity of fungal chitinases is an interesting starting point for protein engineering. In this review, recent data about the architecture of the substrate-binding clefts of fungal chitinases, in connection with their hydrolytic and transglycolytic abilities, and the development of chitinase inhibitors are summarized. Furthermore, the biological functions of chitinases, chitin and chitosan utilization by fungi, and the effects of these aspects on biotechnological applications, including protein overexpression and autolysis during industrial processes, are discussed in this review.
Kullgren, Jeffrey T.; Hafez, Dina; Fedewa, Allison; Heisler, Michele
2017-01-01
Purpose of review To review studies of behavioral economic interventions (financial incentives, choice architecture modifications, or commitment devices) to prevent type 2 diabetes mellitus (T2DM) among at-risk patients or improve self-management among patients with T2DM. Recent findings We found 15 studies that used varied study designs and outcomes to test behavioral economic interventions in clinical, workplace, or health plan settings. Of four studies that focused on prevention of T2DM, two found that financial incentives increased weight loss and completion of a fasting blood glucose test, and two choice architecture modifications had mixed effects in encouraging completion of tests to screen for T2DM. Of 11 studies that focused on improving self-management of T2DM, four of six tests of financial incentives demonstrated increased engagement in recommended care processes or improved biometric measures, and three of five tests of choice architecture modifications found improvements in self-management behaviors. Summary Though few studies have tested behavioral economic interventions for prevention or treatment of T2DM, those that have suggest such approaches have potential to improve patient behaviors and should be tested more broadly. PMID:28755061
A Structured Approach for Reviewing Architecture Documentation
2009-12-01
as those found in ISO 12207 [ ISO /IEC 12207 :2008] (for software engineering), ISO 15288 [ ISO /IEC 15288:2008] (for systems engineering), the Rational...Open Distributed Processing - Reference Model: Foundations ( ISO /IEC 10746-2). 1996. [ ISO /IEC 12207 :2008] International Organization for...Standardization & International Electrotechnical Commission. Sys- tems and software engineering – Software life cycle processes ( ISO /IEC 12207 ). 2008. [ ISO
Adaptive Distributed Intelligent Control Architecture for Future Propulsion Systems (Preprint)
2007-04-01
weight will be reduced by replacing heavy harness assemblies and FADECs , with distributed processing elements interconnected. This paper reviews...Digital Electronic Controls ( FADECs ), with distributed processing elements interconnected through a serial bus. Efficient data flow throughout the...because intelligence is embedded in components while overall control is maintained in the FADEC . The need for Distributed Control Systems in
The major architects of chromatin: architectural proteins in bacteria, archaea and eukaryotes.
Luijsterburg, Martijn S; White, Malcolm F; van Driel, Roel; Dame, Remus Th
2008-01-01
The genomic DNA of all organisms across the three kingdoms of life needs to be compacted and functionally organized. Key players in these processes are DNA supercoiling, macromolecular crowding and architectural proteins that shape DNA by binding to it. The architectural proteins in bacteria, archaea and eukaryotes generally do not exhibit sequence or structural conservation especially across kingdoms. Instead, we propose that they are functionally conserved. Most of these proteins can be classified according to their architectural mode of action: bending, wrapping or bridging DNA. In order for DNA transactions to occur within a compact chromatin context, genome organization cannot be static. Indeed chromosomes are subject to a whole range of remodeling mechanisms. In this review, we discuss the role of (i) DNA supercoiling, (ii) macromolecular crowding and (iii) architectural proteins in genome organization, as well as (iv) mechanisms used to remodel chromosome structure and to modulate genomic activity. We conclude that the underlying mechanisms that shape and remodel genomes are remarkably similar among bacteria, archaea and eukaryotes.
Fanwoua, Julienne; Bairam, Emna; Delaire, Mickael; Buck-Sorlin, Gerhard
2014-01-01
Understanding the role of branch architecture in carbon production and allocation is essential to gain more insight into the complex process of assimilate partitioning in fruit trees. This mini review reports on the current knowledge of the role of branch architecture in carbohydrate production and partitioning in apple. The first-order carrier branch of apple illustrates the complexity of branch structure emerging from bud activity events and encountered in many fruit trees. Branch architecture influences carbon production by determining leaf exposure to light and by affecting leaf internal characteristics related to leaf photosynthetic capacity. The dynamics of assimilate partitioning between branch organs depends on the stage of development of sources and sinks. The sink strength of various branch organs and their relative positioning on the branch also affect partitioning. Vascular connections between branch organs determine major pathways for branch assimilate transport. We propose directions for employing a modeling approach to further elucidate the role of branch architecture on assimilate partitioning. PMID:25071813
Freimuth, Robert R; Schauer, Michael W; Lodha, Preeti; Govindrao, Poornima; Nagarajan, Rakesh; Chute, Christopher G
2008-11-06
The caBIG Compatibility Review System (CRS) is a web-based application to support compatibility reviews, which certify that software applications that pass the review meet a specific set of criteria that allow them to interoperate. The CRS contains workflows that support both semantic and syntactic reviews, which are performed by the caBIG Vocabularies and Common Data Elements (VCDE) and Architecture workspaces, respectively. The CRS increases the efficiency of compatibility reviews by reducing administrative overhead and it improves uniformity by ensuring that each review is conducted according to a standard process. The CRS provides metrics that allow the review team to evaluate the level of data element reuse in an application, a first step towards quantifying the extent of harmonization between applications. Finally, functionality is being added that will provide automated validation of checklist criteria, which will further simplify the review process.
A Biological-Plausable Architecture for Shape Recognition
2006-06-30
between curves. Information Processing Letters, 64, 1997. [4] Irving Biederman . Recognition-by-components: A theory of human image understanding...Psychological Review, 94(2):115–147, 1987 . 43 [5] C. Cadieu, M. Kouh, M. Riesenhuber, and T. Poggio. Shape representation in v4: Investi- gating position
Lewinski, Peter
2015-01-01
This mini literature review analyzes research papers from many countries that directly or indirectly test how classrooms' architecture influences academic performance. These papers evaluate and explain specific characteristics of classrooms, with an emphasis on how they affect learning processes and learning outcomes. Factors such as acoustics, light, color, temperature, and seat arrangement are scrutinized to determine whether and by how much they improve or hinder students' academic performance in classrooms. Apter's (1982, 1984, 2014) reversal theory of telic versus paratelic motivation is presented and used to explain these findings. The results show preference for a learning environment that cues a telic motivation state in the students. Therefore, classroom features should not be distracting or arousing. Moreover, it appears the most influential factors affecting the learning process are noise, temperature and seat arrangement. In addition, there is no current agreement on how some particular physical characteristics of classrooms affect learning outcomes. More research is needed to establish stronger conclusions and recommendations.
Architectural design of the science complex at Elizabeth City State University
NASA Technical Reports Server (NTRS)
Jahromi, Soheila
1993-01-01
This paper gives an overall view of the architectural design process and elements in taking an idea from conception to execution. The project presented is an example for this process. Once the need for a new structure is established, an architect studies the requirements, opinions and limits in creating a structure that people will exist in, move through, and use. Elements in designing a building include factors such as volume and surface, light and form changes of scale and view, movement and stasis. Some of the other factors are functions and physical conditions of construction. Based on experience, intuition, and boundaries, an architect will utilize all elements in creating a new building. In general, the design process begins with studying the spatial needs which develop into an architectural program. A comprehensive and accurate architectural program is essential for having a successful building. The most attractive building which does not meet the functional needs of its users has failed at the primary reason for its existence. To have a good program an architect must have a full understanding of the daily functions that will take place in the building. The architectural program along with site characteristics are among a few of the important guidelines in studying the form, adjacencies, and circulation for the structure itself and also in relation to the adjacent structures. Conceptual studies are part of the schematic design, which is the first milestone in the design process. The other reference points are design development and construction documents. At each milestone, review and coordination with all the consultants is established, and the user is essential in refining the project. In design development phase, conceptual diagrams take shape, and architectural, structural, mechanical, and electrical systems are developed. The final phase construction documents convey all the information required to construct the building. The design process and elements described were applied in the following project.
NASA Astrophysics Data System (ADS)
Kretschmer, E.; Bachner, M.; Blank, J.; Dapp, R.; Ebersoldt, A.; Friedl-Vallon, F.; Guggenmoser, T.; Gulde, T.; Hartmann, V.; Lutz, R.; Maucher, G.; Neubert, T.; Oelhaf, H.; Preusse, P.; Schardt, G.; Schmitt, C.; Schönfeld, A.; Tan, V.
2015-06-01
The Gimballed Limb Observer for Radiance Imaging of the Atmosphere (GLORIA), a Fourier-transform-spectrometer-based limb spectral imager, operates on high-altitude research aircraft to study the transit region between the troposphere and the stratosphere. It is one of the most sophisticated systems to be flown on research aircraft in Europe, requiring constant monitoring and human intervention in addition to an automation system. To ensure proper functionality and interoperability on multiple platforms, a flexible control and communication system was laid out. The architectures of the communication system as well as the protocols used are reviewed. The integration of this architecture in the automation process as well as the scientific campaign flight application context are discussed.
NASA Astrophysics Data System (ADS)
Kretschmer, E.; Bachner, M.; Blank, J.; Dapp, R.; Ebersoldt, A.; Friedl-Vallon, F.; Guggenmoser, T.; Gulde, T.; Hartmann, V.; Lutz, R.; Maucher, G.; Neubert, T.; Oelhaf, H.; Preusse, P.; Schardt, G.; Schmitt, C.; Schönfeld, A.; Tan, V.
2015-02-01
The Gimballed Limb Observer for Radiance Imaging of the Atmosphere (GLORIA), a Fourier transform spectrometer based limb spectral imager, operates on high-altitude research aircraft to study the transit region between the troposphere and the stratosphere. It is one of the most sophisticated systems to be flown on research aircraft in Europe, requiring constant monitoring and human intervention in addition to an automation system. To ensure proper functionality and interoperability on multiple platforms, a flexible control and communication system was laid out. The architectures of the communication system as well as the protocols used are reviewed. The integration of this architecture in the automation process as well as the scientific campaign flight application context are discussed.
NASA Astrophysics Data System (ADS)
Farid, V. L.; Wonorahardjo, S.
2018-05-01
The implementation of Green Building criteria is relatively new in architectural practice, especially in Indonesia. Consequently, the integration of these criteria into design process has the potential to change the design process itself. The implementation of the green building criteria into the conventional design process will be discussed in this paper. The concept of this project is to design a residential unit with a natural air-conditioning system. To achieve this purpose, the Green Building criteria has been implemented since the beginning of the design process until the detailing process on the end of the project. Several studies was performed throughout the design process, such as: (1) Conceptual review, where several professionally proved theories related to Tropical Architecture and passive design are used for a reference, and (2) Computer simulations, such as Computational Fluid Dynamics (CFD) and wind tunnel simulation, used to represent the dynamic response of the surrounding environment towards the building. Hopefully this paper may become a reference for designing a green residential building.
Pre-PDK block-level PPAC assessment of technology options for sub-7nm high-performance logic
NASA Astrophysics Data System (ADS)
Liebmann, L.; Northrop, G.; Facchini, M.; Riviere Cazaux, L.; Baum, Z.; Nakamoto, N.; Sun, K.; Chanemougame, D.; Han, G.; Gerousis, V.
2018-03-01
This paper describes a rigorous yet flexible standard cell place-and-route flow that is used to quantify block-level power, performance, and area trade-offs driven by two unique cell architectures and their associated design rule differences. The two architectures examined in this paper differ primarily in their use of different power-distribution-networks to achieve the desired circuit performance for high-performance logic designs. The paper shows the importance of incorporating block-level routability experiments in the early phases of design-technology co-optimization by reviewing a series of routing trials that explore different aspects of the technology definition. Since the electrical and physical parameters leading to critical process assumptions and design rules are unique to specific integration schemes and design objectives, it is understood that the goal of this work is not to promote one cell-architecture over another, but rather to convey the importance of exploring critical trade-offs long before the process details of the technology node are finalized to a point where a process design kit can be published.
Lee, David A.
2017-01-01
ABSTRACT Nuclear architecture, a function of both chromatin and nucleoskeleton structure, is known to change with stem cell differentiation and differs between various somatic cell types. These changes in nuclear architecture are associated with the regulation of gene expression and genome function in a cell-type specific manner. Biophysical stimuli are known effectors of differentiation and also elicit stimuli-specific changes in nuclear architecture. This occurs via the process of mechanotransduction whereby extracellular mechanical forces activate several well characterized signaling cascades of cytoplasmic origin, and potentially some recently elucidated signaling cascades originating in the nucleus. Recent work has demonstrated changes in nuclear mechanics both with pluripotency state in embryonic stem cells, and with differentiation progression in adult mesenchymal stem cells. This review explores the interplay between cytoplasmic and nuclear mechanosensitivity, highlighting a role for the nucleus as a rheostat in tuning the cellular mechano-response. PMID:28152338
Thorpe, Stephen D; Lee, David A
2017-05-04
Nuclear architecture, a function of both chromatin and nucleoskeleton structure, is known to change with stem cell differentiation and differs between various somatic cell types. These changes in nuclear architecture are associated with the regulation of gene expression and genome function in a cell-type specific manner. Biophysical stimuli are known effectors of differentiation and also elicit stimuli-specific changes in nuclear architecture. This occurs via the process of mechanotransduction whereby extracellular mechanical forces activate several well characterized signaling cascades of cytoplasmic origin, and potentially some recently elucidated signaling cascades originating in the nucleus. Recent work has demonstrated changes in nuclear mechanics both with pluripotency state in embryonic stem cells, and with differentiation progression in adult mesenchymal stem cells. This review explores the interplay between cytoplasmic and nuclear mechanosensitivity, highlighting a role for the nucleus as a rheostat in tuning the cellular mechano-response.
Recent advancements in electrospinning design for tissue engineering applications: A review.
Kishan, Alysha P; Cosgriff-Hernandez, Elizabeth M
2017-10-01
Electrospinning, a technique used to fabricate fibrous scaffolds, has gained popularity in recent years as a method to produce tissue engineered grafts with architectural similarities to the extracellular matrix. Beyond its versatility in material selection, electrospinning also provides many tools to tune the fiber morphology and scaffold geometry. Recent efforts have focused on extending the capabilities of electrospinning to produce scaffolds that better recapitulate tissue properties and enhance regeneration. This review highlights these advancements by providing an overview of the processing variables and setups used to modulate scaffold architecture, discussing strategies to improve cellular infiltration and guide cell behavior, and providing a summary of electrospinning applications in tissue engineering. © 2017 Wiley Periodicals, Inc. J Biomed Mater Res Part A: 105A: 2892-2905, 2017. © 2017 Wiley Periodicals, Inc.
Razin, S V
2018-04-01
This issue of Biochemistry (Moscow) is devoted to the cell nucleus and mechanisms of transcription regulation. Over the years, biochemical processes in the cell nucleus have been studied in isolation, outside the context of their spatial organization. Now it is clear that segregation of functional processes within a compartmentalized cell nucleus is very important for the implementation of basic genetic processes. The functional compartmentalization of the cell nucleus is closely related to the spatial organization of the genome, which in turn plays a key role in the operation of epigenetic mechanisms. In this issue of Biochemistry (Moscow), we present a selection of review articles covering the functional architecture of the eukaryotic cell nucleus, the mechanisms of genome folding, the role of stochastic processes in establishing 3D architecture of the genome, and the impact of genome spatial organization on transcription regulation.
Pastur-Romay, Lucas Antón; Cedrón, Francisco; Pazos, Alejandro; Porto-Pazos, Ana Belén
2016-08-11
Over the past decade, Deep Artificial Neural Networks (DNNs) have become the state-of-the-art algorithms in Machine Learning (ML), speech recognition, computer vision, natural language processing and many other tasks. This was made possible by the advancement in Big Data, Deep Learning (DL) and drastically increased chip processing abilities, especially general-purpose graphical processing units (GPGPUs). All this has created a growing interest in making the most of the potential offered by DNNs in almost every field. An overview of the main architectures of DNNs, and their usefulness in Pharmacology and Bioinformatics are presented in this work. The featured applications are: drug design, virtual screening (VS), Quantitative Structure-Activity Relationship (QSAR) research, protein structure prediction and genomics (and other omics) data mining. The future need of neuromorphic hardware for DNNs is also discussed, and the two most advanced chips are reviewed: IBM TrueNorth and SpiNNaker. In addition, this review points out the importance of considering not only neurons, as DNNs and neuromorphic chips should also include glial cells, given the proven importance of astrocytes, a type of glial cell which contributes to information processing in the brain. The Deep Artificial Neuron-Astrocyte Networks (DANAN) could overcome the difficulties in architecture design, learning process and scalability of the current ML methods.
Pastur-Romay, Lucas Antón; Cedrón, Francisco; Pazos, Alejandro; Porto-Pazos, Ana Belén
2016-01-01
Over the past decade, Deep Artificial Neural Networks (DNNs) have become the state-of-the-art algorithms in Machine Learning (ML), speech recognition, computer vision, natural language processing and many other tasks. This was made possible by the advancement in Big Data, Deep Learning (DL) and drastically increased chip processing abilities, especially general-purpose graphical processing units (GPGPUs). All this has created a growing interest in making the most of the potential offered by DNNs in almost every field. An overview of the main architectures of DNNs, and their usefulness in Pharmacology and Bioinformatics are presented in this work. The featured applications are: drug design, virtual screening (VS), Quantitative Structure–Activity Relationship (QSAR) research, protein structure prediction and genomics (and other omics) data mining. The future need of neuromorphic hardware for DNNs is also discussed, and the two most advanced chips are reviewed: IBM TrueNorth and SpiNNaker. In addition, this review points out the importance of considering not only neurons, as DNNs and neuromorphic chips should also include glial cells, given the proven importance of astrocytes, a type of glial cell which contributes to information processing in the brain. The Deep Artificial Neuron–Astrocyte Networks (DANAN) could overcome the difficulties in architecture design, learning process and scalability of the current ML methods. PMID:27529225
NASA Technical Reports Server (NTRS)
Dehghani, Navid; Tankenson, Michael
2006-01-01
This viewgraph presentation reviews the architectural description of the Mission Data Processing and Control System (MPCS). MPCS is an event-driven, multi-mission ground data processing components providing uplink, downlink, and data management capabilities which will support the Mars Science Laboratory (MSL) project as its first target mission. MPCS is designed with these factors (1) Enabling plug and play architecture (2) MPCS has strong inheritance from GDS components that have been developed for other Flight Projects (MER, MRO, DAWN, MSAP), and are currently being used in operations and ATLO, and (3) MPCS components are Java-based, platform independent, and are designed to consume and produce XML-formatted data
Sebaa, Abderrazak; Chikh, Fatima; Nouicer, Amina; Tari, AbdelKamel
2018-02-19
The huge increases in medical devices and clinical applications which generate enormous data have raised a big issue in managing, processing, and mining this massive amount of data. Indeed, traditional data warehousing frameworks can not be effective when managing the volume, variety, and velocity of current medical applications. As a result, several data warehouses face many issues over medical data and many challenges need to be addressed. New solutions have emerged and Hadoop is one of the best examples, it can be used to process these streams of medical data. However, without an efficient system design and architecture, these performances will not be significant and valuable for medical managers. In this paper, we provide a short review of the literature about research issues of traditional data warehouses and we present some important Hadoop-based data warehouses. In addition, a Hadoop-based architecture and a conceptual data model for designing medical Big Data warehouse are given. In our case study, we provide implementation detail of big data warehouse based on the proposed architecture and data model in the Apache Hadoop platform to ensure an optimal allocation of health resources.
Discussion from a Mathematics Education Perspective
ERIC Educational Resources Information Center
Clements, Douglas; Sarama, Julie
2015-01-01
In a review of the special issue, we conclude that the articles are research gems in the domain of preschool mathematics education. Most share several features, such as their perspective on research methodology and their view of mathematics thinking and learning. They address the cognitive architecture and processes and the developmental levels…
Architectures, Representations and Processes of Language Production
ERIC Educational Resources Information Center
Alario, F.-Xavier; Costa, Albert; Ferreira, Victor S.; Pickering, Martin J.
2006-01-01
The authors present an overview of recent research conducted in the field of language production based on papers presented at the first edition of the International Workshop on Language Production (Marseille, France, September 2004). This article comprises two main parts. In the first part, consisting of three sections, the authors review the…
SABRE--A Novel Software Tool for Bibliographic Post-Processing.
ERIC Educational Resources Information Center
Burge, Cecil D.
1989-01-01
Describes the software architecture and application of SABRE (Semi-Automated Bibliographic Environment), which is one of the first products to provide a semi-automatic environment for relevancy ranking of citations obtained from searches of bibliographic databases. Features designed to meet the review, categorization, culling, and reporting needs…
Identity and Access Management: Technological Implementation of Policy
ERIC Educational Resources Information Center
von Munkwitz-Smith, Jeff; West, Ann
2004-01-01
Navigating the multiple processes for accessing ever-multiplying campus information systems can be a daunting task for students, faculty, and staff. This article provides a brief overview of Identity and Access Management Services. The authors review key characteristics and components of this new information architecture and address the issue of…
The role of WOX genes in flower development
Costanzo, Enrico; Trehin, Christophe; Vandenbussche, Michiel
2014-01-01
Background WOX (Wuschel-like homeobOX) genes form a family of plant-specific HOMEODOMAIN transcription factors, the members of which play important developmental roles in a diverse range of processes. WOX genes were first identified as determining cell fate during embryo development, as well as playing important roles in maintaining stem cell niches in the plant. In recent years, new roles have been identified in plant architecture and organ development, particularly at the flower level. Scope In this review, the role of WOX genes in flower development and flower architecture is highlighted, as evidenced from data obtained in the last few years. The roles played by WOX genes in different species and different flower organs are compared, and differential functional recruitment of WOX genes during flower evolution is considered. Conclusions This review compares available data concerning the role of WOX genes in flower and organ architecture among different species of angiosperms, including representatives of monocots and eudicots (rosids and asterids). These comparative data highlight the usefulness of the WOX gene family for evo–devo studies of floral development. PMID:24973416
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-14
... art from living American artists. One-half of one percent of the estimated construction cost of new or... for OMB Review; Art-in- Architecture Program National Artist Registry AGENCY: Public Buildings Service... extension of a previously approved information collection requirement regarding Art-in Architecture Program...
Wang, Xiaojian; Xu, Shanqing; Zhou, Shiwei; Xu, Wei; Leary, Martin; Choong, Peter; Qian, M; Brandt, Milan; Xie, Yi Min
2016-03-01
One of the critical issues in orthopaedic regenerative medicine is the design of bone scaffolds and implants that replicate the biomechanical properties of the host bones. Porous metals have found themselves to be suitable candidates for repairing or replacing the damaged bones since their stiffness and porosity can be adjusted on demands. Another advantage of porous metals lies in their open space for the in-growth of bone tissue, hence accelerating the osseointegration process. The fabrication of porous metals has been extensively explored over decades, however only limited controls over the internal architecture can be achieved by the conventional processes. Recent advances in additive manufacturing have provided unprecedented opportunities for producing complex structures to meet the increasing demands for implants with customized mechanical performance. At the same time, topology optimization techniques have been developed to enable the internal architecture of porous metals to be designed to achieve specified mechanical properties at will. Thus implants designed via the topology optimization approach and produced by additive manufacturing are of great interest. This paper reviews the state-of-the-art of topological design and manufacturing processes of various types of porous metals, in particular for titanium alloys, biodegradable metals and shape memory alloys. This review also identifies the limitations of current techniques and addresses the directions for future investigations. Copyright © 2016 Elsevier Ltd. All rights reserved.
2013-09-01
processes used in space system acquisitions, simply implementing a data exchange specification would not fundamentally improve how information is...instruction, searching existing data sources , gathering and maintaining the data needed, and completing and reviewing the collection of information ...and manage the configuration of all critical program models, processes , and tools used throughout the DoD. Second, mandate a data exchange
2017-01-01
The continuous technological advances in favor of mHealth represent a key factor in the improvement of medical emergency services. This systematic review presents the identification, study, and classification of the most up-to-date approaches surrounding the deployment of architectures for mHealth. Our review includes 25 articles obtained from databases such as IEEE Xplore, Scopus, SpringerLink, ScienceDirect, and SAGE. This review focused on studies addressing mHealth systems for outdoor emergency situations. In 60% of the articles, the deployment architecture relied in the connective infrastructure associated with emergent technologies such as cloud services, distributed services, Internet-of-things, machine-to-machine, vehicular ad hoc network, and service-oriented architecture. In 40% of the literature review, the deployment architecture for mHealth considered traditional connective infrastructure. Only 20% of the studies implemented an energy consumption protocol to extend system lifetime. We concluded that there is a need for more integrated solutions specifically for outdoor scenarios. Energy consumption protocols are needed to be implemented and evaluated. Emergent connective technologies are redefining the information management and overcome traditional technologies. PMID:29075430
Gonzalez, Enrique; Peña, Raul; Avila, Alfonso; Vargas-Rosales, Cesar; Munoz-Rodriguez, David
2017-01-01
The continuous technological advances in favor of mHealth represent a key factor in the improvement of medical emergency services. This systematic review presents the identification, study, and classification of the most up-to-date approaches surrounding the deployment of architectures for mHealth. Our review includes 25 articles obtained from databases such as IEEE Xplore, Scopus, SpringerLink, ScienceDirect, and SAGE. This review focused on studies addressing mHealth systems for outdoor emergency situations. In 60% of the articles, the deployment architecture relied in the connective infrastructure associated with emergent technologies such as cloud services, distributed services, Internet-of-things, machine-to-machine, vehicular ad hoc network, and service-oriented architecture. In 40% of the literature review, the deployment architecture for mHealth considered traditional connective infrastructure. Only 20% of the studies implemented an energy consumption protocol to extend system lifetime. We concluded that there is a need for more integrated solutions specifically for outdoor scenarios. Energy consumption protocols are needed to be implemented and evaluated. Emergent connective technologies are redefining the information management and overcome traditional technologies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Zhenyu; Dagle, Jeffery E.
2008-07-31
The infrastructure of phasor measurements have evolved over the last two decades from isolated measurement units to networked measurement systems with footprints beyond individual utility companies. This is, to a great extent, a bottom-up self-evolving process except some local systems built by design. Given the number of phasor measurement units (PMUs) in the system is small (currently 70 each in western and eastern interconnections), current phasor network architecture works just fine. However, the architecture will become a bottleneck when large number of PMUs are installed (e.g. >1000~10000). The need for phasor architecture design has yet to be addressed. This papermore » reviews the current phasor networks and investigates future architectures, as related to the efforts undertaken by the North America SynchroPhasor Initiative (NASPI). Then it continues to present staged system tests to evaluate the performance of phasor networks, which is a common practice in the Western Electricity Coordinating Council (WECC) system. This is followed by field measurement evaluation and the implication of phasor quality issues on phasor applications.« less
NASA Astrophysics Data System (ADS)
Kwon, Jang Yeon; Kyeong Jeong, Jae
2015-02-01
This review gives an overview of the recent progress in vacuum-based n-type transition metal oxide (TMO) thin film transistors (TFTs). Several excellent review papers regarding metal oxide TFTs in terms of fundamental electron structure, device process and reliability have been published. In particular, the required field-effect mobility of TMO TFTs has been increasing rapidly to meet the demands of the ultra-high-resolution, large panel size and three dimensional visual effects as a megatrend of flat panel displays, such as liquid crystal displays, organic light emitting diodes and flexible displays. In this regard, the effects of the TMO composition on the performance of the resulting oxide TFTs has been reviewed, and classified into binary, ternary and quaternary composition systems. In addition, the new strategic approaches including zinc oxynitride materials, double channel structures, and composite structures have been proposed recently, and were not covered in detail in previous review papers. Special attention is given to the advanced device architecture of TMO TFTs, such as back-channel-etch and self-aligned coplanar structure, which is a key technology because of their advantages including low cost fabrication, high driving speed and unwanted visual artifact-free high quality imaging. The integration process and related issues, such as etching, post treatment, low ohmic contact and Cu interconnection, required for realizing these advanced architectures are also discussed.
Satellite on-board processing for earth resources data
NASA Technical Reports Server (NTRS)
Bodenheimer, R. E.; Gonzalez, R. C.; Gupta, J. N.; Hwang, K.; Rochelle, R. W.; Wilson, J. B.; Wintz, P. A.
1975-01-01
Results of a survey of earth resources user applications and their data requirements, earth resources multispectral scanner sensor technology, and preprocessing algorithms for correcting the sensor outputs and for data bulk reduction are presented along with a candidate data format. Computational requirements required to implement the data analysis algorithms are included along with a review of computer architectures and organizations. Computer architectures capable of handling the algorithm computational requirements are suggested and the environmental effects of an on-board processor discussed. By relating performance parameters to the system requirements of each of the user requirements the feasibility of on-board processing is determined for each user. A tradeoff analysis is performed to determine the sensitivity of results to each of the system parameters. Significant results and conclusions are discussed, and recommendations are presented.
ERIC Educational Resources Information Center
McMinn, William G.
An evaluation and report was done on the status of programs in architecture and related fields in the Florida State University System as a follow-up to a 1983 evaluation. The evaluation involved self-studies prepared by each program and a series of site visits to each of seven campuses and two centers with programs under review. These institutions…
ERIC Educational Resources Information Center
McMinn, William G.
This report is an update of a report on the development and status of various programs in architecture and related fields in the State University System of Florida, a report that was submitted to the Board of Regents in May 1983. The objectives of this updated report, like those of the earlier one, are to review the anticipated needs of the…
Recent advances in nuclear magnetic resonance quantum information processing.
Criger, Ben; Passante, Gina; Park, Daniel; Laflamme, Raymond
2012-10-13
Quantum information processors have the potential to drastically change the way we communicate and process information. Nuclear magnetic resonance (NMR) has been one of the first experimental implementations of quantum information processing (QIP) and continues to be an excellent testbed to develop new QIP techniques. We review the recent progress made in NMR QIP, focusing on decoupling, pulse engineering and indirect nuclear control. These advances have enhanced the capabilities of NMR QIP, and have useful applications in both traditional NMR and other QIP architectures.
Modeling and system design for the LOFAR station digital processing
NASA Astrophysics Data System (ADS)
Alliot, Sylvain; van Veelen, Martijn
2004-09-01
In the context of the LOFAR preliminary design phase and in particular for the specification of the Station Digital Processing (SDP), a performance/cost model of the system was used. We present here the framework and the trajectory followed in this phase when going from requirements to specification. In the phased array antenna concepts for the next generation of radio telescopes (LOFAR, ATA, SKA) signal processing (multi-beaming and RFI mitigation) replaces the large antenna dishes. The embedded systems for these telescopes are major infrastructure cost items. Moreover, the flexibility and overall performance of the instrument depend greatly on them, therefore alternative solutions need to be investigated. In particular, the technology and the various data transport selections play a fundamental role in the optimization of the architecture. We proposed a formal method [1] of exploring these alternatives that has been followed during the SDP developments. Different scenarios were compared for the specification of the application (selection of the algorithms as well as detailed signal processing techniques) and in the specification of the system architecture (selection of high level topologies, platforms and components). It gave us inside knowledge on the possible trade-offs in the application and architecture domains. This was successful in providing firm basis for the design choices that are demanded by technical review committees.
NASA Technical Reports Server (NTRS)
Pettit, C. D.; Barkhoudarian, S.; Daumann, A. G., Jr.; Provan, G. M.; ElFattah, Y. M.; Glover, D. E.
1999-01-01
In this study, we proposed an Advanced Health Management System (AHMS) functional architecture and conducted a technology assessment for liquid propellant rocket engine lifecycle health management. The purpose of the AHMS is to improve reusable rocket engine safety and to reduce between-flight maintenance. During the study, past and current reusable rocket engine health management-related projects were reviewed, data structures and health management processes of current rocket engine programs were assessed, and in-depth interviews with rocket engine lifecycle and system experts were conducted. A generic AHMS functional architecture, with primary focus on real-time health monitoring, was developed. Fourteen categories of technology tasks and development needs for implementation of the AHMS were identified, based on the functional architecture and our assessment of current rocket engine programs. Five key technology areas were recommended for immediate development, which (1) would provide immediate benefits to current engine programs, and (2) could be implemented with minimal impact on the current Space Shuttle Main Engine (SSME) and Reusable Launch Vehicle (RLV) engine controllers.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-19
... approve meeting agenda Action item review Sub-Group (SG1--Wake OSED, SG3--Architecture, SG4--DO-252... Architecture(s) AAtS Implementation Guidance Document New Capabilities in Flight Services 1 p.m. Opening... Action item review Future meeting plans and dates Other business 1 p.m. Adjourn (no lunch break) Welcome...
NASA Technical Reports Server (NTRS)
Simon, Donald L.
2010-01-01
Aircraft engine performance trend monitoring and gas path fault diagnostics are closely related technologies that assist operators in managing the health of their gas turbine engine assets. Trend monitoring is the process of monitoring the gradual performance change that an aircraft engine will naturally incur over time due to turbomachinery deterioration, while gas path diagnostics is the process of detecting and isolating the occurrence of any faults impacting engine flow-path performance. Today, performance trend monitoring and gas path fault diagnostic functions are performed by a combination of on-board and off-board strategies. On-board engine control computers contain logic that monitors for anomalous engine operation in real-time. Off-board ground stations are used to conduct fleet-wide engine trend monitoring and fault diagnostics based on data collected from each engine each flight. Continuing advances in avionics are enabling the migration of portions of the ground-based functionality on-board, giving rise to more sophisticated on-board engine health management capabilities. This paper reviews the conventional engine performance trend monitoring and gas path fault diagnostic architecture commonly applied today, and presents a proposed enhanced on-board architecture for future applications. The enhanced architecture gains real-time access to an expanded quantity of engine parameters, and provides advanced on-board model-based estimation capabilities. The benefits of the enhanced architecture include the real-time continuous monitoring of engine health, the early diagnosis of fault conditions, and the estimation of unmeasured engine performance parameters. A future vision to advance the enhanced architecture is also presented and discussed
Processing-Related Issues for the Design and Lifing of SiC/SiC Hot-Section Components
NASA Technical Reports Server (NTRS)
DiCarlo, J.; Bhatt, R.; Morscher, G.; Yun, H. M.
2006-01-01
For successful SiC/SiC engine components, numerous process steps related to the fiber, fiber architecture, interphase coating, and matrix need to be optimized. Under recent NASA-sponsored programs, it was determined that many of these steps in their initial approach were inadequate, resulting in less than optimum thermostructural and life properties for the as-fabricated components. This presentation will briefly review many of these process issues, the key composite properties they degrade, their underlying mechanisms, and current process remedies developed by NASA and others.
NASA Technical Reports Server (NTRS)
1983-01-01
The history of NASA's materials processing in space activities is reviewed. Market projections, support requirements, orbital operations issues, cost estimates and candidate systems (orbiter sortie flight, orbiter serviced free flyer, space station, space station serviced free flyer) for the space production of semiconductor crystals are examined. Mission requirements are identified for materials processing, communications missions, bioprocessing, and for transferring aviation maintenance training technology to spacecraft.
Tissue architecture and breast cancer: the role of extracellular matrix and steroid hormones
Hansen, R K; Bissell, M J
2010-01-01
The changes in tissue architecture that accompany the development of breast cancer have been the focus of investigations aimed at developing new cancer therapeutics. As we learn more about the normal mammary gland, we have begun to understand the complex signaling pathways underlying the dramatic shifts in the structure and function of breast tissue. Integrin-, growth factor-, and steroid hormone-signaling pathways all play an important part in maintaining tissue architecture; disruption of the delicate balance of signaling results in dramatic changes in the way cells interact with each other and with the extracellular matrix, leading to breast cancer. The extracellular matrix itself plays a central role in coordinating these signaling processes. In this review, we consider the interrelationships between the extracellular matrix, integrins, growth factors, and steroid hormones in mammary gland development and function. PMID:10903527
The neurobiology of syntax: beyond string sets.
Petersson, Karl Magnus; Hagoort, Peter
2012-07-19
The human capacity to acquire language is an outstanding scientific challenge to understand. Somehow our language capacities arise from the way the human brain processes, develops and learns in interaction with its environment. To set the stage, we begin with a summary of what is known about the neural organization of language and what our artificial grammar learning (AGL) studies have revealed. We then review the Chomsky hierarchy in the context of the theory of computation and formal learning theory. Finally, we outline a neurobiological model of language acquisition and processing based on an adaptive, recurrent, spiking network architecture. This architecture implements an asynchronous, event-driven, parallel system for recursive processing. We conclude that the brain represents grammars (or more precisely, the parser/generator) in its connectivity, and its ability for syntax is based on neurobiological infrastructure for structured sequence processing. The acquisition of this ability is accounted for in an adaptive dynamical systems framework. Artificial language learning (ALL) paradigms might be used to study the acquisition process within such a framework, as well as the processing properties of the underlying neurobiological infrastructure. However, it is necessary to combine and constrain the interpretation of ALL results by theoretical models and empirical studies on natural language processing. Given that the faculty of language is captured by classical computational models to a significant extent, and that these can be embedded in dynamic network architectures, there is hope that significant progress can be made in understanding the neurobiology of the language faculty.
The neurobiology of syntax: beyond string sets
Petersson, Karl Magnus; Hagoort, Peter
2012-01-01
The human capacity to acquire language is an outstanding scientific challenge to understand. Somehow our language capacities arise from the way the human brain processes, develops and learns in interaction with its environment. To set the stage, we begin with a summary of what is known about the neural organization of language and what our artificial grammar learning (AGL) studies have revealed. We then review the Chomsky hierarchy in the context of the theory of computation and formal learning theory. Finally, we outline a neurobiological model of language acquisition and processing based on an adaptive, recurrent, spiking network architecture. This architecture implements an asynchronous, event-driven, parallel system for recursive processing. We conclude that the brain represents grammars (or more precisely, the parser/generator) in its connectivity, and its ability for syntax is based on neurobiological infrastructure for structured sequence processing. The acquisition of this ability is accounted for in an adaptive dynamical systems framework. Artificial language learning (ALL) paradigms might be used to study the acquisition process within such a framework, as well as the processing properties of the underlying neurobiological infrastructure. However, it is necessary to combine and constrain the interpretation of ALL results by theoretical models and empirical studies on natural language processing. Given that the faculty of language is captured by classical computational models to a significant extent, and that these can be embedded in dynamic network architectures, there is hope that significant progress can be made in understanding the neurobiology of the language faculty. PMID:22688633
77 FR 39677 - Performance Review Board Membership
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-05
... ARCHITECTURAL AND TRANSPORTATION BARRIERS COMPLIANCE BOARD Performance Review Board Membership AGENCY: Architectural and Transportation Barriers Compliance Board. ACTION: Notice. SUMMARY: Notice is... Transportation Barriers Compliance Board (Access Board). FOR FURTHER INFORMATION CONTACT: David M. Capozzi...
Quantitative imaging methods in osteoporosis.
Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G
2016-12-01
Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.
Mangan, Hazel; Gailín, Michael Ó; McStay, Brian
2017-12-01
Nucleoli are the sites of ribosome biogenesis and the largest membraneless subnuclear structures. They are intimately linked with growth and proliferation control and function as sensors of cellular stress. Nucleoli form around arrays of ribosomal gene (rDNA) repeats also called nucleolar organizer regions (NORs). In humans, NORs are located on the short arms of all five human acrocentric chromosomes. Multiple NORs contribute to the formation of large heterochromatin-surrounded nucleoli observed in most human cells. Here we will review recent findings about their genomic architecture. The dynamic nature of nucleoli began to be appreciated with the advent of photodynamic experiments using fluorescent protein fusions. We review more recent data on nucleoli in Xenopus germinal vesicles (GVs) which has revealed a liquid droplet-like behavior that facilitates nucleolar fusion. Further analysis in both XenopusGVs and Drosophila embryos indicates that the internal organization of nucleoli is generated by a combination of liquid-liquid phase separation and active processes involving rDNA. We will attempt to integrate these recent findings with the genomic architecture of human NORs to advance our understanding of how nucleoli form and respond to stress in human cells. © 2017 Federation of European Biochemical Societies.
Kullgren, Jeffrey T; Hafez, Dina; Fedewa, Allison; Heisler, Michele
2017-09-01
The purpose of this paper was to review studies of behavioral economic interventions (financial incentives, choice architecture modifications, or commitment devices) to prevent type 2 diabetes mellitus (T2DM) among at-risk patients or improve self-management among patients with T2DM. We found 15 studies that used varied study designs and outcomes to test behavioral economic interventions in clinical, workplace, or health plan settings. Of four studies that focused on prevention of T2DM, two found that financial incentives increased weight loss and completion of a fasting blood glucose test, and two choice architecture modifications had mixed effects in encouraging completion of tests to screen for T2DM. Of 11 studies that focused on improving self-management of T2DM, four of six tests of financial incentives demonstrated increased engagement in recommended care processes or improved biometric measures, and three of five tests of choice architecture modifications found improvements in self-management behaviors. Though few studies have tested behavioral economic interventions for prevention or treatment of T2DM, those that have suggested such approaches have the potential to improve patient behaviors and such approaches should be tested more broadly.
The role of WOX genes in flower development.
Costanzo, Enrico; Trehin, Christophe; Vandenbussche, Michiel
2014-11-01
WOX (Wuschel-like homeobOX) genes form a family of plant-specific HOMEODOMAIN transcription factors, the members of which play important developmental roles in a diverse range of processes. WOX genes were first identified as determining cell fate during embryo development, as well as playing important roles in maintaining stem cell niches in the plant. In recent years, new roles have been identified in plant architecture and organ development, particularly at the flower level. In this review, the role of WOX genes in flower development and flower architecture is highlighted, as evidenced from data obtained in the last few years. The roles played by WOX genes in different species and different flower organs are compared, and differential functional recruitment of WOX genes during flower evolution is considered. This review compares available data concerning the role of WOX genes in flower and organ architecture among different species of angiosperms, including representatives of monocots and eudicots (rosids and asterids). These comparative data highlight the usefulness of the WOX gene family for evo-devo studies of floral development. © The Author 2014. Published by Oxford University Press on behalf of the Annals of Botany Company. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
The walk-ride-walk : getting to school safely program
DOT National Transportation Integrated Search
1997-03-27
The National ITS Architecture Team reviewed the ITS Focus Task Force on System Architecture Report, dated May 1997. The comments collected during this review are documented in this summary. Overall, the ITS Focus report reflects a clear underst...
Enhancing pseudocapacitive charge storage in polymer templated mesoporous materials.
Rauda, Iris E; Augustyn, Veronica; Dunn, Bruce; Tolbert, Sarah H
2013-05-21
Growing global energy demands coupled with environmental concerns have increased the need for renewable energy sources. For intermittent renewable sources like solar and wind to become available on demand will require the use of energy storage devices. Batteries and supercapacitors, also known as electrochemical capacitors (ECs), represent the most widely used energy storage devices. Supercapacitors are frequently overlooked as an energy storage technology, however, despite the fact that these devices provide greater power, much faster response times, and longer cycle life than batteries. Their limitation is that the energy density of ECs is significantly lower than that of batteries, and this has limited their potential applications. This Account reviews our recent work on improving pseudocapacitive energy storage performance by tailoring the electrode architecture. We report our studies of mesoporous transition metal oxide architectures that store charge through surface or near-surface redox reactions, a phenomenon termed pseudocapacitance. The faradaic nature of pseudocapacitance leads to significant increases in energy density and thus represents an exciting future direction for ECs. We show that both the choice of material and electrode architecture is important for producing the ideal pseudocapacitor device. Here we first briefly review the current state of electrode architectures for pseudocapacitors, from slurry electrodes to carbon/metal oxide composites. We then describe the synthesis of mesoporous films made with amphiphilic diblock copolymer templating agents, specifically those optimized for pseudocapacitive charge storage. These include films synthesized from nanoparticle building blocks and films made from traditional battery materials. In the case of more traditional battery materials, we focus on using flexible architectures to minimize the strain associated with lithium intercalation, that is, the accumulation of lithium ions or atoms between the layers of cathode or anode materials that occurs as batteries charge and discharge. Electrochemical analysis of these mesoporous films allows for a detailed understanding of the origin of charge storage by separating capacitive contributions from traditional diffusion-controlled intercalation processes. We also discuss methods to separate the two contributions to capacitance: double-layer capacitance and pseudocapacitance. Understanding these contributions should allow the selection of materials with an optimized architecture that maximize the contribution from pseudocapacitance. From our studies, we show that nanocrystal-based nanoporous materials offer an architecture optimized for high levels of redox or surface pseudocapacitance. Interestingly, in some cases, materials engineered to minimize the strain associated with lithium insertion can also show intercalation pseudocapacitance, which is a process where insertion processes become so kinetically facile that they appear capacitive. Finally, we conclude with a summary of simple design rules that should result in high-power, high-energy-density electrode architectures. These design rules include assembling small, nanosized building blocks to maximize electrode surface area; maintaining an interconnected, open mesoporosity to facilitate solvent diffusion; seeking flexibility in electrode structure to facilitate volume expansion during lithium insertion; optimizing crystalline domain size and orientation; and creating effective electron transport pathways.
Recent Developments in the Application of Biologically Inspired Computation to Chemical Sensing
NASA Astrophysics Data System (ADS)
Marco, S.; Gutierrez-Gálvez, A.
2009-05-01
Biological olfaction outperforms chemical instrumentation in specificity, response time, detection limit, coding capacity, time stability, robustness, size, power consumption, and portability. This biological function provides outstanding performance due, to a large extent, to the unique architecture of the olfactory pathway, which combines a high degree of redundancy, an efficient combinatorial coding along with unmatched chemical information processing mechanisms. The last decade has witnessed important advances in the understanding of the computational primitives underlying the functioning of the olfactory system. In this work, the state of the art concerning biologically inspired computation for chemical sensing will be reviewed. Instead of reviewing the whole body of computational neuroscience of olfaction, we restrict this review to the application of models to the processing of real chemical sensor data.
Tissue architecture and breast cancer: the role of extracellular matrix and steroid hormones
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, R K; Bissell, M J
The changes in tissue architecture that accompany the development of breast cancer have been the focus of investigations aimed at developing new cancer therapeutics. As we learn more about the normal mammary gland, we have begun to understand the complex signaling pathways underlying the dramatic shifts in the structure and function of breast tissue. Integrin-, growth factor-, and steroid hormone-signaling pathways all play an important part in maintaining tissue architecture; disruption of the delicate balance of signaling results in dramatic changes in the way cells interact with each other and with the extracellular matrix, leading to breast cancer. The extracellularmore » matrix itself plays a central role in coordinating these signaling processes. In this review, we consider the interrelationships between the extracellular matrix, integrins, growth factors, and steroid hormones in mammary gland development and function.« less
Flexible devices: from materials, architectures to applications
NASA Astrophysics Data System (ADS)
Zou, Mingzhi; Ma, Yue; Yuan, Xin; Hu, Yi; Liu, Jie; Jin, Zhong
2018-01-01
Flexible devices, such as flexible electronic devices and flexible energy storage devices, have attracted a significant amount of attention in recent years for their potential applications in modern human lives. The development of flexible devices is moving forward rapidly, as the innovation of methods and manufacturing processes has greatly encouraged the research of flexible devices. This review focuses on advanced materials, architecture designs and abundant applications of flexible devices, and discusses the problems and challenges in current situations of flexible devices. We summarize the discovery of novel materials and the design of new architectures for improving the performance of flexible devices. Finally, we introduce the applications of flexible devices as key components in real life. Project supported by the National Key R&D Program of China (Nos. 2017YFA0208200, 2016YFB0700600, 2015CB659300), the National Natural Science Foundation of China (Nos. 21403105, 21573108), and the Fundamental Research Funds for the Central Universities (No. 020514380107).
Transforming Space Missions into Service Oriented Architectures
NASA Technical Reports Server (NTRS)
Mandl, Dan; Frye, Stuart; Cappelaere, Pat
2006-01-01
This viewgraph presentation reviews the vision of the sensor web enablement via a Service Oriented Architecture (SOA). An generic example is given of a user finding a service through the Web, and initiating a request for the desired observation. The parts that comprise this system and how they interact are reviewed. The advantages of the use of SOA are reviewed.
Residential Solar Design Review: A Manual on Community Architectural Controls and Solar Energy Use.
ERIC Educational Resources Information Center
Jaffe, Martin; Erley, Duncan
Presented are architectural design issues associated with solar energy use, and procedures for design review committees to consider in examining residential solar installation in light of existing aesthetic goals for their communities. Recommended design review criteria include the type of solar system being used and the ways in which the system…
['FLEXIBLE WALLS' IN HOSPITALS - ASSESSING THE 'VALUE' OF SOCIAL IMPACT ON ARCHITECTURE].
Tal, Orna; Tal, Shy-Lee
2018-05-01
: The development of hospital architecture is influenced by social trends, with mutual influence. Architecture enables 'organic-design' that leads to development, growth and adaptation of the structure to changing functions. A literature review reveals different perceptions of the flexibility of adapting hospital structure to changing needs, focusing on external forces pressures (expensive technologies, budgetary constraints limiting innovation implementation and regulatory barriers), as well as patients' demands. The degree of contribution of structural changes to the measured or perceived benefit to the patient and staff, has not yet been fully assessed. Expressions of this benefit are infection-control and increasing operational efficiency by energy saving and sustainability. To examine workers' perceptions towards value-based-architecture in relation to the patient or staff in a hospital setting. A survey was conducted among health care workers who underwent management training, using a structured questionnaire. Sixty responders ranked hospital leadership and relevant professionals (engineers and architects) as key players in the decision to change architecture in a hospital; economists, doctors and nurses were ranked as less important, while patients and families were ranked the lowest. Among the factors that contribute to the 'value' of the decision were the agility to adapt to emergency, and to changing morbidity trends in an efficient way. Factors ranked as being of medium importance were the contribution to hospital profitability and, to a lesser extent, the contribution to branding and improved service. 'Flexible walls' (shifting rooms between departments according to clinical need) can provide a response to morbidity changes. Hospital workers can play a role in the process of value-based architecture, thereby improving decisions concerning hospital construction and increasing their commitment to additional quality processes.
Enterprise Information Architecture for Mission Development
NASA Technical Reports Server (NTRS)
Dutra, Jayne
2007-01-01
This slide presentation reviews the concept of an information architecture to assist in mission development. The integrate information architecture will create a unified view of the information using metadata and the values (i.e., taxonomy).
Challenges and Approaches to Make Multidisciplinary Team Meetings Interoperable - The KIMBo Project.
Krauss, Oliver; Holzer, Karl; Schuler, Andreas; Egelkraut, Reinhard; Franz, Barbara
2017-01-01
Multidisciplinary team meetings (MDTMs) are already in use for certain areas in healthcare (e.g. treatment of cancer). Due to the lack of common standards and accessibility for the applied IT systems, their potential is not yet completely exploited. Common requirements for MDTMs shall be identified and aggregated into a process definition to be automated by an application architecture utilizing modern standards in electronic healthcare, e.g. HL7 FHIR. To identify requirements, an extensive literature review as well as semi-structured expert interviews were conducted. Results showed, that interoperability and flexibility in terms of the process are key requirements to be addressed. An architecture blueprint as well as an aggregated process definition were derived from the insights gained. To evaluate the feasibility of identified requirements, methods of explorative prototyping in software engineering were used. MDTMs will become an important part of modern and future healthcare but the need for standardization in terms of interoperability is imminent.
Glaser, John P
2008-01-01
Partners Healthcare, and its affiliated hospitals, have a long track record of accomplishments in clinical information systems implementations and research. Seven ideas have shaped the information systems strategies and tactics at Partners; centrality of processes, organizational partnerships, progressive incrementalism, agility, architecture, embedded research, and engage the field. This article reviews the ideas and discusses the rationale and steps taken to put the ideas into practice.
Glaser, John P.
2008-01-01
Partners Healthcare, and its affiliated hospitals, have a long track record of accomplishments in clinical information systems implementations and research. Seven ideas have shaped the information systems strategies and tactics at Partners; centrality of processes, organizational partnerships, progressive incrementalism, agility, architecture, embedded research, and engage the field. This article reviews the ideas and discusses the rationale and steps taken to put the ideas into practice. PMID:18308978
Services for Graduate Students: A Review of Academic Library Web Sites
ERIC Educational Resources Information Center
Rempel, Hannah Gascho
2010-01-01
A library's Web site is well recognized as the gateway to the library for the vast majority of users. Choosing the most user-friendly Web architecture to reflect the many services libraries offer is a complex process, and librarians are still experimenting to find what works best for their users. As part of a redesign of the Oregon State…
Remote Blood Glucose Monitoring in mHealth Scenarios: A Review.
Lanzola, Giordano; Losiouk, Eleonora; Del Favero, Simone; Facchinetti, Andrea; Galderisi, Alfonso; Quaglini, Silvana; Magni, Lalo; Cobelli, Claudio
2016-11-24
Glucose concentration in the blood stream is a critical vital parameter and an effective monitoring of this quantity is crucial for diabetes treatment and intensive care management. Effective bio-sensing technology and advanced signal processing are therefore of unquestioned importance for blood glucose monitoring. Nevertheless, collecting measurements only represents part of the process as another critical task involves delivering the collected measures to the treating specialists and caregivers. These include the clinical staff, the patient's significant other, his/her family members, and many other actors helping with the patient treatment that may be located far away from him/her. In all of these cases, a remote monitoring system, in charge of delivering the relevant information to the right player, becomes an important part of the sensing architecture. In this paper, we review how the remote monitoring architectures have evolved over time, paralleling the progress in the Information and Communication Technologies, and describe our experiences with the design of telemedicine systems for blood glucose monitoring in three medical applications. The paper ends summarizing the lessons learned through the experiences of the authors and discussing the challenges arising from a large-scale integration of sensors and actuators.
Disease Modeling Using 3D Organoids Derived from Human Induced Pluripotent Stem Cells.
Ho, Beatrice Xuan; Pek, Nicole Min Qian; Soh, Boon-Seng
2018-03-21
The rising interest in human induced pluripotent stem cell (hiPSC)-derived organoid culture has stemmed from the manipulation of various combinations of directed multi-lineage differentiation and morphogenetic processes that mimic organogenesis. Organoids are three-dimensional (3D) structures that are comprised of multiple cell types, self-organized to recapitulate embryonic and tissue development in vitro. This model has been shown to be superior to conventional two-dimensional (2D) cell culture methods in mirroring functionality, architecture, and geometric features of tissues seen in vivo. This review serves to highlight recent advances in the 3D organoid technology for use in modeling complex hereditary diseases, cancer, host-microbe interactions, and possible use in translational and personalized medicine where organoid cultures were used to uncover diagnostic biomarkers for early disease detection via high throughput pharmaceutical screening. In addition, this review also aims to discuss the advantages and shortcomings of utilizing organoids in disease modeling. In summary, studying human diseases using hiPSC-derived organoids may better illustrate the processes involved due to similarities in the architecture and microenvironment present in an organoid, which also allows drug responses to be properly recapitulated in vitro.
Remote Blood Glucose Monitoring in mHealth Scenarios: A Review
Lanzola, Giordano; Losiouk, Eleonora; Del Favero, Simone; Facchinetti, Andrea; Galderisi, Alfonso; Quaglini, Silvana; Magni, Lalo; Cobelli, Claudio
2016-01-01
Glucose concentration in the blood stream is a critical vital parameter and an effective monitoring of this quantity is crucial for diabetes treatment and intensive care management. Effective bio-sensing technology and advanced signal processing are therefore of unquestioned importance for blood glucose monitoring. Nevertheless, collecting measurements only represents part of the process as another critical task involves delivering the collected measures to the treating specialists and caregivers. These include the clinical staff, the patient’s significant other, his/her family members, and many other actors helping with the patient treatment that may be located far away from him/her. In all of these cases, a remote monitoring system, in charge of delivering the relevant information to the right player, becomes an important part of the sensing architecture. In this paper, we review how the remote monitoring architectures have evolved over time, paralleling the progress in the Information and Communication Technologies, and describe our experiences with the design of telemedicine systems for blood glucose monitoring in three medical applications. The paper ends summarizing the lessons learned through the experiences of the authors and discussing the challenges arising from a large-scale integration of sensors and actuators. PMID:27886122
Space station needs, attributes and architectural options study. Final executive review
NASA Technical Reports Server (NTRS)
1983-01-01
Identification and validation of missions, the benefits of manned presence in space, attributes and architectures, space station requirements, orbit selection, space station architectural options, technology selection, and program planning are addressed.
A Service Oriented Architecture to Enable Sensor Webs
NASA Technical Reports Server (NTRS)
Sohlberg, Rob; Frye, Stu; Cappelaere, Pat; Ungar, Steve; Ames, Troy; Chien, Steve
2006-01-01
This viewgraph presentation reviews the development of a Service Oriented Architecture to assist in lowering the cost of new Earth Science products. This architecture will enable rapid and cost effective reconfiguration of new sensors.
2017-01-01
The review is devoted to the physical, chemical, and technological aspects of the breath-figure self-assembly process. The main stages of the process and impact of the polymer architecture and physical parameters of breath-figure self-assembly on the eventual pattern are covered. The review is focused on the hierarchy of spatial and temporal scales inherent to breath-figure self-assembly. Multi-scale patterns arising from the process are addressed. The characteristic spatial lateral scales of patterns vary from nanometers to dozens of micrometers. The temporal scale of the process spans from microseconds to seconds. The qualitative analysis performed in the paper demonstrates that the process is mainly governed by interfacial phenomena, whereas the impact of inertia and gravity are negligible. Characterization and applications of polymer films manufactured with breath-figure self-assembly are discussed. PMID:28813026
NASA Astrophysics Data System (ADS)
Zaharov, A. A.; Nissenbaum, O. V.; Ponomaryov, K. Y.; Nesgovorov, E. S.
2018-01-01
In this paper we study application of Internet of Thing concept and devices to secure automated process control systems. We review different approaches in IoT (Internet of Things) architecture and design and propose them for several applications in security of automated process control systems. We consider an Attribute-based encryption in context of access control mechanism implementation and promote a secret key distribution scheme between attribute authorities and end devices.
Regulation of traffic and organelle architecture of the ER-Golgi interface by signal transduction.
Tillmann, Kerstin D; Millarte, Valentina; Farhan, Hesso
2013-09-01
The components that control trafficking between organelles of the secretory pathway as well as their architecture were uncovered to a reasonable extent in the past decades. However, only recently did we begin to explore the regulation of the secretory pathway by cellular signaling. In the current review, we focus on trafficking between the endoplasmic reticulum and the Golgi apparatus. We highlight recent advances that have been made toward a better understanding of how the secretory pathway is regulated by signaling and discuss how this knowledge is important to obtain an integrative view of secretion in the context of other homeostatic processes such as growth and proliferation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Okhravi, Hamed; Sheldon, Frederick T.; Haines, Joshua
Data diodes provide protection of critical cyber assets by the means of physically enforcing traffic direction on the network. In order to deploy data diodes effectively, it is imperative to understand the protection they provide, the protection they do not provide, their limitations, and their place in the larger security infrastructure. In this work, we study data diodes, their functionalities and limitations. We then propose two critical infrastructure systems that can benefit from the additional protection offered by data diodes: process control networks and net-centric cyber decision support systems. We review the security requirements of these systems, describe the architectures,more » and study the trade-offs. Finally, the architectures are evaluated against different attack patterns.« less
INO340 telescope control system: middleware requirements, design, and evaluation
NASA Astrophysics Data System (ADS)
Shalchian, Hengameh; Ravanmehr, Reza
2016-07-01
The INO340 Control System (INOCS) is being designed in terms of a distributed real-time architecture. The real-time (soft and firm) nature of many processes inside INOCS causes the communication paradigm between its different components to be time-critical and sensitive. For this purpose, we have chosen the Data Distribution Service (DDS) standard as the communications middleware which is itself based on the publish-subscribe paradigm. In this paper, we review and compare the main middleware types, and then we illustrate the middleware architecture of INOCS and its specific requirements. Finally, we present the experimental results, performed to evaluate our middleware in order to ensure that it meets our requirements.
Learning and Tuning of Fuzzy Rules
NASA Technical Reports Server (NTRS)
Berenji, Hamid R.
1997-01-01
In this chapter, we review some of the current techniques for learning and tuning fuzzy rules. For clarity, we refer to the process of generating rules from data as the learning problem and distinguish it from tuning an already existing set of fuzzy rules. For learning, we touch on unsupervised learning techniques such as fuzzy c-means, fuzzy decision tree systems, fuzzy genetic algorithms, and linear fuzzy rules generation methods. For tuning, we discuss Jang's ANFIS architecture, Berenji-Khedkar's GARIC architecture and its extensions in GARIC-Q. We show that the hybrid techniques capable of learning and tuning fuzzy rules, such as CART-ANFIS, RNN-FLCS, and GARIC-RB, are desirable in development of a number of future intelligent systems.
NASA Astrophysics Data System (ADS)
Kozicki, Janek; Kozicka, Joanna
Human missions to Mars are a special kind of space missions due to their long duration. The human aspect of such missions becomes as important as the technological one. The need for a human friendly and comfortable habitat arises. Studies of human behavior in ICEs have shown that larger groups of people mean a lower occurrence of conflicts. However, for a larger crew a larger habitat has to be designed -a Martian base. The research deals with psychological, sociological and technological aspects influencing the architectural design of a Martian Base. Extreme conditions present on Mars demand a partic-ular approach to technological and architectural design. To reduce the cost of building a bigger habitat, low cost solutions have been inquired into. A series of analyses has been performed to identify the best architectural solutions for a Martian base. A review of existing technologies and extreme condition habitats (both terrestrial and extraterrestrial) has revealed solutions that are the most reliable and efficient ones. Additionally, innovative technologies have been analyzed in search of the best candidates for actual base construction. Low cost solutions have been prioritized in the process. An in-depth study of architectural problems inherent in the design of a Martian base has resulted in a number of guidelines for the architect. The main ones are introduced in this review. Based on them, several concepts have been drafted as examples of user-friendly and aesthetically pleasing habitats. They are discussed in the following order: habitats made of domes, those built around greenhouses and those situated in sloping terrain. One of them is presented in detail, including interior design.
NASA Technical Reports Server (NTRS)
Wray, Richard B.; Stovall, John R.
1993-01-01
This paper presents an overview of the application of the Space Generic Open Avionics Architecture (SGOAA) to the Space Shuttle Data Processing System (DPS) architecture design. This application has been performed to validate the SGOAA, and its potential use in flight critical systems. The paper summarizes key elements of the Space Shuttle avionics architecture, data processing system requirements and software architecture as currently implemented. It then summarizes the SGOAA architecture and describes a tailoring of the SGOAA to the Space Shuttle. The SGOAA consists of a generic system architecture for the entities in spacecraft avionics, a generic processing external and internal hardware architecture, a six class model of interfaces and functional subsystem architectures for data services and operations control capabilities. It has been proposed as an avionics architecture standard with the National Aeronautics and Space Administration (NASA), through its Strategic Avionics Technology Working Group, and is being considered by the Society of Aeronautic Engineers (SAE) as an SAE Avionics Standard. This architecture was developed for the Flight Data Systems Division of JSC by the Lockheed Engineering and Sciences Company, Houston, Texas.
Imran, Noreen; Seet, Boon-Chong; Fong, A C M
2015-01-01
Distributed video coding (DVC) is a relatively new video coding architecture originated from two fundamental theorems namely, Slepian-Wolf and Wyner-Ziv. Recent research developments have made DVC attractive for applications in the emerging domain of wireless video sensor networks (WVSNs). This paper reviews the state-of-the-art DVC architectures with a focus on understanding their opportunities and gaps in addressing the operational requirements and application needs of WVSNs.
Nørnberg, Trine Riebeling; Houlby, Louise; Skov, Laurits Rohden; Peréz-Cueto, Federico Jose Armando
2016-05-01
The primary objective of this review is to assess the prevalence and quality of published studies on the effect of choice architectural nudge interventions promoting vegetable consumption among adolescents. Additionally, this review aims to identify studies estimating adolescents' attitude towards choice architectural nudge interventions. Web of Science, Scopus and PubMed were searched systematically for experimental studies with a predefined search strategy in the period November-December 2013. Publications were included following predetermined inclusion criteria. Studies were evaluated as of high, moderate or weak quality. Finally, studies were grouped by the type of intervention and underwent a narrative synthesis. The search showed that only very few studies investigated the effects of choice architectural nudging interventions on vegetable consumption, and none of them had attitude towards behavioural interventions as an outcome measure. Twelve studies met the inclusion criteria. The results of the 12 studies were inconclusive, and the majority of studies were of weak or moderate quality. This review uncovers a gap in knowledge on the effect of choice architectural nudge interventions aiming to promote the intake of vegetables among adolescents in a school context. It also highlights that no previous studies have considered the attitudes towards choice architectural nudge interventions as a potential factor for their success - or lack thereof - in achieving the desired goal of increased vegetable consumption. © Royal Society for Public Health 2015.
What we have learned: the impact of quality from a clinical trials perspective
FitzGerald, T. J.
2011-01-01
In this review article we address the radiation oncology process improvements in clinical trials and review how these changes improve the quality for the next generation of trials. In recent years we have progressed from a time of limited data acquisition to the present in which we have real time influence of clinical trials quality. This enables immediate availability of the important elements including staging, eligibility, response and outcome for all trial investigators. Modern informatics platforms are well designed for future adaptive clinical trials. We review what will be needed in the informatics architecture of current and future clinical trials. PMID:22177875
Trinczek, B.; Köpcke, F.; Leusch, T.; Majeed, R.W.; Schreiweis, B.; Wenk, J.; Bergh, B.; Ohmann, C.; Röhrig, R.; Prokosch, H.U.; Dugas, M.
2014-01-01
Summary Objective (1) To define features and data items of a Patient Recruitment System (PRS); (2) to design a generic software architecture of such a system covering the requirements; (3) to identify implementation options available within different Hospital Information System (HIS) environments; (4) to implement five PRS following the architecture and utilizing the implementation options as proof of concept. Methods Existing PRS were reviewed and interviews with users and developers conducted. All reported PRS features were collected and prioritized according to their published success and user’s request. Common feature sets were combined into software modules of a generic software architecture. Data items to process and transfer were identified for each of the modules. Each site collected implementation options available within their respective HIS environment for each module, provided a prototypical implementation based on available implementation possibilities and supported the patient recruitment of a clinical trial as a proof of concept. Results 24 commonly reported and requested features of a PRS were identified, 13 of them prioritized as being mandatory. A UML version 2 based software architecture containing 5 software modules covering these features was developed. 13 data item groups processed by the modules, thus required to be available electronically, have been identified. Several implementation options could be identified for each module, most of them being available at multiple sites. Utilizing available tools, a PRS could be implemented in each of the five participating German university hospitals. Conclusion A set of required features and data items of a PRS has been described for the first time. The software architecture covers all features in a clear, well-defined way. The variety of implementation options and the prototypes show that it is possible to implement the given architecture in different HIS environments, thus enabling more sites to successfully support patient recruitment in clinical trials. PMID:24734138
Trinczek, B; Köpcke, F; Leusch, T; Majeed, R W; Schreiweis, B; Wenk, J; Bergh, B; Ohmann, C; Röhrig, R; Prokosch, H U; Dugas, M
2014-01-01
(1) To define features and data items of a Patient Recruitment System (PRS); (2) to design a generic software architecture of such a system covering the requirements; (3) to identify implementation options available within different Hospital Information System (HIS) environments; (4) to implement five PRS following the architecture and utilizing the implementation options as proof of concept. Existing PRS were reviewed and interviews with users and developers conducted. All reported PRS features were collected and prioritized according to their published success and user's request. Common feature sets were combined into software modules of a generic software architecture. Data items to process and transfer were identified for each of the modules. Each site collected implementation options available within their respective HIS environment for each module, provided a prototypical implementation based on available implementation possibilities and supported the patient recruitment of a clinical trial as a proof of concept. 24 commonly reported and requested features of a PRS were identified, 13 of them prioritized as being mandatory. A UML version 2 based software architecture containing 5 software modules covering these features was developed. 13 data item groups processed by the modules, thus required to be available electronically, have been identified. Several implementation options could be identified for each module, most of them being available at multiple sites. Utilizing available tools, a PRS could be implemented in each of the five participating German university hospitals. A set of required features and data items of a PRS has been described for the first time. The software architecture covers all features in a clear, well-defined way. The variety of implementation options and the prototypes show that it is possible to implement the given architecture in different HIS environments, thus enabling more sites to successfully support patient recruitment in clinical trials.
Marceglia, S; Fontelo, P; Rossi, E; Ackerman, M J
2015-01-01
Mobile health Applications (mHealth Apps) are opening the way to patients' responsible and active involvement with their own healthcare management. However, apart from Apps allowing patient's access to their electronic health records (EHRs), mHealth Apps are currently developed as dedicated "island systems". Although much work has been done on patient's access to EHRs, transfer of information from mHealth Apps to EHR systems is still low. This study proposes a standards-based architecture that can be adopted by mHealth Apps to exchange information with EHRs to support better quality of care. Following the definition of requirements for the EHR/mHealth App information exchange recently proposed, and after reviewing current standards, we designed the architecture for EHR/mHealth App integration. Then, as a case study, we modeled a system based on the proposed architecture aimed to support home monitoring for congestive heart failure patients. We simulated such process using, on the EHR side, OpenMRS, an open source longitudinal EHR and, on the mHealth App side, the iOS platform. The integration architecture was based on the bi-directional exchange of standard documents (clinical document architecture rel2 - CDA2). In the process, the clinician "prescribes" the home monitoring procedures by creating a CDA2 prescription in the EHR that is sent, encrypted and de-identified, to the mHealth App to create the monitoring calendar. At the scheduled time, the App alerts the patient to start the monitoring. After the measurements are done, the App generates a structured CDA2-compliant monitoring report and sends it to the EHR, thus avoiding local storage. The proposed architecture, even if validated only in a simulation environment, represents a step forward in the integration of personal mHealth Apps into the larger health-IT ecosystem, allowing the bi-directional data exchange between patients and healthcare professionals, supporting the patient's engagement in self-management and self-care.
Three Program Architecture for Design Optimization
NASA Technical Reports Server (NTRS)
Miura, Hirokazu; Olson, Lawrence E. (Technical Monitor)
1998-01-01
In this presentation, I would like to review historical perspective on the program architecture used to build design optimization capabilities based on mathematical programming and other numerical search techniques. It is rather straightforward to classify the program architecture in three categories as shown above. However, the relative importance of each of the three approaches has not been static, instead dynamically changing as the capabilities of available computational resource increases. For example, we considered that the direct coupling architecture would never be used for practical problems, but availability of such computer systems as multi-processor. In this presentation, I would like to review the roles of three architecture from historical as well as current and future perspective. There may also be some possibility for emergence of hybrid architecture. I hope to provide some seeds for active discussion where we are heading to in the very dynamic environment for high speed computing and communication.
Space Communications Capability Roadmap Interim Review
NASA Technical Reports Server (NTRS)
Spearing, Robert; Regan, Michael
2005-01-01
Contents include the following: Identify the need for a robust communications and navigation architecture for the success of exploration and science missions. Describe an approach for specifying architecture alternatives and analyzing them. Establish a top level architecture based on a network of networks. Identify key enabling technologies. Synthesize capability, architecture and technology into an initial capability roadmap.
Jupiter Europa Orbiter Architecture Definition Process
NASA Technical Reports Server (NTRS)
Rasmussen, Robert; Shishko, Robert
2011-01-01
The proposed Jupiter Europa Orbiter mission, planned for launch in 2020, is using a new architectural process and framework tool to drive its model-based systems engineering effort. The process focuses on getting the architecture right before writing requirements and developing a point design. A new architecture framework tool provides for the structured entry and retrieval of architecture artifacts based on an emerging architecture meta-model. This paper describes the relationships among these artifacts and how they are used in the systems engineering effort. Some early lessons learned are discussed.
Nanomechanical architecture of semiconductor nanomembranes.
Huang, Minghuang; Cavallo, Francesca; Liu, Feng; Lagally, Max G
2011-01-01
Semiconductor nanomembranes are single-crystal sheets with thickness ranging from 5 to 500nm. They are flexible, bondable, and mechanically ultra-compliant. They present a new platform to combine bottom-up and top-down semiconductor processing to fabricate various three-dimensional (3D) nanomechanical architectures, with an unprecedented level of control. The bottom-up part is the self-assembly, via folding, rolling, bending, curling, or other forms of shape change of the nanomembranes, with top-down patterning providing the starting point for these processes. The self-assembly to form 3D structures is driven by elastic strain relaxation. A variety of structures, including tubes, rings, coils, rolled-up "rugs", and periodic wrinkles, has been made by such self-assembly. Their geometry and unique properties suggest many potential applications. In this review, we describe the design of desired nanostructures based on continuum mechanics modelling, definition and fabrication of 2D strained nanomembranes according to the established design, and release of the 2D strained sheet into a 3D or quasi-3D object. We also describe several materials properties of nanomechanical architectures. We discuss potential applications of nanomembrane technology to implement simple and hybrid functionalities.
An Evolutionary Optimization Framework for Neural Networks and Neuromorphic Architectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schuman, Catherine D; Plank, James; Disney, Adam
2016-01-01
As new neural network and neuromorphic architectures are being developed, new training methods that operate within the constraints of the new architectures are required. Evolutionary optimization (EO) is a convenient training method for new architectures. In this work, we review a spiking neural network architecture and a neuromorphic architecture, and we describe an EO training framework for these architectures. We present the results of this training framework on four classification data sets and compare those results to other neural network and neuromorphic implementations. We also discuss how this EO framework may be extended to other architectures.
Role of System Architecture in Architecture in Developing New Drafting Tools
NASA Astrophysics Data System (ADS)
Sorguç, Arzu Gönenç
In this study, the impact of information technologies in architectural design process is discussed. In this discussion, first the differences/nuances between the concept of software engineering and system architecture are clarified. Then, the design process in engineering, and design process in architecture has been compared by considering 3-D models as the center of design process over which the other disciplines involve the design. It is pointed out that in many high-end engineering applications, 3-D solid models and consequently digital mock-up concept has become a common practice. But, architecture as one of the important customers of CAD systems employing these tools has not started to use these 3-D models. It is shown that the reason of this time lag between architecture and engineering lies behind the tradition of design attitude. Therefore, it is proposed a new design scheme a meta-model to develop an integrated design model being centered on 3-D model. It is also proposed a system architecture to achieve the transformation of architectural design process by replacing 2-D thinking with 3-D thinking. It is stated that in the proposed system architecture, the CAD systems are included and adapted for 3-D architectural design in order to provide interfaces for integration of all possible disciplines to design process. It is also shown that such a change will allow to elaborate the intelligent or smart building concept in future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, M.R.
1991-02-01
In recent years the NASA Langley Research Center has funded several contractors to conduct conceptual designs defining architectures for fault tolerant computer systems. Such a system is referred to as a Multi-Path Redundant Avionics Suite (MPRAS), and would form the basis for avionics systems that would be used in future families of space vehicles in a variety of missions. The principal contractors were General Dynamics, Boeing, and Draper Laboratories. These contractors participated in a series of review meetings, and submitted final reports defining their candidate architectures. NASA then commissioned the Research Triangle Institute (RTI) to perform an assessment of thesemore » architectures to identify strengths and weaknesses of each. This report is a separate, independent review of the RTI assessment, done primarily to assure that the assessment was comprehensive and objective. The report also includes general recommendations relative to further MPRAS development.« less
FTA Transit Intelligent Transportation System Architecture Consistency Review - 2010 Update
DOT National Transportation Integrated Search
2011-07-01
This report provides an assessment on the level of compliance among the FTA grantees with the National ITS Architecture Policy, specifically examining three items: 1. The use and maintenance of Regional ITS Architectures by transit agencies to plan, ...
NASA Technical Reports Server (NTRS)
Cohen, Marc M. (Editor); Eichold, Alice (Editor); Heers, Susan (Editor)
1987-01-01
Articles are presented on a space station architectural elements model study, space station group activities habitability module study, full-scale architectural simulation techniques for space stations, and social factors in space station interiors.
Air Traffic Control: Complete and Enforced Architecture Needed for FAA Systems Modernization
DOT National Transportation Integrated Search
1997-02-01
Because of the size, complexity, and importance of FAA's air traffic control : (ATC) modernization, the General Accounting Office (GAO) reviewed it to : determine (1) whether FAA has a target architecture(s), and associated : subarchitectures, to gui...
The role of neuroimaging in the discovery of processing stages. A review.
Mulder, G; Wijers, A A; Lange, J J; Buijink, B M; Mulder, L J; Willemsen, A T; Paans, A M
1995-11-01
In this contribution we show how neuroimaging methods can augment behavioural methods to discover processing stages. Event Related Brain Potentials (ERPs), Brain Electrical Source Analysis (BESA) and regional changes in cerebral blood flow (rCBF) do not necessarily require behavioural responses. With the aid of rCBF we are able to discover several cortical and subcortical brain systems (processors) active in selective attention and memory search tasks. BESA describes cortical activity with high temporal resolution in terms of a limited number of neural generators within these brain systems. The combination of behavioural methods and neuroimaging provides a picture of the functional architecture of the brain. The review is organized around three processors: the Visual, Cognitive and Manual Motor Processors.
Real-Time and Secure Wireless Health Monitoring
Dağtaş, S.; Pekhteryev, G.; Şahinoğlu, Z.; Çam, H.; Challa, N.
2008-01-01
We present a framework for a wireless health monitoring system using wireless networks such as ZigBee. Vital signals are collected and processed using a 3-tiered architecture. The first stage is the mobile device carried on the body that runs a number of wired and wireless probes. This device is also designed to perform some basic processing such as the heart rate and fatal failure detection. At the second stage, further processing is performed by a local server using the raw data transmitted by the mobile device continuously. The raw data is also stored at this server. The processed data as well as the analysis results are then transmitted to the service provider center for diagnostic reviews as well as storage. The main advantages of the proposed framework are (1) the ability to detect signals wirelessly within a body sensor network (BSN), (2) low-power and reliable data transmission through ZigBee network nodes, (3) secure transmission of medical data over BSN, (4) efficient channel allocation for medical data transmission over wireless networks, and (5) optimized analysis of data using an adaptive architecture that maximizes the utility of processing and computational capacity at each platform. PMID:18497866
Technology Review of Multi-Agent Systems and Tools
2005-06-01
over a network, including the Internet. A web services architecture is the logical evolution of object-oriented analysis and design coupled with...the logical evolution of components geared towards the architecture, design, implementation, and deployment of e-business solutions. As in object...querying. The Web Services architecture describes the principles behind the next generation of e- business architectures, presenting a logical evolution
ERIC Educational Resources Information Center
Uwakonye, Obioha; Alagbe, Oluwole; Oluwatayo, Adedapo; Alagbe, Taiye; Alalade, Gbenga
2015-01-01
As a result of globalization of digital technology, intellectual discourse on what constitutes the basic body of architectural knowledge to be imparted to future professionals has been on the increase. This digital revolution has brought to the fore the need to review the already overloaded architectural education curriculum of Nigerian schools of…
Advances in Digital Calibration Techniques Enabling Real-Time Beamforming SweepSAR Architectures
NASA Technical Reports Server (NTRS)
Hoffman, James P.; Perkovic, Dragana; Ghaemi, Hirad; Horst, Stephen; Shaffer, Scott; Veilleux, Louise
2013-01-01
Real-time digital beamforming, combined with lightweight, large aperture reflectors, enable SweepSAR architectures, which promise significant increases in instrument capability for solid earth and biomass remote sensing. These new instrument concepts require new methods for calibrating the multiple channels, which are combined on-board, in real-time. The benefit of this effort is that it enables a new class of lightweight radar architecture, Digital Beamforming with SweepSAR, providing significantly larger swath coverage than conventional SAR architectures for reduced mass and cost. This paper will review the on-going development of the digital calibration architecture for digital beamforming radar instrument, such as the proposed Earth Radar Mission's DESDynI (Deformation, Ecosystem Structure, and Dynamics of Ice) instrument. This proposed instrument's baseline design employs SweepSAR digital beamforming and requires digital calibration. We will review the overall concepts and status of the system architecture, algorithm development, and the digital calibration testbed currently being developed. We will present results from a preliminary hardware demonstration. We will also discuss the challenges and opportunities specific to this novel architecture.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-21
... to Support ATC Winds SC-214 Briefing TOR Changes Other business Sub-Groups meetings Sep 24-26... MET Delivery Architecture Recommendations review Sep 27, Friday, Closing Plenary Sub-Groups reports Appoval for AIS and MET Delivery Architecture Recommendations document to enter FRAC Action item review...
75 FR 68806 - Statement of Organization, Functions and Delegations of Authority
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-09
... Agency business applications architectures, the engineering of business processes, the building and... architecture, engineers technology for business processes, builds, deploys, maintains and manages enterprise systems and data collections efforts; (5) applies business applications architecture to process specific...
Quality Attributes for Mission Flight Software: A Reference for Architects
NASA Technical Reports Server (NTRS)
Wilmot, Jonathan; Fesq, Lorraine; Dvorak, Dan
2016-01-01
In the international standards for architecture descriptions in systems and software engineering (ISO/IEC/IEEE 42010), "concern" is a primary concept that often manifests itself in relation to the quality attributes or "ilities" that a system is expected to exhibit - qualities such as reliability, security and modifiability. One of the main uses of an architecture description is to serve as a basis for analyzing how well the architecture achieves its quality attributes, and that requires architects to be as precise as possible about what they mean in claiming, for example, that an architecture supports "modifiability." This paper describes a table, generated by NASA's Software Architecture Review Board, which lists fourteen key quality attributes, identifies different important aspects of each quality attribute and considers each aspect in terms of requirements, rationale, evidence, and tactics to achieve the aspect. This quality attribute table is intended to serve as a guide to software architects, software developers, and software architecture reviewers in the domain of mission-critical real-time embedded systems, such as space mission flight software.
Software synthesis using generic architectures
NASA Technical Reports Server (NTRS)
Bhansali, Sanjay
1993-01-01
A framework for synthesizing software systems based on abstracting software system designs and the design process is described. The result of such an abstraction process is a generic architecture and the process knowledge for customizing the architecture. The customization process knowledge is used to assist a designer in customizing the architecture as opposed to completely automating the design of systems. Our approach using an implemented example of a generic tracking architecture which was customized in two different domains is illustrated. How the designs produced using KASE compare to the original designs of the two systems, and current work and plans for extending KASE to other application areas are described.
NASA Astrophysics Data System (ADS)
Ghosh, Sreya
2017-02-01
This article proposes a new six-model architecture for an intelligent tutoring system to be incorporated in a learning management system with domain-independence feature and individualized dissemination. The present six model architecture aims to simulate a human tutor. Some recent extensions of using intelligent tutoring system (ITS) explores learning management systems to behave as a real teacher during a teaching-learning process, by taking care of, mainly, the dynamic response system. However, the present paper argues that to mimic a human teacher it needs not only the dynamic response but also the incorporation of the teacher's dynamic review of students' performance and keeping track of their current level of understanding. Here, the term individualization has been used to refer to tailor making of contents and its dissemination fitting to the individual needs and capabilities of learners who is taking a course online and is subjected to teaching in absentia. This paper describes how the individual models of the proposed architecture achieves the features of ITS.
Materials science and architecture
NASA Astrophysics Data System (ADS)
Bechthold, Martin; Weaver, James C.
2017-12-01
Materiality — the use of various materials in architecture — has been fundamental to the design and construction of buildings, and materials science has traditionally responded to needs formulated by design, engineering and construction professionals. Material properties and processes are shaping buildings and influencing how they perform. The advent of technologies such as digital fabrication, robotics and 3D printing have not only accelerated the development of new construction solutions, but have also led to a renewed interest in materials as a catalyst for novel architectural design. In parallel, materials science has transformed from a field that explains materials to one that designs materials from the bottom up. The conflation of these two trends is giving rise to materials-based design research in which architects, engineers and materials scientists work as partners in the conception of new materials systems and their applications. This Review surveys this development for different material classes (wood, ceramics, metals, concrete, glass, synthetic composites and polymers), with an emphasis on recent trends and innovations.
Digital image processing: a primer for JVIR authors and readers: Part 3: Digital image editing.
LaBerge, Jeanne M; Andriole, Katherine P
2003-12-01
This is the final installment of a three-part series on digital image processing intended to prepare authors for online submission of manuscripts. In the first two articles of the series, the fundamentals of digital image architecture were reviewed and methods of importing images to the computer desktop were described. In this article, techniques are presented for editing images in preparation for online submission. A step-by-step guide to basic editing with use of Adobe Photoshop is provided and the ethical implications of this activity are explored.
Evaluation of an Atmosphere Revitalization Subsystem for Deep Space Exploration Missions
NASA Technical Reports Server (NTRS)
Perry, Jay L.; Abney, Morgan B.; Conrad, Ruth E.; Frederick, Kenneth R.; Greenwood, Zachary W.; Kayatin, Matthew J.; Knox, James C.; Newton, Robert L.; Parrish, Keith J.; Takada, Kevin C.;
2015-01-01
An Atmosphere Revitalization Subsystem (ARS) suitable for deployment aboard deep space exploration mission vehicles has been developed and functionally demonstrated. This modified ARS process design architecture was derived from the International Space Station's (ISS) basic ARS. Primary functions considered in the architecture include trace contaminant control, carbon dioxide removal, carbon dioxide reduction, and oxygen generation. Candidate environmental monitoring instruments were also evaluated. The process architecture rearranges unit operations and employs equipment operational changes to reduce mass, simplify, and improve the functional performance for trace contaminant control, carbon dioxide removal, and oxygen generation. Results from integrated functional demonstration are summarized and compared to the performance observed during previous testing conducted on an ISS-like subsystem architecture and a similarly evolved process architecture. Considerations for further subsystem architecture and process technology development are discussed.
Selecting a Benchmark Suite to Profile High-Performance Computing (HPC) Machines
2014-11-01
architectures. Machines now contain central processing units (CPUs), graphics processing units (GPUs), and many integrated core ( MIC ) architecture all...evaluate the feasibility and applicability of a new architecture just released to the market . Researchers are often unsure how available resources will...architectures. Having a suite of programs running on different architectures, such as GPUs, MICs , and CPUs, adds complexity and technical challenges
Mountain building processes during continent continent collision in the Uralides
NASA Astrophysics Data System (ADS)
Brown, D.; Juhlin, C.; Ayala, C.; Tryggvason, A.; Bea, F.; Alvarez-Marron, J.; Carbonell, R.; Seward, D.; Glasmacher, U.; Puchkov, V.; Perez-Estaun, A.
2008-08-01
Since the early 1990's the Paleozoic Uralide Orogen of Russia has been the target of a significant research initiative as part of EUROPROBE and GEODE, both European Science Foundation programmes. One of the main objectives of these research programmes was the determination of the tectonic processes that went into the formation of the orogen. In this review paper we focus on the Late Paleozoic continent-continent collision that took place between Laurussia and Kazakhstania. Research in the Uralides was concentrated around two deep seismic profiles crossing the orogen. These were accompanied by geological, geophysical, geochronological, geochemical, and low-temperature thermochronological studies. The seismic profiles demonstrate that the Uralides has an overall bivergent structural architecture, but with significantly different reflectivity characteristics from one tectonic zone to another. The integration of other types of data sets with the seismic data allows us to interpret what tectonic processes where responsible for the formation of the structural architecture, and when they were active. On the basis of these data, we suggest that the changes in the crustal-scale structural architecture indicate that there was significant partitioning of tectonothermal conditions and deformation from zone to zone across major fault systems, and between the lower and upper crust. Also, a number of the structural features revealed in the bivergent architecture of the orogen formed either in the Neoproterozoic or in the Paleozoic, prior to continent-continent collision. From the end of continent-continent collision to the present, low-temperature thermochronology suggests that the evolution of the Uralides has been dominated by erosion and slow exhumation. Despite some evidence for more recent topographic uplift, it has so far proven difficult to quantify it.
Mark 4A antenna control system data handling architecture study
NASA Technical Reports Server (NTRS)
Briggs, H. C.; Eldred, D. B.
1991-01-01
A high-level review was conducted to provide an analysis of the existing architecture used to handle data and implement control algorithms for NASA's Deep Space Network (DSN) antennas and to make system-level recommendations for improving this architecture so that the DSN antennas can support the ever-tightening requirements of the next decade and beyond. It was found that the existing system is seriously overloaded, with processor utilization approaching 100 percent. A number of factors contribute to this overloading, including dated hardware, inefficient software, and a message-passing strategy that depends on serial connections between machines. At the same time, the system has shortcomings and idiosyncrasies that require extensive human intervention. A custom operating system kernel and an obscure programming language exacerbate the problems and should be modernized. A new architecture is presented that addresses these and other issues. Key features of the new architecture include a simplified message passing hierarchy that utilizes a high-speed local area network, redesign of particular processing function algorithms, consolidation of functions, and implementation of the architecture in modern hardware and software using mainstream computer languages and operating systems. The system would also allow incremental hardware improvements as better and faster hardware for such systems becomes available, and costs could potentially be low enough that redundancy would be provided economically. Such a system could support DSN requirements for the foreseeable future, though thorough consideration must be given to hard computational requirements, porting existing software functionality to the new system, and issues of fault tolerance and recovery.
Taking advantage of ground data systems attributes to achieve quality results in testing software
NASA Technical Reports Server (NTRS)
Sigman, Clayton B.; Koslosky, John T.; Hageman, Barbara H.
1994-01-01
During the software development life cycle process, basic testing starts with the development team. At the end of the development process, an acceptance test is performed for the user to ensure that the deliverable is acceptable. Ideally, the delivery is an operational product with zero defects. However, the goal of zero defects is normally not achieved but is successful to various degrees. With the emphasis on building low cost ground support systems while maintaining a quality product, a key element in the test process is simulator capability. This paper reviews the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) test tool that is used in the acceptance test process for unmanned satellite operations control centers. The TASS is designed to support the development, test and operational environments of the Goddard Space Flight Center (GSFC) operations control centers. The TASS uses the same basic architecture as the operations control center. This architecture is characterized by its use of distributed processing, industry standards, commercial off-the-shelf (COTS) hardware and software components, and reusable software. The TASS uses much of the same TPOCC architecture and reusable software that the operations control center developer uses. The TASS also makes use of reusable simulator software in the mission specific versions of the TASS. Very little new software needs to be developed, mainly mission specific telemetry communication and command processing software. By taking advantage of the ground data system attributes, successful software reuse for operational systems provides the opportunity to extend the reuse concept into the test area. Consistency in test approach is a major step in achieving quality results.
Signori, Marcos R; Garcia, Renato
2010-01-01
This paper presents a model that aids the Clinical Engineering to deal with Risk Management in the Healthcare Technological Process. The healthcare technological setting is complex and supported by three basics entities: infrastructure (IS), healthcare technology (HT), and human resource (HR). Was used an Enterprise Architecture - MODAF (Ministry of Defence Architecture Framework) - to model this process for risk management. Thus, was created a new model to contribute to the risk management in the HT process, through the Clinical Engineering viewpoint. This architecture model can support and improve the decision making process of the Clinical Engineering to the Risk Management in the Healthcare Technological process.
Avionics System Architecture for NASA Orion Vehicle
NASA Technical Reports Server (NTRS)
Baggerman, Clint
2010-01-01
This viewgraph presentation reviews the Orion Crew Exploration Vehicle avionics architecture. The contents include: 1) What is Orion?; 2) Orion Concept of Operations; 3) Orion Subsystems; 4) Orion Avionics Architecture; 5) Orion Avionics-Network; 6) Orion Network Unification; 7) Orion Avionics-Integrity; 8) Orion Avionics-Partitioning; and 9) Orion Avionics-Redundancy.
Electrochemical Biosensors - Sensor Principles and Architectures
Grieshaber, Dorothee; MacKenzie, Robert; Vörös, Janos; Reimhult, Erik
2008-01-01
Quantification of biological or biochemical processes are of utmost importance for medical, biological and biotechnological applications. However, converting the biological information to an easily processed electronic signal is challenging due to the complexity of connecting an electronic device directly to a biological environment. Electrochemical biosensors provide an attractive means to analyze the content of a biological sample due to the direct conversion of a biological event to an electronic signal. Over the past decades several sensing concepts and related devices have been developed. In this review, the most common traditional techniques, such as cyclic voltammetry, chronoamperometry, chronopotentiometry, impedance spectroscopy, and various field-effect transistor based methods are presented along with selected promising novel approaches, such as nanowire or magnetic nanoparticle-based biosensing. Additional measurement techniques, which have been shown useful in combination with electrochemical detection, are also summarized, such as the electrochemical versions of surface plasmon resonance, optical waveguide lightmode spectroscopy, ellipsometry, quartz crystal microbalance, and scanning probe microscopy. The signal transduction and the general performance of electrochemical sensors are often determined by the surface architectures that connect the sensing element to the biological sample at the nanometer scale. The most common surface modification techniques, the various electrochemical transduction mechanisms, and the choice of the recognition receptor molecules all influence the ultimate sensitivity of the sensor. New nanotechnology-based approaches, such as the use of engineered ion-channels in lipid bilayers, the encapsulation of enzymes into vesicles, polymersomes, or polyelectrolyte capsules provide additional possibilities for signal amplification. In particular, this review highlights the importance of the precise control over the delicate interplay between surface nano-architectures, surface functionalization and the chosen sensor transducer principle, as well as the usefulness of complementary characterization tools to interpret and to optimize the sensor response. PMID:27879772
A convergent model for distributed processing of Big Sensor Data in urban engineering networks
NASA Astrophysics Data System (ADS)
Parygin, D. S.; Finogeev, A. G.; Kamaev, V. A.; Finogeev, A. A.; Gnedkova, E. P.; Tyukov, A. P.
2017-01-01
The problems of development and research of a convergent model of the grid, cloud, fog and mobile computing for analytical Big Sensor Data processing are reviewed. The model is meant to create monitoring systems of spatially distributed objects of urban engineering networks and processes. The proposed approach is the convergence model of the distributed data processing organization. The fog computing model is used for the processing and aggregation of sensor data at the network nodes and/or industrial controllers. The program agents are loaded to perform computing tasks for the primary processing and data aggregation. The grid and the cloud computing models are used for integral indicators mining and accumulating. A computing cluster has a three-tier architecture, which includes the main server at the first level, a cluster of SCADA system servers at the second level, a lot of GPU video cards with the support for the Compute Unified Device Architecture at the third level. The mobile computing model is applied to visualize the results of intellectual analysis with the elements of augmented reality and geo-information technologies. The integrated indicators are transferred to the data center for accumulation in a multidimensional storage for the purpose of data mining and knowledge gaining.
Architecture for Survivable System Processing (ASSP)
NASA Astrophysics Data System (ADS)
Wood, Richard J.
1991-11-01
The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.
Architecture for Survivable System Processing (ASSP)
NASA Technical Reports Server (NTRS)
Wood, Richard J.
1991-01-01
The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.
DFT algorithms for bit-serial GaAs array processor architectures
NASA Technical Reports Server (NTRS)
Mcmillan, Gary B.
1988-01-01
Systems and Processes Engineering Corporation (SPEC) has developed an innovative array processor architecture for computing Fourier transforms and other commonly used signal processing algorithms. This architecture is designed to extract the highest possible array performance from state-of-the-art GaAs technology. SPEC's architectural design includes a high performance RISC processor implemented in GaAs, along with a Floating Point Coprocessor and a unique Array Communications Coprocessor, also implemented in GaAs technology. Together, these data processors represent the latest in technology, both from an architectural and implementation viewpoint. SPEC has examined numerous algorithms and parallel processing architectures to determine the optimum array processor architecture. SPEC has developed an array processor architecture with integral communications ability to provide maximum node connectivity. The Array Communications Coprocessor embeds communications operations directly in the core of the processor architecture. A Floating Point Coprocessor architecture has been defined that utilizes Bit-Serial arithmetic units, operating at very high frequency, to perform floating point operations. These Bit-Serial devices reduce the device integration level and complexity to a level compatible with state-of-the-art GaAs device technology.
System Architecture for Anti-Ship Ballistic Missile Defense (ASBMD)
2009-12-01
this threat. This thesis documents the process that was used to select and integrate the proposed ASBMD architecture. 15. NUMBER OF PAGES 232...thesis documents the process that was used to select and integrate the proposed ASBMD architecture. vi This page is intentionally left blank...39 B. Process
DOT National Transportation Integrated Search
2002-04-01
The Physical Architecture identifies the physical subsystems and, architecture flows between subsystems that will implement the processes and support the data flows of the ITS Logical Architecture. The Physical Architecture further identifies the sys...
The Rapid Response Radiation Survey (R3S) Mission Using the HISat Conformal Satellite Architecture
NASA Technical Reports Server (NTRS)
Miller, Nathanael
2015-01-01
The Rapid Response Radiation Survey (R3S) experiment, designed as a quick turnaround mission to make radiation measurements in LEO, will fly as a hosted payload in partnership with NovaWurks using their Hyper-integrated Satlet (HiSat) architecture. The need for the mission arises as the Nowcast of Atmospheric Ionization Radiation for Aviation Safety (NAIRAS) model moves from a research effort into an operational radiation assessment tool. The data collected by R3S, in addition to the complementary data from a NASA Langley Research Center (LaRC) atmospheric balloon mission entitled Radiation Dosimetry Experiment (RaDX), will validate exposure prediction capabilities of NAIRAS. This paper discusses the development of the R3S experiment as made possible by use of the HiSat architecture. The system design and operational modes of the experiment are described, as well as the experiment interfaces to the HiSat satellite via the user defined adapter (UDA) provided by NovaWurks. This paper outlines the steps taken by the project to execute the R3S mission in the 4 months of design, build, and test. Finally, description of the engineering process is provided, including the use of facilitated rapid/concurrent engineering sessions, the associated documentation, and the review process employed.
Multimedia architectures: from desktop systems to portable appliances
NASA Astrophysics Data System (ADS)
Bhaskaran, Vasudev; Konstantinides, Konstantinos; Natarajan, Balas R.
1997-01-01
Future desktop and portable computing systems will have as their core an integrated multimedia system. Such a system will seamlessly combine digital video, digital audio, computer animation, text, and graphics. Furthermore, such a system will allow for mixed-media creation, dissemination, and interactive access in real time. Multimedia architectures that need to support these functions have traditionally required special display and processing units for the different media types. This approach tends to be expensive and is inefficient in its use of silicon. Furthermore, such media-specific processing units are unable to cope with the fluid nature of the multimedia market wherein the needs and standards are changing and system manufacturers may demand a single component media engine across a range of products. This constraint has led to a shift towards providing a single-component multimedia specific computing engine that can be integrated easily within desktop systems, tethered consumer appliances, or portable appliances. In this paper, we review some of the recent architectural efforts in developing integrated media systems. We primarily focus on two efforts, namely the evolution of multimedia-capable general purpose processors and a more recent effort in developing single component mixed media co-processors. Design considerations that could facilitate the migration of these technologies to a portable integrated media system also are presented.
Rooney, Kevin K.; Condia, Robert J.; Loschky, Lester C.
2017-01-01
Neuroscience has well established that human vision divides into the central and peripheral fields of view. Central vision extends from the point of gaze (where we are looking) out to about 5° of visual angle (the width of one’s fist at arm’s length), while peripheral vision is the vast remainder of the visual field. These visual fields project to the parvo and magno ganglion cells, which process distinctly different types of information from the world around us and project that information to the ventral and dorsal visual streams, respectively. Building on the dorsal/ventral stream dichotomy, we can further distinguish between focal processing of central vision, and ambient processing of peripheral vision. Thus, our visual processing of and attention to objects and scenes depends on how and where these stimuli fall on the retina. The built environment is no exception to these dependencies, specifically in terms of how focal object perception and ambient spatial perception create different types of experiences we have with built environments. We argue that these foundational mechanisms of the eye and the visual stream are limiting parameters of architectural experience. We hypothesize that people experience architecture in two basic ways based on these visual limitations; by intellectually assessing architecture consciously through focal object processing and assessing architecture in terms of atmosphere through pre-conscious ambient spatial processing. Furthermore, these separate ways of processing architectural stimuli operate in parallel throughout the visual perceptual system. Thus, a more comprehensive understanding of architecture must take into account that built environments are stimuli that are treated differently by focal and ambient vision, which enable intellectual analysis of architectural experience versus the experience of architectural atmosphere, respectively. We offer this theoretical model to help advance a more precise understanding of the experience of architecture, which can be tested through future experimentation. (298 words) PMID:28360867
Rooney, Kevin K; Condia, Robert J; Loschky, Lester C
2017-01-01
Neuroscience has well established that human vision divides into the central and peripheral fields of view. Central vision extends from the point of gaze (where we are looking) out to about 5° of visual angle (the width of one's fist at arm's length), while peripheral vision is the vast remainder of the visual field. These visual fields project to the parvo and magno ganglion cells, which process distinctly different types of information from the world around us and project that information to the ventral and dorsal visual streams, respectively. Building on the dorsal/ventral stream dichotomy, we can further distinguish between focal processing of central vision, and ambient processing of peripheral vision. Thus, our visual processing of and attention to objects and scenes depends on how and where these stimuli fall on the retina. The built environment is no exception to these dependencies, specifically in terms of how focal object perception and ambient spatial perception create different types of experiences we have with built environments. We argue that these foundational mechanisms of the eye and the visual stream are limiting parameters of architectural experience. We hypothesize that people experience architecture in two basic ways based on these visual limitations; by intellectually assessing architecture consciously through focal object processing and assessing architecture in terms of atmosphere through pre-conscious ambient spatial processing. Furthermore, these separate ways of processing architectural stimuli operate in parallel throughout the visual perceptual system. Thus, a more comprehensive understanding of architecture must take into account that built environments are stimuli that are treated differently by focal and ambient vision, which enable intellectual analysis of architectural experience versus the experience of architectural atmosphere, respectively. We offer this theoretical model to help advance a more precise understanding of the experience of architecture, which can be tested through future experimentation. (298 words).
NASA Astrophysics Data System (ADS)
Źróbek-Różańska, Alina; Zysk, Elżbieta; Źróbek, Sabina
2017-10-01
Poland has a turbulent and rich history. Partitions, wars, a centrally planned economy of the socialist era and the rapid transition to a market economy left visible marks on the Polish landscape. The changes that took place in the 20th century and the early 21st century have vastly influenced the country’s architecture. Residential buildings in rural suburbs bear witness to turbulent historical events and change processes. This study analyzed residential buildings in two villages situated in the historical district of Warmia (north-eastern Poland) which is now a part of the Region of Warmia and Mazury. The results of the observations were used to review the social, economic, legal and planning factors that influenced residential architecture between 1900 and 2017. The traditional layout of Warmian villages is well preserved in the analyzed locations where pre-war architectural design mingles with buildings erected in the socialist era when construction materials were scarce. Many buildings in the surveyed villages are reminiscent of collective farms, the prescribed architectural style of the 1970s as well as the stylistic diversity of the early transformation period when customized building plans and construction materials became available. The local landscape also features buildings erected in successive decades which brought a significant increase in the price of land and maintenance costs.
Hybrid battery/supercapacitor energy storage system for the electric vehicles
NASA Astrophysics Data System (ADS)
Kouchachvili, Lia; Yaïci, Wahiba; Entchev, Evgueniy
2018-01-01
Electric vehicles (EVs) have recently attracted considerable attention and so did the development of the battery technologies. Although the battery technology has been significantly advanced, the available batteries do not entirely meet the energy demands of the EV power consumption. One of the key issues is non-monotonic consumption of energy accompanied by frequent changes during the battery discharging process. This is very harmful to the electrochemical process of the battery. A practical solution is to couple the battery with a supercapacitor, which is basically an electrochemical cell with a similar architecture, but with a higher rate capability and better cyclability. In this design, the supercapacitor can provide the excess energy required while the battery fails to do so. In addition to the battery and supercapacitor as the individual units, designing the architecture of the corresponding hybrid system from an electrical engineering point of view is of utmost importance. The present manuscript reviews the recent works devoted to the application of various battery/supercapacitor hybrid systems in EVs.
Fontelo, P.; Rossi, E.; Ackerman, MJ
2015-01-01
Summary Background Mobile health Applications (mHealth Apps) are opening the way to patients’ responsible and active involvement with their own healthcare management. However, apart from Apps allowing patient’s access to their electronic health records (EHRs), mHealth Apps are currently developed as dedicated “island systems”. Objective Although much work has been done on patient’s access to EHRs, transfer of information from mHealth Apps to EHR systems is still low. This study proposes a standards-based architecture that can be adopted by mHealth Apps to exchange information with EHRs to support better quality of care. Methods Following the definition of requirements for the EHR/mHealth App information exchange recently proposed, and after reviewing current standards, we designed the architecture for EHR/mHealth App integration. Then, as a case study, we modeled a system based on the proposed architecture aimed to support home monitoring for congestive heart failure patients. We simulated such process using, on the EHR side, OpenMRS, an open source longitudinal EHR and, on the mHealth App side, the iOS platform. Results The integration architecture was based on the bi-directional exchange of standard documents (clinical document architecture rel2 – CDA2). In the process, the clinician “prescribes” the home monitoring procedures by creating a CDA2 prescription in the EHR that is sent, encrypted and de-identified, to the mHealth App to create the monitoring calendar. At the scheduled time, the App alerts the patient to start the monitoring. After the measurements are done, the App generates a structured CDA2-compliant monitoring report and sends it to the EHR, thus avoiding local storage. Conclusions The proposed architecture, even if validated only in a simulation environment, represents a step forward in the integration of personal mHealth Apps into the larger health-IT ecosystem, allowing the bi-directional data exchange between patients and healthcare professionals, supporting the patient’s engagement in self-management and self-care. PMID:26448794
NOAO observing proposal processing system
NASA Astrophysics Data System (ADS)
Bell, David J.; Gasson, David; Hartman, Mia
2002-12-01
Since going electronic in 1994, NOAO has continued to refine and enhance its observing proposal handling system. Virtually all related processes are now handled electronically. Members of the astronomical community can submit proposals through email, web form or via Gemini's downloadable Phase-I Tool. NOAO staff can use online interfaces for administrative tasks, technical reviews, telescope scheduling, and compilation of various statistics. In addition, all information relevant to the TAC process is made available online. The system, now known as ANDES, is designed as a thin-client architecture (web pages are now used for almost all database functions) built using open source tools (FreeBSD, Apache, MySQL, Perl, PHP) to process descriptively-marked (LaTeX, XML) proposal documents.
Using Real and Simulated TNOs to Constrain the Outer Solar System
NASA Astrophysics Data System (ADS)
Kaib, Nathan
2018-04-01
Over the past 2-3 decades our understanding of the outer solar system’s history and current state has evolved dramatically. An explosion in the number of detected trans-Neptunian objects (TNOs) coupled with simultaneous advances in numerical models of orbital dynamics has driven this rapid evolution. However, successfully constraining the orbital architecture and evolution of the outer solar system requires accurately comparing simulation results with observational datasets. This process is challenging because observed datasets are influenced by orbital discovery biases as well as TNO size and albedo distributions. Meanwhile, such influences are generally absent from numerical results. Here I will review recent work I and others have undertaken using numerical simulations in concert with catalogs of observed TNOs to constrain the outer solar system’s current orbital architecture and past evolution.
Overview of implementation of DARPA GPU program in SAIC
NASA Astrophysics Data System (ADS)
Braunreiter, Dennis; Furtek, Jeremy; Chen, Hai-Wen; Healy, Dennis
2008-04-01
This paper reviews the implementation of DARPA MTO STAP-BOY program for both Phase I and II conducted at Science Applications International Corporation (SAIC). The STAP-BOY program conducts fast covariance factorization and tuning techniques for space-time adaptive process (STAP) Algorithm Implementation on Graphics Processor unit (GPU) Architectures for Embedded Systems. The first part of our presentation on the DARPA STAP-BOY program will focus on GPU implementation and algorithm innovations for a prototype radar STAP algorithm. The STAP algorithm will be implemented on the GPU, using stream programming (from companies such as PeakStream, ATI Technologies' CTM, and NVIDIA) and traditional graphics APIs. This algorithm will include fast range adaptive STAP weight updates and beamforming applications, each of which has been modified to exploit the parallel nature of graphics architectures.
NASA Technical Reports Server (NTRS)
Arnold, Ray; Naderi, F. Michael
1988-01-01
The hardware requirements for multibeam operation and onboard data processing and switching on future communication satellites are reviewed. Topics addressed include multiple-beam antennas, frequency-addressable beams, baseband vs IF switching, FDM/TDMA systems, and bulk demodulators. The proposed use of these technologies in the NASA ACTS, Italsat, and the Japanese ETS-VI is discussed in detail and illustrated with extensive diagrams, maps, drawings, and tables of projected performance data.
2004-09-16
published in non peer-reviewed journals: 1. Gross, SM, Hamilton JL. "Polymer Gels for Use in Lithium Polymer Batteries", Nebraska Academy of Science...a process for the anionic polymerization of styrene and methyl methacrylate in the ionic liquid 1-butyl-3-methylimidazolium hexafluorophosphate ...Current polymer electrolyte composites used for these applications typically comprise polyethers with ethylene carbonate solvents containing lithium
Influence of school architecture and design on healthy eating: a review of the evidence.
Frerichs, Leah; Brittin, Jeri; Sorensen, Dina; Trowbridge, Matthew J; Yaroch, Amy L; Siahpush, Mohammad; Tibbits, Melissa; Huang, Terry T-K
2015-04-01
We examined evidence regarding the influence of school physical environment on healthy-eating outcomes. We applied a systems perspective to examine multiple disciplines' theoretical frameworks and used a mixed-methods systematic narrative review method, considering both qualitative and quantitative sources (published through March 2014) for inclusion. We developed a causal loop diagram from 102 sources identified. We found evidence of the influence of many aspects of a school's physical environment on healthy-eating outcomes. The causal loop diagram highlights multilevel and interrelated factors and elucidates the specific roles of design and architecture in encouraging healthy eating within schools. Our review highlighted the gaps in current evidence and identified areas of research needed to refine and expand school architecture and design strategies for addressing healthy eating.
Influence of School Architecture and Design on Healthy Eating: A Review of the Evidence
Brittin, Jeri; Sorensen, Dina; Trowbridge, Matthew J.; Yaroch, Amy L.; Siahpush, Mohammad; Tibbits, Melissa; Huang, Terry T.-K.
2015-01-01
We examined evidence regarding the influence of school physical environment on healthy-eating outcomes. We applied a systems perspective to examine multiple disciplines’ theoretical frameworks and used a mixed-methods systematic narrative review method, considering both qualitative and quantitative sources (published through March 2014) for inclusion. We developed a causal loop diagram from 102 sources identified. We found evidence of the influence of many aspects of a school’s physical environment on healthy-eating outcomes. The causal loop diagram highlights multilevel and interrelated factors and elucidates the specific roles of design and architecture in encouraging healthy eating within schools. Our review highlighted the gaps in current evidence and identified areas of research needed to refine and expand school architecture and design strategies for addressing healthy eating. PMID:25713964
NASA Astrophysics Data System (ADS)
Prawata, Albertus Galih
2017-11-01
The architectural design stages in architectural practices or in architectural design studio consist of many aspects. One of them is during the early phases of the design process, where the architects or designers try to interpret the project brief into the design concept. This paper is a report of the procedure of digital tools in the early design process in an architectural practice in Jakarta. It targets principally the use of BIM and digital modeling to generate information and transform them into conceptual forms, which is not very common in Indonesian architectural practices. Traditionally, the project brief is transformed into conceptual forms by using sketches, drawings, and physical model. The new method using digital tools shows that it is possible to do the same thing during the initial stage of the design process to create early architectural design forms. Architect's traditional tools and methods begin to be replaced effectively by digital tools, which would drive bigger opportunities for innovation.
Business process architectures: overview, comparison and framework
NASA Astrophysics Data System (ADS)
Dijkman, Remco; Vanderfeesten, Irene; Reijers, Hajo A.
2016-02-01
With the uptake of business process modelling in practice, the demand grows for guidelines that lead to consistent and integrated collections of process models. The notion of a business process architecture has been explicitly proposed to address this. This paper provides an overview of the prevailing approaches to design a business process architecture. Furthermore, it includes evaluations of the usability and use of the identified approaches. Finally, it presents a framework for business process architecture design that can be used to develop a concrete architecture. The use and usability were evaluated in two ways. First, a survey was conducted among 39 practitioners, in which the opinion of the practitioners on the use and usefulness of the approaches was evaluated. Second, four case studies were conducted, in which process architectures from practice were analysed to determine the approaches or elements of approaches that were used in their design. Both evaluations showed that practitioners have a preference for using approaches that are based on reference models and approaches that are based on the identification of business functions or business objects. At the same time, the evaluations showed that practitioners use these approaches in combination, rather than selecting a single approach.
Functional Detachment of Totalitarian Nazi Architecture
NASA Astrophysics Data System (ADS)
Antoszczyszyn, Marek
2017-10-01
The paper describes the systematization process of architectural styles in use during Nazi period in Germany between 1933-45. In the results of the research some regularity about strict concern between function & styling has been observed. Using comparison & case study as well as analytical methods there were pointed out characteristic features of more than 500 objects’ architectural appearance that helped to specify their styling & group them into architectural trends. Ultimately the paper proves that the found trends of architectural styling could be collected by functional detachment key. This observation explains easy to recognize even nowadays traceability - so characteristic to Nazi German architecture. Facing today pluralism in architecture, the findings could be a helpful key in the organization of spatial architectural identification process.
Deep learning in bioinformatics.
Min, Seonwoo; Lee, Byunghan; Yoon, Sungroh
2017-09-01
In the era of big data, transformation of biomedical big data into valuable knowledge has been one of the most important challenges in bioinformatics. Deep learning has advanced rapidly since the early 2000s and now demonstrates state-of-the-art performance in various fields. Accordingly, application of deep learning in bioinformatics to gain insight from data has been emphasized in both academia and industry. Here, we review deep learning in bioinformatics, presenting examples of current research. To provide a useful and comprehensive perspective, we categorize research both by the bioinformatics domain (i.e. omics, biomedical imaging, biomedical signal processing) and deep learning architecture (i.e. deep neural networks, convolutional neural networks, recurrent neural networks, emergent architectures) and present brief descriptions of each study. Additionally, we discuss theoretical and practical issues of deep learning in bioinformatics and suggest future research directions. We believe that this review will provide valuable insights and serve as a starting point for researchers to apply deep learning approaches in their bioinformatics studies. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
A review of experimental techniques to produce a nacre-like structure.
Corni, I; Harvey, T J; Wharton, J A; Stokes, K R; Walsh, F C; Wood, R J K
2012-09-01
The performance of man-made materials can be improved by exploring new structures inspired by the architecture of biological materials. Natural materials, such as nacre (mother-of-pearl), can have outstanding mechanical properties due to their complicated architecture and hierarchical structure at the nano-, micro- and meso-levels which have evolved over millions of years. This review describes the numerous experimental methods explored to date to produce composites with structures and mechanical properties similar to those of natural nacre. The materials produced have sizes ranging from nanometres to centimetres, processing times varying from a few minutes to several months and a different range of mechanical properties that render them suitable for various applications. For the first time, these techniques have been divided into those producing bulk materials, coatings and free-standing films. This is due to the fact that the material's application strongly depends on its dimensions and different results have been reported by applying the same technique to produce materials with different sizes. The limitations and capabilities of these methodologies have been also described.
Grinvald, A
1992-01-01
Long standing questions related to brain mechanisms underlying perception can finally be resolved by direct visualization of the architecture and function of mammalian cortex. This advance has been accomplished with the aid of two optical imaging techniques with which one can literally see how the brain functions. The upbringing of this technology required a multi-disciplinary approach integrating brain research with organic chemistry, spectroscopy, biophysics, computer sciences, optics and image processing. Beyond the technological ramifications, recent research shed new light on cortical mechanisms underlying sensory perception. Clinical applications of this technology for precise mapping of the cortical surface of patients during neurosurgery have begun. Below is a brief summary of our own research and a description of the technical specifications of the two optical imaging techniques. Like every technique, optical imaging also suffers from severe limitations. Here we mostly emphasize some of its advantages relative to all alternative imaging techniques currently in use. The limitations are critically discussed in our recent reviews. For a series of other reviews, see Cohen (1989).
2010-06-01
DATES COVEREDAPR 2009 – JAN 2010 (From - To) APR 2009 – JAN 2010 4. TITLE AND SUBTITLE EMERGING NEUROMORPHIC COMPUTING ARCHITECTURES AND ENABLING...14. ABSTRACT The highly cross-disciplinary emerging field of neuromorphic computing architectures for cognitive information processing applications...belief systems, software, computer engineering, etc. In our effort to develop cognitive systems atop a neuromorphic computing architecture, we explored
Formal Foundations for the Specification of Software Architecture.
1995-03-01
Architectures For- mally: A Case-Study Using KWIC." Kestrel Institute, Palo Alto, CA 94304, April 1994. 58. Kang, Kyo C. Feature-Oriented Domain Analysis ( FODA ...6.3.5 Constraint-Based Architectures ................. 6-60 6.4 Summary ......... ............................. 6-63 VII. Analysis of Process-Based...between these architec- ture theories were investigated. A feasibility analysis on an image processing application demonstrated that architecture theories
Software Architecture Evaluation in Global Software Development Projects
NASA Astrophysics Data System (ADS)
Salger, Frank
Due to ever increasing system complexity, comprehensive methods for software architecture evaluation become more and more important. This is further stressed in global software development (GSD), where the software architecture acts as a central knowledge and coordination mechanism. However, existing methods for architecture evaluation do not take characteristics of GSD into account. In this paper we discuss what aspects are specific for architecture evaluations in GSD. Our experiences from GSD projects at Capgemini sd&m indicate, that architecture evaluations differ in how rigorously one has to assess modularization, architecturally relevant processes, knowledge transfer and process alignment. From our project experiences, we derive nine good practices, the compliance to which should be checked in architecture evaluations in GSD. As an example, we discuss how far the standard architecture evaluation method used at Capgemini sd&m already considers the GSD-specific good practices, and outline what extensions are necessary to achieve a comprehensive architecture evaluation framework for GSD.
The Effects of Architecture and Process on the Hardness of Programmable Technologies
NASA Technical Reports Server (NTRS)
Katz, Richard; Wang, J. J.; Reed, R.; Kleyner, I.; DOrdine, M.; McCollum, J,; Cronquist, B.; Howard, J.
1999-01-01
Architecture and process, combined, significantly affect the hardness of programmable technologies. The effects of high energy ions, ferroelectric memory architectures, and shallow trench isolation are investigated. A detailed single event latchup (SEL) study has been performed.
Electrical Power System Architectures for In-House NASA/GSFC Missions
NASA Technical Reports Server (NTRS)
Yun, Diane D.
2006-01-01
This power point presentation reviews the electrical power system (EPS) architecture used for a few NASA GSFC's missions both current and planned. Included in the presentation are reviews of electric power systems for the Space Technology 5 (ST5) mission, the Solar Dynamics Observatory (SDO) Mission, and the Lunar Reconnaissance Orbiter (LRO). There is a slide that compares the three missions' electrical supply systems.
Linking Neural and Symbolic Representation and Processing of Conceptual Structures
van der Velde, Frank; Forth, Jamie; Nazareth, Deniece S.; Wiggins, Geraint A.
2017-01-01
We compare and discuss representations in two cognitive architectures aimed at representing and processing complex conceptual (sentence-like) structures. First is the Neural Blackboard Architecture (NBA), which aims to account for representation and processing of complex and combinatorial conceptual structures in the brain. Second is IDyOT (Information Dynamics of Thinking), which derives sentence-like structures by learning statistical sequential regularities over a suitable corpus. Although IDyOT is designed at a level more abstract than the neural, so it is a model of cognitive function, rather than neural processing, there are strong similarities between the composite structures developed in IDyOT and the NBA. We hypothesize that these similarities form the basis of a combined architecture in which the individual strengths of each architecture are integrated. We outline and discuss the characteristics of this combined architecture, emphasizing the representation and processing of conceptual structures. PMID:28848460
Particulate Matter Filtration Design Considerations for Crewed Spacecraft Life Support Systems
NASA Technical Reports Server (NTRS)
Agui, Juan H.; Vijayakumar, R.; Perry, Jay L.
2016-01-01
Particulate matter filtration is a key component of crewed spacecraft cabin ventilation and life support system (LSS) architectures. The basic particulate matter filtration functional requirements as they relate to an exploration vehicle LSS architecture are presented. Particulate matter filtration concepts are reviewed and design considerations are discussed. A concept for a particulate matter filtration architecture suitable for exploration missions is presented. The conceptual architecture considers the results from developmental work and incorporates best practice design considerations.
NASA Astrophysics Data System (ADS)
Selznick, S. H.
2017-06-01
Herein we describe an architecture developed for processing engineering and science data for the OSIRIS-REx mission. The architecture is soup-to-nuts, starting with raw telemetry and ending with submission to PDS.
NASA Technical Reports Server (NTRS)
Kolar, Mike; Estefan, Jeff; Giovannoni, Brian; Barkley, Erik
2011-01-01
Topics covered (1) Why Governance and Why Now? (2) Characteristics of Architecture Governance (3) Strategic Elements (3a) Architectural Principles (3b) Architecture Board (3c) Architecture Compliance (4) Architecture Governance Infusion Process. Governance is concerned with decision making (i.e., setting directions, establishing standards and principles, and prioritizing investments). Architecture governance is the practice and orientation by which enterprise architectures and other architectures are managed and controlled at an enterprise-wide level
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-01
... information to evaluate applicants' familiarity with the national preparedness architecture and identify how elements of this architecture have been incorporated into regional/state/local planning, operations, and...
Advanced information processing system for advanced launch system: Avionics architecture synthesis
NASA Technical Reports Server (NTRS)
Lala, Jaynarayan H.; Harper, Richard E.; Jaskowiak, Kenneth R.; Rosch, Gene; Alger, Linda S.; Schor, Andrei L.
1991-01-01
The Advanced Information Processing System (AIPS) is a fault-tolerant distributed computer system architecture that was developed to meet the real time computational needs of advanced aerospace vehicles. One such vehicle is the Advanced Launch System (ALS) being developed jointly by NASA and the Department of Defense to launch heavy payloads into low earth orbit at one tenth the cost (per pound of payload) of the current launch vehicles. An avionics architecture that utilizes the AIPS hardware and software building blocks was synthesized for ALS. The AIPS for ALS architecture synthesis process starting with the ALS mission requirements and ending with an analysis of the candidate ALS avionics architecture is described.
Real-Time Cognitive Computing Architecture for Data Fusion in a Dynamic Environment
NASA Technical Reports Server (NTRS)
Duong, Tuan A.; Duong, Vu A.
2012-01-01
A novel cognitive computing architecture is conceptualized for processing multiple channels of multi-modal sensory data streams simultaneously, and fusing the information in real time to generate intelligent reaction sequences. This unique architecture is capable of assimilating parallel data streams that could be analog, digital, synchronous/asynchronous, and could be programmed to act as a knowledge synthesizer and/or an "intelligent perception" processor. In this architecture, the bio-inspired models of visual pathway and olfactory receptor processing are combined as processing components, to achieve the composite function of "searching for a source of food while avoiding the predator." The architecture is particularly suited for scene analysis from visual data and odorant.
NASA Technical Reports Server (NTRS)
Wood, Richard J.
1992-01-01
The Architecture for Survivable Systems Processing (ASSP) program is a two phase program whose objective is the derivation, specification, development and validation of an open system architecture capable of supporting advanced processing needs of space, ground, and launch vehicle operations. The output of the first phase is a set of hardware and software standards and specifications defining this architecture at three levels. The second phase will validate these standards and develop the technology necessary to achieve strategic hardness, packaging density, throughput requirements, and interoperability/interchangeability.
2011-12-01
systems engineering technical and technical management processes. Technical Planning, Stakeholders Requirements Development, and Architecture Design were...Stakeholder Requirements Definition, Architecture Design and Technical Planning. A purposive sampling of AFRL rapid development program managers and engineers...emphasize one process over another however Architecture Design , Implementation scored higher among Technical Processes. Decision Analysis, Technical
Complex Processes from Dynamical Architectures with Time-Scale Hierarchy
Perdikis, Dionysios; Huys, Raoul; Jirsa, Viktor
2011-01-01
The idea that complex motor, perceptual, and cognitive behaviors are composed of smaller units, which are somehow brought into a meaningful relation, permeates the biological and life sciences. However, no principled framework defining the constituent elementary processes has been developed to this date. Consequently, functional configurations (or architectures) relating elementary processes and external influences are mostly piecemeal formulations suitable to particular instances only. Here, we develop a general dynamical framework for distinct functional architectures characterized by the time-scale separation of their constituents and evaluate their efficiency. Thereto, we build on the (phase) flow of a system, which prescribes the temporal evolution of its state variables. The phase flow topology allows for the unambiguous classification of qualitatively distinct processes, which we consider to represent the functional units or modes within the dynamical architecture. Using the example of a composite movement we illustrate how different architectures can be characterized by their degree of time scale separation between the internal elements of the architecture (i.e. the functional modes) and external interventions. We reveal a tradeoff of the interactions between internal and external influences, which offers a theoretical justification for the efficient composition of complex processes out of non-trivial elementary processes or functional modes. PMID:21347363
Information Architecture of Web-Based Interventions to Improve Health Outcomes: Systematic Review
Grenen, Emily; Surla, Stacy; Schwarz, Mary; Cole-Lewis, Heather
2018-01-01
Background The rise in usage of and access to new technologies in recent years has led to a growth in digital health behavior change interventions. As the shift to digital platforms continues to grow, it is increasingly important to consider how the field of information architecture (IA) can inform the development of digital health interventions. IA is the way in which digital content is organized and displayed, which strongly impacts users’ ability to find and use content. While many information architecture best practices exist, there is a lack of empirical evidence on the role it plays in influencing behavior change and health outcomes. Objective Our aim was to conduct a systematic review synthesizing the existing literature on website information architecture and its effect on health outcomes, behavioral outcomes, and website engagement. Methods To identify all existing information architecture and health behavior literature, we searched articles published in English in the following databases (no date restrictions imposed): ACM Digital Library, CINAHL, Cochrane Library, Google Scholar, Ebsco, and PubMed. The search terms used included information terms (eg, information architecture, interaction design, persuasive design), behavior terms (eg, health behavior, behavioral intervention, ehealth), and health terms (eg, smoking, physical activity, diabetes). The search results were reviewed to determine if they met the inclusion and exclusion criteria created to identify empirical research that studied the effect of IA on health outcomes, behavioral outcomes, or website engagement. Articles that met inclusion criteria were assessed for study quality. Then, data from the articles were extracted using a priori categories established by 3 reviewers. However, the limited health outcome data gathered from the studies precluded a meta-analysis. Results The initial literature search yielded 685 results, which was narrowed down to three publications that examined the effect of information architecture on health outcomes, behavioral outcomes, or website engagement. One publication studied the isolated impact of information architecture on outcomes of interest (ie, website use and engagement; health-related knowledge, attitudes, and beliefs; and health behaviors), while the other two publications studied the impact of information architecture, website features (eg, interactivity, email prompts, and forums), and tailored content on these outcomes. The paper that investigated IA exclusively found that a tunnel IA improved site engagement and behavior knowledge, but it decreased users’ perceived efficiency. The first study that did not isolate IA found that the enhanced site condition improved site usage but not the amount of content viewed. The second study that did not isolate IA found that a tailored site condition improved site usage, behavior knowledge, and some behavior outcomes. Conclusions No clear conclusion can be made about the relationship between IA and health outcomes, given limited evidence in the peer-reviewed literature connecting IA to behavioral outcomes and website engagement. Only one study reviewed solely manipulated IA, and we therefore recommend improving the scientific evidence base such that additional empirical studies investigate the impact of IA in isolation. Moreover, information from the gray literature and expert opinion might be identified and added to the evidence base, in order to lay the groundwork for hypothesis generation to improve empirical evidence on information architecture and health and behavior outcomes. PMID:29563076
Image-Processing Software For A Hypercube Computer
NASA Technical Reports Server (NTRS)
Lee, Meemong; Mazer, Alan S.; Groom, Steven L.; Williams, Winifred I.
1992-01-01
Concurrent Image Processing Executive (CIPE) is software system intended to develop and use image-processing application programs on concurrent computing environment. Designed to shield programmer from complexities of concurrent-system architecture, it provides interactive image-processing environment for end user. CIPE utilizes architectural characteristics of particular concurrent system to maximize efficiency while preserving architectural independence from user and programmer. CIPE runs on Mark-IIIfp 8-node hypercube computer and associated SUN-4 host computer.
NASA Technical Reports Server (NTRS)
Fouts, Douglas J.; Butner, Steven E.
1991-01-01
The design of the processing element of GASP, a GaAs supercomputer with a 500-MHz instruction issue rate and 1-GHz subsystem clocks, is presented. The novel, functionally modular, block data flow architecture of GASP is described. The architecture and design of a GASP processing element is then presented. The processing element (PE) is implemented in a hybrid semiconductor module with 152 custom GaAs ICs of eight different types. The effects of the implementation technology on both the system-level architecture and the PE design are discussed. SPICE simulations indicate that parts of the PE are capable of being clocked at 1 GHz, while the rest of the PE uses a 500-MHz clock. The architecture utilizes data flow techniques at a program block level, which allows efficient execution of parallel programs while maintaining reasonably good performance on sequential programs. A simulation study of the architecture indicates that an instruction execution rate of over 30,000 MIPS can be attained with 65 PEs.
CMOL: A New Concept for Nanoelectronics
NASA Astrophysics Data System (ADS)
Likharev, Konstantin
2005-03-01
I will review the recent work on devices and architectures for future hybrid semiconductor/molecular integrated circuits, in particular those of ``CMOL'' variety [1]. Such circuits would combine an advanced CMOS subsystem fabricated by the usual lithographic patterning, two layers of parallel metallic nanowires formed, e.g., by nanoimprint, and two-terminal molecular devices self-assembled on the nanowire crosspoints. Estimates show that this powerful combination may allow CMOL circuits to reach an unparalleled density (up to 10^12 functions per cm^2) and ultrahigh rate of information processing (up to 10^20 operations per second on a single chip), at acceptable power dissipation. The main challenges on the way toward practical CMOL technology are: (i) reliable chemically-directed self-assembly of mid-size organic molecules, and (ii) the development of efficient defect-tolerant architectures for CMOL circuits. Our recent work has shown that such architectures may be developed not only for terabit-scale memories and naturally defect-tolerant mixed-signal neuromorphic networks, but (rather unexpectedly) also for FPGA-style digital Boolean circuits. [1] For details, see http://rsfq1.physics.sunysb.edu/˜likharev/nano/Springer04.pdf
Comparison of Classifier Architectures for Online Neural Spike Sorting.
Saeed, Maryam; Khan, Amir Ali; Kamboh, Awais Mehmood
2017-04-01
High-density, intracranial recordings from micro-electrode arrays need to undergo Spike Sorting in order to associate the recorded neuronal spikes to particular neurons. This involves spike detection, feature extraction, and classification. To reduce the data transmission and power requirements, on-chip real-time processing is becoming very popular. However, high computational resources are required for classifiers in on-chip spike-sorters, making scalability a great challenge. In this review paper, we analyze several popular classifiers to propose five new hardware architectures using the off-chip training with on-chip classification approach. These include support vector classification, fuzzy C-means classification, self-organizing maps classification, moving-centroid K-means classification, and Cosine distance classification. The performance of these architectures is analyzed in terms of accuracy and resource requirement. We establish that the neural networks based Self-Organizing Maps classifier offers the most viable solution. A spike sorter based on the Self-Organizing Maps classifier, requires only 7.83% of computational resources of the best-reported spike sorter, hierarchical adaptive means, while offering a 3% better accuracy at 7 dB SNR.
1981-10-30
we directed our review to the preselection and selection processes for direct A/E awards,.not awards made under the Small Business Administration’s 8...policy of the United States that small busi - ness concerns and small business concerns owned and controlled by socially and economically disadvantaged...contracts exceeding $10,000 must contain the above state- ment. The Small Business Administration has defined small business concerns for A/E services
Decoding the function of nuclear long non-coding RNAs.
Chen, Ling-Ling; Carmichael, Gordon G
2010-06-01
Long non-coding RNAs (lncRNAs) are mRNA-like, non-protein-coding RNAs that are pervasively transcribed throughout eukaryotic genomes. Rather than silently accumulating in the nucleus, many of these are now known or suspected to play important roles in nuclear architecture or in the regulation of gene expression. In this review, we highlight some recent progress in how lncRNAs regulate these important nuclear processes at the molecular level. Copyright 2010 Elsevier Ltd. All rights reserved.
An integrated systems engineering approach to aircraft design
NASA Astrophysics Data System (ADS)
Price, M.; Raghunathan, S.; Curran, R.
2006-06-01
The challenge in Aerospace Engineering, in the next two decades as set by Vision 2020, is to meet the targets of reduction of nitric oxide emission by 80%, carbon monoxide and carbon dioxide both by 50%, reduce noise by 50% and of course with reduced cost and improved safety. All this must be achieved with expected increase in capacity and demand. Such a challenge has to be in a background where the understanding of physics of flight has changed very little over the years and where industrial growth is driven primarily by cost rather than new technology. The way forward to meet the challenges is to introduce innovative technologies and develop an integrated, effective and efficient process for the life cycle design of aircraft, known as systems engineering (SE). SE is a holistic approach to a product that comprises several components. Customer specifications, conceptual design, risk analysis, functional analysis and architecture, physical architecture, design analysis and synthesis, and trade studies and optimisation, manufacturing, testing validation and verification, delivery, life cycle cost and management. Further, it involves interaction between traditional disciplines such as Aerodynamics, Structures and Flight Mechanics with people- and process-oriented disciplines such as Management, Manufacturing, and Technology Transfer. SE has become the state-of-the-art methodology for organising and managing aerospace production. However, like many well founded methodologies, it is more difficult to embody the core principles into formalised models and tools. The key contribution of the paper will be to review this formalisation and to present the very latest knowledge and technology that facilitates SE theory. Typically, research into SE provides a deeper understanding of the core principles and interactions, and helps one to appreciate the required technical architecture for fully exploiting it as a process, rather than a series of events. There are major issues as regards to systems approach to aircraft design and these include lack of basic scientific/practical models and tools for interfacing and integrating the components of SE and within a given component, for example, life cycle cost, basic models for linking the key drivers. The paper will review the current state of art in SE approach to aircraft design and identify some of the major challenges, the current state of the art and visions for the future. The review moves from an initial basis in traditional engineering design processes to consideration of costs and manufacturing in this integrated environment. Issues related to the implementation of integration in design at the detailed physics level are discussed in the case studies.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Architecture is interpreted to mean the use of the National ITS Architecture to develop a regional ITS architecture, and the subsequent adherence of all ITS projects to that regional ITS architecture. Development of the regional ITS architecture should be consistent with the transportation planning process for...
ERIC Educational Resources Information Center
Kauppinen, Heta
1989-01-01
Explores the use of analogies in architectural design, the importance of Gestalt theory and aesthetic cannons in understanding and being sensitive to architecture. Emphasizes the variation between public and professional appreciation of architecture. Notes that an understanding of architectural process enables students to improve the aesthetic…
ERIC Educational Resources Information Center
Pihl, Ole
2015-01-01
How do architecture students experience the contradictions between the individual and the group at the Department of Architecture and Design of Aalborg University? The Problem-Based Learning model has been extensively applied to the department's degree programs in coherence with the Integrated Design Process, but is a group-based architecture and…
NASA Astrophysics Data System (ADS)
Martinez, Vera
2007-02-01
The paper discusses concepts about the role of architecture in the design of space habitats and the development of a general evaluation criteria of architectural design contribution. Besides the existing feasibility studies, the general requisites, the development studies, and the critical design review which are mainly based on the experience of human space missions and the standards of the NASA-STD-3000 manual and which analyze and evaluate the relation between man and environment and between man and machine mainly in its functionality, there is very few material about design of comfort and wellbeing of man in space habitat. Architecture for space habitat means the design of an artificial environment with much comfort in an "atmosphere" of wellbeing. These are mainly psychological effects of human factors which are very important in the case of a long time space mission. How can the degree of comfort and "wellbeing atmosphere" in an artificial environment be measured? How can the quality of the architectural contribution in space design be quantified? Definition of a criteria catalogue to reach a larger objectivity in architectural design evaluation. Definition of constant parameters as a result of project necessities to quantify the quality of the design. Architectural design analysis due the application and verification within the parameters and consequently overlapping and evaluating results. Interdisciplinary work between architects, astronautics, engineers, psychologists, etc. All the disciplines needed for planning a high quality habitat for humans in space. Analysis of the principles of well designed artificial environment. Good quality design for space architecture is the result of the interaction and interrelation between many different project necessities (technological, environmental, human factors, transportation, costs, etc.). Each of this necessities is interrelated in the design project and cannot be evaluated on its own. Therefore, the design process needs constant check ups to choose each time the best solution in relation to the whole. As well as for the main disciplines around human factors, architectural design for space has to be largely tested to produce scientific improvement.
A Survey on Next-generation Power Grid Data Architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
You, Shutang; Zhu, Dr. Lin; Liu, Yong
2015-01-01
The operation and control of power grids will increasingly rely on data. A high-speed, reliable, flexible and secure data architecture is the prerequisite of the next-generation power grid. This paper summarizes the challenges in collecting and utilizing power grid data, and then provides reference data architecture for future power grids. Based on the data architecture deployment, related research on data architecture is reviewed and summarized in several categories including data measurement/actuation, data transmission, data service layer, data utilization, as well as two cross-cutting issues, interoperability and cyber security. Research gaps and future work are also presented.
2016-01-01
The integration of a DNA copy of the viral RNA genome into host chromatin is the defining step of retroviral replication. This enzymatic process is catalyzed by the virus-encoded integrase protein, which is conserved among retroviruses and LTR-retrotransposons. Retroviral integration proceeds via two integrase activities: 3′-processing of the viral DNA ends, followed by the strand transfer of the processed ends into host cell chromosomal DNA. Herein we review the molecular mechanism of retroviral DNA integration, with an emphasis on reaction chemistries and architectures of the nucleoprotein complexes involved. We additionally discuss the latest advances on anti-integrase drug development for the treatment of AIDS and the utility of integrating retroviral vectors in gene therapy applications. PMID:27198982
A Review of Enterprise Architecture Use in Defence
2014-09-01
dictionary of terms; • architecture description language; • architectural information (pertaining both to specific projects and higher level...UNCLASSIFIED 59 Z39.19 2005 Monolingual Controlled Vocabularies, National Information Standards Organisation, Bethesda: NISO Press, 2005. BABOK 2009...togaf/ Z39.19 2005 ANSI/NISO Z39.19 – Guidelines for the Construction, Format, and Management of Monolingual Controlled Vocabularies, Bethesda: NISO
Evolution and genome architecture in fungal plant pathogens.
Möller, Mareike; Stukenbrock, Eva H
2017-12-01
The fungal kingdom comprises some of the most devastating plant pathogens. Sequencing the genomes of fungal pathogens has shown a remarkable variability in genome size and architecture. Population genomic data enable us to understand the mechanisms and the history of changes in genome size and adaptive evolution in plant pathogens. Although transposable elements predominantly have negative effects on their host, fungal pathogens provide prominent examples of advantageous associations between rapidly evolving transposable elements and virulence genes that cause variation in virulence phenotypes. By providing homogeneous environments at large regional scales, managed ecosystems, such as modern agriculture, can be conducive for the rapid evolution and dispersal of pathogens. In this Review, we summarize key examples from fungal plant pathogen genomics and discuss evolutionary processes in pathogenic fungi in the context of molecular evolution, population genomics and agriculture.
Information systems in healthcare - state and steps towards sustainability.
Lenz, R
2009-01-01
To identify core challenges and first steps on the way to sustainable information systems in healthcare. Recent articles on healthcare information technology and related articles from Medical Informatics and Computer Science were reviewed and analyzed. Core challenges that couldn't be solved over the years are identified. The two core problem areas are process integration, meaning to effectively embed IT-systems into routine workflows, and systems integration, meaning to reduce the effort for interconnecting independently developed IT-components. Standards for systems integration have improved a lot, but their usefulness is limited where system evolution is needed. Sustainable Healthcare Information Systems should be based on system architectures that support system evolution and avoid costly system replacements every five to ten years. Some basic principles for the design of such systems are separation of concerns, loose coupling, deferred systems design, and service oriented architectures.
Generic worklist handler for workflow-enabled products
NASA Astrophysics Data System (ADS)
Schmidt, Joachim; Meetz, Kirsten; Wendler, Thomas
1999-07-01
Workflow management (WfM) is an emerging field of medical information technology. It appears as a promising key technology to model, optimize and automate processes, for the sake of improved efficiency, reduced costs and improved patient care. The Application of WfM concepts requires the standardization of architectures and interfaces. A component of central interest proposed in this report is a generic work list handler: A standardized interface between a workflow enactment service and application system. Application systems with embedded work list handlers will be called 'Workflow Enabled Application Systems'. In this paper we discus functional requirements of work list handlers, as well as their integration into workflow architectures and interfaces. To lay the foundation for this specification, basic workflow terminology, the fundamentals of workflow management and - later in the paper - the available standards as defined by the Workflow Management Coalition are briefly reviewed.
Verifying Architectural Design Rules of the Flight Software Product Line
NASA Technical Reports Server (NTRS)
Ganesan, Dharmalingam; Lindvall, Mikael; Ackermann, Chris; McComas, David; Bartholomew, Maureen
2009-01-01
This paper presents experiences of verifying architectural design rules of the NASA Core Flight Software (CFS) product line implementation. The goal of the verification is to check whether the implementation is consistent with the CFS architectural rules derived from the developer's guide. The results indicate that consistency checking helps a) identifying architecturally significant deviations that were eluded during code reviews, b) clarifying the design rules to the team, and c) assessing the overall implementation quality. Furthermore, it helps connecting business goals to architectural principles, and to the implementation. This paper is the first step in the definition of a method for analyzing and evaluating product line implementations from an architecture-centric perspective.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-23
... architecture and identify how elements of this architecture have been incorporated into regional/State/local... international water borders. Affected Public: State, Local or Tribal Government. Estimated Number of Respondents...
Big data processing in the cloud - Challenges and platforms
NASA Astrophysics Data System (ADS)
Zhelev, Svetoslav; Rozeva, Anna
2017-12-01
Choosing the appropriate architecture and technologies for a big data project is a difficult task, which requires extensive knowledge in both the problem domain and in the big data landscape. The paper analyzes the main big data architectures and the most widely implemented technologies used for processing and persisting big data. Clouds provide for dynamic resource scaling, which makes them a natural fit for big data applications. Basic cloud computing service models are presented. Two architectures for processing big data are discussed, Lambda and Kappa architectures. Technologies for big data persistence are presented and analyzed. Stream processing as the most important and difficult to manage is outlined. The paper highlights main advantages of cloud and potential problems.
Polymer architecture of magnetic gels: a review
NASA Astrophysics Data System (ADS)
Weeber, Rudolf; Hermes, Melissa; Schmidt, Annette M.; Holm, Christian
2018-02-01
In this review article, we provide an introduction to ferrogels, i.e. polymeric gels with embedded magnetic particles. Due to the interplay between magnetic and elastic properties of these materials, they are promising candidates for engineering and biomedical applications such as actuation and controlled drug release. Particular emphasis will be put on the polymer architecture of magnetic gels since it controls the degrees of freedom of the magnetic particles in the gel, and it is important for the particle-polymer coupling determining the mechanisms available for the gel deformation in magnetic fields. We report on the different polymer architectures that have been realized so far, and provide an overview of synthesis strategies and experimental techniques for the characterization of these materials. We further focus on theoretical and simulational studies carried out on magnetic gels, and highlight their contributions towards understanding the influence of the gels’ polymer architecture.
A role for chromatin topology in imprinted domain regulation.
MacDonald, William A; Sachani, Saqib S; White, Carlee R; Mann, Mellissa R W
2016-02-01
Recently, many advancements in genome-wide chromatin topology and nuclear architecture have unveiled the complex and hidden world of the nucleus, where chromatin is organized into discrete neighbourhoods with coordinated gene expression. This includes the active and inactive X chromosomes. Using X chromosome inactivation as a working model, we utilized publicly available datasets together with a literature review to gain insight into topologically associated domains, lamin-associated domains, nucleolar-associating domains, scaffold/matrix attachment regions, and nucleoporin-associated chromatin and their role in regulating monoallelic expression. Furthermore, we comprehensively review for the first time the role of chromatin topology and nuclear architecture in the regulation of genomic imprinting. We propose that chromatin topology and nuclear architecture are important regulatory mechanisms for directing gene expression within imprinted domains. Furthermore, we predict that dynamic changes in chromatin topology and nuclear architecture play roles in tissue-specific imprint domain regulation during early development and differentiation.
Distributed computing environments for future space control systems
NASA Technical Reports Server (NTRS)
Viallefont, Pierre
1993-01-01
The aim of this paper is to present the results of a CNES research project on distributed computing systems. The purpose of this research was to study the impact of the use of new computer technologies in the design and development of future space applications. The first part of this study was a state-of-the-art review of distributed computing systems. One of the interesting ideas arising from this review is the concept of a 'virtual computer' allowing the distributed hardware architecture to be hidden from a software application. The 'virtual computer' can improve system performance by adapting the best architecture (addition of computers) to the software application without having to modify its source code. This concept can also decrease the cost and obsolescence of the hardware architecture. In order to verify the feasibility of the 'virtual computer' concept, a prototype representative of a distributed space application is being developed independently of the hardware architecture.
Marquardt, Gesine; Cross, Emily S.; de Sousa, Alexandra A.; Edelstein, Eve; Farnè, Alessandro; Leszczynski, Marcin; Patterson, Miles; Quadflieg, Susanne
2015-01-01
Through advances in production and treatment technologies, transparent glass has become an increasingly versatile material and a global hallmark of modern architecture. In the shape of invisible barriers, it defines spaces while simultaneously shaping their lighting, noise, and climate conditions. Despite these unique architectural qualities, little is known regarding the human experience with glass barriers. Is a material that has been described as being simultaneously there and not there from an architectural perspective, actually there and/or not there from perceptual, behavioral, and social points of view? In this article, we review systematic observations and experimental studies that explore the impact of transparent barriers on human cognition and action. In doing so, the importance of empirical and multidisciplinary approaches to inform the use of glass in contemporary architecture is highlighted and key questions for future inquiry are identified. PMID:26441756
Microcomponent chemical process sheet architecture
Wegeng, Robert S.; Drost, M. Kevin; Call, Charles J.; Birmingham, Joseph G.; McDonald, Carolyn Evans; Kurath, Dean E.; Friedrich, Michele
1998-01-01
The invention is a microcomponent sheet architecture wherein macroscale unit processes are performed by microscale components. The sheet architecture may be a single laminate with a plurality of separate microcomponent sections or the sheet architecture may be a plurality of laminates with one or more microcomponent sections on each laminate. Each microcomponent or plurality of like microcomponents perform at least one chemical process unit operation. A first laminate having a plurality of like first microcomponents is combined with at least a second laminate having a plurality of like second microcomponents thereby combining at least two unit operations to achieve a system operation.
Microcomponent chemical process sheet architecture
Wegeng, R.S.; Drost, M.K.; Call, C.J.; Birmingham, J.G.; McDonald, C.E.; Kurath, D.E.; Friedrich, M.
1998-09-22
The invention is a microcomponent sheet architecture wherein macroscale unit processes are performed by microscale components. The sheet architecture may be a single laminate with a plurality of separate microcomponent sections or the sheet architecture may be a plurality of laminates with one or more microcomponent sections on each laminate. Each microcomponent or plurality of like microcomponents perform at least one chemical process unit operation. A first laminate having a plurality of like first microcomponents is combined with at least a second laminate having a plurality of like second microcomponents thereby combining at least two unit operations to achieve a system operation. 26 figs.
An architecture for real-time vision processing
NASA Technical Reports Server (NTRS)
Chien, Chiun-Hong
1994-01-01
To study the feasibility of developing an architecture for real time vision processing, a task queue server and parallel algorithms for two vision operations were designed and implemented on an i860-based Mercury Computing System 860VS array processor. The proposed architecture treats each vision function as a task or set of tasks which may be recursively divided into subtasks and processed by multiple processors coordinated by a task queue server accessible by all processors. Each idle processor subsequently fetches a task and associated data from the task queue server for processing and posts the result to shared memory for later use. Load balancing can be carried out within the processing system without the requirement for a centralized controller. The author concludes that real time vision processing cannot be achieved without both sequential and parallel vision algorithms and a good parallel vision architecture.
Kraak, V I; Englund, T; Misyak, S; Serrano, E L
2017-08-01
This review identified and adapted choice architecture frameworks to develop a novel framework that restaurant owners could use to promote healthy food environments for customers who currently overconsume products high in fat, sugar and sodium that increase their risk of obesity and diet-related non-communicable diseases. This review was conducted in three steps and presented as a narrative summary to demonstrate a proof of concept. Step 1 was a systematic review of nudge or choice architecture frameworks used to categorize strategies that cue healthy behaviours in microenvironments. We searched nine electronic databases between January 2000 and December 2016 and identified 1,244 records. Inclusion criteria led to the selection of five choice architecture frameworks, of which three were adapted and combined with marketing mix principles to highlight eight strategies (i.e. place, profile, portion, pricing, promotion, healthy default picks, prompting or priming and proximity). Step 2 involved conducting a comprehensive evidence review between January 2006 and December 2016 to identify U.S. recommendations for the restaurant sector organized by strategy. Step 3 entailed developing 12 performance metrics for the eight strategies. This framework should be tested to determine its value to assist restaurant owners to promote and socially normalize healthy food environments to reduce obesity and non-communicable diseases. © 2017 The Authors. Obesity Reviews published by John Wiley & Sons Ltd on behalf of World Obesity Federation.
Real-time FPGA architectures for computer vision
NASA Astrophysics Data System (ADS)
Arias-Estrada, Miguel; Torres-Huitzil, Cesar
2000-03-01
This paper presents an architecture for real-time generic convolution of a mask and an image. The architecture is intended for fast low level image processing. The FPGA-based architecture takes advantage of the availability of registers in FPGAs to implement an efficient and compact module to process the convolutions. The architecture is designed to minimize the number of accesses to the image memory and is based on parallel modules with internal pipeline operation in order to improve its performance. The architecture is prototyped in a FPGA, but it can be implemented on a dedicated VLSI to reach higher clock frequencies. Complexity issues, FPGA resources utilization, FPGA limitations, and real time performance are discussed. Some results are presented and discussed.
Howard, Réka; Carriquiry, Alicia L.; Beavis, William D.
2014-01-01
Parametric and nonparametric methods have been developed for purposes of predicting phenotypes. These methods are based on retrospective analyses of empirical data consisting of genotypic and phenotypic scores. Recent reports have indicated that parametric methods are unable to predict phenotypes of traits with known epistatic genetic architectures. Herein, we review parametric methods including least squares regression, ridge regression, Bayesian ridge regression, least absolute shrinkage and selection operator (LASSO), Bayesian LASSO, best linear unbiased prediction (BLUP), Bayes A, Bayes B, Bayes C, and Bayes Cπ. We also review nonparametric methods including Nadaraya-Watson estimator, reproducing kernel Hilbert space, support vector machine regression, and neural networks. We assess the relative merits of these 14 methods in terms of accuracy and mean squared error (MSE) using simulated genetic architectures consisting of completely additive or two-way epistatic interactions in an F2 population derived from crosses of inbred lines. Each simulated genetic architecture explained either 30% or 70% of the phenotypic variability. The greatest impact on estimates of accuracy and MSE was due to genetic architecture. Parametric methods were unable to predict phenotypic values when the underlying genetic architecture was based entirely on epistasis. Parametric methods were slightly better than nonparametric methods for additive genetic architectures. Distinctions among parametric methods for additive genetic architectures were incremental. Heritability, i.e., proportion of phenotypic variability, had the second greatest impact on estimates of accuracy and MSE. PMID:24727289
A learnable parallel processing architecture towards unity of memory and computing
NASA Astrophysics Data System (ADS)
Li, H.; Gao, B.; Chen, Z.; Zhao, Y.; Huang, P.; Ye, H.; Liu, L.; Liu, X.; Kang, J.
2015-08-01
Developing energy-efficient parallel information processing systems beyond von Neumann architecture is a long-standing goal of modern information technologies. The widely used von Neumann computer architecture separates memory and computing units, which leads to energy-hungry data movement when computers work. In order to meet the need of efficient information processing for the data-driven applications such as big data and Internet of Things, an energy-efficient processing architecture beyond von Neumann is critical for the information society. Here we show a non-von Neumann architecture built of resistive switching (RS) devices named “iMemComp”, where memory and logic are unified with single-type devices. Leveraging nonvolatile nature and structural parallelism of crossbar RS arrays, we have equipped “iMemComp” with capabilities of computing in parallel and learning user-defined logic functions for large-scale information processing tasks. Such architecture eliminates the energy-hungry data movement in von Neumann computers. Compared with contemporary silicon technology, adder circuits based on “iMemComp” can improve the speed by 76.8% and the power dissipation by 60.3%, together with a 700 times aggressive reduction in the circuit area.
A learnable parallel processing architecture towards unity of memory and computing.
Li, H; Gao, B; Chen, Z; Zhao, Y; Huang, P; Ye, H; Liu, L; Liu, X; Kang, J
2015-08-14
Developing energy-efficient parallel information processing systems beyond von Neumann architecture is a long-standing goal of modern information technologies. The widely used von Neumann computer architecture separates memory and computing units, which leads to energy-hungry data movement when computers work. In order to meet the need of efficient information processing for the data-driven applications such as big data and Internet of Things, an energy-efficient processing architecture beyond von Neumann is critical for the information society. Here we show a non-von Neumann architecture built of resistive switching (RS) devices named "iMemComp", where memory and logic are unified with single-type devices. Leveraging nonvolatile nature and structural parallelism of crossbar RS arrays, we have equipped "iMemComp" with capabilities of computing in parallel and learning user-defined logic functions for large-scale information processing tasks. Such architecture eliminates the energy-hungry data movement in von Neumann computers. Compared with contemporary silicon technology, adder circuits based on "iMemComp" can improve the speed by 76.8% and the power dissipation by 60.3%, together with a 700 times aggressive reduction in the circuit area.
Data Strategies to Support Automated Multi-Sensor Data Fusion in a Service Oriented Architecture
2008-06-01
and employ vast quantities of content. This dissertation provides two software architectural patterns and an auto-fusion process that guide the...UDDI), Simple Order Access Protocol (SOAP), Java, Maritime Domain Awareness (MDA), Business Process Execution Language for Web Service (BPEL4WS) 16...content. This dissertation provides two software architectural patterns and an auto-fusion process that guide the development of a distributed
Applicability of different onboard routing and processing techniques to mobile satellite systems
NASA Technical Reports Server (NTRS)
Craig, A. D.; Marston, P. C.; Bakken, P. M.; Vernucci, A.; Benedicto, J.
1993-01-01
The paper summarizes a study contract recently undertaken for ESA. The study compared the effectiveness of several processing architectures applied to multiple beam, geostationary global and European regional missions. The paper discusses architectures based on transparent SS-FDMA analog, transparent DSP and regenerative processing. Quantitative comparisons are presented and general conclusions are given with respect to suitability of the architectures to different mission requirements.
Petri net model for analysis of concurrently processed complex algorithms
NASA Technical Reports Server (NTRS)
Stoughton, John W.; Mielke, Roland R.
1986-01-01
This paper presents a Petri-net model suitable for analyzing the concurrent processing of computationally complex algorithms. The decomposed operations are to be processed in a multiple processor, data driven architecture. Of particular interest is the application of the model to both the description of the data/control flow of a particular algorithm, and to the general specification of the data driven architecture. A candidate architecture is also presented.
NASA Technical Reports Server (NTRS)
Hsia, T. C.; Lu, G. Z.; Han, W. H.
1987-01-01
In advanced robot control problems, on-line computation of inverse Jacobian solution is frequently required. Parallel processing architecture is an effective way to reduce computation time. A parallel processing architecture is developed for the inverse Jacobian (inverse differential kinematic equation) of the PUMA arm. The proposed pipeline/parallel algorithm can be inplemented on an IC chip using systolic linear arrays. This implementation requires 27 processing cells and 25 time units. Computation time is thus significantly reduced.
Architecture Of High Speed Image Processing System
NASA Astrophysics Data System (ADS)
Konishi, Toshio; Hayashi, Hiroshi; Ohki, Tohru
1988-01-01
One of architectures for a high speed image processing system which corresponds to a new algorithm for a shape understanding is proposed. And the hardware system which is based on the archtecture was developed. Consideration points of the architecture are mainly that using processors should match with the processing sequence of the target image and that the developed system should be used practically in an industry. As the result, it was possible to perform each processing at a speed of 80 nano-seconds a pixel.
Integration Process for the Habitat Demonstration Unit
NASA Technical Reports Server (NTRS)
Gill, Tracy; Merbitz, Jerad; Kennedy, Kriss; Tri, Terry; Howe, A. Scott
2010-01-01
The Habitat Demonstration Unit (HDU) is an experimental exploration habitat technology and architecture test platform designed for analog demonstration activities The HDU project has required a team to integrate a variety of contributions from NASA centers and outside collaborators and poses a challenge in integrating these disparate efforts into a cohesive architecture To complete the development of the HDU from conception in June 2009 to rollout for operations in July 2010, a cohesive integration strategy has been developed to integrate the various systems of HDU and the payloads, such as the Geology Lab, that those systems will support The utilization of interface design standards and uniquely tailored reviews have allowed for an accelerated design process Scheduled activities include early fit-checks and the utilization of a Habitat avionics test bed prior to equipment installation into HDU A coordinated effort to utilize modeling and simulation systems has aided in design and integration concept development Modeling tools have been effective in hardware systems layout, cable routing and length estimation, and human factors analysis Decision processes on the shell development including the assembly sequence and the transportation have been fleshed out early on HDU to maximize the efficiency of both integration and field operations Incremental test operations leading up to an integrated systems test allows for an orderly systems test program The HDU will begin its journey as an emulation of a Pressurized Excursion Module (PEM) for 2010 field testing and then may evolve to a Pressurized Core Module (PCM) for 2011 and later field tests, depending on agency architecture decisions The HDU deployment will vary slightly from current lunar architecture plans to include developmental hardware and software items and additional systems called opportunities for technology demonstration One of the HDU challenges has been designing to be prepared for the integration of presently unanticipated systems Results of the HDU field tests will influence future designs of habitat systems.
Dual-processing accounts of reasoning, judgment, and social cognition.
Evans, Jonathan St B T
2008-01-01
This article reviews a diverse set of proposals for dual processing in higher cognition within largely disconnected literatures in cognitive and social psychology. All these theories have in common the distinction between cognitive processes that are fast, automatic, and unconscious and those that are slow, deliberative, and conscious. A number of authors have recently suggested that there may be two architecturally (and evolutionarily) distinct cognitive systems underlying these dual-process accounts. However, it emerges that (a) there are multiple kinds of implicit processes described by different theorists and (b) not all of the proposed attributes of the two kinds of processing can be sensibly mapped on to two systems as currently conceived. It is suggested that while some dual-process theories are concerned with parallel competing processes involving explicit and implicit knowledge systems, others are concerned with the influence of preconscious processes that contextualize and shape deliberative reasoning and decision-making.
Speed challenge: a case for hardware implementation in soft-computing
NASA Technical Reports Server (NTRS)
Daud, T.; Stoica, A.; Duong, T.; Keymeulen, D.; Zebulum, R.; Thomas, T.; Thakoor, A.
2000-01-01
For over a decade, JPL has been actively involved in soft computing research on theory, architecture, applications, and electronics hardware. The driving force in all our research activities, in addition to the potential enabling technology promise, has been creation of a niche that imparts orders of magnitude speed advantage by implementation in parallel processing hardware with algorithms made especially suitable for hardware implementation. We review our work on neural networks, fuzzy logic, and evolvable hardware with selected application examples requiring real time response capabilities.
High hydrostatic pressure and the cell membrane: stress response of Saccharomyces cerevisiae.
Bravim, Fernanda; de Freitas, Jéssica M; Fernandes, A Alberto R; Fernandes, Patricia M B
2010-02-01
The brewing and baking yeast Saccharomyces cerevisiae is a useful eukaryotic model of stress response systems whose study could lead to the understanding of stress response mechanisms in other organisms. High hydrostatic pressure (HHP) exerts broad effects upon yeast cells, interfering with cell membranes, cellular architecture, and the processes of polymerization and denaturation of proteins. In this review, we focus on the effect of HHP on the S. cerevisiae cell membrane and describe the main signaling pathways involved in the pressure response.
The Information Science Experiment System - The computer for science experiments in space
NASA Technical Reports Server (NTRS)
Foudriat, Edwin C.; Husson, Charles
1989-01-01
The concept of the Information Science Experiment System (ISES), potential experiments, and system requirements are reviewed. The ISES is conceived as a computer resource in space whose aim is to assist computer, earth, and space science experiments, to develop and demonstrate new information processing concepts, and to provide an experiment base for developing new information technology for use in space systems. The discussion covers system hardware and architecture, operating system software, the user interface, and the ground communication link.
NASA Technical Reports Server (NTRS)
Traversi, M.; Piccolo, R.
1980-01-01
Tradeoff study activities and the analysis process used are described with emphasis on (1) review of the alternatives; (2) vehicle architecture; and (3) evaluation of the propulsion system alternatives; interim results are presented for the basic hybrid vehicle characterization; vehicle scheme development; propulsion system power and transmission ratios; vehicle weight; energy consumption and emissions; performance; production costs; reliability, availability and maintainability; life cycle costs, and operational quality. The final vehicle conceptual design is examined.
Stochastic architecture for Hopfield neural nets
NASA Technical Reports Server (NTRS)
Pavel, Sandy
1992-01-01
An expandable stochastic digital architecture for recurrent (Hopfield like) neural networks is proposed. The main features and basic principles of stochastic processing are presented. The stochastic digital architecture is based on a chip with n full interconnected neurons with a pipeline, bit processing structure. For large applications, a flexible way to interconnect many such chips is provided.
Knowledge Production in an Architectural Practice and a University Architectural Department
ERIC Educational Resources Information Center
Winberg, Chris
2006-01-01
Processes of knowledge production by professional architects and architects-in-training were studied and compared. Both professionals and students were involved in the production of knowledge about the architectural heritage of historical buildings in Cape Town. In a study of the artefacts produced, observations of the processes by means of which…
Discrete-Time Demodulator Architectures for Free-Space Broadband Optical Pulse-Position Modulation
NASA Technical Reports Server (NTRS)
Gray, A. A.; Lee, C.
2004-01-01
The objective of this work is to develop discrete-time demodulator architectures for broadband optical pulse-position modulation (PPM) that are capable of processing Nyquist or near-Nyquist data rates. These architectures are motivated by the numerous advantages of realizing communications demodulators in digital very large scale integrated (VLSI) circuits. The architectures are developed within a framework that encompasses a large body of work in optical communications, synchronization, and multirate discrete-time signal processing and are constrained by the limitations of the state of the art in digital hardware. This work attempts to create a bridge between theoretical communication algorithms and analysis for deep-space optical PPM and modern digital VLSI. The primary focus of this work is on the synthesis of discrete-time processing architectures for accomplishing the most fundamental functions required in PPM demodulators, post-detection filtering, synchronization, and decision processing. The architectures derived are capable of closely approximating the theoretical performance of the continuous-time algorithms from which they are derived. The work concludes with an outline of the development path that leads to hardware.
All-in-One Gel-Based Electrochromic Devices: Strengths and Recent Developments
Viñuales, Ana; Rodriguez, Javier; Tena-Zaera, Ramón
2018-01-01
Electrochromic devices (ECDs) have aroused great interest because of their potential applicability in displays and smart systems, including windows, rearview mirrors, and helmet visors. In the last decades, different device structures and materials have been proposed to meet the requirements of commercial applications to boost market entry. To this end, employing simple device architectures and achieving a competitive electrolyte are crucial to accomplish easily implementable, high-performance ECDs. The present review outlines devices comprising gel electrolytes as a single electroactive layer (“all-in-one”) ECD architecture, highlighting some advantages and opportunities they offer over other electrochromic systems. In this context, gel electrolytes not only overcome the drawbacks of liquid and solid electrolytes, such as liquid’s low chemical stability and risk of leaking and soil’s slow switching and lack of transparency, but also exhibit further strengths. These include easier processability, suitability for flexible substrates, and improved stabilization of the chemical species involved in redox processes, leading to better cyclability and opening wide possibilities to extend the electrochromic color palette, as discussed herein. Finally, conclusions and outlook are provided. PMID:29534466
Improving Turbine Performance with Ceramic Matrix Composites
NASA Technical Reports Server (NTRS)
DiCarlo, James A.
2007-01-01
Under the new NASA Fundamental Aeronautics Program, efforts are on-going within the Supersonics Project aimed at the implementation of advanced SiC/SiC ceramic composites into hot section components of future gas turbine engines. Due to recent NASA advancements in SiC-based fibers and matrices, these composites are lighter and capable of much higher service temperatures than current metallic superalloys, which in turn will allow the engines to operate at higher efficiencies and reduced emissions. This presentation briefly reviews studies within Task 6.3.3 that are primarily aimed at developing physics-based concepts, tools, and process/property models for micro- and macro-structural design, fabrication, and lifing of SiC/SiC turbine components in general and airfoils in particular. Particular emphasis is currently being placed on understanding and modeling (1) creep effects on residual stress development within the component, (2) fiber architecture effects on key composite properties such as design strength, and (3) preform formation processes so that the optimum architectures can be implemented into complex-shaped components, such as turbine vanes and blades.
Ruel, Jean; Lachance, Geneviève
2010-01-01
This paper presents an experimental study of three bioreactor configurations. The bioreactor is intended to be used for the development of tissue-engineered heart valve substitutes. Therefore it must be able to reproduce physiological flow and pressure waveforms accurately. A detailed analysis of three bioreactor arrangements is presented using mathematical models based on the windkessel (WK) approach. First, a review of the many applications of this approach in medical studies enhances its fundamental nature and its usefulness. Then the models are developed with reference to the actual components of the bioreactor. This study emphasizes different conflicting issues arising in the design process of a bioreactor for biomedical purposes, where an optimization process is essential to reach a compromise satisfying all conditions. Two important aspects are the need for a simple system providing ease of use and long-term sterility, opposed to the need for an advanced (thus more complex) architecture capable of a more accurate reproduction of the physiological environment. Three classic WK architectures are analyzed, and experimental results enhance the advantages and limitations of each one. PMID:21977286
Chapter 16: Lignin Visualization: Advanced Microscopy Techniques for Lignin Characterization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeng, Yining; Donohoe, Bryon S
Visualization of lignin in plant cell walls, with both spatial and chemical resolution, is emerging as an important tool to understand lignin's role in the plant cell wall's nanoscale architecture and to understand and design processes intended to modify the lignin. As such, this chapter reviews recent advances in advanced imaging methods with respect to lignin in plant cell walls. This review focuses on the importance of lignin detection and localization for studies in both plant biology and biotechnology. Challenges going forward to identify and delineate lignin from other plant cell wall components and to quantitatively analyze lignin in wholemore » cell walls from native plant tissue and treated biomass are also discussed.« less
Information Architecture of Web-Based Interventions to Improve Health Outcomes: Systematic Review.
Pugatch, Jillian; Grenen, Emily; Surla, Stacy; Schwarz, Mary; Cole-Lewis, Heather
2018-03-21
The rise in usage of and access to new technologies in recent years has led to a growth in digital health behavior change interventions. As the shift to digital platforms continues to grow, it is increasingly important to consider how the field of information architecture (IA) can inform the development of digital health interventions. IA is the way in which digital content is organized and displayed, which strongly impacts users' ability to find and use content. While many information architecture best practices exist, there is a lack of empirical evidence on the role it plays in influencing behavior change and health outcomes. Our aim was to conduct a systematic review synthesizing the existing literature on website information architecture and its effect on health outcomes, behavioral outcomes, and website engagement. To identify all existing information architecture and health behavior literature, we searched articles published in English in the following databases (no date restrictions imposed): ACM Digital Library, CINAHL, Cochrane Library, Google Scholar, Ebsco, and PubMed. The search terms used included information terms (eg, information architecture, interaction design, persuasive design), behavior terms (eg, health behavior, behavioral intervention, ehealth), and health terms (eg, smoking, physical activity, diabetes). The search results were reviewed to determine if they met the inclusion and exclusion criteria created to identify empirical research that studied the effect of IA on health outcomes, behavioral outcomes, or website engagement. Articles that met inclusion criteria were assessed for study quality. Then, data from the articles were extracted using a priori categories established by 3 reviewers. However, the limited health outcome data gathered from the studies precluded a meta-analysis. The initial literature search yielded 685 results, which was narrowed down to three publications that examined the effect of information architecture on health outcomes, behavioral outcomes, or website engagement. One publication studied the isolated impact of information architecture on outcomes of interest (ie, website use and engagement; health-related knowledge, attitudes, and beliefs; and health behaviors), while the other two publications studied the impact of information architecture, website features (eg, interactivity, email prompts, and forums), and tailored content on these outcomes. The paper that investigated IA exclusively found that a tunnel IA improved site engagement and behavior knowledge, but it decreased users' perceived efficiency. The first study that did not isolate IA found that the enhanced site condition improved site usage but not the amount of content viewed. The second study that did not isolate IA found that a tailored site condition improved site usage, behavior knowledge, and some behavior outcomes. No clear conclusion can be made about the relationship between IA and health outcomes, given limited evidence in the peer-reviewed literature connecting IA to behavioral outcomes and website engagement. Only one study reviewed solely manipulated IA, and we therefore recommend improving the scientific evidence base such that additional empirical studies investigate the impact of IA in isolation. Moreover, information from the gray literature and expert opinion might be identified and added to the evidence base, in order to lay the groundwork for hypothesis generation to improve empirical evidence on information architecture and health and behavior outcomes. ©Jillian Pugatch, Emily Grenen, Stacy Surla, Mary Schwarz, Heather Cole-Lewis. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 21.03.2018.
2009-12-01
Business Process Modeling BPMN Business Process Modeling Notation SoA Service-oriented Architecture UML Unified Modeling Language CSP...system developers. Supporting technologies include Business Process Modeling Notation ( BPMN ), Unified Modeling Language (UML), model-driven architecture
An agent based architecture for high-risk neonate management at neonatal intensive care unit.
Malak, Jaleh Shoshtarian; Safdari, Reza; Zeraati, Hojjat; Nayeri, Fatemeh Sadat; Mohammadzadeh, Niloofar; Farajollah, Seide Sedighe Seied
2018-01-01
In recent years, the use of new tools and technologies has decreased the neonatal mortality rate. Despite the positive effect of using these technologies, the decisions are complex and uncertain in critical conditions when the neonate is preterm or has a low birth weight or malformations. There is a need to automate the high-risk neonate management process by creating real-time and more precise decision support tools. To create a collaborative and real-time environment to manage neonates with critical conditions at the NICU (Neonatal Intensive Care Unit) and to overcome high-risk neonate management weaknesses by applying a multi agent based analysis and design methodology as a new solution for NICU management. This study was a basic research for medical informatics method development that was carried out in 2017. The requirement analysis was done by reviewing articles on NICU Decision Support Systems. PubMed, Science Direct, and IEEE databases were searched. Only English articles published after 1990 were included; also, a needs assessment was done by reviewing the extracted features and current processes at the NICU environment where the research was conducted. We analyzed the requirements and identified the main system roles (agents) and interactions by a comparative study of existing NICU decision support systems. The Universal Multi Agent Platform (UMAP) was applied to implement a prototype of our multi agent based high-risk neonate management architecture. Local environment agents interacted inside a container and each container interacted with external resources, including other NICU systems and consultation centers. In the NICU container, the main identified agents were reception, monitoring, NICU registry, and outcome prediction, which interacted with human agents including nurses and physicians. Managing patients at the NICU units requires online data collection, real-time collaboration, and management of many components. Multi agent systems are applied as a well-known solution for management, coordination, modeling, and control of NICU processes. We are currently working on an outcome prediction module using artificial intelligence techniques for neonatal mortality risk prediction. The full implementation of the proposed architecture and evaluation is considered the future work.
2016-03-01
ENTRY CRITERIA, AND STANDARD WORK PACKAGE DATA TO ENABLE RAPID DEVELOPMENT OF INTEGRATED MASTER SCHEDULES by Burton W. Porter Jr. March 2016...2. REPORT DATE March 2016 3. REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE USING SYSTEM ARCHITECTURE, REVIEW ENTRY ... ENTRY CRITERIA, AND STANDARD WORK PACKAGE DATA TO ENABLE RAPID DEVELOPMENT OF INTEGRATED MASTER SCHEDULES Burton W. Porter Jr. Civilian
Progress in a novel architecture for high performance processing
NASA Astrophysics Data System (ADS)
Zhang, Zhiwei; Liu, Meng; Liu, Zijun; Du, Xueliang; Xie, Shaolin; Ma, Hong; Ding, Guangxin; Ren, Weili; Zhou, Fabiao; Sun, Wenqin; Wang, Huijuan; Wang, Donglin
2018-04-01
The high performance processing (HPP) is an innovative architecture which targets on high performance computing with excellent power efficiency and computing performance. It is suitable for data intensive applications like supercomputing, machine learning and wireless communication. An example chip with four application-specific integrated circuit (ASIC) cores which is the first generation of HPP cores has been taped out successfully under Taiwan Semiconductor Manufacturing Company (TSMC) 40 nm low power process. The innovative architecture shows great energy efficiency over the traditional central processing unit (CPU) and general-purpose computing on graphics processing units (GPGPU). Compared with MaPU, HPP has made great improvement in architecture. The chip with 32 HPP cores is being developed under TSMC 16 nm field effect transistor (FFC) technology process and is planed to use commercially. The peak performance of this chip can reach 4.3 teraFLOPS (TFLOPS) and its power efficiency is up to 89.5 gigaFLOPS per watt (GFLOPS/W).
Natural language processing in pathology: a scoping review.
Burger, Gerard; Abu-Hanna, Ameen; de Keizer, Nicolette; Cornet, Ronald
2016-07-22
Encoded pathology data are key for medical registries and analyses, but pathology information is often expressed as free text. We reviewed and assessed the use of NLP (natural language processing) for encoding pathology documents. Papers addressing NLP in pathology were retrieved from PubMed, Association for Computing Machinery (ACM) Digital Library and Association for Computational Linguistics (ACL) Anthology. We reviewed and summarised the study objectives; NLP methods used and their validation; software implementations; the performance on the dataset used and any reported use in practice. The main objectives of the 38 included papers were encoding and extraction of clinically relevant information from pathology reports. Common approaches were word/phrase matching, probabilistic machine learning and rule-based systems. Five papers (13%) compared different methods on the same dataset. Four papers did not specify the method(s) used. 18 of the 26 studies that reported F-measure, recall or precision reported values of over 0.9. Proprietary software was the most frequently mentioned category (14 studies); General Architecture for Text Engineering (GATE) was the most applied architecture overall. Practical system use was reported in four papers. Most papers used expert annotation validation. Different methods are used in NLP research in pathology, and good performances, that is, high precision and recall, high retrieval/removal rates, are reported for all of these. Lack of validation and of shared datasets precludes performance comparison. More comparative analysis and validation are needed to provide better insight into the performance and merits of these methods. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
40 CFR 52.1174 - Control strategy: Ozone.
Code of Federal Regulations, 2010 CFR
2010-07-01
... oxides of nitrogen requirements for conformity and new source review. Theses are required by sections 176... architectural, industrial, and maintenance coatings rule; auto body refinisher self-certification audit program... architectural, industrial, and maintenance coatings rule; auto body refinisher self-certification audit program...
40 CFR 52.1174 - Control strategy: Ozone.
Code of Federal Regulations, 2012 CFR
2012-07-01
... oxides of nitrogen requirements for conformity and new source review. Theses are required by sections 176... architectural, industrial, and maintenance coatings rule; auto body refinisher self-certification audit program... architectural, industrial, and maintenance coatings rule; auto body refinisher self-certification audit program...
40 CFR 52.1174 - Control strategy: Ozone.
Code of Federal Regulations, 2013 CFR
2013-07-01
... oxides of nitrogen requirements for conformity and new source review. Theses are required by sections 176... architectural, industrial, and maintenance coatings rule; auto body refinisher self-certification audit program... architectural, industrial, and maintenance coatings rule; auto body refinisher self-certification audit program...
40 CFR 52.1174 - Control strategy: Ozone.
Code of Federal Regulations, 2014 CFR
2014-07-01
... oxides of nitrogen requirements for conformity and new source review. Theses are required by sections 176... architectural, industrial, and maintenance coatings rule; auto body refinisher self-certification audit program... architectural, industrial, and maintenance coatings rule; auto body refinisher self-certification audit program...
40 CFR 52.1174 - Control strategy: Ozone.
Code of Federal Regulations, 2011 CFR
2011-07-01
... oxides of nitrogen requirements for conformity and new source review. Theses are required by sections 176... architectural, industrial, and maintenance coatings rule; auto body refinisher self-certification audit program... architectural, industrial, and maintenance coatings rule; auto body refinisher self-certification audit program...
Networks: A Review of Their Technology, Architecture, and Implementation.
ERIC Educational Resources Information Center
Learn, Larry L.
1988-01-01
This overview of network-related technologies covers network elements, analog and digital signals, transmission media and their characteristics, equipment certification, multiplexing, network types, access technologies, network architectures local-area network technologies and attributes, protocols, internetworking, fiber optics versus satellites,…
Molecular basis of angiosperm tree architecture
USDA-ARS?s Scientific Manuscript database
The shoot architecture of trees greatly impacts orchard and forest management methods. Amassing greater knowledge of the molecular genetics behind tree form can benefit these industries as well as contribute to basic knowledge of plant developmental biology. This review covers basic components of ...
Litzov, Ivan; Brabec, Christoph J.
2013-01-01
Solution-processed inverted bulk heterojunction (BHJ) solar cells have gained much more attention during the last decade, because of their significantly better environmental stability compared to the normal architecture BHJ solar cells. Transparent metal oxides (MeOx) play an important role as the dominant class for solution-processed interface materials in this development, due to their excellent optical transparency, their relatively high electrical conductivity and their tunable work function. This article reviews the advantages and disadvantages of the most common synthesis methods used for the wet chemical preparation of the most relevant n-type- and p-type-like MeOx interface materials consisting of binary compounds AxBy. Their performance for applications as electron transport/extraction layers (ETL/EEL) and as hole transport/extraction layers (HTL/HEL) in inverted BHJ solar cells will be reviewed and discussed. PMID:28788423
Litzov, Ivan; Brabec, Christoph J
2013-12-10
Solution-processed inverted bulk heterojunction (BHJ) solar cells have gained much more attention during the last decade, because of their significantly better environmental stability compared to the normal architecture BHJ solar cells. Transparent metal oxides (MeO x ) play an important role as the dominant class for solution-processed interface materials in this development, due to their excellent optical transparency, their relatively high electrical conductivity and their tunable work function. This article reviews the advantages and disadvantages of the most common synthesis methods used for the wet chemical preparation of the most relevant n -type- and p -type-like MeO x interface materials consisting of binary compounds A x B y . Their performance for applications as electron transport/extraction layers (ETL/EEL) and as hole transport/extraction layers (HTL/HEL) in inverted BHJ solar cells will be reviewed and discussed.
Big data - smart health strategies. Findings from the yearbook 2014 special theme.
Koutkias, V; Thiessard, F
2014-08-15
To select best papers published in 2013 in the field of big data and smart health strategies, and summarize outstanding research efforts. A systematic search was performed using two major bibliographic databases for relevant journal papers. The references obtained were reviewed in a two-stage process, starting with a blinded review performed by the two section editors, and followed by a peer review process operated by external reviewers recognized as experts in the field. The complete review process selected four best papers, illustrating various aspects of the special theme, among them: (a) using large volumes of unstructured data and, specifically, clinical notes from Electronic Health Records (EHRs) for pharmacovigilance; (b) knowledge discovery via querying large volumes of complex (both structured and unstructured) biological data using big data technologies and relevant tools; (c) methodologies for applying cloud computing and big data technologies in the field of genomics, and (d) system architectures enabling high-performance access to and processing of large datasets extracted from EHRs. The potential of big data in biomedicine has been pinpointed in various viewpoint papers and editorials. The review of current scientific literature illustrated a variety of interesting methods and applications in the field, but still the promises exceed the current outcomes. As we are getting closer towards a solid foundation with respect to common understanding of relevant concepts and technical aspects, and the use of standardized technologies and tools, we can anticipate to reach the potential that big data offer for personalized medicine and smart health strategies in the near future.
Big Data - Smart Health Strategies
2014-01-01
Summary Objectives To select best papers published in 2013 in the field of big data and smart health strategies, and summarize outstanding research efforts. Methods A systematic search was performed using two major bibliographic databases for relevant journal papers. The references obtained were reviewed in a two-stage process, starting with a blinded review performed by the two section editors, and followed by a peer review process operated by external reviewers recognized as experts in the field. Results The complete review process selected four best papers, illustrating various aspects of the special theme, among them: (a) using large volumes of unstructured data and, specifically, clinical notes from Electronic Health Records (EHRs) for pharmacovigilance; (b) knowledge discovery via querying large volumes of complex (both structured and unstructured) biological data using big data technologies and relevant tools; (c) methodologies for applying cloud computing and big data technologies in the field of genomics, and (d) system architectures enabling high-performance access to and processing of large datasets extracted from EHRs. Conclusions The potential of big data in biomedicine has been pinpointed in various viewpoint papers and editorials. The review of current scientific literature illustrated a variety of interesting methods and applications in the field, but still the promises exceed the current outcomes. As we are getting closer towards a solid foundation with respect to common understanding of relevant concepts and technical aspects, and the use of standardized technologies and tools, we can anticipate to reach the potential that big data offer for personalized medicine and smart health strategies in the near future. PMID:25123721
A hybrid method for evaluating enterprise architecture implementation.
Nikpay, Fatemeh; Ahmad, Rodina; Yin Kia, Chiam
2017-02-01
Enterprise Architecture (EA) implementation evaluation provides a set of methods and practices for evaluating the EA implementation artefacts within an EA implementation project. There are insufficient practices in existing EA evaluation models in terms of considering all EA functions and processes, using structured methods in developing EA implementation, employing matured practices, and using appropriate metrics to achieve proper evaluation. The aim of this research is to develop a hybrid evaluation method that supports achieving the objectives of EA implementation. To attain this aim, the first step is to identify EA implementation evaluation practices. To this end, a Systematic Literature Review (SLR) was conducted. Second, the proposed hybrid method was developed based on the foundation and information extracted from the SLR, semi-structured interviews with EA practitioners, program theory evaluation and Information Systems (ISs) evaluation. Finally, the proposed method was validated by means of a case study and expert reviews. This research provides a suitable foundation for researchers who wish to extend and continue this research topic with further analysis and exploration, and for practitioners who would like to employ an effective and lightweight evaluation method for EA projects. Copyright © 2016 Elsevier Ltd. All rights reserved.
Zhang, Xiang-Yu; Fang, Gang; Zhou, Jie
2017-01-01
Additive manufacturing (AM), nowadays commonly known as 3D printing, is a revolutionary materials processing technology, particularly suitable for the production of low-volume parts with high shape complexities and often with multiple functions. As such, it holds great promise for the fabrication of patient-specific implants. In recent years, remarkable progress has been made in implementing AM in the bio-fabrication field. This paper presents an overview on the state-of-the-art AM technology for bone tissue engineering (BTE) scaffolds, with a particular focus on the AM scaffolds made of metallic biomaterials. It starts with a brief description of architecture design strategies to meet the biological and mechanical property requirements of scaffolds. Then, it summarizes the working principles, advantages and limitations of each of AM methods suitable for creating porous structures and manufacturing scaffolds from powdered materials. It elaborates on the finite-element (FE) analysis applied to predict the mechanical behavior of AM scaffolds, as well as the effect of the architectural design of porous structure on its mechanical properties. The review ends up with the authors’ view on the current challenges and further research directions. PMID:28772411
Analysis of Android Device-Based Solutions for Fall Detection
Casilari, Eduardo; Luque, Rafael; Morón, María-José
2015-01-01
Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions. PMID:26213928
Zhang, Xiang-Yu; Fang, Gang; Zhou, Jie
2017-01-10
Additive manufacturing (AM), nowadays commonly known as 3D printing, is a revolutionary materials processing technology, particularly suitable for the production of low-volume parts with high shape complexities and often with multiple functions. As such, it holds great promise for the fabrication of patient-specific implants. In recent years, remarkable progress has been made in implementing AM in the bio-fabrication field. This paper presents an overview on the state-of-the-art AM technology for bone tissue engineering (BTE) scaffolds, with a particular focus on the AM scaffolds made of metallic biomaterials. It starts with a brief description of architecture design strategies to meet the biological and mechanical property requirements of scaffolds. Then, it summarizes the working principles, advantages and limitations of each of AM methods suitable for creating porous structures and manufacturing scaffolds from powdered materials. It elaborates on the finite-element (FE) analysis applied to predict the mechanical behavior of AM scaffolds, as well as the effect of the architectural design of porous structure on its mechanical properties. The review ends up with the authors' view on the current challenges and further research directions.
Analysis of Android Device-Based Solutions for Fall Detection.
Casilari, Eduardo; Luque, Rafael; Morón, María-José
2015-07-23
Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions.
Three-Dimensional Nanobiocomputing Architectures With Neuronal Hypercells
2007-06-01
Neumann architectures, and CMOS fabrication. Novel solutions of massive parallel distributed computing and processing (pipelined due to systolic... and processing platforms utilizing molecular hardware within an enabling organization and architecture. The design technology is based on utilizing a...Microsystems and Nanotechnologies investigated a novel 3D3 (Hardware Software Nanotechnology) technology to design super-high performance computing
Image Understanding Architecture
1991-09-01
architecture to support real-time, knowledge -based image understanding , and develop the software support environment that will be needed to utilize...NUMBER OF PAGES Image Understanding Architecture, Knowledge -Based Vision, AI Real-Time Computer Vision, Software Simulator, Parallel Processor IL PRICE... information . In addition to sensory and knowledge -based processing it is useful to introduce a level of symbolic processing. Thus, vision researchers
TMT approach to observatory software development process
NASA Astrophysics Data System (ADS)
Buur, Hanne; Subramaniam, Annapurni; Gillies, Kim; Dumas, Christophe; Bhatia, Ravinder
2016-07-01
The purpose of the Observatory Software System (OSW) is to integrate all software and hardware components of the Thirty Meter Telescope (TMT) to enable observations and data capture; thus it is a complex software system that is defined by four principal software subsystems: Common Software (CSW), Executive Software (ESW), Data Management System (DMS) and Science Operations Support System (SOSS), all of which have interdependencies with the observatory control systems and data acquisition systems. Therefore, the software development process and plan must consider dependencies to other subsystems, manage architecture, interfaces and design, manage software scope and complexity, and standardize and optimize use of resources and tools. Additionally, the TMT Observatory Software will largely be developed in India through TMT's workshare relationship with the India TMT Coordination Centre (ITCC) and use of Indian software industry vendors, which adds complexity and challenges to the software development process, communication and coordination of activities and priorities as well as measuring performance and managing quality and risk. The software project management challenge for the TMT OSW is thus a multi-faceted technical, managerial, communications and interpersonal relations challenge. The approach TMT is using to manage this multifaceted challenge is a combination of establishing an effective geographically distributed software team (Integrated Product Team) with strong project management and technical leadership provided by the TMT Project Office (PO) and the ITCC partner to manage plans, process, performance, risk and quality, and to facilitate effective communications; establishing an effective cross-functional software management team composed of stakeholders, OSW leadership and ITCC leadership to manage dependencies and software release plans, technical complexities and change to approved interfaces, architecture, design and tool set, and to facilitate effective communications; adopting an agile-based software development process across the observatory to enable frequent software releases to help mitigate subsystem interdependencies; defining concise scope and work packages for each of the OSW subsystems to facilitate effective outsourcing of software deliverables to the ITCC partner, and to enable performance monitoring and risk management. At this stage, the architecture and high-level design of the software system has been established and reviewed. During construction each subsystem will have a final design phase with reviews, followed by implementation and testing. The results of the TMT approach to the Observatory Software development process will only be preliminary at the time of the submittal of this paper, but it is anticipated that the early results will be a favorable indication of progress.
Automated Synthesis of Architecture of Avionic Systems
NASA Technical Reports Server (NTRS)
Chau, Savio; Xu, Joseph; Dang, Van; Lu, James F.
2006-01-01
The Architecture Synthesis Tool (AST) is software that automatically synthesizes software and hardware architectures of avionic systems. The AST is expected to be most helpful during initial formulation of an avionic-system design, when system requirements change frequently and manual modification of architecture is time-consuming and susceptible to error. The AST comprises two parts: (1) an architecture generator, which utilizes a genetic algorithm to create a multitude of architectures; and (2) a functionality evaluator, which analyzes the architectures for viability, rejecting most of the non-viable ones. The functionality evaluator generates and uses a viability tree a hierarchy representing functions and components that perform the functions such that the system as a whole performs system-level functions representing the requirements for the system as specified by a user. Architectures that survive the functionality evaluator are further evaluated by the selection process of the genetic algorithm. Architectures found to be most promising to satisfy the user s requirements and to perform optimally are selected as parents to the next generation of architectures. The foregoing process is iterated as many times as the user desires. The final output is one or a few viable architectures that satisfy the user s requirements.
Actin dynamics, architecture, and mechanics in cell motility.
Blanchoin, Laurent; Boujemaa-Paterski, Rajaa; Sykes, Cécile; Plastino, Julie
2014-01-01
Tight coupling between biochemical and mechanical properties of the actin cytoskeleton drives a large range of cellular processes including polarity establishment, morphogenesis, and motility. This is possible because actin filaments are semi-flexible polymers that, in conjunction with the molecular motor myosin, can act as biological active springs or "dashpots" (in laymen's terms, shock absorbers or fluidizers) able to exert or resist against force in a cellular environment. To modulate their mechanical properties, actin filaments can organize into a variety of architectures generating a diversity of cellular organizations including branched or crosslinked networks in the lamellipodium, parallel bundles in filopodia, and antiparallel structures in contractile fibers. In this review we describe the feedback loop between biochemical and mechanical properties of actin organization at the molecular level in vitro, then we integrate this knowledge into our current understanding of cellular actin organization and its physiological roles.
Ion Conduction in Microphase-Separated Block Copolymer Electrolytes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kambe, Yu; Arges, Christopher G.; Patel, Shrayesh
2017-01-01
Microphase separation of block copolymers provides a promising route towards engineering a mechanically robust ion conducting film for electrochemical devices. The separation into two different nano-domains enables the film to simultaneously exhibit both high ion conductivity and mechanical robustness, material properties inversely related in most homopolymer and random copolymer electrolytes. To exhibit the maximum conductivity and mechanical robustness, both domains would span across macroscopic length scales enabling uninterrupted ion conduction. One way to achieve this architecture is through external alignment fields that are applied during the microphase separation process. In this review, we present the progress and challenges for aligningmore » the ionic domains in block copolymer electrolytes. A survey of alignment and characterization is followed by a discussion of how the nanoscale architecture affects the bulk conductivity and how alignment may be improved to maximize the number of participating conduction domains.« less
Development of the brain's functional network architecture.
Vogel, Alecia C; Power, Jonathan D; Petersen, Steven E; Schlaggar, Bradley L
2010-12-01
A full understanding of the development of the brain's functional network architecture requires not only an understanding of developmental changes in neural processing in individual brain regions but also an understanding of changes in inter-regional interactions. Resting state functional connectivity MRI (rs-fcMRI) is increasingly being used to study functional interactions between brain regions in both adults and children. We briefly review methods used to study functional interactions and networks with rs-fcMRI and how these methods have been used to define developmental changes in network functional connectivity. The developmental rs-fcMRI studies to date have found two general properties. First, regional interactions change from being predominately anatomically local in children to interactions spanning longer cortical distances in young adults. Second, this developmental change in functional connectivity occurs, in general, via mechanisms of segregation of local regions and integration of distant regions into disparate subnetworks.
Chromatin Insulators and Topological Domains: Adding New Dimensions to 3D Genome Architecture
Matharu, Navneet K.; Ahanger, Sajad H.
2015-01-01
The spatial organization of metazoan genomes has a direct influence on fundamental nuclear processes that include transcription, replication, and DNA repair. It is imperative to understand the mechanisms that shape the 3D organization of the eukaryotic genomes. Chromatin insulators have emerged as one of the central components of the genome organization tool-kit across species. Recent advancements in chromatin conformation capture technologies have provided important insights into the architectural role of insulators in genomic structuring. Insulators are involved in 3D genome organization at multiple spatial scales and are important for dynamic reorganization of chromatin structure during reprogramming and differentiation. In this review, we will discuss the classical view and our renewed understanding of insulators as global genome organizers. We will also discuss the plasticity of chromatin structure and its re-organization during pluripotency and differentiation and in situations of cellular stress. PMID:26340639
Development of the Brain's Functional Network Architecture
Power, Jonathan D.; Petersen, Steven E.; Schlaggar, Bradley L.
2013-01-01
A full understanding of the development of the brain's functional network architecture requires not only an understanding of developmental changes in neural processing in individual brain regions but also an understanding of changes in inter-regional interactions. Resting state functional connectivity MRI (rs-fcMRI) is increasingly being used to study functional interactions between brain regions in both adults and children. We briefly review methods used to study functional interactions and networks with rs-fcMRI and how these methods have been used to define developmental changes in network functional connectivity. The developmental rs-fcMRI studies to date have found two general properties. First, regional interactions change from being predominately anatomically local in children to interactions spanning longer cortical distances in young adults. Second, this developmental change in functional connectivity occurs, in general, via mechanisms of segregation of local regions and integration of distant regions into disparate subnetworks. PMID:20976563
Developing a taxonomy for mission architecture definition
NASA Technical Reports Server (NTRS)
Neubek, Deborah J.
1990-01-01
The Lunar and Mars Exploration Program Office (LMEPO) was tasked to define candidate architectures for the Space Exploration Initiative to submit to NASA senior management and an externally constituted Outreach Synthesis Group. A systematic, structured process for developing, characterizing, and describing the alternate mission architectures, and applying this process to future studies was developed. The work was done in two phases: (1) national needs were identified and categorized into objectives achievable by the Space Exploration Initiative; and (2) a program development process was created which both hierarchically and iteratively describes the program planning process.
Singlet exciton fission photovoltaics.
Lee, Jiye; Jadhav, Priya; Reusswig, Philip D; Yost, Shane R; Thompson, Nicholas J; Congreve, Daniel N; Hontz, Eric; Van Voorhis, Troy; Baldo, Marc A
2013-06-18
Singlet exciton fission, a process that generates two excitons from a single photon, is perhaps the most efficient of the various multiexciton-generation processes studied to date, offering the potential to increase the efficiency of solar devices. But its unique characteristic, splitting a photogenerated singlet exciton into two dark triplet states, means that the empty absorption region between the singlet and triplet excitons must be filled by adding another material that captures low-energy photons. This has required the development of specialized device architectures. In this Account, we review work to develop devices that harness the theoretical benefits of singlet exciton fission. First, we discuss singlet fission in the archetypal material, pentacene. Pentacene-based photovoltaic devices typically show high external and internal quantum efficiencies. They have enabled researchers to characterize fission, including yield and the impact of competing loss processes, within functional devices. We review in situ probes of singlet fission that modulate the photocurrent using a magnetic field. We also summarize studies of the dissociation of triplet excitons into charge at the pentacene-buckyball (C60) donor-acceptor interface. Multiple independent measurements confirm that pentacene triplet excitons can dissociate at the C60 interface despite their relatively low energy. Because triplet excitons produced by singlet fission each have no more than half the energy of the original photoexcitation, they limit the potential open circuit voltage within a solar cell. Thus, if singlet fission is to increase the overall efficiency of a solar cell and not just double the photocurrent at the cost of halving the voltage, it is necessary to also harvest photons in the absorption gap between the singlet and triplet energies of the singlet fission material. We review two device architectures that attempt this using long-wavelength materials: a three-layer structure that uses long- and short-wavelength donors and an acceptor and a simpler, two-layer combination of a singlet-fission donor and a long-wavelength acceptor. An example of the trilayer structure is singlet fission in tetracene with copper phthalocyanine inserted at the C60 interface. The bilayer approach includes pentacene photovoltaic cells with an acceptor of infrared-absorbing lead sulfide or lead selenide nanocrystals. Lead selenide nanocrystals appear to be the most promising acceptors, exhibiting efficient triplet exciton dissociation and high power conversion efficiency. Finally, we review architectures that use singlet fission materials to sensitize other absorbers, thereby effectively converting conventional donor materials to singlet fission dyes. In these devices, photoexcitation occurs in a particular molecule and then energy is transferred to a singlet fission dye where the fission occurs. For example, rubrene inserted between a donor and an acceptor decouples the ability to perform singlet fission from other major photovoltaic properties such as light absorption.
A 48Cycles/MB H.264/AVC Deblocking Filter Architecture for Ultra High Definition Applications
NASA Astrophysics Data System (ADS)
Zhou, Dajiang; Zhou, Jinjia; Zhu, Jiayi; Goto, Satoshi
In this paper, a highly parallel deblocking filter architecture for H.264/AVC is proposed to process one macroblock in 48 clock cycles and give real-time support to QFHD@60fps sequences at less than 100MHz. 4 edge filters organized in 2 groups for simultaneously processing vertical and horizontal edges are applied in this architecture to enhance its throughput. While parallelism increases, pipeline hazards arise owing to the latency of edge filters and data dependency of deblocking algorithm. To solve this problem, a zig-zag processing schedule is proposed to eliminate the pipeline bubbles. Data path of the architecture is then derived according to the processing schedule and optimized through data flow merging, so as to minimize the cost of logic and internal buffer. Meanwhile, the architecture's data input rate is designed to be identical to its throughput, while the transmission order of input data can also match the zig-zag processing schedule. Therefore no intercommunication buffer is required between the deblocking filter and its previous component for speed matching or data reordering. As a result, only one 24×64 two-port SRAM as internal buffer is required in this design. When synthesized with SMIC 130nm process, the architecture costs a gate count of 30.2k, which is competitive considering its high performance.
Individual Differences in Language Acquisition and Processing.
Kidd, Evan; Donnelly, Seamus; Christiansen, Morten H
2018-02-01
Humans differ in innumerable ways, with considerable variation observable at every level of description, from the molecular to the social. Traditionally, linguistic and psycholinguistic theory has downplayed the possibility of meaningful differences in language across individuals. However, it is becoming increasingly evident that there is significant variation among speakers at any age as well as across the lifespan. Here, we review recent research in psycholinguistics, and argue that a focus on individual differences (IDs) provides a crucial source of evidence that bears strongly upon core issues in theories of the acquisition and processing of language; specifically, the role of experience in language acquisition, processing, and attainment, and the architecture of the language system. Copyright © 2017 Elsevier Ltd. All rights reserved.
Allen, Nicola; Pichler, Franz; Wang, Tina; Patel, Sundip; Salek, Sam
2013-12-01
European countries are increasingly utilising health technology assessment (HTA) to inform reimbursement decision-making. However, the current European HTA environment is very diverse, and projects are already underway to initiate a more efficient and aligned HTA practice within Europe. This study aims to identify a non-ranking method for classifying the diversity of European HTA agencies process and the organisational architecture of the national regulatory review to reimbursement systems. Using a previously developed mapping methodology, this research created process maps to describe national processes for regulatory review to reimbursement for 33 European jurisdictions. These process maps enabled the creation of 2 HTA taxonomic sets. The confluence of the two taxonomic sets was subsequently cross-referenced to identify 10 HTA archetype groups. HTA is a young, rapidly evolving field and it can be argued that optimal practices for performing HTA are yet to emerge. Therefore, a non-ranking classification approach could objectively characterise and compare the diversity observed in the current European HTA environment. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
78 FR 33331 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-04
... network architecture. The Act also charges NTIA with establishing a grant program to assist state... and effective means to use and integrate the infrastructure, equipment, and other architecture... for the standard three years. Affected Public: Business or other for-profit organizations. Frequency...
HYDRA: A Middleware-Oriented Integrated Architecture for e-Procurement in Supply Chains
NASA Astrophysics Data System (ADS)
Alor-Hernandez, Giner; Aguilar-Lasserre, Alberto; Juarez-Martinez, Ulises; Posada-Gomez, Ruben; Cortes-Robles, Guillermo; Garcia-Martinez, Mario Alberto; Gomez-Berbis, Juan Miguel; Rodriguez-Gonzalez, Alejandro
The Service-Oriented Architecture (SOA) development paradigm has emerged to improve the critical issues of creating, modifying and extending solutions for business processes integration, incorporating process automation and automated exchange of information between organizations. Web services technology follows the SOA's principles for developing and deploying applications. Besides, Web services are considered as the platform for SOA, for both intra- and inter-enterprise communication. However, an SOA does not incorporate information about occurring events into business processes, which are the main features of supply chain management. These events and information delivery are addressed in an Event-Driven Architecture (EDA). Taking this into account, we propose a middleware-oriented integrated architecture that offers a brokering service for the procurement of products in a Supply Chain Management (SCM) scenario. As salient contributions, our system provides a hybrid architecture combining features of both SOA and EDA and a set of mechanisms for business processes pattern management, monitoring based on UML sequence diagrams, Web services-based management, event publish/subscription and reliable messaging service.
Farrell, Richard A; Petkov, Nikolay; Morris, Michael A; Holmes, Justin D
2010-09-15
Self-assembled nanoscale porous architectures, such as mesoporous silica (MPS) films, block copolymer films (BCP) and porous anodic aluminas (PAAs), are ideal hosts for templating one dimensional (1D) nano-entities for a wide range of electronic, photonic, magnetic and environmental applications. All three of these templates can provide scalable and tunable pore diameters below 20 nm [1-3]. Recently, research has progressed towards controlling the pore direction, orientation and long-range order of these nanostructures through so-called directed self-assembly (DSA). Significantly, the introduction of a wide range of top-down chemically and physically pre-patterning substrates has facilitated the DSA of nanostructures into functional device arrays. The following review begins with an overview of the fundamental aspects of self-assembly and ordering processes during the formation of PAAs, BCPs and MPS films. Special attention is given to the different ways of directing self-assembly, concentrating on properties such as uni-directional alignment, precision placement and registry of the self-assembled structures to hierarchal or top-down architectures. Finally, to distinguish this review from other articles we focus on research where nanostructures have been utilised in part to fabricate arrays of functioning devices below the sub 50 nm threshold, by subtractive transfer and additive methods. Where possible, we attempt to compare and contrast the different templating approaches and highlight the strengths and/or limitations that will be important for their potential integration into downstream processes. Copyright 2010 Elsevier Inc. All rights reserved.
Spacelab output processing system architectural study
NASA Technical Reports Server (NTRS)
1977-01-01
Two different system architectures are presented. The two architectures are derived from two different data flows within the Spacelab Output Processing System. The major differences between these system architectures are in the position of the decommutation function (the first architecture performs decommutation in the latter half of the system and the second architecture performs that function in the front end of the system). In order to be examined, the system was divided into five stand-alone subsystems; Work Assembler, Mass Storage System, Output Processor, Peripheral Pool, and Resource Monitor. The work load of each subsystem was estimated independent of the specific devices to be used. The candidate devices were surveyed from a wide sampling of off-the-shelf devices. Analytical expressions were developed to quantify the projected workload in conjunction with typical devices which would adequately handle the subsystem tasks. All of the study efforts were then directed toward preparing performance and cost curves for each architecture subsystem.
Technology architecture guidelines for a health care system.
Jones, D T; Duncan, R; Langberg, M L; Shabot, M M
2000-01-01
Although the demand for use of information technology within the healthcare industry is intensifying, relatively little has been written about guidelines to optimize IT investments. A technology architecture is a set of guidelines for technology integration within an enterprise. The architecture is a critical tool in the effort to control information technology (IT) operating costs by constraining the number of technologies supported. A well-designed architecture is also an important aid to integrating disparate applications, data stores and networks. The authors led the development of a thorough, carefully designed technology architecture for a large and rapidly growing health care system. The purpose and design criteria are described, as well as the process for gaining consensus and disseminating the architecture. In addition, the processes for using, maintaining, and handling exceptions are described. The technology architecture is extremely valuable to health care organizations both in controlling costs and promoting integration.
Technology architecture guidelines for a health care system.
Jones, D. T.; Duncan, R.; Langberg, M. L.; Shabot, M. M.
2000-01-01
Although the demand for use of information technology within the healthcare industry is intensifying, relatively little has been written about guidelines to optimize IT investments. A technology architecture is a set of guidelines for technology integration within an enterprise. The architecture is a critical tool in the effort to control information technology (IT) operating costs by constraining the number of technologies supported. A well-designed architecture is also an important aid to integrating disparate applications, data stores and networks. The authors led the development of a thorough, carefully designed technology architecture for a large and rapidly growing health care system. The purpose and design criteria are described, as well as the process for gaining consensus and disseminating the architecture. In addition, the processes for using, maintaining, and handling exceptions are described. The technology architecture is extremely valuable to health care organizations both in controlling costs and promoting integration. PMID:11079913
A survey of compiler optimization techniques
NASA Technical Reports Server (NTRS)
Schneck, P. B.
1972-01-01
Major optimization techniques of compilers are described and grouped into three categories: machine dependent, architecture dependent, and architecture independent. Machine-dependent optimizations tend to be local and are performed upon short spans of generated code by using particular properties of an instruction set to reduce the time or space required by a program. Architecture-dependent optimizations are global and are performed while generating code. These optimizations consider the structure of a computer, but not its detailed instruction set. Architecture independent optimizations are also global but are based on analysis of the program flow graph and the dependencies among statements of source program. A conceptual review of a universal optimizer that performs architecture-independent optimizations at source-code level is also presented.
Buildings, Beauty, and the Brain: A Neuroscience of Architectural Experience.
Coburn, Alex; Vartanian, Oshin; Chatterjee, Anjan
2017-09-01
A burgeoning interest in the intersection of neuroscience and architecture promises to offer biologically inspired insights into the design of spaces. The goal of such interdisciplinary approaches to architecture is to motivate construction of environments that would contribute to peoples' flourishing in behavior, health, and well-being. We suggest that this nascent field of neuroarchitecture is at a pivotal point in which neuroscience and architecture are poised to extend to a neuroscience of architecture. In such a research program, architectural experiences themselves are the target of neuroscientific inquiry. Here, we draw lessons from recent developments in neuroaesthetics to suggest how neuroarchitecture might mature into an experimental science. We review the extant literature and offer an initial framework from which to contextualize such research. Finally, we outline theoretical and technical challenges that lie ahead.
ERIC Educational Resources Information Center
Bin Hassan, Isham Shah; Ismail, Mohd Arif; Mustafa, Ramlee
2011-01-01
The purpose of this research is to examine the effect of integrating the mobile and CAD technology on teaching architectural design process for Malaysian polytechnic architectural students in producing a creative product. The website is set up based on Caroll's minimal theory, while mobile and CAD technology integration is based on Brown and…
Real-time field programmable gate array architecture for computer vision
NASA Astrophysics Data System (ADS)
Arias-Estrada, Miguel; Torres-Huitzil, Cesar
2001-01-01
This paper presents an architecture for real-time generic convolution of a mask and an image. The architecture is intended for fast low-level image processing. The field programmable gate array (FPGA)-based architecture takes advantage of the availability of registers in FPGAs to implement an efficient and compact module to process the convolutions. The architecture is designed to minimize the number of accesses to the image memory and it is based on parallel modules with internal pipeline operation in order to improve its performance. The architecture is prototyped in a FPGA, but it can be implemented on dedicated very- large-scale-integrated devices to reach higher clock frequencies. Complexity issues, FPGA resources utilization, FPGA limitations, and real-time performance are discussed. Some results are presented and discussed.
NASA Technical Reports Server (NTRS)
Dikshit, Piyush; Guimaraes, Katia; Ramamurthy, Maya; Agrawala, Ashok K.; Larsen, Ronald L.
1989-01-01
In a previous work we have defined a general architecture model for autonomous systems, which can be mapped easily to describe the functions of any automated system (SDAG-86-01). In this note, we use the model to describe the problem of thermal management in space stations. First we briefly review the architecture, then we present the environment of our application, and finally we detail the specific function for each functional block of the architecture for that environment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sturgeon, Matthew R.; Hu, Michael Z.
2017-07-01
This paper has reviewed the frontier field of “architectured membranes” that contains anisotropic oriented porous nanostructures of inorganic materials. Three example types of architectured membranes were discussed with some relevant results from our own research: (1) anodized thin-layer titania membranes on porous anodized aluminum oxide (AAO) substrates of different pore sizes, (2) porous glass membranes on alumina substrate, and (3) guest-host membranes based on infiltration of yttrium-stabilized zirconia inside the pore channels of AAO matrices.
Production experience with the ATLAS Event Service
NASA Astrophysics Data System (ADS)
Benjamin, D.; Calafiura, P.; Childers, T.; De, K.; Guan, W.; Maeno, T.; Nilsson, P.; Tsulaia, V.; Van Gemmeren, P.; Wenaus, T.; ATLAS Collaboration
2017-10-01
The ATLAS Event Service (AES) has been designed and implemented for efficient running of ATLAS production workflows on a variety of computing platforms, ranging from conventional Grid sites to opportunistic, often short-lived resources, such as spot market commercial clouds, supercomputers and volunteer computing. The Event Service architecture allows real time delivery of fine grained workloads to running payload applications which process dispatched events or event ranges and immediately stream the outputs to highly scalable Object Stores. Thanks to its agile and flexible architecture the AES is currently being used by grid sites for assigning low priority workloads to otherwise idle computing resources; similarly harvesting HPC resources in an efficient back-fill mode; and massively scaling out to the 50-100k concurrent core level on the Amazon spot market to efficiently utilize those transient resources for peak production needs. Platform ports in development include ATLAS@Home (BOINC) and the Google Compute Engine, and a growing number of HPC platforms. After briefly reviewing the concept and the architecture of the Event Service, we will report the status and experience gained in AES commissioning and production operations on supercomputers, and our plans for extending ES application beyond Geant4 simulation to other workflows, such as reconstruction and data analysis.
On-board processing architectures for satellite B-ISDN services
NASA Technical Reports Server (NTRS)
Inukai, Thomas; Shyy, Dong-Jye; Faris, Faris
1991-01-01
Onboard baseband processing architectures for future satellite broadband integrated services digital networks (B-ISDN's) are addressed. To assess the feasibility of implementing satellite B-ISDN services, critical design issues, such as B-ISDN traffic characteristics, transmission link design, and a trade-off between onboard circuit and fast packet switching, are analyzed. Examples of the two types of switching mechanisms and potential onboard network control functions are presented. A sample network architecture is also included to illustrate a potential onboard processing system.
On-board processing satellite network architectures for broadband ISDN
NASA Technical Reports Server (NTRS)
Inukai, Thomas; Faris, Faris; Shyy, Dong-Jye
1992-01-01
Onboard baseband processing architectures for future satellite broadband integrated services digital networks (B-ISDN's) are addressed. To assess the feasibility of implementing satellite B-ISDN services, critical design issues, such as B-ISDN traffic characteristics, transmission link design, and a trade-off between onboard circuit and fast packet switching, are analyzed. Examples of the two types of switching mechanisms and potential onboard network control functions are presented. A sample network architecture is also included to illustrate a potential onboard processing system.
The Evolution of Academic Library Architecture: A Summary.
ERIC Educational Resources Information Center
Toombs, Kenneth E.
1992-01-01
Reviews the history of architectural developments in academic libraries. Highlights include natural lighting and the invention of the incandescent bulb; compact shelving; open versus closed stacks; modular construction methods; central air conditioning and controlled environments; interior arrangements; access to handicapped users and staff; and…
77 FR 77015 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-31
... operation of a nationwide PSBN, based on a single, national network architecture. The Act also charges NTIA... infrastructure, equipment, and other architecture associated with the nationwide PSBN to satisfy the wireless...: Business or other for-profit organizations. Frequency: Annually and quarterly. Respondent's Obligation: OMB...
Organic Rankine cycle - review and research directions in engine applications
NASA Astrophysics Data System (ADS)
Panesar, Angad
2017-11-01
Waste heat to power conversion using Organic Rankine Cycles (ORC) is expected to play an important role in CO2 reductions from diesel engines. Firstly, a review of automotive ORCs is presented focusing on the pure working fluids, thermal architectures and expanders. The discussion includes, but is not limited to: R245fa, ethanol and water as fluids; series, parallel and cascade as architectures; dry saturated, superheated and supercritical as expansion conditions; and scroll, radial turbine and piston as expansion machines. Secondly, research direction in versatile expander and holistic architecture (NOx + CO2) are proposed. Benefits of using the proposed unconventional approaches are quantified using Ricardo Wave and Aspen HYSYS for diesel engine and ORC modelling. Results indicate that, the implementation of versatile piston expander tolerant to two-phase and using cyclopentane can potentially increase the highway drive cycle power by 8%. Furthermore, holistic architecture offering complete utilisation of charge air and exhaust recirculation heat increased the performance noticeably to 5% of engine power at the design point condition.
A run-time control architecture for the JPL telerobot
NASA Technical Reports Server (NTRS)
Balaram, J.; Lokshin, A.; Kreutz, K.; Beahan, J.
1987-01-01
An architecture for implementing the process-level decision making for a hierarchically structured telerobot currently being implemented at the Jet Propolusion Laboratory (JPL) is described. Constraints on the architecture design, architecture partitioning concepts, and a detailed description of the existing and proposed implementations are provided.
VASSAR: Value assessment of system architectures using rules
NASA Astrophysics Data System (ADS)
Selva, D.; Crawley, E. F.
A key step of the mission development process is the selection of a system architecture, i.e., the layout of the major high-level system design decisions. This step typically involves the identification of a set of candidate architectures and a cost-benefit analysis to compare them. Computational tools have been used in the past to bring rigor and consistency into this process. These tools can automatically generate architectures by enumerating different combinations of decisions and options. They can also evaluate these architectures by applying cost models and simplified performance models. Current performance models are purely quantitative tools that are best fit for the evaluation of the technical performance of mission design. However, assessing the relative merit of a system architecture is a much more holistic task than evaluating performance of a mission design. Indeed, the merit of a system architecture comes from satisfying a variety of stakeholder needs, some of which are easy to quantify, and some of which are harder to quantify (e.g., elegance, scientific value, political robustness, flexibility). Moreover, assessing the merit of a system architecture at these very early stages of design often requires dealing with a mix of: a) quantitative and semi-qualitative data; objective and subjective information. Current computational tools are poorly suited for these purposes. In this paper, we propose a general methodology that can used to assess the relative merit of several candidate system architectures under the presence of objective, subjective, quantitative, and qualitative stakeholder needs. The methodology called VASSAR (Value ASsessment for System Architectures using Rules). The major underlying assumption of the VASSAR methodology is that the merit of a system architecture can assessed by comparing the capabilities of the architecture with the stakeholder requirements. Hence for example, a candidate architecture that fully satisfies all critical sta- eholder requirements is a good architecture. The assessment process is thus fundamentally seen as a pattern matching process where capabilities match requirements, which motivates the use of rule-based expert systems (RBES). This paper describes the VASSAR methodology and shows how it can be applied to a large complex space system, namely an Earth observation satellite system. Companion papers show its applicability to the NASA space communications and navigation program and the joint NOAA-DoD NPOESS program.
Migrating EO/IR sensors to cloud-based infrastructure as service architectures
NASA Astrophysics Data System (ADS)
Berglie, Stephen T.; Webster, Steven; May, Christopher M.
2014-06-01
The Night Vision Image Generator (NVIG), a product of US Army RDECOM CERDEC NVESD, is a visualization tool used widely throughout Army simulation environments to provide fully attributed synthesized, full motion video using physics-based sensor and environmental effects. The NVIG relies heavily on contemporary hardware-based acceleration and GPU processing techniques, which push the envelope of both enterprise and commodity-level hypervisor support for providing virtual machines with direct access to hardware resources. The NVIG has successfully been integrated into fully virtual environments where system architectures leverage cloudbased technologies to various extents in order to streamline infrastructure and service management. This paper details the challenges presented to engineers seeking to migrate GPU-bound processes, such as the NVIG, to virtual machines and, ultimately, Cloud-Based IAS architectures. In addition, it presents the path that led to success for the NVIG. A brief overview of Cloud-Based infrastructure management tool sets is provided, and several virtual desktop solutions are outlined. A discrimination is made between general purpose virtual desktop technologies compared to technologies that expose GPU-specific capabilities, including direct rendering and hard ware-based video encoding. Candidate hypervisor/virtual machine configurations that nominally satisfy the virtualized hardware-level GPU requirements of the NVIG are presented , and each is subsequently reviewed in light of its implications on higher-level Cloud management techniques. Implementation details are included from the hardware level, through the operating system, to the 3D graphics APls required by the NVIG and similar GPU-bound tools.
A Distributed Architecture for Tsunami Early Warning and Collaborative Decision-support in Crises
NASA Astrophysics Data System (ADS)
Moßgraber, J.; Middleton, S.; Hammitzsch, M.; Poslad, S.
2012-04-01
The presentation will describe work on the system architecture that is being developed in the EU FP7 project TRIDEC on "Collaborative, Complex and Critical Decision-Support in Evolving Crises". The challenges for a Tsunami Early Warning System (TEWS) are manifold and the success of a system depends crucially on the system's architecture. A modern warning system following a system-of-systems approach has to integrate various components and sub-systems such as different information sources, services and simulation systems. Furthermore, it has to take into account the distributed and collaborative nature of warning systems. In order to create an architecture that supports the whole spectrum of a modern, distributed and collaborative warning system one must deal with multiple challenges. Obviously, one cannot expect to tackle these challenges adequately with a monolithic system or with a single technology. Therefore, a system architecture providing the blueprints to implement the system-of-systems approach has to combine multiple technologies and architectural styles. At the bottom layer it has to reliably integrate a large set of conventional sensors, such as seismic sensors and sensor networks, buoys and tide gauges, and also innovative and unconventional sensors, such as streams of messages from social media services. At the top layer it has to support collaboration on high-level decision processes and facilitates information sharing between organizations. In between, the system has to process all data and integrate information on a semantic level in a timely manner. This complex communication follows an event-driven mechanism allowing events to be published, detected and consumed by various applications within the architecture. Therefore, at the upper layer the event-driven architecture (EDA) aspects are combined with principles of service-oriented architectures (SOA) using standards for communication and data exchange. The most prominent challenges on this layer include providing a framework for information integration on a syntactic and semantic level, leveraging distributed processing resources for a scalable data processing platform, and automating data processing and decision support workflows.
Digital Device Architecture and the Safe Use of Flash Devices in Munitions
NASA Technical Reports Server (NTRS)
Katz, Richard B.; Flowers, David; Bergevin, Keith
2017-01-01
Flash technology is being utilized in fuzed munition applications and, based on the development of digital logic devices in the commercial world, usage of flash technology will increase. Digital devices of interest to designers include flash-based microcontrollers and field programmable gate arrays (FPGAs). Almost a decade ago, a study was undertaken to determine if flash-based microcontrollers could be safely used in fuzes and, if so, how should such devices be applied. The results were documented in the Technical Manual for the Use of Logic Devices in Safety Features. This paper will first review the Technical Manual and discuss the rationale behind the suggested architectures for microcontrollers and a brief review of the concern about data retention in flash cells. An architectural feature in the microcontroller under study will be discussed and its use will show how to screen for weak or failed cells during manufacture, storage, or immediately prior to use. As was done for microcontrollers a decade ago, architectures for a flash-based FPGA will be discussed, showing how it can be safely used in fuzes. Additionally, architectures for using non-volatile (including flash-based) storage will be discussed for SRAM-based FPGAs.
On the electrophysiology of aesthetic processing.
Jacobsen, Thomas
2013-01-01
One important method that can be applied for gaining an understanding of the underpinning of aesthetics in the brain is that of electrophysiology. Cognitive electrophysiology, in particular, allows the identification of components in a mental processing architecture. The present chapter reviews findings in the neurocognitive psychology of aesthetics, or neuroaesthetics, that have been obtained with the method of event-related brain potentials, as derived from the human electroencephalogram. The cognitive-perceptual bases as well as affective substages of aesthetic processing have been investigated and those are described here. The event-related potential method allows for the identification of mental processing modes in cognitive and aesthetic processing. It also provides an assessment of the mental chronometry of cognitive and affective stages in aesthetic appreciation. As the work described here shows, distinct processes in the brain are engaged in aesthetic judgments. © 2013 Elsevier B.V. All rights reserved.
Gestalten of today: early processing of visual contours and surfaces.
Kovács, I
1996-12-01
While much is known about the specialized, parallel processing streams of low-level vision that extract primary visual cues, there is only limited knowledge about the dynamic interactions between them. How are the fragments, caught by local analyzers, assembled together to provide us with a unified percept? How are local discontinuities in texture, motion or depth evaluated with respect to object boundaries and surface properties? These questions are presented within the framework of orientation-specific spatial interactions of early vision. Key observations of psychophysics, anatomy and neurophysiology on interactions of various spatial and temporal ranges are reviewed. Aspects of the functional architecture and possible neural substrates of local orientation-specific interactions are discussed, underlining their role in the integration of information across the visual field, and particularly in contour integration. Examples are provided demonstrating that global context, such as contour closure and figure-ground assignment, affects these local interactions. It is illustrated that figure-ground assignment is realized early in visual processing, and that the pattern of early interactions also brings about an effective and sparse coding of visual shape. Finally, it is concluded that the underlying functional architecture is not only dynamic and context dependent, but the pattern of connectivity depends as much on past experience as on actual stimulation.
Challenges of nickel silicidation in CMOS technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Breil, Nicolas; Lavoie, Christian; Ozcan, Ahmet
2015-04-01
In our paper, we review some of the key challenges associated with the Ni silicidation process in the most recent CMOS technologies. The introduction of new materials (e.g.SiGe), and of non-planar architectures bring some important changes that require fundamental investigation from a material engineering perspective. Following a discussion of the device architecture and silicide evolution through the last CMOS generations, we focus our study on a very peculiar defect, termed NiSi-Fangs. We describe a mechanism for the defect formation, and present a detailed material analysis that supports this mechanism. We highlight some of the possible metal enrichment processes of themore » nickel monosilicide such as oxidation or various RIE (Reactive Ion Etching) plasma process, leading to a metal source available for defect formation. Furthermore, we investigate the NiSi formation and re-formation silicidation differences between Si and SiGe materials, and between (1 0 0) and (1 1 1) orientations. Finally, we show that the thermal budgets post silicidation can lead to the formation of NiSi-Fangs if the structure and the processes are not optimized. Beyond the understanding of the defect and the discussion on the engineering solutions used to prevent its formation, the interest of this investigation also lies in the fundamental learning within the Ni–Pt–Si–Ge system and some additional perspective on Ni-based contacts to advanced microelectronic devices.« less
Super and parallel computers and their impact on civil engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamat, M.P.
1986-01-01
This book presents the papers given at a conference on the use of supercomputers in civil engineering. Topics considered at the conference included solving nonlinear equations on a hypercube, a custom architectured parallel processing system, distributed data processing, algorithms, computer architecture, parallel processing, vector processing, computerized simulation, and cost benefit analysis.
A Computerized Architecture Slide Classification for a Small University Collection.
ERIC Educational Resources Information Center
Powell, Richard K.
This paper briefly outlines the process used to organize, classify, and make accessible a collection of architecture slides in the Architecture Resource Center at Andrews University in Michigan. The classification system includes the use of Art and Architecture Thesaurus subject headings, the ERIC (Educational Resources Information Center) concept…
The Perception of Human Resources Enterprise Architecture within the Department of Defense
ERIC Educational Resources Information Center
Delaquis, Richard Serge
2012-01-01
The Clinger Cohen Act of 1996 requires that all major Federal Government Information Technology (IT) systems prepare an Enterprise Architecture prior to IT acquisitions. Enterprise Architecture, like house blueprints, represents the system build, capabilities, processes, and data across the enterprise of IT systems. Enterprise Architecture is used…
Critical Review of NOAA's Observation Requirements Process
NASA Astrophysics Data System (ADS)
LaJoie, M.; Yapur, M.; Vo, T.; Templeton, A.; Bludis, D.
2017-12-01
NOAA's Observing Systems Council (NOSC) maintains a comprehensive database of user observation requirements. The requirements collection process engages NOAA subject matter experts to document and effectively communicate the specific environmental observation measurements (parameters and attributes) needed to produce operational products and pursue research objectives. User observation requirements documented using a structured and standardized manner and framework enables NOAA to assess its needs across organizational lines in an impartial, objective, and transparent manner. This structure provides the foundation for: selecting, designing, developing, acquiring observing technologies, systems and architectures; budget and contract formulation and decision-making; and assessing in a repeatable fashion the productivity, efficiency and optimization of NOAA's observing system enterprise. User observation requirements are captured independently from observing technologies. Therefore, they can be addressed by a variety of current or expected observing capabilities and allow flexibility to be remapped to new and evolving technologies. NOAA's current inventory of user observation requirements were collected over a ten-year period, and there have been many changes in policies, mission priorities, and funding levels during this time. In light of these changes, the NOSC initiated a critical, in-depth review to examine all aspects of user observation requirements and associated processes during 2017. This presentation provides background on the NOAA requirements process, major milestones and outcomes of the critical review, and plans for evolving and connecting observing requirements processes in the next year.
Uncoupling File System Components for Bridging Legacy and Modern Storage Architectures
NASA Astrophysics Data System (ADS)
Golpayegani, N.; Halem, M.; Tilmes, C.; Prathapan, S.; Earp, D. N.; Ashkar, J. S.
2016-12-01
Long running Earth Science projects can span decades of architectural changes in both processing and storage environments. As storage architecture designs change over decades such projects need to adjust their tools, systems, and expertise to properly integrate such new technologies with their legacy systems. Traditional file systems lack the necessary support to accommodate such hybrid storage infrastructure resulting in more complex tool development to encompass all possible storage architectures used for the project. The MODIS Adaptive Processing System (MODAPS) and the Level 1 and Atmospheres Archive and Distribution System (LAADS) is an example of a project spanning several decades which has evolved into a hybrid storage architecture. MODAPS/LAADS has developed the Lightweight Virtual File System (LVFS) which ensures a seamless integration of all the different storage architectures, including standard block based POSIX compliant storage disks, to object based architectures such as the S3 compliant HGST Active Archive System, and the Seagate Kinetic disks utilizing the Kinetic Protocol. With LVFS, all analysis and processing tools used for the project continue to function unmodified regardless of the underlying storage architecture enabling MODAPS/LAADS to easily integrate any new storage architecture without the costly need to modify existing tools to utilize such new systems. Most file systems are designed as a single application responsible for using metadata to organizing the data into a tree, determine the location for data storage, and a method of data retrieval. We will show how LVFS' unique approach of treating these components in a loosely coupled fashion enables it to merge different storage architectures into a single uniform storage system which bridges the underlying hybrid architecture.
Switching from Computer to Microcomputer Architecture Education
ERIC Educational Resources Information Center
Bolanakis, Dimosthenis E.; Kotsis, Konstantinos T.; Laopoulos, Theodore
2010-01-01
In the last decades, the technological and scientific evolution of the computing discipline has been widely affecting research in software engineering education, which nowadays advocates more enlightened and liberal ideas. This article reviews cross-disciplinary research on a computer architecture class in consideration of its switching to…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-29
... evaluate applicants' familiarity with the national preparedness architecture and identify how elements of this architecture have been incorporated into regional/state/local planning, operations, and investments. Affected Public: State, Local or Tribal Government; Business or other for-profit. The affected...
Marshall Application Realignment System (MARS) Architecture
NASA Technical Reports Server (NTRS)
Belshe, Andrea; Sutton, Mandy
2010-01-01
The Marshall Application Realignment System (MARS) Architecture project was established to meet the certification requirements of the Department of Defense Architecture Framework (DoDAF) V2.0 Federal Enterprise Architecture Certification (FEAC) Institute program and to provide added value to the Marshall Space Flight Center (MSFC) Application Portfolio Management process. The MARS Architecture aims to: (1) address the NASA MSFC Chief Information Officer (CIO) strategic initiative to improve Application Portfolio Management (APM) by optimizing investments and improving portfolio performance, and (2) develop a decision-aiding capability by which applications registered within the MSFC application portfolio can be analyzed and considered for retirement or decommission. The MARS Architecture describes a to-be target capability that supports application portfolio analysis against scoring measures (based on value) and overall portfolio performance objectives (based on enterprise needs and policies). This scoring and decision-aiding capability supports the process by which MSFC application investments are realigned or retired from the application portfolio. The MARS Architecture is a multi-phase effort to: (1) conduct strategic architecture planning and knowledge development based on the DoDAF V2.0 six-step methodology, (2) describe one architecture through multiple viewpoints, (3) conduct portfolio analyses based on a defined operational concept, and (4) enable a new capability to support the MSFC enterprise IT management mission, vision, and goals. This report documents Phase 1 (Strategy and Design), which includes discovery, planning, and development of initial architecture viewpoints. Phase 2 will move forward the process of building the architecture, widening the scope to include application realignment (in addition to application retirement), and validating the underlying architecture logic before moving into Phase 3. The MARS Architecture key stakeholders are most interested in Phase 3 because this is where the data analysis, scoring, and recommendation capability is realized. Stakeholders want to see the benefits derived from reducing the steady-state application base and identify opportunities for portfolio performance improvement and application realignment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kong Jing; Liu Wei, E-mail: jrliu@sdu.edu.cn; Wang Fenglong
Monodispersed Ni flower-like architectures with size of 1-2 {mu}m were synthesized through a facile solvent-thermal process in 1,2-propanediol solution in the presence of polyethylene glycol (PEG) and sodium alkali for electromagnetic absorption application. The Ni architectures are composed of nanoflakes, which assemble to form three dimensional flower-like structure, and the thickness of nanoflakes is about 10-40 nm. A possible formation mechanism for Ni flower-like architectures was proposed and it was confirmed by the control experiments. The Ni architectures exhibited a saturation magnetization (M{sub s}) of 47.7 emu/g and a large coercivity (H{sub cj}) of 332.3 Oe. The epoxy resin compositesmore » with 20 vol% Ni sample provided good electromagnetic wave absorption performance (reflection loss <-20 dB) in the range of 2.8-6.3 GHz over absorber thickness of 2.6-5.0 mm. - Graphical abstract: Monodispersed Ni flower-like architectures composed of nanoflakes were synthesized through a facile solvent-thermal process. The Ni architectures exhibited a large coercivity and enhanced electromagnetic wave absorption in GHz. Highlights: > Flower-like architectures composed of nanoflakes. > A possible formation mechanism for Ni flower-like architectures was proposed. > Sodium alkali, PEG, and NaCl played the important roles in the final morphology. > Ni architectures exhibited a large coercivity (H{sub cj}) of 332.3 Oe. > Efficient electromagnetic absorption (RL<-20 dB) was provided in 2.8-6.3 GHz.« less
Architectural & engineering handbook
DOT National Transportation Integrated Search
2003-05-21
The Architectural and Engineering (A&E) Handbook provides an overview of the contracting process for A&E consultant services. Produced by the Division of Procurement and Contracts, this handbook provides guidance and a structured process for the plan...
Manyscale Computing for Sensor Processing in Support of Space Situational Awareness
NASA Astrophysics Data System (ADS)
Schmalz, M.; Chapman, W.; Hayden, E.; Sahni, S.; Ranka, S.
2014-09-01
Increasing image and signal data burden associated with sensor data processing in support of space situational awareness implies continuing computational throughput growth beyond the petascale regime. In addition to growing applications data burden and diversity, the breadth, diversity and scalability of high performance computing architectures and their various organizations challenge the development of a single, unifying, practicable model of parallel computation. Therefore, models for scalable parallel processing have exploited architectural and structural idiosyncrasies, yielding potential misapplications when legacy programs are ported among such architectures. In response to this challenge, we have developed a concise, efficient computational paradigm and software called Manyscale Computing to facilitate efficient mapping of annotated application codes to heterogeneous parallel architectures. Our theory, algorithms, software, and experimental results support partitioning and scheduling of application codes for envisioned parallel architectures, in terms of work atoms that are mapped (for example) to threads or thread blocks on computational hardware. Because of the rigor, completeness, conciseness, and layered design of our manyscale approach, application-to-architecture mapping is feasible and scalable for architectures at petascales, exascales, and above. Further, our methodology is simple, relying primarily on a small set of primitive mapping operations and support routines that are readily implemented on modern parallel processors such as graphics processing units (GPUs) and hybrid multi-processors (HMPs). In this paper, we overview the opportunities and challenges of manyscale computing for image and signal processing in support of space situational awareness applications. We discuss applications in terms of a layered hardware architecture (laboratory > supercomputer > rack > processor > component hierarchy). Demonstration applications include performance analysis and results in terms of execution time as well as storage, power, and energy consumption for bus-connected and/or networked architectures. The feasibility of the manyscale paradigm is demonstrated by addressing four principal challenges: (1) architectural/structural diversity, parallelism, and locality, (2) masking of I/O and memory latencies, (3) scalability of design as well as implementation, and (4) efficient representation/expression of parallel applications. Examples will demonstrate how manyscale computing helps solve these challenges efficiently on real-world computing systems.
Biomimetic design processes in architecture: morphogenetic and evolutionary computational design.
Menges, Achim
2012-03-01
Design computation has profound impact on architectural design methods. This paper explains how computational design enables the development of biomimetic design processes specific to architecture, and how they need to be significantly different from established biomimetic processes in engineering disciplines. The paper first explains the fundamental difference between computer-aided and computational design in architecture, as the understanding of this distinction is of critical importance for the research presented. Thereafter, the conceptual relation and possible transfer of principles from natural morphogenesis to design computation are introduced and the related developments of generative, feature-based, constraint-based, process-based and feedback-based computational design methods are presented. This morphogenetic design research is then related to exploratory evolutionary computation, followed by the presentation of two case studies focusing on the exemplary development of spatial envelope morphologies and urban block morphologies.
Utilization of Glycosaminoglycans/Proteoglycans as Carriers for Targeted Therapy Delivery
Misra, Suniti; Hascall, Vincent C.; Atanelishvili, Ilia; Moreno Rodriguez, Ricardo; Markwald, Roger R.; Ghatak, Shibnath
2015-01-01
The outcome of patients with cancer has improved significantly in the past decade with the incorporation of drugs targeting cell surface adhesive receptors, receptor tyrosine kinases, and modulation of several molecules of extracellular matrices (ECMs), the complex composite of collagens, glycoproteins, proteoglycans, and glycosaminoglycans that dictates tissue architecture. Cancer tissue invasive processes progress by various oncogenic strategies, including interfering with ECM molecules and their interactions with invasive cells. In this review, we describe how the ECM components, proteoglycans and glycosaminoglycans, influence tumor cell signaling. In particular this review describes how the glycosaminoglycan hyaluronan (HA) and its major receptor CD44 impact invasive behavior of tumor cells, and provides useful insight when designing new therapeutic strategies in the treatment of cancer. PMID:26448753
DOE Office of Scientific and Technical Information (OSTI.GOV)
Majewski, Pawel W.; Yager, Kevin G.
Block-copolymers self-assemble into diverse morphologies, where nanoscale order can be finely tuned via block architecture and processing conditions. However, the ultimate usage of these materials in real-world applications may be hampered by the extremely long thermal annealing times—hours or days—required to achieve good order. Here, we provide an overview of the fundamentals of block-copolymer self-assembly kinetics, and review the techniques that have been demonstrated to influence, and enhance, these ordering kinetics. We discuss the inherent tradeoffs between oven annealing, solvent annealing, microwave annealing, zone annealing, and other directed self-assembly methods; including an assessment of spatial and temporal characteristics. Here, wemore » also review both real-space and reciprocal-space analysis techniques for quantifying order in these systems.« less
Adhikari, Kaustubh; Mendoza-Revilla, Javier; Chacón-Duque, Juan Camilo; Fuentes-Guajardo, Macarena; Ruiz-Linares, Andrés
2016-12-01
Latin Americans arguably represent the largest recently admixed populations in the world. This reflects a history of massive settlement by immigrants (mostly Europeans and Africans) and their variable admixture with Natives, starting in 1492. This process resulted in the population of Latin America showing an extensive genetic and phenotypic diversity. Here we review how genetic analyses are being applied to examine the demographic history of this population, including patterns of mating, population structure and ancestry. The admixture history of Latin America, and the resulting extensive diversity of the region, represents a natural experiment offering an advantageous setting for genetic association studies. We review how recent analyses in Latin Americans are contributing to elucidating the genetic architecture of human complex traits. Copyright © 2016 Elsevier Ltd. All rights reserved.
Review—Practical Challenges Hindering the Development of Solid State Li Ion Batteries
Kerman, Kian; Luntz, Alan; Viswanathan, Venkatasubramanian; ...
2017-06-09
Solid state electrolyte systems boasting Li+ conductivity of >10 mS cm -1 at room temperature have opened the potential for developing a solid state battery with power and energy densities that are competitive with conventional liquid electrolyte systems. The primary focus of this review is twofold. First, differences in Li penetration resistance in solid state systems are discussed, and kinetic limitations of the solid state interface are highlighted. Second, technological challenges associated with processing such systems in relevant form factors are elucidated, and architectures needed for cell level devices in the context of product development are reviewed. Specific research vectorsmore » that provide high value to advancing solid state batteries are outlined and discussed.« less
Rapid ordering of block copolymer thin films
NASA Astrophysics Data System (ADS)
Majewski, Pawel W.; Yager, Kevin G.
2016-10-01
Block-copolymers self-assemble into diverse morphologies, where nanoscale order can be finely tuned via block architecture and processing conditions. However, the ultimate usage of these materials in real-world applications may be hampered by the extremely long thermal annealing times—hours or days—required to achieve good order. Here, we provide an overview of the fundamentals of block-copolymer self-assembly kinetics, and review the techniques that have been demonstrated to influence, and enhance, these ordering kinetics. We discuss the inherent tradeoffs between oven annealing, solvent annealing, microwave annealing, zone annealing, and other directed self-assembly methods; including an assessment of spatial and temporal characteristics. We also review both real-space and reciprocal-space analysis techniques for quantifying order in these systems.
Functional Interface Considerations within an Exploration Life Support System Architecture
NASA Technical Reports Server (NTRS)
Perry, Jay L.; Sargusingh, Miriam J.; Toomarian, Nikzad
2016-01-01
As notional life support system (LSS) architectures are developed and evaluated, myriad options must be considered pertaining to process technologies, components, and equipment assemblies. Each option must be evaluated relative to its impact on key functional interfaces within the LSS architecture. A leading notional architecture has been developed to guide the path toward realizing future crewed space exploration goals. This architecture includes atmosphere revitalization, water recovery and management, and environmental monitoring subsystems. Guiding requirements for developing this architecture are summarized and important interfaces within the architecture are discussed. The role of environmental monitoring within the architecture is described.
NASA Astrophysics Data System (ADS)
Gradziński, Piotr
2017-10-01
Whereas World’s climate is changing (inter alia, under the influence of architecture activity), the author attempts to reorientations design practice primarily in a direction the use and adapt to the climatic conditions. Architectural Design using in early stages of the architectural Design Process of the building, among other Life Cycle Analysis (LCA) and digital analytical tools BIM (Building Information Modelling) defines the overriding requirements which the designer/architect should meet. The first part, the text characterized the architecture activity influences (by consumption, pollution, waste, etc.) and the use of building materials (embodied energy, embodied carbon, Global Warming Potential, etc.) within the meaning of the direct negative environmental impact. The second part, the paper presents the revision of the methods and analytical techniques prevent negative influences. Firstly, showing the study of the building by using the Life Cycle Analysis of the structure (e.g. materials) and functioning (e.g. energy consumptions) of the architectural object (stages: before use, use, after use). Secondly, the use of digital analytical tools for determining the benefits of running multi-faceted simulations in terms of environmental factors (exposure to light, shade, wind) directly affecting shaping the form of the building. The conclusion, author’s research results highlight the fact that indicates the possibility of building design using the above-mentioned elements (LCA, BIM) causes correction, early designs decisions in the design process of architectural form, minimizing the impact on nature, environment. The work refers directly to the architectural-environmental dimensions, orienting the design process of buildings in respect of widely comprehended climatic changes.
Zhang, Wei; Tang, Xiaoxiang; He, Xianyou; Lai, Shuxian
2018-01-01
Substantial evidence suggests that beauty is associated with the survival and reproduction of organisms. Landscape architecture is composed of a series of natural elements that have significant evolutionary implications. The present study used one pilot material ratings and three experiments to examine the mechanisms of aesthetic appraisals of landscape architecture. The results confirmed that landscape architecture elicited a sense of beauty and captured visual attention more easily than other types of architecture during explicit aesthetic rating task (Experiment 1) and implicit aesthetic perception task (dot-probe paradigm, Experiment 2). Furthermore, the spatial cueing paradigm revealed that response latencies were significantly faster for landscape architecture than non-landscape architecture on valid trials, but there was no significant difference in this contrast on invalid trials at 150-ms stimulus onset asynchrony (SOA, Experiment 3a). At 500-ms SOA (Experiment 3b), participants responded significantly faster for landscape architecture on valid trials, but reacted significantly slower for landscape architecture on invalid trials. The findings indicated that the beauty of landscape architecture can be perceived implicitly, and only faster orienting of attention, but not delayed disengagement of attention was generated at early stages of the processing of landscape architecture. However, the attentional bias at later stages of attentional processes may be resulted from both faster orienting of attention and delayed disengagement of attention from landscape architecture photographs. PMID:29467696
Zhang, Wei; Tang, Xiaoxiang; He, Xianyou; Lai, Shuxian
2018-01-01
Substantial evidence suggests that beauty is associated with the survival and reproduction of organisms. Landscape architecture is composed of a series of natural elements that have significant evolutionary implications. The present study used one pilot material ratings and three experiments to examine the mechanisms of aesthetic appraisals of landscape architecture. The results confirmed that landscape architecture elicited a sense of beauty and captured visual attention more easily than other types of architecture during explicit aesthetic rating task (Experiment 1) and implicit aesthetic perception task (dot-probe paradigm, Experiment 2). Furthermore, the spatial cueing paradigm revealed that response latencies were significantly faster for landscape architecture than non-landscape architecture on valid trials, but there was no significant difference in this contrast on invalid trials at 150-ms stimulus onset asynchrony (SOA, Experiment 3a). At 500-ms SOA (Experiment 3b), participants responded significantly faster for landscape architecture on valid trials, but reacted significantly slower for landscape architecture on invalid trials. The findings indicated that the beauty of landscape architecture can be perceived implicitly, and only faster orienting of attention, but not delayed disengagement of attention was generated at early stages of the processing of landscape architecture. However, the attentional bias at later stages of attentional processes may be resulted from both faster orienting of attention and delayed disengagement of attention from landscape architecture photographs.
Space Generic Open Avionics Architecture (SGOAA) reference model technical guide
NASA Technical Reports Server (NTRS)
Wray, Richard B.; Stovall, John R.
1993-01-01
This report presents a full description of the Space Generic Open Avionics Architecture (SGOAA). The SGOAA consists of a generic system architecture for the entities in spacecraft avionics, a generic processing architecture, and a six class model of interfaces in a hardware/software system. The purpose of the SGOAA is to provide an umbrella set of requirements for applying the generic architecture interface model to the design of specific avionics hardware/software systems. The SGOAA defines a generic set of system interface points to facilitate identification of critical interfaces and establishes the requirements for applying appropriate low level detailed implementation standards to those interface points. The generic core avionics system and processing architecture models provided herein are robustly tailorable to specific system applications and provide a platform upon which the interface model is to be applied.
Finding Services for an Open Architecture: A Review of Existing Applications and Programs in PEO C4I
2011-01-01
2004) Two key SOA success factors listed were as follows: 1. Shared Services Strategy: Existence of a strategy to identify overlapping business and...model Architectural pattern 22 Finding Services for an Open Architecture or eliminating redundancies and overlaps through use of shared services 2...Funding Model: Existence of an IT funding model aligned with and supportive of a shared services strategy. (Sun Micro- systems, 2004) Become Data
Lander Propulsion Overview and Technology Requirements Discussion
NASA Technical Reports Server (NTRS)
Brown, Thomas M.
2007-01-01
This viewgraph presentation reviews the lunar lander propulsion requirements. It includes discussion on: Lander Project Overview, Project Evolution/Design Cycles, Lunar Architecture & Lander Reference Missions, Lander Concept Configurations, Descent and Ascent propulsion reviews, and a review of the technology requirements.
Space Generic Open Avionics Architecture (SGOAA): Overview
NASA Technical Reports Server (NTRS)
Wray, Richard B.; Stovall, John R.
1992-01-01
A space generic open avionics architecture created for NASA is described. It will serve as the basis for entities in spacecraft core avionics, capable of being tailored by NASA for future space program avionics ranging from small vehicles such as Moon ascent/descent vehicles to large ones such as Mars transfer vehicles or orbiting stations. The standard consists of: (1) a system architecture; (2) a generic processing hardware architecture; (3) a six class architecture interface model; (4) a system services functional subsystem architectural model; and (5) an operations control functional subsystem architectural model.
Language, music, syntax and the brain.
Patel, Aniruddh D
2003-07-01
The comparative study of music and language is drawing an increasing amount of research interest. Like language, music is a human universal involving perceptually discrete elements organized into hierarchically structured sequences. Music and language can thus serve as foils for each other in the study of brain mechanisms underlying complex sound processing, and comparative research can provide novel insights into the functional and neural architecture of both domains. This review focuses on syntax, using recent neuroimaging data and cognitive theory to propose a specific point of convergence between syntactic processing in language and music. This leads to testable predictions, including the prediction that that syntactic comprehension problems in Broca's aphasia are not selective to language but influence music perception as well.
Neural Cross-Frequency Coupling: Connecting Architectures, Mechanisms, and Functions.
Hyafil, Alexandre; Giraud, Anne-Lise; Fontolan, Lorenzo; Gutkin, Boris
2015-11-01
Neural oscillations are ubiquitously observed in the mammalian brain, but it has proven difficult to tie oscillatory patterns to specific cognitive operations. Notably, the coupling between neural oscillations at different timescales has recently received much attention, both from experimentalists and theoreticians. We review the mechanisms underlying various forms of this cross-frequency coupling. We show that different types of neural oscillators and cross-frequency interactions yield distinct signatures in neural dynamics. Finally, we associate these mechanisms with several putative functions of cross-frequency coupling, including neural representations of multiple environmental items, communication over distant areas, internal clocking of neural processes, and modulation of neural processing based on temporal predictions. Copyright © 2015 Elsevier Ltd. All rights reserved.
ARCHITECTURAL FLOOR PLAN OF PROCESS AND ACCESS AREAS HOT PILOT ...
ARCHITECTURAL FLOOR PLAN OF PROCESS AND ACCESS AREAS HOT PILOT PLANT (CPP-640). INL DRAWING NUMBER 200-0640-00-279-111679. ALTERNATE ID NUMBER 8952-CPP-640-A-2. - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID
One approach to architectural acoustics in education
NASA Astrophysics Data System (ADS)
Jaffe, J. Christopher
2003-04-01
In the fall of 1997, Dean Alan Balfour of the School of Architecture at the Rennselaer Polytechnic Institute asked me to introduce an undergraduate 14 credit certificate course entitled ''Sonics in Architecture.`` Subsequently, the program was expanded to include a Master's Degree in Building Science. This paper discusses the trials and tribulations of building a scientific program in a liberal arts school. In addition, the problem of acquiring the research funds needed to provide tuition assistance for graduate students in Architectural Acoustics is reviewed. Information on the curriculum developed for both the lecture and laboratory courses is provided. I will also share my concerns regarding the teaching methods currently prevalent in many schools of architecture today, and how building science professionals might assist in addressing these issues.
NASA Astrophysics Data System (ADS)
Samadzadegan, F.; Saber, M.; Zahmatkesh, H.; Joze Ghazi Khanlou, H.
2013-09-01
Rapidly discovering, sharing, integrating and applying geospatial information are key issues in the domain of emergency response and disaster management. Due to the distributed nature of data and processing resources in disaster management, utilizing a Service Oriented Architecture (SOA) to take advantages of workflow of services provides an efficient, flexible and reliable implementations to encounter different hazardous situation. The implementation specification of the Web Processing Service (WPS) has guided geospatial data processing in a Service Oriented Architecture (SOA) platform to become a widely accepted solution for processing remotely sensed data on the web. This paper presents an architecture design based on OGC web services for automated workflow for acquisition, processing remotely sensed data, detecting fire and sending notifications to the authorities. A basic architecture and its building blocks for an automated fire detection early warning system are represented using web-based processing of remote sensing imageries utilizing MODIS data. A composition of WPS processes is proposed as a WPS service to extract fire events from MODIS data. Subsequently, the paper highlights the role of WPS as a middleware interface in the domain of geospatial web service technology that can be used to invoke a large variety of geoprocessing operations and chaining of other web services as an engine of composition. The applicability of proposed architecture by a real world fire event detection and notification use case is evaluated. A GeoPortal client with open-source software was developed to manage data, metadata, processes, and authorities. Investigating feasibility and benefits of proposed framework shows that this framework can be used for wide area of geospatial applications specially disaster management and environmental monitoring.
Non-terrestrial resources of economic importance to earth
NASA Technical Reports Server (NTRS)
Lewis, John S.
1991-01-01
The status of research on the importation of energy and nonterrestrial materials is reviewed, and certain specific directions for new research are proposed. New technologies which are to be developed include aerobraking, in situ propellant production, mining and beneficiation of extraterresrrial minerals, nuclear power systems, electromagnetic launch, and solar thermal propulsion. Topics discussed include the system architecture for solar power satellite constellations, the return of nonterrestrial He-3 to earth for use as a clean fusion fuel, and the return to earth of platinum-group metal byproducts from processing of nonterrestrial native ferrous metals.
NASA Astrophysics Data System (ADS)
Goodman, Joseph W.
1987-10-01
Work Accomplished: OPTICAL INTERCONNECTIONS - the powerful interconnect abilities of optical beams have led much optimism about the possible roles for optics in solving interconnect problems at various levels of computer architecture. Examined were the powerful requirements of optical interconnects at the gate-to-gate and chip-to-chip levels. OPTICAL NEUTRAL NETWORKS - basic studies of the convergence properties on the Holfield model, based on mathematical approach - graph theory. OPTICS AND ARTIFICIAL INTELLIGENCE - review the field of optical processing and artificial intelligence, with the aim of finding areas that might be particularly attractive for future investigation(s).
The upgrade of the ATLAS first-level calorimeter trigger
NASA Astrophysics Data System (ADS)
Yamamoto, Shimpei; Atlas Collaboration
2016-07-01
The first-level calorimeter trigger (L1Calo) had operated successfully through the first data taking phase of the ATLAS experiment at the CERN Large Hadron Collider. Towards forthcoming LHC runs, a series of upgrades is planned for L1Calo to face new challenges posed by the upcoming increases of the beam energy and the luminosity. This paper reviews the ATLAS L1Calo trigger upgrade project that introduces new architectures for the liquid-argon calorimeter trigger readout and the L1Calo trigger processing system.
Artful Writing: Well-Crafted Words Complement Well-Drafted Images
ERIC Educational Resources Information Center
Weinstein, Norman
2008-01-01
Speaking plainly, says the writer: too many architecture students can't write. After hearing graduate architecture students defend their designs at a midterm studio review, the writer observed that, under questioning, several students became inarticulate and left participles or sentences dangling. While this may be understandable, the writer also…
Design Patterns for American Schools: Responding to the Reform Movement.
ERIC Educational Resources Information Center
Moore, Gary T.; Lackney, Jeffery A.
This paper explores the often elusive yet very important relationship between architectural design and educational reform. A review of the major findings from the educational and architectural research literatures on the impacts of school design on educational program effectiveness is presented. Commonalities among the disciplines were identified…
Development of a Conceptual Structure for Architectural Solar Energy Systems.
ERIC Educational Resources Information Center
Ringel, Robert F.
Solar subsystems and components were identified and conceptual structure was developed for architectural solar energy heating and cooling systems. Recent literature related to solar energy systems was reviewed and analyzed. Solar heating and cooling system, subsystem, and component data were compared for agreement and completeness. Significant…
The Emerging Chinese Institutional Architecture in Higher Education
ERIC Educational Resources Information Center
Lo, William Yat Wai
2014-01-01
The article reviews the global landscape of higher education with the anticipation of an emerging Chinese institutional architecture in Asia-Pacific higher education. It starts with a theoretical framework for analyzing the functionalities of values and institutions in international higher education by adopting Joseph Nye's concept of soft power.…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-24
... applicants' familiarity with national preparedness architecture and identify how elements of this architecture have been incorporated into regional/ State/local planning, operations, and investments. The TSP.... Affected Public: Business or other for-profit. Estimated Number of Respondents: 25. Frequency of Response...
Advanced resin systems and 3D textile preforms for low cost composite structures
NASA Technical Reports Server (NTRS)
Shukla, J. G.; Bayha, T. D.
1993-01-01
Advanced resin systems and 3D textile preforms are being evaluated at Lockheed Aeronautical Systems Company (LASC) under NASA's Advanced Composites Technology (ACT) Program. This work is aimed towards the development of low-cost, damage-tolerant composite fuselage structures. Resin systems for resin transfer molding and powder epoxy towpreg materials are being evaluated for processability, performance and cost. Three developmental epoxy resin systems for resin transfer molding (RTM) and three resin systems for powder towpregging are being investigated. Various 3D textile preform architectures using advanced weaving and braiding processes are also being evaluated. Trials are being conducted with powdered towpreg, in 2D weaving and 3D braiding processes for their textile processability and their potential for fabrication in 'net shape' fuselage structures. The progress in advanced resin screening and textile preform development is reviewed here.
An Architecture, System Engineering, and Acquisition Approach for Space System Software Resiliency
NASA Astrophysics Data System (ADS)
Phillips, Dewanne Marie
Software intensive space systems can harbor defects and vulnerabilities that may enable external adversaries or malicious insiders to disrupt or disable system functions, risking mission compromise or loss. Mitigating this risk demands a sustained focus on the security and resiliency of the system architecture including software, hardware, and other components. Robust software engineering practices contribute to the foundation of a resilient system so that the system "can take a hit to a critical component and recover in a known, bounded, and generally acceptable period of time". Software resiliency must be a priority and addressed early in the life cycle development to contribute a secure and dependable space system. Those who develop, implement, and operate software intensive space systems must determine the factors and systems engineering practices to address when investing in software resiliency. This dissertation offers methodical approaches for improving space system resiliency through software architecture design, system engineering, increased software security, thereby reducing the risk of latent software defects and vulnerabilities. By providing greater attention to the early life cycle phases of development, we can alter the engineering process to help detect, eliminate, and avoid vulnerabilities before space systems are delivered. To achieve this objective, this dissertation will identify knowledge, techniques, and tools that engineers and managers can utilize to help them recognize how vulnerabilities are produced and discovered so that they can learn to circumvent them in future efforts. We conducted a systematic review of existing architectural practices, standards, security and coding practices, various threats, defects, and vulnerabilities that impact space systems from hundreds of relevant publications and interviews of subject matter experts. We expanded on the system-level body of knowledge for resiliency and identified a new software architecture framework and acquisition methodology to improve the resiliency of space systems from a software perspective with an emphasis on the early phases of the systems engineering life cycle. This methodology involves seven steps: 1) Define technical resiliency requirements, 1a) Identify standards/policy for software resiliency, 2) Develop a request for proposal (RFP)/statement of work (SOW) for resilient space systems software, 3) Define software resiliency goals for space systems, 4) Establish software resiliency quality attributes, 5) Perform architectural tradeoffs and identify risks, 6) Conduct architecture assessments as part of the procurement process, and 7) Ascertain space system software architecture resiliency metrics. Data illustrates that software vulnerabilities can lead to opportunities for malicious cyber activities, which could degrade the space mission capability for the user community. Reducing the number of vulnerabilities by improving architecture and software system engineering practices can contribute to making space systems more resilient. Since cyber-attacks are enabled by shortfalls in software, robust software engineering practices and an architectural design are foundational to resiliency, which is a quality that allows the system to "take a hit to a critical component and recover in a known, bounded, and generally acceptable period of time". To achieve software resiliency for space systems, acquirers and suppliers must identify relevant factors and systems engineering practices to apply across the lifecycle, in software requirements analysis, architecture development, design, implementation, verification and validation, and maintenance phases.
Optimizing Engineering Tools Using Modern Ground Architectures
2017-12-01
Considerations,” International Journal of Computer Science & Engineering Survey , vol. 5, no. 4, 2014. [10] R. Bell. (n.d). A beginner’s guide to big O notation...scientific community. Traditional computing architectures were not capable of processing the data efficiently, or in some cases, could not process the...thesis investigates how these modern computing architectures could be leveraged by industry and academia to improve the performance and capabilities of
Combining Architecture-Centric Engineering with the Team Software Process
2010-12-01
colleagues from Quarksoft and CIMAT have re- cently reported on their experiences in “Introducing Software Architecture Development Methods into a TSP...Postmortem Lessons, new goals, new requirements, new risk , etc. Business and technical goals Estimates, plans, process, commitment Work products...architecture to mitigate the risks unco- vered by the ATAM. At the end of the iteration, version 1.0 of the architec- ture is available. Implement a second
Modeling and Simulation Roadmap to Enhance Electrical Energy Security of U.S. Naval Bases
2012-03-01
evaluating power system architectures and technologies and, therefore, can become a valuable tool for the implementation of the described plan for Navy...a well validated and consistent process for evaluating power system architectures and technologies and, therefore, can be a valuable tool for the...process for evaluating power system architectures and component technologies is needed to support the development and implementation of these new
A processing architecture for associative short-term memory in electronic noses
NASA Astrophysics Data System (ADS)
Pioggia, G.; Ferro, M.; Di Francesco, F.; DeRossi, D.
2006-11-01
Electronic nose (e-nose) architectures usually consist of several modules that process various tasks such as control, data acquisition, data filtering, feature selection and pattern analysis. Heterogeneous techniques derived from chemometrics, neural networks, and fuzzy rules used to implement such tasks may lead to issues concerning module interconnection and cooperation. Moreover, a new learning phase is mandatory once new measurements have been added to the dataset, thus causing changes in the previously derived model. Consequently, if a loss in the previous learning occurs (catastrophic interference), real-time applications of e-noses are limited. To overcome these problems this paper presents an architecture for dynamic and efficient management of multi-transducer data processing techniques and for saving an associative short-term memory of the previously learned model. The architecture implements an artificial model of a hippocampus-based working memory, enabling the system to be ready for real-time applications. Starting from the base models available in the architecture core, dedicated models for neurons, maps and connections were tailored to an artificial olfactory system devoted to analysing olive oil. In order to verify the ability of the processing architecture in associative and short-term memory, a paired-associate learning test was applied. The avoidance of catastrophic interference was observed.
NASA Technical Reports Server (NTRS)
Hayden, Jeffrey L.; Jeffries, Alan
2012-01-01
The JPSS Ground System is a lIexible system of systems responsible for telemetry, tracking & command (TT &C), data acquisition, routing and data processing services for a varied lIeet of satellites to support weather prediction, modeling and climate modeling. To assist in this engineering effort, architecture modeling tools are being employed to translate the former NPOESS baseline to the new JPSS baseline, The paper will focus on the methodology for the system engineering process and the use of these architecture modeling tools within that process, The Department of Defense Architecture Framework version 2,0 (DoDAF 2.0) viewpoints and views that are being used to describe the JPSS GS architecture are discussed. The Unified Profile for DoOAF and MODAF (UPDM) and Systems Modeling Language (SysML), as ' provided by extensions to the MagicDraw UML modeling tool, are used to develop the diagrams and tables that make up the architecture model. The model development process and structure are discussed, examples are shown, and details of handling the complexities of a large System of Systems (SoS), such as the JPSS GS, with an equally complex modeling tool, are described
Enterprise application architecture development based on DoDAF and TOGAF
NASA Astrophysics Data System (ADS)
Tao, Zhi-Gang; Luo, Yun-Feng; Chen, Chang-Xin; Wang, Ming-Zhe; Ni, Feng
2017-05-01
For the purpose of supporting the design and analysis of enterprise application architecture, here, we report a tailored enterprise application architecture description framework and its corresponding design method. The presented framework can effectively support service-oriented architecting and cloud computing by creating the metadata model based on architecture content framework (ACF), DoDAF metamodel (DM2) and Cloud Computing Modelling Notation (CCMN). The framework also makes an effort to extend and improve the mapping between The Open Group Architecture Framework (TOGAF) application architectural inputs/outputs, deliverables and Department of Defence Architecture Framework (DoDAF)-described models. The roadmap of 52 DoDAF-described models is constructed by creating the metamodels of these described models and analysing the constraint relationship among metamodels. By combining the tailored framework and the roadmap, this article proposes a service-oriented enterprise application architecture development process. Finally, a case study is presented to illustrate the results of implementing the tailored framework in the Southern Base Management Support and Information Platform construction project using the development process proposed by the paper.
Efficient fuzzy C-means architecture for image segmentation.
Li, Hui-Ya; Hwang, Wen-Jyi; Chang, Chia-Yen
2011-01-01
This paper presents a novel VLSI architecture for image segmentation. The architecture is based on the fuzzy c-means algorithm with spatial constraint for reducing the misclassification rate. In the architecture, the usual iterative operations for updating the membership matrix and cluster centroid are merged into one single updating process to evade the large storage requirement. In addition, an efficient pipelined circuit is used for the updating process for accelerating the computational speed. Experimental results show that the the proposed circuit is an effective alternative for real-time image segmentation with low area cost and low misclassification rate.
Information Architecture without Internal Theory: An Inductive Design Process.
ERIC Educational Resources Information Center
Haverty, Marsha
2002-01-01
Suggests that information architecture design is primarily an inductive process, partly because it lacks internal theory and partly because it is an activity that supports emergent phenomena (user experiences) from basic design components. Suggests a resemblance to Constructive Induction, a design process that locates the best representational…
NASA Technical Reports Server (NTRS)
Stovall, John R.; Wray, Richard B.
1994-01-01
This paper presents a description of a model for a space vehicle operational scenario and the commands for avionics. This model will be used in developing a dynamic architecture simulation model using the Statemate CASE tool for validation of the Space Generic Open Avionics Architecture (SGOAA). The SGOAA has been proposed as an avionics architecture standard to NASA through its Strategic Avionics Technology Working Group (SATWG) and has been accepted by the Society of Automotive Engineers (SAE) for conversion into an SAE Avionics Standard. This architecture was developed for the Flight Data Systems Division (FDSD) of the NASA Johnson Space Center (JSC) by the Lockheed Engineering and Sciences Company (LESC), Houston, Texas. This SGOAA includes a generic system architecture for the entities in spacecraft avionics, a generic processing external and internal hardware architecture, and a nine class model of interfaces. The SGOAA is both scalable and recursive and can be applied to any hierarchical level of hardware/software processing systems.
Stress, Cognition, and Human Performance: A Literature Review and Conceptual Framework
NASA Technical Reports Server (NTRS)
Staal, Mark A.
2004-01-01
The following literature review addresses the effects of various stressors on cognition. While attempting to be as inclusive as possible, the review focuses its examination on the relationships between cognitive appraisal, attention, memory, and stress as they relate to information processing and human performance. The review begins with an overview of constructs and theoretical perspectives followed by an examination of effects across attention, memory, perceptual-motor functions, judgment and decision making, putative stressors such as workload, thermals, noise, and fatigue and closes with a discussion of moderating variables and related topics. In summation of the review, a conceptual framework for cognitive process under stress has been assembled. As one might imagine, the research literature that addresses stress, theories governing its effects on human performance, and experimental evidence that supports these notions is large and diverse. In attempting to organize and synthesize this body of work, I was guided by several earlier efforts (Bourne & Yaroush, 2003; Driskell, Mullen, Johnson, Hughes, & Batchelor, 1992; Driskell & Salas, 1996; Haridcock & Desmond, 2001; Stokes & Kite, 1994). These authors should be credited with accomplishing the monumental task of providing focused reviews in this area and their collective efforts laid the foundation for this present review. Similarly, the format of this review has been designed in accordance with these previous exemplars. However, each of these previous efforts either simply reported general findings, without sufficient experimental illustration, or narrowed their scope of investigation to the extent that the breadth of such findings remained hidden from the reader. Moreover, none of these examinations yielded an architecture that adequately describes or explains the inter-relations between information processing elements under stress conditions.
Architectures, representations and processes of language production
Alario, F.-Xavier; Costa, Albert; Ferreira, Victor S.; Pickering, Martin J.
2007-01-01
We present an overview of recent research conducted in the field of language production based on papers presented at the first edition of the International Workshop on Language Production (Marseille, France, September 2004). This article comprises two main parts. In the first part, consisting of three sections, we review the articles that are included in this Special Issue. These three sections deal with three different topics of general interest for models of language production: (A) the general organisational principles of the language production system, (B) several aspects of the lexical selection process and (C) the representations and processes used during syntactic encoding. In the second part, we discuss future directions for research in the field of language production, given the considerable developments that have occurred in recent years. PMID:17710209
Tissue Engineering the Cornea: The Evolution of RAFT
Levis, Hannah J.; Kureshi, Alvena K.; Massie, Isobel; Morgan, Louise; Vernon, Amanda J.; Daniels, Julie T.
2015-01-01
Corneal blindness affects over 10 million people worldwide and current treatment strategies often involve replacement of the defective layer with healthy tissue. Due to a worldwide donor cornea shortage and the absence of suitable biological scaffolds, recent research has focused on the development of tissue engineering techniques to create alternative therapies. This review will detail how we have refined the simple engineering technique of plastic compression of collagen to a process we now call Real Architecture for 3D Tissues (RAFT). The RAFT production process has been standardised, and steps have been taken to consider Good Manufacturing Practice compliance. The evolution of this process has allowed us to create biomimetic epithelial and endothelial tissue equivalents suitable for transplantation and ideal for studying cell-cell interactions in vitro. PMID:25809689
Grammatical Constructions as Relational Categories.
Goldwater, Micah B
2017-07-01
This paper argues that grammatical constructions, specifically argument structure constructions that determine the "who did what to whom" part of sentence meaning and how this meaning is expressed syntactically, can be considered a kind of relational category. That is, grammatical constructions are represented as the abstraction of the syntactic and semantic relations of the exemplar utterances that are expressed in that construction, and it enables the generation of novel exemplars. To support this argument, I review evidence that there are parallel behavioral patterns between how children learn relational categories generally and how they learn grammatical constructions specifically. Then, I discuss computational simulations of how grammatical constructions are abstracted from exemplar sentences using a domain-general relational cognitive architecture. Last, I review evidence from adult language processing that shows parallel behavioral patterns with expert behavior from other cognitive domains. After reviewing the evidence, I consider how to integrate this account with other theories of language development. Copyright © 2017 Cognitive Science Society, Inc.
Physiology of ageing of the musculoskeletal system.
Boros, Katalin; Freemont, Tony
2017-04-01
This review aims to provide a summary of current concepts of ageing in relation to the musculoskeletal system, highlighting recent advances in the understanding of the mechanisms involved in the development of age-related changes in bone, skeletal muscle, chondroid and fibrous tissues. The key components of the musculoskeletal system and their functions are introduced together with a general overview of the molecular hallmarks of ageing. A brief description of the normal architecture of each of these tissue types is followed by a summary of established and developing concepts of mechanisms contributing to the age-related alterations in each. Extensive detailed description of these changes is beyond the scope of this review; instead, we aim to highlight some of the most significant processes and, where possible, the molecular changes underlying these and refer the reader to in-depth, subspecialist reviews of the individual components for further details. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.
The Rapid Response Radiation Survey (R3S) Mission Using the HiSat Conformal Satellite Architecture
NASA Technical Reports Server (NTRS)
Miller, Nathanael A.; Norman, Ryan B.; Soto, Hector L.; Stewart, Victor A.; Jones, Mark L.; Kowalski, Matthew C.; Ben Shabat, Adam; Gough, Kerry M.; Stavely, Rebecca L.; Shim, Alex C.;
2015-01-01
The Rapid Response Radiation Survey (R3S) experiment, designed as a quick turnaround mission to make radiation measurements in Low Earth Orbit (LEO), will fly as a hosted payload in partnership with NovaWurks using their Hyper-integrated Satlet (HISat) architecture. The need for the mission arises as the Nowcast of Atmospheric Ionization Radiation for Aviation Safety (NAIRAS) model moves from a research effort into an operational radiation assessment tool. Currently, airline professionals are the second largest demographic of radiation workers and to date their radiation exposure is undocumented in the USA. The NAIRAS model seeks to fill this information gap. The data collected by R3S, in addition to the complementary data from a NASA Langley Research Center (LaRC) atmospheric balloon mission entitled Radiation Dosimetry Experiment (RaD-X), will validate exposure prediction capabilities of NAIRAS. The R3S mission collects total dose and radiation spectrum measurements using a Teledyne µDosimeter and a Liulin-6SA2 LED spectrometer. These two radiation sensors provide a cross correlated radiometric measurement in combination with the Honeywell HMR2300 Smart Digital Magnetometer. The magnetometer assesses the Earth's magnetic field in the LEO environment and allows radiation dose to be mapped as a function of the Earth's magnetic shielding. R3S is also unique in that the radiation sensors will be exposed on the outer surface of the spacecraft, possibly making this the first measurements of the LEO radiation environment with bare sensors. Viability of R3S as an extremely fast turnaround mission is due, in part, to the nature of the robust, well-defined interfaces of the conformal satellite HiSat Architecture. The HiSat architecture, which was developed with the support of the Defense Advanced Research Projects Agency's (DARPA's) Phoenix Program, enabled the R3S system to advance from the first concept to delivery of preliminary design review (PDR) level documents in 29 calendar days. The architecture allows for interface complexities between the specific devices and the satellite bus to be resolved in a standardized interface control document (ICD). The ICD provided a readymade framework to interface to the modular satellite bus. This modularity allowed for approximately 90% of the R3S system to be designed and fabricated in two months without constraint of the hosting satellite's development cycle. This paper discusses the development of the R3S experiment as made possible by use of the HiSat architecture. The system design and operational modes of the experiment are described, as well as the experiment interfaces to the HiSat satellite via the user defined adapter (UDA) provided by NovaWurks. This paper outlines the steps taken by the project to execute the R3S mission in the 4 months of design, build, and test. Additionally portrayed is the ground work done at LaRC to posture the organization for a fast response and the process by which the opportunity was identified as aligning with key strategic goals. Finally, a description of the engineering process is provided, including the use of facilitated rapid/concurrent engineering sessions, the associated documentation, and the review process employed.
A New FPGA Architecture of FAST and BRIEF Algorithm for On-Board Corner Detection and Matching.
Huang, Jingjin; Zhou, Guoqing; Zhou, Xiang; Zhang, Rongting
2018-03-28
Although some researchers have proposed the Field Programmable Gate Array (FPGA) architectures of Feature From Accelerated Segment Test (FAST) and Binary Robust Independent Elementary Features (BRIEF) algorithm, there is no consideration of image data storage in these traditional architectures that will result in no image data that can be reused by the follow-up algorithms. This paper proposes a new FPGA architecture that considers the reuse of sub-image data. In the proposed architecture, a remainder-based method is firstly designed for reading the sub-image, a FAST detector and a BRIEF descriptor are combined for corner detection and matching. Six pairs of satellite images with different textures, which are located in the Mentougou district, Beijing, China, are used to evaluate the performance of the proposed architecture. The Modelsim simulation results found that: (i) the proposed architecture is effective for sub-image reading from DDR3 at a minimum cost; (ii) the FPGA implementation is corrected and efficient for corner detection and matching, such as the average value of matching rate of natural areas and artificial areas are approximately 67% and 83%, respectively, which are close to PC's and the processing speed by FPGA is approximately 31 and 2.5 times faster than those by PC processing and by GPU processing, respectively.
NASA Technical Reports Server (NTRS)
Bonanne, Kevin H.
2011-01-01
Model-based Systems Engineering (MBSE) is an emerging methodology that can be leveraged to enhance many system development processes. MBSE allows for the centralization of an architecture description that would otherwise be stored in various locations and formats, thus simplifying communication among the project stakeholders, inducing commonality in representation, and expediting report generation. This paper outlines the MBSE approach taken to capture the processes of two different, but related, architectures by employing the Systems Modeling Language (SysML) as a standard for architecture description and the modeling tool MagicDraw. The overarching goal of this study was to demonstrate the effectiveness of MBSE as a means of capturing and designing a mission systems architecture. The first portion of the project focused on capturing the necessary system engineering activities that occur when designing, developing, and deploying a mission systems architecture for a space mission. The second part applies activities from the first to an application problem - the system engineering of the Orion Flight Test 1 (OFT-1) End-to-End Information System (EEIS). By modeling the activities required to create a space mission architecture and then implementing those activities in an application problem, the utility of MBSE as an approach to systems engineering can be demonstrated.
NASA Astrophysics Data System (ADS)
Moran, Steven E.; Austin, William L.; Murray, James T.; Roddier, Nicolas A.; Bridges, Robert; Vercillo, Richard; Stettner, Roger; Phillips, Dave; Bisbee, Al; Witherspoon, Ned H.
2003-09-01
Under the Office of Naval Research's Organic Mine Countermeasures Future Naval Capabilities (OMCM FNC) program, Lite Cycles, Inc. is developing an innovative and highly compact airborne active sensor system for mine and obstacle detection in very shallow water (VSW), through the surf-zone (SZ) and onto the beach. The system uses an innovative LCI proprietary integrated scanner, detector, and telescope (ISDT) receiver architecture. The ISD tightly couples all receiver components and LIDAR electronics to achieve the system compaction required for tactical UAVintegration while providing a large aperture. It also includes an advanced compact multifunction laser transmitter; an industry-first high-resolution, compact 3-D camera, a scanning function for wide area search, and temporally displaced multiple looks on the fly over the ocean surface for clutter reduction. Additionally, the laser will provide time-multiplexed multi-color output to perform day/night multispectral imaging for beach surveillance. New processing algorithms for mine detection in the very challenging surf-zone clutter environment are under development, which offer the potential for significant processing gains in comparison to the legacy approaches. This paper reviews the legacy system approaches, describes the mission challenges, and provides an overview of the ROAR system architecture.
Smith, Owen K.; Aladjem, Mirit I.
2014-01-01
The DNA replication program is, in part, determined by the epigenetic landscape that governs local chromosome architecture and directs chromosome duplication. Replication must coordinate with other biochemical processes occurring concomitantly on chromatin, such as transcription and remodeling, to insure accurate duplication of both genetic and epigenetic features and to preserve genomic stability. The importance of genome architecture and chromatin looping in coordinating cellular processes on chromatin is illustrated by two recent sets of discoveries. First, chromatin-associated proteins that are not part of the core replication machinery were shown to affect the timing of DNA replication. These chromatin-associated proteins could be working in concert, or perhaps in competition, with the transcriptional machinery and with chromatin modifiers to determine the spatial and temporal organization of replication initiation events. Second, epigenetic interactions are mediated by DNA sequences that determine chromosomal replication. In this review we summarize recent findings and current models linking spatial and temporal regulation of the replication program with epigenetic signaling. We discuss these issues in the context of the genome’s three-dimensional structure with an emphasis on events occurring during the initiation of DNA replication. PMID:24905010
Switching from computer to microcomputer architecture education
NASA Astrophysics Data System (ADS)
Bolanakis, Dimosthenis E.; Kotsis, Konstantinos T.; Laopoulos, Theodore
2010-03-01
In the last decades, the technological and scientific evolution of the computing discipline has been widely affecting research in software engineering education, which nowadays advocates more enlightened and liberal ideas. This article reviews cross-disciplinary research on a computer architecture class in consideration of its switching to microcomputer architecture. The authors present their strategies towards a successful crossing of boundaries between engineering disciplines. This communication aims at providing a different aspect on professional courses that are, nowadays, addressed at the expense of traditional courses.
NASA Technical Reports Server (NTRS)
Shyy, Dong-Jye; Redman, Wayne
1993-01-01
For the next-generation packet switched communications satellite system with onboard processing and spot-beam operation, a reliable onboard fast packet switch is essential to route packets from different uplink beams to different downlink beams. The rapid emergence of point-to-point services such as video distribution, and the large demand for video conference, distributed data processing, and network management makes the multicast function essential to a fast packet switch (FPS). The satellite's inherent broadcast features gives the satellite network an advantage over the terrestrial network in providing multicast services. This report evaluates alternate multicast FPS architectures for onboard baseband switching applications and selects a candidate for subsequent breadboard development. Architecture evaluation and selection will be based on the study performed in phase 1, 'Onboard B-ISDN Fast Packet Switching Architectures', and other switch architectures which have become commercially available as large scale integration (LSI) devices.
Analysis of Accuracy and Epoch on Back-propagation BFGS Quasi-Newton
NASA Astrophysics Data System (ADS)
Silaban, Herlan; Zarlis, Muhammad; Sawaluddin
2017-12-01
Back-propagation is one of the learning algorithms on artificial neural networks that have been widely used to solve various problems, such as pattern recognition, prediction and classification. The Back-propagation architecture will affect the outcome of learning processed. BFGS Quasi-Newton is one of the functions that can be used to change the weight of back-propagation. This research tested some back-propagation architectures using classical back-propagation and back-propagation with BFGS. There are 7 architectures that have been tested on glass dataset with various numbers of neurons, 6 architectures with 1 hidden layer and 1 architecture with 2 hidden layers. BP with BFGS improves the convergence of the learning process. The average improvement convergence is 98.34%. BP with BFGS is more optimal on architectures with smaller number of neurons with decreased epoch number is 94.37% with the increase of accuracy about 0.5%.
Developing Dynamic Field Theory Architectures for Embodied Cognitive Systems with cedar.
Lomp, Oliver; Richter, Mathis; Zibner, Stephan K U; Schöner, Gregor
2016-01-01
Embodied artificial cognitive systems, such as autonomous robots or intelligent observers, connect cognitive processes to sensory and effector systems in real time. Prime candidates for such embodied intelligence are neurally inspired architectures. While components such as forward neural networks are well established, designing pervasively autonomous neural architectures remains a challenge. This includes the problem of tuning the parameters of such architectures so that they deliver specified functionality under variable environmental conditions and retain these functions as the architectures are expanded. The scaling and autonomy problems are solved, in part, by dynamic field theory (DFT), a theoretical framework for the neural grounding of sensorimotor and cognitive processes. In this paper, we address how to efficiently build DFT architectures that control embodied agents and how to tune their parameters so that the desired cognitive functions emerge while such agents are situated in real environments. In DFT architectures, dynamic neural fields or nodes are assigned dynamic regimes, that is, attractor states and their instabilities, from which cognitive function emerges. Tuning thus amounts to determining values of the dynamic parameters for which the components of a DFT architecture are in the specified dynamic regime under the appropriate environmental conditions. The process of tuning is facilitated by the software framework cedar , which provides a graphical interface to build and execute DFT architectures. It enables to change dynamic parameters online and visualize the activation states of any component while the agent is receiving sensory inputs in real time. Using a simple example, we take the reader through the workflow of conceiving of DFT architectures, implementing them on embodied agents, tuning their parameters, and assessing performance while the system is coupled to real sensory inputs.
Developing Dynamic Field Theory Architectures for Embodied Cognitive Systems with cedar
Lomp, Oliver; Richter, Mathis; Zibner, Stephan K. U.; Schöner, Gregor
2016-01-01
Embodied artificial cognitive systems, such as autonomous robots or intelligent observers, connect cognitive processes to sensory and effector systems in real time. Prime candidates for such embodied intelligence are neurally inspired architectures. While components such as forward neural networks are well established, designing pervasively autonomous neural architectures remains a challenge. This includes the problem of tuning the parameters of such architectures so that they deliver specified functionality under variable environmental conditions and retain these functions as the architectures are expanded. The scaling and autonomy problems are solved, in part, by dynamic field theory (DFT), a theoretical framework for the neural grounding of sensorimotor and cognitive processes. In this paper, we address how to efficiently build DFT architectures that control embodied agents and how to tune their parameters so that the desired cognitive functions emerge while such agents are situated in real environments. In DFT architectures, dynamic neural fields or nodes are assigned dynamic regimes, that is, attractor states and their instabilities, from which cognitive function emerges. Tuning thus amounts to determining values of the dynamic parameters for which the components of a DFT architecture are in the specified dynamic regime under the appropriate environmental conditions. The process of tuning is facilitated by the software framework cedar, which provides a graphical interface to build and execute DFT architectures. It enables to change dynamic parameters online and visualize the activation states of any component while the agent is receiving sensory inputs in real time. Using a simple example, we take the reader through the workflow of conceiving of DFT architectures, implementing them on embodied agents, tuning their parameters, and assessing performance while the system is coupled to real sensory inputs. PMID:27853431
A Web Centric Architecture for Deploying Multi-Disciplinary Engineering Design Processes
NASA Technical Reports Server (NTRS)
Woyak, Scott; Kim, Hongman; Mullins, James; Sobieszczanski-Sobieski, Jaroslaw
2004-01-01
There are continuous needs for engineering organizations to improve their design process. Current state of the art techniques use computational simulations to predict design performance, and optimize it through advanced design methods. These tools have been used mostly by individual engineers. This paper presents an architecture for achieving results at an organization level beyond individual level. The next set of gains in process improvement will come from improving the effective use of computers and software within a whole organization, not just for an individual. The architecture takes advantage of state of the art capabilities to produce a Web based system to carry engineering design into the future. To illustrate deployment of the architecture, a case study for implementing advanced multidisciplinary design optimization processes such as Bi-Level Integrated System Synthesis is discussed. Another example for rolling-out a design process for Design for Six Sigma is also described. Each example explains how an organization can effectively infuse engineering practice with new design methods and retain the knowledge over time.
Reactive Distillation and Air Stripping Processes for Water Recycling and Trace Contaminant Control
NASA Technical Reports Server (NTRS)
Boul, Peter J.; Lange, Kevin E.; Conger, Bruce; Anderson, Molly
2009-01-01
Reactive distillation designs are considered to reduce the presence of volatile organic compounds in the purified water. Reactive distillation integrates a reactor with a distillation column. A review of the literature in this field has revealed a variety of functional reactive columns in industry. Wastewater may be purified by a combination of a reactor and a distiller (e.g., the EWRS or VPCAR concepts) or, in principle, through a design which integrates the reactor with the distiller. A review of the literature in reactive distillation has identified some different designs in such combinations of reactor and distiller. An evaluation of reactive distillation and reactive air stripping is presented with regards to the reduction of volatile organic compounds in the contaminated water and air. Among the methods presented, an architecture is presented for the evaluation of the simultaneous oxidation of organics in air and water. These and other designs are presented in light of potential improvements in power consumptions and air and water purities for architectures which include catalytic activity integrated into the water processor. In particular, catalytic oxidation of organics may be useful as a tool to remove contaminants that more traditional distillation and/or air stripping columns may not remove. A review of the current leading edge at the commercial level and at the research frontier in catalytically active materials is presented. Themes and directions from the engineering developments in catalyst design are presented conceptually in light of developments in the nanoscale chemistry of a variety of catalyst materials.
Graphics processing units in bioinformatics, computational biology and systems biology.
Nobile, Marco S; Cazzaniga, Paolo; Tangherloni, Andrea; Besozzi, Daniela
2017-09-01
Several studies in Bioinformatics, Computational Biology and Systems Biology rely on the definition of physico-chemical or mathematical models of biological systems at different scales and levels of complexity, ranging from the interaction of atoms in single molecules up to genome-wide interaction networks. Traditional computational methods and software tools developed in these research fields share a common trait: they can be computationally demanding on Central Processing Units (CPUs), therefore limiting their applicability in many circumstances. To overcome this issue, general-purpose Graphics Processing Units (GPUs) are gaining an increasing attention by the scientific community, as they can considerably reduce the running time required by standard CPU-based software, and allow more intensive investigations of biological systems. In this review, we present a collection of GPU tools recently developed to perform computational analyses in life science disciplines, emphasizing the advantages and the drawbacks in the use of these parallel architectures. The complete list of GPU-powered tools here reviewed is available at http://bit.ly/gputools. © The Author 2016. Published by Oxford University Press.
Hierarchical micro-architectures of electrodes for energy storage
NASA Astrophysics Data System (ADS)
Yue, Yuan; Liang, Hong
2015-06-01
The design of electrodes for the electrochemical energy storage devices, particularly Lithium ion batteries (LIBs) and Supercapacitors (SCs), has extraordinary importance in optimization of electrochemical performance. Regardless of the materials used, the architecture of electrodes is crucial for charge transport efficiency and electrochemical interactions. This report provides a critical review of the prototype architectural design and micro- and nano-material properties designated to electrodes of LIBs and SCs. An alternative classification criterion is proposed that divides reported hierarchical architectures into two categories: aligned and unaligned structures. The structures were evaluated and it was found that the aligned architectures are superior to the unaligned in the following characteristics: 1) highly-organized charger pathways, 2) tunable interspaces between architecture units, and 3) good electric-contacted current collectors prepared along with electrodes. Based on these findings, challenges and potential routes to resolve those are provided for future development.
Dahamna, Badisse; Guillemin-Lanne, Sylvie; Darmoni, Stefan J; Faviez, Carole; Huot, Charles; Katsahian, Sandrine; Leroux, Vincent; Pereira, Suzanne; Richard, Christophe; Schück, Stéphane; Souvignet, Julien; Lillo-Le Louët, Agnès; Texier, Nathalie
2017-01-01
Background Adverse drug reactions (ADRs) are an important cause of morbidity and mortality. Classical Pharmacovigilance process is limited by underreporting which justifies the current interest in new knowledge sources such as social media. The Adverse Drug Reactions from Patient Reports in Social Media (ADR-PRISM) project aims to extract ADRs reported by patients in these media. We identified 5 major challenges to overcome to operationalize the analysis of patient posts: (1) variable quality of information on social media, (2) guarantee of data privacy, (3) response to pharmacovigilance expert expectations, (4) identification of relevant information within Web pages, and (5) robust and evolutive architecture. Objective This article aims to describe the current state of advancement of the ADR-PRISM project by focusing on the solutions we have chosen to address these 5 major challenges. Methods In this article, we propose methods and describe the advancement of this project on several aspects: (1) a quality driven approach for selecting relevant social media for the extraction of knowledge on potential ADRs, (2) an assessment of ethical issues and French regulation for the analysis of data on social media, (3) an analysis of pharmacovigilance expert requirements when reviewing patient posts on the Internet, (4) an extraction method based on natural language processing, pattern based matching, and selection of relevant medical concepts in reference terminologies, and (5) specifications of a component-based architecture for the monitoring system. Results Considering the 5 major challenges, we (1) selected a set of 21 validated criteria for selecting social media to support the extraction of potential ADRs, (2) proposed solutions to guarantee data privacy of patients posting on Internet, (3) took into account pharmacovigilance expert requirements with use case diagrams and scenarios, (4) built domain-specific knowledge resources embeding a lexicon, morphological rules, context rules, semantic rules, syntactic rules, and post-analysis processing, and (5) proposed a component-based architecture that allows storage of big data and accessibility to third-party applications through Web services. Conclusions We demonstrated the feasibility of implementing a component-based architecture that allows collection of patient posts on the Internet, near real-time processing of those posts including annotation, and storage in big data structures. In the next steps, we will evaluate the posts identified by the system in social media to clarify the interest and relevance of such approach to improve conventional pharmacovigilance processes based on spontaneous reporting. PMID:28935617
ERIC Educational Resources Information Center
Chamberlin, Jennifer L.
2010-01-01
Compared to other professions in recent years, architecture has lagged woefully behind in attracting and retaining a diverse population, as typically defined by class, race and gender. This dissertation investigates the extent to which architecture culturally reproduces itself, specifically examining the socialization process of students into the…
Integrating Software-Architecture-Centric Methods into the Rational Unified Process
2004-07-01
Architecture Design ...................................................................................... 19...QAW in a life- cycle context. One issue that needs to be addressed is how scenarios produced in a QAW can be used by a software architecture design method...implementation testing. 18 CMU/SEI-2004-TR-011 CMU/SEI-2004-TR-011 19 4 Architecture Design The Attribute-Driven Design (ADD) method
Agricultural Urbanism in the Context of Landscape Ecological Architecture
NASA Astrophysics Data System (ADS)
Maltseva, I. N.; Kaganovich, N. N.; Mindiyrova, T. N.
2017-11-01
The article analyzes some of the fundamental aspects of cities sustainable development connected in many respects with the concept of ecological architecture. One of the main concepts of sustainability is considered in detail: the city as an eco-sustainable and balanced system, architectural objects as a full-fledged part of this system, which, most likely, will be determined by one of the directions of this development - the development of landscape architecture as an tool for integration of nature into the urban environment. At the same time, the variety of its functional forms and architectural methods in the system of organization of internal and external space is outlined as well as its interrelation with energy-saving architecture defining them as the two most important components of eco-sustainable development. The development forms of landscape architecture are considered in the review of analogs, as an example (agricultural urbanism object) a thesis on the topic “Vertical Farm Agroindustrial Complex” is presented.
DOT National Transportation Integrated Search
1993-05-01
The MlTRE Corporation is supporting the Federal Highway Administration (FHWA) in : the development of a national architecture for Intelligent Vehicle Highway Systems (IVHS). : This report examines the communication, processing, and storage load requi...
DOT National Transportation Integrated Search
2006-01-01
This pamphlet gives a brief introduction to the National Intelligent Transportation Systems (ITS) architecture and regional ITS architectures. It gives an overview of architecture, project, and standards requirements, and describes the availability o...
Polymer architectures via mass spectrometry and hyphenated techniques: A review.
Crotty, Sarah; Gerişlioğlu, Selim; Endres, Kevin J; Wesdemiotis, Chrys; Schubert, Ulrich S
2016-08-17
This review covers the application of mass spectrometry (MS) and its hyphenated techniques to synthetic polymers of varying architectural complexities. The synthetic polymers are discussed as according to their architectural complexity from linear homopolymers and copolymers to stars, dendrimers, cyclic copolymers and other polymers. MS and tandem MS (MS/MS) has been extensively used for the analysis of synthetic polymers. However, the increase in structural or architectural complexity can result in analytical challenges that MS or MS/MS cannot overcome alone. Hyphenation to MS with different chromatographic techniques (2D × LC, SEC, HPLC etc.), utilization of other ionization methods (APCI, DESI etc.) and various mass analyzers (FT-ICR, quadrupole, time-of-flight, ion trap etc.) are applied to overcome these challenges and achieve more detailed structural characterizations of complex polymeric systems. In addition, computational methods (software: MassChrom2D, COCONUT, 2D maps etc.) have also reached polymer science to facilitate and accelerate data interpretation. Developments in technology and the comprehension of different polymer classes with diverse architectures have significantly improved, which allow for smart polymer designs to be examined and advanced. We present specific examples covering diverse analytical aspects as well as forthcoming prospects in polymer science. Copyright © 2016 Elsevier B.V. All rights reserved.
A systematic review and meta-analysis of sleep architecture and chronic traumatic brain injury.
Mantua, Janna; Grillakis, Antigone; Mahfouz, Sanaa H; Taylor, Maura R; Brager, Allison J; Yarnell, Angela M; Balkin, Thomas J; Capaldi, Vincent F; Simonelli, Guido
2018-02-02
Sleep quality appears to be altered by traumatic brain injury (TBI). However, whether persistent post-injury changes in sleep architecture are present is unknown and relatively unexplored. We conducted a systematic review and meta-analysis to assess the extent to which chronic TBI (>6 months since injury) is characterized by changes to sleep architecture. We also explored the relationship between sleep architecture and TBI severity. In the fourteen included studies, sleep was assessed with at least one night of polysomnography in both chronic TBI participants and controls. Statistical analyses, performed using Comprehensive Meta-Analysis software, revealed that chronic TBI is characterized by relatively increased slow wave sleep (SWS). A meta-regression showed moderate-severe TBI is associated with elevated SWS, reduced stage 2, and reduced sleep efficiency. In contrast, mild TBI was not associated with any significant alteration of sleep architecture. The present findings are consistent with the hypothesis that increased SWS after moderate-severe TBI reflects post-injury cortical reorganization and restructuring. Suggestions for future research are discussed, including adoption of common data elements in future studies to facilitate cross-study comparability, reliability, and replicability, thereby increasing the likelihood that meaningful sleep (and other) biomarkers of TBI will be identified. Copyright © 2018 Elsevier Ltd. All rights reserved.
The Architecture, Dynamics, and Development of Mental Processing: Greek, Chinese, or Universal?
ERIC Educational Resources Information Center
Demetriou, A.; Kui, Z.X.; Spanoudis, G.; Christou, C.; Kyriakides, L.; Platsidou, M.
2005-01-01
This study compared Greeks with Chinese, from 8 to 14 years of age, on measures of processing efficiency, working memory, and reasoning. All processes were addressed through three domains of relations: verbal/propositional, quantitative, and visuo/spatial. Structural equations modelling and rating scale analysis showed that the architecture and…
Scalable software architecture for on-line multi-camera video processing
NASA Astrophysics Data System (ADS)
Camplani, Massimo; Salgado, Luis
2011-03-01
In this paper we present a scalable software architecture for on-line multi-camera video processing, that guarantees a good trade off between computational power, scalability and flexibility. The software system is modular and its main blocks are the Processing Units (PUs), and the Central Unit. The Central Unit works as a supervisor of the running PUs and each PU manages the acquisition phase and the processing phase. Furthermore, an approach to easily parallelize the desired processing application has been presented. In this paper, as case study, we apply the proposed software architecture to a multi-camera system in order to efficiently manage multiple 2D object detection modules in a real-time scenario. System performance has been evaluated under different load conditions such as number of cameras and image sizes. The results show that the software architecture scales well with the number of camera and can easily works with different image formats respecting the real time constraints. Moreover, the parallelization approach can be used in order to speed up the processing tasks with a low level of overhead.
Help Seeking Attitudes and Behaviors of International Students at Architectural Schools
ERIC Educational Resources Information Center
Meyer, Cary J.
2009-01-01
The purpose of this study was to investigate the help-seeking attitudes and behaviors of international students at architectural schools of higher education. A review of the academic literature revealed no earlier research on this specific population. However, there was a moderate body of literature regarding help seeking attitudes and behavior…
Software system architecture for corporate user support
NASA Astrophysics Data System (ADS)
Sukhopluyeva, V. S.; Kuznetsov, D. Y.
2017-01-01
In this article, several existing ready-to-use solutions for the HelpDesk are reviewed. Advantages and disadvantages of these systems are identified. Architecture of software solution for a corporate user support system is presented in a form of the use case, state, and component diagrams described by using a unified modeling language (UML).
System architecture for the Canadian interim mobile satellite system
NASA Technical Reports Server (NTRS)
Shariatmadar, M.; Gordon, K.; Skerry, B.; Eldamhougy, H.; Bossler, D.
1988-01-01
The system architecture for the Canadian Interim Mobile Satellite Service (IMSS) which is planned for commencement of commercial service in late 1989 is reviewed. The results of an associated field trial program which was carried out to determine the limits of coverage and the preliminary performance characteristics of the system are discussed.
Contrasting the Use of Tools for Presentation and Critique: Some Cases from Architectural Education
ERIC Educational Resources Information Center
Lymer, Gustav; Ivarsson, Jonas; Lindwall, Oskar
2009-01-01
This study investigates video recordings of design reviews in architectural education, focusing on how presentations and discussions of designs are contingent on the specific tools employed. In the analyzed recordings, three different setups are utilized: traditional posters, digital slide-show technologies, and combinations of the two. This range…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-03
... Other Business Sub-Groups meetings Break out, as necessary daily EDR Turbulence Standards Project Briefing from FAA SE2020 Team SG6 WG1 Architecture and MASPS presentations SG3 AIS and MET Services Delivery Architecture Recommendations Document Review (FRAC release approval). Closing Plenary--Sub-Groups...
Using an Architectural Metaphor for Information Design in Hypertext.
ERIC Educational Resources Information Center
Deboard, Donn R.; Lee, Doris
2001-01-01
Uses Frank Lloyd Wright's (1867-1959) organic architecture as a metaphor to define the relationship between a part and a whole, whether the focus is on a building and its surroundings or information delivered via hypertext. Reviews effective strategies for designing text information via hypertext and incorporates three levels of information…
NASA Technical Reports Server (NTRS)
1983-01-01
Various parameters of the orbital space station are discussed. The space station environment, data management system, communication and tracking, environmental control, and life support system are considered. Specific topics reviewed include crew work stations, restraint systems, stowage, computer hardware, and expert systems.
A Comparative Study : Microprogrammed Vs Risc Architectures For Symbolic Processing
NASA Astrophysics Data System (ADS)
Heudin, J. C.; Metivier, C.; Demigny, D.; Maurin, T.; Zavidovique, B.; Devos, F.
1987-05-01
It is oftenclaimed that conventional computers are not well suited for human-like tasks : Vision (Image Processing), Intelligence (Symbolic Processing) ... In the particular case of Artificial Intelligence, dynamic type-checking is one example of basic task that must be improved. The solution implemented in most Lisp work-stations consists in a microprogrammed architecture with a tagged memory. Another way to gain efficiency is to design a well suited instruction set for symbolic processing, which reduces the semantic gap between the high level language and the machine code. In this framework, the RISC concept provides a convenient approach to study new architectures for symbolic processing. This paper compares both approaches and describes our projectof designing a compact symbolic processor for Artificial Intelligence applications.
ASAC Executive Assistant Architecture Description Summary
NASA Technical Reports Server (NTRS)
Roberts, Eileen; Villani, James A.
1997-01-01
In this technical document, we describe the system architecture developed for the Aviation System Analysis Capability (ASAC) Executive Assistant (EA). We describe the genesis and role of the ASAC system, discuss the objectives of the ASAC system and provide an overview of components and models within the ASAC system, discuss our choice for an architecture methodology, the Domain Specific Software Architecture (DSSA), and the DSSA approach to developing a system architecture, and describe the development process and the results of the ASAC EA system architecture. The document has six appendices.
System architectures for telerobotic research
NASA Technical Reports Server (NTRS)
Harrison, F. Wallace
1989-01-01
Several activities are performed related to the definition and creation of telerobotic systems. The effort and investment required to create architectures for these complex systems can be enormous; however, the magnitude of process can be reduced if structured design techniques are applied. A number of informal methodologies supporting certain aspects of the design process are available. More recently, prototypes of integrated tools supporting all phases of system design from requirements analysis to code generation and hardware layout have begun to appear. Activities related to system architecture of telerobots are described, including current activities which are designed to provide a methodology for the comparison and quantitative analysis of alternative system architectures.
A fast, programmable hardware architecture for spaceborne SAR processing
NASA Technical Reports Server (NTRS)
Bennett, J. R.; Cumming, I. G.; Lim, J.; Wedding, R. M.
1983-01-01
The launch of spaceborne SARs during the 1980's is discussed. The satellite SARs require high quality and high throughput ground processors. Compression ratios in range and azimuth of greater than 500 and 150 respectively lead to frequency domain processing and data computation rates in excess of 2000 million real operations per second for C-band SARs under consideration. Various hardware architectures are examined and two promising candidates and proceeds to recommend a fast, programmable hardware architecture for spaceborne SAR processing are selected. Modularity and programmability are introduced as desirable attributes for the purpose of HTSP hardware selection.
An object-oriented software approach for a distributed human tracking motion system
NASA Astrophysics Data System (ADS)
Micucci, Daniela L.
2003-06-01
Tracking is a composite job involving the co-operation of autonomous activities which exploit a complex information model and rely on a distributed architecture. Both information and activities must be classified and related in several dimensions: abstraction levels (what is modelled and how information is processed); topology (where the modelled entities are); time (when entities exist); strategy (why something happens); responsibilities (who is in charge of processing the information). A proper Object-Oriented analysis and design approach leads to a modular architecture where information about conceptual entities is modelled at each abstraction level via classes and intra-level associations, whereas inter-level associations between classes model the abstraction process. Both information and computation are partitioned according to level-specific topological models. They are also placed in a temporal framework modelled by suitable abstractions. Domain-specific strategies control the execution of the computations. Computational components perform both intra-level processing and intra-level information conversion. The paper overviews the phases of the analysis and design process, presents major concepts at each abstraction level, and shows how the resulting design turns into a modular, flexible and adaptive architecture. Finally, the paper sketches how the conceptual architecture can be deployed into a concrete distribute architecture by relying on an experimental framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neely, J. R.; Hornung, R.; Black, A.
This document serves as a detailed companion to the powerpoint slides presented as part of the ASC L2 milestone review for Integrated Codes milestone #4782 titled “Assess Newly Emerging Programming and Memory Models for Advanced Architectures on Integrated Codes”, due on 9/30/2014, and presented for formal program review on 9/12/2014. The program review committee is represented by Mike Zika (A Program Project Lead for Kull), Brian Pudliner (B Program Project Lead for Ares), Scott Futral (DEG Group Lead in LC), and Mike Glass (Sierra Project Lead at Sandia). This document, along with the presentation materials, and a letter of completionmore » signed by the review committee will act as proof of completion for this milestone.« less
Evaluation of hardware costs of implementing PSK signal detection circuit based on "system on chip"
NASA Astrophysics Data System (ADS)
Sokolovskiy, A. V.; Dmitriev, D. D.; Veisov, E. A.; Gladyshev, A. B.
2018-05-01
The article deals with the choice of the architecture of digital signal processing units for implementing the PSK signal detection scheme. As an assessment of the effectiveness of architectures, the required number of shift registers and computational processes are used when implementing the "system on a chip" on the chip. A statistical estimation of the normalized code sequence offset in the signal synchronization scheme for various hardware block architectures is used.
Parallel Signal Processing and System Simulation using aCe
NASA Technical Reports Server (NTRS)
Dorband, John E.; Aburdene, Maurice F.
2003-01-01
Recently, networked and cluster computation have become very popular for both signal processing and system simulation. A new language is ideally suited for parallel signal processing applications and system simulation since it allows the programmer to explicitly express the computations that can be performed concurrently. In addition, the new C based parallel language (ace C) for architecture-adaptive programming allows programmers to implement algorithms and system simulation applications on parallel architectures by providing them with the assurance that future parallel architectures will be able to run their applications with a minimum of modification. In this paper, we will focus on some fundamental features of ace C and present a signal processing application (FFT).
Tan, Huan; Liang, Chen
2011-01-01
This paper proposes a conceptual hybrid cognitive architecture for cognitive robots to learn behaviors from demonstrations in robotic aid situations. Unlike the current cognitive architectures, this architecture puts concentration on the requirements of the safety, the interaction, and the non-centralized processing in robotic aid situations. Imitation learning technologies for cognitive robots have been integrated into this architecture for rapidly transferring the knowledge and skills between human teachers and robots.
A development framework for semantically interoperable health information systems.
Lopez, Diego M; Blobel, Bernd G M E
2009-02-01
Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.
Can we manipulate root system architecture to control soil erosion?
NASA Astrophysics Data System (ADS)
Ola, A.; Dodd, I. C.; Quinton, J. N.
2015-09-01
Soil erosion is a major threat to soil functioning. The use of vegetation to control erosion has long been a topic for research. Much of this research has focused on the above-ground properties of plants, demonstrating the important role that canopy structure and cover plays in the reduction of water erosion processes. Less attention has been paid to plant roots. Plant roots are a crucial yet under-researched factor for reducing water erosion through their ability to alter soil properties, such as aggregate stability, hydraulic function and shear strength. However, there have been few attempts to specifically manipulate plant root system properties to reduce soil erosion. Therefore, this review aims to explore the effects that plant roots have on soil erosion and hydrological processes, and how plant root architecture might be manipulated to enhance its erosion control properties. We demonstrate the importance of root system architecture for the control of soil erosion. We also show that some plant species respond to nutrient-enriched patches by increasing lateral root proliferation. The erosional response to root proliferation will depend upon its location: at the soil surface dense mats of roots may reduce soil erodibility but block soil pores thereby limiting infiltration, enhancing runoff. Additionally, in nutrient-deprived regions, root hair development may be stimulated and larger amounts of root exudates released, thereby improving aggregate stability and decreasing erodibility. Utilizing nutrient placement at specific depths may represent a potentially new, easily implemented, management strategy on nutrient-poor agricultural land or constructed slopes to control erosion, and further research in this area is needed.
High-Performance Monitoring Architecture for Large-Scale Distributed Systems Using Event Filtering
NASA Technical Reports Server (NTRS)
Maly, K.
1998-01-01
Monitoring is an essential process to observe and improve the reliability and the performance of large-scale distributed (LSD) systems. In an LSD environment, a large number of events is generated by the system components during its execution or interaction with external objects (e.g. users or processes). Monitoring such events is necessary for observing the run-time behavior of LSD systems and providing status information required for debugging, tuning and managing such applications. However, correlated events are generated concurrently and could be distributed in various locations in the applications environment which complicates the management decisions process and thereby makes monitoring LSD systems an intricate task. We propose a scalable high-performance monitoring architecture for LSD systems to detect and classify interesting local and global events and disseminate the monitoring information to the corresponding end- points management applications such as debugging and reactive control tools to improve the application performance and reliability. A large volume of events may be generated due to the extensive demands of the monitoring applications and the high interaction of LSD systems. The monitoring architecture employs a high-performance event filtering mechanism to efficiently process the large volume of event traffic generated by LSD systems and minimize the intrusiveness of the monitoring process by reducing the event traffic flow in the system and distributing the monitoring computation. Our architecture also supports dynamic and flexible reconfiguration of the monitoring mechanism via its Instrumentation and subscription components. As a case study, we show how our monitoring architecture can be utilized to improve the reliability and the performance of the Interactive Remote Instruction (IRI) system which is a large-scale distributed system for collaborative distance learning. The filtering mechanism represents an Intrinsic component integrated with the monitoring architecture to reduce the volume of event traffic flow in the system, and thereby reduce the intrusiveness of the monitoring process. We are developing an event filtering architecture to efficiently process the large volume of event traffic generated by LSD systems (such as distributed interactive applications). This filtering architecture is used to monitor collaborative distance learning application for obtaining debugging and feedback information. Our architecture supports the dynamic (re)configuration and optimization of event filters in large-scale distributed systems. Our work represents a major contribution by (1) survey and evaluating existing event filtering mechanisms In supporting monitoring LSD systems and (2) devising an integrated scalable high- performance architecture of event filtering that spans several kev application domains, presenting techniques to improve the functionality, performance and scalability. This paper describes the primary characteristics and challenges of developing high-performance event filtering for monitoring LSD systems. We survey existing event filtering mechanisms and explain key characteristics for each technique. In addition, we discuss limitations with existing event filtering mechanisms and outline how our architecture will improve key aspects of event filtering.
SpaceCubeX: A Framework for Evaluating Hybrid Multi-Core CPU FPGA DSP Architectures
NASA Technical Reports Server (NTRS)
Schmidt, Andrew G.; Weisz, Gabriel; French, Matthew; Flatley, Thomas; Villalpando, Carlos Y.
2017-01-01
The SpaceCubeX project is motivated by the need for high performance, modular, and scalable on-board processing to help scientists answer critical 21st century questions about global climate change, air quality, ocean health, and ecosystem dynamics, while adding new capabilities such as low-latency data products for extreme event warnings. These goals translate into on-board processing throughput requirements that are on the order of 100-1,000 more than those of previous Earth Science missions for standard processing, compression, storage, and downlink operations. To study possible future architectures to achieve these performance requirements, the SpaceCubeX project provides an evolvable testbed and framework that enables a focused design space exploration of candidate hybrid CPU/FPGA/DSP processing architectures. The framework includes ArchGen, an architecture generator tool populated with candidate architecture components, performance models, and IP cores, that allows an end user to specify the type, number, and connectivity of a hybrid architecture. The framework requires minimal extensions to integrate new processors, such as the anticipated High Performance Spaceflight Computer (HPSC), reducing time to initiate benchmarking by months. To evaluate the framework, we leverage a wide suite of high performance embedded computing benchmarks and Earth science scenarios to ensure robust architecture characterization. We report on our projects Year 1 efforts and demonstrate the capabilities across four simulation testbed models, a baseline SpaceCube 2.0 system, a dual ARM A9 processor system, a hybrid quad ARM A53 and FPGA system, and a hybrid quad ARM A53 and DSP system.
Bioprinting for Neural Tissue Engineering.
Knowlton, Stephanie; Anand, Shivesh; Shah, Twisha; Tasoglu, Savas
2018-01-01
Bioprinting is a method by which a cell-encapsulating bioink is patterned to create complex tissue architectures. Given the potential impact of this technology on neural research, we review the current state-of-the-art approaches for bioprinting neural tissues. While 2D neural cultures are ubiquitous for studying neural cells, 3D cultures can more accurately replicate the microenvironment of neural tissues. By bioprinting neuronal constructs, one can precisely control the microenvironment by specifically formulating the bioink for neural tissues, and by spatially patterning cell types and scaffold properties in three dimensions. We review a range of bioprinted neural tissue models and discuss how they can be used to observe how neurons behave, understand disease processes, develop new therapies and, ultimately, design replacement tissues. Copyright © 2017 Elsevier Ltd. All rights reserved.
Rapid ordering of block copolymer thin films
Majewski, Pawel W.; Yager, Kevin G.
2016-08-18
Block-copolymers self-assemble into diverse morphologies, where nanoscale order can be finely tuned via block architecture and processing conditions. However, the ultimate usage of these materials in real-world applications may be hampered by the extremely long thermal annealing times—hours or days—required to achieve good order. Here, we provide an overview of the fundamentals of block-copolymer self-assembly kinetics, and review the techniques that have been demonstrated to influence, and enhance, these ordering kinetics. We discuss the inherent tradeoffs between oven annealing, solvent annealing, microwave annealing, zone annealing, and other directed self-assembly methods; including an assessment of spatial and temporal characteristics. Here, wemore » also review both real-space and reciprocal-space analysis techniques for quantifying order in these systems.« less
A new physical unclonable function architecture
NASA Astrophysics Data System (ADS)
Chuang, Bai; Xuecheng, Zou; Kui, Dai
2015-03-01
This paper describes a new silicon physical unclonable function (PUF) architecture that can be fabricated on a standard CMOS process. Our proposed architecture is built using process sensors, difference amplifier, comparator, voting mechanism and diffusion algorithm circuit. Multiple identical process sensors are fabricated on the same chip. Due to manufacturing process variations, each sensor produces slightly different physical characteristic values that can be compared in order to create a digital identification for the chip. The diffusion algorithm circuit ensures further that the PUF based on the proposed architecture is able to effectively identify a population of ICs. We also improve the stability of PUF design with respect to temporary environmental variations like temperature and supply voltage with the introduction of difference amplifier and voting mechanism. The PUF built on the proposed architecture is fabricated in 0.18 μm CMOS technology. Experimental results show that the PUF has a good output statistical characteristic of uniform distribution and a high stability of 98.1% with respect to temperature variation from -40 to 100 °C, and supply voltage variation from 1.7 to 1.9 V. Project supported by the National Natural Science Foundation of China (No. 61376031).
Remote hardware-reconfigurable robotic camera
NASA Astrophysics Data System (ADS)
Arias-Estrada, Miguel; Torres-Huitzil, Cesar; Maya-Rueda, Selene E.
2001-10-01
In this work, a camera with integrated image processing capabilities is discussed. The camera is based on an imager coupled to an FPGA device (Field Programmable Gate Array) which contains an architecture for real-time computer vision low-level processing. The architecture can be reprogrammed remotely for application specific purposes. The system is intended for rapid modification and adaptation for inspection and recognition applications, with the flexibility of hardware and software reprogrammability. FPGA reconfiguration allows the same ease of upgrade in hardware as a software upgrade process. The camera is composed of a digital imager coupled to an FPGA device, two memory banks, and a microcontroller. The microcontroller is used for communication tasks and FPGA programming. The system implements a software architecture to handle multiple FPGA architectures in the device, and the possibility to download a software/hardware object from the host computer into its internal context memory. System advantages are: small size, low power consumption, and a library of hardware/software functionalities that can be exchanged during run time. The system has been validated with an edge detection and a motion processing architecture, which will be presented in the paper. Applications targeted are in robotics, mobile robotics, and vision based quality control.
CisLunar Habitat Internal Architecture Design Criteria
NASA Technical Reports Server (NTRS)
Jones, R.; Kennedy, K.; Howard, R.; Whitmore, M.; Martin, C.; Garate, J.
2017-01-01
BACKGROUND: In preparation for human exploration to Mars, there is a need to define the development and test program that will validate deep space operations and systems. In that context, a Proving Grounds CisLunar habitat spacecraft is being defined as the next step towards this goal. This spacecraft will operate differently from the ISS or other spacecraft in human history. The performance envelope of this spacecraft (mass, volume, power, specifications, etc.) is being defined by the Future Capabilities Study Team. This team has recognized the need for a human-centered approach for the internal architecture of this spacecraft and has commissioned a CisLunar Phase-1 Habitat Internal Architecture Study Team to develop a NASA reference configuration, providing the Agency with a "smart buyer" approach for future acquisition. THE CISLUNAR HABITAT INTERNAL ARCHITECTURE STUDY: Overall, the CisLunar Habitat Internal Architecture study will address the most significant questions and risks in the current CisLunar architecture, habitation, and operations concept development. This effort is achieved through definition of design criteria, evaluation criteria and process, design of the CisLunar Habitat Phase-1 internal architecture, and the development and fabrication of internal architecture concepts combined with rigorous and methodical Human-in-the-Loop (HITL) evaluations and testing of the conceptual innovations in a controlled test environment. The vision of the CisLunar Habitat Internal Architecture Study is to design, build, and test a CisLunar Phase-1 Habitat Internal Architecture that will be used for habitation (e.g. habitability and human factors) evaluations. The evaluations will mature CisLunar habitat evaluation tools, guidelines, and standards, and will interface with other projects such as the Advanced Exploration Systems (AES) Program integrated Power, Avionics, Software (iPAS), and Logistics for integrated human-in-the-loop testing. The mission of the CisLunar Habitat Internal Architecture Study is to become a forcing function to establish a common understanding of CisLunar Phase-1 Habitation Internal Architecture design criteria, processes, and tools. The scope of the CisLunar Habitat Internal Architecture study is to design, develop, demonstrate, and evaluate a Phase-1 CisLunar Habitat common module internal architecture based on design criteria agreed to by NASA, the International Partners, and Commercial Exploration teams. This task is to define the CisLunar Phase-1 Internal Architecture Government Reference Design, assist NASA in becoming a "smart buyer" for Phase-1 Habitat Concepts, and ultimately to derive standards and requirements from the Internal Architecture Design Process. The first step was to define a Habitat Internal Architecture Design Criteria and create a structured philosophy to be used by design teams as a filter by which critical aspects of consideration would be identified for the purpose of organizing and utilizing interior spaces. With design criteria in place, the team will develop a series of iterative internal architecture concept designs which will be assessed by means of an evaluation criteria and process. These assessments will successively drive and refine the design, leading to the combination and down-selection of design concepts. A single refined reference design configuration will be developed into in a medium-to-high fidelity mockup. A multi-day human-in-the-loop mission test will fully evaluate the reference design and validate its configuration. Lessons learned from the design and evaluation will enable the team to identify appropriate standards for Phase-1 CisLunar Habitat Internal Architecture and will enable NASA to develop derived requirements in support of maturing CisLunar Habitation capabilities. This paper will describe the criteria definition process, workshop event, and resulting CisLunar Phase-1 Habitat Internal Architecture Design Criteria.
Augmenting cognitive architectures to support diagrammatic imagination.
Chandrasekaran, Balakrishnan; Banerjee, Bonny; Kurup, Unmesh; Lele, Omkar
2011-10-01
Diagrams are a form of spatial representation that supports reasoning and problem solving. Even when diagrams are external, not to mention when there are no external representations, problem solving often calls for internal representations, that is, representations in cognition, of diagrammatic elements and internal perceptions on them. General cognitive architectures--Soar and ACT-R, to name the most prominent--do not have representations and operations to support diagrammatic reasoning. In this article, we examine some requirements for such internal representations and processes in cognitive architectures. We discuss the degree to which DRS, our earlier proposal for such an internal representation for diagrams, meets these requirements. In DRS, the diagrams are not raw images, but a composition of objects that can be individuated and thus symbolized, while, unlike traditional symbols, the referent of the symbol is an object that retains its perceptual essence, namely, its spatiality. This duality provides a way to resolve what anti-imagists thought was a contradiction in mental imagery: the compositionality of mental images that seemed to be unique to symbol systems, and their support of a perceptual experience of images and some types of perception on them. We briefly review the use of DRS to augment Soar and ACT-R with a diagrammatic representation component. We identify issues for further research. Copyright © 2011 Cognitive Science Society, Inc.
Conceptual Modeling in the Time of the Revolution: Part II
NASA Astrophysics Data System (ADS)
Mylopoulos, John
Conceptual Modeling was a marginal research topic at the very fringes of Computer Science in the 60s and 70s, when the discipline was dominated by topics focusing on programs, systems and hardware architectures. Over the years, however, the field has moved to centre stage and has come to claim a central role both in Computer Science research and practice in diverse areas, such as Software Engineering, Databases, Information Systems, the Semantic Web, Business Process Management, Service-Oriented Computing, Multi-Agent Systems, Knowledge Management, and more. The transformation was greatly aided by the adoption of standards in modeling languages (e.g., UML), and model-based methodologies (e.g., Model-Driven Architectures) by the Object Management Group (OMG) and other standards organizations. We briefly review the history of the field over the past 40 years, focusing on the evolution of key ideas. We then note some open challenges and report on-going research, covering topics such as the representation of variability in conceptual models, capturing model intentions, and models of laws.
McDermott, Gerry; Le Gros, Mark A.; Larabell, Carolyn A.
2012-01-01
Living cells are structured to create a range of microenvironments that support specific chemical reactions and processes. Understanding how cells function therefore requires detailed knowledge of both the subcellular architecture and the location of specific molecules within this framework. Here we review the development of two correlated cellular imaging techniques that fulfill this need. Cells are first imaged using cryogenic fluorescence microscopy to determine the location of molecules of interest that have been labeled with fluorescent tags. The same specimen is then imaged using soft X-ray tomography to generate a high-contrast, 3D reconstruction of the cells. Data from the two modalities are then combined to produce a composite, information-rich view of the cell. This correlated imaging approach can be applied across the spectrum of problems encountered in cell biology, from basic research to biotechnological and biomedical applications such as the optimization of biofuels and the development of new pharmaceuticals. PMID:22242730
The traveling salesman problem: a hierarchical model.
Graham, S M; Joshi, A; Pizlo, Z
2000-10-01
Our review of prior literature on spatial information processing in perception, attention, and memory indicates that these cognitive functions involve similar mechanisms based on a hierarchical architecture. The present study extends the application of hierarchical models to the area of problem solving. First, we report results of an experiment in which human subjects were tested on a Euclidean traveling salesman problem (TSP) with 6 to 30 cities. The subject's solutions were either optimal or near-optimal in length and were produced in a time that was, on average, a linear function of the number of cities. Next, the performance of the subjects is compared with that of five representative artificial intelligence and operations research algorithms, that produce approximate solutions for Euclidean problems. None of these algorithms was found to be an adequate psychological model. Finally, we present a new algorithm for solving the TSP, which is based on a hierarchical pyramid architecture. The performance of this new algorithm is quite similar to the performance of the subjects.
Salvo, Grazia; Doyle-Baker, Patricia K.; McCormack, Gavin R.
2018-01-01
Qualitative studies can provide important information about how and why the built environment impacts physical activity decision-making—information that is important for informing local urban policies. We undertook a systematized literature review to synthesize findings from qualitative studies exploring how the built environment influences physical activity in adults. Our review included 36 peer-reviewed qualitative studies published from 1998 onwards. Our findings complemented existing quantitative evidence and provided additional insight into how functional, aesthetic, destination, and safety built characteristics influence physical activity decision-making. Sociodemographic characteristics (age, sex, ethnicity, and socioeconomic status) also impacted the BE’s influence on physical activity. Our review findings reinforce the need for synergy between transportation planning, urban design, landscape architecture, road engineering, parks and recreation, bylaw enforcement, and public health to be involved in creating neighbourhood environments that support physical activity. Our findings support a need for local neighbourhood citizens and associations with representation from individuals and groups with different sociodemographic backgrounds to have input into neighbourhood environment planning process. PMID:29724048
Space Generic Open Avionics Architecture (SGOAA) standard specification
NASA Technical Reports Server (NTRS)
Wray, Richard B.; Stovall, John R.
1994-01-01
This standard establishes the Space Generic Open Avionics Architecture (SGOAA). The SGOAA includes a generic functional model, processing structural model, and an architecture interface model. This standard defines the requirements for applying these models to the development of spacecraft core avionics systems. The purpose of this standard is to provide an umbrella set of requirements for applying the generic architecture models to the design of a specific avionics hardware/software processing system. This standard defines a generic set of system interface points to facilitate identification of critical services and interfaces. It establishes the requirement for applying appropriate low level detailed implementation standards to those interfaces points. The generic core avionics functions and processing structural models provided herein are robustly tailorable to specific system applications and provide a platform upon which the interface model is to be applied.
Fiacco, P. A.; Rice, W. H.
1991-01-01
Computerized medical record systems require structured database architectures for information processing. However, the data must be able to be transferred across heterogeneous platform and software systems. Client-Server architecture allows for distributive processing of information among networked computers and provides the flexibility needed to link diverse systems together effectively. We have incorporated this client-server model with a graphical user interface into an outpatient medical record system, known as SuperChart, for the Department of Family Medicine at SUNY Health Science Center at Syracuse. SuperChart was developed using SuperCard and Oracle SuperCard uses modern object-oriented programming to support a hypermedia environment. Oracle is a powerful relational database management system that incorporates a client-server architecture. This provides both a distributed database and distributed processing which improves performance. PMID:1807732
Relating Business Goals to Architecturally Significant Requirements for Software Systems
2010-05-01
must respond within five seconds” [ EPF 2010]. A major source of architecturally significant requirements is the set of business goals that led to the...Projects for Competitive Advantage, Center for Business Practices, 1999. [ EPF 2010] Eclipse Process Framework Project. Concept: Architecturally
NASA Technical Reports Server (NTRS)
Ruiz, B. Ian; Burke, Gary R.; Lung, Gerald; Whitaker, William D.; Nowicki, Robert M.
2004-01-01
This viewgraph presentation reviews the architecture of the The CIA-AlA chip-set is a set of mixed-signal ASICs that provide a flexible high level interface between the spacecraft's command and data handling (C&DH) electronics and lower level functions in other spacecraft subsystems. Due to the open-systems architecture of the chip-set including an embedded micro-controller a variety of applications are possible. The chip-set was developed for the missions to the outer planets. The chips were developed to provide a single solution for both the switching and regulation of a spacecraft power bus. The Open-Systems Architecture allows for other powerful applications.
Principles of Temporal Processing Across the Cortical Hierarchy.
Himberger, Kevin D; Chien, Hsiang-Yun; Honey, Christopher J
2018-05-02
The world is richly structured on multiple spatiotemporal scales. In order to represent spatial structure, many machine-learning models repeat a set of basic operations at each layer of a hierarchical architecture. These iterated spatial operations - including pooling, normalization and pattern completion - enable these systems to recognize and predict spatial structure, while robust to changes in the spatial scale, contrast and noisiness of the input signal. Because our brains also process temporal information that is rich and occurs across multiple time scales, might the brain employ an analogous set of operations for temporal information processing? Here we define a candidate set of temporal operations, and we review evidence that they are implemented in the mammalian cerebral cortex in a hierarchical manner. We conclude that multiple consecutive stages of cortical processing can be understood to perform temporal pooling, temporal normalization and temporal pattern completion. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
Imaging Cellular Architecture with X-rays
Larabell, Carolyn A.; Nugent, Keith A.
2012-01-01
X-ray imaging of biological samples is progressing rapidly. In this paper we review the progress to date in high resolution imaging of cellular architecture. In particular we survey the progress in soft X-ray tomography and argue that the field is coming of age and that important biological insights are starting to emerge. We then review the new ideas based on coherent diffraction. These methods are at a much earlier stage of development but, as they eliminate the need for X-ray optics, have the capacity to provide substantially better spatial resolution than zone plate based methods. PMID:20869868
A Survey of Memristive Threshold Logic Circuits.
Maan, Akshay Kumar; Jayadevi, Deepthi Anirudhan; James, Alex Pappachen
2017-08-01
In this paper, we review different memristive threshold logic (MTL) circuits that are inspired from the synaptic action of the flow of neurotransmitters in the biological brain. The brainlike generalization ability and the area minimization of these threshold logic circuits aim toward crossing Moore's law boundaries at device, circuits, and systems levels. Fast switching memory, signal processing, control systems, programmable logic, image processing, reconfigurable computing, and pattern recognition are identified as some of the potential applications of MTL systems. The physical realization of nanoscale devices with memristive behavior from materials, such as TiO 2 , ferroelectrics, silicon, and polymers, has accelerated research effort in these application areas, inspiring the scientific community to pursue the design of high-speed, low-cost, low-power, and high-density neuromorphic architectures.
Macrophage and Innate Lymphoid Cell Interplay in the Genesis of Fibrosis
Hams, Emily; Bermingham, Rachel; Fallon, Padraic G.
2015-01-01
Fibrosis is a characteristic pathological feature of an array of chronic diseases, where development of fibrosis in tissue can lead to marked alterations in the architecture of the affected organs. As a result of this process of sustained attrition to organs, many diseases that involve fibrosis are often progressive conditions and have a poor long-term prognosis. Inflammation is often a prelude to fibrosis, with innate and adaptive immunity involved in both the initiation and regulation of the fibrotic process. In this review, we will focus on the emerging roles of the newly described innate lymphoid cells (ILCs) in the generation of fibrotic disease with an examination of the potential interplay between ILC and macrophages and the adaptive immune system. PMID:26635811
NASA Technical Reports Server (NTRS)
Stebbins, Robin T.
2009-01-01
The LISA science requirements and conceptual design have been fairly stable for over a decade. In the interest of reducing costs, the LISA Project at NASA has looked for simplifications of the architecture, at downsizing of subsystems, and at descopes of the entire mission. This is a natural activity of the formulation phase, and one that is particularly timely in the current NASA budgetary context. There is, and will continue to be, enormous pressure for cost reduction from both ESA and NASA, reviewers and the broader research community. Here, the rationale for the baseline architecture is reviewed, and recent efforts to find simplifications and other reductions that might lead to savings are reported. A few possible simplifications have been found in the LISA baseline architecture. In the interest of exploring cost sensitivity, one moderate and one aggressive descope have been evaluated; the cost savings are modest and the loss of science is not.
Xu, Weinan; Ledin, Petr A; Shevchenko, Valery V; Tsukruk, Vladimir V
2015-06-17
Branched polyelectrolytes with cylindrical brush, dendritic, hyperbranched, grafted, and star architectures bearing ionizable functional groups possess complex and unique assembly behavior in solution at surfaces and interfaces as compared to their linear counterparts. This review summarizes the recent developments in the introduction of various architectures and understanding of the assembly behavior of branched polyelectrolytes with a focus on functional polyelectrolytes and poly(ionic liquid)s with responsive properties. The branched polyelectrolytes and poly(ionic liquid)s interact electrostatically with small molecules, linear polyelectrolytes, or other branched polyelectrolytes to form assemblies of hybrid nanoparticles, multilayer thin films, responsive microcapsules, and ion-conductive membranes. The branched structures lead to unconventional assemblies and complex hierarchical structures with responsive properties as summarized in this review. Finally, we discuss prospectives for emerging applications of branched polyelectrolytes and poly(ionic liquid)s for energy harvesting and storage, controlled delivery, chemical microreactors, adaptive surfaces, and ion-exchange membranes.
3D Architectured Graphene/Metal Oxide Hybrids for Gas Sensors: A Review
Xia, Yi; Li, Ran; Chen, Ruosong; Wang, Jing; Xiang, Lan
2018-01-01
Graphene/metal oxide-based materials have been demonstrated as promising candidates for gas sensing applications due to the enhanced sensing performance and synergetic effects of the two components. Plenty of metal oxides such as SnO2, ZnO, WO3, etc. have been hybridized with graphene to improve the gas sensing properties. However, graphene/metal oxide nanohybrid- based gas sensors still have several limitations in practical application such as the insufficient sensitivity and response rate, and long recovery time in some cases. To achieve higher sensing performances of graphene/metal oxides nanocomposites, many recent efforts have been devoted to the controllable synthesis of 3D graphene/metal oxides architectures owing to their large surface area and well-organized structure for the enhanced gas adsorption/diffusion on sensing films. This review summarizes recent advances in the synthesis, assembly, and applications of 3D architectured graphene/metal oxide hybrids for gas sensing. PMID:29735951
Mozaffari, Brian
2014-01-01
Based on the notion that the brain is equipped with a hierarchical organization, which embodies environmental contingencies across many time scales, this paper suggests that the medial temporal lobe (MTL)-located deep in the hierarchy-serves as a bridge connecting supra- to infra-MTL levels. Bridging the upper and lower regions of the hierarchy provides a parallel architecture that optimizes information flow between upper and lower regions to aid attention, encoding, and processing of quick complex visual phenomenon. Bypassing intermediate hierarchy levels, information conveyed through the MTL "bridge" allows upper levels to make educated predictions about the prevailing context and accordingly select lower representations to increase the efficiency of predictive coding throughout the hierarchy. This selection or activation/deactivation is associated with endogenous attention. In the event that these "bridge" predictions are inaccurate, this architecture enables the rapid encoding of novel contingencies. A review of hierarchical models in relation to memory is provided along with a new theory, Medial-temporal-lobe Conduit for Parallel Connectivity (MCPC). In this scheme, consolidation is considered as a secondary process, occurring after a MTL-bridged connection, which eventually allows upper and lower levels to access each other directly. With repeated reactivations, as contingencies become consolidated, less MTL activity is predicted. Finally, MTL bridging may aid processing transient but structured perceptual events, by allowing communication between upper and lower levels without calling on intermediate levels of representation.
The MGS Avionics System Architecture: Exploring the Limits of Inheritance
NASA Technical Reports Server (NTRS)
Bunker, R.
1994-01-01
Mars Global Surveyor (MGS) avionics system architecture comprises much of the electronics on board the spacecraft: electrical power, attitude and articulation control, command and data handling, telecommunications, and flight software. Schedule and cost constraints dictated a mix of new and inherited designs, especially hardware upgrades based on findings of the Mars Observer failure review boards.
ERIC Educational Resources Information Center
Tucker, Richard; Choy, Darryl Low; Heyes, Scott; Revell, Grant; Jones, David
2018-01-01
This paper reviews the current status and focus of Australian Architecture programs with respect to Indigenous Knowledge and the extent to which these tertiary programs currently address reconciliation and respect to Indigenous Australians in relation to their professional institutions and accreditation policies. The paper draws upon the findings…
Advanced information processing system: Input/output network management software
NASA Technical Reports Server (NTRS)
Nagle, Gail; Alger, Linda; Kemp, Alexander
1988-01-01
The purpose of this document is to provide the software requirements and specifications for the Input/Output Network Management Services for the Advanced Information Processing System. This introduction and overview section is provided to briefly outline the overall architecture and software requirements of the AIPS system before discussing the details of the design requirements and specifications of the AIPS I/O Network Management software. A brief overview of the AIPS architecture followed by a more detailed description of the network architecture.
An open architecture for medical image workstation
NASA Astrophysics Data System (ADS)
Liang, Liang; Hu, Zhiqiang; Wang, Xiangyun
2005-04-01
Dealing with the difficulties of integrating various medical image viewing and processing technologies with a variety of clinical and departmental information systems and, in the meantime, overcoming the performance constraints in transferring and processing large-scale and ever-increasing image data in healthcare enterprise, we design and implement a flexible, usable and high-performance architecture for medical image workstations. This architecture is not developed for radiology only, but for any workstations in any application environments that may need medical image retrieving, viewing, and post-processing. This architecture contains an infrastructure named Memory PACS and different kinds of image applications built on it. The Memory PACS is in charge of image data caching, pre-fetching and management. It provides image applications with a high speed image data access and a very reliable DICOM network I/O. In dealing with the image applications, we use dynamic component technology to separate the performance-constrained modules from the flexibility-constrained modules so that different image viewing or processing technologies can be developed and maintained independently. We also develop a weakly coupled collaboration service, through which these image applications can communicate with each other or with third party applications. We applied this architecture in developing our product line and it works well. In our clinical sites, this architecture is applied not only in Radiology Department, but also in Ultrasonic, Surgery, Clinics, and Consultation Center. Giving that each concerned department has its particular requirements and business routines along with the facts that they all have different image processing technologies and image display devices, our workstations are still able to maintain high performance and high usability.
(abstract) A High Throughput 3-D Inner Product Processor
NASA Technical Reports Server (NTRS)
Daud, Tuan
1996-01-01
A particularily challenging image processing application is the real time scene acquisition and object discrimination. It requires spatio-temporal recognition of point and resolved objects at high speeds with parallel processing algorithms. Neural network paradigms provide fine grain parallism and, when implemented in hardware, offer orders of magnitude speed up. However, neural networks implemented on a VLSI chip are planer architectures capable of efficient processing of linear vector signals rather than 2-D images. Therefore, for processing of images, a 3-D stack of neural-net ICs receiving planar inputs and consuming minimal power are required. Details of the circuits with chip architectures will be described with need to develop ultralow-power electronics. Further, use of the architecture in a system for high-speed processing will be illustrated.
van Maanen, Leendert; van Rijn, Hedderik; Taatgen, Niels
2012-01-01
This article discusses how sequential sampling models can be integrated in a cognitive architecture. The new theory Retrieval by Accumulating Evidence in an Architecture (RACE/A) combines the level of detail typically provided by sequential sampling models with the level of task complexity typically provided by cognitive architectures. We will use RACE/A to model data from two variants of a picture-word interference task in a psychological refractory period design. These models will demonstrate how RACE/A enables interactions between sequential sampling and long-term declarative learning, and between sequential sampling and task control. In a traditional sequential sampling model, the onset of the process within the task is unclear, as is the number of sampling processes. RACE/A provides a theoretical basis for estimating the onset of sequential sampling processes during task execution and allows for easy modeling of multiple sequential sampling processes within a task. Copyright © 2011 Cognitive Science Society, Inc.
NASA Astrophysics Data System (ADS)
Baroroh, D. K.; Alfiah, D.
2018-05-01
The electric vehicle is one of the innovations to reduce the pollution of the vehicle. Nevertheless, it still has a problem, especially for disposal stage. In supporting product design and development strategy, which is the idea of sustainable design or problem solving of disposal stage, assessment of modularity architecture from electric vehicle in recovery process needs to be done. This research used Design Structure Matrix (DSM) approach to deciding interaction of components and assessment of modularity architecture using the calculation of value from 3 variables, namely Module Independence (MI), Module Similarity (MS), and Modularity for End of Life Stage (MEOL). The result of this research shows that existing design of electric vehicles has the architectural design which has a high value of modularity for recovery process on disposal stage. Accordingly, so it can be reused and recycled in component level or module without disassembly process to support the product that is environmentally friendly (sustainable design) and able reduce disassembly cost.
A CSP-Based Agent Modeling Framework for the Cougaar Agent-Based Architecture
NASA Technical Reports Server (NTRS)
Gracanin, Denis; Singh, H. Lally; Eltoweissy, Mohamed; Hinchey, Michael G.; Bohner, Shawn A.
2005-01-01
Cognitive Agent Architecture (Cougaar) is a Java-based architecture for large-scale distributed agent-based applications. A Cougaar agent is an autonomous software entity with behaviors that represent a real-world entity (e.g., a business process). A Cougaar-based Model Driven Architecture approach, currently under development, uses a description of system's functionality (requirements) to automatically implement the system in Cougaar. The Communicating Sequential Processes (CSP) formalism is used for the formal validation of the generated system. Two main agent components, a blackboard and a plugin, are modeled as CSP processes. A set of channels represents communications between the blackboard and individual plugins. The blackboard is represented as a CSP process that communicates with every agent in the collection. The developed CSP-based Cougaar modeling framework provides a starting point for a more complete formal verification of the automatically generated Cougaar code. Currently it is used to verify the behavior of an individual agent in terms of CSP properties and to analyze the corresponding Cougaar society.
Information Technology Architectures. New Opportunities for Partnering, CAUSE94. Track VI.
ERIC Educational Resources Information Center
CAUSE, Boulder, CO.
Eight papers are presented from the 1994 CAUSE conference track on information technology architectures as applied to higher education institutions. The papers include: (1) "Reshaping the Enterprise: Building the Next Generation of Information Systems Through Information Architecture and Processing Reengineering," which notes…
DOT National Transportation Integrated Search
1999-09-01
This is one of seven studies exploring processes for developing Intelligent Transportation Systems (ITS) architectures for regional, statewide, or commercial vehicle applications. This study was prepared for a broad-based, non-technical audience. In ...
Sequoia: A fault-tolerant tightly coupled multiprocessor for transaction processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernstein, P.A.
1988-02-01
The Sequoia computer is a tightly coupled multiprocessor, and thus attains the performance advantages of this style of architecture. It avoids most of the fault-tolerance disadvantages of tight coupling by using a new fault-tolerance design. The Sequoia architecture is similar to other multimicroprocessor architectures, such as those of Encore and Sequent, in that it gives dozens of microprocessors shared access to a large main memory. It resembles the Stratus architecture in its extensive use of hardware fault-detection techniques. It resembles Stratus and Auragen in its ability to quickly recover all processes after a single point failure, transparently to the user.more » However, Sequoia is unique in its combination of a large-scale tightly coupled architecture with a hardware approach to fault tolerance. This article gives an overview of how the hardware architecture and operating systems (OS) work together to provide a high degree of fault tolerance with good system performance.« less
CDC WONDER: a cooperative processing architecture for public health.
Friede, A; Rosen, D H; Reid, J A
1994-01-01
CDC WONDER is an information management architecture designed for public health. It provides access to information and communications without the user's needing to know the location of data or communication pathways and mechanisms. CDC WONDER users have access to extractions from some 40 databases; electronic mail (e-mail); and surveillance data processing. System components include the Remote Client, the Communications Server, the Queue Managers, and Data Servers and Process Servers. The Remote Client software resides in the user's machine; other components are at the Centers for Disease Control and Prevention (CDC). The Remote Client, the Communications Server, and the Applications Server provide access to the information and functions in the Data Servers and Process Servers. The system architecture is based on cooperative processing, and components are coupled via pure message passing, using several protocols. This architecture allows flexibility in the choice of hardware and software. One system limitation is that final results from some subsystems are obtained slowly. Although designed for public health, CDC WONDER could be useful for other disciplines that need flexible, integrated information exchange. PMID:7719813
Exploiting multicore compute resources in the CMS experiment
NASA Astrophysics Data System (ADS)
Ramírez, J. E.; Pérez-Calero Yzquierdo, A.; Hernández, J. M.; CMS Collaboration
2016-10-01
CMS has developed a strategy to efficiently exploit the multicore architecture of the compute resources accessible to the experiment. A coherent use of the multiple cores available in a compute node yields substantial gains in terms of resource utilization. The implemented approach makes use of the multithreading support of the event processing framework and the multicore scheduling capabilities of the resource provisioning system. Multicore slots are acquired and provisioned by means of multicore pilot agents which internally schedule and execute single and multicore payloads. Multicore scheduling and multithreaded processing are currently used in production for online event selection and prompt data reconstruction. More workflows are being adapted to run in multicore mode. This paper presents a review of the experience gained in the deployment and operation of the multicore scheduling and processing system, the current status and future plans.
Yang, Zhiwei; Gou, Lu; Chen, Shuyu; Li, Na; Zhang, Shengli; Zhang, Lei
2017-01-01
Membrane fusion is one of the most fundamental physiological processes in eukaryotes for triggering the fusion of lipid and content, as well as the neurotransmission. However, the architecture features of neurotransmitter release machinery and interdependent mechanism of synaptic membrane fusion have not been extensively studied. This review article expounds the neuronal membrane fusion processes, discusses the fundamental steps in all fusion reactions (membrane aggregation, membrane association, lipid rearrangement and lipid and content mixing) and the probable mechanism coupling to the delivery of neurotransmitters. Subsequently, this work summarizes the research on the fusion process in synaptic transmission, using electron microscopy (EM) and molecular simulation approaches. Finally, we propose the future outlook for more exciting applications of membrane fusion involved in synaptic transmission, with the aid of stochastic optical reconstruction microscopy (STORM), cryo-EM (cryo-EM), and molecular simulations. PMID:28638320
Advanced computer architecture specification for automated weld systems
NASA Technical Reports Server (NTRS)
Katsinis, Constantine
1994-01-01
This report describes the requirements for an advanced automated weld system and the associated computer architecture, and defines the overall system specification from a broad perspective. According to the requirements of welding procedures as they relate to an integrated multiaxis motion control and sensor architecture, the computer system requirements are developed based on a proven multiple-processor architecture with an expandable, distributed-memory, single global bus architecture, containing individual processors which are assigned to specific tasks that support sensor or control processes. The specified architecture is sufficiently flexible to integrate previously developed equipment, be upgradable and allow on-site modifications.
NASA Technical Reports Server (NTRS)
Torres-Pomales, Wilfredo
2014-01-01
This report presents an example of the application of multi-criteria decision analysis to the selection of an architecture for a safety-critical distributed computer system. The design problem includes constraints on minimum system availability and integrity, and the decision is based on the optimal balance of power, weight and cost. The analysis process includes the generation of alternative architectures, evaluation of individual decision criteria, and the selection of an alternative based on overall value. In this example presented here, iterative application of the quantitative evaluation process made it possible to deliberately generate an alternative architecture that is superior to all others regardless of the relative importance of cost.
Architectural design of heterogeneous metallic nanocrystals--principles and processes.
Yu, Yue; Zhang, Qingbo; Yao, Qiaofeng; Xie, Jianping; Lee, Jim Yang
2014-12-16
CONSPECTUS: Heterogeneous metal nanocrystals (HMNCs) are a natural extension of simple metal nanocrystals (NCs), but as a research topic, they have been much less explored until recently. HMNCs are formed by integrating metal NCs of different compositions into a common entity, similar to the way atoms are bonded to form molecules. HMNCs can be built to exhibit an unprecedented architectural diversity and complexity by programming the arrangement of the NC building blocks ("unit NCs"). The architectural engineering of HMNCs involves the design and fabrication of the architecture-determining elements (ADEs), i.e., unit NCs with precise control of shape and size, and their relative positions in the design. Similar to molecular engineering, where structural diversity is used to create more property variations for application explorations, the architectural engineering of HMNCs can similarly increase the utility of metal NCs by offering a suite of properties to support multifunctionality in applications. The architectural engineering of HMNCs calls for processes and operations that can execute the design. Some enabling technologies already exist in the form of classical micro- and macroscale fabrication techniques, such as masking and etching. These processes, when used singly or in combination, are fully capable of fabricating nanoscopic objects. What is needed is a detailed understanding of the engineering control of ADEs and the translation of these principles into actual processes. For simplicity of execution, these processes should be integrated into a common reaction system and yet retain independence of control. The key to architectural diversity is therefore the independent controllability of each ADE in the design blueprint. The right chemical tools must be applied under the right circumstances in order to achieve the desired outcome. In this Account, after a short illustration of the infinite possibility of combining different ADEs to create HMNC design variations, we introduce the fabrication processes for each ADE, which enable shape, size, and location control of the unit NCs in a particular HMNC design. The principles of these processes are discussed and illustrated with examples. We then discuss how these processes may be integrated into a common reaction system while retaining the independence of individual processes. The principles for the independent control of each ADE are discussed in detail to lay the foundation for the selection of the chemical reaction system and its operating space.
Structural analysis of hierarchically organized zeolites
Mitchell, Sharon; Pinar, Ana B.; Kenvin, Jeffrey; Crivelli, Paolo; Kärger, Jörg; Pérez-Ramírez, Javier
2015-01-01
Advances in materials synthesis bring about many opportunities for technological applications, but are often accompanied by unprecedented complexity. This is clearly illustrated by the case of hierarchically organized zeolite catalysts, a class of crystalline microporous solids that has been revolutionized by the engineering of multilevel pore architectures, which combine unique chemical functionality with efficient molecular transport. Three key attributes, the crystal, the pore and the active site structure, can be expected to dominate the design process. This review examines the adequacy of the palette of techniques applied to characterize these distinguishing features and their catalytic impact. PMID:26482337
Dunlop, R; Arbona, A; Rajasekaran, H; Lo Iacono, L; Fingberg, J; Summers, P; Benkner, S; Engelbrecht, G; Chiarini, A; Friedrich, C M; Moore, B; Bijlenga, P; Iavindrasana, J; Hose, R D; Frangi, A F
2008-01-01
This paper presents an overview of computerised decision support for clinical practice. The concept of computer-interpretable guidelines is introduced in the context of the @neurIST project, which aims at supporting the research and treatment of asymptomatic unruptured cerebral aneurysms by bringing together heterogeneous data, computing and complex processing services. The architecture is generic enough to adapt it to the treatment of other diseases beyond cerebral aneurysms. The paper reviews the generic requirements of the @neurIST system and presents the innovative work in distributing executable clinical guidelines.
Electrospun Fibrous Scaffolds for Tissue Engineering: Viewpoints on Architecture and Fabrication.
Jun, Indong; Han, Hyung-Seop; Edwards, James R; Jeon, Hojeong
2018-03-06
Electrospinning has been used for the fabrication of extracellular matrix (ECM)-mimicking fibrous scaffolds for several decades. Electrospun fibrous scaffolds provide nanoscale/microscale fibrous structures with interconnecting pores, resembling natural ECM in tissues, and showing a high potential to facilitate the formation of artificial functional tissues. In this review, we summarize the fundamental principles of electrospinning processes for generating complex fibrous scaffold geometries that are similar in structural complexity to the ECM of living tissues. Moreover, several approaches for the formation of three-dimensional fibrous scaffolds arranged in hierarchical structures for tissue engineering are also presented.
A review of the promises and challenges of micro-concentrator photovoltaics
NASA Astrophysics Data System (ADS)
Domínguez, César; Jost, Norman; Askins, Steve; Victoria, Marta; Antón, Ignacio
2017-09-01
Micro concentrator photovoltaics (micro-CPV) is an unconventional approach for developing high-efficiency low-cost PV systems. The micrifying of cells and optics brings about an increase of efficiency with respect to classical CPV, at the expense of some fundamental challenges at mass production. The large costs linked to miniaturization under conventional serial-assembly processes raise the need for the development of parallel manufacturing technologies. In return, the tiny sizes involved allows exploring unconventional optical architectures or revisiting conventional concepts that were typically discarded because of large material consumption or high bulk absorption at classical CPV sizes.
NASA Astrophysics Data System (ADS)
Belov, G. V.; Dyachkov, S. A.; Levashov, P. R.; Lomonosov, I. V.; Minakov, D. V.; Morozov, I. V.; Sineva, M. A.; Smirnov, V. N.
2018-01-01
The database structure, main features and user interface of an IVTANTHERMO-Online system are reviewed. This system continues the series of the IVTANTHERMO packages developed in JIHT RAS. It includes the database for thermodynamic properties of individual substances and related software for analysis of experimental results, data fitting, calculation and estimation of thermodynamical functions and thermochemistry quantities. In contrast to the previous IVTANTHERMO versions it has a new extensible database design, the client-server architecture, a user-friendly web interface with a number of new features for online and offline data processing.
System data communication structures for active-control transport aircraft, volume 1
NASA Technical Reports Server (NTRS)
Hopkins, A. L.; Martin, J. H.; Brock, L. D.; Jansson, D. G.; Serben, S.; Smith, T. B.; Hanley, L. D.
1981-01-01
Candidate data communication techniques are identified, including dedicated links, local buses, broadcast buses, multiplex buses, and mesh networks. The design methodology for mesh networks is then discussed, including network topology and node architecture. Several concepts of power distribution are reviewed, including current limiting and mesh networks for power. The technology issues of packaging, transmission media, and lightning are addressed, and, finally, the analysis tools developed to aid in the communication design process are described. There are special tools to analyze the reliability and connectivity of networks and more general reliability analysis tools for all types of systems.
The FASTK family of proteins: emerging regulators of mitochondrial RNA biology
Jourdain, Alexis A.; Popow, Johannes; de la Fuente, Miguel A.; Martinou, Jean-Claude
2017-01-01
Abstract The FASTK family proteins have recently emerged as key post-transcriptional regulators of mitochondrial gene expression. FASTK, the founding member and its homologs FASTKD1–5 are architecturally related RNA-binding proteins, each having a different function in the regulation of mitochondrial RNA biology, from mRNA processing and maturation to ribosome assembly and translation. In this review, we outline the structure, evolution and function of these FASTK proteins and discuss the individual role that each has in mitochondrial RNA biology. In addition, we highlight the aspects of FASTK research that still require more attention. PMID:29036396
Multichannel Baseband Processor for Wideband CDMA
NASA Astrophysics Data System (ADS)
Jalloul, Louay M. A.; Lin, Jim
2005-12-01
The system architecture of the cellular base station modem engine (CBME) is described. The CBME is a single-chip multichannel transceiver capable of processing and demodulating signals from multiple users simultaneously. It is optimized to process different classes of code-division multiple-access (CDMA) signals. The paper will show that through key functional system partitioning, tightly coupled small digital signal processing cores, and time-sliced reuse architecture, CBME is able to achieve a high degree of algorithmic flexibility while maintaining efficiency. The paper will also highlight the implementation and verification aspects of the CBME chip design. In this paper, wideband CDMA is used as an example to demonstrate the architecture concept.
Use of Daylight and Aesthetic Image of Glass Facades in Contemporary Buildings
NASA Astrophysics Data System (ADS)
Roginska-Niesluchowska, Malgorzata
2017-10-01
The paper deals with the architecture of contemporary buildings in respect to their aesthetic image created by the use of natural light. Sustainability is regarded as a governing principle of contemporary architecture, where daylighting is an important factor as it affects energy consumption and environmental quality of the space inside a building. Environmental awareness of architecture, however, involves a much wider and more holistic view of design. The quality of sustainable architecture can be considered in its aesthetic and cultural context with regard to landscape, local tradition, and connection to the surrounding world. This approach is associated with the social mission of architecture, i.e. providing appropriate space for living, facilitating social relations and having positive impact on people. The purpose of the research is to study the use of daylight in creating an aesthetic image of contemporary buildings. The author focuses mainly on public buildings largely dedicated to art and culture which satisfy high functional and aesthetic requirements. The paper examines the genesis and current trends in the aesthetic image of modern buildings which use daylight as the main design strategy, focusing on the issues of glass facades. The main attention is given to the shaping of representative public areas which feature the glass facades. The research has been based on a case study, critical review of literature review, observation and synthesis. The study identifies and classifies different approaches to using daylight in these areas and highlights changes in the aesthetics of architecture made of glass, which uses daylight as the main design strategy. These changes are primarily caused by the development and spreading of new glazing materials and the use of digital method of design. The influence of light and its mode depends on glass materials but also on the local conditions of the site, and has a significant impact on the relationship between architecture and its natural and cultural environment. The subordination of architectural concept to the idea of natural lighting builds the relationship between form, function and the context of architecture, and is expressed in its structural, material and spatial properties, and in the resulting aesthetic order. Search for new architectural solutions is defined by local topographical, climatic, biological and cultural conditions. The architecture subordinate to the conception of contribution of light corresponds to the aesthetic aspirations of sustainability.
Design and Analysis of a Neuromemristive Reservoir Computing Architecture for Biosignal Processing
Kudithipudi, Dhireesha; Saleh, Qutaiba; Merkel, Cory; Thesing, James; Wysocki, Bryant
2016-01-01
Reservoir computing (RC) is gaining traction in several signal processing domains, owing to its non-linear stateful computation, spatiotemporal encoding, and reduced training complexity over recurrent neural networks (RNNs). Previous studies have shown the effectiveness of software-based RCs for a wide spectrum of applications. A parallel body of work indicates that realizing RNN architectures using custom integrated circuits and reconfigurable hardware platforms yields significant improvements in power and latency. In this research, we propose a neuromemristive RC architecture, with doubly twisted toroidal structure, that is validated for biosignal processing applications. We exploit the device mismatch to implement the random weight distributions within the reservoir and propose mixed-signal subthreshold circuits for energy efficiency. A comprehensive analysis is performed to compare the efficiency of the neuromemristive RC architecture in both digital(reconfigurable) and subthreshold mixed-signal realizations. Both Electroencephalogram (EEG) and Electromyogram (EMG) biosignal benchmarks are used for validating the RC designs. The proposed RC architecture demonstrated an accuracy of 90 and 84% for epileptic seizure detection and EMG prosthetic finger control, respectively. PMID:26869876
Using Multiple FPGA Architectures for Real-time Processing of Low-level Machine Vision Functions
Thomas H. Drayer; William E. King; Philip A. Araman; Joseph G. Tront; Richard W. Conners
1995-01-01
In this paper, we investigate the use of multiple Field Programmable Gate Array (FPGA) architectures for real-time machine vision processing. The use of FPGAs for low-level processing represents an excellent tradeoff between software and special purpose hardware implementations. A library of modules that implement common low-level machine vision operations is presented...
On the Design of a Comprehensive Authorisation Framework for Service Oriented Architecture (SOA)
2013-07-01
Authentication Server AZM Authorisation Manager AZS Authorisation Server BP Business Process BPAA Business Process Authorisation Architecture BPAD Business...Internet Protocol Security JAAS Java Authentication and Authorisation Service MAC Mandatory Access Control RBAC Role Based Access Control RCA Regional...the authentication process, make authorisation decisions using application specific access control functions that results in the practice of
Advanced digital SAR processing study
NASA Technical Reports Server (NTRS)
Martinson, L. W.; Gaffney, B. P.; Liu, B.; Perry, R. P.; Ruvin, A.
1982-01-01
A highly programmable, land based, real time synthetic aperture radar (SAR) processor requiring a processed pixel rate of 2.75 MHz or more in a four look system was designed. Variations in range and azimuth compression, number of looks, range swath, range migration and SR mode were specified. Alternative range and azimuth processing algorithms were examined in conjunction with projected integrated circuit, digital architecture, and software technologies. The advaced digital SAR processor (ADSP) employs an FFT convolver algorithm for both range and azimuth processing in a parallel architecture configuration. Algorithm performace comparisons, design system design, implementation tradeoffs and the results of a supporting survey of integrated circuit and digital architecture technologies are reported. Cost tradeoffs and projections with alternate implementation plans are presented.
Towards a Standard Mixed-Signal Parallel Processing Architecture for Miniature and Microrobotics.
Sadler, Brian M; Hoyos, Sebastian
2014-01-01
The conventional analog-to-digital conversion (ADC) and digital signal processing (DSP) architecture has led to major advances in miniature and micro-systems technology over the past several decades. The outlook for these systems is significantly enhanced by advances in sensing, signal processing, communications and control, and the combination of these technologies enables autonomous robotics on the miniature to micro scales. In this article we look at trends in the combination of analog and digital (mixed-signal) processing, and consider a generalized sampling architecture. Employing a parallel analog basis expansion of the input signal, this scalable approach is adaptable and reconfigurable, and is suitable for a large variety of current and future applications in networking, perception, cognition, and control.
Towards a Standard Mixed-Signal Parallel Processing Architecture for Miniature and Microrobotics
Sadler, Brian M; Hoyos, Sebastian
2014-01-01
The conventional analog-to-digital conversion (ADC) and digital signal processing (DSP) architecture has led to major advances in miniature and micro-systems technology over the past several decades. The outlook for these systems is significantly enhanced by advances in sensing, signal processing, communications and control, and the combination of these technologies enables autonomous robotics on the miniature to micro scales. In this article we look at trends in the combination of analog and digital (mixed-signal) processing, and consider a generalized sampling architecture. Employing a parallel analog basis expansion of the input signal, this scalable approach is adaptable and reconfigurable, and is suitable for a large variety of current and future applications in networking, perception, cognition, and control. PMID:26601042
Energy Programming for Buildings.
ERIC Educational Resources Information Center
Parshall, Steven; Diserens, Steven
1982-01-01
Programing is described as a process leading to an explicit statement of an architectural problem. The programing phase is seen as the most critical period in the delivery process in which energy analysis can have an impact on design. A programing method appropriate for standard architectural practice is provided. (MLW)
Using the CoRE Requirements Method with ADARTS. Version 01.00.05
1994-03-01
requirements; combining ADARTS processes and objects derived from CoRE requirements into an ADARTS software architecture design ; and taking advantage of...CoRE’s precision in the ADARTS process structuring, class structuring, and software architecture design activities. Object-oriented requirements and
Active semi-supervised learning method with hybrid deep belief networks.
Zhou, Shusen; Chen, Qingcai; Wang, Xiaolong
2014-01-01
In this paper, we develop a novel semi-supervised learning algorithm called active hybrid deep belief networks (AHD), to address the semi-supervised sentiment classification problem with deep learning. First, we construct the previous several hidden layers using restricted Boltzmann machines (RBM), which can reduce the dimension and abstract the information of the reviews quickly. Second, we construct the following hidden layers using convolutional restricted Boltzmann machines (CRBM), which can abstract the information of reviews effectively. Third, the constructed deep architecture is fine-tuned by gradient-descent based supervised learning with an exponential loss function. Finally, active learning method is combined based on the proposed deep architecture. We did several experiments on five sentiment classification datasets, and show that AHD is competitive with previous semi-supervised learning algorithm. Experiments are also conducted to verify the effectiveness of our proposed method with different number of labeled reviews and unlabeled reviews respectively.
NASA Technical Reports Server (NTRS)
Ticker, Ronald L.; Azzolini, John D.
2000-01-01
The study investigates NASA's Earth Science Enterprise needs for Distributed Spacecraft Technologies in the 2010-2025 timeframe. In particular, the study focused on the Earth Science Vision Initiative and extrapolation of the measurement architecture from the 2002-2010 time period. Earth Science Enterprise documents were reviewed. Interviews were conducted with a number of Earth scientists and technologists. fundamental principles of formation flying were also explored. The results led to the development of four notional distribution spacecraft architectures. These four notional architectures (global constellations, virtual platforms, precision formation flying, and sensorwebs) are presented. They broadly and generically cover the distributed spacecraft architectures needed by Earth Science in the post-2010 era. These notional architectures are used to identify technology needs and drivers. Technology needs are subsequently grouped into five categories: Systems and architecture development tools; Miniaturization, production, manufacture, test and calibration; Data networks and information management; Orbit control, planning and operations; and Launch and deployment. The current state of the art and expected developments are explored. High-value technology areas are identified for possible future funding emphasis.
NASA Technical Reports Server (NTRS)
Creech, Steve; Sumrall, Phil; Cockrell, Charles E., Jr.; Burris, Mike
2009-01-01
As part of NASA s Constellation Program to resume exploration beyond low Earth orbit (LEO), the Ares V heavy-lift cargo launch vehicle as currently conceived will be able to send more crew and cargo to more places on the Moon than the Apollo Program Saturn V. (Figure 1) It also has unprecedented cargo mass and volume capabilities that will be a national asset for science, commerce, and national defense applications. Compared to current systems, it will offer approximately five times the mass and volume to most orbits and locations. The Columbia space shuttle accident, the resulting investigation, the Vision for Space Exploration, and the Exploration Systems Architecture Study (ESAS) broadly shaped the Constellation architecture. Out of those events and initiatives emerged an architecture intended to replace the space shuttle, complete the International Space Station (ISS), resume a much more ambitious plan to explore the moon as a stepping stone to other destinations in the solar system. The Ares I was NASA s main priority because of the goal to retire the Shuttle. Ares V remains in a concept development phase, evolving through hundreds of configurations. The current reference design was approved during the Lunar Capabilities Concept Review/Ares V Mission Concept Review (LCCR/MCR) in June 2008. This reference concept serves as a starting point for a renewed set of design trades and detailed analysis into its interaction with the other components of the Constellation architecture and existing launch infrastructure. In 2009, the Ares V team was heavily involved in supporting the Review of U.S. Human Space Flight Plans Committee. Several alternative designs for Ares V have been supplied to the committee. This paper will discuss the origins of the Ares V design, the evolution to the current reference configuration, and the options provided to the review committee.
Towards Implementation of a Generalized Architecture for High-Level Quantum Programming Language
NASA Astrophysics Data System (ADS)
Ameen, El-Mahdy M.; Ali, Hesham A.; Salem, Mofreh M.; Badawy, Mahmoud
2017-08-01
This paper investigates a novel architecture to the problem of quantum computer programming. A generalized architecture for a high-level quantum programming language has been proposed. Therefore, the programming evolution from the complicated quantum-based programming to the high-level quantum independent programming will be achieved. The proposed architecture receives the high-level source code and, automatically transforms it into the equivalent quantum representation. This architecture involves two layers which are the programmer layer and the compilation layer. These layers have been implemented in the state of the art of three main stages; pre-classification, classification, and post-classification stages respectively. The basic building block of each stage has been divided into subsequent phases. Each phase has been implemented to perform the required transformations from one representation to another. A verification process was exposed using a case study to investigate the ability of the compiler to perform all transformation processes. Experimental results showed that the efficacy of the proposed compiler achieves a correspondence correlation coefficient about R ≈ 1 between outputs and the targets. Also, an obvious achievement has been utilized with respect to the consumed time in the optimization process compared to other techniques. In the online optimization process, the consumed time has increased exponentially against the amount of accuracy needed. However, in the proposed offline optimization process has increased gradually.
Architecture for distributed design and fabrication
NASA Astrophysics Data System (ADS)
McIlrath, Michael B.; Boning, Duane S.; Troxel, Donald E.
1997-01-01
We describe a flexible, distributed system architecture capable of supporting collaborative design and fabrication of semi-conductor devices and integrated circuits. Such capabilities are of particular importance in the development of new technologies, where both equipment and expertise are limited. Distributed fabrication enables direct, remote, physical experimentation in the development of leading edge technology, where the necessary manufacturing resources are new, expensive, and scarce. Computational resources, software, processing equipment, and people may all be widely distributed; their effective integration is essential in order to achieve the realization of new technologies for specific product requirements. Our architecture leverages is essential in order to achieve the realization of new technologies for specific product requirements. Our architecture leverages current vendor and consortia developments to define software interfaces and infrastructure based on existing and merging networking, CIM, and CAD standards. Process engineers and product designers access processing and simulation results through a common interface and collaborate across the distributed manufacturing environment.
Collaborative Working Architecture for IoT-Based Applications.
Mora, Higinio; Signes-Pont, María Teresa; Gil, David; Johnsson, Magnus
2018-05-23
The new sensing applications need enhanced computing capabilities to handle the requirements of complex and huge data processing. The Internet of Things (IoT) concept brings processing and communication features to devices. In addition, the Cloud Computing paradigm provides resources and infrastructures for performing the computations and outsourcing the work from the IoT devices. This scenario opens new opportunities for designing advanced IoT-based applications, however, there is still much research to be done to properly gear all the systems for working together. This work proposes a collaborative model and an architecture to take advantage of the available computing resources. The resulting architecture involves a novel network design with different levels which combines sensing and processing capabilities based on the Mobile Cloud Computing (MCC) paradigm. An experiment is included to demonstrate that this approach can be used in diverse real applications. The results show the flexibility of the architecture to perform complex computational tasks of advanced applications.
A GaAs vector processor based on parallel RISC microprocessors
NASA Astrophysics Data System (ADS)
Misko, Tim A.; Rasset, Terry L.
A vector processor architecture based on the development of a 32-bit microprocessor using gallium arsenide (GaAs) technology has been developed. The McDonnell Douglas vector processor (MVP) will be fabricated completely from GaAs digital integrated circuits. The MVP architecture includes a vector memory of 1 megabyte, a parallel bus architecture with eight processing elements connected in parallel, and a control processor. The processing elements consist of a reduced instruction set CPU (RISC) with four floating-point coprocessor units and necessary memory interface functions. This architecture has been simulated for several benchmark programs including complex fast Fourier transform (FFT), complex inner product, trigonometric functions, and sort-merge routine. The results of this study indicate that the MVP can process a 1024-point complex FFT at a speed of 112 microsec (389 megaflops) while consuming approximately 618 W of power in a volume of approximately 0.1 ft-cubed.
Strategies for concurrent processing of complex algorithms in data driven architectures
NASA Technical Reports Server (NTRS)
Stoughton, John W.; Mielke, Roland R.
1988-01-01
The purpose is to document research to develop strategies for concurrent processing of complex algorithms in data driven architectures. The problem domain consists of decision-free algorithms having large-grained, computationally complex primitive operations. Such are often found in signal processing and control applications. The anticipated multiprocessor environment is a data flow architecture containing between two and twenty computing elements. Each computing element is a processor having local program memory, and which communicates with a common global data memory. A new graph theoretic model called ATAMM which establishes rules for relating a decomposed algorithm to its execution in a data flow architecture is presented. The ATAMM model is used to determine strategies to achieve optimum time performance and to develop a system diagnostic software tool. In addition, preliminary work on a new multiprocessor operating system based on the ATAMM specifications is described.
Architectural programming for the workplace and the careplace.
Easter, James G
2002-01-01
Sensitive planning and architectural design will impact long-term costs and daily operations. At the same time, the quality of the total environment has a direct impact on the patient, the family and the staff. These needs should be carefully balanced with the emotions of the patient, the care partner (parent, husband, wife or guardian) and those of the clinical team (physicians, nurses and staff). This article addresses the first step in the process; the master plan and then focuses in detail on one aspect of the architectural work referred to as architectural programming. The key to the process is selecting the best team of consultants, following the steps carefully, involving the client at every appropriate milestone along the way and asking the right questions. With this experienced team on board; following the proper steps, listening carefully to the answers and observing the daily process one can expect a successful product.
Reversible Self-Assembly of 3D Architectures Actuated by Responsive Polymers.
Zhang, Cheng; Su, Jheng-Wun; Deng, Heng; Xie, Yunchao; Yan, Zheng; Lin, Jian
2017-11-29
An assembly of three-dimensional (3D) architectures with defined configurations has important applications in broad areas. Among various approaches of constructing 3D structures, a stress-driven assembly provides the capabilities of creating 3D architectures in a broad range of functional materials with unique merits. However, 3D architectures built via previous methods are simple, irreversible, or not free-standing. Furthermore, the substrates employed for the assembly remain flat, thus not involved as parts of the final 3D architectures. Herein, we report a reversible self-assembly of various free-standing 3D architectures actuated by the self-folding of smart polymer substrates with programmed geometries. The strategically designed polymer substrates can respond to external stimuli, such as organic solvents, to initiate the 3D assembly process and subsequently become the parts of the final 3D architectures. The self-assembly process is highly controllable via origami and kirigami designs patterned by direct laser writing. Self-assembled geometries include 3D architectures such as "flower", "rainbow", "sunglasses", "box", "pyramid", "grating", and "armchair". The reported self-assembly also shows wide applicability to various materials including epoxy, polyimide, laser-induced graphene, and metal films. The device examples include 3D architectures integrated with a micro light-emitting diode and a flex sensor, indicting the potential applications in soft robotics, bioelectronics, microelectromechanical systems, and others.
Lee, S Seirin; Tashiro, S; Awazu, A; Kobayashi, R
2017-01-01
Specific features of nuclear architecture are important for the functional organization of the nucleus, and chromatin consists of two forms, heterochromatin and euchromatin. Conventional nuclear architecture is observed when heterochromatin is enriched at nuclear periphery, and it represents the primary structure in the majority of eukaryotic cells, including the rod cells of diurnal mammals. In contrast to this, inverted nuclear architecture is observed when the heterochromatin is distributed at the center of the nucleus, which occurs in the rod cells of nocturnal mammals. The inverted architecture found in the rod cells of the adult mouse is formed through the reorganization of conventional architecture during terminal differentiation. Although a previous experimental approach has demonstrated the relationship between these two nuclear architecture types at the molecular level, the mechanisms underlying long-range reorganization processes remain unknown. The details of nuclear structures and their spatial and temporal dynamics remain to be elucidated. Therefore, a comprehensive approach, using mathematical modeling, is required, in order to address these questions. Here, we propose a new mathematical approach to the understanding of nuclear architecture dynamics using the phase-field method. We successfully recreated the process of nuclear architecture reorganization, and showed that it is robustly induced by physical features, independent of a specific genotype. Our study demonstrates the potential of phase-field method application in the life science fields.
Exogenous (automatic) attention to emotional stimuli: a review.
Carretié, Luis
2014-12-01
Current knowledge on the architecture of exogenous attention (also called automatic, bottom-up, or stimulus-driven attention, among other terms) has been mainly obtained from studies employing neutral, anodyne stimuli. Since, from an evolutionary perspective, exogenous attention can be understood as an adaptive tool for rapidly detecting salient events, reorienting processing resources to them, and enhancing processing mechanisms, emotional events (which are, by definition, salient for the individual) would seem crucial to a comprehensive understanding of this process. This review, focusing on the visual modality, describes 55 experiments in which both emotional and neutral irrelevant distractors are presented at the same time as ongoing task targets. Qualitative and, when possible, meta-analytic descriptions of results are provided. The most conspicuous result is that, as confirmed by behavioral and/or neural indices, emotional distractors capture exogenous attention to a significantly greater extent than do neutral distractors. The modulatory effects of the nature of distractors capturing attention, of the ongoing task characteristics, and of individual differences, previously proposed as mediating factors, are also described. Additionally, studies reviewed here provide temporal and spatial information-partially absent in traditional cognitive models-on the neural basis of preattention/evaluation, reorienting, and sensory amplification, the main subprocesses involved in exogenous attention. A model integrating these different levels of information is proposed. The present review, which reveals that there are several key issues for which experimental data are surprisingly scarce, confirms the relevance of including emotional distractors in studies on exogenous attention.
A flexible software architecture for scalable real-time image and video processing applications
NASA Astrophysics Data System (ADS)
Usamentiaga, Rubén; Molleda, Julio; García, Daniel F.; Bulnes, Francisco G.
2012-06-01
Real-time image and video processing applications require skilled architects, and recent trends in the hardware platform make the design and implementation of these applications increasingly complex. Many frameworks and libraries have been proposed or commercialized to simplify the design and tuning of real-time image processing applications. However, they tend to lack flexibility because they are normally oriented towards particular types of applications, or they impose specific data processing models such as the pipeline. Other issues include large memory footprints, difficulty for reuse and inefficient execution on multicore processors. This paper presents a novel software architecture for real-time image and video processing applications which addresses these issues. The architecture is divided into three layers: the platform abstraction layer, the messaging layer, and the application layer. The platform abstraction layer provides a high level application programming interface for the rest of the architecture. The messaging layer provides a message passing interface based on a dynamic publish/subscribe pattern. A topic-based filtering in which messages are published to topics is used to route the messages from the publishers to the subscribers interested in a particular type of messages. The application layer provides a repository for reusable application modules designed for real-time image and video processing applications. These modules, which include acquisition, visualization, communication, user interface and data processing modules, take advantage of the power of other well-known libraries such as OpenCV, Intel IPP, or CUDA. Finally, we present different prototypes and applications to show the possibilities of the proposed architecture.
The NASA Integrated Information Technology Architecture
NASA Technical Reports Server (NTRS)
Baldridge, Tim
1997-01-01
This document defines an Information Technology Architecture for the National Aeronautics and Space Administration (NASA), where Information Technology (IT) refers to the hardware, software, standards, protocols and processes that enable the creation, manipulation, storage, organization and sharing of information. An architecture provides an itemization and definition of these IT structures, a view of the relationship of the structures to each other and, most importantly, an accessible view of the whole. It is a fundamental assumption of this document that a useful, interoperable and affordable IT environment is key to the execution of the core NASA scientific and project competencies and business practices. This Architecture represents the highest level system design and guideline for NASA IT related activities and has been created on the authority of the NASA Chief Information Officer (CIO) and will be maintained under the auspices of that office. It addresses all aspects of general purpose, research, administrative and scientific computing and networking throughout the NASA Agency and is applicable to all NASA administrative offices, projects, field centers and remote sites. Through the establishment of five Objectives and six Principles this Architecture provides a blueprint for all NASA IT service providers: civil service, contractor and outsourcer. The most significant of the Objectives and Principles are the commitment to customer-driven IT implementations and the commitment to a simpler, cost-efficient, standards-based, modular IT infrastructure. In order to ensure that the Architecture is presented and defined in the context of the mission, project and business goals of NASA, this Architecture consists of four layers in which each subsequent layer builds on the previous layer. They are: 1) the Business Architecture: the operational functions of the business, or Enterprise, 2) the Systems Architecture: the specific Enterprise activities within the context of IT systems, 3) the Technical Architecture: a common, vendor-independent framework for design, integration and implementation of IT systems and 4) the Product Architecture: vendor=specific IT solutions. The Systems Architecture is effectively a description of the end-user "requirements". Generalized end-user requirements are discussed and subsequently organized into specific mission and project functions. The Technical Architecture depicts the framework, and relationship, of the specific IT components that enable the end-user functionality as described in the Systems Architecture. The primary components as described in the Technical Architecture are: 1) Applications: Basic Client Component, Object Creation Applications, Collaborative Applications, Object Analysis Applications, 2) Services: Messaging, Information Broker, Collaboration, Distributed Processing, and 3) Infrastructure: Network, Security, Directory, Certificate Management, Enterprise Management and File System. This Architecture also provides specific Implementation Recommendations, the most significant of which is the recognition of IT as core to NASA activities and defines a plan, which is aligned with the NASA strategic planning processes, for keeping the Architecture alive and useful.
Saastamoinen, Marjo; Bocedi, Greta; Cote, Julien; Legrand, Delphine; Guillaume, Frédéric; Wheat, Christopher W; Fronhofer, Emanuel A; Garcia, Cristina; Henry, Roslyn; Husby, Arild; Baguette, Michel; Bonte, Dries; Coulon, Aurélie; Kokko, Hanna; Matthysen, Erik; Niitepõld, Kristjan; Nonaka, Etsuko; Stevens, Virginie M; Travis, Justin M J; Donohue, Kathleen; Bullock, James M; Del Mar Delgado, Maria
2018-02-01
Dispersal is a process of central importance for the ecological and evolutionary dynamics of populations and communities, because of its diverse consequences for gene flow and demography. It is subject to evolutionary change, which begs the question, what is the genetic basis of this potentially complex trait? To address this question, we (i) review the empirical literature on the genetic basis of dispersal, (ii) explore how theoretical investigations of the evolution of dispersal have represented the genetics of dispersal, and (iii) discuss how the genetic basis of dispersal influences theoretical predictions of the evolution of dispersal and potential consequences. Dispersal has a detectable genetic basis in many organisms, from bacteria to plants and animals. Generally, there is evidence for significant genetic variation for dispersal or dispersal-related phenotypes or evidence for the micro-evolution of dispersal in natural populations. Dispersal is typically the outcome of several interacting traits, and this complexity is reflected in its genetic architecture: while some genes of moderate to large effect can influence certain aspects of dispersal, dispersal traits are typically polygenic. Correlations among dispersal traits as well as between dispersal traits and other traits under selection are common, and the genetic basis of dispersal can be highly environment-dependent. By contrast, models have historically considered a highly simplified genetic architecture of dispersal. It is only recently that models have started to consider multiple loci influencing dispersal, as well as non-additive effects such as dominance and epistasis, showing that the genetic basis of dispersal can influence evolutionary rates and outcomes, especially under non-equilibrium conditions. For example, the number of loci controlling dispersal can influence projected rates of dispersal evolution during range shifts and corresponding demographic impacts. Incorporating more realism in the genetic architecture of dispersal is thus necessary to enable models to move beyond the purely theoretical towards making more useful predictions of evolutionary and ecological dynamics under current and future environmental conditions. To inform these advances, empirical studies need to answer outstanding questions concerning whether specific genes underlie dispersal variation, the genetic architecture of context-dependent dispersal phenotypes and behaviours, and correlations among dispersal and other traits. © 2017 The Authors. Biological Reviews published by John Wiley & Sons Ltd on behalf of Cambridge Philosophical Society.
Developing a New Thesaurus for Art and Architecture.
ERIC Educational Resources Information Center
Petersen, Toni
1990-01-01
This description of the development of the Art and Architecture Thesaurus from 1979 to the present explains the processes and policies that were used to construct a language designed to represent knowledge in art and architecture, as well as to be a surrogate for the image and object being described. (EAM)
Reflective Subjects in Kant and Architectural Design Education
ERIC Educational Resources Information Center
Rawes, Peg
2007-01-01
In architectural design education, students develop drawing, conceptual, and critical skills which are informed by their ability to reflect upon the production of ideas in design processes and in the urban, environmental, social, historical, and cultural context that define architecture and the built environment. Reflective actions and thinking…
ERIC Educational Resources Information Center
Kerkiri, Tania
2010-01-01
A comprehensive presentation is here made on the modular architecture of an e-learning platform with a distinctive emphasis on content personalization, combining advantages from semantic web technology, collaborative filtering and recommendation systems. Modules of this architecture handle information about both the domain-specific didactic…
ERIC Educational Resources Information Center
Tambouris, Efthimios; Zotou, Maria; Kalampokis, Evangelos; Tarabanis, Konstantinos
2012-01-01
Enterprise architecture (EA) implementation refers to a set of activities ultimately aiming to align business objectives with information technology infrastructure in an organization. EA implementation is a multidisciplinary, complicated and endless process, hence, calls for adequate education and training programs that will build highly skilled…
ERIC Educational Resources Information Center
Henderson, Rebecca M.; Clark, Kim B.
1990-01-01
Using an empirical study of the semiconductor photolithographic alignment equipment industry, this paper shows that architectural innovations destroy the usefulness of established firms' architectural knowledge. Because this knowledge is embedded in the firms' structure and information-processing procedures, the destruction is hard to detect.…
Rezaeibagha, Fatemeh; Win, Khin Than; Susilo, Willy
Even though many safeguards and policies for electronic health record (EHR) security have been implemented, barriers to the privacy and security protection of EHR systems persist. This article presents the results of a systematic literature review regarding frequently adopted security and privacy technical features of EHR systems. Our inclusion criteria were full articles that dealt with the security and privacy of technical implementations of EHR systems published in English in peer-reviewed journals and conference proceedings between 1998 and 2013; 55 selected studies were reviewed in detail. We analysed the review results using two International Organization for Standardization (ISO) standards (29100 and 27002) in order to consolidate the study findings. Using this process, we identified 13 features that are essential to security and privacy in EHRs. These included system and application access control, compliance with security requirements, interoperability, integration and sharing, consent and choice mechanism, policies and regulation, applicability and scalability and cryptography techniques. This review highlights the importance of technical features, including mandated access control policies and consent mechanisms, to provide patients' consent, scalability through proper architecture and frameworks, and interoperability of health information systems, to EHR security and privacy requirements.
VASP-4096: a very high performance programmable device for digital media processing applications
NASA Astrophysics Data System (ADS)
Krikelis, Argy
2001-03-01
Over the past few years, technology drivers for microprocessors have changed significantly. Media data delivery and processing--such as telecommunications, networking, video processing, speech recognition and 3D graphics--is increasing in importance and will soon dominate the processing cycles consumed in computer-based systems. This paper presents the architecture of the VASP-4096 processor. VASP-4096 provides high media performance with low energy consumption by integrating associative SIMD parallel processing with embedded microprocessor technology. The major innovations in the VASP-4096 is the integration of thousands of processing units in a single chip that are capable of support software programmable high-performance mathematical functions as well as abstract data processing. In addition to 4096 processing units, VASP-4096 integrates on a single chip a RISC controller that is an implementation of the SPARC architecture, 128 Kbytes of Data Memory, and I/O interfaces. The SIMD processing in VASP-4096 implements the ASProCore architecture, which is a proprietary implementation of SIMD processing, operates at 266 MHz with program instructions issued by the RISC controller. The device also integrates a 64-bit synchronous main memory interface operating at 133 MHz (double-data rate), and a 64- bit 66 MHz PCI interface. VASP-4096, compared with other processors architectures that support media processing, offers true performance scalability, support for deterministic and non-deterministic data processing on a single device, and software programmability that can be re- used in future chip generations.
Systems Architecture for a Nationwide Healthcare System.
Abin, Jorge; Nemeth, Horacio; Friedmann, Ignacio
2015-01-01
From a national level to give Internet technology support, the Nationwide Integrated Healthcare System in Uruguay requires a model of Information Systems Architecture. This system has multiple healthcare providers (public and private), and a strong component of supplementary services. Thus, the data processing system should have an architecture that considers this fact, while integrating the central services provided by the Ministry of Public Health. The national electronic health record, as well as other related data processing systems, should be based on this architecture. The architecture model described here conceptualizes a federated framework of electronic health record systems, according to the IHE affinity model, HL7 standards, local standards on interoperability and security, as well as technical advice provided by AGESIC. It is the outcome of the research done by AGESIC and Systems Integration Laboratory (LINS) on the development and use of the e-Government Platform since 2008, as well as the research done by the team Salud.uy since 2013.
LC Data QUEST: A Technical Architecture for Community Federated Clinical Data Sharing.
Stephens, Kari A; Lin, Ching-Ping; Baldwin, Laura-Mae; Echo-Hawk, Abigail; Keppel, Gina A; Buchwald, Dedra; Whitener, Ron J; Korngiebel, Diane M; Berg, Alfred O; Black, Robert A; Tarczy-Hornoch, Peter
2012-01-01
The University of Washington Institute of Translational Health Sciences is engaged in a project, LC Data QUEST, building data sharing capacity in primary care practices serving rural and tribal populations in the Washington, Wyoming, Alaska, Montana, Idaho region to build research infrastructure. We report on the iterative process of developing the technical architecture for semantically aligning electronic health data in primary care settings across our pilot sites and tools that will facilitate linkages between the research and practice communities. Our architecture emphasizes sustainable technical solutions for addressing data extraction, alignment, quality, and metadata management. The architecture provides immediate benefits to participating partners via a clinical decision support tool and data querying functionality to support local quality improvement efforts. The FInDiT tool catalogues type, quantity, and quality of the data that are available across the LC Data QUEST data sharing architecture. These tools facilitate the bi-directional process of translational research.
LC Data QUEST: A Technical Architecture for Community Federated Clinical Data Sharing
Stephens, Kari A.; Lin, Ching-Ping; Baldwin, Laura-Mae; Echo-Hawk, Abigail; Keppel, Gina A.; Buchwald, Dedra; Whitener, Ron J.; Korngiebel, Diane M.; Berg, Alfred O.; Black, Robert A.; Tarczy-Hornoch, Peter
2012-01-01
The University of Washington Institute of Translational Health Sciences is engaged in a project, LC Data QUEST, building data sharing capacity in primary care practices serving rural and tribal populations in the Washington, Wyoming, Alaska, Montana, Idaho region to build research infrastructure. We report on the iterative process of developing the technical architecture for semantically aligning electronic health data in primary care settings across our pilot sites and tools that will facilitate linkages between the research and practice communities. Our architecture emphasizes sustainable technical solutions for addressing data extraction, alignment, quality, and metadata management. The architecture provides immediate benefits to participating partners via a clinical decision support tool and data querying functionality to support local quality improvement efforts. The FInDiT tool catalogues type, quantity, and quality of the data that are available across the LC Data QUEST data sharing architecture. These tools facilitate the bi-directional process of translational research. PMID:22779052
Review of Adaptive Programmable Materials and Their Bioapplications.
Fan, Xiaoshan; Chung, Jing Yang; Lim, Yong Xiang; Li, Zibiao; Loh, Xian Jun
2016-12-14
Adaptive programmable materials have attracted increasing attention due to their high functionality, autonomous behavior, encapsulation, and site-specific confinement capabilities in various applications. Compared to conventional materials, adaptive programmable materials possess unique single-material architecture that can maintain, respond, and change their shapes and dimensions when they are subjected to surrounding environment changes, such as alternation in temperature, pH, and ionic strength. In this review, the most-recent advances in the design strategies of adaptive programmable materials are presented with respect to different types of architectural polymers, including stimuli-responsive polymers and shape-memory polymers. The diverse functions of these sophisticated materials and their significance in therapeutic agent delivery systems are also summarized in this review. Finally, the challenges for facile fabrication of these materials and future prospective are also discussed.
1998-01-24
the Apparel Manufacturing Architecture (AMA), a generic architecture for an apparel enterprise. ARN-AIMS consists of three modules - Order Processing , Order...Tracking and Shipping & Invoicing. The Order Processing Module is designed to facilitate the entry of customer orders for stock and special
Reshaping the Enterprise through an Information Architecture and Process Reengineering.
ERIC Educational Resources Information Center
Laudato, Nicholas C.; DeSantis, Dennis J.
1995-01-01
The approach used by the University of Pittsburgh (Pennsylvania) in designing a campus-wide information architecture and a framework for reengineering the business process included building consensus on a general philosophy for information systems, using pattern-based abstraction techniques, applying data modeling and application prototyping, and…
Biological basis for space-variant sensor design I: parameters of monkey and human spatial vision
NASA Astrophysics Data System (ADS)
Rojer, Alan S.; Schwartz, Eric L.
1991-02-01
Biological sensor design has long provided inspiration for sensor design in machine vision. However relatively little attention has been paid to the actual design parameters provided by biological systems as opposed to the general nature of biological vision architectures. In the present paper we will provide a review of current knowledge of primate spatial vision design parameters and will present recent experimental and modeling work from our lab which demonstrates that a numerical conformal mapping which is a refinement of our previous complex logarithmic model provides the best current summary of this feature of the primate visual system. In this paper we will review recent work from our laboratory which has characterized some of the spatial architectures of the primate visual system. In particular we will review experimental and modeling studies which indicate that: . The global spatial architecture of primate visual cortex is well summarized by a numerical conformal mapping whose simplest analytic approximation is the complex logarithm function . The columnar sub-structure of primate visual cortex can be well summarized by a model based on a band-pass filtered white noise. We will also refer to ongoing work in our lab which demonstrates that: . The joint columnar/map structure of primate visual cortex can be modeled and summarized in terms of a new algorithm the ''''proto-column'''' algorithm. This work provides a reference-point for current engineering approaches to novel architectures for
NASA Technical Reports Server (NTRS)
Weber, Gary A.
1991-01-01
During the 90-day study, support was provided to NASA in defining a point-of-departure space transfer vehicle (STV). The resulting STV concept was performance optimized with a two-stage LTV/LEV configuration. Appendix A reports on the effort during this period of the study. From the end of the 90-day study until the March Interim Review, effort was placed on optimizing the two-stage vehicle approach identified in the 90-day effort. After the March Interim Review, the effort was expanded to perform a full architectural trade study with the intent of developing a decision database to support STV system decisions in response to changing SEI infrastructure concepts. Several of the architecture trade studies were combined in a System Architecture Trade Study. In addition to this trade, system optimization/definition trades and analyses were completed and some special topics were addressed. Program- and system-level trade study and analyses methodologies and results are presented in this section. Trades and analyses covered in this section are: (1) a system architecture trade study; (2) evolution; (3) safety and abort considerations; (4) STV as a launch vehicle upper stage; and (5) optimum crew and cargo split.