Science.gov

Sample records for architecture supporting self-organisation

  1. Introducing Live ePortfolios to Support Self Organised Learning

    ERIC Educational Resources Information Center

    Kirkham, Thomas; Winfield, Sandra; Smallwood, Angela; Coolin, Kirstie; Wood, Stuart; Searchwell, Louis

    2009-01-01

    This paper presents a platform on which a new generation of applications targeted to aid the self-organised learner can be presented. The new application is enabled by innovations in trust-based security of data built upon emerging infrastructures to aid federated data access in the UK education sector. Within the proposed architecture, users and…

  2. Effects of the ISIS Recommender System for Navigation Support in Self-Organised Learning Networks

    ERIC Educational Resources Information Center

    Drachsler, Hendrik; Hummel, Hans; van den Berg, Bert; Eshuis, Jannes; Waterink, Wim; Nadolski, Rob; Berlanga, Adriana; Boers, Nanda; Koper, Rob

    2009-01-01

    The need to support users of the Internet with the selection of information is becoming more important. Learners in complex, self-organising Learning Networks have similar problems and need guidance to find and select most suitable learning activities, in order to attain their lifelong learning goals in the most efficient way. Several research…

  3. Support-vector-based emergent self-organising approach for emotional understanding

    NASA Astrophysics Data System (ADS)

    Nguwi, Yok-Yen; Cho, Siu-Yeung

    2010-12-01

    This study discusses the computational analysis of general emotion understanding from questionnaires methodology. The questionnaires method approaches the subject by investigating the real experience that accompanied the emotions, whereas the other laboratory approaches are generally associated with exaggerated elements. We adopted a connectionist model called support-vector-based emergent self-organising map (SVESOM) to analyse the emotion profiling from the questionnaires method. The SVESOM first identifies the important variables by giving discriminative features with high ranking. The classifier then performs the classification based on the selected features. Experimental results show that the top rank features are in line with the work of Scherer and Wallbott [(1994), 'Evidence for Universality and Cultural Variation of Differential Emotion Response Patterning', Journal of Personality and Social Psychology, 66, 310-328], which approached the emotions physiologically. While the performance measures show that using the full features for classifications can degrade the performance, the selected features provide superior results in terms of accuracy and generalisation.

  4. Self-Organisation and Capacity Building: Sustaining the Change

    ERIC Educational Resources Information Center

    Bain, Alan; Walker, Allan; Chan, Anissa

    2011-01-01

    Purpose: The paper aims to describe the application of theoretical principles derived from a study of self-organisation and complex systems theory and their application to school-based capacity building to support planned change. Design/methodology/approach: The paper employs a case example in a Hong Kong School to illustrate the application of…

  5. Self-organising structures of lecithin

    NASA Astrophysics Data System (ADS)

    Shchipunov, Yurii A.

    1997-04-01

    Modern concepts of the self-assembly of amphiphiles are considered on the example of self-organising structures of the natural lecithin. Binary, ternary and multicomponent systems are discussed. A considerable part of the review is devoted to the peculiarities of self-organisation of this phospholipid in non-aqueous media and to the role of polar inorganic solvents. Virtually all of the structures formed by lecithin are examined: micelles, swollen micelles, microemulsions, emulsions, organogels, vesicles (liposomes), and lyotropic liquid crystals. In each specific case, attention is drawn to the dependence of self-assembly at the macroscopic level on interactions at the molecular level, shape of molecules, and their solvation and packing at the interface. The self-organising lecithin structures formed in the interfacial area of immiscible liquids in the course of unrestricted adsorption from the bulk of non-aqueous solution are considered. The bibliography includes 282 references.

  6. Ising, Schelling and self-organising segregation

    NASA Astrophysics Data System (ADS)

    Stauffer, D.; Solomon, S.

    2007-06-01

    The similarities between phase separation in physics and residential segregation by preference in the Schelling model of 1971 are reviewed. Also, new computer simulations of asymmetric interactions different from the usual Ising model are presented, showing spontaneous magnetisation (=self-organising segregation) and in one case a sharp phase transition.

  7. Self-organisation and motion in plants

    NASA Astrophysics Data System (ADS)

    Lenau, T. A.; Hesselberg, T.

    2014-03-01

    Self-organisation appeals to humans because difficult and repeated actions can be avoided through automation via bottom-up nonhierarchical processes. This is in contrast to the top-level controlled action strategy normally applied in automated products and in manufacturing. There are many situations where it is required that objects perform an action dependent on external stimuli. An example is automatic window blinds that open or closes in response to sunlight level. However, simpler and more robust designs could be made using the self-organising principles for movement found in many plants. Plants move to adapt to external conditions, e.g. sun-flower buds tracking the sun, touch-me-not Mimosa and Venus fly trap responding to mechanical stimuli by closing leaves to protect them and capture insects respectively. This paper describes 3 of the basic biomimetic principles used by plants to track the sun; i) light causing an inhibiting effect on the illuminated side causing it to bend, ii) light inducing a signal from the illuminated side that causes an action on the darker side and iii) light illuminating a number of sensing plates pointing upwards at an angle activate an expansion on the same side. A concept for mimicking the second principle is presented. It is a very simple and possible reliable self-organising structure that aligns a plate perpendicular to the source of illumination.

  8. On the equivalence between kernel self-organising maps and self-organising mixture density networks.

    PubMed

    Yin, Hujun

    2006-01-01

    The kernel method has become a useful trick and has been widely applied to various learning models to extend their nonlinear approximation and classification capabilities. Such extensions have also recently occurred to the Self-Organising Map (SOM). In this paper, two recently proposed kernel SOMs are reviewed, together with their link to an energy function. The Self-Organising Mixture Network is an extension of the SOM for mixture density modelling. This paper shows that with an isotropic, density-type kernel function, the kernel SOM is equivalent to a homoscedastic Self-Organising Mixture Network, an entropy-based density estimator. This revelation on the one hand explains that kernelising SOM can improve classification performance by acquiring better probability models of the data; but on the other hand it also explains that the SOM already naturally approximates the kernel method.

  9. A self-organising network that grows when required.

    PubMed

    Marsland, Stephen; Shapiro, Jonathan; Nehmzow, Ulrich

    2002-01-01

    The ability to grow extra nodes is a potentially useful facility for a self-organising neural network. A network that can add nodes into its map space can approximate the input space more accurately, and often more parsimoniously, than a network with predefined structure and size, such as the Self-Organising Map. In addition, a growing network can deal with dynamic input distributions. Most of the growing networks that have been proposed in the literature add new nodes to support the node that has accumulated the highest error during previous iterations or to support topological structures. This usually means that new nodes are added only when the number of iterations is an integer multiple of some pre-defined constant, A. This paper suggests a way in which the learning algorithm can add nodes whenever the network in its current state does not sufficiently match the input. In this way the network grows very quickly when new data is presented, but stops growing once the network has matched the data. This is particularly important when we consider dynamic data sets, where the distribution of inputs can change to a new regime after some time. We also demonstrate the preservation of neighbourhood relations in the data by the network. The new network is compared to an existing growing network, the Growing Neural Gas (GNG), on a artificial dataset, showing how the network deals with a change in input distribution after some time. Finally, the new network is applied to several novelty detection tasks and is compared with both the GNG and an unsupervised form of the Reduced Coulomb Energy network on a robotic inspection task and with a Support Vector Machine on two benchmark novelty detection tasks. PMID:12416693

  10. Self organising maps for visualising and modelling

    PubMed Central

    2012-01-01

    The paper describes the motivation of SOMs (Self Organising Maps) and how they are generally more accessible due to the wider available modern, more powerful, cost-effective computers. Their advantages compared to Principal Components Analysis and Partial Least Squares are discussed. These allow application to non-linear data, are not so dependent on least squares solutions, normality of errors and less influenced by outliers. In addition there are a wide variety of intuitive methods for visualisation that allow full use of the map space. Modern problems in analytical chemistry include applications to cultural heritage studies, environmental, metabolomic and biological problems result in complex datasets. Methods for visualising maps are described including best matching units, hit histograms, unified distance matrices and component planes. Supervised SOMs for classification including multifactor data and variable selection are discussed as is their use in Quality Control. The paper is illustrated using four case studies, namely the Near Infrared of food, the thermal analysis of polymers, metabolomic analysis of saliva using NMR, and on-line HPLC for pharmaceutical process monitoring. PMID:22594434

  11. Self-organising sensory maps in odour classification mimicking.

    PubMed

    Davide, F A; Di Natale, C; D'Amico, A

    1995-01-01

    A system for artificial olfaction is introduced, which is composed of a sensor array for gas sensing and a self-organising artificial neural network. A detailed reformulation of the most effective Self-Organising sensory Map (SOM)-based algorithms for odour classification and other applications is provided. An opto-electronic micromachined implementation of the neural network is introduced, which employs a novel hybrid mechanism for activating neural groups, avoiding fabricated cloning templates hardware.

  12. Learning robot actions based on self-organising language memory.

    PubMed

    Wermter, Stefan; Elshaw, Mark

    2003-01-01

    In the MirrorBot project we examine perceptual processes using models of cortical assemblies and mirror neurons to explore the emergence of semantic representations of actions, percepts and concepts in a neural robot. The hypothesis under investigation is whether a neural model will produce a life-like perception system for actions. In this context we focus in this paper on how instructions for actions can be modeled in a self-organising memory. Current approaches for robot control often do not use language and ignore neural learning. However, our approach uses language instruction and draws from the concepts of regional distributed modularity, self-organisation and neural assemblies. We describe a self-organising model that clusters actions into different locations depending on the body part they are associated with. In particular, we use actual sensor readings from the MIRA robot to represent semantic features of the action verbs. Furthermore, we outline a hierarchical computational model for a self-organising robot action control system using language for instruction.

  13. The Self-Organising Seismic Early Warning Information Network

    NASA Astrophysics Data System (ADS)

    Kühnlenz, F.; Eveslage, I.; Fischer, J.; Fleming, K. M.; Lichtblau, B.; Milkereit, C.; Picozzi, M.

    2009-12-01

    The Self-Organising Seismic Early Warning Information Network (SOSEWIN) represents a new approach for Earthquake Early Warning Systems (EEWS), consisting in taking advantage of novel wireless communications technologies without the need of a planned, centralised infrastructure. It also sets out to overcome problems of insufficient node density, which typically affects present existing early warning systems, by having the SOSEWIN seismological sensing units being comprised of low-cost components (generally bought "off-the-shelf"), with each unit initially costing 100's of Euros, in contrast to 1,000's to 10,000's for standard seismological stations. The reduced sensitivity of the new sensing units arising from the use of lower-cost components will be compensated by the network's density, which in the future is expected to number 100's to 1000's over areas served currently by the order of 10's of standard stations. The robustness, independence of infrastructure, spontaneous extensibility due to a self-healing/self-organizing character in the case of removing/failing or adding sensors makes SOSEWIN potentially useful for various use cases, e.g. monitoring of building structures or seismic microzonation. Nevertheless its main purpose is the earthquake early warning, for which reason the ground motion is continuously monitored by conventional accelerometers (3-component) and processed within a station. Based on this, the network itself decides whether an event is detected through cooperating stations. SEEDLink is used to store and provide access to the sensor data. Experiences and selected experiment results with the SOSEWIN-prototype installation in the Ataköy district of Istanbul (Turkey) are presented. SOSEWIN considers also the needs of earthquake task forces, which want to set-up a temporary seismic network rapidly and with light-weighted stations to record after-shocks. The wireless and self-organising character of this sensor network is of great value to do this

  14. The Self-Organising Seismic Early Warning Information Network: Scenarios

    NASA Astrophysics Data System (ADS)

    Kühnlenz, F.; Fischer, J.; Eveslage, I.

    2009-04-01

    SAFER and EDIM working groups, the Department of Computer Science, Humboldt-Universität zu Berlin, Berlin, Germany, and Section 2.1 Earthquake Risk and Early Warning, GFZ German Research Centre for Geosciences, Germany Contact: Frank Kühnlenz, kuehnlenz@informatik.hu-berlin.de The Self-Organising Seismic Early Warning Information Network (SOSEWIN) represents a new approach for Earthquake Early Warning Systems (EEWS), consisting in taking advantage of novel wireless communications technologies without the need of a planned, centralised infrastructure. It also sets out to overcome problems of insufficient node density, which typically affects present existing early warning systems, by having the SOSEWIN seismological sensing units being comprised of low-cost components (generally bought "off-the-shelf"), with each unit initially costing 100's of Euros, in contrast to 1,000's to 10,000's for standard seismological stations. The reduced sensitivity of the new sensing units arising from the use of lower-cost components will be compensated by the network's density, which in the future is expected to number 100's to 1000's over areas served currently by the order of 10's of standard stations. The robustness, independence of infrastructure, spontaneous extensibility due to a self-healing/self-organizing character in the case of removing/failing or adding sensors makes SOSEWIN potentially useful for various use cases, e.g. monitoring of building structures or seismic microzonation. Nevertheless its main purpose is the earthquake early warning, for which reason the ground motion is continuously monitored by conventional accelerometers (3-component). It uses SEEDLink to store and provide access to the sensor data. SOSEWIN considers also the needs of earthquake task forces, which want to set-up a temporary seismic network rapidly and with light-weighted stations to record after-shocks. The wireless and self-organising character of this sensor network should be of great value

  15. Architectural prospects for lunar mission support

    NASA Technical Reports Server (NTRS)

    Cesarone, Robert J.; Abraham, Douglas S.; Deutsch, Leslie J.; Noreen, Gary K.; Soloff, Jason A.

    2005-01-01

    A top-level architectural approach facilitates the provision of communications and navigation support services to the anticipated lunar mission set. Following the time-honored principles of systems architecting, i.e., form follows function, the first step is to define the functions or services to be provided, both in terms of character and degree. These will include communication as well as trackin and navigation services.

  16. How to Trigger Emergence and Self-Organisation in Learning Networks

    NASA Astrophysics Data System (ADS)

    Brouns, Francis; Fetter, Sibren; van Rosmalen, Peter

    The previous chapters of this section discussed why the social structure of Learning Networks is important and present guidelines on how to maintain and allow the emergence of communities in Learning Networks. Chapter 2 explains how Learning Networks rely on social interaction and active participations of the participants. Chapter 3 then continues by presenting guidelines and policies that should be incorporated into Learning Network Services in order to maintain existing communities by creating conditions that promote social interaction and knowledge sharing. Chapter 4 discusses the necessary conditions required for knowledge sharing to occur and to trigger communities to self-organise and emerge. As pointed out in Chap. 4, ad-hoc transient communities facilitate the emergence of social interaction in Learning Networks, self-organising them into communities, taking into account personal characteristics, community characteristics and general guidelines. As explained in Chap. 4 community members would benefit from a service that brings suitable people together for a specific purpose, because it will allow the participant to focus on the knowledge sharing process by reducing the effort or costs. In the current chapter, we describe an example of a peer support Learning Network Service based on the mechanism of peer tutoring in ad-hoc transient communities.

  17. A Self-Organising Model of Thermoregulatory Huddling.

    PubMed

    Glancy, Jonathan; Groß, Roderich; Stone, James V; Wilson, Stuart P

    2015-09-01

    Endotherms such as rats and mice huddle together to keep warm. The huddle is considered to be an example of a self-organising system, because complex properties of the collective group behaviour are thought to emerge spontaneously through simple interactions between individuals. Groups of rodent pups display two such emergent properties. First, huddling undergoes a 'phase transition', such that pups start to aggregate rapidly as the temperature of the environment falls below a critical temperature. Second, the huddle maintains a constant 'pup flow', where cooler pups at the periphery continually displace warmer pups at the centre. We set out to test whether these complex group behaviours can emerge spontaneously from local interactions between individuals. We designed a model using a minimal set of assumptions about how individual pups interact, by simply turning towards heat sources, and show in computer simulations that the model reproduces the first emergent property--the phase transition. However, this minimal model tends to produce an unnatural behaviour where several smaller aggregates emerge rather than one large huddle. We found that an extension of the minimal model to include heat exchange between pups allows the group to maintain one large huddle but eradicates the phase transition, whereas inclusion of an additional homeostatic term recovers the phase transition for large huddles. As an unanticipated consequence, the extended model also naturally gave rise to the second observed emergent property--a continuous pup flow. The model therefore serves as a minimal description of huddling as a self-organising system, and as an existence proof that group-level huddling dynamics emerge spontaneously through simple interactions between individuals. We derive a specific testable prediction: Increasing the capacity of the individual to generate or conserve heat will increase the range of ambient temperatures over which adaptive thermoregulatory huddling will

  18. Functional Interface Considerations within an Exploration Life Support System Architecture

    NASA Technical Reports Server (NTRS)

    Perry, Jay L.; Sargusingh, Miriam J.; Toomarian, Nikzad

    2016-01-01

    As notional life support system (LSS) architectures are developed and evaluated, myriad options must be considered pertaining to process technologies, components, and equipment assemblies. Each option must be evaluated relative to its impact on key functional interfaces within the LSS architecture. A leading notional architecture has been developed to guide the path toward realizing future crewed space exploration goals. This architecture includes atmosphere revitalization, water recovery and management, and environmental monitoring subsystems. Guiding requirements for developing this architecture are summarized and important interfaces within the architecture are discussed. The role of environmental monitoring within the architecture is described.

  19. Resilience of Self-Organised and Top-Down Planned Cities--A Case Study on London and Beijing Street Networks.

    PubMed

    Wang, Jiaqiu

    2015-01-01

    The success or failure of the street network depends on its reliability. In this article, using resilience analysis, the author studies how the shape and appearance of street networks in self-organised and top-down planned cities influences urban transport. Considering London and Beijing as proxies for self-organised and top-down planned cities, the structural properties of London and Beijing networks first are investigated based on their primal and dual representations of planar graphs. The robustness of street networks then is evaluated in primal space and dual space by deactivating road links under random and intentional attack scenarios. The results show that the reliability of London street network differs from that of Beijing, which seems to rely more on its architecture and connectivity. It is found that top-down planned Beijing with its higher average degree in the dual space and assortativity in the primal space is more robust than self-organised London using the measures of maximum and second largest cluster size and network efficiency. The article offers an insight, from a network perspective, into the reliability of street patterns in self-organised and top-down planned city systems. PMID:26682551

  20. Resilience of Self-Organised and Top-Down Planned Cities—A Case Study on London and Beijing Street Networks

    PubMed Central

    Wang, Jiaqiu

    2015-01-01

    The success or failure of the street network depends on its reliability. In this article, using resilience analysis, the author studies how the shape and appearance of street networks in self-organised and top-down planned cities influences urban transport. Considering London and Beijing as proxies for self-organised and top-down planned cities, the structural properties of London and Beijing networks first are investigated based on their primal and dual representations of planar graphs. The robustness of street networks then is evaluated in primal space and dual space by deactivating road links under random and intentional attack scenarios. The results show that the reliability of London street network differs from that of Beijing, which seems to rely more on its architecture and connectivity. It is found that top-down planned Beijing with its higher average degree in the dual space and assortativity in the primal space is more robust than self-organised London using the measures of maximum and second largest cluster size and network efficiency. The article offers an insight, from a network perspective, into the reliability of street patterns in self-organised and top-down planned city systems. PMID:26682551

  1. Cytoplasmic streaming emerges naturally from hydrodynamic self-organisation of a microfilament suspension

    NASA Astrophysics Data System (ADS)

    Woodhouse, Francis; Goldstein, Raymond

    2013-03-01

    Cytoplasmic streaming is the ubiquitous phenomenon of deliberate, active circulation of the entire liquid contents of a plant or animal cell by the walking of motor proteins on polymer filament tracks. Its manifestation in the plant kingdom is particularly striking, where many cells exhibit highly organised patterns of flow. How these regimented flow templates develop is biologically unclear, but there is growing experimental evidence to support hydrodynamically-mediated self-organisation of the underlying microfilament tracks. Using the spirally-streaming giant internodal cells of the characean algae Chara and Nitella as our prototype, we model the developing sub-cortical streaming cytoplasm as a continuum microfilament suspension subject to hydrodynamic and geometric forcing. We show that our model successfully reproduces emergent streaming behaviour by evolving from a totally disordered initial state into a steady characean ``conveyor belt'' configuration as a consequence of the cell geometry, and discuss applicability to other classes of steadily streaming plant cells.

  2. Self-organisation Processes In The Carbon ARC For Nanosynthis

    SciTech Connect

    Ng, J.; Raitses, Yefgeny

    2014-02-02

    The atmospheric pressure carbon arc in inert gases such as helium is an important method for the production of nanomaterials. It has recently been shown that the formation of the carbon deposit on the cathode from gaseous carbon plays a crucial role in the operation of the arc, reaching the high temperatures necessary for thermionic emission to take place even with low melting point cathodes. Based on observed ablation and deposition rates, we explore the implications of deposit formation on the energy balance at the cathode surface, and show how the operation of the arc is self-organised process. Our results suggest that the can arc operate in two di erent regimes, one of which has an important contribution from latent heat to the cathode energy balance. This regime is characterised by the enhanced ablation rate, which may be favourable for high yield synthesis of nanomaterials. The second regime has a small and approximately constant ablation rate with a negligible contribution from latent heat.

  3. Self-organisation Processes In The Carbon ARC For Nanosynthis

    SciTech Connect

    Ng, Jonathan; Raitses, Yevgeny

    2014-02-26

    The atmospheric pressure carbon arc in inert gases such as helium is an important method for the production of nanomaterials. It has recently been shown that the formation of the carbon deposit on the cathode from gaseous carbon plays a crucial role in the operation of the arc, reaching the high temperatures necessary for thermionic emission to take place even with low melting point cathodes. Based on observed ablation and deposition rates, we explore the implications of deposit formation on the energy balance at the cathode surface, and show how the operation of the arc is self-organised process. Our results suggest that the can arc operate in two di erent regimes, one of which has an important contribution from latent heat to the cathode energy balance. This regime is characterised by the enhanced ablation rate, which may be favourable for high yield synthesis of nanomaterials. The second regime has a small and approximately constant ablation rate with a negligible contribution from latent heat.

  4. Gait quality assessment using self-organising artificial neural networks.

    PubMed

    Barton, Gabor; Lisboa, Paulo; Lees, Adrian; Attfield, Steve

    2007-03-01

    In this study, the challenge to maximise the potential of gait analysis by employing advanced methods was addressed by using self-organising neural networks to quantify the deviation of patients' gait from normal. Data including three-dimensional joint angles, moments and powers of the two lower limbs and the pelvis were used to train Kohonen artificial neural networks to learn an abstract definition of normal gait. Subsequently, data from patients with gait problems were presented to the network which quantified the quality of gait in the form of a single curve by calculating the quantisation error during the gait cycle. A sensitivity analysis involving the manipulation of gait variables' weighting was able to highlight specific causes of the deviation including the anatomical location and the timing of wrong gait patterns. Use of the quantisation error can be regarded as an extension of previously described gait indices because it measures the goodness of gait and additionally provides information related to the causes underlying gait deviations.

  5. Self-organisation, orientation and magnetic properties of FePt nanoparticle arrays

    NASA Astrophysics Data System (ADS)

    Verdes, C.; Chantrell, R. W.; Satoh, A.; Harrell, J. W.; Nikles, D.

    2006-09-01

    Self-organised magnetic arrays (SOMA) of high anisotropy particles are a promising candidate for ultra-high-density recording media. In principle SOMA media have the capability of storing 1 bit per particle, leading to possible reecording densities in excess of 10 Tbit/sq in. In this paper we consider two major aspects of SOMA media, namely the self-organisation process itself and the physics of the particle orientation process.

  6. Evolutionary self-organising modelling of a municipal wastewater treatment plant.

    PubMed

    Hong, Yoon-Seok; Bhamidimarri, Rao

    2003-03-01

    Building predictive models for highly time varying and complex multivariable aspects of the wastewater treatment plant is important both for understanding the dynamics of this complex system, and in the development of optimal control support and management schemes. This paper presents a new approach, which is called genetic programming as a self-organising modelling tool, to model dynamic performance of municipal activated-sludge wastewater treatment plants. Genetic programming evolves several process models automatically based on methods of natural selection ('survival of the fittest'), that could predict the dynamics of MLSS and suspended solids in the effluent. The predictive accuracy of the genetic programming approach was compared with a nonlinear state-space model with neural network and a well-known IAWQ ASM2. The genetic programming system evolved some models that were an improvement over the neural network and ASM2 and showed that the transparency of the model evolved may allow inferences about underlying processes to be made. This work demonstrates that dynamic nonlinear processes in the wastewater treatment plant may be successfully modelled through the use of evolutionary model induction algorithms in GP technique. Further, our results show that genetic programming can work as a cost-effective intelligent modelling tool, enabling us to create prototype process models quickly and inexpensively instead of an engineer developing the process model.

  7. Biologically relevant neural network architectures for support vector machines.

    PubMed

    Jändel, Magnus

    2014-01-01

    Neural network architectures that implement support vector machines (SVM) are investigated for the purpose of modeling perceptual one-shot learning in biological organisms. A family of SVM algorithms including variants of maximum margin, 1-norm, 2-norm and ν-SVM is considered. SVM training rules adapted for neural computation are derived. It is found that competitive queuing memory (CQM) is ideal for storing and retrieving support vectors. Several different CQM-based neural architectures are examined for each SVM algorithm. Although most of the sixty-four scanned architectures are unconvincing for biological modeling four feasible candidates are found. The seemingly complex learning rule of a full ν-SVM implementation finds a particularly simple and natural implementation in bisymmetric architectures. Since CQM-like neural structures are thought to encode skilled action sequences and bisymmetry is ubiquitous in motor systems it is speculated that trainable pattern recognition in low-level perception has evolved as an internalized motor programme.

  8. In vitro organogenesis in three dimensions: self-organising stem cells.

    PubMed

    Sasai, Yoshiki; Eiraku, Mototsugu; Suga, Hidetaka

    2012-11-01

    Organ formation during embryogenesis is a complex process that involves various local cell-cell interactions at the molecular and mechanical levels. Despite this complexity, organogenesis can be modelled in vitro. In this article, we focus on two recent examples in which embryonic stem cells can self-organise into three-dimensional structures - the optic cup and the pituitary epithelium; and one case of self-organising adult stem cells - the gut epithelium. We summarise how these approaches have revealed intrinsic programs that drive locally autonomous modes of organogenesis and homeostasis. We also attempt to interpret the results of previous in vivo studies of retinal development in light of the self-organising nature of the retina.

  9. Participatory sensing as an enabler for self-organisation in future cellular networks

    NASA Astrophysics Data System (ADS)

    Imran, Muhammad Ali; Imran, Ali; Onireti, Oluwakayode

    2013-12-01

    In this short review paper we summarise the emerging challenges in the field of participatory sensing for the self-organisation of the next generation of wireless cellular networks. We identify the potential of participatory sensing in enabling the self-organisation, deployment optimisation and radio resource management of wireless cellular networks. We also highlight how this approach can meet the future goals for the next generation of cellular system in terms of infrastructure sharing, management of multiple radio access techniques, flexible usage of spectrum and efficient management of very small data cells.

  10. The Psychology of Delivering a Psychological Service: Self-Organised Learning as a Model for Consultation

    ERIC Educational Resources Information Center

    Clarke, Steve; Jenner, Simon

    2006-01-01

    The article describes how one Educational Psychology Service in the UK developed a service delivery based on self-organised learning (SOL). This model is linked to the paradigms and discourses within which educational psychology and special educational needs work. The work described here is dedicated to the memory of Brian Roberts, academic, close…

  11. Discrete Self-Organising Migrating Algorithm for Flow Shop Scheduling with no Wait Makespan

    NASA Astrophysics Data System (ADS)

    Davendra, Donald; Zelinka, Ivan; Senkerik, Roman; Jasek, Roman

    2011-06-01

    This paper introduces a novel discrete Self Organising Migrating Algorithm for the task of flowshop scheduling with no-wait makespan. The new algorithm is tested with the small and medium Taillard benchmark problems and the obtained results are competitive with the best performing heuristics in literature.

  12. Effect of weightlessness on colloidal particle transport and segregation in self-organising microtubule preparations.

    PubMed

    Tabony, James; Rigotti, Nathalie; Glade, Nicolas; Cortès, Sandra

    2007-05-01

    Weightlessness is known to effect cellular functions by as yet undetermined processes. Many experiments indicate a role of the cytoskeleton and microtubules. Under appropriate conditions in vitro microtubule preparations behave as a complex system that self-organises by a combination of reaction and diffusion. This process also results in the collective transport and organisation of any colloidal particles present. In large centimetre-sized samples, self-organisation does not occur when samples are exposed to a brief early period of weightlessness. Here, we report both space-flight and ground-based (clinorotation) experiments on the effect of weightlessness on the transport and segregation of colloidal particles and chromosomes. In centimetre-sized containers, both methods show that a brief initial period of weightlessness strongly inhibits particle transport. In miniature cell-sized containers under normal gravity conditions, the particle transport that self-organisation causes results in their accumulation into segregated regions of high and low particle density. The gravity dependence of this behaviour is strongly shape dependent. In square wells, neither self-organisation nor particle transport and segregation occur under conditions of weightlessness. On the contrary, in rectangular canals, both phenomena are largely unaffected by weightlessness. These observations suggest, depending on factors such as cell and embryo shape, that major biological functions associated with microtubule driven particle transport and organisation might be strongly perturbed by weightlessness.

  13. Reliability Impacts in Life Support Architecture and Technology Selection

    NASA Technical Reports Server (NTRS)

    Lange Kevin E.; Anderson, Molly S.

    2012-01-01

    Quantitative assessments of system reliability and equivalent system mass (ESM) were made for different life support architectures based primarily on International Space Station technologies. The analysis was applied to a one-year deep-space mission. System reliability was increased by adding redundancy and spares, which added to the ESM. Results were thus obtained allowing a comparison of the ESM for each architecture at equivalent levels of reliability. Although the analysis contains numerous simplifications and uncertainties, the results suggest that achieving necessary reliabilities for deep-space missions will add substantially to the life support ESM and could influence the optimal degree of life support closure. Approaches for reducing reliability impacts were investigated and are discussed.

  14. Lunar Surface Architecture Utilization and Logistics Support Assessment

    NASA Astrophysics Data System (ADS)

    Bienhoff, Dallas; Findiesen, William; Bayer, Martin; Born, Andrew; McCormick, David

    2008-01-01

    Crew and equipment utilization and logistics support needs for the point of departure lunar outpost as presented by the NASA Lunar Architecture Team (LAT) and alternative surface architectures were assessed for the first ten years of operation. The lunar surface architectures were evaluated and manifests created for each mission. Distances between Lunar Surface Access Module (LSAM) landing sites and emplacement locations were estimated. Physical characteristics were assigned to each surface element and operational characteristics were assigned to each surface mobility element. Stochastic analysis was conducted to assess probable times to deploy surface elements, conduct exploration excursions, and perform defined crew activities. Crew time is divided into Outpost-related, exploration and science, overhead, and personal activities. Outpost-related time includes element deployment, EVA maintenance, IVA maintenance, and logistics resupply. Exploration and science activities include mapping, geological surveys, science experiment deployment, sample analysis and categorizing, and physiological and biological tests in the lunar environment. Personal activities include sleeping, eating, hygiene, exercising, and time off. Overhead activities include precursor or close-out tasks that must be accomplished but don't fit into the other three categories such as: suit donning and doffing, airlock cycle time, suit cleaning, suit maintenance, post-landing safing actions, and pre-departure preparations. Equipment usage time, spares, maintenance actions, and Outpost consumables are also estimated to provide input into logistics support planning. Results are normalized relative to the NASA LAT point of departure lunar surface architecture.

  15. Modular analytics management architecture for interoperability and decision support

    NASA Astrophysics Data System (ADS)

    Marotta, Stephen; Metzger, Max; Gorman, Joe; Sliva, Amy

    2016-05-01

    The Dual Node Decision Wheels (DNDW) architecture is a new approach to information fusion and decision support systems. By combining cognitive systems engineering organizational analysis tools, such as decision trees, with the Dual Node Network (DNN) technical architecture for information fusion, the DNDW can align relevant data and information products with an organization's decision-making processes. In this paper, we present the Compositional Inference and Machine Learning Environment (CIMLE), a prototype framework based on the principles of the DNDW architecture. CIMLE provides a flexible environment so heterogeneous data sources, messaging frameworks, and analytic processes can interoperate to provide the specific information required for situation understanding and decision making. It was designed to support the creation of modular, distributed solutions over large monolithic systems. With CIMLE, users can repurpose individual analytics to address evolving decision-making requirements or to adapt to new mission contexts; CIMLE's modular design simplifies integration with new host operating environments. CIMLE's configurable system design enables model developers to build analytical systems that closely align with organizational structures and processes and support the organization's information needs.

  16. Self-organising learning control and its application to muscle relaxant anaesthesia.

    PubMed

    Linkens, D A; Hasnain, S B

    1990-11-01

    The concept of a self-organising control system is attractive in biomedicine because of the imprecise nature of available physiological models. In this paper a particular strategy called a self-organising controller (SOC) originating from the work of Barron on aerospace systems is applied to the control of muscle relaxant anaesthesia. The SOC algorithm, which requires no prior knowledge of system dynamics, is described, both in single variable and multivariable format. Simulation results are presented for SOC performance on a well-established pancuronium model. Three implementations are described, being the use of a general purpose language, a SUN workstation approach, and a parallel computer transputer solution. The latter approach becomes important for multivariable control because of the computing-intensive nature of SOC. The transputer is shown to be a suitable vehicle for implementation in terms of speed and parallelism for SOC.

  17. Bring It On, Complexity! Present and Future of Self-Organising Middle-Out Abstraction

    NASA Astrophysics Data System (ADS)

    Mammen, Sebastian Von; Steghöfer, Jan-Philipp

    The following sections are included: * The Great Complexity Challenge * Self-Organising Middle-Out Abstraction * Optimising Graphics, Physics and Artificial Intelligence * Emergence and Hierarchies in a Natural System * The Technical Concept of SOMO * Observation of interactions * Interaction pattern recognition and behavioural abstraction * Creating and adjusting hierarchies * Confidence measures * Execution model * Learning SOMO: parameters, knowledge propagation, and procreation * Current Implementations * Awareness Beyond Virtuality * Integration and emergence * Model inference * SOMO net * SOMO after me * The Future of SOMO

  18. Self-organisation and communication in groups of simulated and physical robots.

    PubMed

    Trianni, Vito; Dorigo, Marco

    2006-09-01

    In social insects, both self-organisation and communication play a crucial role for the accomplishment of many tasks at a collective level. Communication is performed with different modalities, which can be roughly classified into three classes: indirect (stigmergic) communication, direct interactions and direct communication. The use of stigmergic communication is predominant in social insects (e.g. the pheromone trails in ants), where, however, direct interactions (e.g. antennation in ants) and direct communication (e.g. the waggle dance in honey bees) can also be observed. Taking inspiration from insect societies, we present an experimental study of self-organising behaviours for a group of robots, which exploit communication to coordinate their activities. In particular, the robots are placed in an arena presenting holes and open borders, which they should avoid while moving coordinately. Artificial evolution is responsible for the synthesis in a simulated environment of the robot's neural controllers, which are subsequently tested on physical robots. We study different communication strategies among the robots: no direct communication, handcrafted signalling and a completely evolved approach. We show that the latter is the most efficient, suggesting that artificial evolution can produce behaviours that are more adaptive than those obtained with conventional design methodologies. Moreover, we show that the evolved controllers produce a self-organising system that is robust enough to be tested on physical robots, notwithstanding the huge gap between simulation and reality. PMID:16821036

  19. Self-organisation and communication in groups of simulated and physical robots.

    PubMed

    Trianni, Vito; Dorigo, Marco

    2006-09-01

    In social insects, both self-organisation and communication play a crucial role for the accomplishment of many tasks at a collective level. Communication is performed with different modalities, which can be roughly classified into three classes: indirect (stigmergic) communication, direct interactions and direct communication. The use of stigmergic communication is predominant in social insects (e.g. the pheromone trails in ants), where, however, direct interactions (e.g. antennation in ants) and direct communication (e.g. the waggle dance in honey bees) can also be observed. Taking inspiration from insect societies, we present an experimental study of self-organising behaviours for a group of robots, which exploit communication to coordinate their activities. In particular, the robots are placed in an arena presenting holes and open borders, which they should avoid while moving coordinately. Artificial evolution is responsible for the synthesis in a simulated environment of the robot's neural controllers, which are subsequently tested on physical robots. We study different communication strategies among the robots: no direct communication, handcrafted signalling and a completely evolved approach. We show that the latter is the most efficient, suggesting that artificial evolution can produce behaviours that are more adaptive than those obtained with conventional design methodologies. Moreover, we show that the evolved controllers produce a self-organising system that is robust enough to be tested on physical robots, notwithstanding the huge gap between simulation and reality.

  20. Do Performance-Based Codes Support Universal Design in Architecture?

    PubMed

    Grangaard, Sidse; Frandsen, Anne Kathrine

    2016-01-01

    The research project 'An analysis of the accessibility requirements' studies how Danish architectural firms experience the accessibility requirements of the Danish Building Regulations and it examines their opinions on how future regulative models can support innovative and inclusive design - Universal Design (UD). The empirical material consists of input from six workshops to which all 700 Danish Architectural firms were invited, as well as eight group interviews. The analysis shows that the current prescriptive requirements are criticized for being too homogenous and possibilities for differentiation and zoning are required. Therefore, a majority of professionals are interested in a performance-based model because they think that such a model will support 'accessibility zoning', achieving flexibility because of different levels of accessibility in a building due to its performance. The common understanding of accessibility and UD is directly related to buildings like hospitals and care centers. When the objective is both innovative and inclusive architecture, the request of a performance-based model should be followed up by a knowledge enhancement effort in the building sector. Bloom's taxonomy of educational objectives is suggested as a tool for such a boost. The research project has been financed by the Danish Transport and Construction Agency. PMID:27534292

  1. Do Performance-Based Codes Support Universal Design in Architecture?

    PubMed

    Grangaard, Sidse; Frandsen, Anne Kathrine

    2016-01-01

    The research project 'An analysis of the accessibility requirements' studies how Danish architectural firms experience the accessibility requirements of the Danish Building Regulations and it examines their opinions on how future regulative models can support innovative and inclusive design - Universal Design (UD). The empirical material consists of input from six workshops to which all 700 Danish Architectural firms were invited, as well as eight group interviews. The analysis shows that the current prescriptive requirements are criticized for being too homogenous and possibilities for differentiation and zoning are required. Therefore, a majority of professionals are interested in a performance-based model because they think that such a model will support 'accessibility zoning', achieving flexibility because of different levels of accessibility in a building due to its performance. The common understanding of accessibility and UD is directly related to buildings like hospitals and care centers. When the objective is both innovative and inclusive architecture, the request of a performance-based model should be followed up by a knowledge enhancement effort in the building sector. Bloom's taxonomy of educational objectives is suggested as a tool for such a boost. The research project has been financed by the Danish Transport and Construction Agency.

  2. Supporting shared data structures on distributed memory architectures

    NASA Technical Reports Server (NTRS)

    Koelbel, Charles; Mehrotra, Piyush; Vanrosendale, John

    1990-01-01

    Programming nonshared memory systems is more difficult than programming shared memory systems, since there is no support for shared data structures. Current programming languages for distributed memory architectures force the user to decompose all data structures into separate pieces, with each piece owned by one of the processors in the machine, and with all communication explicitly specified by low-level message-passing primitives. A new programming environment is presented for distributed memory architectures, providing a global name space and allowing direct access to remote parts of data values. The analysis and program transformations required to implement this environment are described, and the efficiency of the resulting code on the NCUBE/7 and IPSC/2 hypercubes are described.

  3. Information Architecture for Quality Management Support in Hospitals.

    PubMed

    Rocha, Álvaro; Freixo, Jorge

    2015-10-01

    Quality Management occupies a strategic role in organizations, and the adoption of computer tools within an aligned information architecture facilitates the challenge of making more with less, promoting the development of a competitive edge and sustainability. A formal Information Architecture (IA) lends organizations an enhanced knowledge but, above all, favours management. This simplifies the reinvention of processes, the reformulation of procedures, bridging and the cooperation amongst the multiple actors of an organization. In the present investigation work we planned the IA for the Quality Management System (QMS) of a Hospital, which allowed us to develop and implement the QUALITUS (QUALITUS, name of the computer application developed to support Quality Management in a Hospital Unit) computer application. This solution translated itself in significant gains for the Hospital Unit under study, accelerating the quality management process and reducing the tasks, the number of documents, the information to be filled in and information errors, amongst others.

  4. On Antenna-Architectures for Sensitive Radiometry to Support Oceanography

    NASA Astrophysics Data System (ADS)

    Van't Klooster, Cornelis; Cappellin, Cecilia; Pontoppidan, Knud; Heighwood Nielsen, Per; Skou, Niels; Ivashina, Marianna; Iupikov, Oleg; Ihle, Alexander

    The presentation discusses different antenna architectures supporting radiometric tasks for oceanographic observations. With Aquarius and SMOS in orbit with their associated resolution and revisit capability in L-band, further enhancements are of interest. Following studies into desirable resolution and frequency band interests for oceanographic applications (ref: Microwat - an ESA study, see also https://www.ghrsst.org/ ), breaking through and desirable requirements have been derived. Investigations into potential antenna architectural realisations have been initiated. Included are radiometer sensor (read:antenna) scenarios, based on conical scanning, interferometric 1D and pushbroom coverage. A wide coverage is available from the first two architectures, and a very good sensitivity is available with the pushbroom scenario. There are a couple of interesting aspects, related to polarimetry capabilities, resolution, sensitivity, etc. The pushbroom architecture, at cost of some complexity offers a very good sensitivity with interesting antenna architecture solutions to offer breaking through capabilities, in particular concerning the sensitivity requirements, in combination with polarimetric capabilities. Coverage comes with some infrastructural antenna complexity, with the needs and creativity for a deployable antenna configuration. Following initial considerations for all three antenna configurations at overview level, the push-broom scenario is presented with more details. Interesting aspects include ongoing technology developments in other related fields with refined results to come would enable to consider antenna architectures are used in which focal plane arrays find a combination with shaped reflector assemblies. With processing capabilities further enhanced - with ongoing developments underway in other sectors as radio astronomers can confirm - one would be able to further improve and refine sensitivity aspects in combination with polarimetric capabilities

  5. The influence of the physical environment on the self-organised foraging patterns of ants

    NASA Astrophysics Data System (ADS)

    Detrain, C.; Natan, C.; Deneubourg, J.-L.

    2001-04-01

    Among social insects such as ants, scouts that modulate their recruiting behaviour, following simple rules based on local information, generate collective patterns of foraging. Here we demonstrate that features of the abiotic environment, specifically the foraging substrate, may also be influential in the emergence of group-level decisions such as the choice of one foraging path. Experimental data and theoretical analyses show that the collective patterns can arise independently of behavioural changes of individual scouts and can result, through self-organising processes, from the physico-chemical properties of the environment that alter the dynamics of information transfer by chemical trails.

  6. A self-organising neural network model of image velocity encoding.

    PubMed

    Gurney, K N; Wright, M J

    1992-01-01

    A self-organising neural network has been developed which maps the image velocities of rigid objects, moving in the fronto-parallel plane, topologically over a neural layer. The input is information in the Fourier domain about the spatial components of the image. The computation performed by the network may be viewed as a neural instantiation of the Intersection of Constraints solution to the aperture problem. The model has biological plausibility in that the connectivity develops simply as a result of exposure to inputs derived from rigid translation of textures and its overall organisation is consistent with psychophysical evidence.

  7. Visualisation of gait data with Kohonen self-organising neural maps.

    PubMed

    Barton, Gabor; Lees, Adrian; Lisboa, Paulo; Attfield, Steve

    2006-08-01

    Self-organising artificial neural networks were used to reduce the complexity of joint kinematic and kinetic data, which form part of a typical instrumented gait assessment. Three-dimensional joint angles, moments and powers during the gait cycle were projected from the multi-dimensional data space onto a topological neural map, which thereby identified gait stem-patterns. Patients were positioned on the map in relation to each other and this enabled them to be compared from their gait patterns. The visualisation of large amounts of complex data in a two-dimensional map labelled with gait patterns is a step towards more objective analysis protocols which may enhance decision making.

  8. Symmetries and pattern formation in hyperbolic versus parabolic models of self-organised aggregation.

    PubMed

    Buono, Pietro-Luciano; Eftimie, Raluca

    2015-10-01

    The study of self-organised collective animal behaviour, such as swarms of insects or schools of fish, has become over the last decade a very active research area in mathematical biology. Parabolic and hyperbolic models have been used intensively to describe the formation and movement of various aggregative behaviours. While both types of models can exhibit aggregation-type patterns, studies on hyperbolic models suggest that these models can display a larger variety of spatial and spatio-temporal patterns compared to their parabolic counterparts. Here we use stability, symmetry and bifurcation theory to investigate this observation more rigorously, an approach not attempted before to compare and contrast aggregation patterns in models for collective animal behaviors. To this end, we consider a class of nonlocal hyperbolic models for self-organised aggregations that incorporate various inter-individual communication mechanisms, and take the formal parabolic limit to transform them into nonlocal parabolic models. We then discuss the symmetry of these nonlocal hyperbolic and parabolic models, and the types of bifurcations present or lost when taking the parabolic limit. We show that the parabolic limit leads to a homogenisation of the inter-individual communication, and to a loss of bifurcation dynamics (in particular loss of Hopf bifurcations). This explains the less rich patterns exhibited by the nonlocal parabolic models. However, for multiple interacting populations, by breaking the population interchange symmetry of the model, one can preserve the Hopf bifurcations that lead to the formation of complex spatio-temporal patterns that describe moving aggregations.

  9. Self-organising mixture autoregressive model for non-stationary time series modelling.

    PubMed

    Ni, He; Yin, Hujun

    2008-12-01

    Modelling non-stationary time series has been a difficult task for both parametric and nonparametric methods. One promising solution is to combine the flexibility of nonparametric models with the simplicity of parametric models. In this paper, the self-organising mixture autoregressive (SOMAR) network is adopted as a such mixture model. It breaks time series into underlying segments and at the same time fits local linear regressive models to the clusters of segments. In such a way, a global non-stationary time series is represented by a dynamic set of local linear regressive models. Neural gas is used for a more flexible structure of the mixture model. Furthermore, a new similarity measure has been introduced in the self-organising network to better quantify the similarity of time series segments. The network can be used naturally in modelling and forecasting non-stationary time series. Experiments on artificial, benchmark time series (e.g. Mackey-Glass) and real-world data (e.g. numbers of sunspots and Forex rates) are presented and the results show that the proposed SOMAR network is effective and superior to other similar approaches. PMID:19145663

  10. Visual data mining with self-organising maps for ventricular fibrillation analysis.

    PubMed

    Rosado-Muñoz, Alfredo; Martínez-Martínez, José M; Escandell-Montero, Pablo; Soria-Olivas, Emilio

    2013-08-01

    Detection of ventricular fibrillation (VF) at an early stage is being deeply studied in order to lower the risk of sudden death and allows the specialist to have greater reaction time to give the patient a good recovering therapy. Some works are focusing on detecting VF based on numerical analysis of time-frequency distributions, but in general the methods used do not provide insight into the problem. However, this study proposes a new methodology in order to obtain information about this problem. This work uses a supervised self-organising map (SOM) to obtain visually information among four important groups of patients: VF (ventricular fibrillation), VT (ventricular tachycardia), HP (healthy patients) and AHR (other anomalous heart rates and noise). A total number of 27 variables were obtained from continuous surface ECG recordings in standard databases (MIT and AHA), providing information in the time, frequency, and time-frequency domains. self-organising maps (SOMs), trained with 11 of the 27 variables, were used to extract knowledge about the variable values for each group of patients. Results show that the SOM technique allows to determine the profile of each group of patients, assisting in gaining a deeper understanding of this clinical problem. Additionally, information about the most relevant variables is given by the SOM analysis.

  11. Female Dominance over Males in Primates: Self-Organisation and Sexual Dimorphism

    PubMed Central

    Hemelrijk, Charlotte K.; Wantia, Jan; Isler, Karin

    2008-01-01

    The processes that underlie the formation of the dominance hierarchy in a group are since long under debate. Models of self-organisation suggest that dominance hierarchies develop by the self-reinforcing effects of winning and losing fights (the so-called winner-loser effect), but according to ‘the prior attribute hypothesis’, dominance hierarchies develop from pre-existing individual differences, such as in body mass. In the present paper, we investigate the relevance of each of these two theories for the degree of female dominance over males. We investigate this in a correlative study in which we compare female dominance between groups of 22 species throughout the primate order. In our study female dominance may range from 0 (no female dominance) to 1 (complete female dominance). As regards ‘the prior attribute hypothesis’, we expected a negative correlation between female dominance over males and species-specific sexual dimorphism in body mass. However, to our surprise we found none (we use the method of independent contrasts). Instead, we confirm the self-organisation hypothesis: our model based on the winner-loser effect predicts that female dominance over males increases with the percentage of males in the group. We confirm this pattern at several levels in empirical data (among groups of a single species and between species of the same genus and of different ones). Since the winner-loser effect has been shown to work in many taxa including humans, these results may have broad implications. PMID:18628830

  12. Exploration Clinical Decision Support System: Medical Data Architecture

    NASA Technical Reports Server (NTRS)

    Lindsey, Tony; Shetye, Sandeep; Shaw, Tianna (Editor)

    2016-01-01

    The Exploration Clinical Decision Support (ECDS) System project is intended to enhance the Exploration Medical Capability (ExMC) Element for extended duration, deep-space mission planning in HRP. A major development guideline is the Risk of "Adverse Health Outcomes & Decrements in Performance due to Limitations of In-flight Medical Conditions". ECDS attempts to mitigate that Risk by providing crew-specific health information, actionable insight, crew guidance and advice based on computational algorithmic analysis. The availability of inflight health diagnostic computational methods has been identified as an essential capability for human exploration missions. Inflight electronic health data sources are often heterogeneous, and thus may be isolated or not examined as an aggregate whole. The ECDS System objective provides both a data architecture that collects and manages disparate health data, and an active knowledge system that analyzes health evidence to deliver case-specific advice. A single, cohesive space-ready decision support capability that considers all exploration clinical measurements is not commercially available at present. Hence, this Task is a newly coordinated development effort by which ECDS and its supporting data infrastructure will demonstrate the feasibility of intelligent data mining and predictive modeling as a biomedical diagnostic support mechanism on manned exploration missions. The initial step towards ground and flight demonstrations has been the research and development of both image and clinical text-based computer-aided patient diagnosis. Human anatomical images displaying abnormal/pathological features have been annotated using controlled terminology templates, marked-up, and then stored in compliance with the AIM standard. These images have been filtered and disease characterized based on machine learning of semantic and quantitative feature vectors. The next phase will evaluate disease treatment response via quantitative linear

  13. Architecture and life support systems for a rotating space habitat

    NASA Astrophysics Data System (ADS)

    Misra, Gaurav

    Life Support Systems are critical to sustain human habitation of space over long time periods. As orbiting space habitats become operational in the future, support systems such as atmo-sphere, food, water etc. will play a very pivotal role in sustaining life. To design a long-duration space habitat, it's important to consider the full gamut of human experience of the environment. Long-term viability depends on much more than just the structural or life support efficiency. A space habitat isn't just a machine; it's a life experience. To be viable, it needs to keep the inhabitants satisfied with their condition. This paper provides conceptual research on several key factors that influence the growth and sustainability of humans in a space habitat. Apart from the main life support system parameters, the architecture (both interior and exterior) of the habitat will play a crucial role in influencing the liveability in the space habitat. In order to ensure the best possible liveability for the inhabitants, a truncated (half cut) torus is proposed as the shape of the habitat. This structure rotating at an optimum rpm will en-sure 1g pseudo gravity to the inhabitants. The truncated torus design has several advantages over other proposed shapes such as a cylinder or a sphere. The design provides minimal grav-ity variation (delta g) in the living area, since its flat outer pole ensures a constant gravity. The design is superior in economy of structural and atmospheric mass. Interior architecture of the habitat addresses the total built environment, drawing from diverse disciplines includ-ing physiology, psychology, and sociology. Furthermore, factors such as line of sight, natural sunlight and overhead clearance have been discussed in the interior architecture. Substantial radiation shielding is also required in order to prevent harmful cosmic radiations and solar flares from causing damage to inhabitants. Regolith shielding of 10 tons per meter square is proposed for the

  14. New architectures support for ALMA common software: lessons learned

    NASA Astrophysics Data System (ADS)

    Menay, Camilo E.; Zamora, Gabriel A.; Tobar, Rodrigo J.; Avarias, Jorge A.; Dahl-skog, Kevin R.; von Brand, Horst H.; Chiozzi, Gianluca

    2010-07-01

    ALMA Common Software (ACS) is a distributed control framework based on CORBA that provides communication between distributed pieces of software. Because of its size and complexity it provides its own compilation system, a mix of several technologies. The current ACS compilation process depends on specific tools, compilers, code generation, and a strict dependency model induced by the large number of software components. This document presents a summary of several porting and compatibility attempts at using ACS on platforms other than the officially supported one. A porting of ACS to the Microsoft Windows Platform and to the ARM processor architecture were attempted, with different grades of success. Also, support for LINUX-PREEMPT (a set of real-time patches for the Linux kernel) using a new design for real-time services was implemented. These efforts were integrated with the ACS building and compilation system, while others were included in its design. Lessons learned in this process are presented, and a general approach is extracted from them.

  15. Self-organisation in protoplanetary discs. Global, non-stratified Hall-MHD simulations

    NASA Astrophysics Data System (ADS)

    Béthune, William; Lesur, Geoffroy; Ferreira, Jonathan

    2016-05-01

    Context. Recent observations have revealed organised structures in protoplanetary discs, such as axisymmetric rings or horseshoe concentrations, evocative of large-scale vortices. These structures are often interpreted as the result of planet-disc interactions. However, these discs are also known to be unstable to the magneto-rotational instability (MRI) which is believed to be one of the dominant angular momentum transport mechanism in these objects. It is therefore natural to ask whether the MRI itself could produce these structures without invoking planets. Aims: The nonlinear evolution of the MRI is strongly affected by the low ionisation fraction in protoplanetary discs. The Hall effect in particular, which is dominant in dense and weakly ionised parts of these objects, has been shown to spontaneously drive self-organising flows in local, shearing box simulations. Here, we investigate the behaviour of global MRI-unstable disc models dominated by the Hall effect and characterise their dynamics. Methods: We validated our implementation of the Hall effect into the PLUTO code with predictions from a spectral method in cylindrical geometry. We then performed 3D unstratified Hall-MHD simulations of Keplerian discs for a broad range of Hall, Ohmic, and ambipolar Elsasser numbers. Results: We confirm the transition from a turbulent to an organised state as the intensity of the Hall effect is increased. We observe the formation of zonal flows, their number depending on the available magnetic flux and on the intensity of the Hall effect. For intermediate Hall intensity, the flow self-organises into long-lived magnetised vortices. Neither the addition of a toroidal field nor Ohmic or ambipolar diffusion change this picture drastically in the range of parameters we have explored. Conclusions: Self-organisation by the Hall effect is a robust phenomenon in global non-stratified simulations. It is able to quench turbulent transport and spontaneously produce axisymmetric

  16. Descriptive Characteristics of Surface Water Quality in Hong Kong by a Self-Organising Map

    PubMed Central

    An, Yan; Zou, Zhihong; Li, Ranran

    2016-01-01

    In this study, principal component analysis (PCA) and a self-organising map (SOM) were used to analyse a complex dataset obtained from the river water monitoring stations in the Tolo Harbor and Channel Water Control Zone (Hong Kong), covering the period of 2009–2011. PCA was initially applied to identify the principal components (PCs) among the nonlinear and complex surface water quality parameters. SOM followed PCA, and was implemented to analyze the complex relationships and behaviors of the parameters. The results reveal that PCA reduced the multidimensional parameters to four significant PCs which are combinations of the original ones. The positive and inverse relationships of the parameters were shown explicitly by pattern analysis in the component planes. It was found that PCA and SOM are efficient tools to capture and analyze the behavior of multivariable, complex, and nonlinear related surface water quality data. PMID:26761018

  17. Descriptive Characteristics of Surface Water Quality in Hong Kong by a Self-Organising Map.

    PubMed

    An, Yan; Zou, Zhihong; Li, Ranran

    2016-01-08

    In this study, principal component analysis (PCA) and a self-organising map (SOM) were used to analyse a complex dataset obtained from the river water monitoring stations in the Tolo Harbor and Channel Water Control Zone (Hong Kong), covering the period of 2009-2011. PCA was initially applied to identify the principal components (PCs) among the nonlinear and complex surface water quality parameters. SOM followed PCA, and was implemented to analyze the complex relationships and behaviors of the parameters. The results reveal that PCA reduced the multidimensional parameters to four significant PCs which are combinations of the original ones. The positive and inverse relationships of the parameters were shown explicitly by pattern analysis in the component planes. It was found that PCA and SOM are efficient tools to capture and analyze the behavior of multivariable, complex, and nonlinear related surface water quality data.

  18. Tailoring broadband light trapping of GaAs and Si substrates by self-organised nanopatterning

    SciTech Connect

    Martella, C.; Chiappe, D.; Mennucci, C.; Buatier de Mongeot, F.

    2014-05-21

    We report on the formation of high aspect ratio anisotropic nanopatterns on crystalline GaAs (100) and Si (100) substrates exploiting defocused Ion Beam Sputtering assisted by a sacrificial self-organised Au stencil mask. The tailored optical properties of the substrates are characterised in terms of total reflectivity and haze by means of integrating sphere measurements as a function of the morphological modification at increasing ion fluence. Refractive index grading from sub-wavelength surface features induces polarisation dependent anti-reflection behaviour in the visible-near infrared (VIS-NIR) range, while light scattering at off-specular angles from larger structures leads to very high values of the haze functions in reflection. The results, obtained for an important class of technologically relevant materials, are appealing in view of photovoltaic and photonic applications aiming at photon harvesting in ultrathin crystalline solar cells.

  19. Tailoring broadband light trapping of GaAs and Si substrates by self-organised nanopatterning

    NASA Astrophysics Data System (ADS)

    Martella, C.; Chiappe, D.; Mennucci, C.; de Mongeot, F. Buatier

    2014-05-01

    We report on the formation of high aspect ratio anisotropic nanopatterns on crystalline GaAs (100) and Si (100) substrates exploiting defocused Ion Beam Sputtering assisted by a sacrificial self-organised Au stencil mask. The tailored optical properties of the substrates are characterised in terms of total reflectivity and haze by means of integrating sphere measurements as a function of the morphological modification at increasing ion fluence. Refractive index grading from sub-wavelength surface features induces polarisation dependent anti-reflection behaviour in the visible-near infrared (VIS-NIR) range, while light scattering at off-specular angles from larger structures leads to very high values of the haze functions in reflection. The results, obtained for an important class of technologically relevant materials, are appealing in view of photovoltaic and photonic applications aiming at photon harvesting in ultrathin crystalline solar cells.

  20. On multidimensional scaling and the embedding of self-organising maps.

    PubMed

    Yin, Hujun

    2008-01-01

    The self-organising map (SOM) and its variant, visualisation induced SOM (ViSOM), have been known to yield similar results to multidimensional scaling (MDS). However, the exact connection has not been established. In this paper, a review on the SOM and its cost function and topological measures is provided first. We then examine the exact scaling effect of the SOM and ViSOM from their objective functions. The SOM is shown to produce a qualitative, nonmetric scaling, while the local distance-preserving ViSOM produces a quantitative or metric scaling. Their relationship with the principal manifold is also discussed. The SOM-based methods not only produce topological or metric scaling but also provide a principal manifold. Furthermore a growing ViSOM is proposed to aid the adaptive embedding of highly nonlinear manifolds. Examples and comparisons with other embedding methods such as Isomap and local linear embedding are also presented.

  1. Provenance determination of Vinica terra cotta icons using self-organising maps.

    PubMed

    Tanevska, Vinka; Kuzmanovski, Igor; Grupce, Orhideja

    2007-07-01

    In the Vinica Fortress, Republic of Macedonia, 50 undamaged terra cotta icons and 100 fragments, all dated 6th-7th century, were found. In order to determine the provenance of these unique terra cotta icons, the mass fractions of 19 different chemical elements were previously determined in ten fragments of the terra cotta icons and thirty three samples of clays from eight different sites from the region. Due to the dimensionality and complexity of the experimental data, the archaeometric results were treated with self-organising maps (SOM). The results obtained using SOM were compared with the ones obtained using principal component analysis. Both chemometric methods revealed that Vinica terra cotta icons were made from clay from Grncarka, 2.5 km South-East from the Vinica Fortress.

  2. Self-Organisation, Thermotropic and Lyotropic Properties of Glycolipids Related to their Biological Implications

    PubMed Central

    Garidel, Patrick; Kaconis, Yani; Heinbockel, Lena; Wulf, Matthias; Gerber, Sven; Munk, Ariane; Vill, Volkmar; Brandenburg, Klaus

    2015-01-01

    Glycolipids are amphiphilic molecules which bear an oligo- or polysaccharide as hydrophilic head group and hydrocarbon chains in varying numbers and lengths as hydrophobic part. They play an important role in life science as well as in material science. Their biological and physiological functions are quite diverse, ranging from mediators of cell-cell recognition processes, constituents of membrane domains or as membrane-forming units. Glycolipids form an exceptional class of liquid-crystal mesophases due to the fact that their self-organisation obeys more complex rules as compared to classical monophilic liquid-crystals. Like other amphiphiles, the supra-molecular structures formed by glycolipids are driven by their chemical structure; however, the details of this process are still hardly understood. Based on the synthesis of specific glycolipids with a clearly defined chemical structure, e.g., type and length of the sugar head group, acyl chain linkage, substitution pattern, hydrocarbon chain lengths and saturation, combined with a profound physico-chemical characterisation of the formed mesophases, the principles of the organisation in different aggregate structures of the glycolipids can be obtained. The importance of the observed and formed phases and their properties are discussed with respect to their biological and physiological relevance. The presented data describe briefly the strategies used for the synthesis of the used glycolipids. The main focus, however, lies on the thermotropic as well as lyotropic characterisation of the self-organised structures and formed phases based on physico-chemical and biophysical methods linked to their potential biological implications and relevance. PMID:26464591

  3. Investigation of support vector machine for the detection of architectural distortion in mammographic images

    NASA Astrophysics Data System (ADS)

    Guo, Q.; Shao, J.; Ruiz, V.

    2005-01-01

    This paper investigates detection of architectural distortion in mammographic images using support vector machine. Hausdorff dimension is used to characterise the texture feature of mammographic images. Support vector machine, a learning machine based on statistical learning theory, is trained through supervised learning to detect architectural distortion. Compared to the Radial Basis Function neural networks, SVM produced more accurate classification results in distinguishing architectural distortion abnormality from normal breast parenchyma.

  4. Self-Organising Navigational Support in Lifelong Learning: How Predecessors Can Lead the Way

    ERIC Educational Resources Information Center

    Janssen, Jose; Tattersall, Colin; Waterink, Wim; van den Berg, Bert; van Es, Rene; Bolman, Catherine; Koper, Rob

    2007-01-01

    Increased flexibility and modularisation in higher education complicates the process of learners finding their way through the offerings of higher education institutions. In lifelong learning, where learning opportunities are diverse and reach beyond institutional boundaries, it becomes even more complex to decide on a learning path. However,…

  5. A Proposed Clinical Decision Support Architecture Capable of Supporting Whole Genome Sequence Information

    PubMed Central

    Welch, Brandon M.; Rodriguez Loya, Salvador; Eilbeck, Karen; Kawamoto, Kensaku

    2014-01-01

    Whole genome sequence (WGS) information may soon be widely available to help clinicians personalize the care and treatment of patients. However, considerable barriers exist, which may hinder the effective utilization of WGS information in a routine clinical care setting. Clinical decision support (CDS) offers a potential solution to overcome such barriers and to facilitate the effective use of WGS information in the clinic. However, genomic information is complex and will require significant considerations when developing CDS capabilities. As such, this manuscript lays out a conceptual framework for a CDS architecture designed to deliver WGS-guided CDS within the clinical workflow. To handle the complexity and breadth of WGS information, the proposed CDS framework leverages service-oriented capabilities and orchestrates the interaction of several independently-managed components. These independently-managed components include the genome variant knowledge base, the genome database, the CDS knowledge base, a CDS controller and the electronic health record (EHR). A key design feature is that genome data can be stored separately from the EHR. This paper describes in detail: (1) each component of the architecture; (2) the interaction of the components; and (3) how the architecture attempts to overcome the challenges associated with WGS information. We believe that service-oriented CDS capabilities will be essential to using WGS information for personalized medicine. PMID:25411644

  6. Network architectures in support of digital subscriber line (DSL) deployment

    NASA Astrophysics Data System (ADS)

    Peuch, Bruno

    1998-09-01

    DSL technology enables very high bandwidth transmission in a point-to-point fashion from a customer's premises to a central office (CO), wiring center, or other logical point of traffic aggregation. Unlike many technologies that enable broadband Internet access, DSL technology does not determine a specific architecture to be deployed at either the customer's premises or in the service/access provider's network. In fact, DSL technology can be used in conjunction with a variety of network architectures. While being agnostic regarding to higher-layer protocols, there are still several critical 'protocol specific' issues that need to be addressed when deploying DSL as a solution for IP (Internet/intrAnet) access. This paper will address these issues and present a range of network architectures that incorporate DSL technology. This paper will only focus on those architectures that enable IP access. These architectures are divided into three categories: Traditional Dialled Model (TDM), frame-based (Frame Relay/Ethernet), and cell-based (ATM).

  7. An Information Architecture To Support the Visualization of Personal Histories.

    ERIC Educational Resources Information Center

    Plaisant, Catherine; Shneiderman, Ben; Mushlin, Rich

    1998-01-01

    Proposes an information architecture for personal-history data and describes how the data model can be extended to a runtime model for a compact visualization using graphical timelines. The model groups personal-history events into aggregates that are contained in facets, crosslinks are made, and data attributes are mapped. (Author/LRW)

  8. Design optimisation of powers-of-two FIR filter using self-organising random immigrants GA

    NASA Astrophysics Data System (ADS)

    Chandra, Abhijit; Chattopadhyay, Sudipta

    2015-01-01

    In this communication, we propose a novel design strategy of multiplier-less low-pass finite impulse response (FIR) filter with the aid of a recent evolutionary optimisation technique, known as the self-organising random immigrants genetic algorithm. Individual impulse response coefficients of the proposed filter have been encoded as sum of signed powers-of-two. During the formulation of the cost function for the optimisation algorithm, both the frequency response characteristic and the hardware cost of the discrete coefficient FIR filter have been considered. The role of crossover probability of the optimisation technique has been evaluated on the overall performance of the proposed strategy. For this purpose, the convergence characteristic of the optimisation technique has been included in the simulation results. In our analysis, two design examples of different specifications have been taken into account. In order to substantiate the efficiency of our proposed structure, a number of state-of-the-art design strategies of multiplier-less FIR filter have also been included in this article for the purpose of comparison. Critical analysis of the result unambiguously establishes the usefulness of our proposed approach for the hardware efficient design of digital filter.

  9. The metaphor-gestalt synergy underlying the self-organisation of perception as a semiotic process.

    PubMed

    Rail, David

    2013-04-01

    Recently the basis of concept and language formation has been redefined by the proposal that they both stem from perception and embodiment. The experiential revolution has lead to a far more integrated and dynamic understanding of perception as a semiotic system. The emergence of meaning in the perceptual process stems from the interaction between two key mechanisms. These are first, the generation of schemata through recurrent sensorimotor activity (SM) that underlies category and language formation (L). The second is the interaction between metaphor (M) and gestalt mechanisms (G) that generate invariant mappings beyond the SM domain that both conserve and diversify our understanding and meaning potential. We propose an important advance in our understanding of perception as a semiotic system through exploring the affect of self-organising to criticality where hierarchical behaviour becomes widely integrated through 1/f process and isomorphisms. Our proposal leads to several important implications. First, that SM and L form a functional isomorphism depicted as SM <=> L. We contend that SM <=> L is emergent, corresponding to the phenomenal self. Second, meaning structures the isomorphism SM <=>L through the synergy between M and G (M-G). M-G synergy is based on a combination of structuring and imagination. We contend that the interaction between M-G and SM <=> L functions as a macro-micro comutation that governs perception as semiosis. We discuss how our model relates to current research in fractal time and verb formation.

  10. Emergence and Dissolvence in the Self-organisation of Complex Systems

    NASA Astrophysics Data System (ADS)

    Testa, Bernard; Kier, Lemont B.

    2000-03-01

    The formation of complex systems is accompanied by the emergence of properties that are non-existent in the components. But what of the properties and behaviour of such components caught up in the formation of a system of a higher level of complexity? In this assay, we use a large variety of examples, from molecules to organisms and beyond, to show that systems merging into a complex system of higher order experience constraints with a partial loss of choice, options and independence. In other words, emergence in a complex system often implies reduction in the number of probable states of its components, a phenomenon we term dissolvence. This is seen in atoms when they merge to form molecules, in biomolecules when they form macromolecules such as proteins, and in macromolecules when they form aggregates such as molecular machines or membranes. At higher biological levels, dissolvence occurs for example in components of cells (e.g. organelles), tissues (cells), organs (tissues), organisms (organs) and societies (individuals). Far from being a destruction, dissolvence is understood here as a creative process in which information is generated to fuel the process of self-organisation of complex systems, allowing them to appear and evolve to higher states of organisation and emergence. Questions are raised about the relationship of dissolvence and adaptability; the interrelation with top-down causation; the reversibility of dissolvence; and the connection between dissolvence and anticipation.

  11. Using Self-Organising Maps (SOMs) to assess synchronies: an application to historical eucalypt flowering records.

    PubMed

    Hudson, Irene L; Keatley, Marie R; Lee, Shalem Y

    2011-11-01

    Self-Organising Map (SOM) clustering methods applied to the monthly and seasonal averaged flowering intensity records of eight Eucalypt species are shown to successfully quantify, visualise and model synchronisation of multivariate time series. The SOM algorithm converts complex, nonlinear relationships between high-dimensional data into simple networks and a map based on the most likely patterns in the multiplicity of time series that it trains. Monthly- and seasonal-based SOMs identified three synchronous species groups (clusters): E. camaldulensis, E. melliodora, E. polyanthemos; E. goniocalyx, E. microcarpa, E. macrorhyncha; and E. leucoxylon, E. tricarpa. The main factor in synchronisation (clustering) appears to be the season in which flowering commences. SOMs also identified the asynchronous relationship among the eight species. Hence, the likelihood of the production, or not, of hybrids between sympatric species is also identified. The SOM pattern-based correlation values mirror earlier synchrony statistics gleaned from Moran correlations obtained from the raw flowering records. Synchronisation of flowering is shown to be a complex mechanism that incorporates all the flowering characteristics: flowering duration, timing of peak flowering, of start and finishing of flowering, as well as possibly specific climate drivers for flowering. SOMs can accommodate for all this complexity and we advocate their use by phenologists and ecologists as a powerful, accessible and interpretable tool for visualisation and clustering of multivariate time series and for synchrony studies.

  12. Self-organising continuous attractor networks with multiple activity packets, and the representation of space.

    PubMed

    Stringer, S M; Rolls, E T; Trappenberg, T P

    2004-01-01

    'Continuous attractor' neural networks can maintain a localised packet of neuronal activity representing the current state of an agent in a continuous space without external sensory input. In applications such as the representation of head direction or location in the environment, only one packet of activity is needed. For some spatial computations a number of different locations, each with its own features, must be held in memory. We extend previous approaches to continuous attractor networks (in which one packet of activity is maintained active) by showing that a single continuous attractor network can maintain multiple packets of activity simultaneously, if each packet is in a different state space or map. We also show how such a network could by learning self-organise to enable the packets in each space to be moved continuously in that space by idiothetic (motion) inputs. We show how such multi-packet continuous attractor networks could be used to maintain different types of feature (such as form vs colour) simultaneously active in the correct location in a spatial representation. We also show how high-order synapses can improve the performance of these networks, and how the location of a packet could be read by motor networks. The multiple packet continuous attractor networks described here may be used for spatial representations in brain areas such as the parietal cortex and hippocampus.

  13. Self-organising discovery, recognition and prediction of haemodynamic patterns in the intensive care unit.

    PubMed

    Spencer, R G; Lessard, C S; Davila, F; Etter, B

    1997-03-01

    To care properly for critically ill patients in the intensive care unit (ICU), clinicians must be aware of haemodynamic patterns. In a typical ICU, a variety of physiological measurements are made continuously and intermittently in an attempt to provide clinicians with the most accurate and precise data needed for recognising such patterns. However, the data are disjointed, yielding little information beyond that provided by instantaneous high/low limit checking. Although instantaneous limit checking is useful for determining immediate dangers, it does not provide much information about temporal patterns. As a result, the clinician is left to sift manually through an excess of data in the interest of generating information. In the study, an arrangement of self-organising artificial neural networks is used to automate the discovery, recognition and prediction of haemodynamic patterns in real time. It is shown that the network is capable of recognising the same haemodynamic patterns recognised by an expert system, DYNASCENE, without being explicitly programmed to do so. Consequently, the system is also capable of discovering new haemodynamic patterns. Results from real clinical data are presented.

  14. Self-Organisation and Intermittent Coherent Oscillations in the EXTRAP T2 Reversed Field Pinch

    NASA Astrophysics Data System (ADS)

    Cecconello, M.; Malmberg, J.-A.; Sallander, E.; Drake, J. R.

    Many reversed-field pinch (RFP) experiments exhibit a coherent oscillatory behaviour that is characteristic of discrete dynamo events and is associated with intermittent current profile self-organisation phenomena. However, in the vast majority of the discharges in the resistive shell RFP experiment EXTRAP T2, the dynamo activity does not show global, coherent oscillatory behaviour. The internally resonant tearing modes are phase-aligned and wall-locked resulting in a large localised magnetic perturbation. Equilibrium and plasma parameters have a level of high frequency fluctuations but the average values are quasi-steady. For some discharges, however, the equilibrium parameters exhibit the oscillatory behaviour characteristic of the discrete dynamo events. For these discharges, the trend observed in the tearing mode spectra, associated with the onset of the discrete relaxation event behaviour, is a relative higher amplitude of m = 0 mode activity and relative lower amplitude of the m = 1 mode activity compared with their average values. Global plasma parameters and model profile calculations for sample discharges representing the two types of relaxation dynamics are presented.

  15. The metaphor-gestalt synergy underlying the self-organisation of perception as a semiotic process.

    PubMed

    Rail, David

    2013-04-01

    Recently the basis of concept and language formation has been redefined by the proposal that they both stem from perception and embodiment. The experiential revolution has lead to a far more integrated and dynamic understanding of perception as a semiotic system. The emergence of meaning in the perceptual process stems from the interaction between two key mechanisms. These are first, the generation of schemata through recurrent sensorimotor activity (SM) that underlies category and language formation (L). The second is the interaction between metaphor (M) and gestalt mechanisms (G) that generate invariant mappings beyond the SM domain that both conserve and diversify our understanding and meaning potential. We propose an important advance in our understanding of perception as a semiotic system through exploring the affect of self-organising to criticality where hierarchical behaviour becomes widely integrated through 1/f process and isomorphisms. Our proposal leads to several important implications. First, that SM and L form a functional isomorphism depicted as SM <=> L. We contend that SM <=> L is emergent, corresponding to the phenomenal self. Second, meaning structures the isomorphism SM <=>L through the synergy between M and G (M-G). M-G synergy is based on a combination of structuring and imagination. We contend that the interaction between M-G and SM <=> L functions as a macro-micro comutation that governs perception as semiosis. We discuss how our model relates to current research in fractal time and verb formation. PMID:23517606

  16. Space Network IP Services (SNIS): An Architecture for Supporting Low Earth Orbiting IP Satellite Missions

    NASA Technical Reports Server (NTRS)

    Israel, David J.

    2005-01-01

    The NASA Space Network (SN) supports a variety of missions using the Tracking and Data Relay Satellite System (TDRSS), which includes ground stations in White Sands, New Mexico and Guam. A Space Network IP Services (SNIS) architecture is being developed to support future users with requirements for end-to-end Internet Protocol (IP) communications. This architecture will support all IP protocols, including Mobile IP, over TDRSS Single Access, Multiple Access, and Demand Access Radio Frequency (RF) links. This paper will describe this architecture and how it can enable Low Earth Orbiting IP satellite missions.

  17. The influence of receptor-mediated interactions on reaction-diffusion mechanisms of cellular self-organisation.

    PubMed

    Klika, Václav; Baker, Ruth E; Headon, Denis; Gaffney, Eamonn A

    2012-04-01

    Understanding the mechanisms governing and regulating self-organisation in the developing embryo is a key challenge that has puzzled and fascinated scientists for decades. Since its conception in 1952 the Turing model has been a paradigm for pattern formation, motivating numerous theoretical and experimental studies, though its verification at the molecular level in biological systems has remained elusive. In this work, we consider the influence of receptor-mediated dynamics within the framework of Turing models, showing how non-diffusing species impact the conditions for the emergence of self-organisation. We illustrate our results within the framework of hair follicle pre-patterning, showing how receptor interaction structures can be constrained by the requirement for patterning, without the need for detailed knowledge of the network dynamics. Finally, in the light of our results, we discuss the ability of such systems to pattern outside the classical limits of the Turing model, and the inherent dangers involved in model reduction. PMID:22072186

  18. SANDS: A Service-Oriented Architecture for Clinical Decision Support in a National Health Information Network

    PubMed Central

    Wright, Adam; Sittig, Dean F.

    2008-01-01

    In this paper we describe and evaluate a new distributed architecture for clinical decision support called SANDS (Service-oriented Architecture for NHIN Decision Support), which leverages current health information exchange efforts and is based on the principles of a service-oriented architecture. The architecture allows disparate clinical information systems and clinical decision support systems to be seamlessly integrated over a network according to a set of interfaces and protocols described in this paper. The architecture described is fully defined and developed, and six use cases have been developed and tested using a prototype electronic health record which links to one of the existing prototype National Health Information Networks (NHIN): drug interaction checking, syndromic surveillance, diagnostic decision support, inappropriate prescribing in older adults, information at the point of care and a simple personal health record. Some of these use cases utilize existing decision support systems, which are either commercially or freely available at present, and developed outside of the SANDS project, while other use cases are based on decision support systems developed specifically for the project. Open source code for many of these components is available, and an open source reference parser is also available for comparison and testing of other clinical information systems and clinical decision support systems that wish to implement the SANDS architecture. PMID:18434256

  19. Self organising hypothesis networks: a new approach for representing and structuring SAR knowledge

    PubMed Central

    2014-01-01

    Background Combining different sources of knowledge to build improved structure activity relationship models is not easy owing to the variety of knowledge formats and the absence of a common framework to interoperate between learning techniques. Most of the current approaches address this problem by using consensus models that operate at the prediction level. We explore the possibility to directly combine these sources at the knowledge level, with the aim to harvest potentially increased synergy at an earlier stage. Our goal is to design a general methodology to facilitate knowledge discovery and produce accurate and interpretable models. Results To combine models at the knowledge level, we propose to decouple the learning phase from the knowledge application phase using a pivot representation (lingua franca) based on the concept of hypothesis. A hypothesis is a simple and interpretable knowledge unit. Regardless of its origin, knowledge is broken down into a collection of hypotheses. These hypotheses are subsequently organised into hierarchical network. This unification permits to combine different sources of knowledge into a common formalised framework. The approach allows us to create a synergistic system between different forms of knowledge and new algorithms can be applied to leverage this unified model. This first article focuses on the general principle of the Self Organising Hypothesis Network (SOHN) approach in the context of binary classification problems along with an illustrative application to the prediction of mutagenicity. Conclusion It is possible to represent knowledge in the unified form of a hypothesis network allowing interpretable predictions with performances comparable to mainstream machine learning techniques. This new approach offers the potential to combine knowledge from different sources into a common framework in which high level reasoning and meta-learning can be applied; these latter perspectives will be explored in future work. PMID

  20. The use of self-organising maps for anomalous behaviour detection in a digital investigation.

    PubMed

    Fei, B K L; Eloff, J H P; Olivier, M S; Venter, H S

    2006-10-16

    The dramatic increase in crime relating to the Internet and computers has caused a growing need for digital forensics. Digital forensic tools have been developed to assist investigators in conducting a proper investigation into digital crimes. In general, the bulk of the digital forensic tools available on the market permit investigators to analyse data that has been gathered from a computer system. However, current state-of-the-art digital forensic tools simply cannot handle large volumes of data in an efficient manner. With the advent of the Internet, many employees have been given access to new and more interesting possibilities via their desktop. Consequently, excessive Internet usage for non-job purposes and even blatant misuse of the Internet have become a problem in many organisations. Since storage media are steadily growing in size, the process of analysing multiple computer systems during a digital investigation can easily consume an enormous amount of time. Identifying a single suspicious computer from a set of candidates can therefore reduce human processing time and monetary costs involved in gathering evidence. The focus of this paper is to demonstrate how, in a digital investigation, digital forensic tools and the self-organising map (SOM)--an unsupervised neural network model--can aid investigators to determine anomalous behaviours (or activities) among employees (or computer systems) in a far more efficient manner. By analysing the different SOMs (one for each computer system), anomalous behaviours are identified and investigators are assisted to conduct the analysis more efficiently. The paper will demonstrate how the easy visualisation of the SOM enhances the ability of the investigators to interpret and explore the data generated by digital forensic tools so as to determine anomalous behaviours.

  1. Summarising climate and air quality (ozone) data on self-organising maps: a Sydney case study.

    PubMed

    Jiang, Ningbo; Betts, Alan; Riley, Matt

    2016-02-01

    This paper explores the classification and visualisation utility of the self-organising map (SOM) method in the context of New South Wales (NSW), Australia, using gridded NCEP/NCAR geopotential height reanalysis for east Australia, together with multi-site meteorological and air quality data for Sydney from the NSW Office of Environment and Heritage Air Quality Monitoring Network. A twice-daily synoptic classification has been derived for east Australia for the period of 1958-2012. The classification has not only reproduced the typical synoptic patterns previously identified in the literature but also provided an opportunity to visualise the subtle, non-linear change in the eastward-migrating synoptic systems influencing NSW (including Sydney). The summarisation of long-term, multi-site air quality/meteorological data from the Sydney basin on the SOM plane has identified a set of typical air pollution/meteorological spatial patterns in the region. Importantly, the examination of these patterns in relation to synoptic weather types has provided important visual insights into how local and synoptic meteorological conditions interact with each other and affect the variability of air quality in tandem. The study illustrates that while synoptic circulation types are influential, the within-type variability in mesoscale flows plays a critical role in determining local ozone levels in Sydney. These results indicate that the SOM can be a useful tool for assessing the impact of weather and climatic conditions on air quality in the regional airshed. This study further promotes the use of the SOM method in environmental research.

  2. Ecological hierarchies and self-organisation - Pattern analysis, modelling and process integration across scales

    USGS Publications Warehouse

    Reuter, H.; Jopp, F.; Blanco-Moreno, J. M.; Damgaard, C.; Matsinos, Y.; DeAngelis, D.L.

    2010-01-01

    A continuing discussion in applied and theoretical ecology focuses on the relationship of different organisational levels and on how ecological systems interact across scales. We address principal approaches to cope with complex across-level issues in ecology by applying elements of hierarchy theory and the theory of complex adaptive systems. A top-down approach, often characterised by the use of statistical techniques, can be applied to analyse large-scale dynamics and identify constraints exerted on lower levels. Current developments are illustrated with examples from the analysis of within-community spatial patterns and large-scale vegetation patterns. A bottom-up approach allows one to elucidate how interactions of individuals shape dynamics at higher levels in a self-organisation process; e.g., population development and community composition. This may be facilitated by various modelling tools, which provide the distinction between focal levels and resulting properties. For instance, resilience in grassland communities has been analysed with a cellular automaton approach, and the driving forces in rodent population oscillations have been identified with an agent-based model. Both modelling tools illustrate the principles of analysing higher level processes by representing the interactions of basic components.The focus of most ecological investigations on either top-down or bottom-up approaches may not be appropriate, if strong cross-scale relationships predominate. Here, we propose an 'across-scale-approach', closely interweaving the inherent potentials of both approaches. This combination of analytical and synthesising approaches will enable ecologists to establish a more coherent access to cross-level interactions in ecological systems. ?? 2010 Gesellschaft f??r ??kologie.

  3. A heavy rainfall sounding climatology over Gauteng, South Africa, using self-organising maps

    NASA Astrophysics Data System (ADS)

    Dyson, Liesl L.

    2015-12-01

    The daily weather at a particular place is largely influenced by the synoptic circulation and thermodynamic profile of the atmosphere. Heavy rainfall occurs from a particular subset of synoptic and thermodynamic states. Baseline climatologies provide objective information on heavy rainfall-producing circulation patterns and thermodynamic variables. This is how climatologically large or extreme values associated with heavy rainfall are identified. The aim of this research is to provide a heavy rainfall sounding climatology in austral summer over Gauteng, South Africa, using self-organising maps (SOMs). The results show that the SOM captures the intra-seasonal variability of heavy rainfall soundings by clearly distinguishing between the atmospheric conditions on early summer (October-December) and late summer (January-March) heavy rainfall days. Conditions associated with heavy early summer rainfall are large vertical wind shear and conditional instability, while the atmosphere is drier and cooler than when heavy rainfall occurs in late summer. Late summer heavy rainfall conditions are higher convective instability and small vertical wind shear values. The SOM climatology shows that some heavy rainfall days occur in both early and late summer when large-scale synoptic weather systems cause strong near-surface moisture flux and large values of wind shear. On these days, both the conditional and convective instability of the atmosphere are low and heavy rainfall results from the strong synoptic forcing. In contrast, heavy rainfall also occurs on days when synoptic circulation is not very favourable and the air is relatively dry, but the atmosphere is unstable with warm surface conditions and heavy rainfall develops from local favourable conditions. The SOM climatology provides guidelines to critical values of sounding-derived parameters for all these scenarios.

  4. Self-Organised Criticality at the Onset of Aeolian Sediment Transport.

    NASA Astrophysics Data System (ADS)

    McMenamin, R.; Cassidy, R.; McCloskey, J.

    2002-12-01

    Despite decades of rigorous investigation, reliable prediction of aeolian sediment transport rates remains impossible. Transport rate formulae are based on the governing principle of steady state equilibrium such that wind velocity produces a linear response in sediment flux. Field experiments, however, demonstrate a highly non-linear response and considerable deviation exists between observed and predicted transport rates. The limited predictive ability of the transport rate equations is largely attributed to crude measurement techniques that characterise wind velocity and sediment flux as time averaged values on the order of minutes, effectively concealing a time scale on the order of seconds in which the equilibrium condition is established. All attempts to resolve a characteristic time scale persistently reveal complexity. From the study of multi-component systems, it is now becoming apparent that such non-linearity is a pervasive attribute of system dynamics. Wind tunnel experiments were conducted to examine the nature of steady state sand transport under uniform forcing. Images of grains traversing an illuminated plane in the tunnel were acquired by video camera at a rate of 10 frames per second. A suite of image analysis techniques were then applied to quantify the volume of sand recorded in sequences of thousands of images and a transport time series generated. Wind velocity measurements were also acquired simultaneously with transport measurements. In contradiction to the steady state hypothesis, sand transport events obeyed a clear power-law scaling (number - size) over about 2.5 orders of magnitude, consistent with the dynamics of self-organised critical systems and suggesting that the dynamics of aeolian sediment transport are similar to those of avalanches observed in a sand pile. Such systems are inherently unpredictable - a fact which may contribute to our understanding of the intractability of the aeolian transport problem.

  5. Rapid self-organised initiation of ad hoc sensor networks close above the percolation threshold

    NASA Astrophysics Data System (ADS)

    Korsnes, Reinert

    2010-07-01

    This work shows potentials for rapid self-organisation of sensor networks where nodes collaborate to relay messages to a common data collecting unit (sink node). The study problem is, in the sense of graph theory, to find a shortest path tree spanning a weighted graph. This is a well-studied problem where for example Dijkstra’s algorithm provides a solution for non-negative edge weights. The present contribution shows by simulation examples that simple modifications of known distributed approaches here can provide significant improvements in performance. Phase transition phenomena, which are known to take place in networks close to percolation thresholds, may explain these observations. An initial method, which here serves as reference, assumes the sink node starts organisation of the network (tree) by transmitting a control message advertising its availability for its neighbours. These neighbours then advertise their current cost estimate for routing a message to the sink. A node which in this way receives a message implying an improved route to the sink, advertises its new finding and remembers which neighbouring node the message came from. This activity proceeds until there are no more improvements to advertise to neighbours. The result is a tree network for cost effective transmission of messages to the sink (root). This distributed approach has potential for simple improvements which are of interest when minimisation of storage and communication of network information are a concern. Fast organisation of the network takes place when the number k of connections for each node ( degree) is close above its critical value for global network percolation and at the same time there is a threshold for the nodes to decide to advertise network route updates.

  6. Summarising climate and air quality (ozone) data on self-organising maps: a Sydney case study.

    PubMed

    Jiang, Ningbo; Betts, Alan; Riley, Matt

    2016-02-01

    This paper explores the classification and visualisation utility of the self-organising map (SOM) method in the context of New South Wales (NSW), Australia, using gridded NCEP/NCAR geopotential height reanalysis for east Australia, together with multi-site meteorological and air quality data for Sydney from the NSW Office of Environment and Heritage Air Quality Monitoring Network. A twice-daily synoptic classification has been derived for east Australia for the period of 1958-2012. The classification has not only reproduced the typical synoptic patterns previously identified in the literature but also provided an opportunity to visualise the subtle, non-linear change in the eastward-migrating synoptic systems influencing NSW (including Sydney). The summarisation of long-term, multi-site air quality/meteorological data from the Sydney basin on the SOM plane has identified a set of typical air pollution/meteorological spatial patterns in the region. Importantly, the examination of these patterns in relation to synoptic weather types has provided important visual insights into how local and synoptic meteorological conditions interact with each other and affect the variability of air quality in tandem. The study illustrates that while synoptic circulation types are influential, the within-type variability in mesoscale flows plays a critical role in determining local ozone levels in Sydney. These results indicate that the SOM can be a useful tool for assessing the impact of weather and climatic conditions on air quality in the regional airshed. This study further promotes the use of the SOM method in environmental research. PMID:26787272

  7. A novel EPON architecture for supporting direct communication between ONUs

    NASA Astrophysics Data System (ADS)

    Wang, Liqian; Chen, Xue; Wang, Zhen

    2008-11-01

    In the traditional EPON network, optical signal from one ONU can not reach other ONUs. So ONUs can not directly transmit packets to other ONUs .The packets must be transferred by the OLT and it consumes both upstream bandwidth and downstream bandwidth. The bandwidth utilization is low and becomes lower when there are more packets among ONUs. When the EPON network carries P2P (Peer-to-Peer) applications and VPN applications, there would be a great lot of packets among ONUs and the traditional EPON network meets the problem of low bandwidth utilization. In the worst situation the bandwidth utilization of traditional EPON only is 50 percent. This paper proposed a novel EPON architecture and a novel medium access control protocol to realize direct packets transmission between ONUs. In the proposed EPON we adopt a novel circled architecture in the splitter. Due to the circled-splitter, optical signals from an ONU can reach the other ONUs and packets could be directly transmitted between two ONUs. The traffic between two ONUs only consumes upstream bandwidth and the bandwidth cost is reduced by 50 percent. Moreover, this kind of directly transmission reduces the packet's latency.

  8. A support architecture for reliable distributed computing systems

    NASA Technical Reports Server (NTRS)

    Dasgupta, Partha; Leblanc, Richard J., Jr.

    1988-01-01

    The Clouds project is well underway to its goal of building a unified distributed operating system supporting the object model. The operating system design uses the object concept of structuring software at all levels of the system. The basic operating system was developed and work is under progress to build a usable system.

  9. Scaling Impacts in Life Support Architecture and Technology Selection

    NASA Technical Reports Server (NTRS)

    Lange, Kevin

    2016-01-01

    For long-duration space missions outside of Earth orbit, reliability considerations will drive higher levels of redundancy and/or on-board spares for life support equipment. Component scaling will be a critical element in minimizing overall launch mass while maintaining an acceptable level of system reliability. Building on an earlier reliability study (AIAA 2012-3491), this paper considers the impact of alternative scaling approaches, including the design of technology assemblies and their individual components to maximum, nominal, survival, or other fractional requirements. The optimal level of life support system closure is evaluated for deep-space missions of varying duration using equivalent system mass (ESM) as the comparative basis. Reliability impacts are included in ESM by estimating the number of component spares required to meet a target system reliability. Common cause failures are included in the analysis. ISS and ISS-derived life support technologies are considered along with selected alternatives. This study focusses on minimizing launch mass, which may be enabling for deep-space missions.

  10. A network architecture supporting consistent rich behavior in collaborative interactive applications.

    PubMed

    Marsh, James; Glencross, Mashhuda; Pettifer, Steve; Hubbold, Roger

    2006-01-01

    Network architectures for collaborative virtual reality have traditionally been dominated by client-server and peer-to-peer approaches, with peer-to-peer strategies typically being favored where minimizing latency is a priority, and client-server where consistency is key. With increasingly sophisticated behavior models and the demand for better support for haptics, we argue that neither approach provides sufficient support for these scenarios and, thus, a hybrid architecture is required. We discuss the relative performance of different distribution strategies in the face of real network conditions and illustrate the problems they face. Finally, we present an architecture that successfully meets many of these challenges and demonstrate its use in a distributed virtual prototyping application which supports simultaneous collaboration for assembly, maintenance, and training applications utilizing haptics. PMID:16640254

  11. Applications of Self-Organising Map (SOM) for prioritisation of endemic zones of filariasis in Andhra Pradesh, India.

    PubMed

    Murty, Upadhayula Suryanaryana; Rao, Mutheneni Srinivasa; Sriram, K; Rao, K Madhusudhan

    2011-01-01

    Entomological and epidemiological data of Lymphatic Filariasis (LF) was collected from 120 villages of four districts of Andhra Pradesh, India. Self-Organising Maps (SOMs), data-mining techniques, was used to classify and prioritise the endemic zones of filariasis. The results show that, SOMs classified all the villages into three major clusters by considering the data of Microfilaria (MF) rate, infection, infectivity rate and Per Man Hour (PMH). By considering the patterns of cluster, appropriate decision can be drawn for each parameter that is responsible for disease transmission of filariasis. Hence, SOM will certainly be a suitable tool for management of filariasis. The detailed application of SOM is discussed in this paper.

  12. A cloud based architecture to support Electronic Health Record.

    PubMed

    Zangara, Gianluca; Corso, Pietro Paolo; Cangemi, Francesco; Millonzi, Filippo; Collova, Francesco; Scarlatella, Antonio

    2014-01-01

    We introduce a novel framework of electronic healthcare enabled by a Cloud platform able to host both Hospital Information Systems (HIS) and Electronic Medical Record (EMR) systems and implement an innovative model of Electronic Health Record (EHR) that is not only patient-oriented but also supports a better governance of the whole healthcare system. The proposed EHR model adopts the state of the art of the Cloud technologies, being able to join the different clinical data of the patient stored within the HISs and EMRs either placed into a local Data Center or hosted into a Cloud Platform enabling new directions of data analysis. PMID:25488244

  13. Innovative use of self-organising maps (SOMs) in model validation.

    NASA Astrophysics Data System (ADS)

    Jolly, Ben; McDonald, Adrian; Coggins, Jack

    2016-04-01

    We present an innovative combination of techniques for validation of numerical weather prediction (NWP) output against both observations and reanalyses using two classification schemes, demonstrated by a validation of the operational NWP 'AMPS' (the Antarctic Mesoscale Prediction System). Historically, model validation techniques have centred on case studies or statistics at various time scales (yearly/seasonal/monthly). Within the past decade the latter technique has been expanded by the addition of classification schemes in place of time scales, allowing more precise analysis. Classifications are typically generated for either the model or the observations, then used to create composites for both which are compared. Our method creates and trains a single self-organising map (SOM) on both the model output and observations, which is then used to classify both datasets using the same class definitions. In addition to the standard statistics on class composites, we compare the classifications themselves between the model and the observations. To add further context to the area studied, we use the same techniques to compare the SOM classifications with regimes developed for another study to great effect. The AMPS validation study compares model output against surface observations from SNOWWEB and existing University of Wisconsin-Madison Antarctic Automatic Weather Stations (AWS) during two months over the austral summer of 2014-15. Twelve SOM classes were defined in a '4 x 3' pattern, trained on both model output and observations of 2 m wind components, then used to classify both training datasets. Simple statistics (correlation, bias and normalised root-mean-square-difference) computed for SOM class composites showed that AMPS performed well during extreme weather events, but less well during lighter winds and poorly during the more changeable conditions between either extreme. Comparison of the classification time-series showed that, while correlations were lower

  14. Knowledge Network Architecture in Support of International Science

    NASA Astrophysics Data System (ADS)

    Hugo, Wim

    2015-04-01

    ICSU (The International Council for Science) created the World Data System (WDS) as an interdisciplinary body at its General Assembly in Maputo in 2008, and since then the membership of the WDS has grown to include 86 members, of whom 56 are institutions or data centres focused on providing quality-assured data and services to the scientific community. In addition to its objective of providing universal and equitable access to such data and services, WDS is also active in promoting stewardship, standards and conventions, and improved access to products derived from data and services. Whereas WDS is in process of aggregating and harmonizing the meta-data collections of its membership, it is clear that additional benefits can be obtained by supplementing such traditional meta-data sources with information about members, authors, and the coverages of the data, as well as metrics such as citation indices, quality indicators, and usability. Moreover, the relationships between the actors and systems that populate this meta-data landscape can be seen as a knowledge network that describes a sub-set of global scientific endeavor. Such a knowledge network is useful in many ways, supporting both machine-based and human requests for contextual information related to a specific data set, institution, author, topic, or other entities in the network. Specific use cases that can be realised include decision and policy support for funding agencies, identification of collaborators, ranking of data sources, availability of data for specific coverages, and many more. The paper defines the scope of and conceptual background to such a knowledge network, discusses some initial work done by WDS to establish the network, and proposes an implementation model for rapid operationalisation. In this model, established interests such as DataCITE, ORCID, and CrossRef have well-defined roles, and the standards, services, and registries required to build a community-maintained, scalable knowledge

  15. WDS Knowledge Network Architecture in Support of International Science

    NASA Astrophysics Data System (ADS)

    Mokrane, M.; Minster, J. B. H.; Hugo, W.

    2014-12-01

    ICSU (International Council for Science) created the World Data System (WDS) as an interdisciplinary body at its General Assembly in Maputo in 2008, and since then the membership of the WDS has grown to include 86 members, of whom 56 are institutions or data centers focused on providing quality-assured data and services to the scientific community, and 10 more are entire networks of such data facilities and services. In addition to its objective of providing universal and equitable access to scientific data and services, WDS is also active in promoting stewardship, standards and conventions, and improved access to products derived from data and services. Whereas WDS is in process of aggregating and harmonizing the metadata collections of its membership, it is clear that additional benefits can be obtained by supplementing such traditional metadata sources with information about members, authors, and the coverages of the data, as well as metrics such as citation indices, quality indicators, and usability. Moreover, the relationships between the actors and systems that populate this metadata landscape can be seen as a knowledge network that describes a subset of global scientific endeavor. Such a knowledge network is useful in many ways, supporting both machine-based and human requests for contextual information related to a specific data set, institution, author, topic, or other entities in the network. Specific use cases that can be realized include decision and policy support for funding agencies, identification of collaborators, ranking of data sources, availability of data for specific coverages, and many more. The paper defines the scope of and conceptual background to such a knowledge network, discusses some initial work done by WDS to establish the network, and proposes an implementation model for rapid operationalization. In this model, established interests such as DataCite, ORCID, and CrossRef have well-defined roles, and the standards, services, and

  16. An integrative architecture for a sensor-supported trust management system.

    PubMed

    Trček, Denis

    2012-01-01

    Trust plays a key role not only in e-worlds and emerging pervasive computing environments, but also already for millennia in human societies. Trust management solutions that have being around now for some fifteen years were primarily developed for the above mentioned cyber environments and they are typically focused on artificial agents, sensors, etc. However, this paper presents extensions of a new methodology together with architecture for trust management support that is focused on humans and human-like agents. With this methodology and architecture sensors play a crucial role. The architecture presents an already deployable tool for multi and interdisciplinary research in various areas where humans are involved. It provides new ways to obtain an insight into dynamics and evolution of such structures, not only in pervasive computing environments, but also in other important areas like management and decision making support. PMID:23112628

  17. FY04 Advanced Life Support Architecture and Technology Studies: Mid-Year Presentation

    NASA Technical Reports Server (NTRS)

    Lange, Kevin; Anderson, Molly; Duffield, Bruce; Hanford, Tony; Jeng, Frank

    2004-01-01

    Long-Term Objective: Identify optimal advanced life support system designs that meet existing and projected requirements for future human spaceflight missions. a) Include failure-tolerance, reliability, and safe-haven requirements. b) Compare designs based on multiple criteria including equivalent system mass (ESM), technology readiness level (TRL), simplicity, commonality, etc. c) Develop and evaluate new, more optimal, architecture concepts and technology applications.

  18. Architecture, Design, and Development of an HTML/JavaScript Web-Based Group Support System.

    ERIC Educational Resources Information Center

    Romano, Nicholas C., Jr.; Nunamaker, Jay F., Jr.; Briggs, Robert O.; Vogel, Douglas R.

    1998-01-01

    Examines the need for virtual workspaces and describes the architecture, design, and development of GroupSystems for the World Wide Web (GSWeb), an HTML/JavaScript Web-based Group Support System (GSS). GSWeb, an application interface similar to a Graphical User Interface (GUI), is currently used by teams around the world and relies on user…

  19. The middleware architecture supports heterogeneous network systems for module-based personal robot system

    NASA Astrophysics Data System (ADS)

    Choo, Seongho; Li, Vitaly; Choi, Dong Hee; Jung, Gi Deck; Park, Hong Seong; Ryuh, Youngsun

    2005-12-01

    On developing the personal robot system presently, the internal architecture is every module those occupy separated functions are connected through heterogeneous network system. This module-based architecture supports specialization and division of labor at not only designing but also implementation, as an effect of this architecture, it can reduce developing times and costs for modules. Furthermore, because every module is connected among other modules through network systems, we can get easy integrations and synergy effect to apply advanced mutual functions by co-working some modules. In this architecture, one of the most important technologies is the network middleware that takes charge communications among each modules connected through heterogeneous networks systems. The network middleware acts as the human nerve system inside of personal robot system; it relays, transmits, and translates information appropriately between modules that are similar to human organizations. The network middleware supports various hardware platform, heterogeneous network systems (Ethernet, Wireless LAN, USB, IEEE 1394, CAN, CDMA-SMS, RS-232C). This paper discussed some mechanisms about our network middleware to intercommunication and routing among modules, methods for real-time data communication and fault-tolerant network service. There have designed and implemented a layered network middleware scheme, distributed routing management, network monitoring/notification technology on heterogeneous networks for these goals. The main theme is how to make routing information in our network middleware. Additionally, with this routing information table, we appended some features. Now we are designing, making a new version network middleware (we call 'OO M/W') that can support object-oriented operation, also are updating program sources itself for object-oriented architecture. It is lighter, faster, and can support more operation systems and heterogeneous network systems, but other general

  20. A Distributed Architecture for Tsunami Early Warning and Collaborative Decision-support in Crises

    NASA Astrophysics Data System (ADS)

    Moßgraber, J.; Middleton, S.; Hammitzsch, M.; Poslad, S.

    2012-04-01

    The presentation will describe work on the system architecture that is being developed in the EU FP7 project TRIDEC on "Collaborative, Complex and Critical Decision-Support in Evolving Crises". The challenges for a Tsunami Early Warning System (TEWS) are manifold and the success of a system depends crucially on the system's architecture. A modern warning system following a system-of-systems approach has to integrate various components and sub-systems such as different information sources, services and simulation systems. Furthermore, it has to take into account the distributed and collaborative nature of warning systems. In order to create an architecture that supports the whole spectrum of a modern, distributed and collaborative warning system one must deal with multiple challenges. Obviously, one cannot expect to tackle these challenges adequately with a monolithic system or with a single technology. Therefore, a system architecture providing the blueprints to implement the system-of-systems approach has to combine multiple technologies and architectural styles. At the bottom layer it has to reliably integrate a large set of conventional sensors, such as seismic sensors and sensor networks, buoys and tide gauges, and also innovative and unconventional sensors, such as streams of messages from social media services. At the top layer it has to support collaboration on high-level decision processes and facilitates information sharing between organizations. In between, the system has to process all data and integrate information on a semantic level in a timely manner. This complex communication follows an event-driven mechanism allowing events to be published, detected and consumed by various applications within the architecture. Therefore, at the upper layer the event-driven architecture (EDA) aspects are combined with principles of service-oriented architectures (SOA) using standards for communication and data exchange. The most prominent challenges on this layer

  1. The Setting is the Service: How the Architecture of Sober Living Residences Supports Community Based Recovery

    PubMed Central

    Wittman, Fried; Jee, Babette; Polcin, Douglas L.; Henderson, Diane

    2014-01-01

    The architecture of residential recovery settings is an important silent partner in the alcohol/drug recovery field. The settings significantly support or hinder recovery experiences of residents, and shape community reactions to the presence of sober living houses (SLH) in ordinary neighborhoods. Grounded in the principles of Alcoholics Anonymous, the SLH provides residents with settings designed to support peer based recovery; further, these settings operate in a community context that insists on sobriety and strongly encourages attendance at 12-step meetings. Little formal research has been conducted to show how architectural features of the recovery setting – building appearance, spatial layouts, furnishings and finishes, policies for use of the facilities, physical care and maintenance of the property, neighborhood features, aspects of location in the city – function to promote (or retard) recovery, and to build (or detract from) community support. This paper uses a case-study approach to analyze the architecture of a community-based residential recovery service that has demonstrated successful recovery outcomes for its residents, is popular in its community, and has achieved state-wide recognition. The Environmental Pattern Language (Alexander, Ishikawa, & Silverstein, 1977) is used to analyze its architecture in a format that can be tested, critiqued, and adapted for use by similar programs in many communities, providing a model for replication and further research. PMID:25328377

  2. Exploring Hardware Support For Scaling Irregular Applications on Multi-node Multi-core Architectures

    SciTech Connect

    Secchi, Simone; Ceriani, Marco; Tumeo, Antonino; Villa, Oreste; Palermo, Gianluca; Raffo, Luigi

    2013-06-05

    With the recent emergence of large-scale knowledge dis- covery, data mining and social network analysis, irregular applications have gained renewed interest. Classic cache-based high-performance architectures do not provide optimal performances with such kind of workloads, mainly due to the very low spatial and temporal locality of the irregular control and memory access patterns. In this paper, we present a multi-node, multi-core, fine-grained multi-threaded shared-memory system architecture specifically designed for the execution of large-scale irregular applications, and built on top of three pillars, that we believe are fundamental to support these workloads. First, we offer transparent hardware support for Partitioned Global Address Space (PGAS) to provide a large globally-shared address space with no software library overhead. Second, we employ multi-threaded multi-core processing nodes to achieve the necessary latency tolerance required by accessing global memory, which potentially resides in a remote node. Finally, we devise hardware support for inter-thread synchronization on the whole global address space. We first model the performances by using an analytical model that takes into account the main architecture and application characteristics. We describe the hardware design of the proposed cus- tom architectural building blocks that provide support for the above- mentioned three pillars. Finally, we present a limited-scale evaluation of the system on a multi-board FPGA prototype with typical irregular kernels and benchmarks. The experimental evaluation demonstrates the architecture performance scalability for different configurations of the whole system.

  3. Service oriented architecture for clinical decision support: a systematic review and future directions.

    PubMed

    Loya, Salvador Rodriguez; Kawamoto, Kensaku; Chatwin, Chris; Huser, Vojtech

    2014-12-01

    The use of a service-oriented architecture (SOA) has been identified as a promising approach for improving health care by facilitating reliable clinical decision support (CDS). A review of the literature through October 2013 identified 44 articles on this topic. The review suggests that SOA related technologies such as Business Process Model and Notation (BPMN) and Service Component Architecture (SCA) have not been generally adopted to impact health IT systems' performance for better care solutions. Additionally, technologies such as Enterprise Service Bus (ESB) and architectural approaches like Service Choreography have not been generally exploited among researchers and developers. Based on the experience of other industries and our observation of the evolution of SOA, we found that the greater use of these approaches have the potential to significantly impact SOA implementations for CDS.

  4. Distributed Sensor Architecture for Intelligent Control that Supports Quality of Control and Quality of Service

    PubMed Central

    Poza-Lujan, Jose-Luis; Posadas-Yagüe, Juan-Luis; Simó-Ten, José-Enrique; Simarro, Raúl; Benet, Ginés

    2015-01-01

    This paper is part of a study of intelligent architectures for distributed control and communications systems. The study focuses on optimizing control systems by evaluating the performance of middleware through quality of service (QoS) parameters and the optimization of control using Quality of Control (QoC) parameters. The main aim of this work is to study, design, develop, and evaluate a distributed control architecture based on the Data-Distribution Service for Real-Time Systems (DDS) communication standard as proposed by the Object Management Group (OMG). As a result of the study, an architecture called Frame-Sensor-Adapter to Control (FSACtrl) has been developed. FSACtrl provides a model to implement an intelligent distributed Event-Based Control (EBC) system with support to measure QoS and QoC parameters. The novelty consists of using, simultaneously, the measured QoS and QoC parameters to make decisions about the control action with a new method called Event Based Quality Integral Cycle. To validate the architecture, the first five Braitenberg vehicles have been implemented using the FSACtrl architecture. The experimental outcomes, demonstrate the convenience of using jointly QoS and QoC parameters in distributed control systems. PMID:25723145

  5. Distributed sensor architecture for intelligent control that supports quality of control and quality of service.

    PubMed

    Poza-Lujan, Jose-Luis; Posadas-Yagüe, Juan-Luis; Simó-Ten, José-Enrique; Simarro, Raúl; Benet, Ginés

    2015-02-25

    This paper is part of a study of intelligent architectures for distributed control and communications systems. The study focuses on optimizing control systems by evaluating the performance of middleware through quality of service (QoS) parameters and the optimization of control using Quality of Control (QoC) parameters. The main aim of this work is to study, design, develop, and evaluate a distributed control architecture based on the Data-Distribution Service for Real-Time Systems (DDS) communication standard as proposed by the Object Management Group (OMG). As a result of the study, an architecture called Frame-Sensor-Adapter to Control (FSACtrl) has been developed. FSACtrl provides a model to implement an intelligent distributed Event-Based Control (EBC) system with support to measure QoS and QoC parameters. The novelty consists of using, simultaneously, the measured QoS and QoC parameters to make decisions about the control action with a new method called Event Based Quality Integral Cycle. To validate the architecture, the first five Braitenberg vehicles have been implemented using the FSACtrl architecture. The experimental outcomes, demonstrate the convenience of using jointly QoS and QoC parameters in distributed control systems.

  6. Multitier Portal Architecture for Thin- and Thick-client Neutron Scattering Experiment Support

    SciTech Connect

    Green, Mark L; Miller, Stephen D

    2007-01-01

    Integration of emerging technologies and design patterns into the three-tier client-server architecture is required in order to provide a scalable and flexible architecture for novice to sophisticated portal user groups. The ability to provide user customizable portal interfaces is rapidly becoming commonplace and is driving the expectations of researchers and scientists in the scientific community. This paper describes an architectural design that maximizes information technology service reuse while providing a customizable user interface that scales with user sophistication and requirements. The Spallation Neutron Source (SNS) located at Oak Ridge National Laboratory provides a state-of-the-art facility ideal for implementation of this infrastructure. The SNS Java-based Science Portal (Tier I) and Open Grid Computing Environment (Tier II) provide thin-client support whereas the GumTree Eclipse Rich Client Platform (Tier III) and Eclipse Integrated Development Environment (Tier IV) provide thickclient support within a multitier portal architecture. Each tier incorporates all of the features of the previous tiers while adding new capabilities based on the user requirements.

  7. Lunar Outpost Life Support Architecture Study Based on a High-Mobility Exploration Scenario

    NASA Technical Reports Server (NTRS)

    Lange, Kevin E.; Anderson, Molly S.

    2010-01-01

    This paper presents results of a life support architecture study based on a 2009 NASA lunar surface exploration scenario known as Scenario 12. The study focuses on the assembly complete outpost configuration and includes pressurized rovers as part of a distributed outpost architecture in both stand-alone and integrated configurations. A range of life support architectures are examined reflecting different levels of closure and distributed functionality. Monte Carlo simulations are used to assess the sensitivity of results to volatile high-impact mission variables, including the quantity of residual Lander oxygen and hydrogen propellants available for scavenging, the fraction of crew time away from the outpost on excursions, total extravehicular activity hours, and habitat leakage. Surpluses or deficits of water and oxygen are reported for each architecture, along with fixed and 10-year total equivalent system mass estimates relative to a reference case. System robustness is discussed in terms of the probability of no water or oxygen resupply as determined from the Monte Carlo simulations.

  8. Chlorinated solvents in a petrochemical wastewater treatment plant: an assessment of their removal using self-organising maps.

    PubMed

    Tobiszewski, Marek; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek

    2012-05-01

    The self-organising map approach was used to assess the efficiency of chlorinated solvent removal from petrochemical wastewater in a refinery wastewater treatment plant. Chlorinated solvents and inorganic anions (11 variables) were determined in 72 wastewater samples, collected from three different purification streams. The classification of variables identified technical solvents, brine from oil desalting and runoff sulphates as pollution sources in the refinery, affecting the quality of wastewater treatment plant influent. The classification of samples revealed the formation of five clusters: the first three clusters contained samples collected from the drainage water, process water and oiled rainwater treatment streams. The fourth cluster consisted mainly of samples collected after biological treatment, and the fifth one of samples collected after an unusual event. SOM analysis showed that the biological treatment step significantly reduced concentrations of chlorinated solvents in wastewater. PMID:22356856

  9. Fuzzylot: a novel self-organising fuzzy-neural rule-based pilot system for automated vehicles.

    PubMed

    Pasquier, M; Quek, C; Toh, M

    2001-10-01

    This paper presents part of our research work concerned with the realisation of an Intelligent Vehicle and the technologies required for its routing, navigation, and control. An automated driver prototype has been developed using a self-organising fuzzy rule-based system (POPFNN-CRI(S)) to model and subsequently emulate human driving expertise. The ability of fuzzy logic to represent vague information using linguistic variables makes it a powerful tool to develop rule-based control systems when an exact working model is not available, as is the case of any vehicle-driving task. Designing a fuzzy system, however, is a complex endeavour, due to the need to define the variables and their associated fuzzy sets, and determine a suitable rule base. Many efforts have thus been devoted to automating this process, yielding the development of learning and optimisation techniques. One of them is the family of POP-FNNs, or Pseudo-Outer Product Fuzzy Neural Networks (TVR, AARS(S), AARS(NS), CRI, Yager). These generic self-organising neural networks developed at the Intelligent Systems Laboratory (ISL/NTU) are based on formal fuzzy mathematical theory and are able to objectively extract a fuzzy rule base from training data. In this application, a driving simulator has been developed, that integrates a detailed model of the car dynamics, complete with engine characteristics and environmental parameters, and an OpenGL-based 3D-simulation interface coupled with driving wheel and accelerator/ brake pedals. The simulator has been used on various road scenarios to record from a human pilot driving data consisting of steering and speed control actions associated to road features. Specifically, the POPFNN-CRI(S) system is used to cluster the data and extract a fuzzy rule base modelling the human driving behaviour. Finally, the effectiveness of the generated rule base has been validated using the simulator in autopilot mode. PMID:11681754

  10. Fuzzylot: a novel self-organising fuzzy-neural rule-based pilot system for automated vehicles.

    PubMed

    Pasquier, M; Quek, C; Toh, M

    2001-10-01

    This paper presents part of our research work concerned with the realisation of an Intelligent Vehicle and the technologies required for its routing, navigation, and control. An automated driver prototype has been developed using a self-organising fuzzy rule-based system (POPFNN-CRI(S)) to model and subsequently emulate human driving expertise. The ability of fuzzy logic to represent vague information using linguistic variables makes it a powerful tool to develop rule-based control systems when an exact working model is not available, as is the case of any vehicle-driving task. Designing a fuzzy system, however, is a complex endeavour, due to the need to define the variables and their associated fuzzy sets, and determine a suitable rule base. Many efforts have thus been devoted to automating this process, yielding the development of learning and optimisation techniques. One of them is the family of POP-FNNs, or Pseudo-Outer Product Fuzzy Neural Networks (TVR, AARS(S), AARS(NS), CRI, Yager). These generic self-organising neural networks developed at the Intelligent Systems Laboratory (ISL/NTU) are based on formal fuzzy mathematical theory and are able to objectively extract a fuzzy rule base from training data. In this application, a driving simulator has been developed, that integrates a detailed model of the car dynamics, complete with engine characteristics and environmental parameters, and an OpenGL-based 3D-simulation interface coupled with driving wheel and accelerator/ brake pedals. The simulator has been used on various road scenarios to record from a human pilot driving data consisting of steering and speed control actions associated to road features. Specifically, the POPFNN-CRI(S) system is used to cluster the data and extract a fuzzy rule base modelling the human driving behaviour. Finally, the effectiveness of the generated rule base has been validated using the simulator in autopilot mode.

  11. Hierarchical state space partitioning with a network self-organising map for the recognition of ST-T segment changes.

    PubMed

    Bezerianos, A; Vladutu, L; Papadimitriou, S

    2000-07-01

    The problem of maximising the performance of ST-T segment automatic recognition for ischaemia detection is a difficult pattern classification problem. The paper proposes the network self-organising map (NetSOM) model as an enhancement to the Kohonen self-organised map (SOM) model. This model is capable of effectively decomposing complex large-scale pattern classification problems into a number of partitions, each of which is more manageable with a local classification device. The NetSOM attempts to generalize the regularization and ordering potential of the basic SOM from the space of vectors to the space of approximating functions. It becomes a device for the ordering of local experts (i.e. independent neural networks) over its lattice of neurons and for their selection and co-ordination. Each local expert is an independent neural network that is trained and activated under the control of the NetSOM. This method is evaluated with examples from the European ST-T database. The first results obtained after the application of NetSOM to ST-T segment change recognition show a significant improvement in the performance compared with that obtained with monolithic approaches, i.e. with single network types. The basic SOM model has attained an average ischaemic beat sensitivity of 73.6% and an average ischaemic beat predictivity of 68.3%. The work reports and discusses the improvements that have been obtained from the implementation of a NetSOM classification system with both multilayer perceptrons and radial basis function (RBF) networks as local experts for the ST-T segment change problem. Specifically, the NetSOM with multilayer perceptrons (radial basis functions) as local experts has improved the results over the basic SOM to an average ischaemic beat sensitivity of 75.9% (77.7%) and an average ischaemic beat predictivity of 72.5% (74.1%).

  12. Earth Orbiting Support Systems for commercial low Earth orbit data relay: Assessing architectures through tradespace exploration

    NASA Astrophysics Data System (ADS)

    Palermo, Gianluca; Golkar, Alessandro; Gaudenzi, Paolo

    2015-06-01

    As small satellites and Sun Synchronous Earth Observation systems are assuming an increased role in nowadays space activities, including commercial investments, it is of interest to assess how infrastructures could be developed to support the development of such systems and other spacecraft that could benefit from having a data relay service in Low Earth Orbit (LEO), as opposed to traditional Geostationary relays. This paper presents a tradespace exploration study of the architecture of such LEO commercial satellite data relay systems, here defined as Earth Orbiting Support Systems (EOSS). The paper proposes a methodology to formulate architectural decisions for EOSS constellations, and enumerate the corresponding tradespace of feasible architectures. Evaluation metrics are proposed to measure benefits and costs of architectures; lastly, a multicriteria Pareto criterion is used to downselect optimal architectures for subsequent analysis. The methodology is applied to two case studies for a set of 30 and 100 customer-spacecraft respectively, representing potential markets for LEO services in Exploration, Earth Observation, Science, and CubeSats. Pareto analysis shows how increased performance of the constellation is always achieved by an increased node size, as measured by the gain of the communications antenna mounted on EOSS spacecraft. On the other hand, nonlinear trends in optimal orbital altitude, number of satellites per plane, and number of orbital planes, are found in both cases. An upward trend in individual node memory capacity is found, although never exceeding 256 Gbits of onboard memory for both cases that have been considered, assuming the availability of a polar ground station for EOSS data downlink. System architects can use the proposed methodology to identify optimal EOSS constellations for a given service pricing strategy and customer target, thus identifying alternatives for selection by decision makers.

  13. Lunar Outpost Life Support Architecture Study Based on a High Mobility Exploration Scenario

    NASA Technical Reports Server (NTRS)

    Lange, Kevin E.; Anderson, Molly S.

    2009-01-01

    As scenarios for lunar surface exploration and habitation continue to evolve within NASA s Constellation program, so must studies of optimal life support system architectures and technologies. This paper presents results of a life support architecture study based on a 2009 NASA scenario known as Scenario 12. Scenario 12 represents a consolidation of ideas from earlier NASA scenarios and includes an outpost near the Lunar South Pole comprised of three larger fixed surface elements and four attached pressurized rovers. The scenario places a high emphasis on surface mobility, with planning assuming that all four crewmembers spend roughly 50% of the time away from the outpost on 3-14 day excursions in two of the pressurized rovers. Some of the larger elements can also be mobilized for longer duration excursions. This emphasis on mobility poses a significant challenge for a regenerative life support system in terms of cost-effective waste collection and resource recovery across multiple elements, including rovers with very constrained infrastructure resources. The current study considers pressurized rovers as part of a distributed outpost life support architecture in both stand-alone and integrated configurations. A range of architectures are examined reflecting different levels of closure and distributed functionality. Different lander propellant scavenging options are also considered involving either initial conversion of residual oxygen and hydrogen propellants to water or initial direct oxygen scavenging. Monte Carlo simulations are used to assess the sensitivity of results to volatile high-impact mission variables, including the quantity of residual lander propellants available for scavenging, the fraction of crew time away from the outpost on excursions, total extravehicular activity hours, and habitat leakage. Architectures are evaluated by estimating surpluses or deficits of water and oxygen per 180-day mission and differences in fixed and 10-year

  14. An Architecture and Supporting Environment of Service-Oriented Computing Based-On Context Awareness

    NASA Astrophysics Data System (ADS)

    Ma, Tianxiao; Wu, Gang; Huang, Jun

    Service-oriented computing (SOC) is emerging to be an important computing paradigm of the next future. Based on context awareness, this paper proposes an architecture of SOC. A definition of the context in open environments such as Internet is given, which is based on ontology. The paper also proposes a supporting environment for the context-aware SOC, which focus on services on-demand composition and context-awareness evolving. A reference implementation of the supporting environment based on OSGi[11] is given at last.

  15. Big Data Architectures for Operationalized Seismic and Subsurface Monitoring and Decision Support Workflows

    NASA Astrophysics Data System (ADS)

    Irving, D. H.; Rasheed, M.; Hillman, C.; O'Doherty, N.

    2012-12-01

    Oilfield management is moving to a more operational footing with near-realtime seismic and sensor monitoring governing drilling, fluid injection and hydrocarbon extraction workflows within safety, productivity and profitability constraints. To date, the geoscientific analytical architectures employed are configured for large volumes of data, computational power or analytical latency and compromises in system design must be made to achieve all three aspects. These challenges are encapsulated by the phrase 'Big Data' which has been employed for over a decade in the IT industry to describe the challenges presented by data sets that are too large, volatile and diverse for existing computational architectures and paradigms. We present a data-centric architecture developed to support a geoscientific and geotechnical workflow whereby: ●scientific insight is continuously applied to fresh data ●insights and derived information are incorporated into engineering and operational decisions ●data governance and provenance are routine within a broader data management framework Strategic decision support systems in large infrastructure projects such as oilfields are typically relational data environments; data modelling is pervasive across analytical functions. However, subsurface data and models are typically non-relational (i.e. file-based) in the form of large volumes of seismic imaging data or rapid streams of sensor feeds and are analysed and interpreted using niche applications. The key architectural challenge is to move data and insight from a non-relational to a relational, or structured, data environment for faster and more integrated analytics. We describe how a blend of MapReduce and relational database technologies can be applied in geoscientific decision support, and the strengths and weaknesses of each in such an analytical ecosystem. In addition we discuss hybrid technologies that use aspects of both and translational technologies for moving data and analytics

  16. Design and Parametric Sizing of Deep Space Habitats Supporting NASA'S Human Space Flight Architecture Team

    NASA Technical Reports Server (NTRS)

    Toups, Larry; Simon, Matthew; Smitherman, David; Spexarth, Gary

    2012-01-01

    NASA's Human Space Flight Architecture Team (HAT) is a multi-disciplinary, cross-agency study team that conducts strategic analysis of integrated development approaches for human and robotic space exploration architectures. During each analysis cycle, HAT iterates and refines the definition of design reference missions (DRMs), which inform the definition of a set of integrated capabilities required to explore multiple destinations. An important capability identified in this capability-driven approach is habitation, which is necessary for crewmembers to live and work effectively during long duration transits to and operations at exploration destinations beyond Low Earth Orbit (LEO). This capability is captured by an element referred to as the Deep Space Habitat (DSH), which provides all equipment and resources for the functions required to support crew safety, health, and work including: life support, food preparation, waste management, sleep quarters, and housekeeping.The purpose of this paper is to describe the design of the DSH capable of supporting crew during exploration missions. First, the paper describes the functionality required in a DSH to support the HAT defined exploration missions, the parameters affecting its design, and the assumptions used in the sizing of the habitat. Then, the process used for arriving at parametric sizing estimates to support additional HAT analyses is detailed. Finally, results from the HAT Cycle C DSH sizing are presented followed by a brief description of the remaining design trades and technological advancements necessary to enable the exploration habitation capability.

  17. Open multi-agent control architecture to support virtual-reality-based man-machine interfaces

    NASA Astrophysics Data System (ADS)

    Freund, Eckhard; Rossmann, Juergen; Brasch, Marcel

    2001-10-01

    Projective Virtual Reality is a new and promising approach to intuitively operable man machine interfaces for the commanding and supervision of complex automation systems. The user interface part of Projective Virtual Reality heavily builds on latest Virtual Reality techniques, a task deduction component and automatic action planning capabilities. In order to realize man machine interfaces for complex applications, not only the Virtual Reality part has to be considered but also the capabilities of the underlying robot and automation controller are of great importance. This paper presents a control architecture that has proved to be an ideal basis for the realization of complex robotic and automation systems that are controlled by Virtual Reality based man machine interfaces. The architecture does not just provide a well suited framework for the real-time control of a multi robot system but also supports Virtual Reality metaphors and augmentations which facilitate the user's job to command and supervise a complex system. The developed control architecture has already been used for a number of applications. Its capability to integrate sensor information from sensors of different levels of abstraction in real-time helps to make the realized automation system very responsive to real world changes. In this paper, the architecture will be described comprehensively, its main building blocks will be discussed and one realization that is built based on an open source real-time operating system will be presented. The software design and the features of the architecture which make it generally applicable to the distributed control of automation agents in real world applications will be explained. Furthermore its application to the commanding and control of experiments in the Columbus space laboratory, the European contribution to the International Space Station (ISS), is only one example which will be described.

  18. Boeing Crew Exploration Vehicle Environmental Control and Life Support System Architecture Overview

    NASA Technical Reports Server (NTRS)

    Saiidi, Mo; Lewis, John F.

    2007-01-01

    The Boeing Company under the teaming agreement with the Northrop Grumman Systems Corporation and in compliance with the NASA Phase 1 contract, had the responsibilities for the CEV architecture development of the Environmental control and life support (ECLS) system under the NASA Phase 1 contract. The ECLS system was comprised of the various subsystems which provided for a shirt-sleeve habitable environment for crew to live and work in the crew module of the CEV. This architecture met the NASA requirements to ferry cargo and crew to ISS, and Lunar sortie missions, with extensibility to long duration missions to Moon and MARS. This paper provides a summary overview of the CEV ECLS subsystems which was proposed in compliance with the contract activities.

  19. Exploring Life Support Architectures for Evolution of Deep Space Human Exploration

    NASA Technical Reports Server (NTRS)

    Anderson, Molly; Stambaugh, Imelda

    2015-01-01

    Life support system architectures for long duration space missions are often explored analytically in the human spaceflight community to find optimum solutions for mass, performance, and reliability. But in reality, many other constraints can guide the design when the life support system is examined within the context of an overall vehicle, as well as specific programmatic goals and needs. Between the end of the Constellation program and the development of the "Evolvable Mars Campaign", NASA explored a broad range of mission possibilities. Most of these missions will never be implemented but the lessons learned during these concept development phases may color and guide future analytical studies and eventual life support system architectures. This paper discusses several iterations of design studies from the life support system perspective to examine which requirements and assumptions, programmatic needs, or interfaces drive design. When doing early concept studies, many assumptions have to be made about technology and operations. Data can be pulled from a variety of sources depending on the study needs, including parametric models, historical data, new technologies, and even predictive analysis. In the end, assumptions must be made in the face of uncertainty. Some of these may introduce more risk as to whether the solution for the conceptual design study will still work when designs mature and data becomes available.

  20. Unusual quasars from the Sloan Digital Sky Survey selected by means of Kohonen self-organising maps

    NASA Astrophysics Data System (ADS)

    Meusinger, H.; Schalldach, P.; Scholz, R.-D.; in der Au, A.; Newholm, M.; de Hoon, A.; Kaminsky, B.

    2012-05-01

    Context. Large spectroscopic surveys have discovered very peculiar and hitherto unknown types of active galactic nuclei (AGN). Such rare objects may hold clues to the accretion history of the supermassive black holes at the centres of galaxies. Aims: We aim to create a sizeable sample of unusual quasars from the unprecedented spectroscopic database of the Sloan Digital Sky Survey (SDSS). Methods: We exploit the spectral archive of the SDSS Data Release 7 to select unusual quasar spectra. The selection method is based on a combination of the power of self-organising maps and the visual inspection of a huge number of spectra. Self-organising maps were applied to nearly 105 spectra classified as quasars at redshifts from z = 0.6 to 4.3 by the SDSS pipeline. Particular attention was paid to minimise possible contamination by rare peculiar stellar spectral types. All selected quasar spectra were individually studied to determine the object type and the redshift. Results: We present a catalogue of 1005 quasars with unusual spectra. These spectra are dominated by either broad absorption lines (BALs; 42%), unusual red continua (27%), weak emission lines (18%), or conspicuously strong optical and/or UV iron emission (11%). This large sample provides a useful resource for both studying properties and relations of/between different types of unusual quasars and selecting particularly interesting objects, even though the compilation is not aimed at completeness in a quantifiable sense. The spectra are grouped into six types for which composite spectra are constructed and mean properties are computed. Remarkably, all these types turn out to be on average more luminous than comparison samples of normal quasars after a statistical correction is made for intrinsic reddening (E(B - V) ≈ 0 to 0.4 for SMC-like extinction). Both the unusual BAL quasars and the strong iron emitters have significantly lower radio luminosities than normal quasars. We also confirm that strong BALs avoid

  1. Clinical Decision Support for Whole Genome Sequence Information Leveraging a Service-Oriented Architecture: a Prototype

    PubMed Central

    Welch, Brandon M.; Rodriguez-Loya, Salvador; Eilbeck, Karen; Kawamoto, Kensaku

    2014-01-01

    Whole genome sequence (WGS) information could soon be routinely available to clinicians to support the personalized care of their patients. At such time, clinical decision support (CDS) integrated into the clinical workflow will likely be necessary to support genome-guided clinical care. Nevertheless, developing CDS capabilities for WGS information presents many unique challenges that need to be overcome for such approaches to be effective. In this manuscript, we describe the development of a prototype CDS system that is capable of providing genome-guided CDS at the point of care and within the clinical workflow. To demonstrate the functionality of this prototype, we implemented a clinical scenario of a hypothetical patient at high risk for Lynch Syndrome based on his genomic information. We demonstrate that this system can effectively use service-oriented architecture principles and standards-based components to deliver point of care CDS for WGS information in real-time. PMID:25954430

  2. Coaching Doctoral Students--A Means to Enhance Progress and Support Self-Organisation in Doctoral Education

    ERIC Educational Resources Information Center

    Godskesen, Mirjam; Kobayashi, Sofie

    2016-01-01

    In this paper we focus on individual coaching carried out by an external coach as a new pedagogical element that can impact doctoral students' sense of progress in doctoral education. The study used a mixed-methods approach in that we draw on quantitative and qualitative data from the evaluation of a project on coaching doctoral students. We…

  3. Fully Distributed Monitoring Architecture Supporting Multiple Trackees and Trackers in Indoor Mobile Asset Management Application

    PubMed Central

    Jeong, Seol Young; Jo, Hyeong Gon; Kang, Soon Ju

    2014-01-01

    A tracking service like asset management is essential in a dynamic hospital environment consisting of numerous mobile assets (e.g., wheelchairs or infusion pumps) that are continuously relocated throughout a hospital. The tracking service is accomplished based on the key technologies of an indoor location-based service (LBS), such as locating and monitoring multiple mobile targets inside a building in real time. An indoor LBS such as a tracking service entails numerous resource lookups being requested concurrently and frequently from several locations, as well as a network infrastructure requiring support for high scalability in indoor environments. A traditional centralized architecture needs to maintain a geographic map of the entire building or complex in its central server, which can cause low scalability and traffic congestion. This paper presents a self-organizing and fully distributed indoor mobile asset management (MAM) platform, and proposes an architecture for multiple trackees (such as mobile assets) and trackers based on the proposed distributed platform in real time. In order to verify the suggested platform, scalability performance according to increases in the number of concurrent lookups was evaluated in a real test bed. Tracking latency and traffic load ratio in the proposed tracking architecture was also evaluated. PMID:24662407

  4. Changing vegetation self organisation affecting eco-hydrological and geomorphological processes under invasion of blue bush in SE South Africa

    NASA Astrophysics Data System (ADS)

    Cammeraat, L. H.; Kakembo, V.

    2012-04-01

    In southeastern South Africa sub-humid grasslands on abandoned soils are spontaneously being invaded by the exotic shrub Pteronia incana (Blue bush) originating from the semi-arid and arid Karoo region. This results eventually in soil loss and rill and gully erosion and consequently loss in agricultural production affecting the local rural economy. Degradation of soils is occurring following replacement of grassland by unpalatable shrubs and altering the spatial organization of the vegetation. This in consequence is changing the eco-hydrological response of the hillslopes leading to a dramatic increase of runoff and erosion. However the reason for this spontaneous vegetation replacement is not clear. Various explanations have been proposed and discussed such as overgrazing, vegetation cover and rainfall, drought or climatic change or exposition. The study presented aims at quantifying the observed changes in the plant and bare spot patterns and which may help us unraveling vegetation self organisation processes in relation to environmental disturbances. We analyzed high resolution low altitude images of vegetation patterns in combination with high resolution digital terrain model analysis. We applied this procedure for different patterns reflecting a time series covering the observed changing patterns. These reflect changing interactions between the (re-) organization of the plant patterns during the bushy invasion and incorporated the interaction between vegetation, water redistribution and soil properties. By doing so we may be able to unravel critical processes as indicated by changes in vegetation patterns that might enable us to mitigate degradation of dryland ecosystems.

  5. Detecting tactical patterns in basketball: comparison of merge self-organising maps and dynamic controlled neural networks.

    PubMed

    Kempe, Matthias; Grunz, Andreas; Memmert, Daniel

    2015-01-01

    The soaring amount of data, especially spatial-temporal data, recorded in recent years demands for advanced analysis methods. Neural networks derived from self-organizing maps established themselves as a useful tool to analyse static and temporal data. In this study, we applied the merge self-organising map (MSOM) to spatio-temporal data. To do so, we investigated the ability of MSOM's to analyse spatio-temporal data and compared its performance to the common dynamical controlled network (DyCoN) approach to analyse team sport position data. The position data of 10 players were recorded via the Ubisense tracking system during a basketball game. Furthermore, three different pre-selected plays were recorded for classification. Following data preparation, the different nets were trained with the data of the first half. The training success of both networks was evaluated by achieved entropy. The second half of the basketball game was presented to both nets for automatic classification. Both approaches were able to present the trained data extremely well and to detect the pre-selected plays correctly. In conclusion, MSOMs are a useful tool to analyse spatial-temporal data, especially in team sports. By their direct inclusion of different time length of tactical patterns, they open up new opportunities within team sports.

  6. Runtime and Architecture Support for Efficient Data Exchange in Multi-Accelerator Applications

    PubMed Central

    Cabezas, Javier; Gelado, Isaac; Stone, John E.; Navarro, Nacho; Kirk, David B.; Hwu, Wen-mei

    2014-01-01

    Heterogeneous parallel computing applications often process large data sets that require multiple GPUs to jointly meet their needs for physical memory capacity and compute throughput. However, the lack of high-level abstractions in previous heterogeneous parallel programming models force programmers to resort to multiple code versions, complex data copy steps and synchronization schemes when exchanging data between multiple GPU devices, which results in high software development cost, poor maintainability, and even poor performance. This paper describes the HPE runtime system, and the associated architecture support, which enables a simple, efficient programming interface for exchanging data between multiple GPUs through either interconnects or cross-node network interfaces. The runtime and architecture support presented in this paper can also be used to support other types of accelerators. We show that the simplified programming interface reduces programming complexity. The research presented in this paper started in 2009. It has been implemented and tested extensively in several generations of HPE runtime systems as well as adopted into the NVIDIA GPU hardware and drivers for CUDA 4.0 and beyond since 2011. The availability of real hardware that support key HPE features gives rise to a rare opportunity for studying the effectiveness of the hardware support by running important benchmarks on real runtime and hardware. Experimental results show that in a exemplar heterogeneous system, peer DMA and double-buffering, pinned buffers, and software techniques can improve the inter-accelerator data communication bandwidth by 2×. They can also improve the execution speed by 1.6× for a 3D finite difference, 2.5× for 1D FFT, and 1.6× for merge sort, all measured on real hardware. The proposed architecture support enables the HPE runtime to transparently deploy these optimizations under simple portable user code, allowing system designers to freely employ devices of

  7. A web-services architecture designed for intermittent connectivity to support medical response to disasters.

    PubMed

    Brown, Steve; Griswold, William; Lenert, Leslie A

    2005-01-01

    To support mobile computing systems for first responders at mass casualty sites, as part of the WIISARD (Wireless Internet Information System for Medical Response in Disasters) project, we have developed a data architecture to gracefully handle an environment with frequent network failure and, multiple writers that also supports rapid dissemination of updates that could be critical to the safety of responders. This is accomplished by allowing for a subset of the overall information available in a disaster scene to be cached locally on a responder's device and locally modified with or without network access. When the network is available, the local subset of the model is automatically synchronized with a server that contains the full model, and conflicts are resolved. When changes from a device are committed, the changes are instantly sent to any connected devices where the local subset would be modified by the changes. PMID:16779191

  8. DNA Tetrominoes: The Construction of DNA Nanostructures Using Self-Organised Heterogeneous Deoxyribonucleic Acids Shapes

    PubMed Central

    Ong, Hui San; Rahim, Mohd Syafiq; Firdaus-Raih, Mohd; Ramlan, Effirul Ikhwan

    2015-01-01

    The unique programmability of nucleic acids offers alternative in constructing excitable and functional nanostructures. This work introduces an autonomous protocol to construct DNA Tetris shapes (L-Shape, B-Shape, T-Shape and I-Shape) using modular DNA blocks. The protocol exploits the rich number of sequence combinations available from the nucleic acid alphabets, thus allowing for diversity to be applied in designing various DNA nanostructures. Instead of a deterministic set of sequences corresponding to a particular design, the protocol promotes a large pool of DNA shapes that can assemble to conform to any desired structures. By utilising evolutionary programming in the design stage, DNA blocks are subjected to processes such as sequence insertion, deletion and base shifting in order to enrich the diversity of the resulting shapes based on a set of cascading filters. The optimisation algorithm allows mutation to be exerted indefinitely on the candidate sequences until these sequences complied with all the four fitness criteria. Generated candidates from the protocol are in agreement with the filter cascades and thermodynamic simulation. Further validation using gel electrophoresis indicated the formation of the designed shapes. Thus, supporting the plausibility of constructing DNA nanostructures in a more hierarchical, modular, and interchangeable manner. PMID:26258940

  9. Guiding Principles for Data Architecture to Support the Pathways Community HUB Model

    PubMed Central

    Zeigler, Bernard P.; Redding, Sarah; Leath, Brenda A.; Carter, Ernest L.; Russell, Cynthia

    2016-01-01

    Introduction: The Pathways Community HUB Model provides a unique strategy to effectively supplement health care services with social services needed to overcome barriers for those most at risk of poor health outcomes. Pathways are standardized measurement tools used to define and track health and social issues from identification through to a measurable completion point. The HUB use Pathways to coordinate agencies and service providers in the community to eliminate the inefficiencies and duplication that exist among them. Pathways Community HUB Model and Formalization: Experience with the Model has brought out the need for better information technology solutions to support implementation of the Pathways themselves through decision-support tools for care coordinators and other users to track activities and outcomes, and to facilitate reporting. Here we provide a basis for discussing recommendations for such a data infrastructure by developing a conceptual model that formalizes the Pathway concept underlying current implementations. Requirements for Data Architecture to Support the Pathways Community HUB Model: The main contribution is a set of core recommendations as a framework for developing and implementing a data architecture to support implementation of the Pathways Community HUB Model. The objective is to present a tool for communities interested in adopting the Model to learn from and to adapt in their own development and implementation efforts. Problems with Quality of Data Extracted from the CHAP Database: Experience with the Community Health Access Project (CHAP) data base system (the core implementation of the Model) has identified several issues and remedies that have been developed to address these issues. Based on analysis of issues and remedies, we present several key features for a data architecture meeting the just mentioned recommendations. Implementation of Features: Presentation of features is followed by a practical guide to their implementation

  10. Knowledge base and sensor bus messaging service architecture for critical tsunami warning and decision-support

    NASA Astrophysics Data System (ADS)

    Sabeur, Z. A.; Wächter, J.; Middleton, S. E.; Zlatev, Z.; Häner, R.; Hammitzsch, M.; Loewe, P.

    2012-04-01

    The intelligent management of large volumes of environmental monitoring data for early tsunami warning requires the deployment of robust and scalable service oriented infrastructure that is supported by an agile knowledge-base for critical decision-support In the TRIDEC project (TRIDEC 2010-2013), a sensor observation service bus of the TRIDEC system is being developed for the advancement of complex tsunami event processing and management. Further, a dedicated TRIDEC system knowledge-base is being implemented to enable on-demand access to semantically rich OGC SWE compliant hydrodynamic observations and operationally oriented meta-information to multiple subscribers. TRIDEC decision support requires a scalable and agile real-time processing architecture which enables fast response to evolving subscribers requirements as the tsunami crisis develops. This is also achieved with the support of intelligent processing services which specialise in multi-level fusion methods with relevance feedback and deep learning. The TRIDEC knowledge base development work coupled with that of the generic sensor bus platform shall be presented to demonstrate advanced decision-support with situation awareness in context of tsunami early warning and crisis management.

  11. 1H NMR metabonomics of plasma lipoprotein subclasses: elucidation of metabolic clustering by self-organising maps.

    PubMed

    Suna, Teemu; Salminen, Aino; Soininen, Pasi; Laatikainen, Reino; Ingman, Petri; Mäkelä, Sanna; Savolainen, Markku J; Hannuksela, Minna L; Jauhiainen, Matti; Taskinen, Marja-Riitta; Kaski, Kimmo; Ala-Korpela, Mika

    2007-11-01

    (1)H NMR spectra of plasma are known to provide specific information on lipoprotein subclasses in the form of complex overlapping resonances. A combination of (1)H NMR and self-organising map (SOM) analysis was applied to investigate if automated characterisation of subclass-related metabolic interactions can be achieved. To reliably assess the intrinsic capability of (1)H NMR for resolving lipoprotein subclass profiles, sum spectra representing the pure lipoprotein subclass part of actual plasma were simulated with the aid of experimentally derived model signals for 11 distinct lipoprotein subclasses. Two biochemically characteristic categories of spectra, representing normolipidaemic and metabolic syndrome status, were generated with corresponding lipoprotein subclass profiles. A set of spectra representing a metabolic pathway between the two categories was also generated. The SOM analysis, based solely on the aliphatic resonances of these simulated spectra, clearly revealed the lipoprotein subclass profiles and their changes. Comparable SOM analysis in a group of 69 experimental (1)H NMR spectra of serum samples, which according to biochemical analyses represented a wide range of lipoprotein lipid concentrations, corroborated the findings based on the simulated data. Interestingly, the choline-N(CH(3))(3) region seems to provide more resolved clustering of lipoprotein subclasses in the SOM analyses than the methyl-CH(3) region commonly used for subclass quantification. The results illustrate the inherent suitability of (1)H NMR metabonomics for automated studies of lipoprotein subclass-related metabolism and demonstrate the power of SOM analysis in an extensive and representative case of (1)H NMR metabonomics.

  12. Modeling development of natural multi-sensory integration using neural self-organisation and probabilistic population codes

    NASA Astrophysics Data System (ADS)

    Bauer, Johannes; Dávila-Chacón, Jorge; Wermter, Stefan

    2015-10-01

    Humans and other animals have been shown to perform near-optimally in multi-sensory integration tasks. Probabilistic population codes (PPCs) have been proposed as a mechanism by which optimal integration can be accomplished. Previous approaches have focussed on how neural networks might produce PPCs from sensory input or perform calculations using them, like combining multiple PPCs. Less attention has been given to the question of how the necessary organisation of neurons can arise and how the required knowledge about the input statistics can be learned. In this paper, we propose a model of learning multi-sensory integration based on an unsupervised learning algorithm in which an artificial neural network learns the noise characteristics of each of its sources of input. Our algorithm borrows from the self-organising map the ability to learn latent-variable models of the input and extends it to learning to produce a PPC approximating a probability density function over the latent variable behind its (noisy) input. The neurons in our network are only required to perform simple calculations and we make few assumptions about input noise properties and tuning functions. We report on a neurorobotic experiment in which we apply our algorithm to multi-sensory integration in a humanoid robot to demonstrate its effectiveness and compare it to human multi-sensory integration on the behavioural level. We also show in simulations that our algorithm performs near-optimally under certain plausible conditions, and that it reproduces important aspects of natural multi-sensory integration on the neural level.

  13. Using self-organising maps to explore ozone profile validation results - SCIAMACHY limb compared to ground-based lidar observations

    NASA Astrophysics Data System (ADS)

    van Gijsel, J. A. E.; Zurita-Milla, R.; Stammes, P.; Godin-Beekmann, S.; Leblanc, T.; Marchand, M.; McDermid, I. S.; Stebel, K.; Steinbrecht, W.; Swart, D. P. J.

    2015-05-01

    Traditional validation of atmospheric profiles is based on the intercomparison of two or more data sets in predefined ranges or classes of a given observational characteristic such as latitude or solar zenith angle. In this study we trained a self-organising map (SOM) with a full time series of relative difference profiles of SCIAMACHY limb v5.02 and lidar ozone profiles from seven observation sites. Each individual observation characteristic was then mapped to the obtained SOM to investigate to which degree variation in this characteristic is explanatory for the variation seen in the SOM map. For the studied data sets, altitude-dependent relations for the global data set were found between the difference profiles and studied variables. From the lowest altitude studied (18 km) ascending, the most influencing factors were found to be longitude, followed by solar zenith angle and latitude, sensor age and again solar zenith angle together with the day of the year at the highest altitudes studied here (up to 45 km). After accounting for both latitude and longitude, residual partial correlations with a reduced magnitude are seen for various factors. However, (partial) correlations cannot point out which (combination) of the factors drives the observed differences between the ground-based and satellite ozone profiles as most of the factors are inter-related. Clustering into three classes showed that there are also some local dependencies, with for instance one cluster having a much stronger correlation with the sensor age (days since launch) between 36 and 42 km. The proposed SOM-based approach provides a powerful tool for the exploration of differences between data sets without being limited to a priori defined data subsets.

  14. Insights into environmental drivers of acoustic angular response using a self-organising map and hierarchical clustering

    NASA Astrophysics Data System (ADS)

    Daniell, James; Siwabessy, Justy; Nichol, Scott; Brooke, Brendan

    2015-10-01

    Acoustic backscatter from the seafloor is a complex function of signal frequency, seabed roughness, grain size distribution, benthos, bioturbation, volume reverberation, and other factors. Angular response is the variation in acoustic backscatter with incident angle and is considered be an intrinsic property of the seabed. An unsupervised classification technique combining a self-organising map (SOM) and hierarchical clustering was used to create an angular response facies map and explore the relationships between acoustic facies and ground truth data. Cluster validation routines indicated that a two cluster solution was optimal and separated sediment dominated environments from mixtures of sediment and hard ground. Low cluster separation limited cluster validation routines from identifying fine cluster structure visible with an AR density plot. Cluster validation, aided by a visual comparison with an AR density plot, indicated that a 14 cluster solution was also a suitable representation of the input dataset. Clusters that were a mixture of hard and unconsolidated substrates displayed an increase in backscatter with an increase in the occurrence of hard ground and highlighted the sensitivity of AR curves to the presence of even modest amounts of hard ground. Remapping video observations and sediment data onto the SOM matrix is innovative and depicts the relationship between ground truth data and cluster structure. Mapping environmental variables onto the SOM matrix can show broad trends and localised peaks and troughs and display the variability of ground truth data within designated clusters. These variables, when linked to AR curves via clusters, can indicate how environmental factors influence the shape of the curves. Once these links are established they can be incorporated into improved geoacoustic models that replicate field observations.

  15. An attention-gating recurrent working memory architecture for emergent speech representation

    NASA Astrophysics Data System (ADS)

    Elshaw, Mark; Moore, Roger K.; Klein, Michael

    2010-06-01

    This paper describes an attention-gating recurrent self-organising map approach for emergent speech representation. Inspired by evidence from human cognitive processing, the architecture combines two main neural components. The first component, the attention-gating mechanism, uses actor-critic learning to perform selective attention towards speech. Through this selective attention approach, the attention-gating mechanism controls access to working memory processing. The second component, the recurrent self-organising map memory, develops a temporal-distributed representation of speech using phone-like structures. Representing speech in terms of phonetic features in an emergent self-organised fashion, according to research on child cognitive development, recreates the approach found in infants. Using this representational approach, in a fashion similar to infants, should improve the performance of automatic recognition systems through aiding speech segmentation and fast word learning.

  16. A cost-effective WDM-PON architecture simultaneously supporting wired, wireless and optical VPN services

    NASA Astrophysics Data System (ADS)

    Wu, Yanzhi; Ye, Tong; Zhang, Liang; Hu, Xiaofeng; Li, Xinwan; Su, Yikai

    2011-03-01

    It is believed that next-generation passive optical networks (PONs) are required to provide flexible and various services to users in a cost-effective way. To address this issue, for the first time, this paper proposes and demonstrates a novel wavelength-division-multiplexed PON (WDM-PON) architecture to simultaneously support three types of services: 1) wireless access traffic, 2) optical virtual passive network (VPN) communications, and 3) conventional wired services. In the optical line terminal (OLT), we use two cascaded Mach-Zehnder modulators (MZMs) on each wavelength channel to generate an optical carrier, and produce the wireless and the downstream traffic using the orthogonal modulation technique. In each optical network unit (ONU), the obtained optical carrier is modulated by a single MZM to provide the VPN and upstream communications. Consequently, the light sources in the ONUs are saved and the system cost is reduced. The feasibility of our proposal is experimentally and numerically verified.

  17. Requirements for Designing Life Support System Architectures for Crewed Exploration Missions Beyond Low-Earth Orbit

    NASA Technical Reports Server (NTRS)

    Howard, David; Perry,Jay; Sargusingh, Miriam; Toomarian, Nikzad

    2016-01-01

    NASA's technology development roadmaps provide guidance to focus technological development on areas that enable crewed exploration missions beyond low-Earth orbit. Specifically, the technology area roadmap on human health, life support and habitation systems describes the need for life support system (LSS) technologies that can improve reliability and in-situ maintainability within a minimally-sized package while enabling a high degree of mission autonomy. To address the needs outlined by the guiding technology area roadmap, NASA's Advanced Exploration Systems (AES) Program has commissioned the Life Support Systems (LSS) Project to lead technology development in the areas of water recovery and management, atmosphere revitalization, and environmental monitoring. A notional exploration LSS architecture derived from the International Space has been developed and serves as the developmental basis for these efforts. Functional requirements and key performance parameters that guide the exploration LSS technology development efforts are presented and discussed. Areas where LSS flight operations aboard the ISS afford lessons learned that are relevant to exploration missions are highlighted.

  18. Supporting Undergraduate Computer Architecture Students Using a Visual MIPS64 CPU Simulator

    ERIC Educational Resources Information Center

    Patti, D.; Spadaccini, A.; Palesi, M.; Fazzino, F.; Catania, V.

    2012-01-01

    The topics of computer architecture are always taught using an Assembly dialect as an example. The most commonly used textbooks in this field use the MIPS64 Instruction Set Architecture (ISA) to help students in learning the fundamentals of computer architecture because of its orthogonality and its suitability for real-world applications. This…

  19. NASA's Earth Science Gateway: A Platform for Interoperable Services in Support of the GEOSS Architecture

    NASA Astrophysics Data System (ADS)

    Alameh, N.; Bambacus, M.; Cole, M.

    2006-12-01

    Nasa's Earth Science as well as interdisciplinary research and applications activities require access to earth observations, analytical models and specialized tools and services, from diverse distributed sources. Interoperability and open standards for geospatial data access and processing greatly facilitate such access among the information and processing compo¬nents related to space¬craft, airborne, and in situ sensors; predictive models; and decision support tools. To support this mission, NASA's Geosciences Interoperability Office (GIO) has been developing the Earth Science Gateway (ESG; online at http://esg.gsfc.nasa.gov) by adapting and deploying a standards-based commercial product. Thanks to extensive use of open standards, ESG can tap into a wide array of online data services, serve a variety of audiences and purposes, and adapt to technology and business changes. Most importantly, the use of open standards allow ESG to function as a platform within a larger context of distributed geoscience processing, such as the Global Earth Observing System of Systems (GEOSS). ESG shares the goals of GEOSS to ensure that observations and products shared by users will be accessible, comparable, and understandable by relying on common standards and adaptation to user needs. By maximizing interoperability, modularity, extensibility and scalability, ESG's architecture fully supports the stated goals of GEOSS. As such, ESG's role extends beyond that of a gateway to NASA science data to become a shared platform that can be leveraged by GEOSS via: A modular and extensible architecture Consensus and community-based standards (e.g. ISO and OGC standards) A variety of clients and visualization techniques, including WorldWind and Google Earth A variety of services (including catalogs) with standard interfaces Data integration and interoperability Mechanisms for user involvement and collaboration Mechanisms for supporting interdisciplinary and domain-specific applications ESG

  20. Architectures and Evaluation for Adjustable Control Autonomy for Space-Based Life Support Systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Schreckenghost, Debra K.

    2001-01-01

    In the past five years, a number of automation applications for control of crew life support systems have been developed and evaluated in the Adjustable Autonomy Testbed at NASA's Johnson Space Center. This paper surveys progress on an adjustable autonomous control architecture for situations where software and human operators work together to manage anomalies and other system problems. When problems occur, the level of control autonomy can be adjusted, so that operators and software agents can work together on diagnosis and recovery. In 1997 adjustable autonomy software was developed to manage gas transfer and storage in a closed life support test. Four crewmembers lived and worked in a chamber for 91 days, with both air and water recycling. CO2 was converted to O2 by gas processing systems and wheat crops. With the automation software, significantly fewer hours were spent monitoring operations. System-level validation testing of the software by interactive hybrid simulation revealed problems both in software requirements and implementation. Since that time, we have been developing multi-agent approaches for automation software and human operators, to cooperatively control systems and manage problems. Each new capability has been tested and demonstrated in realistic dynamic anomaly scenarios, using the hybrid simulation tool.

  1. A Scalable, Out-of-Band Diagnostics Architecture for International Space Station Systems Support

    NASA Technical Reports Server (NTRS)

    Fletcher, Daryl P.; Alena, Rick; Clancy, Daniel (Technical Monitor)

    2002-01-01

    The computational infrastructure of the International Space Station (ISS) is a dynamic system that supports multiple vehicle subsystems such as Caution and Warning, Electrical Power Systems and Command and Data Handling (C&DH), as well as scientific payloads of varying size and complexity. The dynamic nature of the ISS configuration coupled with the increased demand for payload support places a significant burden on the inherently resource constrained computational infrastructure of the ISS. Onboard system diagnostics applications are hosted on computers that are elements of the avionics network while ground-based diagnostic applications receive only a subset of available telemetry, down-linked via S-band communications. In this paper we propose a scalable, out-of-band diagnostics architecture for ISS systems support that uses a read-only connection for C&DH data acquisition, which provides a lower cost of deployment and maintenance (versus a higher criticality readwrite connection). The diagnostics processing burden is off-loaded from the avionics network to elements of the on-board LAN that have a lower overall cost of operation and increased computational capacity. A superset of diagnostic data, richer in content than the configured telemetry, is made available to Advanced Diagnostic System (ADS) clients running on wireless handheld devices, affording the crew greater mobility for troubleshooting and providing improved insight into vehicle state. The superset of diagnostic data is made available to the ground in near real-time via an out-of band downlink, providing a high level of fidelity between vehicle state and test, training and operational facilities on the ground.

  2. Enhancing Architecture-Implementation Conformance with Change Management and Support for Behavioral Mapping

    ERIC Educational Resources Information Center

    Zheng, Yongjie

    2012-01-01

    Software architecture plays an increasingly important role in complex software development. Its further application, however, is challenged by the fact that software architecture, over time, is often found not conformant to its implementation. This is usually caused by frequent development changes made to both artifacts. Against this background,…

  3. Architecture and Functionality of the Advanced Life Support On-Line Project Information System (OPIS)

    NASA Technical Reports Server (NTRS)

    Hogan, John A.; Levri, Julie A.; Morrow, Rich; Cavazzoni, Jim; Rodriquez, Luis F.; Riano, Rebecca; Whitaker, Dawn R.

    2004-01-01

    An ongoing effort is underway at NASA Amcs Research Center (ARC) tu develop an On-line Project Information System (OPIS) for the Advanced Life Support (ALS) Program. The objective of this three-year project is to develop, test, revise and deploy OPIS to enhance the quality of decision-making metrics and attainment of Program goals through improved knowledge sharing. OPIS will centrally locate detailed project information solicited from investigators on an annual basis and make it readily accessible by the ALS Community via a web-accessible interface. The data will be stored in an object-oriented relational database (created in MySQL(Trademark) located on a secure server at NASA ARC. OPE will simultaneously serve several functions, including being an R&TD status information hub that can potentially serve as the primary annual reporting mechanism. Using OPIS, ALS managers and element leads will be able to carry out informed research and technology development investment decisions, and allow analysts to perform accurate systems evaluations. Additionally, the range and specificity of information solicited will serve to educate technology developers of programmatic needs. OPIS will collect comprehensive information from all ALS projects as well as highly detailed information specific to technology development in each ALS area (Waste, Water, Air, Biomass, Food, Thermal, and Control). Because the scope of needed information can vary dramatically between areas, element-specific technology information is being compiled with the aid of multiple specialized working groups. This paper presents the current development status in terms of the architecture and functionality of OPIS. Possible implementation approaches for OPIS are also discussed.

  4. Architecture and Functionality of the Advanced Life Support On-Line Project Information System

    NASA Technical Reports Server (NTRS)

    Hogan, John A.; Levri, Julie A.; Morrow, Rich; Cavazzoni, Jim; Rodriguez, Luis F.; Riano, Rebecca; Whitaker, Dawn R.

    2004-01-01

    An ongoing effort is underway at NASA Ames Research Center (ARC) to develop an On-line Project Information System (OPIS) for the Advanced Life Support (ALS) Program. The objective of this three-year project is to develop, test, revise and deploy OPIS to enhance the quality of decision-making metrics and attainment of Program goals through improved knowledge sharing. OPIS will centrally locate detailed project information solicited from investigators on an annual basis and make it readily accessible by the ALS Community via a Web-accessible interface. The data will be stored in an object-oriented relational database (created in MySQL) located on a secure server at NASA ARC. OPE will simultaneously serve several functions, including being an research and technology development (R&TD) status information hub that can potentially serve as the primary annual reporting mechanism for ALS-funded projects. Using OPIS, ALS managers and element leads will be able to carry out informed R&TD investment decisions, and allow analysts to perform accurate systems evaluations. Additionally, the range and specificity of information solicited will serve to educate technology developers of programmatic needs. OPIS will collect comprehensive information from all ALS projects as well as highly detailed information specific to technology development in each ALS area (Waste, Water, Air, Biomass, Food, Thermal, Controls and Systems Analysis). Because the scope of needed information can vary dramatically between areas, element-specific technology information is being compiled with the aid of multiple specialized working groups. This paper presents the current development status in terms of the architecture and functionality of OPIS. Possible implementation approaches for OPIS are also discussed.

  5. Usalpharma: A Cloud-Based Architecture to Support Quality Assurance Training Processes in Health Area Using Virtual Worlds

    PubMed Central

    García-Peñalvo, Francisco J.; Pérez-Blanco, Jonás Samuel; Martín-Suárez, Ana

    2014-01-01

    This paper discusses how cloud-based architectures can extend and enhance the functionality of the training environments based on virtual worlds and how, from this cloud perspective, we can provide support to analysis of training processes in the area of health, specifically in the field of training processes in quality assurance for pharmaceutical laboratories, presenting a tool for data retrieval and analysis that allows facing the knowledge discovery in the happenings inside the virtual worlds. PMID:24778593

  6. The Use of Supporting Documentation for Information Architecture by Australian Libraries

    ERIC Educational Resources Information Center

    Hider, Philip; Burford, Sally; Ferguson, Stuart

    2009-01-01

    This article reports the results of an online survey that examined the development of information architecture of Australian library Web sites with reference to documented methods and guidelines. A broad sample of library Web managers responded from across the academic, public, and special sectors. A majority of libraries used either in-house or…

  7. The Architecture of Support: The Activation of Pre-existing Ties and Formation of New Ties for Tailored Support

    PubMed Central

    LaValley, Susan; Panagakis, Christina; Shelton, Rachel C.

    2015-01-01

    This study examines differences in the resources, information, and support parents coping with pediatric cancer accessed from different types of network contacts. Using interviews with parents of childhood cancer patients (N = 80 parents), we examine (1) if parents rely on different types of network ties to access tailored information, resources or support; (2) differences in the nature or utility of information, resources, and support offered by different types of network contacts; and (3) the role of health-related professionals in brokering new network ties. Findings show that after a child’s cancer diagnosis, parents received support from a broad portfolio of network members, which included preexisting network ties to friends and families as well as the formation of new ties to other cancer families and health-related professionals. Family, friends, and neighbors offered logistical support that aided balancing preexisting work and household responsibilities with new obligations. Parents formed new ties to other families coping with cancer for tailored health-related emotional and informational support. Health-related professionals served as network brokers, who fostered the development of new network ties and connected parents with supportive resources. PMID:25888807

  8. Architectural design of a data warehouse to support operational and analytical queries across disparate clinical databases.

    PubMed

    Chelico, John D; Wilcox, Adam; Wajngurt, David

    2007-10-11

    As the clinical data warehouse of the New York Presbyterian Hospital has evolved innovative methods of integrating new data sources and providing more effective and efficient data reporting and analysis need to be explored. We designed and implemented a new clinical data warehouse architecture to handle the integration of disparate clinical databases in the institution. By examining the way downstream systems are populated and streamlining the way data is stored we create a virtual clinical data warehouse that is adaptable to future needs of the organization.

  9. High rate information systems - Architectural trends in support of the interdisciplinary investigator

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Preheim, Larry E.

    1990-01-01

    Data systems requirements in the Earth Observing System (EOS) Space Station Freedom (SSF) eras indicate increasing data volume, increased discipline interplay, higher complexity and broader data integration and interpretation. A response to the needs of the interdisciplinary investigator is proposed, considering the increasing complexity and rising costs of scientific investigation. The EOS Data Information System, conceived to be a widely distributed system with reliable communication links between central processing and the science user community, is described. Details are provided on information architecture, system models, intelligent data management of large complex databases, and standards for archiving ancillary data, using a research library, a laboratory and collaboration services.

  10. Development of Groundwater Modeling Support System Based on Service-Oriented Architecture

    NASA Astrophysics Data System (ADS)

    WANG, Y.; Tsai, J. P.; Hsiao, C. T.; Chang, L. C.

    2014-12-01

    Groundwater simulation has become an essential step on the groundwater resources management and assessment. There are many stand-alone pre and post processing software packages to alleviate the model simulation loading, but the stand-alone software do not consider centralized management of data and simulation results neither do they provide network sharing function. The model buildings are still implemented independently case to case when using these packages. Hence, it is difficult to share and reuse the data and knowledge (simulation cases) systematically within or across companies. Therefore, this study develops a centralized and network based groundwater model developing system to assist model simulation. The system is based on service-oriented architecture and allows remote user to develop their modeling cases on internet. The data and cases (knowledge) are thus easy to manage centralized. MODFLOW is the modeling engine of the system, which is the most popular groundwater model in the world. Other functions include the database management and variety of model developing assisted web services including auto digitalizing of geology profile map、groundwater missing data recovery assisting、graphic data demonstration and auto generation of MODFLOW input files from database that is the most important function of the system. Since the system architecture is service-oriented, it is scalable and flexible. The system can be easily extended to include the scenarios analysis and knowledge management to facilitate the reuse of groundwater modeling knowledge.

  11. Using a service oriented architecture approach to clinical decision support: performance results from two CDS Consortium demonstrations.

    PubMed

    Paterno, Marilyn D; Goldberg, Howard S; Simonaitis, Linas; Dixon, Brian E; Wright, Adam; Rocha, Beatriz H; Ramelson, Harley Z; Middleton, Blackford

    2012-01-01

    The Clinical Decision Support Consortium has completed two demonstration trials involving a web service for the execution of clinical decision support (CDS) rules in one or more electronic health record (EHR) systems. The initial trial ran in a local EHR at Partners HealthCare. A second EHR site, associated with Wishard Memorial Hospital, Indianapolis, IN, was added in the second trial. Data were gathered during each 6 month period and analyzed to assess performance, reliability, and response time in the form of means and standard deviations for all technical components of the service, including assembling and preparation of input data. The mean service call time for each period was just over 2 seconds. In this paper we report on the findings and analysis to date while describing the areas for further analysis and optimization as we continue to expand our use of a Services Oriented Architecture approach for CDS across multiple institutions.

  12. Challenges with Deploying and Integrating Environmental Control and Life Support Functions in a Lunar Architecture with High Degrees of Mobility

    NASA Technical Reports Server (NTRS)

    Bagdigian, Robert M.

    2009-01-01

    Visions of lunar outposts often depict a collection of fixed elements such as pressurized habitats, in and around which human inhabitants spend the large majority of their surface stay time. In such an outpost, an efficient deployment of environmental control and life support equipment can be achieved by centralizing certain functions within one or a minimum number of habitable elements and relying on the exchange of gases and liquids between elements via atmosphere ventilation and plumbed interfaces. However, a rigidly fixed outpost can constrain the degree to which the total lunar landscape can be explored. The capability to enable widespread access across the landscape makes a lunar architecture with a high degree of surface mobility attractive. Such mobility presents unique challenges to the efficient deployment of environmental control and life support functions in multiple elements that may for long periods of time be operated independently. This paper describes some of those anticipated challenges.

  13. Novel architectured metal-supported solid oxide fuel cells with Mo-doped SrFeO3-δ electrocatalysts

    NASA Astrophysics Data System (ADS)

    Zhou, Yucun; Meng, Xie; Liu, Xuejiao; Pan, Xin; Li, Junliang; Ye, Xiaofeng; Nie, Huaiwen; Xia, Changrong; Wang, Shaorong; Zhan, Zhongliang

    2014-12-01

    Barriers to technological advancement of metal-supported SOFCs include nickel coarsening in the anode, metallic interdiffusion between the anode and the metal substrate, as well as poor cathode adhesion. Here we report a robust and novel architectured metal-supported SOFC that consists of a thin dense yttria-stabilized zirconia (YSZ) electrolyte layer sandwiched between a porous 430L stainless steel substrate and a porous YSZ thin layer. The key feature is simultaneous use of impregnated nano-scale SrFe0.75Mo0.25O3-δ coatings on the internal surfaces of the porous 430L and YSZ backbones respectively as the anode and cathode catalyst. Such a fuel cell exhibits power densities of 0.74 W cm-2 at 800 °C and 0.40 W cm-2 at 700 °C when operating on hydrogen fuels and air oxidants.

  14. Spatial and temporal patterns of bank failure during extreme flood events: Evidence of nonlinearity and self-organised criticality at the basin scale?

    NASA Astrophysics Data System (ADS)

    Thompson, C. J.; Croke, J. C.; Grove, J. R.

    2012-04-01

    Non-linearity in physical systems provides a conceptual framework to explain complex patterns and form that are derived from complex internal dynamics rather than external forcings, and can be used to inform modeling and improve landscape management. One process that has been investigated previously to explore the existence of self-organised critical system (SOC) in river systems at the basin-scale is bank failure. Spatial trends in bank failure have been previously quantified to determine if the distribution of bank failures at the basin scale exhibit the necessary power law magnitude/frequency distributions. More commonly bank failures are investigated at a small-scale using several cross-sections with strong emphasis on local-scale factors such as bank height, cohesion and hydraulic properties. Advancing our understanding of non-linearity in such processes, however, requires many more studies where both the spatial and temporal measurements of the process can be used to investigate the existence or otherwise of non-linearity and self-organised criticality. This study presents measurements of bank failure throughout the Lockyer catchment in southeast Queensland, Australia, which experienced an extreme flood event in January 2011 resulting in the loss of human lives and geomorphic channel change. The most dominant form of fluvial adjustment consisted of changes in channel geometry and notably widespread bank failures, which were readily identifiable as 'scalloped' shaped failure scarps. The spatial extents of these were mapped using high-resolution LiDAR derived digital elevation model and were verified by field surveys and air photos. Pre-flood event LiDAR coverage for the catchment also existed allowing direct comparison of the magnitude and frequency of bank failures from both pre and post-flood time periods. Data were collected and analysed within a GIS framework and investigated for power-law relationships. Bank failures appeared random and occurred

  15. PNNI routing support for ad hoc mobile networking: A flat architecture

    SciTech Connect

    Martinez, L.; Sholander, P.; Tolendino, L.

    1997-12-01

    This contribution extends the Outside Nodal Hierarchy List (ONHL) procedures described in ATM Form Contribution 97-0766. These extensions allow multiple mobile networks to form either an ad hoc network or an extension of a fixed PNNI infrastructure. This contribution covers the simplest case where the top-most Logical Group Nodes (LGNs), in those mobile networks, all reside at the same level in a PNNI hierarchy. Future contributions will cover the general case where those top-most LGNs reside at different hierarchy levels. This contribution considers a flat ad hoc network architecture--in the sense that each mobile network always participates in the PNNI hierarchy at the preconfigured level of its top-most LGN.

  16. Honeycomb architecture of carbon quantum dots: a new efficient substrate to support gold for stronger SERS

    NASA Astrophysics Data System (ADS)

    Fan, Yueqiong; Cheng, Huhu; Zhou, Ce; Xie, Xuejun; Liu, Yong; Dai, Liming; Zhang, Jing; Qu, Liangti

    2012-02-01

    The rational assembly of quantum dots (QDs) in a geometrically well-defined fashion opens up the possibility of accessing the full potential of the material and allows new functions of the assembled QDs to be achieved. In this work, well-confined two-dimensional (2D) and 3D carbon quantum dot (CQD) honeycomb structures have been assembled by electrodeposition of oxygen-rich functional CQDs within the interstitial voids of assemblies of SiO2 nanospheres, followed by extraction of the SiO2 cores with HF treatment. Although made from quantum sized carbon dots, the CQD assemblies present a solid porous framework, which can be further used as a sacrificial template for the fabrication of new nanostructures made from other functional materials. Based on the unique honeycomb architecture of the CQDs, which allows the more efficient adsorption of molecules, the formed Au nanoparticles on the CQD honeycomb exhibit 8-11 times stronger surface enhanced Raman scattering (SERS) effect than the widely used Au nanoparticle SERS substrate for the highly sensitive detection of target molecules. This work provides a new approach for the design and fabrication of ultrasensitive SERS platforms for various applications.The rational assembly of quantum dots (QDs) in a geometrically well-defined fashion opens up the possibility of accessing the full potential of the material and allows new functions of the assembled QDs to be achieved. In this work, well-confined two-dimensional (2D) and 3D carbon quantum dot (CQD) honeycomb structures have been assembled by electrodeposition of oxygen-rich functional CQDs within the interstitial voids of assemblies of SiO2 nanospheres, followed by extraction of the SiO2 cores with HF treatment. Although made from quantum sized carbon dots, the CQD assemblies present a solid porous framework, which can be further used as a sacrificial template for the fabrication of new nanostructures made from other functional materials. Based on the unique honeycomb

  17. Self-organised silicide nanodot patterning by medium-energy ion beam sputtering of Si(100): local correlation between the morphology and metal content

    NASA Astrophysics Data System (ADS)

    Redondo-Cubero, A.; Galiana, B.; Lorenz, K.; Palomares, FJ; Bahena, D.; Ballesteros, C.; Hernandez-Calderón, I.; Vázquez, L.

    2016-11-01

    We have produced self-organised silicide nanodot patterns by medium-energy ion beam sputtering (IBS) of silicon targets with a simultaneous and isotropic molybdenum supply. Atomic force microscopy (AFM) studies show that these patterns are qualitatively similar to those produced thus far at low ion energies. We have determined the relevance of the ion species on the pattern ordering and properties. For the higher ordered patterns produced by Xe+ ions, the pattern wavelength depends linearly on the ion energy. The dot nanostructures are silicide-rich as assessed by x-ray photoelectron spectroscopy (XPS) and emerge in height due to their lower sputtering yield, as observed by electron microscopy. Remarkably, a long wavelength corrugation is observed on the surface which is correlated with both the Mo content and the dot pattern properties. Thus, as assessed by electron microscopy, the protrusions are Mo-rich with higher and more spaced dots on their surface whereas the valleys are Mo-poor with smaller dots that are closer to each other. These findings indicate that there is a correlation between the local metal content of the surface and the nanodot pattern properties both at the nanodot and the large corrugation scales. These results contribute to advancing the understanding of this interesting nanofabrication method and aid in developing a comprehensive theory of nanodot pattern formation and evolution.

  18. Self-organised silicide nanodot patterning by medium-energy ion beam sputtering of Si(100): local correlation between the morphology and metal content.

    PubMed

    Redondo-Cubero, A; Galiana, B; Lorenz, K; Palomares, F J; Bahena, D; Ballesteros, C; Hernandez-Calderón, I; Vázquez, L

    2016-11-01

    We have produced self-organised silicide nanodot patterns by medium-energy ion beam sputtering (IBS) of silicon targets with a simultaneous and isotropic molybdenum supply. Atomic force microscopy (AFM) studies show that these patterns are qualitatively similar to those produced thus far at low ion energies. We have determined the relevance of the ion species on the pattern ordering and properties. For the higher ordered patterns produced by Xe(+) ions, the pattern wavelength depends linearly on the ion energy. The dot nanostructures are silicide-rich as assessed by x-ray photoelectron spectroscopy (XPS) and emerge in height due to their lower sputtering yield, as observed by electron microscopy. Remarkably, a long wavelength corrugation is observed on the surface which is correlated with both the Mo content and the dot pattern properties. Thus, as assessed by electron microscopy, the protrusions are Mo-rich with higher and more spaced dots on their surface whereas the valleys are Mo-poor with smaller dots that are closer to each other. These findings indicate that there is a correlation between the local metal content of the surface and the nanodot pattern properties both at the nanodot and the large corrugation scales. These results contribute to advancing the understanding of this interesting nanofabrication method and aid in developing a comprehensive theory of nanodot pattern formation and evolution.

  19. Self-organised silicide nanodot patterning by medium-energy ion beam sputtering of Si(100): local correlation between the morphology and metal content.

    PubMed

    Redondo-Cubero, A; Galiana, B; Lorenz, K; Palomares, F J; Bahena, D; Ballesteros, C; Hernandez-Calderón, I; Vázquez, L

    2016-11-01

    We have produced self-organised silicide nanodot patterns by medium-energy ion beam sputtering (IBS) of silicon targets with a simultaneous and isotropic molybdenum supply. Atomic force microscopy (AFM) studies show that these patterns are qualitatively similar to those produced thus far at low ion energies. We have determined the relevance of the ion species on the pattern ordering and properties. For the higher ordered patterns produced by Xe(+) ions, the pattern wavelength depends linearly on the ion energy. The dot nanostructures are silicide-rich as assessed by x-ray photoelectron spectroscopy (XPS) and emerge in height due to their lower sputtering yield, as observed by electron microscopy. Remarkably, a long wavelength corrugation is observed on the surface which is correlated with both the Mo content and the dot pattern properties. Thus, as assessed by electron microscopy, the protrusions are Mo-rich with higher and more spaced dots on their surface whereas the valleys are Mo-poor with smaller dots that are closer to each other. These findings indicate that there is a correlation between the local metal content of the surface and the nanodot pattern properties both at the nanodot and the large corrugation scales. These results contribute to advancing the understanding of this interesting nanofabrication method and aid in developing a comprehensive theory of nanodot pattern formation and evolution. PMID:27670245

  20. Functional annotation of the mesophilic-like character of mutants in a cold-adapted enzyme by self-organising map analysis of their molecular dynamics.

    PubMed

    Fraccalvieri, Domenico; Tiberti, Matteo; Pandini, Alessandro; Bonati, Laura; Papaleo, Elena

    2012-10-01

    Multiple comparison of the Molecular Dynamics (MD) trajectories of mutants in a cold-adapted α-amylase (AHA) could be used to elucidate functional features required to restore mesophilic-like activity. Unfortunately it is challenging to identify the different dynamic behaviors and correctly relate them to functional activity by routine analysis. We here employed a previously developed and robust two-stage approach that combines Self-Organising Maps (SOMs) and hierarchical clustering to compare conformational ensembles of proteins. Moreover, we designed a novel strategy to identify the specific mutations that more efficiently convert the dynamic signature of the psychrophilic enzyme (AHA) to that of the mesophilic counterpart (PPA). The SOM trained on AHA and its variants was used to classify a PPA MD ensemble and successfully highlighted the relationships between the flexibilities of the target enzyme and of the different mutants. Moreover the local features of the mutants that mostly influence their global flexibility in a mesophilic-like direction were detected. It turns out that mutations of the cold-adapted enzyme to hydrophobic and aromatic residues are the most effective in restoring the PPA dynamic features and could guide the design of more mesophilic-like mutants. In conclusion, our strategy can efficiently extract specific dynamic signatures related to function from multiple comparisons of MD conformational ensembles. Therefore, it can be a promising tool for protein engineering.

  1. Reconfiguration of brain network architecture to support executive control in aging.

    PubMed

    Gallen, Courtney L; Turner, Gary R; Adnan, Areeba; D'Esposito, Mark

    2016-08-01

    Aging is accompanied by declines in executive control abilities and changes in underlying brain network architecture. Here, we examined brain networks in young and older adults during a task-free resting state and an N-back task and investigated age-related changes in the modular network organization of the brain. Compared with young adults, older adults showed larger changes in network organization between resting state and task. Although young adults exhibited increased connectivity between lateral frontal regions and other network modules during the most difficult task condition, older adults also exhibited this pattern of increased connectivity during less-demanding task conditions. Moreover, the increase in between-module connectivity in older adults was related to faster task performance and greater fractional anisotropy of the superior longitudinal fasciculus. These results demonstrate that older adults who exhibit more pronounced network changes between a resting state and task have better executive control performance and greater structural connectivity of a core frontal-posterior white matter pathway.

  2. Structural architecture supports functional organization in the human aging brain at a regionwise and network level.

    PubMed

    Zimmermann, Joelle; Ritter, Petra; Shen, Kelly; Rothmeier, Simon; Schirner, Michael; McIntosh, Anthony R

    2016-07-01

    Functional interactions in the brain are constrained by the underlying anatomical architecture, and structural and functional networks share network features such as modularity. Accordingly, age-related changes of structural connectivity (SC) may be paralleled by changes in functional connectivity (FC). We provide a detailed qualitative and quantitative characterization of the SC-FC coupling in human aging as inferred from resting-state blood oxygen-level dependent functional magnetic resonance imaging and diffusion-weighted imaging in a sample of 47 adults with an age range of 18-82. We revealed that SC and FC decrease with age across most parts of the brain and there is a distinct age-dependency of regionwise SC-FC coupling and network-level SC-FC relations. A specific pattern of SC-FC coupling predicts age more reliably than does regionwise SC or FC alone (r = 0.73, 95% CI = [0.7093, 0.8522]). Hence, our data propose that regionwise SC-FC coupling can be used to characterize brain changes in aging. Hum Brain Mapp 37:2645-2661, 2016. © 2016 Wiley Periodicals, Inc. PMID:27041212

  3. Thioflavin T-Silent Denaturation Intermediates Support the Main-Chain-Dominated Architecture of Amyloid Fibrils.

    PubMed

    Noda, Sayaka; So, Masatomo; Adachi, Masayuki; Kardos, József; Akazawa-Ogawa, Yoko; Hagihara, Yoshihisa; Goto, Yuji

    2016-07-19

    Ultrasonication is considered one of the most effective agitations for inducing the spontaneous formation of amyloid fibrils. When we induced the ultrasonication-dependent fibrillation of β2-microglobulin and insulin monitored by amyloid-specific thioflavin T (ThT) fluorescence, both proteins showed a significant decrease in ThT fluorescence after the burst-phase increase. The decrease in ThT fluorescence was accelerated when the ultrasonic power was stronger, suggesting that this decrease was caused by the partial denaturation of preformed fibrils. The possible intermediates of denaturation retained amyloid-like morphologies, secondary structures, and seeding potentials. Similar denaturation intermediates were also observed when fibrils were denatured by guanidine hydrochloride or sodium dodecyl sulfate. The presence of these denaturation intermediates is consistent with the main-chain-dominated architecture of amyloid fibrils. Moreover, in the three types of denaturation experiments conducted, insulin fibrils were more stable than β2-microglobulin fibrils, suggesting that the relative stability of various fibrils is independent of the method of denaturation. PMID:27345358

  4. Architectural proteins Pita, Zw5,and ZIPIC contain homodimerization domain and support specific long-range interactions in Drosophila

    PubMed Central

    Zolotarev, Nikolay; Fedotova, Anna; Kyrchanova, Olga; Bonchuk, Artem; Penin, Aleksey A.; Lando, Andrey S.; Eliseeva, Irina A.; Kulakovskiy, Ivan V.; Maksimenko, Oksana; Georgiev, Pavel

    2016-01-01

    According to recent models, as yet poorly studied architectural proteins appear to be required for local regulation of enhancer–promoter interactions, as well as for global chromosome organization. Transcription factors ZIPIC, Pita and Zw5 belong to the class of chromatin insulator proteins and preferentially bind to promoters near the TSS and extensively colocalize with cohesin and condensin complexes. ZIPIC, Pita and Zw5 are structurally similar in containing the N-terminal zinc finger-associated domain (ZAD) and different numbers of C2H2-type zinc fingers at the C-terminus. Here we have shown that the ZAD domains of ZIPIC, Pita and Zw5 form homodimers. In Drosophila transgenic lines, these proteins are able to support long-distance interaction between GAL4 activator and the reporter gene promoter. However, no functional interaction between binding sites for different proteins has been revealed, suggesting that such interactions are highly specific. ZIPIC facilitates long-distance stimulation of the reporter gene by GAL4 activator in yeast model system. Many of the genomic binding sites of ZIPIC, Pita and Zw5 are located at the boundaries of topologically associated domains (TADs). Thus, ZAD-containing zinc-finger proteins can be attributed to the class of architectural proteins. PMID:27137890

  5. Architectural proteins Pita, Zw5,and ZIPIC contain homodimerization domain and support specific long-range interactions in Drosophila.

    PubMed

    Zolotarev, Nikolay; Fedotova, Anna; Kyrchanova, Olga; Bonchuk, Artem; Penin, Aleksey A; Lando, Andrey S; Eliseeva, Irina A; Kulakovskiy, Ivan V; Maksimenko, Oksana; Georgiev, Pavel

    2016-09-01

    According to recent models, as yet poorly studied architectural proteins appear to be required for local regulation of enhancer-promoter interactions, as well as for global chromosome organization. Transcription factors ZIPIC, Pita and Zw5 belong to the class of chromatin insulator proteins and preferentially bind to promoters near the TSS and extensively colocalize with cohesin and condensin complexes. ZIPIC, Pita and Zw5 are structurally similar in containing the N-terminal zinc finger-associated domain (ZAD) and different numbers of C2H2-type zinc fingers at the C-terminus. Here we have shown that the ZAD domains of ZIPIC, Pita and Zw5 form homodimers. In Drosophila transgenic lines, these proteins are able to support long-distance interaction between GAL4 activator and the reporter gene promoter. However, no functional interaction between binding sites for different proteins has been revealed, suggesting that such interactions are highly specific. ZIPIC facilitates long-distance stimulation of the reporter gene by GAL4 activator in yeast model system. Many of the genomic binding sites of ZIPIC, Pita and Zw5 are located at the boundaries of topologically associated domains (TADs). Thus, ZAD-containing zinc-finger proteins can be attributed to the class of architectural proteins. PMID:27137890

  6. Does Supporting Multiple Student Strategies Lead to Greater Learning and Motivation? Investigating a Source of Complexity in the Architecture of Intelligent Tutoring Systems

    ERIC Educational Resources Information Center

    Waalkens, Maaike; Aleven, Vincent; Taatgen, Niels

    2013-01-01

    Intelligent tutoring systems (ITS) support students in learning a complex problem-solving skill. One feature that makes an ITS architecturally complex, and hard to build, is support for strategy freedom, that is, the ability to let students pursue multiple solution strategies within a given problem. But does greater freedom mean that students…

  7. Adaptive and Speculative Memory Consistency Support for Multi-core Architectures with On-Chip Local Memories

    NASA Astrophysics Data System (ADS)

    Vujic, Nikola; Alvarez, Lluc; Tallada, Marc Gonzalez; Martorell, Xavier; Ayguadé, Eduard

    Software cache has been showed as a robust approach in multi-core systems with no hardware support for transparent data transfers between local and global memories. Software cache provides the user with a transparent view of the memory architecture and considerably improves the programmability of such systems. But this software approach can suffer from poor performance due to considerable overheads related to software mechanisms to maintain the memory consistency. This paper presents a set of alternatives to smooth their impact. A specific write-back mechanism is introduced based on some degree of speculation regarding the number of threads actually modifying the same cache lines. A case study based on the Cell BE processor is described. Performance evaluation indicates that improvements due to the optimized software-cache structures combined with the proposed code-optimizations translate into 20% up to 40% speedup factors, compared to a traditional software cache approach.

  8. Insight into the Supramolecular Architecture of Intact Diatom Biosilica from DNP-Supported Solid-State NMR Spectroscopy.

    PubMed

    Jantschke, Anne; Koers, Eline; Mance, Deni; Weingarth, Markus; Brunner, Eike; Baldus, Marc

    2015-12-01

    Diatom biosilica is an inorganic/organic hybrid with interesting properties. The molecular architecture of the organic material at the atomic and nanometer scale has so far remained unknown, in particular for intact biosilica. A DNP-supported ssNMR approach assisted by microscopy, MS, and MD simulations was applied to study the structural organization of intact biosilica. For the first time, the secondary structure elements of tightly biosilica-associated native proteins in diatom biosilica were characterized in situ. Our data suggest that these proteins are rich in a limited set of amino acids and adopt a mixture of random-coil and β-strand conformations. Furthermore, biosilica-associated long-chain polyamines and carbohydrates were characterized, thereby leading to a model for the supramolecular organization of intact biosilica.

  9. Insight into the Supramolecular Architecture of Intact Diatom Biosilica from DNP-Supported Solid-State NMR Spectroscopy.

    PubMed

    Jantschke, Anne; Koers, Eline; Mance, Deni; Weingarth, Markus; Brunner, Eike; Baldus, Marc

    2015-12-01

    Diatom biosilica is an inorganic/organic hybrid with interesting properties. The molecular architecture of the organic material at the atomic and nanometer scale has so far remained unknown, in particular for intact biosilica. A DNP-supported ssNMR approach assisted by microscopy, MS, and MD simulations was applied to study the structural organization of intact biosilica. For the first time, the secondary structure elements of tightly biosilica-associated native proteins in diatom biosilica were characterized in situ. Our data suggest that these proteins are rich in a limited set of amino acids and adopt a mixture of random-coil and β-strand conformations. Furthermore, biosilica-associated long-chain polyamines and carbohydrates were characterized, thereby leading to a model for the supramolecular organization of intact biosilica. PMID:26509491

  10. The kinematic architecture of the Active Headframe: A new head support for awake brain surgery.

    PubMed

    Malosio, Matteo; Negri, Simone Pio; Pedrocchi, Nicola; Vicentini, Federico; Cardinale, Francesco; Tosatti, Lorenzo Molinari

    2012-01-01

    This paper presents the novel hybrid kinematic structure of the Active Headframe, a robotic head support to be employed in brain surgery operations for an active and dynamic control of the patient's head position and orientation, particularly addressing awake surgery requirements. The topology has been conceived in order to satisfy all the installation, functional and dynamic requirements. A kinetostatic optimization has been performed to obtain the actual geometric dimensions of the prototype currently being developed. PMID:23366166

  11. The kinematic architecture of the Active Headframe: A new head support for awake brain surgery.

    PubMed

    Malosio, Matteo; Negri, Simone Pio; Pedrocchi, Nicola; Vicentini, Federico; Cardinale, Francesco; Tosatti, Lorenzo Molinari

    2012-01-01

    This paper presents the novel hybrid kinematic structure of the Active Headframe, a robotic head support to be employed in brain surgery operations for an active and dynamic control of the patient's head position and orientation, particularly addressing awake surgery requirements. The topology has been conceived in order to satisfy all the installation, functional and dynamic requirements. A kinetostatic optimization has been performed to obtain the actual geometric dimensions of the prototype currently being developed.

  12. Software Architecture to Support the Evolution of the ISRU RESOLVE Engineering Breadboard Unit 2 (EBU2)

    NASA Technical Reports Server (NTRS)

    Moss, Thomas; Nurge, Mark; Perusich, Stephen

    2011-01-01

    The In-Situ Resource Utilization (ISRU) Regolith & Environmental Science and Oxygen & Lunar Volatiles Extraction (RESOLVE) software provides operation of the physical plant from a remote location with a high-level interface that can access and control the data from external software applications of other subsystems. This software allows autonomous control over the entire system with manual computer control of individual system/process components. It gives non-programmer operators the capability to easily modify the high-level autonomous sequencing while the software is in operation, as well as the ability to modify the low-level, file-based sequences prior to the system operation. Local automated control in a distributed system is also enabled where component control is maintained during the loss of network connectivity with the remote workstation. This innovation also minimizes network traffic. The software architecture commands and controls the latest generation of RESOLVE processes used to obtain, process, and quantify lunar regolith. The system is grouped into six sub-processes: Drill, Crush, Reactor, Lunar Water Resource Demonstration (LWRD), Regolith Volatiles Characterization (RVC) (see example), and Regolith Oxygen Extraction (ROE). Some processes are independent, some are dependent on other processes, and some are independent but run concurrently with other processes. The first goal is to analyze the volatiles emanating from lunar regolith, such as water, carbon monoxide, carbon dioxide, ammonia, hydrogen, and others. This is done by heating the soil and analyzing and capturing the volatilized product. The second goal is to produce water by reducing the soil at high temperatures with hydrogen. This is done by raising the reactor temperature in the range of 800 to 900 C, causing the reaction to progress by adding hydrogen, and then capturing the water product in a desiccant bed. The software needs to run the entire unit and all sub-processes; however

  13. CranialCloud: A cloud-based architecture to support trans-institutional collaborative efforts in neuro-degenerative disorders

    PubMed Central

    D’Haese, Pierre-Francois; Konrad, Peter E.; Pallavaram, Srivatsan; Li, Rui; Prassad, Priyanka; Rodriguez, William; Dawant, Benoit M.

    2015-01-01

    Purpose Neurological diseases have a devastating impact on millions of individuals and their families. These diseases will continue to constitute a significant research focus for this century. The search for effective treatments and cures requires multiple teams of experts in clinical neurosciences, neuroradiology, engineering and industry. Hence, the need to communicate a large amount of information with accuracy and precision is more necessary than ever for this specialty. Method In this paper, we present a distributed system that supports this vision, which we call the CranialVault Cloud (CranialCloud). It consists in a network of nodes, each with the capability to store and process data, that share the same spatial normalization processes, thus guaranteeing a common reference space. We detail and justify design choices, the architecture and functionality of individual nodes, the way these nodes interact, and how the distributed system can be used to support inter-institutional research. Results We discuss the current state of the system that gathers data for more than 1,600 patients and how we envision it to grow. Conclusions We contend that the fastest way to find and develop promising treatments and cures is to permit teams of researchers to aggregate data, spatially normalize these data, and share them. The Cranialvault system is a system that supports this vision. PMID:25861055

  14. Guiding Requirements for Designing Life Support System Architectures for Crewed Exploration Missions Beyond Low-Earth Orbit

    NASA Technical Reports Server (NTRS)

    Perry, Jay L.; Sargusingh, Miriam J.; Toomarian, Nikzad

    2016-01-01

    The National Aeronautics and Space Administration's (NASA) technology development roadmaps provide guidance to focus technological development in areas that enable crewed exploration missions beyond low-Earth orbit. Specifically, the technology area roadmap on human health, life support and habitation systems describes the need for life support system (LSS) technologies that can improve reliability and in-flight maintainability within a minimally-sized package while enabling a high degree of mission autonomy. To address the needs outlined by the guiding technology area roadmap, NASA's Advanced Exploration Systems (AES) Program has commissioned the Life Support Systems (LSS) Project to lead technology development in the areas of water recovery and management, atmosphere revitalization, and environmental monitoring. A notional exploration LSS architecture derived from the International Space has been developed and serves as the developmental basis for these efforts. Functional requirements and key performance parameters that guide the exploration LSS technology development efforts are presented and discussed. Areas where LSS flight operations aboard the ISS afford lessons learned that are relevant to exploration missions are highlighted.

  15. Fortran Transformational Tools in Support of Scientific Application Development for Petascale Computer Architectures

    SciTech Connect

    Sottille, Matthew

    2013-09-12

    This document is the final report for a multi-year effort building infrastructure to support tool development for Fortran programs. We also investigated static analysis and code transformation methods relevant to scientific programmers who are writing Fortran programs for petascale-class high performance computing systems. This report details our accomplishments, technical approaches, and provides information on where the research results and code may be obtained from an open source software repository. The report for the first year of the project that was performed at the University of Oregon prior to the PI moving to Galois, Inc. is included as an appendix.

  16. Development of a Computer Architecture to Support the Optical Plume Anomaly Detection (OPAD) System

    NASA Technical Reports Server (NTRS)

    Katsinis, Constantine

    1996-01-01

    to execute the software in a modern single-processor workstation, and therefore real-time operation is currently not possible. A different number of iterations may be required to perform spectral data fitting per spectral sample. Yet, the OPAD system must be designed to maintain real-time performance in all cases. Although faster single-processor workstations are available for execution of the fitting and SPECTRA software, this option is unattractive due to the excessive cost associated with very fast workstations and also due to the fact that such hardware is not easily expandable to accommodate future versions of the software which may require more processing power. Initial research has already demonstrated that the OPAD software can take advantage of a parallel computer architecture to achieve the necessary speedup. Current work has improved the software by converting it into a form which is easily parallelizable. Timing experiments have been performed to establish the computational complexity and execution speed of major components of the software. This work provides the foundation of future work which will create a fully parallel version of the software executing in a shared-memory multiprocessor system.

  17. NASA's Earth Science Gateway within the GEOSS Architecture Framework and in Support of Distributed Global Systems

    NASA Astrophysics Data System (ADS)

    Alameh, N.; Cole, M.; Bambacus, M.; Thomas, R.

    2007-12-01

    Progress continues within the arena of interoperability towards greater discovery, access, and use of scientific data regarding improved societal benefit and decision solutions. The Group on Earth Observation System of Systems has developed multiple pilot projects in which many of these maturing and emerging technologies are being interconnected, tested, and implemented as operational systems. Within this network of components are data stores, registries, catalogs, portals, models, work-process flows, satellites, UAVs, and many more components. The pilots tackle the intricate process of ensuring these components properly work together within the framework of open-standards based interoperability, and enabling the vision of a distributed, comprehensible, global system of scientific tools and data for the lay-person as well as the researcher. This paper will concentrate on the NASA Earth Science Gateway (http://esg.gsfc.nasa.gov) view of interconnections to the registries, catalogs, models, and so on, that support this System of Systems. This paper covers what standards were used, how components can be connected and tested, where difficulties emerged, where we have seen return on investment, and how this level of interoperability is progressing.

  18. Self-Organizing Distributed Architecture Supporting Dynamic Space Expanding and Reducing in Indoor LBS Environment

    PubMed Central

    Jeong, Seol Young; Jo, Hyeong Gon; Kang, Soon Ju

    2015-01-01

    Indoor location-based services (iLBS) are extremely dynamic and changeable, and include numerous resources and mobile devices. In particular, the network infrastructure requires support for high scalability in the indoor environment, and various resource lookups are requested concurrently and frequently from several locations based on the dynamic network environment. A traditional map-based centralized approach for iLBSs has several disadvantages: it requires global knowledge to maintain a complete geographic indoor map; the central server is a single point of failure; it can also cause low scalability and traffic congestion; and it is hard to adapt to a change of service area in real time. This paper proposes a self-organizing and fully distributed platform for iLBSs. The proposed self-organizing distributed platform provides a dynamic reconfiguration of locality accuracy and service coverage by expanding and contracting dynamically. In order to verify the suggested platform, scalability performance according to the number of inserted or deleted nodes composing the dynamic infrastructure was evaluated through a simulation similar to the real environment. PMID:26016908

  19. Self-organizing distributed architecture supporting dynamic space expanding and reducing in indoor LBS environment.

    PubMed

    Jeong, Seol Young; Jo, Hyeong Gon; Kang, Soon Ju

    2015-05-26

    Indoor location-based services (iLBS) are extremely dynamic and changeable, and include numerous resources and mobile devices. In particular, the network infrastructure requires support for high scalability in the indoor environment, and various resource lookups are requested concurrently and frequently from several locations based on the dynamic network environment. A traditional map-based centralized approach for iLBSs has several disadvantages: it requires global knowledge to maintain a complete geographic indoor map; the central server is a single point of failure; it can also cause low scalability and traffic congestion; and it is hard to adapt to a change of service area in real time. This paper proposes a self-organizing and fully distributed platform for iLBSs. The proposed self-organizing distributed platform provides a dynamic reconfiguration of locality accuracy and service coverage by expanding and contracting dynamically. In order to verify the suggested platform, scalability performance according to the number of inserted or deleted nodes composing the dynamic infrastructure was evaluated through a simulation similar to the real environment.

  20. Self-organizing distributed architecture supporting dynamic space expanding and reducing in indoor LBS environment.

    PubMed

    Jeong, Seol Young; Jo, Hyeong Gon; Kang, Soon Ju

    2015-01-01

    Indoor location-based services (iLBS) are extremely dynamic and changeable, and include numerous resources and mobile devices. In particular, the network infrastructure requires support for high scalability in the indoor environment, and various resource lookups are requested concurrently and frequently from several locations based on the dynamic network environment. A traditional map-based centralized approach for iLBSs has several disadvantages: it requires global knowledge to maintain a complete geographic indoor map; the central server is a single point of failure; it can also cause low scalability and traffic congestion; and it is hard to adapt to a change of service area in real time. This paper proposes a self-organizing and fully distributed platform for iLBSs. The proposed self-organizing distributed platform provides a dynamic reconfiguration of locality accuracy and service coverage by expanding and contracting dynamically. In order to verify the suggested platform, scalability performance according to the number of inserted or deleted nodes composing the dynamic infrastructure was evaluated through a simulation similar to the real environment. PMID:26016908

  1. NASA's Earth Observing Data and Information System - Supporting Interoperability through a Scalable Architecture (Invited)

    NASA Astrophysics Data System (ADS)

    Mitchell, A. E.; Lowe, D. R.; Murphy, K. J.; Ramapriyan, H. K.

    2011-12-01

    Initiated in 1990, NASA's Earth Observing System Data and Information System (EOSDIS) is currently a petabyte-scale archive of data designed to receive, process, distribute and archive several terabytes of science data per day from NASA's Earth science missions. Comprised of 12 discipline specific data centers collocated with centers of science discipline expertise, EOSDIS manages over 6800 data products from many science disciplines and sources. NASA supports global climate change research by providing scalable open application layers to the EOSDIS distributed information framework. This allows many other value-added services to access NASA's vast Earth Science Collection and allows EOSDIS to interoperate with data archives from other domestic and international organizations. EOSDIS is committed to NASA's Data Policy of full and open sharing of Earth science data. As metadata is used in all aspects of NASA's Earth science data lifecycle, EOSDIS provides a spatial and temporal metadata registry and order broker called the EOS Clearing House (ECHO) that allows efficient search and access of cross domain data and services through the Reverb Client and Application Programmer Interfaces (APIs). Another core metadata component of EOSDIS is NASA's Global Change Master Directory (GCMD) which represents more than 25,000 Earth science data set and service descriptions from all over the world, covering subject areas within the Earth and environmental sciences. With inputs from the ECHO, GCMD and Soil Moisture Active Passive (SMAP) mission metadata models, EOSDIS is developing a NASA ISO 19115 Best Practices Convention. Adoption of an international metadata standard enables a far greater level of interoperability among national and international data products. NASA recently concluded a 'Metadata Harmony Study' of EOSDIS metadata capabilities/processes of ECHO and NASA's Global Change Master Directory (GCMD), to evaluate opportunities for improved data access and use, reduce

  2. NASA's Earth Observing Data and Information System - Supporting Interoperability through a Scalable Architecture (Invited)

    NASA Astrophysics Data System (ADS)

    Mitchell, A. E.; Lowe, D. R.; Murphy, K. J.; Ramapriyan, H. K.

    2013-12-01

    Initiated in 1990, NASA's Earth Observing System Data and Information System (EOSDIS) is currently a petabyte-scale archive of data designed to receive, process, distribute and archive several terabytes of science data per day from NASA's Earth science missions. Comprised of 12 discipline specific data centers collocated with centers of science discipline expertise, EOSDIS manages over 6800 data products from many science disciplines and sources. NASA supports global climate change research by providing scalable open application layers to the EOSDIS distributed information framework. This allows many other value-added services to access NASA's vast Earth Science Collection and allows EOSDIS to interoperate with data archives from other domestic and international organizations. EOSDIS is committed to NASA's Data Policy of full and open sharing of Earth science data. As metadata is used in all aspects of NASA's Earth science data lifecycle, EOSDIS provides a spatial and temporal metadata registry and order broker called the EOS Clearing House (ECHO) that allows efficient search and access of cross domain data and services through the Reverb Client and Application Programmer Interfaces (APIs). Another core metadata component of EOSDIS is NASA's Global Change Master Directory (GCMD) which represents more than 25,000 Earth science data set and service descriptions from all over the world, covering subject areas within the Earth and environmental sciences. With inputs from the ECHO, GCMD and Soil Moisture Active Passive (SMAP) mission metadata models, EOSDIS is developing a NASA ISO 19115 Best Practices Convention. Adoption of an international metadata standard enables a far greater level of interoperability among national and international data products. NASA recently concluded a 'Metadata Harmony Study' of EOSDIS metadata capabilities/processes of ECHO and NASA's Global Change Master Directory (GCMD), to evaluate opportunities for improved data access and use, reduce

  3. A Scalable Architecture for Incremental Specification and Maintenance of Procedural and Declarative Clinical Decision-Support Knowledge

    PubMed Central

    Hatsek, Avner; Shahar, Yuval; Taieb-Maimon, Meirav; Shalom, Erez; Klimov, Denis; Lunenfeld, Eitan

    2010-01-01

    Clinical guidelines have been shown to improve the quality of medical care and to reduce its costs. However, most guidelines exist in a free-text representation and, without automation, are not sufficiently accessible to clinicians at the point of care. A prerequisite for automated guideline application is a machine-comprehensible representation of the guidelines. In this study, we designed and implemented a scalable architecture to support medical experts and knowledge engineers in specifying and maintaining the procedural and declarative aspects of clinical guideline knowledge, resulting in a machine comprehensible representation. The new framework significantly extends our previous work on the Digital electronic Guidelines Library (DeGeL) The current study designed and implemented a graphical framework for specification of declarative and procedural clinical knowledge, Gesher. We performed three different experiments to evaluate the functionality and usability of the major aspects of the new framework: Specification of procedural clinical knowledge, specification of declarative clinical knowledge, and exploration of a given clinical guideline. The subjects included clinicians and knowledge engineers (overall, 27 participants). The evaluations indicated high levels of completeness and correctness of the guideline specification process by both the clinicians and the knowledge engineers, although the best results, in the case of declarative-knowledge specification, were achieved by teams including a clinician and a knowledge engineer. The usability scores were high as well, although the clinicians’ assessment was significantly lower than the assessment of the knowledge engineers. PMID:21611137

  4. Long-Term Patterns in the Population Dynamics of Daphnia longispina, Leptodora kindtii and Cyanobacteria in a Shallow Reservoir: A Self-Organising Map (SOM) Approach.

    PubMed

    Wojtal-Frankiewicz, Adrianna; Kruk, Andrzej; Frankiewicz, Piotr; Oleksińska, Zuzanna; Izydorczyk, Katarzyna

    2015-01-01

    The recognition of long-term patterns in the seasonal dynamics of Daphnia longispina, Leptodora kindtii and cyanobacteria is dependent upon their interactions, the water temperature and the hydrological conditions, which were all investigated between 1999 and 2008 in the lowland Sulejow Reservoir. The biomass of cyanobacteria, densities of D. longispina and L. kindtii, concentration of chlorophyll a and water temperature were assessed weekly from April to October at three sampling stations along the longitudinal reservoir axis. The retention time was calculated using data on the actual water inflow and reservoir volume. A self-organising map (SOM) was used due to high interannual variability in the studied parameters and their often non-linear relationships. Classification of the SOM output neurons into three clusters that grouped the sampling terms with similar biotic states allowed identification of the crucial abiotic factors responsible for the seasonal sequence of events: cluster CL-ExSp (extreme/spring) corresponded to hydrologically unstable cold periods (mostly spring) with extreme values and highly variable abiotic factors, which made abiotic control of the biota dominant; cluster CL-StSm (stable/summer) was associated with ordinary late spring and summer and was characterised by stable non-extreme abiotic conditions, which made biotic interactions more important; and the cluster CL-ExSm (extreme/summer), was associated with late spring/summer and characterised by thermal or hydrological extremes, which weakened the role of biotic factors. The significance of the differences between the SOM sub-clusters was verified by Kruskal-Wallis and post-hoc Dunn tests. The importance of the temperature and hydrological regimes as the key plankton-regulating factors in the dam reservoir, as shown by the SOM, was confirmed by the results of canonical correlation analyses (CCA) of each cluster. The demonstrated significance of hydrology in seasonal plankton dynamics

  5. Long-Term Patterns in the Population Dynamics of Daphnia longispina, Leptodora kindtii and Cyanobacteria in a Shallow Reservoir: A Self-Organising Map (SOM) Approach

    PubMed Central

    Wojtal-Frankiewicz, Adrianna; Kruk, Andrzej; Frankiewicz, Piotr; Oleksińska, Zuzanna; Izydorczyk, Katarzyna

    2015-01-01

    The recognition of long-term patterns in the seasonal dynamics of Daphnia longispina, Leptodora kindtii and cyanobacteria is dependent upon their interactions, the water temperature and the hydrological conditions, which were all investigated between 1999 and 2008 in the lowland Sulejow Reservoir. The biomass of cyanobacteria, densities of D. longispina and L. kindtii, concentration of chlorophyll a and water temperature were assessed weekly from April to October at three sampling stations along the longitudinal reservoir axis. The retention time was calculated using data on the actual water inflow and reservoir volume. A self-organising map (SOM) was used due to high interannual variability in the studied parameters and their often non-linear relationships. Classification of the SOM output neurons into three clusters that grouped the sampling terms with similar biotic states allowed identification of the crucial abiotic factors responsible for the seasonal sequence of events: cluster CL-ExSp (extreme/spring) corresponded to hydrologically unstable cold periods (mostly spring) with extreme values and highly variable abiotic factors, which made abiotic control of the biota dominant; cluster CL-StSm (stable/summer) was associated with ordinary late spring and summer and was characterised by stable non-extreme abiotic conditions, which made biotic interactions more important; and the cluster CL-ExSm (extreme/summer), was associated with late spring/summer and characterised by thermal or hydrological extremes, which weakened the role of biotic factors. The significance of the differences between the SOM sub-clusters was verified by Kruskal-Wallis and post-hoc Dunn tests. The importance of the temperature and hydrological regimes as the key plankton-regulating factors in the dam reservoir, as shown by the SOM, was confirmed by the results of canonical correlation analyses (CCA) of each cluster. The demonstrated significance of hydrology in seasonal plankton dynamics

  6. Long-Term Patterns in the Population Dynamics of Daphnia longispina, Leptodora kindtii and Cyanobacteria in a Shallow Reservoir: A Self-Organising Map (SOM) Approach.

    PubMed

    Wojtal-Frankiewicz, Adrianna; Kruk, Andrzej; Frankiewicz, Piotr; Oleksińska, Zuzanna; Izydorczyk, Katarzyna

    2015-01-01

    The recognition of long-term patterns in the seasonal dynamics of Daphnia longispina, Leptodora kindtii and cyanobacteria is dependent upon their interactions, the water temperature and the hydrological conditions, which were all investigated between 1999 and 2008 in the lowland Sulejow Reservoir. The biomass of cyanobacteria, densities of D. longispina and L. kindtii, concentration of chlorophyll a and water temperature were assessed weekly from April to October at three sampling stations along the longitudinal reservoir axis. The retention time was calculated using data on the actual water inflow and reservoir volume. A self-organising map (SOM) was used due to high interannual variability in the studied parameters and their often non-linear relationships. Classification of the SOM output neurons into three clusters that grouped the sampling terms with similar biotic states allowed identification of the crucial abiotic factors responsible for the seasonal sequence of events: cluster CL-ExSp (extreme/spring) corresponded to hydrologically unstable cold periods (mostly spring) with extreme values and highly variable abiotic factors, which made abiotic control of the biota dominant; cluster CL-StSm (stable/summer) was associated with ordinary late spring and summer and was characterised by stable non-extreme abiotic conditions, which made biotic interactions more important; and the cluster CL-ExSm (extreme/summer), was associated with late spring/summer and characterised by thermal or hydrological extremes, which weakened the role of biotic factors. The significance of the differences between the SOM sub-clusters was verified by Kruskal-Wallis and post-hoc Dunn tests. The importance of the temperature and hydrological regimes as the key plankton-regulating factors in the dam reservoir, as shown by the SOM, was confirmed by the results of canonical correlation analyses (CCA) of each cluster. The demonstrated significance of hydrology in seasonal plankton dynamics

  7. A fraud management system architecture for next-generation networks.

    PubMed

    Bihina Bella, M A; Eloff, J H P; Olivier, M S

    2009-03-10

    This paper proposes an original architecture for a fraud management system (FMS) for convergent. Next-generation networks (NGNs), which are based on the Internet protocol (IP). The architecture has the potential to satisfy the requirements of flexibility and application-independency for effective fraud detection in NGNs that cannot be met by traditional FMSs. The proposed architecture has a thorough four-stage detection process that analyses billing records in IP detail record (IPDR) format - an emerging IP-based billing standard - for signs of fraud. Its key feature is its usage of neural networks in the form of self-organising maps (SOMs) to help uncover unknown NGN fraud scenarios. A prototype was implemented to test the effectiveness of using a SOM for fraud detection and is also described in the paper.

  8. A Sustainable, Reliable Mission-Systems Architecture that Supports a System of Systems Approach to Space Exploration

    NASA Technical Reports Server (NTRS)

    Watson, Steve; Orr, Jim; O'Neil, Graham

    2004-01-01

    A mission-systems architecture based on a highly modular "systems of systems" infrastructure utilizing open-standards hardware and software interfaces as the enabling technology is absolutely essential for an affordable and sustainable space exploration program. This architecture requires (a) robust communication between heterogeneous systems, (b) high reliability, (c) minimal mission-to-mission reconfiguration, (d) affordable development, system integration, and verification of systems, and (e) minimum sustaining engineering. This paper proposes such an architecture. Lessons learned from the space shuttle program are applied to help define and refine the model.

  9. An open and flexible interface proposal and proof of concept implementation to support service orientated architectures and interoperability in the tactical environment

    NASA Astrophysics Data System (ADS)

    Peach, Nicholas

    2012-06-01

    The development of SOAs in the tactical domain has been hindered by a lack of interface standards suitable for the environment of unpredictable and low bandwidth communications, low powered computers and dynamic ad-hoc grouping of tactical participants. Existing commercial SOA standards have assumed reliable access to central servers and services and having relatively static participants. The proposal describes an open and published message-oriented interface created to support the aims of the upcoming MOD Generic Base Architecture1 (GBA) Defence Standard and the associated Land Open Systems Architecture2. The aims are; a) to support multiple open transport protocols, such as HTTP, SMTP, DDS and MQTT; b) to be suitable for integrating together low-level utility functions with their controlling systems (such as water, waste and power) and integrating together high-level mission-support functions (such as ISTAR and C2); c) reduce operator burden by using automated discovery and configuration where possible; d) dynamically integrate with MOD Generic Vehicle Architecture3 platforms to link base and vehicle mission and logistics systems over tactical radio links; e) extensible to support features such as security classification; f) to be lightweight in implementation and bandwidth and not dependent on central servers for operation. The paper will present the proposed interface and describe the features required for a military tactical rather than a commercial environment, and will report the outcome of a MOD-funded proof of concept that uses the proposed interface to interoperate several military systems.

  10. IAIMS Architecture

    PubMed Central

    Hripcsak, George

    1997-01-01

    Abstract An information system architecture defines the components of a system and the interfaces among the components. A good architecture is essential for creating an Integrated Advanced Information Management System (IAIMS) that works as an integrated whole yet is flexible enough to accommodate many users and roles, multiple applications, changing vendors, evolving user needs, and advancing technology. Modularity and layering promote flexibility by reducing the complexity of a system and by restricting the ways in which components may interact. Enterprise-wide mediation promotes integration by providing message routing, support for standards, dictionary-based code translation, a centralized conceptual data schema, business rule implementation, and consistent access to databases. Several IAIMS sites have adopted a client-server architecture, and some have adopted a three-tiered approach, separating user interface functions, application logic, and repositories. PMID:9067884

  11. An SMS-based System Architecture (Logical Model) to Support Management of Information Exchange in Emergency Stuations. poLINT-112-SMS PROJECT

    NASA Astrophysics Data System (ADS)

    Vetulani, Zygmunt; Marciniak, Jacek; Konieczka, Pawel; Walkowska, Justyna

    In the paper we present the architecture of the POLINT-112-SMS system to support information management in emergency situations. The system interprets the text input in form of SMS messages, understands and interprets information provided by the human user. It is supposed to assist a human in taking decisions. The main modules of the system presented here are the following: the SMS gate, the NLP Module (processing Polish), the Situation Analysis Module (SAM) and the Dialogue Maintenance Module (DMM).

  12. High-level specification of a proposed information architecture for support of a bioterrorism early-warning system.

    PubMed

    Berkowitz, Murray R

    2013-01-01

    Current information systems for use in detecting bioterrorist attacks lack a consistent, overarching information architecture. An overview of the use of biological agents as weapons during a bioterrorist attack is presented. Proposed are the design, development, and implementation of a medical informatics system to mine pertinent databases, retrieve relevant data, invoke appropriate biostatistical and epidemiological software packages, and automatically analyze these data. The top-level information architecture is presented. Systems requirements and functional specifications for this level are presented. Finally, future studies are identified.

  13. High-level specification of a proposed information architecture for support of a bioterrorism early-warning system.

    PubMed

    Berkowitz, Murray R

    2013-01-01

    Current information systems for use in detecting bioterrorist attacks lack a consistent, overarching information architecture. An overview of the use of biological agents as weapons during a bioterrorist attack is presented. Proposed are the design, development, and implementation of a medical informatics system to mine pertinent databases, retrieve relevant data, invoke appropriate biostatistical and epidemiological software packages, and automatically analyze these data. The top-level information architecture is presented. Systems requirements and functional specifications for this level are presented. Finally, future studies are identified. PMID:23263311

  14. T and D-Bench--Innovative Combined Support for Education and Research in Computer Architecture and Embedded Systems

    ERIC Educational Resources Information Center

    Soares, S. N.; Wagner, F. R.

    2011-01-01

    Teaching and Design Workbench (T&D-Bench) is a framework aimed at education and research in the areas of computer architecture and embedded systems. It includes a set of features not found in other educational environments. This set of features is the result of an original combination of design requirements for T&D-Bench: that the framework should…

  15. A Tool for Managing Software Architecture Knowledge

    SciTech Connect

    Babar, Muhammad A.; Gorton, Ian

    2007-08-01

    This paper describes a tool for managing architectural knowledge and rationale. The tool has been developed to support a framework for capturing and using architectural knowledge to improve the architecture process. This paper describes the main architectural components and features of the tool. The paper also provides examples of using the tool for supporting wellknown architecture design and analysis methods.

  16. Implementation of a metadata architecture and knowledge collection to support semantic interoperability in an enterprise data warehouse.

    PubMed

    Dhaval, Rakesh; Borlawsky, Tara; Ostrander, Michael; Santangelo, Jennifer; Kamal, Jyoti; Payne, Philip R O

    2008-11-06

    In order to enhance interoperability between enterprise systems, and improve data validity and reliability throughout The Ohio State University Medical Center (OSUMC), we have initiated the development of an ontology-anchored metadata architecture and knowledge collection for our enterprise data warehouse. The metadata and corresponding semantic relationships stored in the OSUMC knowledge collection are intended to promote consistency and interoperability across the heterogeneous clinical, research, business and education information managed within the data warehouse.

  17. A Multiprocessor SoC Architecture with Efficient Communication Infrastructure and Advanced Compiler Support for Easy Application Development

    NASA Astrophysics Data System (ADS)

    Urfianto, Mohammad Zalfany; Isshiki, Tsuyoshi; Khan, Arif Ullah; Li, Dongju; Kunieda, Hiroaki

    This paper presentss a Multiprocessor System-on-Chips (MPSoC) architecture used as an execution platform for the new C-language based MPSoC design framework we are currently developing. The MPSoC architecture is based on an existing SoC platform with a commercial RISC core acting as the host CPU. We extend the existing SoC with a multiprocessor-array block that is used as the main engine to run parallel applications modeled in our design framework. Utilizing several optimizations provided by our compiler, an efficient inter-communication between processing elements with minimum overhead is implemented. A host-interface is designed to integrate the existing RISC core to the multiprocessor-array. The experimental results show that an efficacious integration is achieved, proving that the designed communication module can be used to efficiently incorporate off-the-shelf processors as a processing element for MPSoC architectures designed using our framework.

  18. Architecture Design of Healthcare Software-as-a-Service Platform for Cloud-Based Clinical Decision Support Service

    PubMed Central

    Oh, Sungyoung; Cha, Jieun; Ji, Myungkyu; Kang, Hyekyung; Kim, Seok; Heo, Eunyoung; Han, Jong Soo; Kang, Hyunggoo; Chae, Hoseok; Hwang, Hee

    2015-01-01

    Objectives To design a cloud computing-based Healthcare Software-as-a-Service (SaaS) Platform (HSP) for delivering healthcare information services with low cost, high clinical value, and high usability. Methods We analyzed the architecture requirements of an HSP, including the interface, business services, cloud SaaS, quality attributes, privacy and security, and multi-lingual capacity. For cloud-based SaaS services, we focused on Clinical Decision Service (CDS) content services, basic functional services, and mobile services. Microsoft's Azure cloud computing for Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) was used. Results The functional and software views of an HSP were designed in a layered architecture. External systems can be interfaced with the HSP using SOAP and REST/JSON. The multi-tenancy model of the HSP was designed as a shared database, with a separate schema for each tenant through a single application, although healthcare data can be physically located on a cloud or in a hospital, depending on regulations. The CDS services were categorized into rule-based services for medications, alert registration services, and knowledge services. Conclusions We expect that cloud-based HSPs will allow small and mid-sized hospitals, in addition to large-sized hospitals, to adopt information infrastructures and health information technology with low system operation and maintenance costs. PMID:25995962

  19. Nitric oxide is required for determining root architecture and lignin composition in sunflower. Supporting evidence from microarray analyses.

    PubMed

    Corti Monzón, Georgina; Pinedo, Marcela; Di Rienzo, Julio; Novo-Uzal, Esther; Pomar, Federico; Lamattina, Lorenzo; de la Canal, Laura

    2014-05-30

    Nitric oxide (NO) is a signal molecule involved in several physiological processes in plants, including root development. Despite the importance of NO as a root growth regulator, the knowledge about the genes and metabolic pathways modulated by NO in this process is still limited. A constraint to unravel these pathways has been the use of exogenous applications of NO donors that may produce toxic effects. We have analyzed the role of NO in root architecture through the depletion of endogenous NO using the scavenger cPTIO. Sunflower seedlings growing in liquid medium supplemented with cPTIO showed unaltered primary root length while the number of lateral roots was deeply reduced; indicating that endogenous NO participates in determining root branching in sunflower. The transcriptional changes induced by NO depletion have been analyzed using a large-scale approach. A microarray analysis showed 330 genes regulated in the roots (p≤0.001) upon endogenous NO depletion. A general cPTIO-induced up-regulation of genes involved in the lignin biosynthetic pathway was observed. Even if no detectable changes in total lignin content could be detected, cell walls analyses revealed that the ratio G/S lignin increased in roots treated with cPTIO. This means that endogenous NO may control lignin composition in planta. Our results suggest that a fine tuning regulation of NO levels could be used by plants to regulate root architecture and lignin composition. The functional implications of these findings are discussed.

  20. Evolving earth-based and in-situ satellite network architectures for Mars communications and navigation support

    NASA Technical Reports Server (NTRS)

    Hastrup, Rolf; Weinberg, Aaron; Mcomber, Robert

    1991-01-01

    Results of on-going studies to develop navigation/telecommunications network concepts to support future robotic and human missions to Mars are presented. The performance and connectivity improvements provided by the relay network will permit use of simpler, lower performance, and less costly telecom subsystems for the in-situ mission exploration elements. Orbiting relay satellites can serve as effective navigation aids by supporting earth-based tracking as well as providing Mars-centered radiometric data for mission elements approaching, in orbit, or on the surface of Mars. The relay satellite orbits may be selected to optimize navigation aid support and communication coverage for specific mission sets.

  1. FTS2000 network architecture

    NASA Technical Reports Server (NTRS)

    Klenart, John

    1991-01-01

    The network architecture of FTS2000 is graphically depicted. A map of network A topology is provided, with interservice nodes. Next, the four basic element of the architecture is laid out. Then, the FTS2000 time line is reproduced. A list of equipment supporting FTS2000 dedicated transmissions is given. Finally, access alternatives are shown.

  2. Robotic Intelligence Kernel: Architecture

    SciTech Connect

    2009-09-16

    The INL Robotic Intelligence Kernel Architecture (RIK-A) is a multi-level architecture that supports a dynamic autonomy structure. The RIK-A is used to coalesce hardware for sensing and action as well as software components for perception, communication, behavior and world modeling into a framework that can be used to create behaviors for humans to interact with the robot.

  3. Extending multi-tenant architectures: a database model for a multi-target support in SaaS applications

    NASA Astrophysics Data System (ADS)

    Rico, Antonio; Noguera, Manuel; Garrido, José Luis; Benghazi, Kawtar; Barjis, Joseph

    2016-05-01

    Multi-tenant architectures (MTAs) are considered a cornerstone in the success of Software as a Service as a new application distribution formula. Multi-tenancy allows multiple customers (i.e. tenants) to be consolidated into the same operational system. This way, tenants run and share the same application instance as well as costs, which are significantly reduced. Functional needs vary from one tenant to another; either companies from different sectors run different types of applications or, although deploying the same functionality, they do differ in the extent of their complexity. In any case, MTA leaves one major concern regarding the companies' data, their privacy and security, which requires special attention to the data layer. In this article, we propose an extended data model that enhances traditional MTAs in respect of this concern. This extension - called multi-target - allows MT applications to host, manage and serve multiple functionalities within the same multi-tenant (MT) environment. The practical deployment of this approach will allow SaaS vendors to target multiple markets or address different levels of functional complexity and yet commercialise just one single MT application. The applicability of the approach is demonstrated via a case study of a real multi-tenancy multi-target (MT2) implementation, called Globalgest.

  4. MIC-SVM: Designing A Highly Efficient Support Vector Machine For Advanced Modern Multi-Core and Many-Core Architectures

    SciTech Connect

    You, Yang; Song, Shuaiwen; Fu, Haohuan; Marquez, Andres; Mehri Dehanavi, Maryam; Barker, Kevin J.; Cameron, Kirk; Randles, Amanda; Yang, Guangwen

    2014-08-16

    Support Vector Machine (SVM) has been widely used in data-mining and Big Data applications as modern commercial databases start to attach an increasing importance to the analytic capabilities. In recent years, SVM was adapted to the field of High Performance Computing for power/performance prediction, auto-tuning, and runtime scheduling. However, even at the risk of losing prediction accuracy due to insufficient runtime information, researchers can only afford to apply offline model training to avoid significant runtime training overhead. To address the challenges above, we designed and implemented MICSVM, a highly efficient parallel SVM for x86 based multi-core and many core architectures, such as the Intel Ivy Bridge CPUs and Intel Xeon Phi coprocessor (MIC).

  5. An Approach for Hydrogen Recycling in a Closed-loop Life Support Architecture to Increase Oxygen Recovery Beyond State-of-the-Art

    NASA Technical Reports Server (NTRS)

    Abney, Morgan B.; Miller, Lee; Greenwood, Zachary; Alvarez, Giraldo

    2014-01-01

    State-of-the-art atmosphere revitalization life support technology on the International Space Station is theoretically capable of recovering 50% of the oxygen from metabolic carbon dioxide via the Carbon Dioxide Reduction Assembly (CRA). When coupled with a Plasma Pyrolysis Assembly (PPA), oxygen recovery increases dramatically, thus drastically reducing the logistical challenges associated with oxygen resupply. The PPA decomposes methane to predominantly form hydrogen and acetylene. Because of the unstable nature of acetylene, a down-stream separation system is required to remove acetylene from the hydrogen stream before it is recycled to the CRA. A new closed-loop architecture that includes a PPA and downstream Hydrogen Purification Assembly (HyPA) is proposed and discussed. Additionally, initial results of separation material testing are reported.

  6. The story of DB4GeO - A service-based geo-database architecture to support multi-dimensional data analysis and visualization

    NASA Astrophysics Data System (ADS)

    Breunig, Martin; Kuper, Paul V.; Butwilowski, Edgar; Thomsen, Andreas; Jahn, Markus; Dittrich, André; Al-Doori, Mulhim; Golovko, Darya; Menninghaus, Mathias

    2016-07-01

    Multi-dimensional data analysis and visualization need efficient data handling to archive original data, to reproduce results on large data sets, and to retrieve space and time partitions just in time. This article tells the story of more than twenty years research resulting in the development of DB4GeO, a web service-based geo-database architecture for geo-objects to support the data handling of 3D/4D geo-applications. Starting from the roots and lessons learned, the concepts and implementation of DB4GeO are described in detail. Furthermore, experiences and extensions to DB4GeO are presented. Finally, conclusions and an outlook on further research also considering 3D/4D geo-applications for DB4GeO in the context of Dubai 2020 are given.

  7. Development of a real-time clinical decision support system upon the web mvc-based architecture for prostate cancer treatment

    PubMed Central

    2011-01-01

    Background A real-time clinical decision support system (RTCDSS) with interactive diagrams enables clinicians to instantly and efficiently track patients' clinical records (PCRs) and improve their quality of clinical care. We propose a RTCDSS to process online clinical informatics from multiple databases for clinical decision making in the treatment of prostate cancer based on Web Model-View-Controller (MVC) architecture, by which the system can easily be adapted to different diseases and applications. Methods We designed a framework upon the Web MVC-based architecture in which the reusable and extractable models can be conveniently adapted to other hospital information systems and which allows for efficient database integration. Then, we determined the clinical variables of the prostate cancer treatment based on participating clinicians' opinions and developed a computational model to determine the pretreatment parameters. Furthermore, the components of the RTCDSS integrated PCRs and decision factors for real-time analysis to provide evidence-based diagrams upon the clinician-oriented interface for visualization of treatment guidance and health risk assessment. Results The resulting system can improve quality of clinical treatment by allowing clinicians to concurrently analyze and evaluate the clinical markers of prostate cancer patients with instantaneous clinical data and evidence-based diagrams which can automatically identify pretreatment parameters. Moreover, the proposed RTCDSS can aid interactions between patients and clinicians. Conclusions Our proposed framework supports online clinical informatics, evaluates treatment risks, offers interactive guidance, and provides real-time reference for decision making in the treatment of prostate cancer. The developed clinician-oriented interface can assist clinicians in conveniently presenting evidence-based information to patients and can be readily adapted to an existing hospital information system and be easily

  8. Preparation of a self-supporting cell architecture mimic by water channel confined photocrosslinking within a lamellar structured hydrogel.

    SciTech Connect

    Grubjesic, S.; Lee, B.; Seifert, S.; Firestone, M. A.

    2011-01-01

    A self-supporting biomimetic chemical hydrogel that can be reversibly swollen in water is described. An aqueous dispersion of a diacrylate end-derivatized PEO-PPO-PEO macromer, a saturated phospholipid, and a zwitterionic co-surfactant self-assembles into a multilamellar-structured physical gel at room temperature as determined by SAXS. The addition of a water soluble PEGDA co-monomer and photoinitiator within the water layers does not alter the self-assembled structure. ATR/FT-IR spectroscopy reveals that photoirradiation initiates the crosslinking between the acrylate end groups on the macromer with the PEGDA, forming a polymeric network within the aqueous domains. The primitive cytoskeleton mimic serves to stabilize the amphiphile bilayer, converting the physical gel into an elastic self-supporting chemical gel. Storage under ambient conditions causes dehydration of the hydrogel to 5 wt % water which can be reversed by swelling in water. The fully water swollen gel (85 wt % water) remains self-supporting but converts to a non-lamellar structure. As water is lost the chemical gel regains its lamellar structure. Incubation of the hydrogel in nonpolar organic solvents that do not dissolve the uncrosslinked lipid component (hexane) allow for swelling without loss of structural integrity. Chloroform, which readily solubilizes the lipid, causes irreversible loss of the lamellar structure.

  9. Three-Dimensional Nitrogen-Doped Reduced Graphene Oxide-Carbon Nanotubes Architecture Supporting Ultrafine Palladium Nanoparticles for Highly Efficient Methanol Electrooxidation.

    PubMed

    Song, Hejie; Yang, Liming; Tang, Yanhong; Yan, Dafeng; Liu, Chengbin; Luo, Shenglian

    2015-11-01

    A three-dimensional (3D) nitrogen-doped reduced graphene oxide (rGO)-carbon nanotubes (CNTs) architecture supporting ultrafine Pd nanoparticles is prepared and used as a highly efficient electrocatalyst. Graphene oxide (GO) is first used as a surfactant to disperse pristine CNTs for electrochemical preparation of 3D rGO@CNTs, and subsequently one-step electrodeposition of the stable colloidal GO-CNTs solution containing Na2 PdCl4 affords rGO@CNTs-supported Pd nanoparticles. Further thermal treatment of the Pd/rGO@CNTs hybrid with ammonia achieves not only in situ nitrogen-doping of the rGO@CNTs support but also extraordinary size decrease of the Pd nanoparticles to below 2.0 nm. The resulting catalyst is characterized by scanning and transmission electron microscopy, X-ray diffraction, Raman spectroscopy, and X-ray photoelectron spectroscopy. Catalyst performance for the methanol oxidation reaction is tested through cyclic voltammetry and chronoamperometry techniques, which shows exceedingly high mass activity and superior durability.

  10. The Development of a Remote Sensor System and Decision Support Systems Architecture to Monitor Resistance Development in Transgenic Crops

    NASA Technical Reports Server (NTRS)

    Cacas, Joseph; Glaser, John; Copenhaver, Kenneth; May, George; Stephens, Karen

    2008-01-01

    The United States Environmental Protection Agency (EPA) has declared that "significant benefits accrue to growers, the public, and the environment" from the use of transgenic pesticidal crops due to reductions in pesticide usage for crop pest management. Large increases in the global use of transgenic pesticidal crops has reduced the amounts of broad spectrum pesticides used to manage pest populations, improved yield and reduced the environmental impact of crop management. A significant threat to the continued use of this technology is the evolution of resistance in insect pest populations to the insecticidal Bt toxins expressed by the plants. Management of transgenic pesticidal crops with an emphasis on conservation of Bt toxicity in field populations of insect pests is important to the future of sustainable agriculture. A vital component of this transgenic pesticidal crop management is establishing the proof of concept basic understanding, situational awareness, and monitoring and decision support system tools for more than 133650 square kilometers (33 million acres) of bio-engineered corn and cotton for development of insect resistance . Early and recent joint NASA, US EPA and ITD remote imagery flights and ground based field experiments have provided very promising research results that will potentially address future requirements for crop management capabilities.

  11. An integrated Rotorcraft Avionics/Controls Architecture to support advanced controls and low-altitude guidance flight research

    NASA Technical Reports Server (NTRS)

    Jacobsen, Robert A.; Doane, Douglas H.; Eshow, Michelle M.; Aiken, Edwin W.; Hindson, William S.

    1992-01-01

    Salient design features of a new NASA/Army research rotorcraft--the Rotorcraft-Aircrew Systems Concepts Airborne Laboratory (RASCAL) are described. Using a UH-60A Black Hawk helicopter as a baseline vehicle, the RASCAL will be a flying laboratory capable of supporting the research requirements of major NASA and Army guidance, control, and display research programs. The paper describes the research facility requirements of these programs together with other critical constraints on the design of the research system. Research program schedules demand a phased development approach, wherein specific research capability milestones are met and flight research projects are flown throughout the complete development cycle of the RASCAL. This development approach is summarized, and selected features of the research system are described. The research system includes a real-time obstacle detection and avoidance system which will generate low-altitude guidance commands to the pilot on a wide field-of-view, color helmet-mounted display and a full-authority, programmable, fault-tolerant/fail-safe, fly-by-wire flight control system.

  12. Architecture & Environment

    ERIC Educational Resources Information Center

    Erickson, Mary; Delahunt, Michael

    2010-01-01

    Most art teachers would agree that architecture is an important form of visual art, but they do not always include it in their curriculums. In this article, the authors share core ideas from "Architecture and Environment," a teaching resource that they developed out of a long-term interest in teaching architecture and their fascination with the…

  13. Project Integration Architecture: Application Architecture

    NASA Technical Reports Server (NTRS)

    Jones, William Henry

    2005-01-01

    The Project Integration Architecture (PIA) implements a flexible, object-oriented, wrapping architecture which encapsulates all of the information associated with engineering applications. The architecture allows the progress of a project to be tracked and documented in its entirety. Additionally, by bringing all of the information sources and sinks of a project into a single architectural space, the ability to transport information between those applications is enabled.

  14. Green Architecture

    NASA Astrophysics Data System (ADS)

    Lee, Seung-Ho

    Today, the environment has become a main subject in lots of science disciplines and the industrial development due to the global warming. This paper presents the analysis of the tendency of Green Architecture in France on the threes axes: Regulations and Approach for the Sustainable Architecture (Certificate and Standard), Renewable Materials (Green Materials) and Strategies (Equipments) of Sustainable Technology. The definition of 'Green Architecture' will be cited in the introduction and the question of the interdisciplinary for the technological development in 'Green Architecture' will be raised up in the conclusion.

  15. Architecture Adaptive Computing Environment

    NASA Technical Reports Server (NTRS)

    Dorband, John E.

    2006-01-01

    Architecture Adaptive Computing Environment (aCe) is a software system that includes a language, compiler, and run-time library for parallel computing. aCe was developed to enable programmers to write programs, more easily than was previously possible, for a variety of parallel computing architectures. Heretofore, it has been perceived to be difficult to write parallel programs for parallel computers and more difficult to port the programs to different parallel computing architectures. In contrast, aCe is supportable on all high-performance computing architectures. Currently, it is supported on LINUX clusters. aCe uses parallel programming constructs that facilitate writing of parallel programs. Such constructs were used in single-instruction/multiple-data (SIMD) programming languages of the 1980s, including Parallel Pascal, Parallel Forth, C*, *LISP, and MasPar MPL. In aCe, these constructs are extended and implemented for both SIMD and multiple- instruction/multiple-data (MIMD) architectures. Two new constructs incorporated in aCe are those of (1) scalar and virtual variables and (2) pre-computed paths. The scalar-and-virtual-variables construct increases flexibility in optimizing memory utilization in various architectures. The pre-computed-paths construct enables the compiler to pre-compute part of a communication operation once, rather than computing it every time the communication operation is performed.

  16. Data management system advanced architectures

    NASA Technical Reports Server (NTRS)

    Chevers, ED

    1991-01-01

    The topics relating to the Space Station Freedom (SSF) are presented in view graph form and include: (1) the data management system (DMS) concept; (2) DMS evolution rationale; (3) the DMS advance architecture task; (4) DMS group support for Ames payloads; (5) DMS testbed development; (6) the DMS architecture task status; (7) real time multiprocessor testbed; (8) networked processor performance; (9) and the DMS advance architecture task 1992 goals.

  17. Project Integration Architecture: Architectural Overview

    NASA Technical Reports Server (NTRS)

    Jones, William Henry

    2001-01-01

    The Project Integration Architecture (PIA) implements a flexible, object-oriented, wrapping architecture which encapsulates all of the information associated with engineering applications. The architecture allows the progress of a project to be tracked and documented in its entirety. By being a single, self-revealing architecture, the ability to develop single tools, for example a single graphical user interface, to span all applications is enabled. Additionally, by bringing all of the information sources and sinks of a project into a single architectural space, the ability to transport information between those applications becomes possible, Object-encapsulation further allows information to become in a sense self-aware, knowing things such as its own dimensionality and providing functionality appropriate to its kind.

  18. Experimental Architecture.

    ERIC Educational Resources Information Center

    Alter, Kevin

    2003-01-01

    Describes the design of the Centre for Architectural Structures and Technology at the University of Manitoba, including the educational context and design goals. Includes building plans and photographs. (EV)

  19. The flight telerobotic servicer: From functional architecture to computer architecture

    NASA Technical Reports Server (NTRS)

    Lumia, Ronald; Fiala, John

    1989-01-01

    After a brief tutorial on the NASA/National Bureau of Standards Standard Reference Model for Telerobot Control System Architecture (NASREM) functional architecture, the approach to its implementation is shown. First, interfaces must be defined which are capable of supporting the known algorithms. This is illustrated by considering the interfaces required for the SERVO level of the NASREM functional architecture. After interface definition, the specific computer architecture for the implementation must be determined. This choice is obviously technology dependent. An example illustrating one possible mapping of the NASREM functional architecture to a particular set of computers which implements it is shown. The result of choosing the NASREM functional architecture is that it provides a technology independent paradigm which can be mapped into a technology dependent implementation capable of evolving with technology in the laboratory and in space.

  20. Lunar architecture and urbanism

    NASA Technical Reports Server (NTRS)

    Sherwood, Brent

    1992-01-01

    Human civilization and architecture have defined each other for over 5000 years on Earth. Even in the novel environment of space, persistent issues of human urbanism will eclipse, within a historically short time, the technical challenges of space settlement that dominate our current view. By adding modern topics in space engineering, planetology, life support, human factors, material invention, and conservation to their already renaissance array of expertise, urban designers can responsibly apply ancient, proven standards to the exciting new opportunities afforded by space. Inescapable facts about the Moon set real boundaries within which tenable lunar urbanism and its component architecture must eventually develop.

  1. QUEST2: Sysdtem architecture deliverable set

    SciTech Connect

    Braaten, F.D.

    1995-02-27

    This document contains the system architecture and related documents which were developed during the Preliminary Analysis/System Architecture phase of the Quality, Environmental, Safety T-racking System redesign (QUEST2) project. Each discreet document in this deliverable set applies to a analytic effort supporting the architectural model of QUEST2. The P+ methodology cites a list of P+ documents normally included in a ``typical`` system architecture. Some of these were deferred to the release development phase of the project. The documents included in this deliverable set represent the system architecture itself. Related to that architecture are some decision support documents which provided needed information for management reviews that occurred during April. Consequently, the deliverables in this set were logically grouped and provided to support customer requirements. The remaining System Architecture Phase deliverables will be provided as a ``Supporting Documents`` deliverable set for the first release.

  2. Terra Harvest software architecture

    NASA Astrophysics Data System (ADS)

    Humeniuk, Dave; Klawon, Kevin

    2012-06-01

    Under the Terra Harvest Program, the DIA has the objective of developing a universal Controller for the Unattended Ground Sensor (UGS) community. The mission is to define, implement, and thoroughly document an open architecture that universally supports UGS missions, integrating disparate systems, peripherals, etc. The Controller's inherent interoperability with numerous systems enables the integration of both legacy and future UGS System (UGSS) components, while the design's open architecture supports rapid third-party development to ensure operational readiness. The successful accomplishment of these objectives by the program's Phase 3b contractors is demonstrated via integration of the companies' respective plug-'n'-play contributions that include controllers, various peripherals, such as sensors, cameras, etc., and their associated software drivers. In order to independently validate the Terra Harvest architecture, L-3 Nova Engineering, along with its partner, the University of Dayton Research Institute, is developing the Terra Harvest Open Source Environment (THOSE), a Java Virtual Machine (JVM) running on an embedded Linux Operating System. The Use Cases on which the software is developed support the full range of UGS operational scenarios such as remote sensor triggering, image capture, and data exfiltration. The Team is additionally developing an ARM microprocessor-based evaluation platform that is both energy-efficient and operationally flexible. The paper describes the overall THOSE architecture, as well as the design decisions for some of the key software components. Development process for THOSE is discussed as well.

  3. Avionics System Architecture Tool

    NASA Technical Reports Server (NTRS)

    Chau, Savio; Hall, Ronald; Traylor, marcus; Whitfield, Adrian

    2005-01-01

    Avionics System Architecture Tool (ASAT) is a computer program intended for use during the avionics-system-architecture- design phase of the process of designing a spacecraft for a specific mission. ASAT enables simulation of the dynamics of the command-and-data-handling functions of the spacecraft avionics in the scenarios in which the spacecraft is expected to operate. ASAT is built upon I-Logix Statemate MAGNUM, providing a complement of dynamic system modeling tools, including a graphical user interface (GUI), modeling checking capabilities, and a simulation engine. ASAT augments this with a library of predefined avionics components and additional software to support building and analyzing avionics hardware architectures using these components.

  4. Guiding Architects in Selecting Architectural Evolution Alternatives

    SciTech Connect

    Ciraci, Selim; Sozer, Hasan; Aksit, Mehmet

    2011-09-09

    Although there exist methods and tools to support architecture evolution, the derivation and evaluation of alternative evolution paths are realized manually. In this paper, we introduce an approach, where architecture specification is converted to a graph representation. Based on this representation, we automatically generate possible evolution paths, evalute quality attributes for different architecture configurations, and optimize the selection of a particular path accordingly. We illustrate our approach by modeling the software architecture evolution of a crisis management system.

  5. Ontology-driven health information systems architectures.

    PubMed

    Blobel, Bernd; Oemig, Frank

    2009-01-01

    Following an architecture vision such as the Generic Component Model (GCM) architecture framework, health information systems for supporting personalized care have to be based on a component-oriented architecture. Representing concepts and their interrelations, the GCM perspectives system architecture, domains, and development process can be described by the domains' ontologies. The paper introduces ontology principles, ontology references to the GCM as well as some practical aspects of ontology-driven approaches to semantically interoperable and sustainable health information systems.

  6. Class Architecture.

    ERIC Educational Resources Information Center

    Crosbie, Michael J.

    This compendium contains more than 40 schools that show new directions in design and the changing demands on this building type. It discusses the design challenges in new schools and how each one of the projects meets the demands of an architecture for learning. An introduction by architect Raymond Bordwell explains many of the trends in new…

  7. Architectural Tops

    ERIC Educational Resources Information Center

    Mahoney, Ellen

    2010-01-01

    The development of the skyscraper is an American story that combines architectural history, economic power, and technological achievement. Each city in the United States can be identified by the profile of its buildings. The design of the tops of skyscrapers was the inspiration for the students in the author's high-school ceramic class to develop…

  8. Architectural Drafting.

    ERIC Educational Resources Information Center

    Davis, Ronald; Yancey, Bruce

    Designed to be used as a supplement to a two-book course in basic drafting, these instructional materials consisting of 14 units cover the process of drawing all working drawings necessary for residential buildings. The following topics are covered in the individual units: introduction to architectural drafting, lettering and tools, site…

  9. Architectural Models

    ERIC Educational Resources Information Center

    Levenson, Harold E.; Hurni, Andre

    1978-01-01

    Suggests building models as a way to reinforce and enhance related subjects such as architectural drafting, structural carpentry, etc., and discusses time, materials, scales, tools or equipment needed, how to achieve realistic special effects, and the types of projects that can be built (model of complete building, a panoramic model, and model…

  10. Design, Implementation and Evaluation of an Architecture based on the CDA R2 Document Repository to Provide Support to the Contingency Plan.

    PubMed

    Campos, Fernando; Luna, Daniel; Sittig, Dean F; Bernaldo de Quirós, Fernán González

    2015-01-01

    The pervasive use of electronic records in healthcare increases the dependency on technology due to the lack of physical backup for the records. Downtime in the Electronic Health Record system is unavoidable, due to software, infrastructure and power failures as well as natural disasters, so there is a need to develop a contingency plan ensuring patient care continuity and minimizing risks for health care delivery. To mitigate these risks, two applications were developed allowing healthcare delivery providers to retrieve clinical information using the Clinical Document Architecture Release 2 (CDA R2) document repository as the information source. In this paper we describe the strategy, implementation and results; and provide an evaluation of effectiveness.

  11. Information systems definition architecture

    SciTech Connect

    Calapristi, A.J.

    1996-06-20

    The Tank Waste Remediation System (TWRS) Information Systems Definition architecture evaluated information Management (IM) processes in several key organizations. The intent of the study is to identify improvements in TWRS IM processes that will enable better support to the TWRS mission, and accommodate changes in TWRS business environment. The ultimate goals of the study are to reduce IM costs, Manage the configuration of TWRS IM elements, and improve IM-related process performance.

  12. The NASA Space Communications Data Networking Architecture

    NASA Technical Reports Server (NTRS)

    Israel, David J.; Hooke, Adrian J.; Freeman, Kenneth; Rush, John J.

    2006-01-01

    The NASA Space Communications Architecture Working Group (SCAWG) has recently been developing an integrated agency-wide space communications architecture in order to provide the necessary communication and navigation capabilities to support NASA's new Exploration and Science Programs. A critical element of the space communications architecture is the end-to-end Data Networking Architecture, which must provide a wide range of services required for missions ranging from planetary rovers to human spaceflight, and from sub-orbital space to deep space. Requirements for a higher degree of user autonomy and interoperability between a variety of elements must be accommodated within an architecture that necessarily features minimum operational complexity. The architecture must also be scalable and evolvable to meet mission needs for the next 25 years. This paper will describe the recommended NASA Data Networking Architecture, present some of the rationale for the recommendations, and will illustrate an application of the architecture to example NASA missions.

  13. Space Telecommunications Radio Architecture (STRS)

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.

    2006-01-01

    A software defined radio (SDR) architecture used in space-based platforms proposes to standardize certain aspects of radio development such as interface definitions, functional control and execution, and application software and firmware development. NASA has charted a team to develop an open software defined radio hardware and software architecture to support NASA missions and determine the viability of an Agency-wide Standard. A draft concept of the proposed standard has been released and discussed among organizations in the SDR community. Appropriate leveraging of the JTRS SCA, OMG's SWRadio Architecture and other aspects are considered. A standard radio architecture offers potential value by employing common waveform software instantiation, operation, testing and software maintenance. While software defined radios offer greater flexibility, they also poses challenges to the radio development for the space environment in terms of size, mass and power consumption and available technology. An SDR architecture for space must recognize and address the constraints of space flight hardware, and systems along with flight heritage and culture. NASA is actively participating in the development of technology and standards related to software defined radios. As NASA considers a standard radio architecture for space communications, input and coordination from government agencies, the industry, academia, and standards bodies is key to a successful architecture. The unique aspects of space require thorough investigation of relevant terrestrial technologies properly adapted to space. The talk will describe NASA s current effort to investigate SDR applications to space missions and a brief overview of a candidate architecture under consideration for space based platforms.

  14. Software Architecture Review: The State of Practice

    SciTech Connect

    Babar, Muhammad A.; Gorton, Ian

    2009-07-01

    This paper presents the results of a survey we carried out to investigate the state of practice of software architecture reviews. Of the survey results we describe, two are particularly significant for the software architecture research community. First, the survey respondents evaluate architectures mostly using informal, experience-based approaches. Second, the survey respondents rarely adopt the techniques that are highly recommended in architecture review research, such as the use of project-independent reviewers. We conclude that the software engineering practitioner community has yet to become fully aware of the methods and techniques available to support disciplined architecture review processes and their potential benefits. The architecture review research community needs to concentrate on helping practitioners by providing guidelines for justifying and institutionalizing the architecture review processes, and associated tools support.

  15. Space station needs, attributes, and architectural options study. Volume 2: Program options, architecture, and technology

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Mission scenarios and space station architectures are discussed. Electrical power subsystems (EPS), environmental control and life support, subsystems (ECLSS), and reaction control subsystem (RCS) architectures are addressed. Thermal control subsystems, (TCS), guidance/navigation and control (GN and C), information management systems IMS), communications and tracking (C and T), and propellant transfer and storage systems architectures are discussed.

  16. Open architecture design and approach for the Integrated Sensor Architecture (ISA)

    NASA Astrophysics Data System (ADS)

    Moulton, Christine L.; Krzywicki, Alan T.; Hepp, Jared J.; Harrell, John; Kogut, Michael

    2015-05-01

    Integrated Sensor Architecture (ISA) is designed in response to stovepiped integration approaches. The design, based on the principles of Service Oriented Architectures (SOA) and Open Architectures, addresses the problem of integration, and is not designed for specific sensors or systems. The use of SOA and Open Architecture approaches has led to a flexible, extensible architecture. Using these approaches, and supported with common data formats, open protocol specifications, and Department of Defense Architecture Framework (DoDAF) system architecture documents, an integration-focused architecture has been developed. ISA can help move the Department of Defense (DoD) from costly stovepipe solutions to a more cost-effective plug-and-play design to support interoperability.

  17. Commanding Constellations (Pipeline Architecture)

    NASA Technical Reports Server (NTRS)

    Ray, Tim; Condron, Jeff

    2003-01-01

    Providing ground command software for constellations of spacecraft is a challenging problem. Reliable command delivery requires a feedback loop; for a constellation there will likely be an independent feedback loop for each constellation member. Each command must be sent via the proper Ground Station, which may change from one contact to the next (and may be different for different members). Dynamic configuration of the ground command software is usually required (e.g. directives to configure each member's feedback loop and assign the appropriate Ground Station). For testing purposes, there must be a way to insert command data at any level in the protocol stack. The Pipeline architecture described in this paper can support all these capabilities with a sequence of software modules (the pipeline), and a single self-identifying message format (for all types of command data and configuration directives). The Pipeline architecture is quite simple, yet it can solve some complex problems. The resulting solutions are conceptually simple, and therefore, reliable. They are also modular, and therefore, easy to distribute and extend. We first used the Pipeline architecture to design a CCSDS (Consultative Committee for Space Data Systems) Ground Telecommand system (to command one spacecraft at a time with a fixed Ground Station interface). This pipeline was later extended to include gateways to any of several Ground Stations. The resulting pipeline was then extended to handle a small constellation of spacecraft. The use of the Pipeline architecture allowed us to easily handle the increasing complexity. This paper will describe the Pipeline architecture, show how it was used to solve each of the above commanding situations, and how it can easily be extended to handle larger constellations.

  18. Variation in photosynthetic performance and hydraulic architecture across European beech (Fagus sylvatica L.) populations supports the case for local adaptation to water stress.

    PubMed

    Aranda, Ismael; Cano, Francisco Javier; Gascó, Antonio; Cochard, Hervé; Nardini, Andrea; Mancha, Jose Antonio; López, Rosana; Sánchez-Gómez, David

    2015-01-01

    The aim of this study was to provide new insights into how intraspecific variability in the response of key functional traits to drought dictates the interplay between gas-exchange parameters and the hydraulic architecture of European beech (Fagus sylvatica L.). Considering the relationships between hydraulic and leaf functional traits, we tested whether local adaptation to water stress occurs in this species. To address these objectives, we conducted a glasshouse experiment in which 2-year-old saplings from six beech populations were subjected to different watering treatments. These populations encompassed central and marginal areas of the range, with variation in macro- and microclimatic water availability. The results highlight subtle but significant differences among populations in their functional response to drought. Interpopulation differences in hydraulic traits suggest that vulnerability to cavitation is higher in populations with higher sensitivity to drought. However, there was no clear relationship between variables related to hydraulic efficiency, such as xylem-specific hydraulic conductivity or stomatal conductance, and those that reflect resistance to xylem cavitation (i.e., Ψ(12), the water potential corresponding to a 12% loss of stem hydraulic conductivity). The results suggest that while a trade-off between photosynthetic capacity at the leaf level and hydraulic function of xylem could be established across populations, it functions independently of the compromise between safety and efficiency of the hydraulic system with regard to water use at the interpopulation level.

  19. Lab architecture

    NASA Astrophysics Data System (ADS)

    Crease, Robert P.

    2008-04-01

    There are few more dramatic illustrations of the vicissitudes of laboratory architecturethan the contrast between Building 20 at the Massachusetts Institute of Technology (MIT) and its replacement, the Ray and Maria Stata Center. Building 20 was built hurriedly in 1943 as temporary housing for MIT's famous Rad Lab, the site of wartime radar research, and it remained a productive laboratory space for over half a century. A decade ago it was demolished to make way for the Stata Center, an architecturally striking building designed by Frank Gehry to house MIT's computer science and artificial intelligence labs (above). But in 2004 - just two years after the Stata Center officially opened - the building was criticized for being unsuitable for research and became the subject of still ongoing lawsuits alleging design and construction failures.

  20. Executable Architecture Research at Old Dominion University

    NASA Technical Reports Server (NTRS)

    Tolk, Andreas; Shuman, Edwin A.; Garcia, Johnny J.

    2011-01-01

    Executable Architectures allow the evaluation of system architectures not only regarding their static, but also their dynamic behavior. However, the systems engineering community do not agree on a common formal specification of executable architectures. To close this gap and identify necessary elements of an executable architecture, a modeling language, and a modeling formalism is topic of ongoing PhD research. In addition, systems are generally defined and applied in an operational context to provide capabilities and enable missions. To maximize the benefits of executable architectures, a second PhD effort introduces the idea of creating an executable context in addition to the executable architecture. The results move the validation of architectures from the current information domain into the knowledge domain and improve the reliability of such validation efforts. The paper presents research and results of both doctoral research efforts and puts them into a common context of state-of-the-art of systems engineering methods supporting more agility.

  1. Tank waste remediation system architecture tree

    SciTech Connect

    PECK, L.G.

    1999-05-13

    The TWRS Architecture Tree presented in this document is a hierarchical breakdown to support the TWRS systems engineering analysis of the TWRS physical system, including facilities, hardware and software. The purpose for this systems engineering architecture tree is to describe and communicate the system's selected and existing architecture, to provide a common structure to improve the integration of work and resulting products, and to provide a framework as a basis for TWRS Specification Tree development.

  2. The influence of nano-architectured CeOx supports in RhPd/CeO₂ for the catalytic ethanol steam reforming reaction

    DOE PAGESBeta

    Divins, N. J.; Senanayake, S. D.; Casanovas, A.; Xu, W.; Trovarelli, A.; Llorca, J.

    2015-01-19

    The ethanol steam reforming (ESR) reaction has been tested over RhPd supported on polycrystalline ceria in comparison to structured supports composed of nanoshaped CeO₂ cubes and CeO₂ rods tailored towards the production of hydrogen. At 650-700 K the hydrogen yield follows the trend RhPd/CeO₂-cubes > RhPd/CeO₂ -rods > RhPd/CeO₂- polycrystalline, whereas at temperatures higher than 800 K the catalytic performance of all samples is similar and close to the thermodynamic equilibrium. The improved performance of RhPd/CeO₂-cubes and RhPd/CeO₂ -rods for ESR at low temperature is mainly ascribed to higher water-gas shift activity and a strong interaction between the bimetallic -more » oxide support interaction. STEM analysis shows the existence of RhPd alloyed nanoparticles in all samples, with no apparent relationship between ESR performance and RhPd particle size. X-ray diffraction under operating conditions shows metal reorganization on {100} and {110} ceria crystallographic planes during catalyst activation and ESR, but not on {111} ceria crystallographic planes. The RhPd reconstructing and tuned activation over ceria nanocubes and nanorods is considered the main reason for better catalytic activity with respect to conventional catalysts based on polycrystalline ceria« less

  3. The influence of nano-architectured CeOx supports in RhPd/CeO₂ for the catalytic ethanol steam reforming reaction

    SciTech Connect

    Divins, N. J.; Senanayake, S. D.; Casanovas, A.; Xu, W.; Trovarelli, A.; Llorca, J.

    2015-01-19

    The ethanol steam reforming (ESR) reaction has been tested over RhPd supported on polycrystalline ceria in comparison to structured supports composed of nanoshaped CeO₂ cubes and CeO₂ rods tailored towards the production of hydrogen. At 650-700 K the hydrogen yield follows the trend RhPd/CeO₂-cubes > RhPd/CeO₂ -rods > RhPd/CeO₂- polycrystalline, whereas at temperatures higher than 800 K the catalytic performance of all samples is similar and close to the thermodynamic equilibrium. The improved performance of RhPd/CeO₂-cubes and RhPd/CeO₂ -rods for ESR at low temperature is mainly ascribed to higher water-gas shift activity and a strong interaction between the bimetallic - oxide support interaction. STEM analysis shows the existence of RhPd alloyed nanoparticles in all samples, with no apparent relationship between ESR performance and RhPd particle size. X-ray diffraction under operating conditions shows metal reorganization on {100} and {110} ceria crystallographic planes during catalyst activation and ESR, but not on {111} ceria crystallographic planes. The RhPd reconstructing and tuned activation over ceria nanocubes and nanorods is considered the main reason for better catalytic activity with respect to conventional catalysts based on polycrystalline ceria

  4. Architectural Methodology Report

    NASA Technical Reports Server (NTRS)

    Dhas, Chris

    2000-01-01

    The establishment of conventions between two communicating entities in the end systems is essential for communications. Examples of the kind of decisions that need to be made in establishing a protocol convention include the nature of the data representation, the for-mat and the speed of the date representation over the communications path, and the sequence of control messages (if any) which are sent. One of the main functions of a protocol is to establish a standard path between the communicating entities. This is necessary to create a virtual communications medium with certain desirable characteristics. In essence, it is the function of the protocol to transform the characteristics of the physical communications environment into a more useful virtual communications model. The final function of a protocol is to establish standard data elements for communications over the path; that is, the protocol serves to create a virtual data element for exchange. Other systems may be constructed in which the transferred element is a program or a job. Finally, there are special purpose applications in which the element to be transferred may be a complex structure such as all or part of a graphic display. NASA's Glenn Research Center (GRC) defines and develops advanced technology for high priority national needs in communications technologies for application to aeronautics and space. GRC tasked Computer Networks and Software Inc. (CNS) to describe the methodologies used in developing a protocol architecture for an in-space Internet node. The node would support NASA:s four mission areas: Earth Science; Space Science; Human Exploration and Development of Space (HEDS); Aerospace Technology. This report presents the methodology for developing the protocol architecture. The methodology addresses the architecture for a computer communications environment. It does not address an analog voice architecture.

  5. Marshall Application Realignment System (MARS) Architecture

    NASA Technical Reports Server (NTRS)

    Belshe, Andrea; Sutton, Mandy

    2010-01-01

    The Marshall Application Realignment System (MARS) Architecture project was established to meet the certification requirements of the Department of Defense Architecture Framework (DoDAF) V2.0 Federal Enterprise Architecture Certification (FEAC) Institute program and to provide added value to the Marshall Space Flight Center (MSFC) Application Portfolio Management process. The MARS Architecture aims to: (1) address the NASA MSFC Chief Information Officer (CIO) strategic initiative to improve Application Portfolio Management (APM) by optimizing investments and improving portfolio performance, and (2) develop a decision-aiding capability by which applications registered within the MSFC application portfolio can be analyzed and considered for retirement or decommission. The MARS Architecture describes a to-be target capability that supports application portfolio analysis against scoring measures (based on value) and overall portfolio performance objectives (based on enterprise needs and policies). This scoring and decision-aiding capability supports the process by which MSFC application investments are realigned or retired from the application portfolio. The MARS Architecture is a multi-phase effort to: (1) conduct strategic architecture planning and knowledge development based on the DoDAF V2.0 six-step methodology, (2) describe one architecture through multiple viewpoints, (3) conduct portfolio analyses based on a defined operational concept, and (4) enable a new capability to support the MSFC enterprise IT management mission, vision, and goals. This report documents Phase 1 (Strategy and Design), which includes discovery, planning, and development of initial architecture viewpoints. Phase 2 will move forward the process of building the architecture, widening the scope to include application realignment (in addition to application retirement), and validating the underlying architecture logic before moving into Phase 3. The MARS Architecture key stakeholders are most

  6. MSAT network architecture

    NASA Technical Reports Server (NTRS)

    Davies, N. G.; Skerry, B.

    1990-01-01

    The Mobile Satellite (MSAT) communications system will support mobile voice and data services using circuit switched and packet switched facilities with interconnection to the public switched telephone network and private networks. Control of the satellite network will reside in a Network Control System (NCS) which is being designed to be extremely flexible to provide for the operation of the system initially with one multi-beam satellite, but with capability to add additional satellites which may have other beam configurations. The architecture of the NCS is described. The signalling system must be capable of supporting the protocols for the assignment of circuits for mobile public telephone and private network calls as well as identifying packet data networks. The structure of a straw-man signalling system is discussed.

  7. Architecture as Design Study.

    ERIC Educational Resources Information Center

    Kauppinen, Heta

    1989-01-01

    Explores the use of analogies in architectural design, the importance of Gestalt theory and aesthetic cannons in understanding and being sensitive to architecture. Emphasizes the variation between public and professional appreciation of architecture. Notes that an understanding of architectural process enables students to improve the aesthetic…

  8. Elucidating the role of topological pattern discovery and support vector machine in generating predictive models for Indian summer monsoon rainfall

    NASA Astrophysics Data System (ADS)

    Chattopadhyay, Manojit; Chattopadhyay, Surajit

    2016-10-01

    The present paper reports a study, where growing hierarchical self-organising map (GHSOM) has been applied to achieve a visual cluster analysis to the Indian rainfall dataset consisting of 142 years of Indian rainfall data so that the yearly rainfall can be segregated into small groups to visualise the pattern of clustering behaviour of yearly rainfall due to changes in monthly rainfall for each year. Also, through support vector machine (SVM), it has been observed that generation of clusters impacts positively on the prediction of the Indian summer monsoon rainfall. Results have been presented through statistical and graphical analyses.

  9. Space Telecommunications Radio System (STRS) Architecture, Tutorial Part 2 - Detailed

    NASA Technical Reports Server (NTRS)

    Handler, Louis

    2014-01-01

    The STRS architecture detail presentation presents each requirement in the STRS Architecture Standard with some examples and supporting information. The purpose is to give a platform provider, application provider, or application integrator a better, more detailed understanding of the STRS Architecture Standard and its use.

  10. Using Multimedia for Teaching Analysis in History of Modern Architecture.

    ERIC Educational Resources Information Center

    Perryman, Garry

    This paper presents a case for the development and support of a computer-based interactive multimedia program for teaching analysis in community college architecture design programs. Analysis in architecture design is an extremely important strategy for the teaching of higher-order thinking skills, which senior schools of architecture look for in…

  11. On the Inevitable Intertwining of Requirements and Architecture

    NASA Astrophysics Data System (ADS)

    Sutcliffe, Alistair

    The chapter investigates the relationship between architecture and requirements, arguing that architectural issues need to be addressed early in the RE process. Three trends are driving architectural implications for RE: the growth of intelligent, context-aware and adaptable systems. First the relationship between architecture and requirements is considered from a theoretical viewpoint of problem frames and abstract conceptual models. The relationships between architectural decisions and non-functional requirements is reviewed, and then the impact of architecture on the RE process is assessed using a case study of developing configurable, semi-intelligent software to support medical researchers in e-science domains.

  12. Lunar Navigation Architecture Design Considerations

    NASA Technical Reports Server (NTRS)

    D'Souza, Christopher; Getchius, Joel; Holt, Greg; Moreau, Michael

    2009-01-01

    The NASA Constellation Program is aiming to establish a long-term presence on the lunar surface. The Constellation elements (Orion, Altair, Earth Departure Stage, and Ares launch vehicles) will require a lunar navigation architecture for navigation state updates during lunar-class missions. Orion in particular has baselined earth-based ground direct tracking as the primary source for much of its absolute navigation needs. However, due to the uncertainty in the lunar navigation architecture, the Orion program has had to make certain assumptions on the capabilities of such architectures in order to adequately scale the vehicle design trade space. The following paper outlines lunar navigation requirements, the Orion program assumptions, and the impacts of these assumptions to the lunar navigation architecture design. The selection of potential sites was based upon geometric baselines, logistical feasibility, redundancy, and abort support capability. Simulated navigation covariances mapped to entry interface flightpath- angle uncertainties were used to evaluate knowledge errors. A minimum ground station architecture was identified consisting of Goldstone, Madrid, Canberra, Santiago, Hartebeeshoek, Dongora, Hawaii, Guam, and Ascension Island (or the geometric equivalent).

  13. The ALMA software architecture

    NASA Astrophysics Data System (ADS)

    Schwarz, Joseph; Farris, Allen; Sommer, Heiko

    2004-09-01

    The software for the Atacama Large Millimeter Array (ALMA) is being developed by many institutes on two continents. The software itself will function in a distributed environment, from the 0.5-14 kmbaselines that separate antennas to the larger distances that separate the array site at the Llano de Chajnantor in Chile from the operations and user support facilities in Chile, North America and Europe. Distributed development demands 1) interfaces that allow separated groups to work with minimal dependence on their counterparts at other locations; and 2) a common architecture to minimize duplication and ensure that developers can always perform similar tasks in a similar way. The Container/Component model provides a blueprint for the separation of functional from technical concerns: application developers concentrate on implementing functionality in Components, which depend on Containers to provide them with services such as access to remote resources, transparent serialization of entity objects to XML, logging, error handling and security. Early system integrations have verified that this architecture is sound and that developers can successfully exploit its features. The Containers and their services are provided by a system-orienteddevelopment team as part of the ALMA Common Software (ACS), middleware that is based on CORBA.

  14. Protocol Architecture Model Report

    NASA Technical Reports Server (NTRS)

    Dhas, Chris

    2000-01-01

    NASA's Glenn Research Center (GRC) defines and develops advanced technology for high priority national needs in communications technologies for application to aeronautics and space. GRC tasked Computer Networks and Software Inc. (CNS) to examine protocols and architectures for an In-Space Internet Node. CNS has developed a methodology for network reference models to support NASA's four mission areas: Earth Science, Space Science, Human Exploration and Development of Space (REDS), Aerospace Technology. This report applies the methodology to three space Internet-based communications scenarios for future missions. CNS has conceptualized, designed, and developed space Internet-based communications protocols and architectures for each of the independent scenarios. The scenarios are: Scenario 1: Unicast communications between a Low-Earth-Orbit (LEO) spacecraft inspace Internet node and a ground terminal Internet node via a Tracking and Data Rela Satellite (TDRS) transfer; Scenario 2: Unicast communications between a Low-Earth-Orbit (LEO) International Space Station and a ground terminal Internet node via a TDRS transfer; Scenario 3: Multicast Communications (or "Multicasting"), 1 Spacecraft to N Ground Receivers, N Ground Transmitters to 1 Ground Receiver via a Spacecraft.

  15. Secure Storage Architectures

    SciTech Connect

    Aderholdt, Ferrol; Caldwell, Blake A; Hicks, Susan Elaine; Koch, Scott M; Naughton, III, Thomas J; Pogge, James R; Scott, Stephen L; Shipman, Galen M; Sorrillo, Lawrence

    2015-01-01

    The purpose of this report is to clarify the challenges associated with storage for secure enclaves. The major focus areas for the report are: - review of relevant parallel filesystem technologies to identify assets and gaps; - review of filesystem isolation/protection mechanisms, to include native filesystem capabilities and auxiliary/layered techniques; - definition of storage architectures that can be used for customizable compute enclaves (i.e., clarification of use-cases that must be supported for shared storage scenarios); - investigate vendor products related to secure storage. This study provides technical details on the storage and filesystem used for HPC with particular attention on elements that contribute to creating secure storage. We outline the pieces for a a shared storage architecture that balances protection and performance by leveraging the isolation capabilities available in filesystems and virtualization technologies to maintain the integrity of the data. Key Points: There are a few existing and in-progress protection features in Lustre related to secure storage, which are discussed in (Chapter 3.1). These include authentication capabilities like GSSAPI/Kerberos and the in-progress work for GSSAPI/Host-keys. The GPFS filesystem provides native support for encryption, which is not directly available in Lustre. Additionally, GPFS includes authentication/authorization mechanisms for inter-cluster sharing of filesystems (Chapter 3.2). The limitations of key importance for secure storage/filesystems are: (i) restricting sub-tree mounts for parallel filesystem (which is not directly supported in Lustre or GPFS), and (ii) segregation of hosts on the storage network and practical complications with dynamic additions to the storage network, e.g., LNET. A challenge for VM based use cases will be to provide efficient IO forwarding of the parallel filessytem from the host to the guest (VM). There are promising options like para-virtualized filesystems to

  16. Data-driven array architectures: a rebirth?

    NASA Astrophysics Data System (ADS)

    Cardoso, Joao M. P.

    2005-06-01

    The von Neumann-style architectures have been tremendously well succeeded by taking advantage of the Moore"s law. It is now understood that, it will be very difficult to meet the supercomputing demands of the future computing systems with this style of microprocessor architectures. Most nowadays applications require high-performance for processing data streams. Being dataflow computing a natural paradigm to process data streams, architectures based on dataflow principles are emerging as a way to meet the supercomputing demands. Data-driven arrays, introduced in the 80"s, are examples of such architectures. They devised a scalable and effective fashion to directly support the dataflow model of computation and have been revived by a number of reconfigurable architectures (e.g., KressArray, WaveScalar, and XPP). Those coarse-grained reconfigurable architectures with dataflow semantics depict interesting achievements with respect to performance and programming methodologies, when compared to other computing platforms. This paper presents the most interesting data-driven array architectures. Trends and open issues related to a number of properties at architectural level and to compilation techniques are enumerated and discussed. A number of features are illustrated, especially the support for hardware virtualization, speculative configuration, and software pipelining. Examples using the PACT XPP reconfigurable array are shown. Those examples include the ADPCM decoder, from the MediaBench repository, and LeeDCT, an optimized DCT algorithm.

  17. The EPOS ICT Architecture

    NASA Astrophysics Data System (ADS)

    Jeffery, Keith; Harrison, Matt; Bailo, Daniele

    2016-04-01

    parallel the ICT team is tracking developments in ICT for relevance to EPOS-IP. In particular, the potential utilisation of e-Is (e-Infrastructures) such as GEANT(network), AARC (security), EGI (GRID computing), EUDAT (data curation), PRACE (High Performance Computing), HELIX-Nebula / Open Science Cloud (Cloud computing) are being assessed. Similarly relationships to other e-RIs (e-Research Infrastructures) such as ENVRI+, EXCELERATE and other ESFRI (European Strategic Forum for Research Infrastructures) projects are developed to share experience and technology and to promote interoperability. EPOS ICT team members are also involved in VRE4EIC, a project developing a reference architecture and component software services for a Virtual Research Environment to be superimposed on EPOS-ICS. The challenge which is being tackled now is therefore to keep consistency and interoperability among the different modules, initiatives and actors which participate to the process of running the EPOS platform. It implies both a continuous update about IT aspects of mentioned initiatives and a refinement of the e-architecture designed so far. One major aspect of EPOS-IP is the ICT support for legalistic, financial and governance aspects of the EPOS ERIC to be initiated during EPOS-IP. This implies a sophisticated AAAI (Authentication, authorization, accounting infrastructure) with consistency throughout the software, communications and data stack.

  18. Information architecture. Volume 4: Vision

    SciTech Connect

    1998-03-01

    The Vision document marks the transition from definition to implementation of the Department of Energy (DOE) Information Architecture Program. A description of the possibilities for the future, supported by actual experience with a process model and tool set, points toward implementation options. The directions for future information technology investments are discussed. Practical examples of how technology answers the business and information needs of the organization through coordinated and meshed data, applications, and technology architectures are related. This document is the fourth and final volume in the planned series for defining and exhibiting the DOE information architecture. The targeted scope of this document includes DOE Program Offices, field sites, contractor-operated facilities, and laboratories. This document paints a picture of how, over the next 7 years, technology may be implemented, dramatically improving the ways business is conducted at DOE. While technology is mentioned throughout this document, the vision is not about technology. The vision concerns the transition afforded by technology and the process steps to be completed to ensure alignment with business needs. This goal can be met if those directing the changing business and mission-support processes understand the capabilities afforded by architectural processes.

  19. Demand Activated Manufacturing Architecture

    SciTech Connect

    Bender, T.R.; Zimmerman, J.J.

    2001-02-07

    Honeywell Federal Manufacturing & Technologies (FM&T) engineers John Zimmerman and Tom Bender directed separate projects within this CRADA. This Project Accomplishments Summary contains their reports independently. Zimmerman: In 1998 Honeywell FM&T partnered with the Demand Activated Manufacturing Architecture (DAMA) Cooperative Business Management Program to pilot the Supply Chain Integration Planning Prototype (SCIP). At the time, FM&T was developing an enterprise-wide supply chain management prototype called the Integrated Programmatic Scheduling System (IPSS) to improve the DOE's Nuclear Weapons Complex (NWC) supply chain. In the CRADA partnership, FM&T provided the IPSS technical and business infrastructure as a test bed for SCIP technology, and this would provide FM&T the opportunity to evaluate SCIP as the central schedule engine and decision support tool for IPSS. FM&T agreed to do the bulk of the work for piloting SCIP. In support of that aim, DAMA needed specific DOE Defense Programs opportunities to prove the value of its supply chain architecture and tools. In this partnership, FM&T teamed with Sandia National Labs (SNL), Division 6534, the other DAMA partner and developer of SCIP. FM&T tested SCIP in 1998 and 1999. Testing ended in 1999 when DAMA CRADA funding for FM&T ceased. Before entering the partnership, FM&T discovered that the DAMA SCIP technology had an array of applications in strategic, tactical, and operational planning and scheduling. At the time, FM&T planned to improve its supply chain performance by modernizing the NWC-wide planning and scheduling business processes and tools. The modernization took the form of a distributed client-server planning and scheduling system (IPSS) for planners and schedulers to use throughout the NWC on desktops through an off-the-shelf WEB browser. The planning and scheduling process within the NWC then, and today, is a labor-intensive paper-based method that plans and schedules more than 8,000 shipped parts

  20. New computer architectures

    SciTech Connect

    Tiberghien, J.

    1984-01-01

    This book presents papers on supercomputers. Topics considered include decentralized computer architecture, new programming languages, data flow computers, reduction computers, parallel prefix calculations, structural and behavioral descriptions of digital systems, instruction sets, software generation, personal computing, and computer architecture education.

  1. High performance parallel architectures

    SciTech Connect

    Anderson, R.E. )

    1989-09-01

    In this paper the author describes current high performance parallel computer architectures. A taxonomy is presented to show computer architecture from the user programmer's point-of-view. The effects of the taxonomy upon the programming model are described. Some current architectures are described with respect to the taxonomy. Finally, some predictions about future systems are presented. 5 refs., 1 fig.

  2. HRST architecture modeling and assessments

    SciTech Connect

    Comstock, D.A.

    1997-01-01

    This paper presents work supporting the assessment of advanced concept options for the Highly Reusable Space Transportation (HRST) study. It describes the development of computer models as the basis for creating an integrated capability to evaluate the economic feasibility and sustainability of a variety of system architectures. It summarizes modeling capabilities for use on the HRST study to perform sensitivity analysis of alternative architectures (consisting of different combinations of highly reusable vehicles, launch assist systems, and alternative operations and support concepts) in terms of cost, schedule, performance, and demand. In addition, the identification and preliminary assessment of alternative market segments for HRST applications, such as space manufacturing, space tourism, etc., is described. Finally, the development of an initial prototype model that can begin to be used for modeling alternative HRST concepts at the system level is presented. {copyright} {ital 1997 American Institute of Physics.}

  3. Launch Vehicle Control Center Architectures

    NASA Technical Reports Server (NTRS)

    Watson, Michael D.; Epps, Amy; Woodruff, Van; Vachon, Michael Jacob; Monreal, Julio; Williams, Randall; McLaughlin, Tom

    2014-01-01

    This analysis is a survey of control center architectures of the NASA Space Launch System (SLS), United Launch Alliance (ULA) Atlas V and Delta IV, and the European Space Agency (ESA) Ariane 5. Each of these control center architectures have similarities in basic structure, and differences in functional distribution of responsibilities for the phases of operations: (a) Launch vehicles in the international community vary greatly in configuration and process; (b) Each launch site has a unique processing flow based on the specific configurations; (c) Launch and flight operations are managed through a set of control centers associated with each launch site, however the flight operations may be a different control center than the launch center; and (d) The engineering support centers are primarily located at the design center with a small engineering support team at the launch site.

  4. Architecture for Adaptive Intelligent Systems

    NASA Technical Reports Server (NTRS)

    Hayes-Roth, Barbara

    1993-01-01

    We identify a class of niches to be occupied by 'adaptive intelligent systems (AISs)'. In contrast with niches occupied by typical AI agents, AIS niches present situations that vary dynamically along several key dimensions: different combinations of required tasks, different configurations of available resources, contextual conditions ranging from benign to stressful, and different performance criteria. We present a small class hierarchy of AIS niches that exhibit these dimensions of variability and describe a particular AIS niche, ICU (intensive care unit) patient monitoring, which we use for illustration throughout the paper. We have designed and implemented an agent architecture that supports all of different kinds of adaptation by exploiting a single underlying theoretical concept: An agent dynamically constructs explicit control plans to guide its choices among situation-triggered behaviors. We illustrate the architecture and its support for adaptation with examples from Guardian, an experimental agent for ICU monitoring.

  5. Information architecture. Volume 1, The foundations

    SciTech Connect

    1995-03-01

    The Information Management Planning and Architecture Coordinating Team was formed to establish an information architecture framework to meet DOE`s current and future information needs. This department- wide activity was initiated in accordance with the DOE Information Management Strategic Plan; it also supports the Departmental Strategic Plan. It recognizes recent changes in emphasis as reflected in OMB Circular A-130 and the Information Resources Management Planning Process Improvement Team recommendations. Sections of this document provides the foundation for establishing DOE`s Information Architecture: Background, Business Case (reduced duplication of effort, increased integration of activities, improved operational capabilities), Baseline (technology baseline currently in place within DOE), Vision (guiding principles for future DOE Information Architecture), Standards Process, Policy and Process Integration (describes relations between information architecture and business processes), and Next Steps. Following each section is a scenario. A glossary of terms is provided.

  6. Airport Surface Network Architecture Definition

    NASA Technical Reports Server (NTRS)

    Nguyen, Thanh C.; Eddy, Wesley M.; Bretmersky, Steven C.; Lawas-Grodek, Fran; Ellis, Brenda L.

    2006-01-01

    Currently, airport surface communications are fragmented across multiple types of systems. These communication systems for airport operations at most airports today are based dedicated and separate architectures that cannot support system-wide interoperability and information sharing. The requirements placed upon the Communications, Navigation, and Surveillance (CNS) systems in airports are rapidly growing and integration is urgently needed if the future vision of the National Airspace System (NAS) and the Next Generation Air Transportation System (NGATS) 2025 concept are to be realized. To address this and other problems such as airport surface congestion, the Space Based Technologies Project s Surface ICNS Network Architecture team at NASA Glenn Research Center has assessed airport surface communications requirements, analyzed existing and future surface applications, and defined a set of architecture functions that will help design a scalable, reliable and flexible surface network architecture to meet the current and future needs of airport operations. This paper describes the systems approach or methodology to networking that was employed to assess airport surface communications requirements, analyze applications, and to define the surface network architecture functions as the building blocks or components of the network. The systems approach used for defining these functions is relatively new to networking. It is viewing the surface network, along with its environment (everything that the surface network interacts with or impacts), as a system. Associated with this system are sets of services that are offered by the network to the rest of the system. Therefore, the surface network is considered as part of the larger system (such as the NAS), with interactions and dependencies between the surface network and its users, applications, and devices. The surface network architecture includes components such as addressing/routing, network management, network

  7. Adaptive reconfigurable distributed sensor architecture

    NASA Astrophysics Data System (ADS)

    Akey, Mark L.

    1997-07-01

    The infancy of unattended ground based sensors is quickly coming to an end with the arrival of on-board GPS, networking, and multiple sensing capabilities. Unfortunately, their use is only first-order at best: GPS assists with sensor report registration; networks push sensor reports back to the warfighter and forwards control information to the sensors; multispectral sensing is a preset, pre-deployment consideration; and the scalability of large sensor networks is questionable. Current architectures provide little synergy among or within the sensors either before or after deployment, and do not map well to the tactical user's organizational structures and constraints. A new distributed sensor architecture is defined which moves well beyond single sensor, single task architectures. Advantages include: (1) automatic mapping of tactical direction to multiple sensors' tasks; (2) decentralized, distributed management of sensor resources and tasks; (3) software reconfiguration of deployed sensors; (4) network scalability and flexibility to meet the constraints of tactical deployments, and traditional combat organizations and hierarchies; and (5) adaptability to new battlefield communication paradigms such as BADD (Battlefield Analysis and Data Dissemination). The architecture is supported in two areas: a recursive, structural definition of resource configuration and management via loose associations; and a hybridization of intelligent software agents with tele- programming capabilities. The distributed sensor architecture is examined within the context of air-deployed ground sensors with acoustic, communication direction finding, and infra-red capabilities. Advantages and disadvantages of the architecture are examined. Consideration is given to extended sensor life (up to 6 months), post-deployment sensor reconfiguration, limited on- board sensor resources (processor and memory), and bandwidth. It is shown that technical tasking of the sensor suite can be automatically

  8. Launch Vehicle Control Center Architectures

    NASA Technical Reports Server (NTRS)

    Watson, Michael D.; Epps, Amy; Woodruff, Van; Vachon, Michael Jacob; Monreal, Julio; Levesque, Marl; Williams, Randall; Mclaughlin, Tom

    2014-01-01

    Launch vehicles within the international community vary greatly in their configuration and processing. Each launch site has a unique processing flow based on the specific launch vehicle configuration. Launch and flight operations are managed through a set of control centers associated with each launch site. Each launch site has a control center for launch operations; however flight operations support varies from being co-located with the launch site to being shared with the space vehicle control center. There is also a nuance of some having an engineering support center which may be co-located with either the launch or flight control center, or in a separate geographical location altogether. A survey of control center architectures is presented for various launch vehicles including the NASA Space Launch System (SLS), United Launch Alliance (ULA) Atlas V and Delta IV, and the European Space Agency (ESA) Ariane 5. Each of these control center architectures shares some similarities in basic structure while differences in functional distribution also exist. The driving functions which lead to these factors are considered and a model of control center architectures is proposed which supports these commonalities and variations.

  9. Space Telecommunications Radio Architecture (STRS): Technical Overview

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.

    2006-01-01

    A software defined radio (SDR) architecture used in space-based platforms proposes to standardize certain aspects of radio development such as interface definitions, functional control and execution, and application software and firmware development. NASA has charted a team to develop an open software defined radio hardware and software architecture to support NASA missions and determine the viability of an Agency-wide Standard. A draft concept of the proposed standard has been released and discussed among organizations in the SDR community. Appropriate leveraging of the JTRS SCA, OMG s SWRadio Architecture and other aspects are considered. A standard radio architecture offers potential value by employing common waveform software instantiation, operation, testing and software maintenance. While software defined radios offer greater flexibility, they also poses challenges to the radio development for the space environment in terms of size, mass and power consumption and available technology. An SDR architecture for space must recognize and address the constraints of space flight hardware, and systems along with flight heritage and culture. NASA is actively participating in the development of technology and standards related to software defined radios. As NASA considers a standard radio architecture for space communications, input and coordination from government agencies, the industry, academia, and standards bodies is key to a successful architecture. The unique aspects of space require thorough investigation of relevant terrestrial technologies properly adapted to space. The talk will describe NASA's current effort to investigate SDR applications to space missions and a brief overview of a candidate architecture under consideration for space based platforms.

  10. Distributed visualization framework architecture

    NASA Astrophysics Data System (ADS)

    Mishchenko, Oleg; Raman, Sundaresan; Crawfis, Roger

    2010-01-01

    An architecture for distributed and collaborative visualization is presented. The design goals of the system are to create a lightweight, easy to use and extensible framework for reasearch in scientific visualization. The system provides both single user and collaborative distributed environment. System architecture employs a client-server model. Visualization projects can be synchronously accessed and modified from different client machines. We present a set of visualization use cases that illustrate the flexibility of our system. The framework provides a rich set of reusable components for creating new applications. These components make heavy use of leading design patterns. All components are based on the functionality of a small set of interfaces. This allows new components to be integrated seamlessly with little to no effort. All user input and higher-level control functionality interface with proxy objects supporting a concrete implementation of these interfaces. These light-weight objects can be easily streamed across the web and even integrated with smart clients running on a user's cell phone. The back-end is supported by concrete implementations wherever needed (for instance for rendering). A middle-tier manages any communication and synchronization with the proxy objects. In addition to the data components, we have developed several first-class GUI components for visualization. These include a layer compositor editor, a programmable shader editor, a material editor and various drawable editors. These GUI components interact strictly with the interfaces. Access to the various entities in the system is provided by an AssetManager. The asset manager keeps track of all of the registered proxies and responds to queries on the overall system. This allows all user components to be populated automatically. Hence if a new component is added that supports the IMaterial interface, any instances of this can be used in the various GUI components that work with this

  11. Parallel machine architecture and compiler design facilities

    NASA Technical Reports Server (NTRS)

    Kuck, David J.; Yew, Pen-Chung; Padua, David; Sameh, Ahmed; Veidenbaum, Alex

    1990-01-01

    The objective is to provide an integrated simulation environment for studying and evaluating various issues in designing parallel systems, including machine architectures, parallelizing compiler techniques, and parallel algorithms. The status of Delta project (which objective is to provide a facility to allow rapid prototyping of parallelized compilers that can target toward different machine architectures) is summarized. Included are the surveys of the program manipulation tools developed, the environmental software supporting Delta, and the compiler research projects in which Delta has played a role.

  12. Advanced computer architecture specification for automated weld systems

    NASA Technical Reports Server (NTRS)

    Katsinis, Constantine

    1994-01-01

    This report describes the requirements for an advanced automated weld system and the associated computer architecture, and defines the overall system specification from a broad perspective. According to the requirements of welding procedures as they relate to an integrated multiaxis motion control and sensor architecture, the computer system requirements are developed based on a proven multiple-processor architecture with an expandable, distributed-memory, single global bus architecture, containing individual processors which are assigned to specific tasks that support sensor or control processes. The specified architecture is sufficiently flexible to integrate previously developed equipment, be upgradable and allow on-site modifications.

  13. Numerical Propulsion System Simulation Architecture

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia G.

    2004-01-01

    The Numerical Propulsion System Simulation (NPSS) is a framework for performing analysis of complex systems. Because the NPSS was developed using the object-oriented paradigm, the resulting architecture is an extensible and flexible framework that is currently being used by a diverse set of participants in government, academia, and the aerospace industry. NPSS is being used by over 15 different institutions to support rockets, hypersonics, power and propulsion, fuel cells, ground based power, and aerospace. Full system-level simulations as well as subsystems may be modeled using NPSS. The NPSS architecture enables the coupling of analyses at various levels of detail, which is called numerical zooming. The middleware used to enable zooming and distributed simulations is the Common Object Request Broker Architecture (CORBA). The NPSS Developer's Kit offers tools for the developer to generate CORBA-based components and wrap codes. The Developer's Kit enables distributed multi-fidelity and multi-discipline simulations, preserves proprietary and legacy codes, and facilitates addition of customized codes. The platforms supported are PC, Linux, HP, Sun, and SGI.

  14. Franchisees in Crisis: Using Action Learning to Self-Organise

    ERIC Educational Resources Information Center

    O'Donoghue, Carol

    2011-01-01

    The present article describes the use of action learning by a group of 30 franchisees to organise themselves and work through a period of upheaval and uncertainty when their parent company faced liquidation. Written from the perspective of one of the franchisees who found herself adopting action learning principles to facilitate the group, it…

  15. Enzyme immobilisation on self-organised nanopatterned electrode surfaces.

    PubMed

    Gajdzik, Janine; Lenz, Jennifer; Natter, Harald; Hempelmann, Rolf; Kohring, Gert-Wieland; Giffhorn, Friedrich; Manolova, Mila; Kolb, Dieter M

    2010-10-21

    A new method is described for immobilisation of enzymes on polymer-coated Pt islands. These islands are deposited on top of a SAM-covered Au(111) electrode by a combination of electroless and electrochemical deposition, which allows for a variation of island size and distance between the islands. Here we describe the immobilisation of pyranose-2-oxidase (P2Ox) and the catalytic response to D-glucose on such a nanopatterned surface, which provides optimum access to the active centres of the enzyme.

  16. Grid Architecture 2

    SciTech Connect

    Taft, Jeffrey D.

    2016-01-01

    The report describes work done on Grid Architecture under the auspices of the Department of Electricity Office of Electricity Delivery and Reliability in 2015. As described in the first Grid Architecture report, the primary purpose of this work is to provide stakeholder insight about grid issues so as to enable superior decision making on their part. Doing this requires the creation of various work products, including oft-times complex diagrams, analyses, and explanations. This report provides architectural insights into several important grid topics and also describes work done to advance the science of Grid Architecture as well.

  17. A computer architecture for intelligent machines

    NASA Technical Reports Server (NTRS)

    Lefebvre, D. R.; Saridis, G. N.

    1991-01-01

    The Theory of Intelligent Machines proposes a hierarchical organization for the functions of an autonomous robot based on the Principle of Increasing Precision With Decreasing Intelligence. An analytic formulation of this theory using information-theoretic measures of uncertainty for each level of the intelligent machine has been developed in recent years. A computer architecture that implements the lower two levels of the intelligent machine is presented. The architecture supports an event-driven programming paradigm that is independent of the underlying computer architecture and operating system. Details of Execution Level controllers for motion and vision systems are addressed, as well as the Petri net transducer software used to implement Coordination Level functions. Extensions to UNIX and VxWorks operating systems which enable the development of a heterogeneous, distributed application are described. A case study illustrates how this computer architecture integrates real-time and higher-level control of manipulator and vision systems.

  18. Systems Architecture for a Nationwide Healthcare System.

    PubMed

    Abin, Jorge; Nemeth, Horacio; Friedmann, Ignacio

    2015-01-01

    From a national level to give Internet technology support, the Nationwide Integrated Healthcare System in Uruguay requires a model of Information Systems Architecture. This system has multiple healthcare providers (public and private), and a strong component of supplementary services. Thus, the data processing system should have an architecture that considers this fact, while integrating the central services provided by the Ministry of Public Health. The national electronic health record, as well as other related data processing systems, should be based on this architecture. The architecture model described here conceptualizes a federated framework of electronic health record systems, according to the IHE affinity model, HL7 standards, local standards on interoperability and security, as well as technical advice provided by AGESIC. It is the outcome of the research done by AGESIC and Systems Integration Laboratory (LINS) on the development and use of the e-Government Platform since 2008, as well as the research done by the team Salud.uy since 2013.

  19. A computer architecture for intelligent machines

    NASA Technical Reports Server (NTRS)

    Lefebvre, D. R.; Saridis, G. N.

    1992-01-01

    The theory of intelligent machines proposes a hierarchical organization for the functions of an autonomous robot based on the principle of increasing precision with decreasing intelligence. An analytic formulation of this theory using information-theoretic measures of uncertainty for each level of the intelligent machine has been developed. The authors present a computer architecture that implements the lower two levels of the intelligent machine. The architecture supports an event-driven programming paradigm that is independent of the underlying computer architecture and operating system. Execution-level controllers for motion and vision systems are briefly addressed, as well as the Petri net transducer software used to implement coordination-level functions. A case study illustrates how this computer architecture integrates real-time and higher-level control of manipulator and vision systems.

  20. Virtual environment architecture for rapid application development

    NASA Technical Reports Server (NTRS)

    Grinstein, Georges G.; Southard, David A.; Lee, J. P.

    1993-01-01

    We describe the MITRE Virtual Environment Architecture (VEA), a product of nearly two years of investigations and prototypes of virtual environment technology. This paper discusses the requirements for rapid prototyping, and an architecture we are developing to support virtual environment construction. VEA supports rapid application development by providing a variety of pre-built modules that can be reconfigured for each application session. The modules supply interfaces for several types of interactive I/O devices, in addition to large-screen or head-mounted displays.

  1. Generic POCC architectures

    NASA Technical Reports Server (NTRS)

    1989-01-01

    This document describes a generic POCC (Payload Operations Control Center) architecture based upon current POCC software practice, and several refinements to the architecture based upon object-oriented design principles and expected developments in teleoperations. The current-technology generic architecture is an abstraction based upon close analysis of the ERBS, COBE, and GRO POCC's. A series of three refinements is presented: these may be viewed as an approach to a phased transition to the recommended architecture. The third refinement constitutes the recommended architecture, which, together with associated rationales, will form the basis of the rapid synthesis environment to be developed in the remainder of this task. The document is organized into two parts. The first part describes the current generic architecture using several graphical as well as tabular representations or 'views.' The second part presents an analysis of the generic architecture in terms of object-oriented principles. On the basis of this discussion, refinements to the generic architecture are presented, again using a combination of graphical and tabular representations.

  2. Teaching American Indian Architecture.

    ERIC Educational Resources Information Center

    Winchell, Dick

    1991-01-01

    Reviews "Native American Architecture," by Nabokov and Easton, an encyclopedic work that examines technology, climate, social structure, economics, religion, and history in relation to house design and the "meaning" of space among tribes of nine regions. Describes this book's use in a college course on Native American architecture. (SV)

  3. Architectural Physics: Lighting.

    ERIC Educational Resources Information Center

    Hopkinson, R. G.

    The author coordinates the many diverse branches of knowledge which have dealt with the field of lighting--physiology, psychology, engineering, physics, and architectural design. Part I, "The Elements of Architectural Physics", discusses the physiological aspects of lighting, visual performance, lighting design, calculations and measurements of…

  4. Software Architecture Evolution

    ERIC Educational Resources Information Center

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  5. Workflow automation architecture standard

    SciTech Connect

    Moshofsky, R.P.; Rohen, W.T.

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  6. Applying neuroscience to architecture.

    PubMed

    Eberhard, John P

    2009-06-25

    Architectural practice and neuroscience research use our brains and minds in much the same way. However, the link between neuroscience knowledge and architectural design--with rare exceptions--has yet to be made. The concept of linking these two fields is a challenge worth considering.

  7. The Technology of Architecture

    ERIC Educational Resources Information Center

    Reese, Susan

    2006-01-01

    This article discusses how career and technical education is helping students draw up plans for success in architectural technology. According to the College of DuPage (COD) in Glen Ellyn, Illinois, one of the two-year schools offering training in architectural technology, graduates have a number of opportunities available to them. They may work…

  8. Satellite ATM Networks: Architectures and Guidelines Developed

    NASA Technical Reports Server (NTRS)

    vonDeak, Thomas C.; Yegendu, Ferit

    1999-01-01

    An important element of satellite-supported asynchronous transfer mode (ATM) networking will involve support for the routing and rerouting of active connections. Work published under the auspices of the Telecommunications Industry Association (http://www.tiaonline.org), describes basic architectures and routing protocol issues for satellite ATM (SATATM) networks. The architectures and issues identified will serve as a basis for further development of technical specifications for these SATATM networks. Three ATM network architectures for bent pipe satellites and three ATM network architectures for satellites with onboard ATM switches were developed. The architectures differ from one another in terms of required level of mobility, supported data rates, supported terrestrial interfaces, and onboard processing and switching requirements. The documentation addresses low-, middle-, and geosynchronous-Earth-orbit satellite configurations. The satellite environment may require real-time routing to support the mobility of end devices and nodes of the ATM network itself. This requires the network to be able to reroute active circuits in real time. In addition to supporting mobility, rerouting can also be used to (1) optimize network routing, (2) respond to changing quality-of-service requirements, and (3) provide a fault tolerance mechanism. Traffic management and control functions are necessary in ATM to ensure that the quality-of-service requirements associated with each connection are not violated and also to provide flow and congestion control functions. Functions related to traffic management were identified and described. Most of these traffic management functions will be supported by on-ground ATM switches, but in a hybrid terrestrial-satellite ATM network, some of the traffic management functions may have to be supported by the onboard satellite ATM switch. Future work is planned to examine the tradeoffs of placing traffic management functions onboard a satellite as

  9. The Simulation Intranet Architecture

    SciTech Connect

    Holmes, V.P.; Linebarger, J.M.; Miller, D.J.; Vandewart, R.L.

    1998-12-02

    The Simdarion Infranet (S1) is a term which is being used to dcscribc one element of a multidisciplinary distributed and distance computing initiative known as DisCom2 at Sandia National Laboratory (http ct al. 1998). The Simulation Intranet is an architecture for satisfying Sandia's long term goal of providing an end- to-end set of scrviccs for high fidelity full physics simu- lations in a high performance, distributed, and distance computing environment. The Intranet Architecture group was formed to apply current distributed object technologies to this problcm. For the hardware architec- tures and software models involved with the current simulation process, a CORBA-based architecture is best suited to meet Sandia's needs. This paper presents the initial desi-a and implementation of this Intranct based on a three-tier Network Computing Architecture(NCA). The major parts of the architecture include: the Web Cli- ent, the Business Objects, and Data Persistence.

  10. Can architecture be barbaric?

    PubMed

    Hürol, Yonca

    2009-06-01

    The title of this article is adapted from Theodor W. Adorno's famous dictum: 'To write poetry after Auschwitz is barbaric.' After the catastrophic earthquake in Kocaeli, Turkey on the 17th of August 1999, in which more than 40,000 people died or were lost, Necdet Teymur, who was then the dean of the Faculty of Architecture of the Middle East Technical University, referred to Adorno in one of his 'earthquake poems' and asked: 'Is architecture possible after 17th of August?' The main objective of this article is to interpret Teymur's question in respect of its connection to Adorno's philosophy with a view to make a contribution to the politics and ethics of architecture in Turkey. Teymur's question helps in providing a new interpretation of a critical approach to architecture and architectural technology through Adorno's philosophy. The paper also presents a discussion of Adorno's dictum, which serves for a better understanding of its universality/particularity.

  11. The Planning Execution Monitoring Architecture

    NASA Technical Reports Server (NTRS)

    Wang, Lui; Ly, Bebe; Crocker, Alan; Schreckenghost, Debra; Mueller, Stephen; Phillips, Robert; Wadsworth, David; Sorensen, Charles

    2011-01-01

    The Planning Execution Monitoring (PEM) architecture is a design concept for developing autonomous cockpit command and control software. The PEM architecture is designed to reduce the operations costs in the space transportation system through the use of automation while improving safety and operability of the system. Specifically, the PEM autonomous framework enables automatic performance of many vehicle operations that would typically be performed by a human. Also, this framework supports varying levels of autonomous control, ranging from fully automatic to fully manual control. The PEM autonomous framework interfaces with the core flight software to perform flight procedures. It can either assist human operators in performing procedures or autonomously execute routine cockpit procedures based on the operational context. Most importantly, the PEM autonomous framework promotes and simplifies the capture, verification, and validation of the flight operations knowledge. Through a hierarchical decomposition of the domain knowledge, the vehicle command and control capabilities are divided into manageable functional "chunks" that can be captured and verified separately. These functional units, each of which has the responsibility to manage part of the vehicle command and control, are modular, re-usable, and extensible. Also, the functional units are self-contained and have the ability to plan and execute the necessary steps for accomplishing a task based upon the current mission state and available resources. The PEM architecture has potential for application outside the realm of spaceflight, including management of complex industrial processes, nuclear control, and control of complex vehicles such as submarines or unmanned air vehicles.

  12. Microcomponent sheet architecture

    DOEpatents

    Wegeng, R.S.; Drost, M.K..; McDonald, C.E.

    1997-03-18

    The invention is a microcomponent sheet architecture wherein macroscale unit processes are performed by microscale components. The sheet architecture may be a single laminate with a plurality of separate microcomponent sections or the sheet architecture may be a plurality of laminates with one or more microcomponent sections on each laminate. Each microcomponent or plurality of like microcomponents perform at least one unit operation. A first laminate having a plurality of like first microcomponents is combined with at least a second laminate having a plurality of like second microcomponents thereby combining at least two unit operations to achieve a system operation. 14 figs.

  13. Microcomponent sheet architecture

    DOEpatents

    Wegeng, Robert S.; Drost, M. Kevin; McDonald, Carolyn E.

    1997-01-01

    The invention is a microcomponent sheet architecture wherein macroscale unit processes are performed by microscale components. The sheet architecture may be a single laminate with a plurality of separate microcomponent sections or the sheet architecture may be a plurality of laminates with one or more microcomponent sections on each laminate. Each microcomponent or plurality of like microcomponents perform at least one unit operation. A first laminate having a plurality of like first microcomponents is combined with at least a second laminate having a plurality of like second microcomponents thereby combining at least two unit operations to achieve a system operation.

  14. Architecture for Verifiable Software

    NASA Technical Reports Server (NTRS)

    Reinholtz, William; Dvorak, Daniel

    2005-01-01

    Verifiable MDS Architecture (VMA) is a software architecture that facilitates the construction of highly verifiable flight software for NASA s Mission Data System (MDS), especially for smaller missions subject to cost constraints. More specifically, the purpose served by VMA is to facilitate aggressive verification and validation of flight software while imposing a minimum of constraints on overall functionality. VMA exploits the state-based architecture of the MDS and partitions verification issues into elements susceptible to independent verification and validation, in such a manner that scaling issues are minimized, so that relatively large software systems can be aggressively verified in a cost-effective manner.

  15. RASSP signal processing architectures

    NASA Astrophysics Data System (ADS)

    Shirley, Fred; Bassett, Bob; Letellier, J. P.

    1995-06-01

    The rapid prototyping of application specific signal processors (RASSP) program is an ARPA/tri-service effort to dramatically improve the process by which complex digital systems, particularly embedded signal processors, are specified, designed, documented, manufactured, and supported. The domain of embedded signal processing was chosen because it is important to a variety of military and commercial applications as well as for the challenge it presents in terms of complexity and performance demands. The principal effort is being performed by two major contractors, Lockheed Sanders (Nashua, NH) and Martin Marietta (Camden, NJ). For both, improvements in methodology are to be exercised and refined through the performance of individual 'Demonstration' efforts. The Lockheed Sanders' Demonstration effort is to develop an infrared search and track (IRST) processor. In addition, both contractors' results are being measured by a series of externally administered (by Lincoln Labs) six-month Benchmark programs that measure process improvement as a function of time. The first two Benchmark programs are designing and implementing a synthetic aperture radar (SAR) processor. Our demonstration team is using commercially available VME modules from Mercury Computer to assemble a multiprocessor system scalable from one to hundreds of Intel i860 microprocessors. Custom modules for the sensor interface and display driver are also being developed. This system implements either proprietary or Navy owned algorithms to perform the compute-intensive IRST function in real time in an avionics environment. Our Benchmark team is designing custom modules using commercially available processor ship sets, communication submodules, and reconfigurable logic devices. One of the modules contains multiple vector processors optimized for fast Fourier transform processing. Another module is a fiberoptic interface that accepts high-rate input data from the sensors and provides video-rate output data to a

  16. Robot Electronics Architecture

    NASA Technical Reports Server (NTRS)

    Garrett, Michael; Magnone, Lee; Aghazarian, Hrand; Baumgartner, Eric; Kennedy, Brett

    2008-01-01

    An electronics architecture has been developed to enable the rapid construction and testing of prototypes of robotic systems. This architecture is designed to be a research vehicle of great stability, reliability, and versatility. A system according to this architecture can easily be reconfigured (including expanded or contracted) to satisfy a variety of needs with respect to input, output, processing of data, sensing, actuation, and power. The architecture affords a variety of expandable input/output options that enable ready integration of instruments, actuators, sensors, and other devices as independent modular units. The separation of different electrical functions onto independent circuit boards facilitates the development of corresponding simple and modular software interfaces. As a result, both hardware and software can be made to expand or contract in modular fashion while expending a minimum of time and effort.

  17. Utilizing Rapid Prototyping for Architectural Modeling

    ERIC Educational Resources Information Center

    Kirton, E. F.; Lavoie, S. D.

    2006-01-01

    This paper will discuss our approach to, success with and future direction in rapid prototyping for architectural modeling. The premise that this emerging technology has broad and exciting applications in the building design and construction industry will be supported by visual and physical evidence. This evidence will be presented in the form of…

  18. Network architecture functional description and design

    SciTech Connect

    Stans, L.; Bencoe, M.; Brown, D.; Kelly, S.; Pierson, L.; Schaldach, C.

    1989-05-25

    This report provides a top level functional description and design for the development and implementation of the central network to support the next generation of SNL, Albuquerque supercomputer in a UNIX{reg sign} environment. It describes the network functions and provides an architecture and topology.

  19. A Practical Software Architecture for Virtual Universities

    ERIC Educational Resources Information Center

    Xiang, Peifeng; Shi, Yuanchun; Qin, Weijun

    2006-01-01

    This article introduces a practical software architecture called CUBES, which focuses on system integration and evolvement for online virtual universities. The key of CUBES is a supporting platform that helps to integrate and evolve heterogeneous educational applications developed by different organizations. Both standardized educational…

  20. Beethoven: architecture for media telephony

    NASA Astrophysics Data System (ADS)

    Keskinarkaus, Anja; Ohtonen, Timo; Sauvola, Jaakko J.

    1999-11-01

    This paper presents a new architecture and techniques for media-based telephony over wireless/wireline IP networks, called `Beethoven'. The platform supports complex media transport and mobile conferencing for multi-user environments having a non-uniform access. New techniques are presented to provide advanced multimedia call management over different media types and their presentation. The routing and distribution of the media is rendered over the standards based protocol. Our approach offers a generic, distributed and object-oriented solution having interfaces, where signal processing and unified messaging algorithms are embedded as instances of core classes. The platform services are divided into `basic communication', `conferencing' and `media session'. The basic communication form platform core services and supports access from scalable user interface to network end-points. Conferencing services take care of media filter adaptation, conversion, error resiliency, multi-party connection and event signaling, while the media session services offer resources for application-level communication between the terminals. The platform allows flexible attachment of any number of plug-in modules, and thus we use it as a test bench for multiparty/multi-point conferencing and as an evaluation bench for signal coding algorithms. In tests, our architecture showed the ability to easily be scaled from simple voice terminal to complex multi-user conference sharing virtual data.

  1. Message Bus Architectures - Simplicity in the Right Places

    NASA Technical Reports Server (NTRS)

    Smith, Dan

    2010-01-01

    There will always be a new latest and greatest architecture for satellite ground systems. This paper discusses the use of a proven message-oriented middleware (MOM) architecture using publish/subscribe functions and the strengths it brings to these mission critical systems. An even newer approach gaining popularity is Service Oriented Architectures (SOAs). SOAs are generally considered more powerful than the MOM approach and address many mission-critical system challenges. A MOM vs SOA discussion can highlight capabilities supported or enabled by the underlying architecture and can identify benefits of MOMs and SOAs when applied to differing sets of mission requirements or evaluation criteria.

  2. Evaluating science return in space exploration initiative architectures

    NASA Astrophysics Data System (ADS)

    Budden, Nancy Ann; Spudis, Paul D.

    1993-03-01

    Science is an important aspect of the Space Exploration Initiative, a program to explore the Moon and Mars with people and machines. Different SEI mission architectures are evaluated on the basis of three variables: access (to the planet's surface), capability (including number of crew, equipment, and supporting infrastructure), and time (being the total number of man-hours available for scientific activities). This technique allows us to estimate the scientific return to be expected from different architectures and from different implementations of the same architecture. Our methodology allows us to maximize the scientific return from the initiative by illuminating the different emphases and returns that result from the alternative architectural decisions.

  3. Evaluating science return in space exploration initiative architectures

    NASA Technical Reports Server (NTRS)

    Budden, Nancy Ann; Spudis, Paul D.

    1993-01-01

    Science is an important aspect of the Space Exploration Initiative, a program to explore the Moon and Mars with people and machines. Different SEI mission architectures are evaluated on the basis of three variables: access (to the planet's surface), capability (including number of crew, equipment, and supporting infrastructure), and time (being the total number of man-hours available for scientific activities). This technique allows us to estimate the scientific return to be expected from different architectures and from different implementations of the same architecture. Our methodology allows us to maximize the scientific return from the initiative by illuminating the different emphases and returns that result from the alternative architectural decisions.

  4. Specifying structural constraints of architectural patterns in the ARCHERY language

    SciTech Connect

    Sanchez, Alejandro; Barbosa, Luis S.; Riesco, Daniel

    2015-03-10

    ARCHERY is an architectural description language for modelling and reasoning about distributed, heterogeneous and dynamically reconfigurable systems in terms of architectural patterns. The language supports the specification of architectures and their reconfiguration. This paper introduces a language extension for precisely describing the structural design decisions that pattern instances must respect in their (re)configurations. The extension is a propositional modal logic with recursion and nominals referencing components, i.e., a hybrid µ-calculus. Its expressiveness allows specifying safety and liveness constraints, as well as paths and cycles over structures. Refinements of classic architectural patterns are specified.

  5. Technology architecture guidelines for a health care system.

    PubMed

    Jones, D T; Duncan, R; Langberg, M L; Shabot, M M

    2000-01-01

    Although the demand for use of information technology within the healthcare industry is intensifying, relatively little has been written about guidelines to optimize IT investments. A technology architecture is a set of guidelines for technology integration within an enterprise. The architecture is a critical tool in the effort to control information technology (IT) operating costs by constraining the number of technologies supported. A well-designed architecture is also an important aid to integrating disparate applications, data stores and networks. The authors led the development of a thorough, carefully designed technology architecture for a large and rapidly growing health care system. The purpose and design criteria are described, as well as the process for gaining consensus and disseminating the architecture. In addition, the processes for using, maintaining, and handling exceptions are described. The technology architecture is extremely valuable to health care organizations both in controlling costs and promoting integration.

  6. Neural Architectures for Control

    NASA Technical Reports Server (NTRS)

    Peterson, James K.

    1991-01-01

    The cerebellar model articulated controller (CMAC) neural architectures are shown to be viable for the purposes of real-time learning and control. Software tools for the exploration of CMAC performance are developed for three hardware platforms, the MacIntosh, the IBM PC, and the SUN workstation. All algorithm development was done using the C programming language. These software tools were then used to implement an adaptive critic neuro-control design that learns in real-time how to back up a trailer truck. The truck backer-upper experiment is a standard performance measure in the neural network literature, but previously the training of the controllers was done off-line. With the CMAC neural architectures, it was possible to train the neuro-controllers on-line in real-time on a MS-DOS PC 386. CMAC neural architectures are also used in conjunction with a hierarchical planning approach to find collision-free paths over 2-D analog valued obstacle fields. The method constructs a coarse resolution version of the original problem and then finds the corresponding coarse optimal path using multipass dynamic programming. CMAC artificial neural architectures are used to estimate the analog transition costs that dynamic programming requires. The CMAC architectures are trained in real-time for each obstacle field presented. The coarse optimal path is then used as a baseline for the construction of a fine scale optimal path through the original obstacle array. These results are a very good indication of the potential power of the neural architectures in control design. In order to reach as wide an audience as possible, we have run a seminar on neuro-control that has met once per week since 20 May 1991. This seminar has thoroughly discussed the CMAC architecture, relevant portions of classical control, back propagation through time, and adaptive critic designs.

  7. Evolution and development of inflorescence architectures.

    PubMed

    Prusinkiewicz, Przemyslaw; Erasmus, Yvette; Lane, Brendan; Harder, Lawrence D; Coen, Enrico

    2007-06-01

    To understand the constraints on biological diversity, we analyzed how selection and development interact to control the evolution of inflorescences, the branching structures that bear flowers. We show that a single developmental model accounts for the restricted range of inflorescence types observed in nature and that this model is supported by molecular genetic studies. The model predicts associations between inflorescence architecture, climate, and life history, which we validated empirically. Paths, or evolutionary wormholes, link different architectures in a multidimensional fitness space, but the rate of evolution along these paths is constrained by genetic and environmental factors, which explains why some evolutionary transitions are rare between closely related plant taxa.

  8. Unconventional Architectures for High-Throughput Sciences

    SciTech Connect

    Nieplocha, Jarek; Marquez, Andres; Petrini, Fabrizio; Chavarría-Miranda, Daniel

    2007-06-15

    Science laboratories and sophisticated simulations are producing data of increasing volumes and complexities, and that’s posing significant challenges to current data infrastructures as terabytes to petabytes of data must be processed and analyzed. Traditional computing platforms, originally designed to support model-driven applications, are unable to meet the demands of the data-intensive scientific applications. Pacific Northwest National Laboratory (PNNL) research goes beyond “traditional supercomputing” applications to address emerging problems that need scalable, real-time solutions. The outcome is new unconventional architectures for data-intensive applications specifically designed to process the deluge of scientific data, including FPGAs, multithreaded architectures and IBM's Cell.

  9. Advanced HF anti-jam network architecture

    NASA Astrophysics Data System (ADS)

    Jackson, E. M.; Horner, Robert W.; Cai, Khiem V.

    The Hughes HF2000 system was developed using a flexible architecture which utilizes a wideband RF front-end and extensive digital signal processing. The HF2000 antijamming (AJ) mode was field tested via an HF skywave path between Fullerton, CA and Carlsbad, CA (about 100 miles), and it was shown that reliable fast frequency-hopping data transmission is feasible at 2400 b/s without adaptive equalization. The necessary requirements of an HF communication network are discussed, and how the HF2000 AJ mode can be used to support those requirements is shown. The Hughes HF2000 AJ mode system architecture is presented.

  10. Space station needs, attributes and architectural options: Midterm main briefing

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Space station missions, their requirements, and architectural solutions are presented. Analyses of the following five mission categories are summarized: (1) science/applications, (2) commercial, (3) national security, (4) operational support, and (5) technology development.

  11. The architectural relevance of cybernetics

    SciTech Connect

    Frazer, J.H.

    1993-12-31

    This title is taken from an article by Gordon Pask in Architectural Design September 1969. It raises a number of questions which this article attempts to answer. How did Gordon come to be writing for an architectural publication? What was his contribution to architecture? How does he now come to be on the faculty of a school of architecture? And what indeed is the architectural relevance of cybernetics? 12 refs.

  12. Using an Integrated Distributed Test Architecture to Develop an Architecture for Mars

    NASA Technical Reports Server (NTRS)

    Othon, William L.

    2016-01-01

    The creation of a crew-rated spacecraft architecture capable of sending humans to Mars requires the development and integration of multiple vehicle systems and subsystems. Important new technologies will be identified and matured within each technical discipline to support the mission. Architecture maturity also requires coordination with mission operations elements and ground infrastructure. During early architecture formulation, many of these assets will not be co-located and will required integrated, distributed test to show that the technologies and systems are being developed in a coordinated way. When complete, technologies must be shown to function together to achieve mission goals. In this presentation, an architecture will be described that promotes and advances integration of disparate systems within JSC and across NASA centers.

  13. Data-driven parallel architecture for syntactic pattern recognition

    NASA Astrophysics Data System (ADS)

    Tseng, Chien-Chao; Hwang, Shu-Yuen

    1991-02-01

    Syntax analysis is the primary operation of a Syntactic Pattern Recognition (SPR) system. A real time SPR system would require efficient architectural supports for syntax analysis. The process of syntax analysis and the execution of a logic program are closely related. In this paper we propose a data-driven parallel architecture for syntax analysis based on the principle of parallel execution of logic programs. The proposed architecture is hybrid in the sense that its functional units unlike those itt traditional fine-grain datafiow model are coarse-grain macro operators capable of performing unification operations. The scheme for compiling the datafiow graphs eliminates the necessity of any operand matching unit in the data-driven architecture. All memory requests are tagged with register identification (similar to IBM 360/91) to provide an efficient hardware support for context switching. The experimental results indicate the proposed architecture is promising.

  14. The social architecture of capitalism

    NASA Astrophysics Data System (ADS)

    Wright, Ian

    2005-02-01

    A dynamic model of the social relations between workers and capitalists is introduced. The model self-organises into a dynamic equilibrium with statistical properties that are in close qualitative and in many cases quantitative agreement with a broad range of known empirical distributions of developed capitalism, including the power-law firm size distribution, the Laplace firm and GDP growth distribution, the lognormal firm demises distribution, the exponential recession duration distribution, the lognormal-Pareto income distribution, and the gamma-like firm rate-of-profit distribution. Normally these distributions are studied in isolation, but this model unifies and connects them within a single causal framework. The model also generates business cycle phenomena, including fluctuating wage and profit shares in national income about values consistent with empirical studies. The generation of an approximately lognormal-Pareto income distribution and an exponential-Pareto wealth distribution demonstrates that the power-law regime of the income distribution can be explained by an additive process on a power-law network that models the social relation between employers and employees organised in firms, rather than a multiplicative process that models returns to investment in financial markets. A testable consequence of the model is the conjecture that the rate-of-profit distribution is consistent with a parameter-mix of a ratio of normal variates with means and variances that depend on a firm size parameter that is distributed according to a power-law.

  15. Agent Architectures for Compliance

    NASA Astrophysics Data System (ADS)

    Burgemeestre, Brigitte; Hulstijn, Joris; Tan, Yao-Hua

    A Normative Multi-Agent System consists of autonomous agents who must comply with social norms. Different kinds of norms make different assumptions about the cognitive architecture of the agents. For example, a principle-based norm assumes that agents can reflect upon the consequences of their actions; a rule-based formulation only assumes that agents can avoid violations. In this paper we present several cognitive agent architectures for self-monitoring and compliance. We show how different assumptions about the cognitive architecture lead to different information needs when assessing compliance. The approach is validated with a case study of horizontal monitoring, an approach to corporate tax auditing recently introduced by the Dutch Customs and Tax Authority.

  16. Protein domain architectures.

    PubMed

    Mulder, Nicola J

    2010-01-01

    Proteins are composed of functional units, or domains, that can be found alone or in combination with other domains. Analysis of protein domain architectures and the movement of protein domains within and across different genomes provide clues about the evolution of protein function. The classification of proteins into families and domains is provided through publicly available tools and databases that use known protein domains to predict other members in new proteins sequences. Currently at least 80% of the main protein sequence databases can be classified using these tools, thus providing a large data set to work from for analyzing protein domain architectures. Each of the protein domain databases provide intuitive web interfaces for viewing and analyzing their domain classifications and provide their data freely for downloading. Some of the main protein family and domain databases are described here, along with their Web-based tools for analyzing domain architectures.

  17. Nanorobot architecture for medical target identification

    NASA Astrophysics Data System (ADS)

    Cavalcanti, Adriano; Shirinzadeh, Bijan; Freitas, Robert A., Jr.; Hogg, Tad

    2008-01-01

    This work has an innovative approach for the development of nanorobots with sensors for medicine. The nanorobots operate in a virtual environment comparing random, thermal and chemical control techniques. The nanorobot architecture model has nanobioelectronics as the basis for manufacturing integrated system devices with embedded nanobiosensors and actuators, which facilitates its application for medical target identification and drug delivery. The nanorobot interaction with the described workspace shows how time actuation is improved based on sensor capabilities. Therefore, our work addresses the control and the architecture design for developing practical molecular machines. Advances in nanotechnology are enabling manufacturing nanosensors and actuators through nanobioelectronics and biologically inspired devices. Analysis of integrated system modeling is one important aspect for supporting nanotechnology in the fast development towards one of the most challenging new fields of science: molecular machines. The use of 3D simulation can provide interactive tools for addressing nanorobot choices on sensing, hardware architecture design, manufacturing approaches, and control methodology investigation.

  18. Exploration Architecture Options - ECLSS, EVA, TCS Implications

    NASA Technical Reports Server (NTRS)

    Chambliss, Joe; Henninger, Don; Lawrence, Carl

    2010-01-01

    Many options for exploration of space have been identified and evaluated since the Vision for Space Exploration (VSE) was announced in 2004. Lunar architectures have been identified and addressed in the Lunar Surface Systems team to establish options for how to get to and then inhabit and explore the moon. The Augustine Commission evaluated human space flight for the Obama administration and identified many options for how to conduct human spaceflight in the future. This paper will evaluate the options for exploration of space for the implications of architectures on the Environmental Control and Life Support (ECLSS), ExtraVehicular Activity (EVA) and Thermal Control System (TCS) Systems. The advantages and disadvantages of each architecture and options are presented.

  19. Exploration Architecture Options - ECLSS, EVA, TCS Implications

    NASA Technical Reports Server (NTRS)

    Chambliss, Joe; Henninger, Don; Lawrence, Carl

    2009-01-01

    Many options for exploration of the Moon and Mars have been identified and evaluated since the Vision for Space Exploration VSE was announced in 2004. Lunar architectures have been identified and addressed in the Lunar Surface Systems team to establish options for how to get to and then inhabit and explore the moon. The Augustine Commission evaluated human space flight for the Obama administration and identified many options for how to conduct human spaceflight in the future. This paper will evaluate the options for exploration of the moon and Mars and those of the Augustine human spaceflight commission for the implications of each architecture on the Environmental Control and Life Support, ExtraVehicular Activity and Thermal Control systems. The advantages and disadvantages of each architecture and options are presented.

  20. TALOS: A distributed architecture for intelligent monitoring and anomaly diagnosis of the Hubble Space Telescope

    NASA Technical Reports Server (NTRS)

    Cruse, Bryant G.

    1988-01-01

    Lockheed, the Hubble Space Telescope Mission Operations Contractor, is currently engaged in a project to develop a distributed architecture of communicating expert systems to support vehicle operations. This architecture, named Telemetry Analysis Logic for Operating Spacecraft (TALOS), has the potential for wide applicability in spacecraft operations. The architecture mirrors the organization of the human experts within an operations control center.

  1. Information architecture. Volume 3: Guidance

    SciTech Connect

    1997-04-01

    The purpose of this document, as presented in Volume 1, The Foundations, is to assist the Department of Energy (DOE) in developing and promulgating information architecture guidance. This guidance is aimed at increasing the development of information architecture as a Departmentwide management best practice. This document describes departmental information architecture principles and minimum design characteristics for systems and infrastructures within the DOE Information Architecture Conceptual Model, and establishes a Departmentwide standards-based architecture program. The publication of this document fulfills the commitment to address guiding principles, promote standard architectural practices, and provide technical guidance. This document guides the transition from the baseline or defacto Departmental architecture through approved information management program plans and budgets to the future vision architecture. This document also represents another major step toward establishing a well-organized, logical foundation for the DOE information architecture.

  2. Hybrid polarity SAR architecture

    NASA Astrophysics Data System (ADS)

    Raney, R. Keith

    2009-05-01

    A space-based synthetic aperture radar (SAR) designed to provide quantitative information on a global scale implies severe requirements to maximize coverage and to sustain reliable operational calibration. These requirements are best served by the hybrid-polarity architecture, in which the radar transmits in circular polarization, and receives on two orthogonal linear polarizations, coherently, retaining their relative phase. This paper summarizes key attributes of hybrid-polarity dual- and quadrature-polarized SARs, reviews the associated advantages, formalizes conditions under which the signal-to-noise ratio is conserved, and describes the evolution of this architecture from first principles.

  3. D Architectural Videomapping

    NASA Astrophysics Data System (ADS)

    Catanese, R.

    2013-07-01

    3D architectural mapping is a video projection technique that can be done with a survey of a chosen building in order to realize a perfect correspondence between its shapes and the images in projection. As a performative kind of audiovisual artifact, the real event of the 3D mapping is a combination of a registered video animation file with a real architecture. This new kind of visual art is becoming very popular and its big audience success testifies new expressive chances in the field of urban design. My case study has been experienced in Pisa for the Luminara feast in 2012.

  4. A component simulator architecture

    NASA Astrophysics Data System (ADS)

    Bégin, M.-E.; Walsh, T.

    2002-07-01

    This paper describes the current state of our new component simulator architecture. This design is being developed at VEGA GmbH, by the Technology Group, within the Space Business Unit. This paper describes our overall component architecture and attempts to explain how it can be used by model developers and end-users. At the time of writing, it appears clear that a certain level of automation is required to increase the usability of the system. This automation is only briefly discussed here.

  5. National Positioning, Navigation, and Timing Architecture Study

    NASA Astrophysics Data System (ADS)

    van Dyke, K.; Vicario, J.; Hothem, L.

    2007-12-01

    The purpose of the National Positioning, Navigation and Timing (PNT) Architecture effort is to help guide future PNT system-of-systems investment and implementation decisions. The Assistant Secretary of Defense for Networks and Information Integration and the Under Secretary of Transportation for Policy sponsored a National PNT Architecture study to provide more effective and efficient PNT capabilities focused on the 2025 timeframe and an evolutionary path for government provided systems and services. U.S. Space-Based PNT Policy states that the U.S. must continue to improve and maintain GPS, augmentations to GPS, and back-up capabilities to meet growing national, homeland, and economic security needs. PNT touches almost every aspect of people´s lives today. PNT is essential for Defense and Civilian applications ranging from the Department of Defense´s Joint network centric and precision operations to the transportation and telecommunications sectors, improving efficiency, increasing safety, and being more productive. Absence of an approved PNT architecture results in uncoordinated research efforts, lack of clear developmental paths, potentially wasteful procurements and inefficient deployment of PNT resources. The national PNT architecture effort evaluated alternative future mixes of global (space and non space-based) and regional PNT solutions, PNT augmentations, and autonomous PNT capabilities to address priorities identified in the DoD PNT Joint Capabilities Document (JCD) and civil equivalents. The path to achieving the Should-Be architecture is described by the National PNT Architecture's Guiding Principles, representing an overarching Vision of the US' role in PNT, an architectural Strategy to fulfill that Vision, and four Vectors which support the Strategy. The National PNT Architecture effort has developed nineteen recommendations. Five foundational recommendations are tied directly to the Strategy while the remaining fourteen individually support one of

  6. Reference Avionics Architecture for Lunar Surface Systems

    NASA Technical Reports Server (NTRS)

    Somervill, Kevin M.; Lapin, Jonathan C.; Schmidt, Oron L.

    2010-01-01

    Developing and delivering infrastructure capable of supporting long-term manned operations to the lunar surface has been a primary objective of the Constellation Program in the Exploration Systems Mission Directorate. Several concepts have been developed related to development and deployment lunar exploration vehicles and assets that provide critical functionality such as transportation, habitation, and communication, to name a few. Together, these systems perform complex safety-critical functions, largely dependent on avionics for control and behavior of system functions. These functions are implemented using interchangeable, modular avionics designed for lunar transit and lunar surface deployment. Systems are optimized towards reuse and commonality of form and interface and can be configured via software or component integration for special purpose applications. There are two core concepts in the reference avionics architecture described in this report. The first concept uses distributed, smart systems to manage complexity, simplify integration, and facilitate commonality. The second core concept is to employ extensive commonality between elements and subsystems. These two concepts are used in the context of developing reference designs for many lunar surface exploration vehicles and elements. These concepts are repeated constantly as architectural patterns in a conceptual architectural framework. This report describes the use of these architectural patterns in a reference avionics architecture for Lunar surface systems elements.

  7. Service connectivity architecture for mobile augmented reality

    NASA Astrophysics Data System (ADS)

    Turunen, Tuukka; Pyssysalo, Tino; Roening, Juha

    2001-06-01

    Mobile augmented reality can be utilized in a number of different services, and it provides a lot of added value compared to the interfaces used in mobile multimedia today. Intelligent service connectivity architecture is needed for the emerging commercial mobile augmented reality services, to guarantee mobility and interoperability on a global scale. Some of the key responsibilities of this architecture are to find suitable service providers, to manage the connection with and utilization of such providers, and to allow smooth switching between them whenever the user moves out of the service area of the service provider she is currently connected to. We have studied the potential support technologies for such architectures and propose a way to create an intelligent service connectivity architecture based on current and upcoming wireless networks, an Internet backbone, and mechanisms to manage service connectivity in the upper layers of the protocol stack. In this paper, we explain the key issues of service connectivity, describe the properties of our architecture, and analyze the functionality of an example system. Based on these, we consider our proposition a good solution to the quest for global interoperability in mobile augmented reality services.

  8. Kernel methods for phenotyping complex plant architecture.

    PubMed

    Kawamura, Koji; Hibrand-Saint Oyant, Laurence; Foucher, Fabrice; Thouroude, Tatiana; Loustau, Sébastien

    2014-02-01

    The Quantitative Trait Loci (QTL) mapping of plant architecture is a critical step for understanding the genetic determinism of plant architecture. Previous studies adopted simple measurements, such as plant-height, stem-diameter and branching-intensity for QTL mapping of plant architecture. Many of these quantitative traits were generally correlated to each other, which give rise to statistical problem in the detection of QTL. We aim to test the applicability of kernel methods to phenotyping inflorescence architecture and its QTL mapping. We first test Kernel Principal Component Analysis (KPCA) and Support Vector Machines (SVM) over an artificial dataset of simulated inflorescences with different types of flower distribution, which is coded as a sequence of flower-number per node along a shoot. The ability of discriminating the different inflorescence types by SVM and KPCA is illustrated. We then apply the KPCA representation to the real dataset of rose inflorescence shoots (n=1460) obtained from a 98 F1 hybrid mapping population. We find kernel principal components with high heritability (>0.7), and the QTL analysis identifies a new QTL, which was not detected by a trait-by-trait analysis of simple architectural measurements. The main tools developed in this paper could be use to tackle the general problem of QTL mapping of complex (sequences, 3D structure, graphs) phenotypic traits.

  9. NASA CEV Reference GN&C Architecture

    NASA Technical Reports Server (NTRS)

    Tamblyn, Scott; Hinkel, Heather; Saley, Dave

    2007-01-01

    The Orion Crew Exploration Vehicle (CEV) will be the first human spacecraft built by NASA in almost 3 decades and will be the first vehicle to perform both Low Earth Orbit (LEO) missions and lunar missions since Apollo. The awesome challenge of designing a Guidance, Navigation, and Control (GN&C) system for this vehicle that satisfies all of its various mission requirements is countered by the opportunity to take advantage of the improvements in algorithms, software, sensors, and other related GN&C technology over this period. This paper describes the CEV GN&C reference architecture developed to support the overall NASA reference configuration and validate the driving requirements of the Constellation (Cx) Architecture Requirements Document (CARD, Reference 1) and the CEV System Requirements Document (SRD, Reference 2). The Orion GN&C team designed the reference architecture based on the functional allocation of GN&C roles and responsibilities of CEV with respect to the other Cx vehicles, such as the Crew Launch Vehicle (CLV), Earth Departure Stage (EDS), and Lunar Surface Area Module (LSAM), across all flight phases. The specific challenges and responsibilities of the CEV GN&C system from launch pad to touchdown will be introduced along with an overview of the navigation sensor suite, its redundancy management, and flight software (FSW) architecture. Sensors will be discussed in terms of range of operation, data utility within the navigation system, and rationale for selection. The software architecture is illustrated via block diagrams, commensurate with the design aspects.

  10. Generic Software Architecture for Launchers

    NASA Astrophysics Data System (ADS)

    Carre, Emilien; Gast, Philippe; Hiron, Emmanuel; Leblanc, Alain; Lesens, David; Mescam, Emmanuelle; Moro, Pierre

    2015-09-01

    The definition and reuse of generic software architecture for launchers is not so usual for several reasons: the number of European launcher families is very small (Ariane 5 and Vega for these last decades); the real time constraints (reactivity and determinism needs) are very hard; low levels of versatility are required (implying often an ad hoc development of the launcher mission). In comparison, satellites are often built on a generic platform made up of reusable hardware building blocks (processors, star-trackers, gyroscopes, etc.) and reusable software building blocks (middleware, TM/TC, On Board Control Procedure, etc.). If some of these reasons are still valid (e.g. the limited number of development), the increase of the available CPU power makes today an approach based on a generic time triggered middleware (ensuring the full determinism of the system) and a centralised mission and vehicle management (offering more flexibility in the design and facilitating the long term maintenance) achievable. This paper presents an example of generic software architecture which could be envisaged for future launchers, based on the previously described principles and supported by model driven engineering and automatic code generation.

  11. Hadl: HUMS Architectural Description Language

    NASA Technical Reports Server (NTRS)

    Mukkamala, R.; Adavi, V.; Agarwal, N.; Gullapalli, S.; Kumar, P.; Sundaram, P.

    2004-01-01

    Specification of architectures is an important prerequisite for evaluation of architectures. With the increase m the growth of health usage and monitoring systems (HUMS) in commercial and military domains, the need far the design and evaluation of HUMS architectures has also been on the increase. In this paper, we describe HADL, HUMS Architectural Description Language, that we have designed for this purpose. In particular, we describe the features of the language, illustrate them with examples, and show how we use it in designing domain-specific HUMS architectures. A companion paper contains details on our design methodology of HUMS architectures.

  12. American School & University Architectural Portfolio 2000 Awards: Landscape Architecture.

    ERIC Educational Resources Information Center

    American School & University, 2000

    2000-01-01

    Presents photographs and basic information on architectural design, costs, square footage, and principle designers of the award winning school landscaping projects that competed in the American School & University Architectural Portfolio 2000. (GR)

  13. Geostar's system architectures

    NASA Technical Reports Server (NTRS)

    Lepkowski, Ronald J.

    1989-01-01

    Geostar is currently constructing a radiodetermination satellite system to provide position fixes and vehicle surveillance services, and has proposed a digital land mobile satellite service to provide data, facsimile and digitized voice services to low cost mobile users. The different system architectures for these two systems, are reviewed.

  14. INL Generic Robot Architecture

    SciTech Connect

    2005-03-30

    The INL Generic Robot Architecture is a generic, extensible software framework that can be applied across a variety of different robot geometries, sensor suites and low-level proprietary control application programming interfaces (e.g. mobility, aria, aware, player, etc.).

  15. Emulating an MIMD architecture

    SciTech Connect

    Su Bogong; Grishman, R.

    1982-01-01

    As part of a research effort in parallel processor architecture and programming, the ultracomputer group at New York University has performed extensive simulation of parallel programs. To speed up these simulations, a parallel processor emulator, using the microprogrammable Puma computer system previously designed and built at NYU, has been developed. 8 references.

  16. [Architecture, budget and dignity].

    PubMed

    Morel, Etienne

    2012-01-01

    Drawing on its dynamic strengths, a psychiatric unit develops various projects and care techniques. In this framework, the institute director must make a number of choices with regard to architecture. Why renovate the psychiatry building? What financial investments are required? What criteria should be followed? What if the major argument was based on the respect of the patient's dignity?

  17. [Architecture and movement].

    PubMed

    Rivallan, Armel

    2012-01-01

    Leading an architectural project means accompanying the movement which it induces within the teams. Between questioning, uncertainty and fear, the organisational changes inherent to the new facility must be subject to constructive and ongoing exchanges. Ethics, safety and training are revised and the unit projects are sometimes modified.

  18. Making Connections through Architecture.

    ERIC Educational Resources Information Center

    Hollingsworth, Patricia

    1993-01-01

    The Center for Arts and Sciences (Oklahoma) developed an interdisciplinary curriculum for disadvantaged gifted children on styles of architecture, called "Discovering Patterns in the Built Environment." This article describes the content and processes used in the curriculum, as well as other programs of the center, such as teacher workshops,…

  19. Tutorial on architectural acoustics

    NASA Astrophysics Data System (ADS)

    Shaw, Neil; Talaske, Rick; Bistafa, Sylvio

    2002-11-01

    This tutorial is intended to provide an overview of current knowledge and practice in architectural acoustics. Topics covered will include basic concepts and history, acoustics of small rooms (small rooms for speech such as classrooms and meeting rooms, music studios, small critical listening spaces such as home theatres) and the acoustics of large rooms (larger assembly halls, auditoria, and performance halls).

  20. GNU debugger internal architecture

    SciTech Connect

    Miller, P.; Nessett, D.; Pizzi, R.

    1993-12-16

    This document describes the internal and architecture and implementation of the GNU debugger, gdb. Topics include inferior process management, command execution, symbol table management and remote debugging. Call graphs for specific functions are supplied. This document is not a complete description but offers a developer an overview which is the place to start before modification.

  1. Modeling Operations Costs for Human Exploration Architectures

    NASA Technical Reports Server (NTRS)

    Shishko, Robert

    2013-01-01

    Operations and support (O&S) costs for human spaceflight have not received the same attention in the cost estimating community as have development costs. This is unfortunate as O&S costs typically comprise a majority of life-cycle costs (LCC) in such programs as the International Space Station (ISS) and the now-cancelled Constellation Program. Recognizing this, the Constellation Program and NASA HQs supported the development of an O&S cost model specifically for human spaceflight. This model, known as the Exploration Architectures Operations Cost Model (ExAOCM), provided the operations cost estimates for a variety of alternative human missions to the moon, Mars, and Near-Earth Objects (NEOs) in architectural studies. ExAOCM is philosophically based on the DoD Architecture Framework (DoDAF) concepts of operational nodes, systems, operational functions, and milestones. This paper presents some of the historical background surrounding the development of the model, and discusses the underlying structure, its unusual user interface, and lastly, previous examples of its use in the aforementioned architectural studies.

  2. Standardizing the information architecture for spacecraft operations

    NASA Technical Reports Server (NTRS)

    Easton, C. R.

    1994-01-01

    This paper presents an information architecture developed for the Space Station Freedom as a model from which to derive an information architecture standard for advanced spacecraft. The information architecture provides a way of making information available across a program, and among programs, assuming that the information will be in a variety of local formats, structures and representations. It provides a format that can be expanded to define all of the physical and logical elements that make up a program, add definitions as required, and import definitions from prior programs to a new program. It allows a spacecraft and its control center to work in different representations and formats, with the potential for supporting existing spacecraft from new control centers. It supports a common view of data and control of all spacecraft, regardless of their own internal view of their data and control characteristics, and of their communications standards, protocols and formats. This information architecture is central to standardizing spacecraft operations, in that it provides a basis for information transfer and translation, such that diverse spacecraft can be monitored and controlled in a common way.

  3. Next Generation Mass Memory Architecture

    NASA Astrophysics Data System (ADS)

    Herpel, H.-J.; Stahle, M.; Lonsdorfer, U.; Binzer, N.

    2010-08-01

    Future Mass Memory units will have to cope with various demanding requirements driven by onboard instruments (optical and SAR) that generate a huge amount of data (>10TBit) at a data rate > 6 Gbps. For downlink data rates around 3 Gbps will be feasible using latest ka-band technology together with Variable Coding and Modulation (VCM) techniques. These high data rates and storage capacities need to be effectively managed. Therefore, data structures and data management functions have to be improved and adapted to existing standards like the Packet Utilisation Standard (PUS). In this paper we will present a highly modular and scalable architectural approach for mass memories in order to support a wide range of mission requirements.

  4. Shaping plant architecture

    PubMed Central

    Teichmann, Thomas; Muhr, Merlin

    2015-01-01

    Plants exhibit phenotypical plasticity. Their general body plan is genetically determined, but plant architecture and branching patterns are variable and can be adjusted to the prevailing environmental conditions. The modular design of the plant facilitates such morphological adaptations. The prerequisite for the formation of a branch is the initiation of an axillary meristem. Here, we review the current knowledge about this process. After its establishment, the meristem can develop into a bud which can either become dormant or grow out and form a branch. Many endogenous factors, such as photoassimilate availability, and exogenous factors like nutrient availability or shading, have to be integrated in the decision whether a branch is formed. The underlying regulatory network is complex and involves phytohormones and transcription factors. The hormone auxin is derived from the shoot apex and inhibits bud outgrowth indirectly in a process termed apical dominance. Strigolactones appear to modulate apical dominance by modification of auxin fluxes. Furthermore, the transcription factor BRANCHED1 plays a central role. The exact interplay of all these factors still remains obscure and there are alternative models. We discuss recent findings in the field along with the major models. Plant architecture is economically significant because it affects important traits of crop and ornamental plants, as well as trees cultivated in forestry or on short rotation coppices. As a consequence, plant architecture has been modified during plant domestication. Research revealed that only few key genes have been the target of selection during plant domestication and in breeding programs. Here, we discuss such findings on the basis of various examples. Architectural ideotypes that provide advantages for crop plant management and yield are described. We also outline the potential of breeding and biotechnological approaches to further modify and improve plant architecture for economic needs

  5. Shaping plant architecture.

    PubMed

    Teichmann, Thomas; Muhr, Merlin

    2015-01-01

    Plants exhibit phenotypical plasticity. Their general body plan is genetically determined, but plant architecture and branching patterns are variable and can be adjusted to the prevailing environmental conditions. The modular design of the plant facilitates such morphological adaptations. The prerequisite for the formation of a branch is the initiation of an axillary meristem. Here, we review the current knowledge about this process. After its establishment, the meristem can develop into a bud which can either become dormant or grow out and form a branch. Many endogenous factors, such as photoassimilate availability, and exogenous factors like nutrient availability or shading, have to be integrated in the decision whether a branch is formed. The underlying regulatory network is complex and involves phytohormones and transcription factors. The hormone auxin is derived from the shoot apex and inhibits bud outgrowth indirectly in a process termed apical dominance. Strigolactones appear to modulate apical dominance by modification of auxin fluxes. Furthermore, the transcription factor BRANCHED1 plays a central role. The exact interplay of all these factors still remains obscure and there are alternative models. We discuss recent findings in the field along with the major models. Plant architecture is economically significant because it affects important traits of crop and ornamental plants, as well as trees cultivated in forestry or on short rotation coppices. As a consequence, plant architecture has been modified during plant domestication. Research revealed that only few key genes have been the target of selection during plant domestication and in breeding programs. Here, we discuss such findings on the basis of various examples. Architectural ideotypes that provide advantages for crop plant management and yield are described. We also outline the potential of breeding and biotechnological approaches to further modify and improve plant architecture for economic needs

  6. 11. Photocopy of architectural drawing (from National Archives Architectural and ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. Photocopy of architectural drawing (from National Archives Architectural and Cartographic Branch Alexandria, Va.) 'Non-Com-Officers Qrs.' Quartermaster General's Office Standard Plan 82, sheet 1. Lithograph on linen architectural drawing. April 1893 3 ELEVATIONS, 3 PLANS AND A PARTIAL SECTION - Fort Myer, Non-Commissioned Officers Quarters, Washington Avenue between Johnson Lane & Custer Road, Arlington, Arlington County, VA

  7. 12. Photocopy of architectural drawing (from National Archives Architectural and ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    12. Photocopy of architectural drawing (from National Archives Architectural and Cartographic Branch, Alexandria, Va.) 'Non-Com-Officers Qrs.' Quartermaster Generals Office Standard Plan 82, sheet 2, April 1893. Lithograph on linen architectural drawing. DETAILS - Fort Myer, Non-Commissioned Officers Quarters, Washington Avenue between Johnson Lane & Custer Road, Arlington, Arlington County, VA

  8. ACOUSTICS IN ARCHITECTURAL DESIGN, AN ANNOTATED BIBLIOGRAPHY ON ARCHITECTURAL ACOUSTICS.

    ERIC Educational Resources Information Center

    DOELLE, LESLIE L.

    THE PURPOSE OF THIS ANNOTATED BIBLIOGRAPHY ON ARCHITECTURAL ACOUSTICS WAS--(1) TO COMPILE A CLASSIFIED BIBLIOGRAPHY, INCLUDING MOST OF THOSE PUBLICATIONS ON ARCHITECTURAL ACOUSTICS, PUBLISHED IN ENGLISH, FRENCH, AND GERMAN WHICH CAN SUPPLY A USEFUL AND UP-TO-DATE SOURCE OF INFORMATION FOR THOSE ENCOUNTERING ANY ARCHITECTURAL-ACOUSTIC DESIGN…

  9. Architecture, Aesthetics, and Pluralism: Theories of Taste as a Determinant of Architectural Standards.

    ERIC Educational Resources Information Center

    Mann, Dennis Alan

    1979-01-01

    The author outlines and extends the description of "taste cultures" offered by Gans, indicating the aesthetic standards supported by different social classes in American society and suggesting how these standards operate in the form, content, composition, and contextual relationships of American architecture. (Author/SJL)

  10. An Experiment in Architectural Instruction.

    ERIC Educational Resources Information Center

    Dvorak, Robert W.

    1978-01-01

    Discusses the application of the PLATO IV computer-based educational system to a one-semester basic drawing course for freshman architecture, landscape architecture, and interior design students and relates student reactions to the experience. (RAO)

  11. Controlling Material Reactivity Using Architecture.

    PubMed

    Sullivan, Kyle T; Zhu, Cheng; Duoss, Eric B; Gash, Alexander E; Kolesky, David B; Kuntz, Joshua D; Lewis, Jennifer A; Spadaccini, Christopher M

    2016-03-01

    3D-printing methods are used to generate reactive material architectures. Several geometric parameters are observed to influence the resultant flame propagation velocity, indicating that the architecture can be utilized to control reactivity. Two different architectures, channels and hurdles, are generated, and thin films of thermite are deposited onto the surface. The architecture offers an additional route to control, at will, the energy release rate in reactive composite materials. PMID:26669517

  12. Architecture for autonomy

    NASA Astrophysics Data System (ADS)

    Broten, Gregory S.; Monckton, Simon P.; Collier, Jack; Giesbrecht, Jared

    2006-05-01

    In 2002 Defence R&D Canada changed research direction from pure tele-operated land vehicles to general autonomy for land, air, and sea craft. The unique constraints of the military environment coupled with the complexity of autonomous systems drove DRDC to carefully plan a research and development infrastructure that would provide state of the art tools without restricting research scope. DRDC's long term objectives for its autonomy program address disparate unmanned ground vehicle (UGV), unattended ground sensor (UGS), air (UAV), and subsea and surface (UUV and USV) vehicles operating together with minimal human oversight. Individually, these systems will range in complexity from simple reconnaissance mini-UAVs streaming video to sophisticated autonomous combat UGVs exploiting embedded and remote sensing. Together, these systems can provide low risk, long endurance, battlefield services assuming they can communicate and cooperate with manned and unmanned systems. A key enabling technology for this new research is a software architecture capable of meeting both DRDC's current and future requirements. DRDC built upon recent advances in the computing science field while developing its software architecture know as the Architecture for Autonomy (AFA). Although a well established practice in computing science, frameworks have only recently entered common use by unmanned vehicles. For industry and government, the complexity, cost, and time to re-implement stable systems often exceeds the perceived benefits of adopting a modern software infrastructure. Thus, most persevere with legacy software, adapting and modifying software when and wherever possible or necessary -- adopting strategic software frameworks only when no justifiable legacy exists. Conversely, academic programs with short one or two year projects frequently exploit strategic software frameworks but with little enduring impact. The open-source movement radically changes this picture. Academic frameworks

  13. Architectural Adventures in Your Community

    ERIC Educational Resources Information Center

    Henn, Cynthia A.

    2007-01-01

    Due to architecture's complexity, it can be challenging to develop lessons for the students, and consequently, the teaching of architecture is frequently overlooked. Every community has an architectural history. For example, the community in which the author's students live has a variety of historic houses from when the community originated (the…

  14. Information architecture. Volume 2, Part 1: Baseline analysis summary

    SciTech Connect

    1996-12-01

    The Department of Energy (DOE) Information Architecture, Volume 2, Baseline Analysis, is a collaborative and logical next-step effort in the processes required to produce a Departmentwide information architecture. The baseline analysis serves a diverse audience of program management and technical personnel and provides an organized way to examine the Department`s existing or de facto information architecture. A companion document to Volume 1, The Foundations, it furnishes the rationale for establishing a Departmentwide information architecture. This volume, consisting of the Baseline Analysis Summary (part 1), Baseline Analysis (part 2), and Reference Data (part 3), is of interest to readers who wish to understand how the Department`s current information architecture technologies are employed. The analysis identifies how and where current technologies support business areas, programs, sites, and corporate systems.

  15. Generic robot architecture

    DOEpatents

    Bruemmer, David J [Idaho Falls, ID; Few, Douglas A [Idaho Falls, ID

    2010-09-21

    The present invention provides methods, computer readable media, and apparatuses for a generic robot architecture providing a framework that is easily portable to a variety of robot platforms and is configured to provide hardware abstractions, abstractions for generic robot attributes, environment abstractions, and robot behaviors. The generic robot architecture includes a hardware abstraction level and a robot abstraction level. The hardware abstraction level is configured for developing hardware abstractions that define, monitor, and control hardware modules available on a robot platform. The robot abstraction level is configured for defining robot attributes and provides a software framework for building robot behaviors from the robot attributes. Each of the robot attributes includes hardware information from at least one hardware abstraction. In addition, each robot attribute is configured to substantially isolate the robot behaviors from the at least one hardware abstraction.

  16. Parallel Subconvolution Filtering Architectures

    NASA Technical Reports Server (NTRS)

    Gray, Andrew A.

    2003-01-01

    These architectures are based on methods of vector processing and the discrete-Fourier-transform/inverse-discrete- Fourier-transform (DFT-IDFT) overlap-and-save method, combined with time-block separation of digital filters into frequency-domain subfilters implemented by use of sub-convolutions. The parallel-processing method implemented in these architectures enables the use of relatively small DFT-IDFT pairs, while filter tap lengths are theoretically unlimited. The size of a DFT-IDFT pair is determined by the desired reduction in processing rate, rather than on the order of the filter that one seeks to implement. The emphasis in this report is on those aspects of the underlying theory and design rules that promote computational efficiency, parallel processing at reduced data rates, and simplification of the designs of very-large-scale integrated (VLSI) circuits needed to implement high-order filters and correlators.

  17. Consistent model driven architecture

    NASA Astrophysics Data System (ADS)

    Niepostyn, Stanisław J.

    2015-09-01

    The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.

  18. Instrumented Architectural Simulation System

    NASA Technical Reports Server (NTRS)

    Delagi, B. A.; Saraiya, N.; Nishimura, S.; Byrd, G.

    1987-01-01

    Simulation of systems at an architectural level can offer an effective way to study critical design choices if (1) the performance of the simulator is adequate to examine designs executing significant code bodies, not just toy problems or small application fragements, (2) the details of the simulation include the critical details of the design, (3) the view of the design presented by the simulator instrumentation leads to useful insights on the problems with the design, and (4) there is enough flexibility in the simulation system so that the asking of unplanned questions is not suppressed by the weight of the mechanics involved in making changes either in the design or its measurement. A simulation system with these goals is described together with the approach to its implementation. Its application to the study of a particular class of multiprocessor hardware system architectures is illustrated.

  19. Staged Event Architecture

    SciTech Connect

    Hoschek, Wolfgang; Berket, Karlo

    2005-05-30

    Sea is a framework for a Staged Event Architecture, designed around non-blocking asynchronous communication facilities that are decoupled from the threading model chosen by any given application, Components for P networking and in-memory communication are provided. The Sea Java library encapsulates these concepts. Sea is used to easily build efficient and flexible low-level network clients and servers, and in particular as a basic communication substrate for Peer-to-Peer applications.

  20. Aerobot Autonomy Architecture

    NASA Technical Reports Server (NTRS)

    Elfes, Alberto; Hall, Jeffery L.; Kulczycki, Eric A.; Cameron, Jonathan M.; Morfopoulos, Arin C.; Clouse, Daniel S.; Montgomery, James F.; Ansar, Adnan I.; Machuzak, Richard J.

    2009-01-01

    An architecture for autonomous operation of an aerobot (i.e., a robotic blimp) to be used in scientific exploration of planets and moons in the Solar system with an atmosphere (such as Titan and Venus) is undergoing development. This architecture is also applicable to autonomous airships that could be flown in the terrestrial atmosphere for scientific exploration, military reconnaissance and surveillance, and as radio-communication relay stations in disaster areas. The architecture was conceived to satisfy requirements to perform the following functions: a) Vehicle safing, that is, ensuring the integrity of the aerobot during its entire mission, including during extended communication blackouts. b) Accurate and robust autonomous flight control during operation in diverse modes, including launch, deployment of scientific instruments, long traverses, hovering or station-keeping, and maneuvers for touch-and-go surface sampling. c) Mapping and self-localization in the absence of a global positioning system. d) Advanced recognition of hazards and targets in conjunction with tracking of, and visual servoing toward, targets, all to enable the aerobot to detect and avoid atmospheric and topographic hazards and to identify, home in on, and hover over predefined terrain features or other targets of scientific interest. The architecture is an integrated combination of systems for accurate and robust vehicle and flight trajectory control; estimation of the state of the aerobot; perception-based detection and avoidance of hazards; monitoring of the integrity and functionality ("health") of the aerobot; reflexive safing actions; multi-modal localization and mapping; autonomous planning and execution of scientific observations; and long-range planning and monitoring of the mission of the aerobot. The prototype JPL aerobot (see figure) has been tested extensively in various areas in the California Mojave desert.

  1. Analyzing and Visualizing Whole Program Architectures

    SciTech Connect

    Panas, T; Quinlan, D; Vuduc, R

    2007-05-10

    This paper describes our work to develop new tool support for analyzing and visualizing the architecture of complete large-scale (millions or more lines of code) programs. Our approach consists of (i) creating a compact, accurate representation of a whole C or C++ program, (ii) analyzing the program in this representation, and (iii) visualizing the analysis results with respect to the program's architecture. We have implemented our approach by extending and combining a compiler infrastructure and a program visualization tool, and we believe our work will be of broad interest to those engaged in a variety of program understanding and transformation tasks. We have added new whole-program analysis support to ROSE [15, 14], a source-to-source C/C++ compiler infrastructure for creating customized analysis and transformation tools. Our whole-program work does not rely on procedure summaries; rather, we preserve all of the information present in the source while keeping our representation compact. In our representation, a million-line application fits in well less than 1 GB of memory. Because whole-program analyses can generate large amounts of data, we believe that abstracting and visualizing analysis results at the architecture level is critical to reducing the cognitive burden on the consumer of the analysis results. Therefore, we have extended Vizz3D [19], an interactive program visualization tool, with an appropriate metaphor and layout algorithm for representing a program's architecture. Our implementation provides developers with an intuitive, interactive way to view analysis results, such as those produced by ROSE, in the context of the program's architecture. The remainder of this paper summarizes our approach to whole-program analysis (Section 2) and provides an example of how we visualize the analysis results (Section 3).

  2. Modular robotic architecture

    NASA Astrophysics Data System (ADS)

    Smurlo, Richard P.; Laird, Robin T.

    1991-03-01

    The development of control architectures for mobile systems is typically a task undertaken with each new application. These architectures address different operational needs and tend to be difficult to adapt to more than the problem at hand. The development of a flexible and extendible control system with evolutionary growth potential for use on mobile robots will help alleviate these problems and if made widely available will promote standardization and cornpatibility among systems throughout the industry. The Modular Robotic Architecture (MRA) is a generic control systern that meets the above needs by providing developers with a standard set of software hardware tools that can be used to design modular robots (MODBOTs) with nearly unlimited growth potential. The MODBOT itself is a generic creature that must be customized by the developer for a particular application. The MRA facilitates customization of the MODBOT by providing sensor actuator and processing modules that can be configured in almost any manner as demanded by the application. The Mobile Security Robot (MOSER) is an instance of a MODBOT that is being developed using the MRA. Navigational Sonar Module RF Link Control Station Module hR Link Detection Module Near hR Proximi Sensor Module Fluxgate Compass and Rate Gyro Collision Avoidance Sonar Module Figure 1. Remote platform module configuration of the Mobile Security Robot (MOSER). Acoustical Detection Array Stereoscopic Pan and Tilt Module High Level Processing Module Mobile Base 566

  3. Complex Event Recognition Architecture

    NASA Technical Reports Server (NTRS)

    Fitzgerald, William A.; Firby, R. James

    2009-01-01

    Complex Event Recognition Architecture (CERA) is the name of a computational architecture, and software that implements the architecture, for recognizing complex event patterns that may be spread across multiple streams of input data. One of the main components of CERA is an intuitive event pattern language that simplifies what would otherwise be the complex, difficult tasks of creating logical descriptions of combinations of temporal events and defining rules for combining information from different sources over time. In this language, recognition patterns are defined in simple, declarative statements that combine point events from given input streams with those from other streams, using conjunction, disjunction, and negation. Patterns can be built on one another recursively to describe very rich, temporally extended combinations of events. Thereafter, a run-time matching algorithm in CERA efficiently matches these patterns against input data and signals when patterns are recognized. CERA can be used to monitor complex systems and to signal operators or initiate corrective actions when anomalous conditions are recognized. CERA can be run as a stand-alone monitoring system, or it can be integrated into a larger system to automatically trigger responses to changing environments or problematic situations.

  4. Robust Software Architecture for Robots

    NASA Technical Reports Server (NTRS)

    Aghazanian, Hrand; Baumgartner, Eric; Garrett, Michael

    2009-01-01

    Robust Real-Time Reconfigurable Robotics Software Architecture (R4SA) is the name of both a software architecture and software that embodies the architecture. The architecture was conceived in the spirit of current practice in designing modular, hard, realtime aerospace systems. The architecture facilitates the integration of new sensory, motor, and control software modules into the software of a given robotic system. R4SA was developed for initial application aboard exploratory mobile robots on Mars, but is adaptable to terrestrial robotic systems, real-time embedded computing systems in general, and robotic toys.

  5. BADD phase II: DDS information management architecture

    NASA Astrophysics Data System (ADS)

    Stephenson, Thomas P.; DeCleene, Brian T.; Speckert, Glen; Voorhees, Harry L.

    1997-06-01

    The DARPA Battlefield Awareness and Data Dissemination (BADD) Phase II Program will provide the next generation multimedia information management architecture to support the warfighter. One goal of this architecture is proactive dissemination of information to the warfighter through strategies such as multicast and 'smart push and pull' designed to minimize latency and make maximum use of available communications bandwidth. Another goal is to support integration of information from widely distributed legacy repositories. This will enable the next generation of battlefield awareness applications to form a common operational view of the battlefield to aid joint service and/or multi-national peacekeeping forces. This paper discusses the approach we are taking to realize such an architecture for BADD. Our architecture and its implementation, known as the Distributed Dissemination Serivces (DDS) are based on two key concepts: a global database schema and an intelligent, proactive caching scheme. A global schema provides a common logical view of the information space in which the warfighter operates. This schema (or subsets of it) is shared by all warfighters through a distributed object database providing local access to all relevant metadata. This approach provides both scalability to a large number of warfighters, and it supports tethered as well as autonomous operations. By utilizing DDS information integration services that provide transparent access to legacy databases, related information from multiple 'stovepipe' systems are now available to battlefield awareness applications. The second key concept embedded in our architecture is an intelligent, hierarchical caching system supported by proactive dissemination management services which push both lightweight and heavyweight data such as imagery and video to warfighters based on their information profiles. The goal of this approach is to transparently and proactively stage data which is likely to be requested by

  6. Surface Buildup Scenarios and Outpost Architectures for Lunar Exploration

    NASA Technical Reports Server (NTRS)

    Mazanek, Daniel D.; Troutman, Patrick A.; Culbert, Christopher J.; Leonard, Matthew J.; Spexarth, Gary R.

    2009-01-01

    The Constellation Program Architecture Team and the Lunar Surface Systems Project Office have developed an initial set of lunar surface buildup scenarios and associated polar outpost architectures, along with preliminary supporting element and system designs in support of NASA's Exploration Strategy. The surface scenarios are structured in such a way that outpost assembly can be suspended at any time to accommodate delivery contingencies or changes in mission emphasis. The modular nature of the architectures mitigates the impact of the loss of any one element and enhances the ability of international and commercial partners to contribute elements and systems. Additionally, the core lunar surface system technologies and outpost operations concepts are applicable to future Mars exploration. These buildup scenarios provide a point of departure for future trades and assessments of alternative architectures and surface elements.

  7. Capital Architecture: Situating symbolism parallel to architectural methods and technology

    NASA Astrophysics Data System (ADS)

    Daoud, Bassam

    Capital Architecture is a symbol of a nation's global presence and the cultural and social focal point of its inhabitants. Since the advent of High-Modernism in Western cities, and subsequently decolonised capitals, civic architecture no longer seems to be strictly grounded in the philosophy that national buildings shape the legacy of government and the way a nation is regarded through its built environment. Amidst an exceedingly globalized architectural practice and with the growing concern of key heritage foundations over the shortcomings of international modernism in representing its immediate socio-cultural context, the contextualization of public architecture within its sociological, cultural and economic framework in capital cities became the key denominator of this thesis. Civic architecture in capital cities is essential to confront the challenges of symbolizing a nation and demonstrating the legitimacy of the government'. In today's dominantly secular Western societies, governmental architecture, especially where the seat of political power lies, is the ultimate form of architectural expression in conveying a sense of identity and underlining a nation's status. Departing with these convictions, this thesis investigates the embodied symbolic power, the representative capacity, and the inherent permanence in contemporary architecture, and in its modes of production. Through a vast study on Modern architectural ideals and heritage -- in parallel to methodologies -- the thesis stimulates the future of large scale governmental building practices and aims to identify and index the key constituents that may respond to the lack representation in civic architecture in capital cities.

  8. A Ground Systems Architecture Transition for A Distributed Operations System

    NASA Technical Reports Server (NTRS)

    Sellers, Donna; Bailey, Darrell (Technical Monitor)

    2002-01-01

    The Marshall Space Flight Center (MSFC) Ground Systems Department (GSD) recently undertook an architecture change in the product line that serves the ISS program. As a result, the architecture tradeoffs between data system product lines that serve remote users versus those that serve control center flight control teams were explored extensively. This paper describes the resulting architecture that will be used in the ISS payloads program, and the resulting functional breakdown of the products that support that architecture. It also describes the lessons learned from the path that was followed, as a migration of products cause the need to reevaluate the allocation of functions across the architecture. The result is a set of innovative ground system solutions that is scalable so it can support facilities of wide-ranging sizes, from a small site up to large control centers. Effective use of system automation, custom components, design optimization for data management, data storage, data transmissions, and advanced local and wide area networking architectures, plus the effective use of Commercial-Off-The-Shelf (COTS) products, provides flexible Remote Ground System options that can be tailored to the needs of each user. This paper offers a description of the efficiency and effectiveness of the Ground Systems architectural options that have been implemented, and includes successful implementation examples and lessons learned.

  9. A Ground Systems Architecture Transition for a Distributed Operations System

    NASA Technical Reports Server (NTRS)

    Sellers, Donna; Pitts, Lee; Bryant, Barry

    2003-01-01

    The Marshall Space Flight Center (MSFC) Ground Systems Department (GSD) recently undertook an architecture change in the product line that serves the ISS program. As a result, the architecture tradeoffs between data system product lines that serve remote users versus those that serve control center flight control teams were explored extensively. This paper describes the resulting architecture that will be used in the International Space Station (ISS) payloads program, and the resulting functional breakdown of the products that support this architecture. It also describes the lessons learned from the path that was followed, as a migration of products cause the need to reevaluate the allocation of functions across the architecture. The result is a set of innovative ground system solutions that is scalable so it can support facilities of wide-ranging sizes, from a small site up to large control centers. Effective use of system automation, custom components, design optimization for data management, data storage, data transmissions, and advanced local and wide area networking architectures, plus the effective use of Commercial-Off-The-Shelf (COTS) products, provides flexible Remote Ground System options that can be tailored to the needs of each user. This paper offers a description of the efficiency and effectiveness of the Ground Systems architectural options that have been implemented, and includes successful implementation examples and lessons learned.

  10. The REmote Patient Education in a Telemedicine Environment Architecture (REPETE).

    PubMed

    Lai, Albert M; Starren, Justin B; Kaufman, David R; Mendonça, Eneida A; Palmas, Walter; Nieh, Jason; Shea, Steven

    2008-05-01

    The objective of the study was to develop and implement an architecture for remote training that can be used in the narrowband home telemedicine environment. A remote training architecture, the REmote Patient Education in a Telemedicine Environment (REPETE) architecture, using a remote control protocol (RCP) was developed. A set of design criteria was specified. The developed architecture was integrated into the IDEATel home telemedicine unit (HTU) and evaluated against these design criteria using a combination of technical and expert evaluations. Technical evaluation of the architecture demonstrated that remote cursor movements and positioning displayed on the HTU were smooth and effectively real-time. The trainers were able to observe within approximately 2 seconds lag what the patient sees on their HTU screen. Evaluation of the architecture by experts was favorable. Responses to a Likert scale questionnaire regarding audio quality and remote control performance indicated that the expert evaluators thought that the audio quality and remote control performance were adequate for remote training. All evaluators strongly agreed that the system would be useful for training patients. The REPETE architecture supports basic training needs over a narrowband dial-up connection. We were able to maintain an audio chat simultaneously with performing a remote training session, while maintaining both acceptable audio quality and remote control performance. The RCP provides a mechanism to provide training without requiring a trainer to go to the patient's home and effectively supports deictic referencing to on screen objects.

  11. Exascale Hardware Architectures Working Group

    SciTech Connect

    Hemmert, S; Ang, J; Chiang, P; Carnes, B; Doerfler, D; Leininger, M; Dosanjh, S; Fields, P; Koch, K; Laros, J; Noe, J; Quinn, T; Torrellas, J; Vetter, J; Wampler, C; White, A

    2011-03-15

    The ASC Exascale Hardware Architecture working group is challenged to provide input on the following areas impacting the future use and usability of potential exascale computer systems: processor, memory, and interconnect architectures, as well as the power and resilience of these systems. Going forward, there are many challenging issues that will need to be addressed. First, power constraints in processor technologies will lead to steady increases in parallelism within a socket. Additionally, all cores may not be fully independent nor fully general purpose. Second, there is a clear trend toward less balanced machines, in terms of compute capability compared to memory and interconnect performance. In order to mitigate the memory issues, memory technologies will introduce 3D stacking, eventually moving on-socket and likely on-die, providing greatly increased bandwidth but unfortunately also likely providing smaller memory capacity per core. Off-socket memory, possibly in the form of non-volatile memory, will create a complex memory hierarchy. Third, communication energy will dominate the energy required to compute, such that interconnect power and bandwidth will have a significant impact. All of the above changes are driven by the need for greatly increased energy efficiency, as current technology will prove unsuitable for exascale, due to unsustainable power requirements of such a system. These changes will have the most significant impact on programming models and algorithms, but they will be felt across all layers of the machine. There is clear need to engage all ASC working groups in planning for how to deal with technological changes of this magnitude. The primary function of the Hardware Architecture Working Group is to facilitate codesign with hardware vendors to ensure future exascale platforms are capable of efficiently supporting the ASC applications, which in turn need to meet the mission needs of the NNSA Stockpile Stewardship Program. This issue is

  12. A multi-agent architecture for geosimulation of moving agents

    NASA Astrophysics Data System (ADS)

    Vahidnia, Mohammad H.; Alesheikh, Ali A.; Alavipanah, Seyed Kazem

    2015-10-01

    In this paper, a novel architecture is proposed in which an axiomatic derivation system in the form of first-order logic facilitates declarative explanation and spatial reasoning. Simulation of environmental perception and interaction between autonomous agents is designed with a geographic belief-desire-intention and a request-inform-query model. The architecture has a complementary quantitative component that supports collaborative planning based on the concept of equilibrium and game theory. This new architecture presents a departure from current best practices geographic agent-based modelling. Implementation tasks are discussed in some detail, as well as scenarios for fleet management and disaster management.

  13. The Tera Multithreaded Architecture and Unstructured Meshes

    NASA Technical Reports Server (NTRS)

    Bokhari, Shahid H.; Mavriplis, Dimitri J.

    1998-01-01

    The Tera Multithreaded Architecture (MTA) is a new parallel supercomputer currently being installed at San Diego Supercomputing Center (SDSC). This machine has an architecture quite different from contemporary parallel machines. The computational processor is a custom design and the machine uses hardware to support very fine grained multithreading. The main memory is shared, hardware randomized and flat. These features make the machine highly suited to the execution of unstructured mesh problems, which are difficult to parallelize on other architectures. We report the results of a study carried out during July-August 1998 to evaluate the execution of EUL3D, a code that solves the Euler equations on an unstructured mesh, on the 2 processor Tera MTA at SDSC. Our investigation shows that parallelization of an unstructured code is extremely easy on the Tera. We were able to get an existing parallel code (designed for a shared memory machine), running on the Tera by changing only the compiler directives. Furthermore, a serial version of this code was compiled to run in parallel on the Tera by judicious use of directives to invoke the "full/empty" tag bits of the machine to obtain synchronization. This version achieves 212 and 406 Mflop/s on one and two processors respectively, and requires no attention to partitioning or placement of data issues that would be of paramount importance in other parallel architectures.

  14. Versatile architecture for image recognition applications

    NASA Astrophysics Data System (ADS)

    Sacramone, Anthony; Scola, Joseph; Shazeer, Dov J.

    1992-03-01

    Architectures for the development of image recognition algorithms must support the implementation of systematic procedures for solving image recognition problems. All too often, designers develop image recognition architectures in an ad hoc fashion which lacks the structure to meet long term needs. Vendors typically supply customers with standard image processing libraries and display tools. Combining these tools and formulating development strategies have remained stumbling blocks in the design of complete image recognition algorithm development environments. In this paper, an architecture is presented which provides a well defined framework, and at the same time is sufficiently flexible to accommodate images of multiple sensor and data types. The primary components of the architecture are: ground-truthing, preprocessing (which includes image processing and segmentation), feature extraction, classification, and performance analysis. Powerful and well defined data structures are exploited for each of the primary components. Groups of programs called tasks manipulate one or more of these data structures, each task belonging to one of the primary components. Multiple tasks can be executed in an unsupervised mode over an entire database of images. Results are then subjected to performance analysis and feedback. A description of the primary components and how they are integrated to facilitate the rapid prototyping and development of image recognition algorithms is presented.

  15. Space Architecture: The Role, Work and Aptitude

    NASA Technical Reports Server (NTRS)

    Griffin, Brand

    2014-01-01

    Space architecture has been an emerging discipline for at least 40 years. Has it arrived? Is space architecture a legitimate vocation or an avocation? If it leads to a job, what do employers want? In 2002, NASA Headquarters created a management position for a space architect whose job was to "lead the development of strategic architectures and identify high level requirements for systems that will accomplish the Nation's space exploration vision." This is a good job description with responsibility at the right level in NASA, but unfortunately, the office was discontinued two years later. Even though there is no accredited academic program or professional licensing for space architecture, there is a community of practitioners. They are civil servants, contractors and academicians supporting International Space Station and space exploration programs. In various ways, space architects currently contribute to human spaceflight, but there is a way for the discipline to be more effective in developing solutions to large scale complex problems. This paper organizes contributions from engineers, architects and psychologists into recommendations on the role of space architects in the organization, the process of creating and selecting options, and intrinsic personality traits including why they must have a high tolerance for ambiguity.

  16. Style grammars for interactive visualization of architecture.

    PubMed

    Aliaga, Daniel G; Rosen, Paul A; Bekins, Daniel R

    2007-01-01

    Interactive visualization of architecture provides a way to quickly visualize existing or novel buildings and structures. Such applications require both fast rendering and an effortless input regimen for creating and changing architecture using high-level editing operations that automatically fill in the necessary details. Procedural modeling and synthesis is a powerful paradigm that yields high data amplification and can be coupled with fast-rendering techniques to quickly generate plausible details of a scene without much or any user interaction. Previously, forward generating procedural methods have been proposed where a procedure is explicitly created to generate particular content. In this paper, we present our work in inverse procedural modeling of buildings and describe how to use an extracted repertoire of building grammars to facilitate the visualization and quick modification of architectural structures and buildings. We demonstrate an interactive application where the user draws simple building blocks and, using our system, can automatically complete the building "in the style of" other buildings using view-dependent texture mapping or nonphotorealistic rendering techniques. Our system supports an arbitrary number of building grammars created from user subdivided building models and captured photographs. Using only edit, copy, and paste metaphors, the entire building styles can be altered and transferred from one building to another in a few operations, enhancing the ability to modify an existing architectural structure or to visualize a novel building in the style of the others.

  17. Systems Architecture for Fully Autonomous Space Missions

    NASA Technical Reports Server (NTRS)

    Esper, Jamie; Schnurr, R.; VanSteenberg, M.; Brumfield, Mark (Technical Monitor)

    2002-01-01

    The NASA Goddard Space Flight Center is working to develop a revolutionary new system architecture concept in support of fully autonomous missions. As part of GSFC's contribution to the New Millenium Program (NMP) Space Technology 7 Autonomy and on-Board Processing (ST7-A) Concept Definition Study, the system incorporates the latest commercial Internet and software development ideas and extends them into NASA ground and space segment architectures. The unique challenges facing the exploration of remote and inaccessible locales and the need to incorporate corresponding autonomy technologies within reasonable cost necessitate the re-thinking of traditional mission architectures. A measure of the resiliency of this architecture in its application to a broad range of future autonomy missions will depend on its effectiveness in leveraging from commercial tools developed for the personal computer and Internet markets. Specialized test stations and supporting software come to past as spacecraft take advantage of the extensive tools and research investments of billion-dollar commercial ventures. The projected improvements of the Internet and supporting infrastructure go hand-in-hand with market pressures that provide continuity in research. By taking advantage of consumer-oriented methods and processes, space-flight missions will continue to leverage on investments tailored to provide better services at reduced cost. The application of ground and space segment architectures each based on Local Area Networks (LAN), the use of personal computer-based operating systems, and the execution of activities and operations through a Wide Area Network (Internet) enable a revolution in spacecraft mission formulation, implementation, and flight operations. Hardware and software design, development, integration, test, and flight operations are all tied-in closely to a common thread that enables the smooth transitioning between program phases. The application of commercial software

  18. Spacecraft Architecture and environmental pshychology

    NASA Astrophysics Data System (ADS)

    Ören, Ayşe

    2016-07-01

    As we embark on a journey for new homes in the new worlds to lay solid foundations, we should consider not only the survival of frontiers but also well-being of those to live in zero gravity. As a versatile science, architecture encompasses abstract human needs as well. On our new different direction in the course of the Homo sapiens evolution, we can do this with designs addressing both our needs and senses. Well-being of humans can be achieved by creating environments supporting the cognitive and social stages in the evolution process. Space stations are going through their own evolution process. Any step taken can serve as a reference for further attempts. When studying the history of architecture, window designing is discussed in a later phase, which is the case for building a spaceship as well. We lean on the places we live both physically and metaphorically. The feeling of belonging is essential here, entailing trans-humanism, which is significant since the environment therein is like a dress comfortable enough to fit in, meeting needs without any burden. Utilizing the advent of technology, we can create moods and atmospheres to regulate night and day cycles, thus we can turn claustrophobic places into cozy or dream-like places. Senses provoke a psychological sensation going beyond cultural codes as they are rooted within consciousness, which allows designers to create a mood within a space that tells a story and evokes an emotional impact. Color, amount of light, sound and odor are not superficial. As much as intangible, they are real and powerful tools with a physical presence. Tapping into induction, we can solve a whole system based on a part thereof. Therefore, fractal designs may not yield good results unless used correctly in terms of design although they are functional, which makes geometric arrangement critical.

  19. Spacecraft Architecture and well being

    NASA Astrophysics Data System (ADS)

    Ören, Ayşe

    2016-07-01

    As we embark on a journey for new homes in the new worlds to lay solid foundations, we should consider not only the survival of frontiers but also well-being of those to live in zero gravity. As a versatile science, architecture encompasses abstract human needs as well. On our new different direction in the course of the Homo sapiens evolution, we can do this with designs addressing both our needs and senses. Well-being of humans can be achieved by creating environments supporting the cognitive and social stages in the evolution process. Space stations are going through their own evolution process. Any step taken can serve as a reference for further attempts. When studying the history of architecture, window designing is discussed in a later phase, which is the case for building a spaceship as well. We lean on the places we live both physically and metaphorically. The feeling of belonging is essential here, entailing trans-humanism, which is significant since the environment therein is like a dress comfortable enough to fit in, meeting needs without any burden. Utilizing the advent of technology, we can create moods and atmospheres to regulate night and day cycles, thus we can turn claustrophobic places into cozy or dream-like places. Senses provoke a psychological sensation going beyond cultural codes as they are rooted within consciousness, which allows designers to create a mood within a space that tells a story and evokes an emotional impact. Color, amount of light, sound and odor are not superficial. As much as intangible, they are real and powerful tools with a physical presence. Tapping into induction, we can solve a whole system based on a part thereof. Therefore, fractal designs may not yield good results unless used correctly in terms of design although they are functional, which makes geometric arrangement critical.

  20. The D3 Middleware Architecture

    NASA Technical Reports Server (NTRS)

    Walton, Joan; Filman, Robert E.; Korsmeyer, David J.; Lee, Diana D.; Mak, Ron; Patel, Tarang

    2002-01-01

    DARWIN is a NASA developed, Internet-based system for enabling aerospace researchers to securely and remotely access and collaborate on the analysis of aerospace vehicle design data, primarily the results of wind-tunnel testing and numeric (e.g., computational fluid-dynamics) model executions. DARWIN captures, stores and indexes data; manages derived knowledge (such as visualizations across multiple datasets); and provides an environment for designers to collaborate in the analysis of test results. DARWIN is an interesting application because it supports high-volumes of data. integrates multiple modalities of data display (e.g., images and data visualizations), and provides non-trivial access control mechanisms. DARWIN enables collaboration by allowing not only sharing visualizations of data, but also commentary about and views of data. Here we provide an overview of the architecture of D3, the third generation of DARWIN. Earlier versions of DARWIN were characterized by browser-based interfaces and a hodge-podge of server technologies: CGI scripts, applets, PERL, and so forth. But browsers proved difficult to control, and a proliferation of computational mechanisms proved inefficient and difficult to maintain. D3 substitutes a pure-Java approach for that medley: A Java client communicates (though RMI over HTTPS) with a Java-based application server. Code on the server accesses information from JDBC databases, distributed LDAP security services, and a collaborative information system. D3 is a three tier-architecture, but unlike 'E-commerce' applications, the data usage pattern suggests different strategies than traditional Enterprise Java Beans - we need to move volumes of related data together, considerable processing happens on the client, and the 'business logic' on the server-side is primarily data integration and collaboration. With D3, we are extending DARWIN to handle other data domains and to be a distributed system, where a single login allows a user

  1. Avionics Architectures for Exploration: Wireless Technologies and Human Spaceflight

    NASA Technical Reports Server (NTRS)

    Goforth, Montgomery B.; Ratliff, James E.; Barton, Richard J.; Wagner, Raymond S.; Lansdowne, Chatwin

    2014-01-01

    The authors describe ongoing efforts by the Avionics Architectures for Exploration (AAE) project chartered by NASA's Advanced Exploration Systems (AES) Program to evaluate new avionics architectures and technologies, provide objective comparisons of them, and mature selected technologies for flight and for use by other AES projects. The AAE project team includes members from most NASA centers and from industry. This paper provides an overview of recent AAE efforts, with particular emphasis on the wireless technologies being evaluated under AES to support human spaceflight.

  2. Science Driven Supercomputing Architectures: AnalyzingArchitectural Bottlenecks with Applications and Benchmark Probes

    SciTech Connect

    Kamil, S.; Yelick, K.; Kramer, W.T.; Oliker, L.; Shalf, J.; Shan,H.; Strohmaier, E.

    2005-09-26

    There is a growing gap between the peak speed of parallel computing systems and the actual delivered performance for scientific applications. In general this gap is caused by inadequate architectural support for the requirements of modern scientific applications, as commercial applications and the much larger market they represent, have driven the evolution of computer architectures. This gap has raised the importance of developing better benchmarking methodologies to characterize and to understand the performance requirements of scientific applications, to communicate them efficiently to influence the design of future computer architectures. This improved understanding of the performance behavior of scientific applications will allow improved performance predictions, development of adequate benchmarks for identification of hardware and application features that work well or poorly together, and a more systematic performance evaluation in procurement situations. The Berkeley Institute for Performance Studies has developed a three-level approach to evaluating the design of high end machines and the software that runs on them: (1) A suite of representative applications; (2) A set of application kernels; and (3) Benchmarks to measure key system parameters. The three levels yield different type of information, all of which are useful in evaluating systems, and enable NSF and DOE centers to select computer architectures more suited for scientific applications. The analysis will further allow the centers to engage vendors in discussion of strategies to alleviate the present architectural bottlenecks using quantitative information. These may include small hardware changes or larger ones that may be out interest to non-scientific workloads. Providing quantitative models to the vendors allows them to assess the benefits of technology alternatives using their own internal cost-models in the broader marketplace, ideally facilitating the development of future computer

  3. 18. Photocopy of drawing (1961 architectural drawing by Kaiser Engineers) ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    18. Photocopy of drawing (1961 architectural drawing by Kaiser Engineers) FLOOR PLAN, ELEVATIONS, AND SCHEDULE FOR VEHICLE SUPPORT BUILDING, SHEET A-1 - Vandenberg Air Force Base, Space Launch Complex 3, Vehicle Support Building, Napa & Alden Roads, Lompoc, Santa Barbara County, CA

  4. Integrating the services' imagery architectures

    NASA Astrophysics Data System (ADS)

    Mader, John F.

    1993-04-01

    Any military organization requiring imagery must deal with one or more of several architectures: the tactical architectures of the three military departments, the theater architectures, and their interfaces to a separate national architecture. A seamless, joint, integrated architecture must meet today's imagery requirements. The CIO's vision of 'the right imagery to the right people in the right format at the right time' would serve well as the objective of a joint, integrated architecture. A joint imagery strategy should be initially shaped by the four pillars of the National Military Strategy of the United States: strategic deterrence; forward presence; crisis response; and reconstitution. In a macro view, it must consist of a series of sub-strategies to include science and technology and research and development, maintenance of the imagery related industrial base, acquisition, resource management, and burden sharing. Common imagery doctrine must follow the imagery strategy. Most of all, control, continuity, and direction must be maintained with regard to organizations and systems development as the architecture evolves. These areas and more must be addressed to reach the long term goal of a joint, integrated imagery architecture. This will require the services and theaters to relinquish some sovereignty over at least systems development and acquisition. Nevertheless, the goal of a joint, integrated imagery architecture is feasible. The author presents arguments and specific recommendations to orient the imagery community in the direction of a joint, integrated imagery architecture.

  5. Multiuser Receiver Architectures for Space Modems

    NASA Astrophysics Data System (ADS)

    Bell, D.; Satorius, E.; Kuperman, I.; Koenig, J.

    2014-08-01

    In this article, we develop multiuser modem architectures suitable for augmentation of existing software-defined flight radios with important near-term enhancements to relay telecom services as well as support for missions requiring entry, descent, and landing (EDL). EDL support for missions like Mars Exploration Rover (MER), Phoenix, Mars Science Laboratory (MSL), and Mars 2020 obtains strong visibility within JPL and NASA headquarters. As part of an agency-wide commitment to support EDL, funding has been made available on past missions to prepare multiple Deep Space Network (DSN) ground station sites, non-DSN ground stations (Greenbank 100-m), and to prioritize in situ coverage from multiple orbiters. Multiuser operations open up new possibilities for simultaneous support of multiple surface landers that are in close proximity such that these surface elements simultaneously appear in the same coverage circle of a single relay orbiter. Simultaneous multiuser support is an important service type for many future surface mission paradigms. In this article, designs for field-programmable gate array (FPGA) implementation of multiuser modems are presented. A fixed-point model of the most promising architecture for space applications is presented as well as simulated performance results based on a fixed-point design that is suitable for FPGA implementation.

  6. Architecture for Teraflop Visualization

    SciTech Connect

    Breckenridge, A.R.; Haynes, R.A.

    1999-04-09

    Sandia Laboratories' computational scientists are addressing a very important question: How do we get insight from the human combined with the computer-generated information? The answer inevitably leads to using scientific visualization. Going one technology leap further is teraflop visualization, where the computing model and interactive graphics are an integral whole to provide computing for insight. In order to implement our teraflop visualization architecture, all hardware installed or software coded will be based on open modules and dynamic extensibility principles. We will illustrate these concepts with examples in our three main research areas: (1) authoring content (the computer), (2) enhancing precision and resolution (the human), and (3) adding behaviors (the physics).

  7. Architecture for robot intelligence

    NASA Technical Reports Server (NTRS)

    Peters, II, Richard Alan (Inventor)

    2004-01-01

    An architecture for robot intelligence enables a robot to learn new behaviors and create new behavior sequences autonomously and interact with a dynamically changing environment. Sensory information is mapped onto a Sensory Ego-Sphere (SES) that rapidly identifies important changes in the environment and functions much like short term memory. Behaviors are stored in a DBAM that creates an active map from the robot's current state to a goal state and functions much like long term memory. A dream state converts recent activities stored in the SES and creates or modifies behaviors in the DBAM.

  8. Mind and Language Architecture

    PubMed Central

    Logan, Robert K

    2010-01-01

    A distinction is made between the brain and the mind. The architecture of the mind and language is then described within a neo-dualistic framework. A model for the origin of language based on emergence theory is presented. The complexity of hominid existence due to tool making, the control of fire and the social cooperation that fire required gave rise to a new level of order in mental activity and triggered the simultaneous emergence of language and conceptual thought. The mind is shown to have emerged as a bifurcation of the brain with the emergence of language. The role of language in the evolution of human culture is also described. PMID:20922045

  9. Architecture, constraints, and behavior

    PubMed Central

    Doyle, John C.; Csete, Marie

    2011-01-01

    This paper aims to bridge progress in neuroscience involving sophisticated quantitative analysis of behavior, including the use of robust control, with other relevant conceptual and theoretical frameworks from systems engineering, systems biology, and mathematics. Familiar and accessible case studies are used to illustrate concepts of robustness, organization, and architecture (modularity and protocols) that are central to understanding complex networks. These essential organizational features are hidden during normal function of a system but are fundamental for understanding the nature, design, and function of complex biologic and technologic systems. PMID:21788505

  10. Etruscan Divination and Architecture

    NASA Astrophysics Data System (ADS)

    Magli, Giulio

    The Etruscan religion was characterized by divination methods, aimed at interpreting the will of the gods. These methods were revealed by the gods themselves and written in the books of the Etrusca Disciplina. The books are lost, but parts of them are preserved in the accounts of later Latin sources. According to such traditions divination was tightly connected with the Etruscan cosmovision of a Pantheon distributed in equally spaced, specific sectors of the celestial realm. We explore here the possible reflections of such issues in the Etruscan architectural remains.

  11. TROPIX Power System Architecture

    NASA Technical Reports Server (NTRS)

    Manner, David B.; Hickman, J. Mark

    1995-01-01

    This document contains results obtained in the process of performing a power system definition study of the TROPIX power management and distribution system (PMAD). Requirements derived from the PMADs interaction with other spacecraft systems are discussed first. Since the design is dependent on the performance of the photovoltaics, there is a comprehensive discussion of the appropriate models for cells and arrays. A trade study of the array operating voltage and its effect on array bus mass is also presented. A system architecture is developed which makes use of a combination of high efficiency switching power convertors and analog regulators. Mass and volume estimates are presented for all subsystems.

  12. Evolution of genome architecture.

    PubMed

    Koonin, Eugene V

    2009-02-01

    Charles Darwin believed that all traits of organisms have been honed to near perfection by natural selection. The empirical basis underlying Darwin's conclusions consisted of numerous observations made by him and other naturalists on the exquisite adaptations of animals and plants to their natural habitats and on the impressive results of artificial selection. Darwin fully appreciated the importance of heredity but was unaware of the nature and, in fact, the very existence of genomes. A century and a half after the publication of the "Origin", we have the opportunity to draw conclusions from the comparisons of hundreds of genome sequences from all walks of life. These comparisons suggest that the dominant mode of genome evolution is quite different from that of the phenotypic evolution. The genomes of vertebrates, those purported paragons of biological perfection, turned out to be veritable junkyards of selfish genetic elements where only a small fraction of the genetic material is dedicated to encoding biologically relevant information. In sharp contrast, genomes of microbes and viruses are incomparably more compact, with most of the genetic material assigned to distinct biological functions. However, even in these genomes, the specific genome organization (gene order) is poorly conserved. The results of comparative genomics lead to the conclusion that the genome architecture is not a straightforward result of continuous adaptation but rather is determined by the balance between the selection pressure, that is itself dependent on the effective population size and mutation rate, the level of recombination, and the activity of selfish elements. Although genes and, in many cases, multigene regions of genomes possess elaborate architectures that ensure regulation of expression, these arrangements are evolutionarily volatile and typically change substantially even on short evolutionary scales when gene sequences diverge minimally. Thus, the observed genome

  13. Towards a Domain Specific Software Architecture for Scientific Data Distribution

    NASA Astrophysics Data System (ADS)

    Wilson, A.; Lindholm, D. M.

    2011-12-01

    A reference architecture is a "design that satisfies a clearly distinguished subset of the functional capabilities identified in the reference requirements within the boundaries of certain design and implementation constraints, also identified in reference requirements." [Tracz, 1995] Recognizing the value of a reference architecture, NASA's ESDSWG's Standards Process Group (SPG) is introducing a multi-disciplinary science data systems (SDS) reference architecture in order to provide an implementation neutral, template solution for an architecture to support scientific data systems in general [Burnett, et al, 2011]. This reference architecture describes common features and patterns in scientific data systems, and can thus provide guidelines in building and improving such systems. But, guidelines alone may not be sufficient to actually build a system. A domain specific software architecture (DSSA) is "an assemblage of software components, specialized for a particular type of task (domain), generalized for effective use across that domain, composed in a standardized structure (topology) effective for building successful applications." [Tracz, 1995]. It can be thought of as relatively specific reference architecture. The "DSSA Process" is a software life cycle developed at Carnegie Melon's Software Engineering Institute that is based on the development and use of domain-specific software architectures, components, and tools. The process has four distinct activities: 1) develop a domain specific base/model, 2) populate and maintain the library, 3) build applications, 4) operate and maintain applications [Armitage, 1993]. The DSSA process may provide the missing link between guidelines and actual system construction. In this presentation we focus specifically on the realm of scientific data access and distribution. Assuming the role of domain experts in building data access systems, we report the results of creating a DSSA for scientific data distribution. We describe

  14. Options for a lunar base surface architecture

    NASA Astrophysics Data System (ADS)

    Roberts, Barney B.

    1992-02-01

    The Planet Surface Systems Office at the NASA Johnson Space Center has participated in an analysis of the Space Exploration Initiative architectures described in the Synthesis Group report. This effort involves a Systems Engineering and Integration effort to define point designs for evolving lunar and Mars bases that support substantial science, exploration, and resource production objectives. The analysis addresses systems-level designs; element requirements and conceptual designs; assessments of precursor and technology needs; and overall programmatics and schedules. This paper focuses on the results of the study of the Space Resource Utilization Architecture. This architecture develops the capability to extract useful materials from the indigenous resources of the Moon and Mars. On the Moon, a substantial infrastructure is emplaced which can support a crew of up to twelve. Two major process lines are developed: one produces oxygen, ceramics, and metals; the other produces hydrogen, helium, and other volatiles. The Moon is also used for a simulation of a Mars mission. Significant science capabilities are established in conjunction with resource development. Exploration includes remote global surveys and piloted sorties of local and regional areas. Science accommodations include planetary science, astronomy, and biomedical research. Greenhouses are established to provide a substantial amount of food needs.

  15. Medical nanorobot architecture based on nanobioelectronics.

    PubMed

    Cavalcanti, Adriano; Shirinzadeh, Bijan; Freitas, Robert A; Kretly, Luiz C

    2007-01-01

    This work describes an innovative medical nanorobot architecture based on important discoveries in nanotechnology, integrated circuit patents, and some publications, directly or indirectly related to one of the most challenging new fields of science: molecular machines. Thus, the architecture described in this paper reflects, and is supported by, some remarkable recent achievements and patents in nanoelectronics, wireless communication and power transmission techniques, nanotubes, lithography, biomedical instrumentation, genetics, and photonics. We also describe how medicine can benefit from the joint development of nanodevices which are derived, and which integrate techniques, from artificial intelligence, nanotechnology, and embedded smart sensors. Teleoperated surgical procedures, early disease diagnosis, and pervasive patient monitoring are some possible applications of nanorobots, reflecting progress along a roadmap for the gradual and practical development of nanorobots. To illustrate the described nanorobot architecture, a computational 3D approach with the application of nanorobots for diabetes is simulated using clinical data. Theoretical and practical analysis of system integration modeling is one important aspect for supporting the rapid development in the emerging field of nanotechnology. This provides useful directions for further research and development of medical nanorobotics and suggests a time frame in which nanorobots may be expected to be available for common utilization in therapeutic and medical procedures. PMID:19076015

  16. A Geosynchronous Orbit Optical Communications Relay Architecture

    NASA Technical Reports Server (NTRS)

    Edwards, Bernard L.; Israel, David J.

    2014-01-01

    NASA is planning to fly a Next Generation Tracking and Data Relay Satellite (TDRS) next decade. While the requirements and architecture for that satellite are unknown at this time, NASA is investing in communications technologies that could be deployed on the satellite to provide new communications services. One of those new technologies is optical communications. The Laser Communications Relay Demonstration (LCRD) project, scheduled for launch in December 2017 as a hosted payload on a commercial communications satellite, is a critical pathfinder towards NASA providing optical communications services on the Next Generation TDRS. While it is obvious that a small to medium sized optical communications terminal could be flown on a GEO satellite to provide support to Near Earth missions, it is also possible to deploy a large terminal on the satellite to support Deep Space missions. Onboard data processing and Delay Tolerant Networking (DTN) are two additional technologies that could be used to optimize optical communications link services and enable additional mission and network operations. This paper provides a possible architecture for the optical communications augmentation of a Next Generation TDRS and touches on the critical technology work currently being done at NASA. It will also describe the impact of clouds on such an architecture and possible mitigation techniques.

  17. Architectures Toward Reusable Science Data Systems

    NASA Astrophysics Data System (ADS)

    Moses, J. F.

    2014-12-01

    Science Data Systems (SDS) comprise an important class of data processing systems that support product generation from remote sensors and in-situ observations. These systems enable research into new science data products, replication of experiments and verification of results. NASA has been building ground systems for satellite data processing since the first Earth observing satellites launched and is continuing development of systems to support NASA science research, NOAA's weather satellites and USGS's Earth observing satellite operations. The basic data processing workflows and scenarios continue to be valid for remote sensor observations research as well as for the complex multi-instrument operational satellite data systems being built today. System functions such as ingest, product generation and distribution need to be configured and performed in a consistent and repeatable way with an emphasis on scalability. This paper will examine the key architectural elements of several NASA satellite data processing systems currently in operation and under development that make them suitable for scaling and reuse. Examples of architectural elements that have become attractive include virtual machine environments, standard data product formats, metadata content and file naming, workflow and job management frameworks, data acquisition, search, and distribution protocols. By highlighting key elements and implementation experience the goal is to recognize architectures that will outlast their original application and be readily adaptable for new applications. Concepts and principles are explored that lead to sound guidance for SDS developers and strategists.

  18. Architectures Toward Reusable Science Data Systems

    NASA Technical Reports Server (NTRS)

    Moses, John

    2015-01-01

    Science Data Systems (SDS) comprise an important class of data processing systems that support product generation from remote sensors and in-situ observations. These systems enable research into new science data products, replication of experiments and verification of results. NASA has been building systems for satellite data processing since the first Earth observing satellites launched and is continuing development of systems to support NASA science research and NOAAs Earth observing satellite operations. The basic data processing workflows and scenarios continue to be valid for remote sensor observations research as well as for the complex multi-instrument operational satellite data systems being built today. System functions such as ingest, product generation and distribution need to be configured and performed in a consistent and repeatable way with an emphasis on scalability. This paper will examine the key architectural elements of several NASA satellite data processing systems currently in operation and under development that make them suitable for scaling and reuse. Examples of architectural elements that have become attractive include virtual machine environments, standard data product formats, metadata content and file naming, workflow and job management frameworks, data acquisition, search, and distribution protocols. By highlighting key elements and implementation experience we expect to find architectures that will outlast their original application and be readily adaptable for new applications. Concepts and principles are explored that lead to sound guidance for SDS developers and strategists.

  19. Options for a lunar base surface architecture

    NASA Technical Reports Server (NTRS)

    Roberts, Barney B.

    1992-01-01

    The Planet Surface Systems Office at the NASA Johnson Space Center has participated in an analysis of the Space Exploration Initiative architectures described in the Synthesis Group report. This effort involves a Systems Engineering and Integration effort to define point designs for evolving lunar and Mars bases that support substantial science, exploration, and resource production objectives. The analysis addresses systems-level designs; element requirements and conceptual designs; assessments of precursor and technology needs; and overall programmatics and schedules. This paper focuses on the results of the study of the Space Resource Utilization Architecture. This architecture develops the capability to extract useful materials from the indigenous resources of the Moon and Mars. On the Moon, a substantial infrastructure is emplaced which can support a crew of up to twelve. Two major process lines are developed: one produces oxygen, ceramics, and metals; the other produces hydrogen, helium, and other volatiles. The Moon is also used for a simulation of a Mars mission. Significant science capabilities are established in conjunction with resource development. Exploration includes remote global surveys and piloted sorties of local and regional areas. Science accommodations include planetary science, astronomy, and biomedical research. Greenhouses are established to provide a substantial amount of food needs.

  20. Medical nanorobot architecture based on nanobioelectronics.

    PubMed

    Cavalcanti, Adriano; Shirinzadeh, Bijan; Freitas, Robert A; Kretly, Luiz C

    2007-01-01

    This work describes an innovative medical nanorobot architecture based on important discoveries in nanotechnology, integrated circuit patents, and some publications, directly or indirectly related to one of the most challenging new fields of science: molecular machines. Thus, the architecture described in this paper reflects, and is supported by, some remarkable recent achievements and patents in nanoelectronics, wireless communication and power transmission techniques, nanotubes, lithography, biomedical instrumentation, genetics, and photonics. We also describe how medicine can benefit from the joint development of nanodevices which are derived, and which integrate techniques, from artificial intelligence, nanotechnology, and embedded smart sensors. Teleoperated surgical procedures, early disease diagnosis, and pervasive patient monitoring are some possible applications of nanorobots, reflecting progress along a roadmap for the gradual and practical development of nanorobots. To illustrate the described nanorobot architecture, a computational 3D approach with the application of nanorobots for diabetes is simulated using clinical data. Theoretical and practical analysis of system integration modeling is one important aspect for supporting the rapid development in the emerging field of nanotechnology. This provides useful directions for further research and development of medical nanorobotics and suggests a time frame in which nanorobots may be expected to be available for common utilization in therapeutic and medical procedures.

  1. Architectures for intelligent machines

    NASA Technical Reports Server (NTRS)

    Saridis, George N.

    1991-01-01

    The theory of intelligent machines has been recently reformulated to incorporate new architectures that are using neural and Petri nets. The analytic functions of an intelligent machine are implemented by intelligent controls, using entropy as a measure. The resulting hierarchical control structure is based on the principle of increasing precision with decreasing intelligence. Each of the three levels of the intelligent control is using different architectures, in order to satisfy the requirements of the principle: the organization level is moduled after a Boltzmann machine for abstract reasoning, task planning and decision making; the coordination level is composed of a number of Petri net transducers supervised, for command exchange, by a dispatcher, which also serves as an interface to the organization level; the execution level, include the sensory, planning for navigation and control hardware which interacts one-to-one with the appropriate coordinators, while a VME bus provides a channel for database exchange among the several devices. This system is currently implemented on a robotic transporter, designed for space construction at the CIRSSE laboratories at the Rensselaer Polytechnic Institute. The progress of its development is reported.

  2. Rutger's CAM2000 chip architecture

    NASA Technical Reports Server (NTRS)

    Smith, Donald E.; Hall, J. Storrs; Miyake, Keith

    1993-01-01

    This report describes the architecture and instruction set of the Rutgers CAM2000 memory chip. The CAM2000 combines features of Associative Processing (AP), Content Addressable Memory (CAM), and Dynamic Random Access Memory (DRAM) in a single chip package that is not only DRAM compatible but capable of applying simple massively parallel operations to memory. This document reflects the current status of the CAM2000 architecture and is continually updated to reflect the current state of the architecture and instruction set.

  3. Software synthesis using generic architectures

    NASA Technical Reports Server (NTRS)

    Bhansali, Sanjay

    1993-01-01

    A framework for synthesizing software systems based on abstracting software system designs and the design process is described. The result of such an abstraction process is a generic architecture and the process knowledge for customizing the architecture. The customization process knowledge is used to assist a designer in customizing the architecture as opposed to completely automating the design of systems. Our approach using an implemented example of a generic tracking architecture which was customized in two different domains is illustrated. How the designs produced using KASE compare to the original designs of the two systems, and current work and plans for extending KASE to other application areas are described.

  4. A resource management architecture for metacomputing systems.

    SciTech Connect

    Czajkowski, K.; Foster, I.; Karonis, N.; Kesselman, C.; Martin, S.; Smith, W.; Tuecke, S.

    1999-08-24

    Metacomputing systems are intended to support remote and/or concurrent use of geographically distributed computational resources. Resource management in such systems is complicated by five concerns that do not typically arise in other situations: site autonomy and heterogeneous substrates at the resources, and application requirements for policy extensibility, co-allocation, and online control. We describe a resource management architecture that addresses these concerns. This architecture distributes the resource management problem among distinct local manager, resource broker, and resource co-allocator components and defines an extensible resource specification language to exchange information about requirements. We describe how these techniques have been implemented in the context of the Globus metacomputing toolkit and used to implement a variety of different resource management strategies. We report on our experiences applying our techniques in a large testbed, GUSTO, incorporating 15 sites, 330 computers, and 3600 processors.

  5. Roadmap to the SRS computing architecture

    SciTech Connect

    Johnson, A.

    1994-07-05

    This document outlines the major steps that must be taken by the Savannah River Site (SRS) to migrate the SRS information technology (IT) environment to the new architecture described in the Savannah River Site Computing Architecture. This document proposes an IT environment that is {open_quotes}...standards-based, data-driven, and workstation-oriented, with larger systems being utilized for the delivery of needed information to users in a client-server relationship.{close_quotes} Achieving this vision will require many substantial changes in the computing applications, systems, and supporting infrastructure at the site. This document consists of a set of roadmaps which provide explanations of the necessary changes for IT at the site and describes the milestones that must be completed to finish the migration.

  6. Architectural Implications for Spatial Object Association Algorithms*

    PubMed Central

    Kumar, Vijay S.; Kurc, Tahsin; Saltz, Joel; Abdulla, Ghaleb; Kohn, Scott R.; Matarazzo, Celeste

    2013-01-01

    Spatial object association, also referred to as crossmatch of spatial datasets, is the problem of identifying and comparing objects in two or more datasets based on their positions in a common spatial coordinate system. In this work, we evaluate two crossmatch algorithms that are used for astronomical sky surveys, on the following database system architecture configurations: (1) Netezza Performance Server®, a parallel database system with active disk style processing capabilities, (2) MySQL Cluster, a high-throughput network database system, and (3) a hybrid configuration consisting of a collection of independent database system instances with data replication support. Our evaluation provides insights about how architectural characteristics of these systems affect the performance of the spatial crossmatch algorithms. We conducted our study using real use-case scenarios borrowed from a large-scale astronomy application known as the Large Synoptic Survey Telescope (LSST). PMID:25692244

  7. Exploration Architecture Options - ECLSS, TCS, EVA Implications

    NASA Technical Reports Server (NTRS)

    Chambliss, Joe; Henninger, Don

    2011-01-01

    Many options for exploration of space have been identified and evaluated since the Vision for Space Exploration (VSE) was announced in 2004. The Augustine Commission evaluated human space flight for the Obama administration then the Human Exploration Framework Teams (HEFT and HEFT2) evaluated potential exploration missions and the infrastructure and technology needs for those missions. Lunar architectures have been identified and addressed by the Lunar Surface Systems team to establish options for how to get to, and then inhabit and explore, the moon. This paper will evaluate the options for exploration of space for the implications of architectures on the Environmental Control and Life Support (ECLSS), Thermal Control (TCS), and Extravehicular Activity (EVA) Systems.

  8. Architectural Implications for Spatial Object Association Algorithms

    SciTech Connect

    Kumar, V S; Kurc, T; Saltz, J; Abdulla, G; Kohn, S R; Matarazzo, C

    2009-01-29

    Spatial object association, also referred to as cross-match of spatial datasets, is the problem of identifying and comparing objects in two or more datasets based on their positions in a common spatial coordinate system. In this work, we evaluate two crossmatch algorithms that are used for astronomical sky surveys, on the following database system architecture configurations: (1) Netezza Performance Server R, a parallel database system with active disk style processing capabilities, (2) MySQL Cluster, a high-throughput network database system, and (3) a hybrid configuration consisting of a collection of independent database system instances with data replication support. Our evaluation provides insights about how architectural characteristics of these systems affect the performance of the spatial crossmatch algorithms. We conducted our study using real use-case scenarios borrowed from a large-scale astronomy application known as the Large Synoptic Survey Telescope (LSST).

  9. A computational architecture for social agents

    SciTech Connect

    Bond, A.H.

    1996-12-31

    This article describes a new class of information-processing models for social agents. They axe derived from primate brain architecture, the processing in brain regions, the interactions among brain regions, and the social behavior of primates. In another paper, we have reviewed the neuroanatomical connections and functional involvements of cortical regions. We reviewed the evidence for a hierarchical architecture in the primate brain. By examining neuroanatomical evidence for connections among neural areas, we were able to establish anatomical regions and connections. We then examined evidence for specific functional involvements of the different neural axeas and found some support for hierarchical functioning, not only for the perception hierarchies but also for the planning and action hierarchy in the frontal lobes.

  10. Integrated Network Architecture for NASA's Orion Missions

    NASA Technical Reports Server (NTRS)

    Bhasin, Kul B.; Hayden, Jeffrey L.; Sartwell, Thomas; Miller, Ronald A.; Hudiburg, John J.

    2008-01-01

    NASA is planning a series of short and long duration human and robotic missions to explore the Moon and then Mars. The series of missions will begin with a new crew exploration vehicle (called Orion) that will initially provide crew exchange and cargo supply support to the International Space Station (ISS) and then become a human conveyance for travel to the Moon. The Orion vehicle will be mounted atop the Ares I launch vehicle for a series of pre-launch tests and then launched and inserted into low Earth orbit (LEO) for crew exchange missions to the ISS. The Orion and Ares I comprise the initial vehicles in the Constellation system of systems that later includes Ares V, Earth departure stage, lunar lander, and other lunar surface systems for the lunar exploration missions. These key systems will enable the lunar surface exploration missions to be initiated in 2018. The complexity of the Constellation system of systems and missions will require a communication and navigation infrastructure to provide low and high rate forward and return communication services, tracking services, and ground network services. The infrastructure must provide robust, reliable, safe, sustainable, and autonomous operations at minimum cost while maximizing the exploration capabilities and science return. The infrastructure will be based on a network of networks architecture that will integrate NASA legacy communication, modified elements, and navigation systems. New networks will be added to extend communication, navigation, and timing services for the Moon missions. Internet protocol (IP) and network management systems within the networks will enable interoperability throughout the Constellation system of systems. An integrated network architecture has developed based on the emerging Constellation requirements for Orion missions. The architecture, as presented in this paper, addresses the early Orion missions to the ISS with communication, navigation, and network services over five

  11. Architecture for web-based image processing

    NASA Astrophysics Data System (ADS)

    Srini, Vason P.; Pini, David; Armstrong, Matt D.; Alalusi, Sayf H.; Thendean, John; Ueng, Sain-Zee; Bushong, David P.; Borowski, Erek S.; Chao, Elaine; Rabaey, Jan M.

    1997-09-01

    A computer systems architecture for processing medical images and other data coming over the Web is proposed. The architecture comprises a Java engine for communicating images over the Internet, storing data in local memory, doing floating point calculations, and a coprocessor MIMD parallel DSP for doing fine-grained operations found in video, graphics, and image processing applications. The local memory is shared between the Java engine and the parallel DSP. Data coming from the Web is stored in the local memory. This approach avoids the frequent movement of image data between a host processor's memory and an image processor's memory, found in many image processing systems. A low-power and high performance parallel DSP architecture containing lots of processors interconnected by a segmented hierarchical network has been developed. The instruction set of the 16-bit processor supports video, graphics, and image processing calculations. Two's complement arithmetic, saturation arithmetic, and packed instructions are supported. Higher data precision such as 32-bit and 64-bit can be achieved by cascading processors. A VLSI chip implementation of the architecture containing 64 processors organized in 16 clusters and interconnected by a statically programmable hierarchical bus is in progress. The buses are segmentable by programming switches on the bus. The instruction memory of each processor has sixteen 40-bit words. Data streaming through the processor is manipulated by the instructions. Multiple operations can be performed in a single cycle in a processor. A low-power handshake protocol is used for synchronization between the sender and the receiver of data. Temporary storage for data and filter coefficients is provided in each chip. A 256 by 16 memory unit is included in each of the 16 clusters. The memory unit can be used as a delay line, FIFO, lookup table or random access memory. The architecture is scalable with technology. Portable multimedia terminals like U

  12. Instrument calibration architecture of Radar Imaging Satellite (RISAT-1)

    NASA Astrophysics Data System (ADS)

    Misra, T.; Bhan, R.; Putrevu, D.; Mehrotra, P.; Nandy, P. S.; Shukla, S. D.; Rao, C. V. N.; Dave, D. B.; Desai, N. M.

    2016-05-01

    Radar Imaging Satellite (RISAT-1) payload system is configured to perform self-calibration of transmit and receive paths before and after imaging sessions through a special instrument calibration technique. Instrument calibration architecture of RISAT-1 supported ground verification and validation of payload including active array antenna. During on-ground validation of 126 beams of active array antenna which needed precise calibration of boresight pointing, a unique method called "collimation coefficient error estimation" was utilized. This method of antenna calibration was supported by special hardware and software calibration architecture of RISAT-1. This paper concentrates on RISAT-1 hardware and software architecture which supports in-orbit and on-ground instrument calibration. Efforts are also put here to highlight use of special calibration scheme of RISAT-1 instrument to evaluate system response during ground verification and validation.

  13. Integrated Operations Architecture Technology Assessment Study

    NASA Technical Reports Server (NTRS)

    2001-01-01

    As part of NASA's Integrated Operations Architecture (IOA) Baseline, NASA will consolidate all communications operations. including ground-based, near-earth, and deep-space communications, into a single integrated network. This network will make maximum use of commercial equipment, services and standards. It will be an Internet Protocol (IP) based network. This study supports technology development planning for the IOA. The technical problems that may arise when LEO mission spacecraft interoperate with commercial satellite services were investigated. Commercial technology and services that could support the IOA were surveyed, and gaps in the capability of existing technology and techniques were identified. Recommendations were made on which gaps should be closed by means of NASA research and development funding. Several findings emerged from the interoperability assessment: in the NASA mission set, there is a preponderance of small. inexpensive, low data rate science missions; proposed commercial satellite communications services could potentially provide TDRSS-like data relay functions; and. IP and related protocols, such as TCP, require augmentation to operate in the mobile networking environment required by the space-to-ground portion of the IOA. Five case studies were performed in the technology assessment. Each case represented a realistic implementation of the near-earth portion of the IOA. The cases included the use of frequencies at L-band, Ka-band and the optical spectrum. The cases also represented both space relay architectures and direct-to-ground architectures. Some of the main recommendations resulting from the case studies are: select an architecture for the LEO/MEO communications network; pursue the development of a Ka-band space-qualified transmitter (and possibly a receiver), and a low-cost Ka-band ground terminal for a direct-to-ground network, pursue the development of an Inmarsat (L-band) space-qualified transceiver to implement a global, low

  14. 9. Photocopy of architectural drawing (from National Archives Architectural and ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. Photocopy of architectural drawing (from National Archives Architectural and Cartographic Branch, Alexandria, Va.) Annotated lithograph on paper. Standard plan used for construction of Commissary Sergeants Quarters, 1876. PLAN, FRONT AND SIDE ELEVATIONS, SECTION - Fort Myer, Commissary Sergeant's Quarters, Washington Avenue between Johnson Lane & Custer Road, Arlington, Arlington County, VA

  15. The Architecture of Exoplanets

    NASA Astrophysics Data System (ADS)

    Hatzes, Artie P.

    2016-05-01

    Prior to the discovery of exoplanets our expectations of their architecture were largely driven by the properties of our solar system. We expected giant planets to lie in the outer regions and rocky planets in the inner regions. Planets should probably only occupy orbital distances 0.3-30 AU from the star. Planetary orbits should be circular, prograde and in the same plane. The reality of exoplanets have shattered these expectations. Jupiter-mass, Neptune-mass, Superearths, and even Earth-mass planets can orbit within 0.05 AU of the stars, sometimes with orbital periods of less than one day. Exoplanetary orbits can be eccentric, misaligned, and even in retrograde orbits. Radial velocity surveys gave the first hints that the occurrence rate increases with decreasing mass. This was put on a firm statistical basis with the Kepler mission that clearly demonstrated that there were more Neptune- and Superearth-sized planets than Jupiter-sized planets. These are often in multiple, densely packed systems where the planets all orbit within 0.3 AU of the star, a result also suggested by radial velocity surveys. Exoplanets also exhibit diversity along the main sequence. Massive stars tend to have a higher frequency of planets ( ≈ 20-25 %) that tend to be more massive ( M≈ 5-10 M_{Jup}). Giant planets around low mass stars are rare, but these stars show an abundance of small (Neptune and Superearth) planets in multiple systems. Planet formation is also not restricted to single stars as the Kepler mission has discovered several circumbinary planets. Although we have learned much about the architecture of planets over the past 20 years, we know little about the census of small planets at relatively large ( a>1 AU) orbital distances. We have yet to find a planetary system that is analogous to our own solar system. The question of how unique are the properties of our own solar system remains unanswered. Advancements in the detection methods of small planets over a wide range

  16. Space station needs, attributes and architectural options study. Volume 4: Architectural options, subsystems, technology and programmatics

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Space station architectural options, habitability considerations and subsystem analyses, technology, and programmatics are reviewed. The methodology employed for conceiving and defining space station concepts is presented. As a result of this approach, architectures were conceived and along with their supporting rationale are described within this portion of the report. Habitability consideration and subsystem analyses describe the human factors associated with space station operations and includes subsections covering (1) data management, (2) communications and tracking, (3) environmental control and life support, (4) manipulator systems, (5) resupply, (6) pointing, (7) thermal management and (8) interface standardization. A consolidated matrix of subsystems technology issues as related to meeting the mission needs for a 1990's era space station is presented. Within the programmatics portion, a brief description of costing and program strategies is outlined.

  17. Planning in subsumption architectures

    NASA Technical Reports Server (NTRS)

    Chalfant, Eugene C.

    1994-01-01

    A subsumption planner using a parallel distributed computational paradigm based on the subsumption architecture for control of real-world capable robots is described. Virtual sensor state space is used as a planning tool to visualize the robot's anticipated effect on its environment. Decision sequences are generated based on the environmental situation expected at the time the robot must commit to a decision. Between decision points, the robot performs in a preprogrammed manner. A rudimentary, domain-specific partial world model contains enough information to extrapolate the end results of the rote behavior between decision points. A collective network of predictors operates in parallel with the reactive network forming a recurrrent network which generates plans as a hierarchy. Details of a plan segment are generated only when its execution is imminent. The use of the subsumption planner is demonstrated by a simple maze navigation problem.

  18. BioArchitecture

    PubMed Central

    Gunning, Peter

    2012-01-01

    BioArchitecture is a term used to describe the organization and regulation of biological space. It applies to the principles which govern the structure of molecules, polymers and mutiprotein complexes, organelles, membranes and their organization in the cytoplasm and the nucleus. It also covers the integration of cells into their three dimensional environment at the level of cell-matrix, cell-cell interactions, integration into tissue/organ structure and function and finally into the structure of the organism. This review will highlight studies at all these levels which are providing a new way to think about the relationship between the organization of biological space and the function of biological systems. PMID:23267413

  19. The architecture of personality.

    PubMed

    Cervone, David

    2004-01-01

    This article presents a theoretical framework for analyzing psychological systems that contribute to the variability, consistency, and cross-situational coherence of personality functioning. In the proposed knowledge-and-appraisal personality architecture (KAPA), personality structures and processes are delineated by combining 2 principles: distinctions (a) between knowledge structures and appraisal processes and (b) among intentional cognitions with varying directions of fit, with the latter distinction differentiating among beliefs, evaluative standards, and aims. Basic principles of knowledge activation and use illuminate relations between knowledge and appraisal, yielding a synthetic account of personality structures and processes. Novel empirical data illustrate the heuristic value of the knowledge/appraisal distinction by showing how self-referent and situational knowledge combine to foster cross-situational coherence in appraisals of self-efficacy. PMID:14756593

  20. Functional Biomimetic Architectures

    NASA Astrophysics Data System (ADS)

    Levine, Paul M.

    N-substituted glycine oligomers, or 'peptoids,' are a class of sequence--specific foldamers composed of tertiary amide linkages, engendering proteolytic stability and enhanced cellular permeability. Peptoids are notable for their facile synthesis, sequence diversity, and ability to fold into distinct secondary structures. In an effort to establish new functional peptoid architectures, we utilize the copper-catalyzed azide-alkyne [3+2] cycloaddition (CuAAC) reaction to generate peptidomimetic assemblies bearing bioactive ligands that specifically target and modulate Androgen Receptor (AR) activity, a major therapeutic target for prostate cancer. Additionally, we explore chemical ligation protocols to generate semi-synthetic hybrid biomacromolecules capable of exhibiting novel structures and functions not accessible to fully biosynthesized proteins.

  1. Power Systems Control Architecture

    SciTech Connect

    James Davidson

    2005-01-01

    A diagram provided in the report depicts the complexity of the power systems control architecture used by the national power structure. It shows the structural hierarchy and the relationship of the each system to those other systems interconnected to it. Each of these levels provides a different focus for vulnerability testing and has its own weaknesses. In evaluating each level, of prime concern is what vulnerabilities exist that provide a path into the system, either to cause the system to malfunction or to take control of a field device. An additional vulnerability to consider is can the system be compromised in such a manner that the attacker can obtain critical information about the system and the portion of the national power structure that it controls.

  2. Multiprocessor architectural study

    NASA Technical Reports Server (NTRS)

    Kosmala, A. L.; Stanten, S. F.; Vandever, W. H.

    1972-01-01

    An architectural design study was made of a multiprocessor computing system intended to meet functional and performance specifications appropriate to a manned space station application. Intermetrics, previous experience, and accumulated knowledge of the multiprocessor field is used to generate a baseline philosophy for the design of a future SUMC* multiprocessor. Interrupts are defined and the crucial questions of interrupt structure, such as processor selection and response time, are discussed. Memory hierarchy and performance is discussed extensively with particular attention to the design approach which utilizes a cache memory associated with each processor. The ability of an individual processor to approach its theoretical maximum performance is then analyzed in terms of a hit ratio. Memory management is envisioned as a virtual memory system implemented either through segmentation or paging. Addressing is discussed in terms of various register design adopted by current computers and those of advanced design.

  3. Kepler Science Operations Center Architecture

    NASA Technical Reports Server (NTRS)

    Middour, Christopher; Klaus, Todd; Jenkins, Jon; Pletcher, David; Cote, Miles; Chandrasekaran, Hema; Wohler, Bill; Girouard, Forrest; Gunter, Jay P.; Uddin, Kamal; Allen, Christopher; Hall, Jennifer; Ibrahim, Khadeejah; Clarke, Bruce; Li, Jie; McCauliff, Sean; Quintana, Elisa; Sommers, Jeneen; Stroozas, Brett; Tenenbaum, Peter; Twicken, Joseph; Wu, Hayley; Caldwell, Doug; Bryson, Stephen; Bhavsar,Paresh

    2010-01-01

    We give an overview of the operational concepts and architecture of the Kepler Science Data Pipeline. Designed, developed, operated, and maintained by the Science Operations Center (SOC) at NASA Ames Research Center, the Kepler Science Data Pipeline is central element of the Kepler Ground Data System. The SOC charter is to analyze stellar photometric data from the Kepler spacecraft and report results to the Kepler Science Office for further analysis. We describe how this is accomplished via the Kepler Science Data Pipeline, including the hardware infrastructure, scientific algorithms, and operational procedures. The SOC consists of an office at Ames Research Center, software development and operations departments, and a data center that hosts the computers required to perform data analysis. We discuss the high-performance, parallel computing software modules of the Kepler Science Data Pipeline that perform transit photometry, pixel-level calibration, systematic error-correction, attitude determination, stellar target management, and instrument characterization. We explain how data processing environments are divided to support operational processing and test needs. We explain the operational timelines for data processing and the data constructs that flow into the Kepler Science Data Pipeline.

  4. Kepler Science Operations Center architecture

    NASA Astrophysics Data System (ADS)

    Middour, Christopher; Klaus, Todd C.; Jenkins, Jon; Pletcher, David; Cote, Miles; Chandrasekaran, Hema; Wohler, Bill; Girouard, Forrest; Gunter, Jay P.; Uddin, Kamal; Allen, Christopher; Hall, Jennifer; Ibrahim, Khadeejah; Clarke, Bruce; Li, Jie; McCauliff, Sean; Quintana, Elisa; Sommers, Jeneen; Stroozas, Brett; Tenenbaum, Peter; Twicken, Joseph; Wu, Hayley; Caldwell, Doug; Bryson, Stephen; Bhavsar, Paresh; Wu, Michael; Stamper, Brian; Trombly, Terry; Page, Christopher; Santiago, Elaine

    2010-07-01

    We give an overview of the operational concepts and architecture of the Kepler Science Processing Pipeline. Designed, developed, operated, and maintained by the Kepler Science Operations Center (SOC) at NASA Ames Research Center, the Science Processing Pipeline is a central element of the Kepler Ground Data System. The SOC consists of an office at Ames Research Center, software development and operations departments, and a data center which hosts the computers required to perform data analysis. The SOC's charter is to analyze stellar photometric data from the Kepler spacecraft and report results to the Kepler Science Office for further analysis. We describe how this is accomplished via the Kepler Science Processing Pipeline, including the hardware infrastructure, scientific algorithms, and operational procedures. We present the high-performance, parallel computing software modules of the pipeline that perform transit photometry, pixel-level calibration, systematic error correction, attitude determination, stellar target management, and instrument characterization. We show how data processing environments are divided to support operational processing and test needs. We explain the operational timelines for data processing and the data constructs that flow into the Kepler Science Processing Pipeline.

  5. Mars Exploration Architecture

    NASA Technical Reports Server (NTRS)

    Jordan, James F.; Miller, Sylvia L.

    2000-01-01

    The architecture of NASA's program of robotic Mars exploration missions received an intense scrutiny during the summer months of 1998. We present here the results of that scrutiny, and describe a list of Mars exploration missions which are now being proposed by the nation's space agency. The heart of the new program architecture consists of missions which will return samples of Martian rocks and soil back to Earth for analysis. A primary scientific goal for these missions is to understand Mars as a possible abode of past or present life. The current level of sophistication for detecting markers of biological processes and fossil or extant life forms is much higher in Earth-based laboratories than possible with remotely deployed instrumentation, and will remain so for at least the next decade. Hence, bringing Martian samples back to Earth is considered the best way to search for the desired evidence. A Mars sample return mission takes approximately three years to complete. Transit from Earth to Mars requires almost a single year. After a lapse of time of almost a year at Mars, during which orbital and surface operations can take place, and the correct return launch energy constraints are met, a Mars-to-Earth return flight can be initiated. This return leg also takes approximately one year. Opportunities to launch these 3-year sample return missions occur about every 2 years. The figure depicts schedules for flights to and from Mars for Earth launches in 2003, 2005, 2007 and 2009. Transits for less than 180 deg flight angle, measured from the sun, and more than 180 deg are both shown.

  6. SpaceWire Architectures: Present and Future

    NASA Technical Reports Server (NTRS)

    Rakow, Glen Parker

    2006-01-01

    A viewgraph presentation on current and future spacewire architectures is shown. The topics include: 1) Current Spacewire Architectures: Swift Data Flow; 2) Current SpaceWire Architectures : LRO Data Flow; 3) Current Spacewire Architectures: JWST Data Flow; 4) Current SpaceWire Architectures; 5) Traditional Systems; 6) Future Systems; 7) Advantages; and 8) System Engineer Toolkit.

  7. An OSI architecture for the deep space network

    NASA Technical Reports Server (NTRS)

    Heuser, W. Randy; Cooper, Lynne P.

    1993-01-01

    The flexibility and robustness of a monitor and control system are a direct result of the underlying inter-processor communications architecture. A new architecture for monitor & Control at the Deep Space Network Communications Complexes has been developed based on the Open System Interconnection (OSI) standards. The suitability of OSI standards for DSN M&C has been proven in the laboratory. The laboratory success has resulted in choosing an OSI-based architecture for DSS-13 M&C. DSS-13 is the DSN experimental station and is not part of the 'operational' DSN; it's role is to provide an environment to test new communications concepts can be tested and conduct unique science experiments. Therefore, DSS-13 must be robust enough to support operational activities, while also being flexible enough to enable experimentation. This paper describes the M&C architecture developed for DSS-13 and the results from system and operational testing.

  8. An epigenetic toolkit allows for diverse genome architectures in eukaryotes.

    PubMed

    Maurer-Alcalá, Xyrus X; Katz, Laura A

    2015-12-01

    Genome architecture varies considerably among eukaryotes in terms of both size and structure (e.g. distribution of sequences within the genome, elimination of DNA during formation of somatic nuclei). The diversity in eukaryotic genome architectures and the dynamic processes are only possible due to the well-developed epigenetic toolkit, which probably existed in the Last Eukaryotic Common Ancestor (LECA). This toolkit may have arisen as a means of navigating the genomic conflict that arose from the expansion of transposable elements within the ancestral eukaryotic genome. This toolkit has been coopted to support the dynamic nature of genomes in lineages across the eukaryotic tree of life. Here we highlight how the changes in genome architecture in diverse eukaryotes are regulated by epigenetic processes, such as DNA elimination, genome rearrangements, and adaptive changes to genome architecture. The ability to epigenetically modify and regulate genomes has contributed greatly to the diversity of eukaryotes observed today.

  9. Architectural Portfolio 2001: Main Winners.

    ERIC Educational Resources Information Center

    American School & University, 2001

    2001-01-01

    Presents descriptions and photographs of the following two American School and University Architectural Portfolio main winners for 2001: Chesterton, Indiana's Chesterton High School and Lied Library at the University of Nevada, Las Vegas. Included are each project's vital statistics, the architectural firm involved, and a list of designers.(GR)

  10. Dynamic Weather Routes Architecture Overview

    NASA Technical Reports Server (NTRS)

    Eslami, Hassan; Eshow, Michelle

    2014-01-01

    Dynamic Weather Routes Architecture Overview, presents the high level software architecture of DWR, based on the CTAS software framework and the Direct-To automation tool. The document also covers external and internal data flows, required dataset, changes to the Direct-To software for DWR, collection of software statistics, and the code structure.

  11. Interior Design in Architectural Education

    ERIC Educational Resources Information Center

    Gurel, Meltem O.; Potthoff, Joy K.

    2006-01-01

    The domain of interiors constitutes a point of tension between practicing architects and interior designers. Design of interior spaces is a significant part of architectural profession. Yet, to what extent does architectural education keep pace with changing demands in rendering topics that are identified as pertinent to the design of interiors?…

  12. Space Elevators Preliminary Architectural View

    NASA Astrophysics Data System (ADS)

    Pullum, L.; Swan, P. A.

    Space Systems Architecture has been expanded into a process by the US Department of Defense for their large scale systems of systems development programs. This paper uses the steps in the process to establishes a framework for Space Elevator systems to be developed and provides a methodology to manage complexity. This new approach to developing a family of systems is based upon three architectural views: Operational View OV), Systems View (SV), and Technical Standards View (TV). The top level view of the process establishes the stages for the development of the first Space Elevator and is called Architectural View - 1, Overview and Summary. This paper will show the guidelines and steps of the process while focusing upon components of the Space Elevator Preliminary Architecture View. This Preliminary Architecture View is presented as a draft starting point for the Space Elevator Project.

  13. Coral identity underpins architectural complexity on Caribbean reefs.

    PubMed

    Alvarez-Filip, Lorenzo; Dulvy, Nicholas K; Côte, Isabelle M; Watkinson, Andrew R; Gill, Jennifer A

    2011-09-01

    The architectural complexity of ecosystems can greatly influence their capacity to support biodiversity and deliver ecosystem services. Understanding the components underlying this complexity can aid the development of effective strategies for ecosystem conservation. Caribbean coral reefs support and protect millions of livelihoods, but recent anthropogenic change is shifting communities toward reefs dominated by stress-resistant coral species, which are often less architecturally complex. With the regionwide decline in reef fish abundance, it is becoming increasingly important to understand changes in coral reef community structure and function. We quantify the influence of coral composition, diversity, and morpho-functional traits on the architectural complexity of reefs across 91 sites at Cozumel, Mexico. Although reef architectural complexity increases with coral cover and species richness, it is highest on sites that are low in taxonomic evenness and dominated by morpho-functionally important, reef-building coral genera, particularly Montastraea. Sites with similar coral community composition also tend to occur on reefs with very similar architectural complexity, suggesting that reef structure tends to be determined by the same key species across sites. Our findings provide support for prioritizing and protecting particular reef types, especially those dominated by key reef-building corals, in order to enhance reef complexity.

  14. Information architecture: Profile of adopted standards

    SciTech Connect

    1997-09-01

    The Department of Energy (DOE), like other Federal agencies, is under increasing pressure to use information technology to improve efficiency in mission accomplishment as well as delivery of services to the public. Because users and systems have become interdependent, DOE has enterprise wide needs for common application architectures, communication networks, databases, security, and management capabilities. Users need open systems that provide interoperability of products and portability of people, data, and applications that are distributed throughout heterogeneous computing environments. The level of interoperability necessary requires the adoption of DOE wide standards, protocols, and best practices. The Department has developed an information architecture and a related standards adoption and retirement process to assist users in developing strategies and plans for acquiring information technology products and services based upon open systems standards that support application software interoperability, portability, and scalability. This set of Departmental Information Architecture standards represents guidance for achieving higher degrees of interoperability within the greater DOE community, business partners, and stakeholders. While these standards are not mandatory, particular and due consideration of their applications in contractual matters and use in technology implementations Department wide are goals of the Chief Information Officer.

  15. A Facility and Architecture for Autonomy Research

    NASA Technical Reports Server (NTRS)

    Pisanich, Greg; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Autonomy is a key enabling factor in the advancement of the remote robotic exploration. There is currently a large gap between autonomy software at the research level and software that is ready for insertion into near-term space missions. The Mission Simulation Facility (MST) will bridge this gap by providing a simulation framework and suite of simulation tools to support research in autonomy for remote exploration. This system will allow developers of autonomy software to test their models in a high-fidelity simulation and evaluate their system's performance against a set of integrated, standardized simulations. The Mission Simulation ToolKit (MST) uses a distributed architecture with a communication layer that is built on top of the standardized High Level Architecture (HLA). This architecture enables the use of existing high fidelity models, allows mixing simulation components from various computing platforms and enforces the use of a standardized high-level interface among components. The components needed to achieve a realistic simulation can be grouped into four categories: environment generation (terrain, environmental features), robotic platform behavior (robot dynamics), instrument models (camera/spectrometer/etc.), and data analysis. The MST will provide basic components in these areas but allows users to plug-in easily any refined model by means of a communication protocol. Finally, a description file defines the robot and environment parameters for easy configuration and ensures that all the simulation models share the same information.

  16. Comparing concepts for electronic health record architectures.

    PubMed

    Blobel, Bernd

    2002-01-01

    Keeping all relevant information directly or indirectly related to patient's care, electronic health records (EHR) systems are supposed to be kernel application for any kind of health information systems. For facilitating shared care, managed care, or disease management, such EHR systems have to be scalable, portable, distributed, and interoperable which has to be enabled by a proper architecture supporting informational and functional needs as well. Advanced EHR architectures are based on object-oriented or component-oriented paradigms and use modern tooling to design, specify, implement and maintain EHR solutions. They reflect not only medical information but also underlying concepts and integrate an extended vocabulary. The most advanced EHR architecture approaches CEN ENV 13606, G-CPR, HL7 RIM and derived models, and finally the Australian GEHR project are shortly characterised. For comparing the solutions, the ISO RM - ODP, the Generic Component Model and the CORBA 3 methodology have been used. The HARP methodology for enhancing the current harmonisation of openEHR is shortly discussed.

  17. A layered architecture for critical database design

    SciTech Connect

    Chisholm, G.H.; Swietlik, C.E.

    1997-12-31

    Integrity, security, and safety are desired properties of database systems destined for use in critical applications. These properties are desirable because they determine a system`s credibility. However, demonstrating that a system does, in fact, preserve these properties when implemented is a difficult task. The difficulty depends on the complexity of the associated design. The authors explore architectural paradigms that have been demonstrated to reduce system complexity and, thus, reduce the cost associated with certifying that the above properties are present in the final implementation. The approach is based on the tenet that the design is divided into multiple layers. The critical functions and data make up the bottom layer, where the requirements for integrity, security, and safety are most rigid. Certification is dependent on the use of formal methods to specify and analyze the system. Appropriate formal methods are required to support certification that multiple properties are present in the final implementation. These methods must assure a rigid mapping from the top-level specification down through the implementation details. Application of a layered architecture reduces the scope of the design that must be formally specified and analyzed. This paper describes a generic, layered architecture and a formal model for specification and analysis of complex systems that require rigid integrity security, and safety properties.

  18. On the architecture of spacetime geometry

    NASA Astrophysics Data System (ADS)

    Bianchi, Eugenio; Myers, Robert C.

    2014-11-01

    We propose entanglement entropy as a probe of the architecture of spacetime in quantum gravity. We argue that the leading contribution to this entropy satisfies an area law for any sufficiently large region in a smooth spacetime, which, in fact, is given by the Bekenstein-Hawking formula. This conjecture is supported by various lines of evidence from perturbative quantum gravity, simplified models of induced gravity, the AdS/CFT correspondence and loop quantum gravity, as well as Jacobson's ‘thermodynamic’ perspective of gravity.

  19. NASA Laboratory telerobotic manipulator control system architecture

    NASA Technical Reports Server (NTRS)

    Rowe, J. C.; Butler, P. L.; Glassell, R. L.; Herndon, J. N.

    1991-01-01

    In support of the National Aeronautics and Space Administration (NASA) goals to increase the utilization of dexterous robotic systems in space, the Oak Ridge National Laboratory (ORNL) has developed the Laboratory Telerobotic Manipulator (LTM) system. It is a dexterous, dual-arm, force reflecting teleoperator system with robotic features for NASA ground-based research. This paper describes the overall control system architecture, including both the hardware and software. The control system is a distributed, modular, and hierarchical design with flexible expansion capabilities for future enhancements of both the hardware and software.

  20. Planetary cubesats - mission architectures

    NASA Astrophysics Data System (ADS)

    Bousquet, Pierre W.; Ulamec, Stephan; Jaumann, Ralf; Vane, Gregg; Baker, John; Clark, Pamela; Komarek, Tomas; Lebreton, Jean-Pierre; Yano, Hajime

    2016-07-01

    Miniaturisation of technologies over the last decade has made cubesats a valid solution for deep space missions. For example, a spectacular set 13 cubesats will be delivered in 2018 to a high lunar orbit within the frame of SLS' first flight, referred to as Exploration Mission-1 (EM-1). Each of them will perform autonomously valuable scientific or technological investigations. Other situations are encountered, such as the auxiliary landers / rovers and autonomous camera that will be carried in 2018 to asteroid 1993 JU3 by JAXA's Hayabusas 2 probe, and will provide complementary scientific return to their mothership. In this case, cubesats depend on a larger spacecraft for deployment and other resources, such as telecommunication relay or propulsion. For both situations, we will describe in this paper how cubesats can be used as remote observatories (such as NEO detection missions), as technology demonstrators, and how they can perform or contribute to all steps in the Deep Space exploration sequence: Measurements during Deep Space cruise, Body Fly-bies, Body Orbiters, Atmospheric probes (Jupiter probe, Venus atmospheric probes, ..), Static Landers, Mobile landers (such as balloons, wheeled rovers, small body rovers, drones, penetrators, floating devices, …), Sample Return. We will elaborate on mission architectures for the most promising concepts where cubesat size devices offer an advantage in terms of affordability, feasibility, and increase of scientific return.

  1. Array processor architecture

    NASA Technical Reports Server (NTRS)

    Barnes, George H. (Inventor); Lundstrom, Stephen F. (Inventor); Shafer, Philip E. (Inventor)

    1983-01-01

    A high speed parallel array data processing architecture fashioned under a computational envelope approach includes a data base memory for secondary storage of programs and data, and a plurality of memory modules interconnected to a plurality of processing modules by a connection network of the Omega gender. Programs and data are fed from the data base memory to the plurality of memory modules and from hence the programs are fed through the connection network to the array of processors (one copy of each program for each processor). Execution of the programs occur with the processors operating normally quite independently of each other in a multiprocessing fashion. For data dependent operations and other suitable operations, all processors are instructed to finish one given task or program branch before all are instructed to proceed in parallel processing fashion on the next instruction. Even when functioning in the parallel processing mode however, the processors are not locked-step but execute their own copy of the program individually unless or until another overall processor array synchronization instruction is issued.

  2. Distributed multiport memory architecture

    NASA Technical Reports Server (NTRS)

    Kohl, W. H. (Inventor)

    1983-01-01

    A multiport memory architecture is diclosed for each of a plurality of task centers connected to a command and data bus. Each task center, includes a memory and a plurality of devices which request direct memory access as needed. The memory includes an internal data bus and an internal address bus to which the devices are connected, and direct timing and control logic comprised of a 10-state ring counter for allocating memory devices by enabling AND gates connected to the request signal lines of the devices. The outputs of AND gates connected to the same device are combined by OR gates to form an acknowledgement signal that enables the devices to address the memory during the next clock period. The length of the ring counter may be effectively lengthened to any multiple of ten to allow for more direct memory access intervals in one repetitive sequence. One device is a network bus adapter which serially shifts onto the command and data bus, a data word (8 bits plus control and parity bits) during the next ten direct memory access intervals after it has been granted access. The NBA is therefore allocated only one access in every ten intervals, which is a predetermined interval for all centers. The ring counters of all centers are periodically synchronized by DMA SYNC signal to assure that all NBAs be able to function in synchronism for data transfer from one center to another.

  3. Lunar Exploration Architectures

    NASA Astrophysics Data System (ADS)

    Perino, Maria Antonietta

    The international space exploration plans foresee in the next decades multiple robotic and human missions to Moon and robotic missions to Mars, Phobos and other destinations. Notably the US has since the announcement of the US space exploration vision by President G. W. Bush in 2004 made significant progress in the further definition of its exploration programme focusing in the next decades in particular on human missions to Moon. Given the highly demanding nature of these missions, different initiatives have been recently taken at international level to discuss how the lunar exploration missions currently planned at national level could fit in a coordinate roadmap and contribute to lunar exploration. Thales Alenia Space - Italia is leading 3 studies for the European Space Agency focus on the analysis of the transportation, in-space and surface architectures required to meet ESA provided stakeholders exploration objectives and requirements. Main result of this activity is the identification of European near-term priorities for exploration missions and European long-term priorities for capability and technology developments related to planetary exploration missions. This paper will present the main studies' results drawing a European roadmap for exploration missions and capability and technology developments related to lunar exploration infrastructure development, taking into account the strategic and programmatic indications for exploration coming from ESA as well as the international exploration context.

  4. Superconducting Bolometer Array Architectures

    NASA Technical Reports Server (NTRS)

    Benford, Dominic; Chervenak, Jay; Irwin, Kent; Moseley, S. Harvey; Shafer, Rick; Staguhn, Johannes; Wollack, Ed; Oegerle, William (Technical Monitor)

    2002-01-01

    The next generation of far-infrared and submillimeter instruments require large arrays of detectors containing thousands of elements. These arrays will necessarily be multiplexed, and superconducting bolometer arrays are the most promising present prospect for these detectors. We discuss our current research into superconducting bolometer array technologies, which has recently resulted in the first multiplexed detections of submillimeter light and the first multiplexed astronomical observations. Prototype arrays containing 512 pixels are in production using the Pop-Up Detector (PUD) architecture, which can be extended easily to 1000 pixel arrays. Planar arrays of close-packed bolometers are being developed for the GBT (Green Bank Telescope) and for future space missions. For certain applications, such as a slewed far-infrared sky survey, feedhorncoupling of a large sparsely-filled array of bolometers is desirable, and is being developed using photolithographic feedhorn arrays. Individual detectors have achieved a Noise Equivalent Power (NEP) of -10(exp 17) W/square root of Hz at 300mK, but several orders of magnitude improvement are required and can be reached with existing technology. The testing of such ultralow-background detectors will prove difficult, as this requires optical loading of below IfW. Antenna-coupled bolometer designs have advantages for large format array designs at low powers due to their mode selectivity.

  5. The architectural design of networks of protein domain architectures.

    PubMed

    Hsu, Chia-Hsin; Chen, Chien-Kuo; Hwang, Ming-Jing

    2013-08-23

    Protein domain architectures (PDAs), in which single domains are linked to form multiple-domain proteins, are a major molecular form used by evolution for the diversification of protein functions. However, the design principles of PDAs remain largely uninvestigated. In this study, we constructed networks to connect domain architectures that had grown out from the same single domain for every single domain in the Pfam-A database and found that there are three main distinctive types of these networks, which suggests that evolution can exploit PDAs in three different ways. Further analysis showed that these three different types of PDA networks are each adopted by different types of protein domains, although many networks exhibit the characteristics of more than one of the three types. Our results shed light on nature's blueprint for protein architecture and provide a framework for understanding architectural design from a network perspective.

  6. Integrated Sensor Architecture (ISA) for Live Virtual Constructive (LVC) environments

    NASA Astrophysics Data System (ADS)

    Moulton, Christine L.; Harkrider, Susan; Harrell, John; Hepp, Jared

    2014-06-01

    The Integrated Sensor Architecture (ISA) is an interoperability solution that allows for the sharing of information between sensors and systems in a dynamic tactical environment. The ISA created a Service Oriented Architecture (SOA) that identifies common standards and protocols which support a net-centric system of systems integration. Utilizing a common language, these systems are able to connect, publish their needs and capabilities, and interact with other systems even on disadvantaged networks. Within the ISA project, three levels of interoperability were defined and implemented and these levels were tested at many events. Extensible data models and capabilities that are scalable across multi-echelons are supported, as well as dynamic discovery of capabilities and sensor management. The ISA has been tested and integrated with multiple sensors, platforms, and over a variety of hardware architectures in operational environments.

  7. A cognitive architecture for simulating bodies and minds.

    PubMed

    Nirenburg, Sergei; McShane, Marjorie; Beale, Stephen; Catizone, Roberta

    2011-01-01

    This paper presents an overview of a cognitive architecture, OntoAgent, that supports the creation and deployment of intelligent agents capable of simulating human-like abilities. The agents, which have a simulated mind and, if applicable, a simulated body, are intended to operate as members of multi-agent teams featuring both artificial and human agents. The agent architecture and its underlying knowledge resources and processors are being developed in a sufficiently generic way to support a variety of applications. In this paper we briefly describe the architecture and two applications being configured within it: the Maryland Virtual Patient (MVP) system for training medical personnel and the CLinician's ADvisor (CLAD). We organize the discussion around four aspects of agent modeling and how they are utilized in the two applications: physiological simulation, modeling an agent's knowledge and learning, decision-making and language processing.

  8. Systolic architecture for heirarchical clustering

    SciTech Connect

    Ku, L.C.

    1984-01-01

    Several hierarchical clustering methods (including single-linkage complete-linkage, centroid, and absolute overlap methods) are reviewed. The absolute overlap clustering method is selected for the design of systolic architecture mainly due to its simplicity. Two versions of systolic architectures for the absolute overlap hierarchical clustering algorithm are proposed: one-dimensional version that leads to the development of a two dimensional version which fully takes advantage of the underlying data structure of the problems. The two dimensional systolic architecture can achieve a time complexity of O(m + n) in comparison with the conventional computer implementation of a time complexity of O(m/sup 2*/n).

  9. Microcomponent chemical process sheet architecture

    DOEpatents

    Wegeng, R.S.; Drost, M.K.; Call, C.J.; Birmingham, J.G.; McDonald, C.E.; Kurath, D.E.; Friedrich, M.

    1998-09-22

    The invention is a microcomponent sheet architecture wherein macroscale unit processes are performed by microscale components. The sheet architecture may be a single laminate with a plurality of separate microcomponent sections or the sheet architecture may be a plurality of laminates with one or more microcomponent sections on each laminate. Each microcomponent or plurality of like microcomponents perform at least one chemical process unit operation. A first laminate having a plurality of like first microcomponents is combined with at least a second laminate having a plurality of like second microcomponents thereby combining at least two unit operations to achieve a system operation. 26 figs.

  10. Microcomponent chemical process sheet architecture

    DOEpatents

    Wegeng, Robert S.; Drost, M. Kevin; Call, Charles J.; Birmingham, Joseph G.; McDonald, Carolyn Evans; Kurath, Dean E.; Friedrich, Michele

    1998-01-01

    The invention is a microcomponent sheet architecture wherein macroscale unit processes are performed by microscale components. The sheet architecture may be a single laminate with a plurality of separate microcomponent sections or the sheet architecture may be a plurality of laminates with one or more microcomponent sections on each laminate. Each microcomponent or plurality of like microcomponents perform at least one chemical process unit operation. A first laminate having a plurality of like first microcomponents is combined with at least a second laminate having a plurality of like second microcomponents thereby combining at least two unit operations to achieve a system operation.

  11. Telemedicine system interoperability architecture: concept description and architecture overview.

    SciTech Connect

    Craft, Richard Layne, II

    2004-05-01

    In order for telemedicine to realize the vision of anywhere, anytime access to care, it must address the question of how to create a fully interoperable infrastructure. This paper describes the reasons for pursuing interoperability, outlines operational requirements that any interoperability approach needs to consider, proposes an abstract architecture for meeting these needs, identifies candidate technologies that might be used for rendering this architecture, and suggests a path forward that the telemedicine community might follow.

  12. Digital Architecture – Results From a Gap Analysis

    SciTech Connect

    Oxstrand, Johanna Helene; Thomas, Kenneth David; Fitzgerald, Kirk

    2015-09-01

    The digital architecture is defined as a collection of IT capabilities needed to support and integrate a wide-spectrum of real-time digital capabilities for nuclear power plant performance improvements. The digital architecture can be thought of as an integration of the separate I&C and information systems already in place in NPPs, brought together for the purpose of creating new levels of automation in NPP work activities. In some cases, it might be an extension of the current communication systems, to provide digital communications where they are currently analog only. This collection of IT capabilities must in turn be based on a set of user requirements that must be supported for the interconnected technologies to operate in an integrated manner. These requirements, simply put, are a statement of what sorts of digital work functions will be exercised in a fully-implemented seamless digital environment and how much they will be used. The goal of the digital architecture research is to develop a methodology for mapping nuclear power plant operational and support activities into the digital architecture, which includes the development of a consensus model for advanced information and control architecture. The consensus model should be developed at a level of detail that is useful to the industry. In other words, not so detailed that it specifies specific protocols and not so vague that it is only provides a high level description of technology. The next step towards the model development is to determine the current state of digital architecture at typical NPPs. To investigate the current state, the researchers conducted a gap analysis to determine to what extent the NPPs can support the future digital technology environment with their existing I&C and IT structure, and where gaps exist with respect to the full deployment of technology over time. The methodology, result, and conclusions from the gap analysis are described in this report.

  13. High-performance solid oxide fuel cells based on a thin La0.8Sr0.2Ga0.8Mg0.2O3-δ electrolyte membrane supported by a nickel-based anode of unique architecture

    NASA Astrophysics Data System (ADS)

    Sun, Haibin; Chen, Yu; Chen, Fanglin; Zhang, Yujun; Liu, Meilin

    2016-01-01

    Solid oxide fuel cells (SOFCs) based on a thin La0.8Sr0.2Ga0.8Mg0.2O3-δ (LSGM) electrolyte membrane supported by a nickel-based anode often suffers from undesirable reaction/diffusion between the Ni anode and the LSGM during high-temperature co-firing. In this study, a high performance intermediate-temperature SOFC is fabricated by depositing thin LSGM electrolyte membranes on a LSGM backbone of unique architecture coated with nano-sized Ni and Gd0.1Ce0.9O2-δ (GDC) particles via a combination of freeze-drying tape-casting, slurry drop-coating, and solution infiltration. The thickness of the dense LSGM electrolyte membranes is ∼30 μm while the undesirable reaction/diffusion between Ni and LSGM are effectively hindered because of the relatively low firing temperature, as confirmed by XRD analysis. Single cells show peak power densities of 1.61 W cm-2 at 700 °C and 0.52 W cm-2 at 600 °C using 3 vol% humidified H2 as fuel and ambient air as oxidant. The cell performance is very stable for 115 h at a constant current density of 0.303 A cm-2 at 600 °C.

  14. An Object Oriented Extensible Architecture for Affordable Aerospace Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J.; Lytle, John K. (Technical Monitor)

    2002-01-01

    Driven by a need to explore and develop propulsion systems that exceeded current computing capabilities, NASA Glenn embarked on a novel strategy leading to the development of an architecture that enables propulsion simulations never thought possible before. Full engine 3 Dimensional Computational Fluid Dynamic propulsion system simulations were deemed impossible due to the impracticality of the hardware and software computing systems required. However, with a software paradigm shift and an embracing of parallel and distributed processing, an architecture was designed to meet the needs of future propulsion system modeling. The author suggests that the architecture designed at the NASA Glenn Research Center for propulsion system modeling has potential for impacting the direction of development of affordable weapons systems currently under consideration by the Applied Vehicle Technology Panel (AVT). This paper discusses the salient features of the NPSS Architecture including its interface layer, object layer, implementation for accessing legacy codes, numerical zooming infrastructure and its computing layer. The computing layer focuses on the use and deployment of these propulsion simulations on parallel and distributed computing platforms which has been the focus of NASA Ames. Additional features of the object oriented architecture that support MultiDisciplinary (MD) Coupling, computer aided design (CAD) access and MD coupling objects will be discussed. Included will be a discussion of the successes, challenges and benefits of implementing this architecture.

  15. An implementation of SISAL for distributed-memory architectures

    SciTech Connect

    Beard, P.C.

    1995-06-01

    This thesis describes a new implementation of the implicitly parallel functional programming language SISAL, for massively parallel processor supercomputers. The Optimizing SISAL Compiler (OSC), developed at Lawrence Livermore National Laboratory, was originally designed for shared-memory multiprocessor machines and has been adapted to distributed-memory architectures. OSC has been relatively portable between shared-memory architectures, because they are architecturally similar, and OSC generates portable C code. However, distributed-memory architectures are not standardized -- each has a different programming model. Distributed-memory SISAL depends on a layer of software that provides a portable, distributed, shared-memory abstraction. This layer is provided by Split-C, a dialect of the C programming language developed at U.C. Berkeley, which has demonstrated good performance on distributed-memory architectures. Split-C provides important capabilities for good performance: support for program-specific distributed data structures, and split-phase memory operations. Distributed data structures help achieve good memory locality, while split-phase memory operations help tolerate the longer communication latencies inherent in distributed-memory architectures. The distributed-memory SISAL compiler and run-time system takes advantage of these capabilities. The results of these efforts is a compiler that runs identically on the Thinking Machines Connection Machine (CM-5), and the Meiko Computing Surface (CS-2).

  16. A Mobile Service Oriented Multiple Object Tracking Augmented Reality Architecture for Education and Learning Experiences

    ERIC Educational Resources Information Center

    Rattanarungrot, Sasithorn; White, Martin; Newbury, Paul

    2014-01-01

    This paper describes the design of our service-oriented architecture to support mobile multiple object tracking augmented reality applications applied to education and learning scenarios. The architecture is composed of a mobile multiple object tracking augmented reality client, a web service framework, and dynamic content providers. Tracking of…

  17. Alternatives generation and analysis report for immobilized low-level waste interim storage architecture

    SciTech Connect

    Burbank, D.A., Westinghouse Hanford

    1996-09-01

    The Immobilized Low-Level Waste Interim Storage subproject will provide storage capacity for immobilized low-level waste product sold to the U.S. Department of Energy by the privatization contractor. This report describes alternative Immobilized Low-Level Waste storage system architectures, evaluation criteria, and evaluation results to support the Immobilized Low-Level Waste storage system architecture selection decision process.

  18. 76 FR 34287 - ITS Joint Program Office; Core System Requirements Walkthrough and Architecture Proposal Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-13

    ... ITS Joint Program Office; Core System Requirements Walkthrough and Architecture Proposal Review... the Vehicle to Infrastructure (V2I) Core System Requirements and Architecture Proposal. The first... . The V2I Core System will support applications for safety, mobility, and sustainability for...

  19. 76 FR 36954 - ITS Joint Program Office; Core System Requirements Walkthrough and Architecture Proposal Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-23

    ... ITS Joint Program Office; Core System Requirements Walkthrough and Architecture Proposal Review... accompanying webinars to discuss the Vehicle to Infrastructure (V2I) Core System Requirements and Architecture... . The V2I Core System will support applications for safety, mobility, and sustainability for...

  20. Collaborative Concept Mapping in a Web-Based Learning Environment: A Pedagogic Experience in Architectural Education.

    ERIC Educational Resources Information Center

    Madrazo, Leandro; Vidal, Jordi

    2002-01-01

    Describes a pedagogical work, carried out within a school of architecture, using a Web-based learning environment to support collaborative understanding of texts on architectural theory. Explains the use of concept maps, creation of a critical vocabulary, exploration of semantic spaces, and knowledge discovery through navigation. (Author/LRW)

  1. Adapted Verbal Feedback, Instructor Interaction and Student Emotions in the Landscape Architecture Studio

    ERIC Educational Resources Information Center

    Smith, Carl A.; Boyer, Mark E.

    2015-01-01

    In light of concerns with architectural students' emotional jeopardy during traditional desk and final-jury critiques, the authors pursue alternative approaches intended to provide more supportive and mentoring verbal assessment in landscape architecture studios. In addition to traditional studio-based critiques throughout a semester, we provide…

  2. The IVOA Architecture

    NASA Astrophysics Data System (ADS)

    Arviset, C.; Gaudet, S.; IVOA Technical Coordination Group

    2012-09-01

    Astronomy produces large amounts of data of many kinds, coming from various sources: science space missions, ground based telescopes, theoretical models, compilation of results, etc. These data and associated processing services are made available via the Internet by "providers", usually large data centres or smaller teams (see Figure 1). The "consumers", be they individual researchers, research teams or computer systems, access these services to do their science. However, inter-connection amongst all these services and between providers and consumers is usually not trivial. The Virtual Observatory (VO) is the necessary "middle layer" framework enabling interoperability between all these providers and consumers in a seamless and transparent manner. Like the web which enables end users and machines to access transparently documents and services wherever and however they are stored, the VO enables the astronomy community to access data and service resources wherever and however they are provided. Over the last decade, the International Virtual Observatory Alliance (IVOA) has been defining various standards to build the VO technical framework for the providers to share their data and services ("Sharing"), and to allow users to find ("Finding") these resources, to get them ("Getting") and to use them ("Using"). To enable these functionalities, the definition of some core astronomically-oriented standards ("VO Core") has also been necessary. This paper will present the official and current IVOA Architecture[1], describing the various building blocks of the VO framework (see Figure 2) and their relation to all existing and in-progress IVOA standards. Additionally, it will show examples of these standards in action, connecting VO "consumers" to VO "providers".

  3. Project Integration Architecture

    NASA Technical Reports Server (NTRS)

    Jones, William Henry

    2008-01-01

    The Project Integration Architecture (PIA) is a distributed, object-oriented, conceptual, software framework for the generation, organization, publication, integration, and consumption of all information involved in any complex technological process in a manner that is intelligible to both computers and humans. In the development of PIA, it was recognized that in order to provide a single computational environment in which all information associated with any given complex technological process could be viewed, reviewed, manipulated, and shared, it is necessary to formulate all the elements of such a process on the most fundamental level. In this formulation, any such element is regarded as being composed of any or all of three parts: input information, some transformation of that input information, and some useful output information. Another fundamental principle of PIA is the assumption that no consumer of information, whether human or computer, can be assumed to have any useful foreknowledge of an element presented to it. Consequently, a PIA-compliant computing system is required to be ready to respond to any questions, posed by the consumer, concerning the nature of the proffered element. In colloquial terms, a PIA-compliant system must be prepared to provide all the information needed to place the element in context. To satisfy this requirement, PIA extends the previously established object-oriented- programming concept of self-revelation and applies it on a grand scale. To enable pervasive use of self-revelation, PIA exploits another previously established object-oriented-programming concept - that of semantic infusion through class derivation. By means of self-revelation and semantic infusion through class derivation, a consumer of information can inquire about the contents of all information entities (e.g., databases and software) and can interact appropriately with those entities. Other key features of PIA are listed.

  4. Dynamic Information Architecture System

    1997-02-12

    The Dynamic Information System (DIAS) is a flexible object-based software framework for concurrent, multidiscplinary modeling of arbitrary (but related) processes. These processes are modeled as interrelated actions caused by and affecting the collection of diverse real-world objects represented in a simulation. The DIAS architecture allows independent process models to work together harmoniously in the same frame of reference and provides a wide range of data ingestion and output capabilities, including Geographic Information System (GIS) typemore » map-based displays and photorealistic visualization of simulations in progress. In the DIAS implementation of the object-based approach, software objects carry within them not only the data which describe their static characteristics, but also the methods, or functions, which describe their dynamic behaviors. There are two categories of objects: (1) Entity objects which have real-world counterparts and are the actors in a simulation, and (2) Software infrastructure objects which make it possible to carry out the simulations. The Entity objects contain lists of Aspect objects, each of which addresses a single aspect of the Entity''s behavior. For example, a DIAS Stream Entity representing a section of a river can have many aspects correspondimg to its behavior in terms of hydrology (as a drainage system component), navigation (as a link in a waterborne transportation system), meteorology (in terms of moisture, heat, and momentum exchange with the atmospheric boundary layer), and visualization (for photorealistic visualization or map type displays), etc. This makes it possible for each real-world object to exhibit any or all of its unique behaviors within the context of a single simulation.« less

  5. The Mothership Mission Architecture

    NASA Astrophysics Data System (ADS)

    Ernst, S. M.; DiCorcia, J. D.; Bonin, G.; Gump, D.; Lewis, J. S.; Foulds, C.; Faber, D.

    2015-12-01

    The Mothership is considered to be a dedicated deep space carrier spacecraft. It is currently being developed by Deep Space Industries (DSI) as a mission concept that enables a broad participation in the scientific exploration of small bodies - the Mothership mission architecture. A Mothership shall deliver third-party nano-sats, experiments and instruments to Near Earth Asteroids (NEOs), comets or moons. The Mothership service includes delivery of nano-sats, communication to Earth and visuals of the asteroid surface and surrounding area. The Mothership is designed to carry about 10 nano-sats, based upon a variation of the Cubesat standard, with some flexibility on the specific geometry. The Deep Space Nano-Sat reference design is a 14.5 cm cube, which accommodates the same volume as a traditional 3U CubeSat. To reduce cost, Mothership is designed as a secondary payload aboard launches to GTO. DSI is offering slots for nano-sats to individual customers. This enables organizations with relatively low operating budgets to closely examine an asteroid with highly specialized sensors of their own choosing and carry out experiments in the proximity of or on the surface of an asteroid, while the nano-sats can be built or commissioned by a variety of smaller institutions, companies, or agencies. While the overall Mothership mission will have a financial volume somewhere between a European Space Agencies' (ESA) S- and M-class mission for instance, it can be funded through a number of small and individual funding sources and programs, hence avoiding the processes associated with traditional space exploration missions. DSI has been able to identify a significant interest in the planetary science and nano-satellite communities.

  6. Parallel architectures and neural networks

    SciTech Connect

    Calianiello, E.R. )

    1989-01-01

    This book covers parallel computer architectures and neural networks. Topics include: neural modeling, use of ADA to simulate neural networks, VLSI technology, implementation of Boltzmann machines, and analysis of neural nets.

  7. Transverse pumped laser amplifier architecture

    SciTech Connect

    Bayramian, Andrew James; Manes, Kenneth R.; Deri, Robert; Erlandson, Alvin; Caird, John; Spaeth, Mary L.

    2015-05-19

    An optical gain architecture includes a pump source and a pump aperture. The architecture also includes a gain region including a gain element operable to amplify light at a laser wavelength. The gain region is characterized by a first side intersecting an optical path, a second side opposing the first side, a third side adjacent the first and second sides, and a fourth side opposing the third side. The architecture further includes a dichroic section disposed between the pump aperture and the first side of the gain region. The dichroic section is characterized by low reflectance at a pump wavelength and high reflectance at the laser wavelength. The architecture additionally includes a first cladding section proximate to the third side of the gain region and a second cladding section proximate to the fourth side of the gain region.

  8. Architecture and the Information Revolution.

    ERIC Educational Resources Information Center

    Driscoll, Porter; And Others

    1982-01-01

    Traces how technological changes affect the architecture of the workplace. Traces these effects from the industrial revolution up through the computer revolution. Offers suggested designs for the computerized office of today and tomorrow. (JM)

  9. Transverse pumped laser amplifier architecture

    DOEpatents

    Bayramian, Andrew James; Manes, Kenneth; Deri, Robert; Erlandson, Al; Caird, John; Spaeth, Mary

    2013-07-09

    An optical gain architecture includes a pump source and a pump aperture. The architecture also includes a gain region including a gain element operable to amplify light at a laser wavelength. The gain region is characterized by a first side intersecting an optical path, a second side opposing the first side, a third side adjacent the first and second sides, and a fourth side opposing the third side. The architecture further includes a dichroic section disposed between the pump aperture and the first side of the gain region. The dichroic section is characterized by low reflectance at a pump wavelength and high reflectance at the laser wavelength. The architecture additionally includes a first cladding section proximate to the third side of the gain region and a second cladding section proximate to the fourth side of the gain region.

  10. Simulator for heterogeneous dataflow architectures

    NASA Technical Reports Server (NTRS)

    Malekpour, Mahyar R.

    1993-01-01

    A new simulator is developed to simulate the execution of an algorithm graph in accordance with the Algorithm to Architecture Mapping Model (ATAMM) rules. ATAMM is a Petri Net model which describes the periodic execution of large-grained, data-independent dataflow graphs and which provides predictable steady state time-optimized performance. This simulator extends the ATAMM simulation capability from a heterogenous set of resources, or functional units, to a more general heterogenous architecture. Simulation test cases show that the simulator accurately executes the ATAMM rules for both a heterogenous architecture and a homogenous architecture, which is the special case for only one processor type. The simulator forms one tool in an ATAMM Integrated Environment which contains other tools for graph entry, graph modification for performance optimization, and playback of simulations for analysis.

  11. Space station needs, attributes and architectural options study

    NASA Technical Reports Server (NTRS)

    1983-01-01

    All the candidate Technology Development missions investigated during the space station needs, attributes, and architectural options study are described. All the mission data forms plus additional information such as, cost, drawings, functional flows, etc., generated in support of these mission is included with a computer generated mission data form.

  12. Educational JavaBeans: a Requirements Driven Architecture.

    ERIC Educational Resources Information Center

    Hall, Jon; Rapanotti, Lucia

    This paper investigates, through a case study, the development of a software architecture that is compatible with a system's high-level requirements. The case study is an example of an extended customer/supplier relationship (post-point of sale support) involved in e-universities and is representative of a class of enterprise without current…

  13. Information Architecture in JASIST: Just Where Did We Come From?

    ERIC Educational Resources Information Center

    Dillon, Andrew

    2002-01-01

    Traces information architecture (IA) to a historical summit, supported by American Society for Information Science and Technology (ASIS&T) in May 2000 at Boston, MA. where several hundred gathered to thrash out the questions of just what IA was and what this field might become. Outlines the six IA issues discussed. (JMK)

  14. 18. Photocopy of Architectural Layout drawing, dated 25 June, 1993 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    18. Photocopy of Architectural Layout drawing, dated 25 June, 1993 by US Air Force Space Command. Original drawing property of United States Air Force, 21' Space Command AL-2 PAVE PAWS SUPPORT SYSTEMS - CAPE COD AFB, MASSACHUSETTS - SITE PLAN. DRAWING NO. AL-2 - SHEET 3 OF 21. - Cape Cod Air Station, Massachusetts Military Reservation, Sandwich, Barnstable County, MA

  15. 3. PHOTOCOPY OF DRAWING (1960 ARCHITECTURAL DRAWING BY THE RALPH ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. PHOTOCOPY OF DRAWING (1960 ARCHITECTURAL DRAWING BY THE RALPH M. PARSONS COMPANY) FLOOR PLAN, ELEVATIONS, AND SECTION FOR THE SAMOS TECHNICAL SUPPORT BUILDING (BLDG. 761; NOW CALLED SLC-3 AIR FORCE BUILDING), SHEET A14 - Vandenberg Air Force Base, Space Launch Complex 3, SLC-3 Air Force Building, Napa & Alden Roads, Lompoc, Santa Barbara County, CA

  16. Re-engineering Nascom's network management architecture

    NASA Technical Reports Server (NTRS)

    Drake, Brian C.; Messent, David

    1994-01-01

    The development of Nascom systems for ground communications began in 1958 with Project Vanguard. The low-speed systems (rates less than 9.6 Kbs) were developed following existing standards; but, there were no comparable standards for high-speed systems. As a result, these systems were developed using custom protocols and custom hardware. Technology has made enormous strides since the ground support systems were implemented. Standards for computer equipment, software, and high-speed communications exist and the performance of current workstations exceeds that of the mainframes used in the development of the ground systems. Nascom is in the process of upgrading its ground support systems and providing additional services. The Message Switching System (MSS), Communications Address Processor (CAP), and Multiplexer/Demultiplexer (MDM) Automated Control System (MACS) are all examples of Nascom systems developed using standards such as, X-windows, Motif, and Simple Network Management Protocol (SNMP). Also, the Earth Observing System (EOS) Communications (Ecom) project is stressing standards as an integral part of its network. The move towards standards has produced a reduction in development, maintenance, and interoperability costs, while providing operational quality improvement. The Facility and Resource Manager (FARM) project has been established to integrate the Nascom networks and systems into a common network management architecture. The maximization of standards and implementation of computer automation in the architecture will lead to continued cost reductions and increased operational efficiency. The first step has been to derive overall Nascom requirements and identify the functionality common to all the current management systems. The identification of these common functions will enable the reuse of processes in the management architecture and promote increased use of automation throughout the Nascom network. The MSS, CAP, MACS, and Ecom projects have indicated

  17. The architecture of FAIM-1

    SciTech Connect

    Anderson, J.M.; Coates, W.S.; Davis, A.L.; Hon, R.W.; Robinson, I.N.; Robison, S.V.; Stevens, K.S.

    1987-01-01

    This article describes a symbolic multiprocessing system called FAIM-1. FAIM-1 is a highly concurrent, general-purpose, symbolic accelerator for parallel AI symbolic computation. The paramount goal of the FAIM project is to produce an architecture that can be scaled to a configuration capable of performance improvements of two to three orders of magnitude over conventional architectures. In the design of FAIM-1, prime consideration was given to programmability, performance, extensibility, fault tolerance, and the cost-effective use of technology.

  18. How architecture wins technology wars.

    PubMed

    Morris, C R; Ferguson, C H

    1993-01-01

    Signs of revolutionary transformation in the global computer industry are everywhere. A roll call of the major industry players reads like a waiting list in the emergency room. The usual explanations for the industry's turmoil are at best inadequate. Scale, friendly government policies, manufacturing capabilities, a strong position in desktop markets, excellent software, top design skills--none of these is sufficient, either by itself or in combination, to ensure competitive success in information technology. A new paradigm is required to explain patterns of success and failure. Simply stated, success flows to the company that manages to establish proprietary architectural control over a broad, fast-moving, competitive space. Architectural strategies have become crucial to information technology because of the astonishing rate of improvement in microprocessors and other semiconductor components. Since no single vendor can keep pace with the outpouring of cheap, powerful, mass-produced components, customers insist on stitching together their own local systems solutions. Architectures impose order on the system and make the interconnections possible. The architectural controller is the company that controls the standard by which the entire information package is assembled. Microsoft's Windows is an excellent example of this. Because of the popularity of Windows, companies like Lotus must conform their software to its parameters in order to compete for market share. In the 1990s, proprietary architectural control is not only possible but indispensable to competitive success. What's more, it has broader implications for organizational structure: architectural competition is giving rise to a new form of business organization. PMID:10124636

  19. Demand Activated Manufacturing Architecture (DAMA) model for supply chain collaboration

    SciTech Connect

    CHAPMAN,LEON D.; PETERSEN,MARJORIE B.

    2000-03-13

    The Demand Activated Manufacturing Architecture (DAMA) project during the last five years of work with the U.S. Integrated Textile Complex (retail, apparel, textile, and fiber sectors) has developed an inter-enterprise architecture and collaborative model for supply chains. This model will enable improved collaborative business across any supply chain. The DAMA Model for Supply Chain Collaboration is a high-level model for collaboration to achieve Demand Activated Manufacturing. The five major elements of the architecture to support collaboration are (1) activity or process, (2) information, (3) application, (4) data, and (5) infrastructure. These five elements are tied to the application of the DAMA architecture to three phases of collaboration - prepare, pilot, and scale. There are six collaborative activities that may be employed in this model: (1) Develop Business Planning Agreements, (2) Define Products, (3) Forecast and Plan Capacity Commitments, (4) Schedule Product and Product Delivery, (5) Expedite Production and Delivery Exceptions, and (6) Populate Supply Chain Utility. The Supply Chain Utility is a set of applications implemented to support collaborative product definition, forecast visibility, planning, scheduling, and execution. The DAMA architecture and model will be presented along with the process for implementing this DAMA model.

  20. An intelligent service-based network architecture for wearable robots.

    PubMed

    Lee, Ka Keung; Zhang, Ping; Xu, Yangsheng; Liang, Bin

    2004-08-01

    We are developing a novel robot concept called the wearable robot. Wearable robots are mobile information devices capable of supporting remote communication and intelligent interaction between networked entities. In this paper, we explore the possible functions of such a robotic network and will present a distributed network architecture based on service components. In order to support the interaction and communication between the components in the wearable robot system, we have developed an intelligent network architecture. This service-based architecture involves three major mechanisms. The first mechanism involves the use of a task coordinator service such that the execution of the services can be managed using a priority queue. The second mechanism enables the system to automatically push the required service proxy to the client intelligently based on certain system-related conditions. In the third mechanism, we allow the system to automatically deliver services based on contextual information. Using a fuzzy-logic-based decision making system, the matching service can determine whether the service should be automatically delivered utilizing the information provided by the service, client, lookup service, and context sensors. An application scenario has been implemented to demonstrate the feasibility of this distributed service-based robot architecture. The architecture is implemented as extensions to the Jini network model.

  1. Anatomy of a Decision Support System.

    ERIC Educational Resources Information Center

    Chachra, Vinod; Heterick, Robert C.

    1982-01-01

    The decision support system (DSS) environment, the functional requirements of a DSS, and the architectural requirements of the computer systems and communications network necessary to support a DSS are discussed. Changes in the computing environment that are necessary to implement decision support systems are suggested. (Author/MLW)

  2. Architecture-Aware Algorithms for Scalable Performance and Resilience on Heterogeneous Architectures. Final Report

    SciTech Connect

    Gropp, William D.

    2014-06-23

    With the coming end of Moore's law, it has become essential to develop new algorithms and techniques that can provide the performance needed by demanding computational science applications, especially those that are part of the DOE science mission. This work was part of a multi-institution, multi-investigator project that explored several approaches to develop algorithms that would be effective at the extreme scales and with the complex processor architectures that are expected at the end of this decade. The work by this group developed new performance models that have already helped guide the development of highly scalable versions of an algebraic multigrid solver, new programming approaches designed to support numerical algorithms on heterogeneous architectures, and a new, more scalable version of conjugate gradient, an important algorithm in the solution of very large linear systems of equations.

  3. A support architecture for reliable distributed computing systems

    NASA Technical Reports Server (NTRS)

    Mckendry, Martin S.

    1986-01-01

    The Clouds kernel design was through several design phases and is nearly complete. The object manager, the process manager, the storage manager, the communications manager, and the actions manager are examined.

  4. Architecture for Building Conversational Agents that Support Collaborative Learning

    ERIC Educational Resources Information Center

    Kumar, R.; Rose, C. P.

    2011-01-01

    Tutorial Dialog Systems that employ Conversational Agents (CAs) to deliver instructional content to learners in one-on-one tutoring settings have been shown to be effective in multiple learning domains by multiple research groups. Our work focuses on extending this successful learning technology to collaborative learning settings involving two or…

  5. Data Intensive Architecture for Scalable Cyber Analytics

    SciTech Connect

    Olsen, Bryan K.; Johnson, John R.; Critchlow, Terence J.

    2011-12-19

    Cyber analysts are tasked with the identification and mitigation of network exploits and threats. These compromises are difficult to identify due to the characteristics of cyber communication, the volume of traffic, and the duration of possible attack. In this paper, we describe a prototype implementation designed to provide cyber analysts an environment where they can interactively explore a month’s worth of cyber security data. This prototype utilized On-Line Analytical Processing (OLAP) techniques to present a data cube to the analysts. The cube provides a summary of the data, allowing trends to be easily identified as well as the ability to easily pull up the original records comprising an event of interest. The cube was built using SQL Server Analysis Services (SSAS), with the interface to the cube provided by Tableau. This software infrastructure was supported by a novel hardware architecture comprising a Netezza TwinFin® for the underlying data warehouse and a cube server with a FusionIO drive hosting the data cube. We evaluated this environment on a month’s worth of artificial, but realistic, data using multiple queries provided by our cyber analysts. As our results indicate, OLAP technology has progressed to the point where it is in a unique position to provide novel insights to cyber analysts, as long as it is supported by an appropriate data intensive architecture.

  6. Brahms Mobile Agents: Architecture and Field Tests

    NASA Technical Reports Server (NTRS)

    Clancey, William J.; Sierhuis, Maarten; Kaskiris, Charis; vanHoof, Ron

    2002-01-01

    We have developed a model-based, distributed architecture that integrates diverse components in a system designed for lunar and planetary surface operations: an astronaut's space suit, cameras, rover/All-Terrain Vehicle (ATV), robotic assistant, other personnel in a local habitat, and a remote mission support team (with time delay). Software processes, called agents, implemented in the Brahms language, run on multiple, mobile platforms. These mobile agents interpret and transform available data to help people and robotic systems coordinate their actions to make operations more safe and efficient. The Brahms-based mobile agent architecture (MAA) uses a novel combination of agent types so the software agents may understand and facilitate communications between people and between system components. A state-of-the-art spoken dialogue interface is integrated with Brahms models, supporting a speech-driven field observation record and rover command system (e.g., return here later and bring this back to the habitat ). This combination of agents, rover, and model-based spoken dialogue interface constitutes a personal assistant. An important aspect of the methodology involves first simulating the entire system in Brahms, then configuring the agents into a run-time system.

  7. AES Water Architecture Study Interim Results

    NASA Technical Reports Server (NTRS)

    Sarguisingh, Miriam J.

    2012-01-01

    The mission of the Advanced Exploration System (AES) Water Recovery Project (WRP) is to develop advanced water recovery systems in order to enable NASA human exploration missions beyond low earth orbit (LEO). The primary objective of the AES WRP is to develop water recovery technologies critical to near term missions beyond LEO. The secondary objective is to continue to advance mid-readiness level technologies to support future NASA missions. An effort is being undertaken to establish the architecture for the AES Water Recovery System (WRS) that meets both near and long term objectives. The resultant architecture will be used to guide future technical planning, establish a baseline development roadmap for technology infusion, and establish baseline assumptions for integrated ground and on-orbit environmental control and life support systems (ECLSS) definition. This study is being performed in three phases. Phase I of this study established the scope of the study through definition of the mission requirements and constraints, as well as indentifying all possible WRS configurations that meet the mission requirements. Phase II of this study focused on the near term space exploration objectives by establishing an ISS-derived reference schematic for long-duration (>180 day) in-space habitation. Phase III will focus on the long term space exploration objectives, trading the viable WRS configurations identified in Phase I to identify the ideal exploration WRS. The results of Phases I and II are discussed in this paper.

  8. GEOSS Architecture Implementation Pilot Phase 2

    NASA Astrophysics Data System (ADS)

    Percivall, G.

    2009-04-01

    The Group on Earth Observations (GEO) is conducting a second phase of the Architecture Implementation Pilot (AIP-2) to integrate services into the Global Earth Observing System of Systems (GEOSS). The first phase of AIP contributed to the initial operating capability of GEOSS Common Infrastructure (GCI) established in early 2008. AIP-2 will augment the GCI with services contributed by GEO Members and Participating Organizations. The activities of AIP-2 are conducted in working groups. Five working groups are developing the transverse technology that supports the multiple user communities. Four Community working groups are applying the transverse technologies to support the following communties of practice: Energy, Biodiversity and Climate Change, Disasters and Air Quality. The Air Quality Working Group is led by the ESIP AQ Cluster. AIP-2 testing and integration will integrate the use cases in to demonstration scenarios. Persistent exemplar services will be nominated to augment the GCI. This presentation will describe the AIP-2 process, progress and planned deliverables.

  9. Space Station data management system architecture

    NASA Technical Reports Server (NTRS)

    Mallary, William E.; Whitelaw, Virginia A.

    1987-01-01

    Within the Space Station program, the Data Management System (DMS) functions in a dual role. First, it provides the hardware resources and software services which support the data processing, data communications, and data storage functions of the onboard subsystems and payloads. Second, it functions as an integrating entity which provides a common operating environment and human-machine interface for the operation and control of the orbiting Space Station systems and payloads by both the crew and the ground operators. This paper discusses the evolution and derivation of the requirements and issues which have had significant effect on the design of the Space Station DMS, describes the DMS components and services which support system and payload operations, and presents the current architectural view of the system as it exists in October 1986; one-and-a-half years into the Space Station Phase B Definition and Preliminary Design Study.

  10. Space station needs, attributes, and architectural options: Brief analysis

    NASA Technical Reports Server (NTRS)

    Shepphird, F. H.

    1983-01-01

    A baseline set of model missions is thoroughly characterized in terms of support requirements, demands on the Space Station, operating regimes, payload properties, and statements of the mission goals and objectives. This baseline is a representative set of mission requirements covering the most likely extent of space station support requirements from which architectural options can be constructed and exercised. The baseline set of 90 missions are assessed collectively and individually in terms of the economic, performance, and social benefits.

  11. NASA's Exploration Architecture

    NASA Technical Reports Server (NTRS)

    Tyburski, Timothy

    2006-01-01

    A Bold Vision for Space Exploration includes: 1) Complete the International Space Station; 2) Safely fly the Space Shuttle until 2010; 3) Develop and fly the Crew Exploration Vehicle no later than 2012; 4) Return to the moon no later than 2020; 5) Extend human presence across the solar system and beyond; 6) Implement a sustained and affordable human and robotic program; 7) Develop supporting innovative technologies, knowledge, and infrastructures; and 8) Promote international and commercial participation in exploration.

  12. Mission Architecture Comparison for Human Lunar Exploration

    NASA Technical Reports Server (NTRS)

    Geffre, Jim; Robertson, Ed; Lenius, Jon

    2006-01-01

    The Vision for Space Exploration outlines a bold new national space exploration policy that holds as one of its primary objectives the extension of human presence outward into the Solar System, starting with a return to the Moon in preparation for the future exploration of Mars and beyond. The National Aeronautics and Space Administration is currently engaged in several preliminary analysis efforts in order to develop the requirements necessary for implementing this objective in a manner that is both sustainable and affordable. Such analyses investigate various operational concepts, or mission architectures , by which humans can best travel to the lunar surface, live and work there for increasing lengths of time, and then return to Earth. This paper reports on a trade study conducted in support of NASA s Exploration Systems Mission Directorate investigating the relative merits of three alternative lunar mission architecture strategies. The three architectures use for reference a lunar exploration campaign consisting of multiple 90-day expeditions to the Moon s polar regions, a strategy which was selected for its high perceived scientific and operational value. The first architecture discussed incorporates the lunar orbit rendezvous approach employed by the Apollo lunar exploration program. This concept has been adapted from Apollo to meet the particular demands of a long-stay polar exploration campaign while assuring the safe return of crew to Earth. Lunar orbit rendezvous is also used as the baseline against which the other alternate concepts are measured. The first such alternative, libration point rendezvous, utilizes the unique characteristics of the cislunar libration point instead of a low altitude lunar parking orbit as a rendezvous and staging node. Finally, a mission strategy which does not incorporate rendezvous after the crew ascends from the Moon is also studied. In this mission strategy, the crew returns directly to Earth from the lunar surface, and is

  13. STEEL TRUSS TENSION RING SUPPORTING DOME ROOF. TENSION RING COVERED ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    STEEL TRUSS TENSION RING SUPPORTING DOME ROOF. TENSION RING COVERED BY ARCHITECTURAL FINISH. TENSION RING ROLLER SUPPORT AT COLUMN OBSCURED BY COLUMN COVERINGS. - Houston Astrodome, 8400 Kirby Drive, Houston, Harris County, TX

  14. Bipartite memory network architectures for parallel processing

    SciTech Connect

    Smith, W.; Kale, L.V. . Dept. of Computer Science)

    1990-01-01

    Parallel architectures are boradly classified as either shared memory or distributed memory architectures. In this paper, the authors propose a third family of architectures, called bipartite memory network architectures. In this architecture, processors and memory modules constitute a bipartite graph, where each processor is allowed to access a small subset of the memory modules, and each memory module allows access from a small set of processors. The architecture is particularly suitable for computations requiring dynamic load balancing. The authors explore the properties of this architecture by examining the Perfect Difference set based topology for the graph. Extensions of this topology are also suggested.

  15. Architectural Analysis of Dynamically Reconfigurable Systems

    NASA Technical Reports Server (NTRS)

    Lindvall, Mikael; Godfrey, Sally; Ackermann, Chris; Ray, Arnab; Yonkwa, Lyly

    2010-01-01

    oTpics include: the problem (increased flexibility of architectural styles decrease analyzability, behavior emerges and varies depending on the configuration, does the resulting system run according to the intended design, and architectural decisions can impede or facilitate testing); top down approach to architecture analysis, detection of defects and deviations, and architecture and its testability; currently targeted projects GMSEC and CFS; analyzing software architectures; analyzing runtime events; actual architecture recognition; GMPUB in Dynamic SAVE; sample output from new approach; taking message timing delays into account; CFS examples of architecture and testability; some recommendations for improved testablity; and CFS examples of abstract interfaces and testability; CFS example of opening some internal details.

  16. Selected Precepts in Lunar Architecture

    NASA Astrophysics Data System (ADS)

    Cohen, Marc M.

    2002-01-01

    This paper presents an overview of selected approaches to Lunar Architecture to describe the parameters of this design problem space. The paper identifies typologies of architecture based on Lunar site features, structural concepts and habitable functions. This paper develops an analysis of these architectures based on the NASA Habitats and Surface Construction Road Map (1997) in which there are three major types of surface construction: Class I) Preintegrated, Class 2) Assembled, Deployed, Erected or Inflated, and Class 3) Use of In Situ materials and site characteristics. Class 1 Architectures include the following. The Apollo Program was intended to extend to landing a 14 day base in enhanced Lunar Excursion Modules. The Air Force was the first to propose preintegrated cylindrical modules landed on the Lunar surface. The University of Wisconsin proposed building a module and hub system on the surface. Madhu Thangavelu proposed assembling such a module and hub base in orbit and then landing it intact on the moon . Class 2 Architectures include: The NASA 90 Day Study proposed an inflatable sphere of about 20m diameter for a lunar habitat. Jenine Abarbanel of Colorado State University proposed rectangular inflatable habitats, with lunar regolith as ballast on the flat top. Class 3 Architectures include: William Simon proposed a lunar base bored into a crater rim. Alice Eichold proposed a base within a crater ring. The paper presents a comparative characterization and analysis of these and other examples paradigms of proposed Lunar construction. It evaluates bath the architectures and the NASA Habitats and Surface Construction Road Map for how well they correlate to one another

  17. Space station needs, attributes, and architectural options study

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The top level, time-phased total space program support system architecture is described including progress from the use of ground-based space shuttle, teleoperator system, extended duration orbiter, and multimission spacecraft, to an initial 4-man crew station at 29 deg inclination in 1991, to a growth station with an 8-man crew with capabilities for OTV high energy orbit payload placement and servicing, assembly, and construction of mission payloads in 1994. System Z, proposed for Earth observation missions in high inclination orbit, can be accommodated in 1993 using a space station derivative platform. Mission definition, system architecture, and benefits are discussed.

  18. Programmable bandwidth management in software-defined EPON architecture

    NASA Astrophysics Data System (ADS)

    Li, Chengjun; Guo, Wei; Wang, Wei; Hu, Weisheng; Xia, Ming

    2016-07-01

    This paper proposes a software-defined EPON architecture which replaces the hardware-implemented DBA module with reprogrammable DBA module. The DBA module allows pluggable bandwidth allocation algorithms among multiple ONUs adaptive to traffic profiles and network states. We also introduce a bandwidth management scheme executed at the controller to manage the customized DBA algorithms for all date queues of ONUs. Our performance investigation verifies the effectiveness of this new EPON architecture, and numerical results show that software-defined EPONs can achieve less traffic delay and provide better support to service differentiation in comparison with traditional EPONs.

  19. An architecture for a brain-image database

    NASA Technical Reports Server (NTRS)

    Herskovits, E. H.

    2000-01-01

    The widespread availability of methods for noninvasive assessment of brain structure has enabled researchers to investigate neuroimaging correlates of normal aging, cerebrovascular disease, and other processes; we designate such studies as image-based clinical trials (IBCTs). We propose an architecture for a brain-image database, which integrates image processing and statistical operators, and thus supports the implementation and analysis of IBCTs. The implementation of this architecture is described and results from the analysis of image and clinical data from two IBCTs are presented. We expect that systems such as this will play a central role in the management and analysis of complex research data sets.

  20. Architecture for a Generalized Emergency Management Software System

    SciTech Connect

    Hoza, Mark; Bower, John C.; Stoops, LaMar R.; Downing, Timothy R.; Carter, Richard J.; Millard, W. David

    2002-12-19

    The Federal Emergency Management Information System (FEMIS) was originally developed for the Chemical Stockpile Emergency Preparedness Program (CSEPP). It has evolved from a CSEPP-specific emergency management software system to a general-purpose system that supports multiple types of hazards. The latest step in the evolution is the adoption of a hazard analysis architecture that enables the incorporation of hazard models for each of the hazards such that the model is seamlessly incorporated into the FEMIS hazard analysis subsystem. This paper describes that new architecture.