Sample records for integrated analysis environment

  1. Integrated Design Engineering Analysis (IDEA) Environment - Aerodynamics, Aerothermodynamics, and Thermal Protection System Integration Module

    NASA Technical Reports Server (NTRS)

    Kamhawi, Hilmi N.

    2011-01-01

    This report documents the work performed during from March 2010 October 2011. The Integrated Design and Engineering Analysis (IDEA) environment is a collaborative environment based on an object-oriented, multidisciplinary, distributed environment using the Adaptive Modeling Language (AML) as the underlying framework. This report will focus on describing the work done in the area of extending the aerodynamics, and aerothermodynamics module using S/HABP, CBAERO, PREMIN and LANMIN. It will also detail the work done integrating EXITS as the TPS sizing tool.

  2. Integrated Design and Engineering Analysis (IDEA) Environment - Propulsion Related Module Development and Vehicle Integration

    NASA Technical Reports Server (NTRS)

    Kamhawi, Hilmi N.

    2013-01-01

    This report documents the work performed during the period from May 2011 - October 2012 on the Integrated Design and Engineering Analysis (IDEA) environment. IDEA is a collaborative environment based on an object-oriented, multidisciplinary, distributed framework using the Adaptive Modeling Language (AML). This report will focus on describing the work done in the areas of: (1) Integrating propulsion data (turbines, rockets, and scramjets) in the system, and using the data to perform trajectory analysis; (2) Developing a parametric packaging strategy for a hypersonic air breathing vehicles allowing for tank resizing when multiple fuels and/or oxidizer are part of the configuration; and (3) Vehicle scaling and closure strategies.

  3. Advanced Satellite Workstation - An integrated workstation environment for operational support of satellite system planning and analysis

    NASA Astrophysics Data System (ADS)

    Hamilton, Marvin J.; Sutton, Stewart A.

    A prototype integrated environment, the Advanced Satellite Workstation (ASW), which was developed and delivered for evaluation and operator feedback in an operational satellite control center, is described. The current ASW hardware consists of a Sun Workstation and Macintosh II Workstation connected via an ethernet Network Hardware and Software, Laser Disk System, Optical Storage System, and Telemetry Data File Interface. The central objective of ASW is to provide an intelligent decision support and training environment for operator/analysis of complex systems such as satellites. Compared to the many recent workstation implementations that incorporate graphical telemetry displays and expert systems, ASW provides a considerably broader look at intelligent, integrated environments for decision support, based on the premise that the central features of such an environment are intelligent data access and integrated toolsets.

  4. Analysis of methods. [information systems evolution environment

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J. (Editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.

    1991-01-01

    Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity Relationship, Data Flow Diagrams, and Structure Charts, for inclusion in an integrated development support environment.

  5. Integrated Sensitivity Analysis Workflow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman-Hill, Ernest J.; Hoffman, Edward L.; Gibson, Marcus J.

    2014-08-01

    Sensitivity analysis is a crucial element of rigorous engineering analysis, but performing such an analysis on a complex model is difficult and time consuming. The mission of the DART Workbench team at Sandia National Laboratories is to lower the barriers to adoption of advanced analysis tools through software integration. The integrated environment guides the engineer in the use of these integrated tools and greatly reduces the cycle time for engineering analysis.

  6. Systematic analysis of signaling pathways using an integrative environment.

    PubMed

    Visvanathan, Mahesh; Breit, Marc; Pfeifer, Bernhard; Baumgartner, Christian; Modre-Osprian, Robert; Tilg, Bernhard

    2007-01-01

    Understanding the biological processes of signaling pathways as a whole system requires an integrative software environment that has comprehensive capabilities. The environment should include tools for pathway design, visualization, simulation and a knowledge base concerning signaling pathways as one. In this paper we introduce a new integrative environment for the systematic analysis of signaling pathways. This system includes environments for pathway design, visualization, simulation and a knowledge base that combines biological and modeling information concerning signaling pathways that provides the basic understanding of the biological system, its structure and functioning. The system is designed with a client-server architecture. It contains a pathway designing environment and a simulation environment as upper layers with a relational knowledge base as the underlying layer. The TNFa-mediated NF-kB signal trans-duction pathway model was designed and tested using our integrative framework. It was also useful to define the structure of the knowledge base. Sensitivity analysis of this specific pathway was performed providing simulation data. Then the model was extended showing promising initial results. The proposed system offers a holistic view of pathways containing biological and modeling data. It will help us to perform biological interpretation of the simulation results and thus contribute to a better understanding of the biological system for drug identification.

  7. Scenario Analysis: An Integrative Study and Guide to Implementation in the United States Air Force

    DTIC Science & Technology

    1994-09-01

    Environmental Analysis ................................ 3-3 Classifications of Environments ......................... 3-5 Characteristics of... Environments ........................ 3-8 iii Page Components of the Environmental Analysis Process ........... 3-12 Forecasting... Environmental Analysis ...................... 3-4 3-2 Model of the Industry Environment ......................... 3-6 3-3 Model of Macroenvironment

  8. The Environment for Application Software Integration and Execution (EASIE) version 1.0. Volume 1: Executive overview

    NASA Technical Reports Server (NTRS)

    Rowell, Lawrence F.; Davis, John S.

    1989-01-01

    The Environment for Application Software Integration and Execution (EASIE) provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational database management system. Volume 1, Executive Overview, gives an overview of the functions provided by EASIE and describes their use. Three operational design systems based upon the EASIE software are briefly described.

  9. 40 CFR 610.22 - Device integrity.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Device integrity. 610.22 Section 610.22 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) ENERGY POLICY FUEL ECONOMY RETROFIT DEVICES Test Procedures and Evaluation Criteria Evaluation Criteria for the Preliminary Analysis...

  10. NASA Lighting Research, Test, & Analysis

    NASA Technical Reports Server (NTRS)

    Clark, Toni

    2015-01-01

    The Habitability and Human Factors Branch, at Johnson Space Center, in Houston, TX, provides technical guidance for the development of spaceflight lighting requirements, verification of light system performance, analysis of integrated environmental lighting systems, and research of lighting-related human performance issues. The Habitability & Human Factors Lighting Team maintains two physical facilities that are integrated to provide support. The Lighting Environment Test Facility (LETF) provides a controlled darkroom environment for physical verification of lighting systems with photometric and spetrographic measurement systems. The Graphics Research & Analysis Facility (GRAF) maintains the capability for computer-based analysis of operational lighting environments. The combined capabilities of the Lighting Team at Johnson Space Center have been used for a wide range of lighting-related issues.

  11. Using Kepler for Tool Integration in Microarray Analysis Workflows.

    PubMed

    Gan, Zhuohui; Stowe, Jennifer C; Altintas, Ilkay; McCulloch, Andrew D; Zambon, Alexander C

    Increasing numbers of genomic technologies are leading to massive amounts of genomic data, all of which requires complex analysis. More and more bioinformatics analysis tools are being developed by scientist to simplify these analyses. However, different pipelines have been developed using different software environments. This makes integrations of these diverse bioinformatics tools difficult. Kepler provides an open source environment to integrate these disparate packages. Using Kepler, we integrated several external tools including Bioconductor packages, AltAnalyze, a python-based open source tool, and R-based comparison tool to build an automated workflow to meta-analyze both online and local microarray data. The automated workflow connects the integrated tools seamlessly, delivers data flow between the tools smoothly, and hence improves efficiency and accuracy of complex data analyses. Our workflow exemplifies the usage of Kepler as a scientific workflow platform for bioinformatics pipelines.

  12. Total systems design analysis of high performance structures

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1993-01-01

    Designer-control parameters were identified at interdiscipline interfaces to optimize structural systems performance and downstream development and operations with reliability and least life-cycle cost. Interface tasks and iterations are tracked through a matrix of performance disciplines integration versus manufacturing, verification, and operations interactions for a total system design analysis. Performance integration tasks include shapes, sizes, environments, and materials. Integrity integrating tasks are reliability and recurring structural costs. Significant interface designer control parameters were noted as shapes, dimensions, probability range factors, and cost. Structural failure concept is presented, and first-order reliability and deterministic methods, benefits, and limitations are discussed. A deterministic reliability technique combining benefits of both is proposed for static structures which is also timely and economically verifiable. Though launch vehicle environments were primarily considered, the system design process is applicable to any surface system using its own unique filed environments.

  13. Army-NASA aircrew/aircraft integration program: Phase 4 A(3)I Man-Machine Integration Design and Analysis System (MIDAS) software detailed design document

    NASA Technical Reports Server (NTRS)

    Banda, Carolyn; Bushnell, David; Chen, Scott; Chiu, Alex; Constantine, Betsy; Murray, Jerry; Neukom, Christian; Prevost, Michael; Shankar, Renuka; Staveland, Lowell

    1991-01-01

    The Man-Machine Integration Design and Analysis System (MIDAS) is an integrated suite of software components that constitutes a prototype workstation to aid designers in applying human factors principles to the design of complex human-machine systems. MIDAS is intended to be used at the very early stages of conceptual design to provide an environment wherein designers can use computational representations of the crew station and operator, instead of hardware simulators and man-in-the-loop studies, to discover problems and ask 'what if' questions regarding the projected mission, equipment, and environment. This document is the Software Product Specification for MIDAS. Introductory descriptions of the processing requirements, hardware/software environment, structure, I/O, and control are given in the main body of the document for the overall MIDAS system, with detailed discussion of the individual modules included in Annexes A-J.

  14. A Mathematics Teacher's Practice in a Technological Environment: A Case Study Analysis Using Two Complementary Theories

    ERIC Educational Resources Information Center

    Tabach, Michal

    2011-01-01

    Integrating technology in school mathematics has become more and more common. The teacher is a key person in integrating technology into everyday practice. To understand teacher practice in a technological environment, this study proposes using two theoretical perspectives: the theory of technological pedagogical content knowledge to analyze…

  15. Systems Integration Analysis for Future Tower Cab Configurations/Systems

    DOT National Transportation Integrated Search

    1978-06-01

    This report presents the results of the analysis of various aspects of the integration of future ATC systems into the tower cab. The impact on the tower cab environment is analyzed from several points of view: how the systems information and displays...

  16. Multidisciplinary High-Fidelity Analysis and Optimization of Aerospace Vehicles. Part 1; Formulation

    NASA Technical Reports Server (NTRS)

    Walsh, J. L.; Townsend, J. C.; Salas, A. O.; Samareh, J. A.; Mukhopadhyay, V.; Barthelemy, J.-F.

    2000-01-01

    An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity, finite element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a highspeed civil transport configuration. The paper describes the engineering aspects of formulating the optimization by integrating these analysis codes and associated interface codes into the system. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture (CORBA) compliant software product. A companion paper presents currently available results.

  17. The visual-landscape analysis during the integration of high-rise buildings within the historic urban environment

    NASA Astrophysics Data System (ADS)

    Akristiniy, Vera A.; Dikova, Elena A.

    2018-03-01

    The article is devoted to one of the types of urban planning studies - the visual-landscape analysis during the integration of high-rise buildings within the historic urban environment for the purposes of providing pre-design and design studies in terms of preserving the historical urban environment and the implementation of the reconstructional resource of the area. In the article formed and systematized the stages and methods of conducting the visual-landscape analysis taking into account the influence of high-rise buildings on objects of cultural heritage and valuable historical buildings of the city. Practical application of the visual-landscape analysis provides an opportunity to assess the influence of hypothetical location of high-rise buildings on the perception of a historically developed environment and optimal building parameters. The contents of the main stages in the conduct of the visual - landscape analysis and their key aspects, concerning the construction of predicted zones of visibility of the significant historically valuable urban development objects and hypothetically planned of the high-rise buildings are revealed. The obtained data are oriented to the successive development of the planning and typological structure of the city territory and preservation of the compositional influence of valuable fragments of the historical environment in the structure of the urban landscape. On their basis, an information database is formed to determine the permissible urban development parameters of the high-rise buildings for the preservation of the compositional integrity of the urban area.

  18. Multi-Disciplinary Analysis for Future Launch Systems Using NASA's Advanced Engineering Environment (AEE)

    NASA Technical Reports Server (NTRS)

    Monell, D.; Mathias, D.; Reuther, J.; Garn, M.

    2003-01-01

    A new engineering environment constructed for the purposes of analyzing and designing Reusable Launch Vehicles (RLVs) is presented. The new environment has been developed to allow NASA to perform independent analysis and design of emerging RLV architectures and technologies. The new Advanced Engineering Environment (AEE) is both collaborative and distributed. It facilitates integration of the analyses by both vehicle performance disciplines and life-cycle disciplines. Current performance disciplines supported include: weights and sizing, aerodynamics, trajectories, propulsion, structural loads, and CAD-based geometries. Current life-cycle disciplines supported include: DDT&E cost, production costs, operations costs, flight rates, safety and reliability, and system economics. Involving six NASA centers (ARC, LaRC, MSFC, KSC, GRC and JSC), AEE has been tailored to serve as a web-accessed agency-wide source for all of NASA's future launch vehicle systems engineering functions. Thus, it is configured to facilitate (a) data management, (b) automated tool/process integration and execution, and (c) data visualization and presentation. The core components of the integrated framework are a customized PTC Windchill product data management server, a set of RLV analysis and design tools integrated using Phoenix Integration's Model Center, and an XML-based data capture and transfer protocol. The AEE system has seen production use during the Initial Architecture and Technology Review for the NASA 2nd Generation RLV program, and it continues to undergo development and enhancements in support of its current main customer, the NASA Next Generation Launch Technology (NGLT) program.

  19. Functional Integration

    NASA Astrophysics Data System (ADS)

    Cartier, Pierre; DeWitt-Morette, Cecile

    2006-11-01

    Acknowledgements; List symbols, conventions, and formulary; Part I. The Physical and Mathematical Environment: 1. The physical and mathematical environment; Part II. Quantum Mechanics: 2. First lesson: gaussian integrals; 3. Selected examples; 4. Semiclassical expansion: WKB; 5. Semiclassical expansion: beyond WKB; 6. Quantum dynamics: path integrals and operator formalism; Part III. Methods from Differential Geometry: 7. Symmetries; 8. Homotopy; 9. Grassmann analysis: basics; 10. Grassmann analysis: applications; 11. Volume elements, divergences, gradients; Part IV. Non-Gaussian Applications: 12. Poisson processes in physics; 13. A mathematical theory of Poisson processes; 14. First exit time: energy problems; Part V. Problems in Quantum Field Theory: 15. Renormalization 1: an introduction; 16. Renormalization 2: scaling; 17. Renormalization 3: combinatorics; 18. Volume elements in quantum field theory Bryce DeWitt; Part VI. Projects: 19. Projects; Appendix A. Forward and backward integrals: spaces of pointed paths; Appendix B. Product integrals; Appendix C. A compendium of gaussian integrals; Appendix D. Wick calculus Alexander Wurm; Appendix E. The Jacobi operator; Appendix F. Change of variables of integration; Appendix G. Analytic properties of covariances; Appendix H. Feynman's checkerboard; Bibliography; Index.

  20. Functional Integration

    NASA Astrophysics Data System (ADS)

    Cartier, Pierre; DeWitt-Morette, Cecile

    2010-06-01

    Acknowledgements; List symbols, conventions, and formulary; Part I. The Physical and Mathematical Environment: 1. The physical and mathematical environment; Part II. Quantum Mechanics: 2. First lesson: gaussian integrals; 3. Selected examples; 4. Semiclassical expansion: WKB; 5. Semiclassical expansion: beyond WKB; 6. Quantum dynamics: path integrals and operator formalism; Part III. Methods from Differential Geometry: 7. Symmetries; 8. Homotopy; 9. Grassmann analysis: basics; 10. Grassmann analysis: applications; 11. Volume elements, divergences, gradients; Part IV. Non-Gaussian Applications: 12. Poisson processes in physics; 13. A mathematical theory of Poisson processes; 14. First exit time: energy problems; Part V. Problems in Quantum Field Theory: 15. Renormalization 1: an introduction; 16. Renormalization 2: scaling; 17. Renormalization 3: combinatorics; 18. Volume elements in quantum field theory Bryce DeWitt; Part VI. Projects: 19. Projects; Appendix A. Forward and backward integrals: spaces of pointed paths; Appendix B. Product integrals; Appendix C. A compendium of gaussian integrals; Appendix D. Wick calculus Alexander Wurm; Appendix E. The Jacobi operator; Appendix F. Change of variables of integration; Appendix G. Analytic properties of covariances; Appendix H. Feynman's checkerboard; Bibliography; Index.

  1. Space Station Environment Control and Life Support System Pressure Control Pump Assembly Modeling and Analysis

    NASA Technical Reports Server (NTRS)

    Schunk, R. Gregory

    2002-01-01

    This paper presents the Modeling and Analysis of the Space Station Environment Control and Life Support System Pressure Control Pump Assembly (PCPA). The contents include: 1) Integrated PCPA/Manifold Analyses; 2) Manifold Performance Analysis; 3) PCPA Motor Heat Leak Study; and 4) Future Plans. This paper is presented in viewgraph form.

  2. Integrated Design Engineering Analysis (IDEA) Environment Automated Generation of Structured CFD Grids using Topology Methods

    NASA Technical Reports Server (NTRS)

    Kamhawi, Hilmi N.

    2012-01-01

    This report documents the work performed from March 2010 to March 2012. The Integrated Design and Engineering Analysis (IDEA) environment is a collaborative environment based on an object-oriented, multidisciplinary, distributed framework using the Adaptive Modeling Language (AML) as a framework and supporting the configuration design and parametric CFD grid generation. This report will focus on describing the work in the area of parametric CFD grid generation using novel concepts for defining the interaction between the mesh topology and the geometry in such a way as to separate the mesh topology from the geometric topology while maintaining the link between the mesh topology and the actual geometry.

  3. BioVLAB-MMIA: a cloud environment for microRNA and mRNA integrated analysis (MMIA) on Amazon EC2.

    PubMed

    Lee, Hyungro; Yang, Youngik; Chae, Heejoon; Nam, Seungyoon; Choi, Donghoon; Tangchaisin, Patanachai; Herath, Chathura; Marru, Suresh; Nephew, Kenneth P; Kim, Sun

    2012-09-01

    MicroRNAs, by regulating the expression of hundreds of target genes, play critical roles in developmental biology and the etiology of numerous diseases, including cancer. As a vast amount of microRNA expression profile data are now publicly available, the integration of microRNA expression data sets with gene expression profiles is a key research problem in life science research. However, the ability to conduct genome-wide microRNA-mRNA (gene) integration currently requires sophisticated, high-end informatics tools, significant expertise in bioinformatics and computer science to carry out the complex integration analysis. In addition, increased computing infrastructure capabilities are essential in order to accommodate large data sets. In this study, we have extended the BioVLAB cloud workbench to develop an environment for the integrated analysis of microRNA and mRNA expression data, named BioVLAB-MMIA. The workbench facilitates computations on the Amazon EC2 and S3 resources orchestrated by the XBaya Workflow Suite. The advantages of BioVLAB-MMIA over the web-based MMIA system include: 1) readily expanded as new computational tools become available; 2) easily modifiable by re-configuring graphic icons in the workflow; 3) on-demand cloud computing resources can be used on an "as needed" basis; 4) distributed orchestration supports complex and long running workflows asynchronously. We believe that BioVLAB-MMIA will be an easy-to-use computing environment for researchers who plan to perform genome-wide microRNA-mRNA (gene) integrated analysis tasks.

  4. Integrating interface slicing into software engineering processes

    NASA Technical Reports Server (NTRS)

    Beck, Jon

    1993-01-01

    Interface slicing is a tool which was developed to facilitate software engineering. As previously presented, it was described in terms of its techniques and mechanisms. The integration of interface slicing into specific software engineering activities is considered by discussing a number of potential applications of interface slicing. The applications discussed specifically address the problems, issues, or concerns raised in a previous project. Because a complete interface slicer is still under development, these applications must be phrased in future tenses. Nonetheless, the interface slicing techniques which were presented can be implemented using current compiler and static analysis technology. Whether implemented as a standalone tool or as a module in an integrated development or reverse engineering environment, they require analysis no more complex than that required for current system development environments. By contrast, conventional slicing is a methodology which, while showing much promise and intuitive appeal, has yet to be fully implemented in a production language environment despite 12 years of development.

  5. Overview of NASA's Integrated Design and Engineering Analysis (IDEA)Environment

    NASA Technical Reports Server (NTRS)

    Robinson, Jeffrey S.; Martin John G.

    2008-01-01

    Historically, the design of subsonic and supersonic aircraft has been divided into separate technical disciplines (such as propulsion, aerodynamics and structures) each of which performs their design and analysis in relative isolation from others. This is possible in most cases either because the amount of interdisciplinary coupling is minimal or because the interactions can be treated as linear. The design of hypersonic airbreathing vehicles, like NASA s X-43, is quite the opposite. Such systems are dominated by strong non-linear interactions between disciplines. The design of these systems demands that a multi-disciplinary approach be taken. Furthermore, increased analytical fidelity at the conceptual design phase is highly desirable as many of the non-linearities are not captured by lower fidelity tools. Only when these systems are designed from a true multi-disciplinary perspective can the real performance benefits be achieved and complete vehicle systems be fielded. Toward this end, the Vehicle Analysis Branch at NASA Langley Research Center has been developing the Integrated Design & Engineering Analysis (IDEA) Environment. IDEA is a collaborative environment for parametrically modeling conceptual and preliminary launch vehicle configurations using the Adaptive Modeling Language (AML) as the underlying framework. The environment integrates geometry, configuration, propulsion, aerodynamics, aerothermodynamics, trajectory, closure and structural analysis into a generative, parametric, unified computational model where data is shared seamlessly between the different disciplines. Plans are also in place to incorporate life cycle analysis tools into the environment which will estimate vehicle operability, reliability and cost. IDEA is currently being funded by NASA s Hypersonics Project, a part of the Fundamental Aeronautics Program within the Aeronautics Research Mission Directorate. The environment is currently focused around a two-stage-to-orbit configuration with a turbine based combined cycle (TBCC) first stage and reusable rocket second stage. This paper provides an overview of the development of the IDEA environment, a description of the current status and detail of future plans.

  6. Integrated Data Visualization and Virtual Reality Tool

    NASA Technical Reports Server (NTRS)

    Dryer, David A.

    1998-01-01

    The Integrated Data Visualization and Virtual Reality Tool (IDVVRT) Phase II effort was for the design and development of an innovative Data Visualization Environment Tool (DVET) for NASA engineers and scientists, enabling them to visualize complex multidimensional and multivariate data in a virtual environment. The objectives of the project were to: (1) demonstrate the transfer and manipulation of standard engineering data in a virtual world; (2) demonstrate the effects of design and changes using finite element analysis tools; and (3) determine the training and engineering design and analysis effectiveness of the visualization system.

  7. [Integral quantitative evaluation of working conditions in the construction industry].

    PubMed

    Guseĭnov, A A

    1993-01-01

    Present method evaluating the quality of environment (using MAC and MAL) does not enable to assess completely and objectively the work conditions of building industry due to multiple confounding elements. A solution to this complicated problem including the analysis of various correlating elements of the system "human--work conditions--environment" may be encouraged by social norm of morbidity, which is independent on industrial and natural environment. The complete integral assessment enables to see the whole situation and reveal the points at risk.

  8. Semantic integration of gene expression analysis tools and data sources using software connectors

    PubMed Central

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data. PMID:24341380

  9. Semantic integration of gene expression analysis tools and data sources using software connectors.

    PubMed

    Miyazaki, Flávia A; Guardia, Gabriela D A; Vêncio, Ricardo Z N; de Farias, Cléver R G

    2013-10-25

    The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heterogeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data.

  10. Integrating freight into NEPA analysis

    DOT National Transportation Integrated Search

    2010-09-01

    The effi ciency and effectiveness of freight transportation is critical for economic growth and development, : ensuring livable communities, and creating a sustainable human and natural environment. It is increasingly : important to integrate freight...

  11. Single event test methodology for integrated optoelectronics

    NASA Technical Reports Server (NTRS)

    Label, Kenneth A.; Cooley, James A.; Stassinopoulos, E. G.; Marshall, Paul; Crabtree, Christina

    1993-01-01

    A single event upset (SEU), defined as a transient or glitch on the output of a device, and its applicability to integrated optoelectronics are discussed in the context of spacecraft design and the need for more than a bit error rate viewpoint for testing and analysis. A methodology for testing integrated optoelectronic receivers and transmitters for SEUs is presented, focusing on the actual test requirements and system schemes needed for integrated optoelectronic devices. Two main causes of single event effects in the space environment, including protons and galactic cosmic rays, are considered along with ground test facilities for simulating the space environment.

  12. Using R to implement spatial analysis in open source environment

    NASA Astrophysics Data System (ADS)

    Shao, Yixi; Chen, Dong; Zhao, Bo

    2007-06-01

    R is an open source (GPL) language and environment for spatial analysis, statistical computing and graphics which provides a wide variety of statistical and graphical techniques, and is highly extensible. In the Open Source environment it plays an important role in doing spatial analysis. So, to implement spatial analysis in the Open Source environment which we called the Open Source geocomputation is using the R data analysis language integrated with GRASS GIS and MySQL or PostgreSQL. This paper explains the architecture of the Open Source GIS environment and emphasizes the role R plays in the aspect of spatial analysis. Furthermore, one apt illustration of the functions of R is given in this paper through the project of constructing CZPGIS (Cheng Zhou Population GIS) supported by Changzhou Government, China. In this project we use R to implement the geostatistics in the Open Source GIS environment to evaluate the spatial correlation of land price and estimate it by Kriging Interpolation. We also use R integrated with MapServer and php to show how R and other Open Source software cooperate with each other in WebGIS environment, which represents the advantages of using R to implement spatial analysis in Open Source GIS environment. And in the end, we points out that the packages for spatial analysis in R is still scattered and the limited memory is still a bottleneck when large sum of clients connect at the same time. Therefore further work is to group the extensive packages in order or design normative packages and make R cooperate better with other commercial software such as ArcIMS. Also we look forward to developing packages for land price evaluation.

  13. CONFIG: Integrated engineering of systems and their operation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Ryan, Dan; Fleming, Land

    1994-01-01

    This article discusses CONFIG 3, a prototype software tool that supports integrated conceptual design evaluation from early in the product life cycle, by supporting isolated or integrated modeling, simulation, and analysis of the function, structure, behavior, failures and operations of system designs. Integration and reuse of models is supported in an object-oriented environment providing capabilities for graph analysis and discrete event simulation. CONFIG supports integration among diverse modeling approaches (component view, configuration or flow path view, and procedure view) and diverse simulation and analysis approaches. CONFIG is designed to support integrated engineering in diverse design domains, including mechanical and electro-mechanical systems, distributed computer systems, and chemical processing and transport systems.

  14. Integrated Modeling of Optical Systems (IMOS): An Assessment and Future Directions

    NASA Technical Reports Server (NTRS)

    Moore, Gregory; Broduer, Steve (Technical Monitor)

    2001-01-01

    Integrated Modeling of Optical Systems (IMOS) is a finite element-based code combining structural, thermal, and optical ray-tracing capabilities in a single environment for analysis of space-based optical systems. We'll present some recent examples of IMOS usage and discuss future development directions. Due to increasing model sizes and a greater emphasis on multidisciplinary analysis and design, much of the anticipated future work will be in the areas of improved architecture, numerics, and overall performance and analysis integration.

  15. Service-oriented model-encapsulation strategy for sharing and integrating heterogeneous geo-analysis models in an open web environment

    NASA Astrophysics Data System (ADS)

    Yue, Songshan; Chen, Min; Wen, Yongning; Lu, Guonian

    2016-04-01

    Earth environment is extremely complicated and constantly changing; thus, it is widely accepted that the use of a single geo-analysis model cannot accurately represent all details when solving complex geo-problems. Over several years of research, numerous geo-analysis models have been developed. However, a collaborative barrier between model providers and model users still exists. The development of cloud computing has provided a new and promising approach for sharing and integrating geo-analysis models across an open web environment. To share and integrate these heterogeneous models, encapsulation studies should be conducted that are aimed at shielding original execution differences to create services which can be reused in the web environment. Although some model service standards (such as Web Processing Service (WPS) and Geo Processing Workflow (GPW)) have been designed and developed to help researchers construct model services, various problems regarding model encapsulation remain. (1) The descriptions of geo-analysis models are complicated and typically require rich-text descriptions and case-study illustrations, which are difficult to fully represent within a single web request (such as the GetCapabilities and DescribeProcess operations in the WPS standard). (2) Although Web Service technologies can be used to publish model services, model users who want to use a geo-analysis model and copy the model service into another computer still encounter problems (e.g., they cannot access the model deployment dependencies information). This study presents a strategy for encapsulating geo-analysis models to reduce problems encountered when sharing models between model providers and model users and supports the tasks with different web service standards (e.g., the WPS standard). A description method for heterogeneous geo-analysis models is studied. Based on the model description information, the methods for encapsulating the model-execution program to model services and for describing model-service deployment information are also included in the proposed strategy. Hence, the model-description interface, model-execution interface and model-deployment interface are studied to help model providers and model users more easily share, reuse and integrate geo-analysis models in an open web environment. Finally, a prototype system is established, and the WPS standard is employed as an example to verify the capability and practicability of the model-encapsulation strategy. The results show that it is more convenient for modellers to share and integrate heterogeneous geo-analysis models in cloud computing platforms.

  16. PATIKA: an integrated visual environment for collaborative construction and analysis of cellular pathways.

    PubMed

    Demir, E; Babur, O; Dogrusoz, U; Gursoy, A; Nisanci, G; Cetin-Atalay, R; Ozturk, M

    2002-07-01

    Availability of the sequences of entire genomes shifts the scientific curiosity towards the identification of function of the genomes in large scale as in genome studies. In the near future, data produced about cellular processes at molecular level will accumulate with an accelerating rate as a result of proteomics studies. In this regard, it is essential to develop tools for storing, integrating, accessing, and analyzing this data effectively. We define an ontology for a comprehensive representation of cellular events. The ontology presented here enables integration of fragmented or incomplete pathway information and supports manipulation and incorporation of the stored data, as well as multiple levels of abstraction. Based on this ontology, we present the architecture of an integrated environment named Patika (Pathway Analysis Tool for Integration and Knowledge Acquisition). Patika is composed of a server-side, scalable, object-oriented database and client-side editors to provide an integrated, multi-user environment for visualizing and manipulating network of cellular events. This tool features automated pathway layout, functional computation support, advanced querying and a user-friendly graphical interface. We expect that Patika will be a valuable tool for rapid knowledge acquisition, microarray generated large-scale data interpretation, disease gene identification, and drug development. A prototype of Patika is available upon request from the authors.

  17. Science Data Report for the Optical Properties Monitor (OPM) Experiment

    NASA Technical Reports Server (NTRS)

    Wilkes, Donald R.; Zwiener, James M.

    1999-01-01

    Long term stability of spacecraft materials when exposed to the space environment continues to be a major area of investigation. The natural and induced environment surrounding a spacecraft can decrease material performance and limit useful lifetimes. The Optical Properties Monitor (OPM) experiment provided the capability to perform the important flight testing of materials and was flown on the Russian Mir Station to study the long term effects of the natural and induced space environment on materials. The core of the OPM in-flight analysis was three independent optical instruments. These instruments included an integrating sphere spectral reflectometer, a vacuum ultraviolet spectrometer, and a Total Integrated Scatter instrument. The OPM also monitored selected components of the environment including molecular contamination. The OPM was exposed on the exterior of the Mir Docking Module for approximately 8-1/2 months. This report describes the OPM experiment, a brief background of its development, program organization, experiment description, mission overview including space environment definition, performance overview, materials data including flight and ground data, in-depth post flight analysis including ground analysis measurements and a summary discussion of the findings and results.

  18. Nationwide Buildings Energy Research enabled through an integrated Data Intensive Scientific Workflow and Advanced Analysis Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kleese van Dam, Kerstin; Lansing, Carina S.; Elsethagen, Todd O.

    2014-01-28

    Modern workflow systems enable scientists to run ensemble simulations at unprecedented scales and levels of complexity, allowing them to study system sizes previously impossible to achieve, due to the inherent resource requirements needed for the modeling work. However as a result of these new capabilities the science teams suddenly also face unprecedented data volumes that they are unable to analyze with their existing tools and methodologies in a timely fashion. In this paper we will describe the ongoing development work to create an integrated data intensive scientific workflow and analysis environment that offers researchers the ability to easily create andmore » execute complex simulation studies and provides them with different scalable methods to analyze the resulting data volumes. The integration of simulation and analysis environments is hereby not only a question of ease of use, but supports fundamental functions in the correlated analysis of simulation input, execution details and derived results for multi-variant, complex studies. To this end the team extended and integrated the existing capabilities of the Velo data management and analysis infrastructure, the MeDICi data intensive workflow system and RHIPE the R for Hadoop version of the well-known statistics package, as well as developing a new visual analytics interface for the result exploitation by multi-domain users. The capabilities of the new environment are demonstrated on a use case that focusses on the Pacific Northwest National Laboratory (PNNL) building energy team, showing how they were able to take their previously local scale simulations to a nationwide level by utilizing data intensive computing techniques not only for their modeling work, but also for the subsequent analysis of their modeling results. As part of the PNNL research initiative PRIMA (Platform for Regional Integrated Modeling and Analysis) the team performed an initial 3 year study of building energy demands for the US Eastern Interconnect domain, which they are now planning to extend to predict the demand for the complete century. The initial study raised their data demands from a few GBs to 400GB for the 3year study and expected tens of TBs for the full century.« less

  19. MADNESS: A Multiresolution, Adaptive Numerical Environment for Scientific Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrison, Robert J.; Beylkin, Gregory; Bischoff, Florian A.

    2016-01-01

    MADNESS (multiresolution adaptive numerical environment for scientific simulation) is a high-level software environment for solving integral and differential equations in many dimensions that uses adaptive and fast harmonic analysis methods with guaranteed precision based on multiresolution analysis and separated representations. Underpinning the numerical capabilities is a powerful petascale parallel programming environment that aims to increase both programmer productivity and code scalability. This paper describes the features and capabilities of MADNESS and briefly discusses some current applications in chemistry and several areas of physics.

  20. Role of IAC in large space systems thermal analysis

    NASA Technical Reports Server (NTRS)

    Jones, G. K.; Skladany, J. T.; Young, J. P.

    1982-01-01

    Computer analysis programs to evaluate critical coupling effects that can significantly influence spacecraft system performance are described. These coupling effects arise from the varied parameters of the spacecraft systems, environments, and forcing functions associated with disciplines such as thermal, structures, and controls. Adverse effects can be expected to significantly impact system design aspects such as structural integrity, controllability, and mission performance. One such needed design analysis capability is a software system that can integrate individual discipline computer codes into a highly user-oriented/interactive-graphics-based analysis capability. The integrated analysis capability (IAC) system can be viewed as: a core framework system which serves as an integrating base whereby users can readily add desired analysis modules and as a self-contained interdisciplinary system analysis capability having a specific set of fully integrated multidisciplinary analysis programs that deal with the coupling of thermal, structures, controls, antenna radiation performance, and instrument optical performance disciplines.

  1. Assessment of Urbanization on the Integrated Land-Ocean-Atmosphere Environment in Coastal Metropolis in Preparation for HyspIRI

    NASA Technical Reports Server (NTRS)

    Sequera, Pedro; McDonald, Kyle C.; Gonzalez, Jorge; Arend, Mark; Krakauer, Nir; Bornstein, Robert; Luvll, Jeffrey

    2012-01-01

    The need for comprehensive studies of the relationships between past and projected changes of regional climate and human activity in comple x urban environments has been well established. The HyspIRI preparato ry airborne activities in California, associated science and applicat ions research, and eventually HyspIRI itself provide an unprecedented opportunity for development and implementation of an integrated data and modeling analysis system focused on coastal urban environments. We will utilize HyspIRI preparatory data collections in developing ne w remote sensing-based tools for investigating the integrated urban e nvironment, emphasizing weather, climate, and energy demands in compl ex coastal cities.

  2. Collaboration and Synergy among Government, Industry and Academia in M&S Domain: Turkey’s Approach

    DTIC Science & Technology

    2009-10-01

    Analysis, Decision Support System Design and Implementation, Simulation Output Analysis, Statistical Data Analysis, Virtual Reality , Artificial... virtual and constructive visual simulation systems as well as integrated advanced analytical models. Collaboration and Synergy among Government...simulation systems that are ready to use, credible, integrated with C4ISR systems.  Creating synthetic environments and/or virtual prototypes of concepts

  3. Computational Environments and Analysis methods available on the NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.

    2014-12-01

    The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will involve the further integration and analysis of this data across the social sciences to facilitate the impacts across the societal domain, including timely analysis to more accurately predict and forecast future climate and environmental state.

  4. Quality assessment of urban environment

    NASA Astrophysics Data System (ADS)

    Ovsiannikova, T. Y.; Nikolaenko, M. N.

    2015-01-01

    This paper is dedicated to the research applicability of quality management problems of construction products. It is offered to expand quality management borders in construction, transferring its principles to urban systems as economic systems of higher level, which qualitative characteristics are substantially defined by quality of construction product. Buildings and structures form spatial-material basis of cities and the most important component of life sphere - urban environment. Authors justify the need for the assessment of urban environment quality as an important factor of social welfare and life quality in urban areas. The authors suggest definition of a term "urban environment". The methodology of quality assessment of urban environment is based on integrated approach which includes the system analysis of all factors and application of both quantitative methods of assessment (calculation of particular and integrated indicators) and qualitative methods (expert estimates and surveys). The authors propose the system of indicators, characterizing quality of the urban environment. This indicators fall into four classes. The authors show the methodology of their definition. The paper presents results of quality assessment of urban environment for several Siberian regions and comparative analysis of these results.

  5. An Integrated Approach for Urban Earthquake Vulnerability Analyses

    NASA Astrophysics Data System (ADS)

    Düzgün, H. S.; Yücemen, M. S.; Kalaycioglu, H. S.

    2009-04-01

    The earthquake risk for an urban area has increased over the years due to the increasing complexities in urban environments. The main reasons are the location of major cities in hazard prone areas, growth in urbanization and population and rising wealth measures. In recent years physical examples of these factors are observed through the growing costs of major disasters in urban areas which have stimulated a demand for in-depth evaluation of possible strategies to manage the large scale damaging effects of earthquakes. Understanding and formulation of urban earthquake risk requires consideration of a wide range of risk aspects, which can be handled by developing an integrated approach. In such an integrated approach, an interdisciplinary view should be incorporated into the risk assessment. Risk assessment for an urban area requires prediction of vulnerabilities related to elements at risk in the urban area and integration of individual vulnerability assessments. However, due to complex nature of an urban environment, estimating vulnerabilities and integrating them necessities development of integrated approaches in which vulnerabilities of social, economical, structural (building stock and infrastructure), cultural and historical heritage are estimated for a given urban area over a given time period. In this study an integrated urban earthquake vulnerability assessment framework, which considers vulnerability of urban environment in a holistic manner and performs the vulnerability assessment for the smallest administrative unit, namely at neighborhood scale, is proposed. The main motivation behind this approach is the inability to implement existing vulnerability assessment methodologies for countries like Turkey, where the required data are usually missing or inadequate and decision makers seek for prioritization of their limited resources in risk reduction in the administrative districts from which they are responsible. The methodology integrates socio-economical, structural, coastal, ground condition, organizational vulnerabilities, as well as accessibility to critical services within the framework. The proposed framework has the following eight components: Seismic hazard analysis, soil response analysis, tsunami inundation analysis, structural vulnerability analysis, socio-economic vulnerability analysis, accessibility to critical services, GIS-based integrated vulnerability assessment, and visualization of vulnerabilities in 3D virtual city model The integrated model for various vulnerabilities obtained for the urban area is developed in GIS environment by using individual vulnerability assessments for considered elements at risk and serve for establishing the backbone of the spatial decision support system. The stages followed in the model are: Determination of a common mapping unit for each aspect of urban earthquake vulnerability, formation of a geo-database for the vulnerabilities, evaluation of urban vulnerability based on multi attribute utility theory with various weighting algorithms, mapping of the evaluated integrated earthquake risk in geographic information systems (GIS) in the neighborhood scale. The framework is also applicable to larger geographical mapping scales, for example, the building scale. When illustrating the results in building scale, 3-D visualizations with remote sensing data is used so that decision-makers can easily interpret the outputs. The proposed vulnerability assessment framework is flexible and can easily be applied to urban environments at various geographical scales with different mapping units. The obtained total vulnerability maps for the urban area provide a baseline for the development of risk reduction strategies for the decision makers. Moreover, as several aspects of elements at risk for an urban area is considered through vulnerability analyses, effect on changes in vulnerability conditions on the total can easily be determined. The developed approach also enables decision makers to monitor temporal and spatial changes in the urban environment due to implementation of risk reduction strategies.

  6. Expert diagnostics system as a part of analysis software for power mission operations

    NASA Technical Reports Server (NTRS)

    Harris, Jennifer A.; Bahrami, Khosrow A.

    1993-01-01

    The operation of interplanetary spacecraft at JPL has become an increasingly complex activity. This complexity is due to advanced spacecraft designs and ambitious mission objectives which lead to operations requirements that are more demanding than those of any previous mission. For this reason, several productivity enhancement measures are underway at JPL within mission operations, particularly in the spacecraft analysis area. These measures aimed at spacecraft analysis include: the development of a multi-mission, multi-subsystem operations environment; the introduction of automated tools into this environment; and the development of an expert diagnostics system. This paper discusses an effort to integrate the above mentioned productivity enhancement measures. A prototype was developed that integrates an expert diagnostics system into a multi-mission, multi-subsystem operations environment using the Galileo Power / Pyro Subsystem as a testbed. This prototype will be discussed in addition to background information associated with it.

  7. Application of integrated fluid-thermal-structural analysis methods

    NASA Technical Reports Server (NTRS)

    Wieting, Allan R.; Dechaumphai, Pramote; Bey, Kim S.; Thornton, Earl A.; Morgan, Ken

    1988-01-01

    Hypersonic vehicles operate in a hostile aerothermal environment which has a significant impact on their aerothermostructural performance. Significant coupling occurs between the aerodynamic flow field, structural heat transfer, and structural response creating a multidisciplinary interaction. Interfacing state-of-the-art disciplinary analysis methods is not efficient, hence interdisciplinary analysis methods integrated into a single aerothermostructural analyzer are needed. The NASA Langley Research Center is developing such methods in an analyzer called LIFTS (Langley Integrated Fluid-Thermal-Structural) analyzer. The evolution and status of LIFTS is reviewed and illustrated through applications.

  8. Open environments to support systems engineering tool integration: A study using the Portable Common Tool Environment (PCTE)

    NASA Technical Reports Server (NTRS)

    Eckhardt, Dave E., Jr.; Jipping, Michael J.; Wild, Chris J.; Zeil, Steven J.; Roberts, Cathy C.

    1993-01-01

    A study of computer engineering tool integration using the Portable Common Tool Environment (PCTE) Public Interface Standard is presented. Over a 10-week time frame, three existing software products were encapsulated to work in the Emeraude environment, an implementation of the PCTE version 1.5 standard. The software products used were a computer-aided software engineering (CASE) design tool, a software reuse tool, and a computer architecture design and analysis tool. The tool set was then demonstrated to work in a coordinated design process in the Emeraude environment. The project and the features of PCTE used are described, experience with the use of Emeraude environment over the project time frame is summarized, and several related areas for future research are summarized.

  9. MicroScope-an integrated resource for community expertise of gene functions and comparative analysis of microbial genomic and metabolic data.

    PubMed

    Médigue, Claudine; Calteau, Alexandra; Cruveiller, Stéphane; Gachet, Mathieu; Gautreau, Guillaume; Josso, Adrien; Lajus, Aurélie; Langlois, Jordan; Pereira, Hugo; Planel, Rémi; Roche, David; Rollin, Johan; Rouy, Zoe; Vallenet, David

    2017-09-12

    The overwhelming list of new bacterial genomes becoming available on a daily basis makes accurate genome annotation an essential step that ultimately determines the relevance of thousands of genomes stored in public databanks. The MicroScope platform (http://www.genoscope.cns.fr/agc/microscope) is an integrative resource that supports systematic and efficient revision of microbial genome annotation, data management and comparative analysis. Starting from the results of our syntactic, functional and relational annotation pipelines, MicroScope provides an integrated environment for the expert annotation and comparative analysis of prokaryotic genomes. It combines tools and graphical interfaces to analyze genomes and to perform the manual curation of gene function in a comparative genomics and metabolic context. In this article, we describe the free-of-charge MicroScope services for the annotation and analysis of microbial (meta)genomes, transcriptomic and re-sequencing data. Then, the functionalities of the platform are presented in a way providing practical guidance and help to the nonspecialists in bioinformatics. Newly integrated analysis tools (i.e. prediction of virulence and resistance genes in bacterial genomes) and original method recently developed (the pan-genome graph representation) are also described. Integrated environments such as MicroScope clearly contribute, through the user community, to help maintaining accurate resources. © The Author 2017. Published by Oxford University Press.

  10. MADNESS: A Multiresolution, Adaptive Numerical Environment for Scientific Simulation

    DOE PAGES

    Harrison, Robert J.; Beylkin, Gregory; Bischoff, Florian A.; ...

    2016-01-01

    We present MADNESS (multiresolution adaptive numerical environment for scientific simulation) that is a high-level software environment for solving integral and differential equations in many dimensions that uses adaptive and fast harmonic analysis methods with guaranteed precision that are based on multiresolution analysis and separated representations. Underpinning the numerical capabilities is a powerful petascale parallel programming environment that aims to increase both programmer productivity and code scalability. This paper describes the features and capabilities of MADNESS and briefly discusses some current applications in chemistry and several areas of physics.

  11. Multidisciplinary High-Fidelity Analysis and Optimization of Aerospace Vehicles. Part 2; Preliminary Results

    NASA Technical Reports Server (NTRS)

    Walsh, J. L.; Weston, R. P.; Samareh, J. A.; Mason, B. H.; Green, L. L.; Biedron, R. T.

    2000-01-01

    An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity finite-element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a high-speed civil transport configuration. The paper describes both the preliminary results from implementing and validating the multidisciplinary analysis and the results from an aerodynamic optimization. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture compliant software product. A companion paper describes the formulation of the multidisciplinary analysis and optimization system.

  12. Model-based engineering for laser weapons systems

    NASA Astrophysics Data System (ADS)

    Panthaki, Malcolm; Coy, Steve

    2011-10-01

    The Comet Performance Engineering Workspace is an environment that enables integrated, multidisciplinary modeling and design/simulation process automation. One of the many multi-disciplinary applications of the Comet Workspace is for the integrated Structural, Thermal, Optical Performance (STOP) analysis of complex, multi-disciplinary space systems containing Electro-Optical (EO) sensors such as those which are designed and developed by and for NASA and the Department of Defense. The CometTM software is currently able to integrate performance simulation data and processes from a wide range of 3-D CAD and analysis software programs including CODE VTM from Optical Research Associates and SigFitTM from Sigmadyne Inc. which are used to simulate the optics performance of EO sensor systems in space-borne applications. Over the past year, Comet Solutions has been working with MZA Associates of Albuquerque, NM, under a contract with the Air Force Research Laboratories. This funded effort is a "risk reduction effort", to help determine whether the combination of Comet and WaveTrainTM, a wave optics systems engineering analysis environment developed and maintained by MZA Associates and used by the Air Force Research Laboratory, will result in an effective Model-Based Engineering (MBE) environment for the analysis and design of laser weapons systems. This paper will review the results of this effort and future steps.

  13. A programmable laboratory testbed in support of evaluation of functional brain activation and connectivity.

    PubMed

    Barbour, Randall L; Graber, Harry L; Xu, Yong; Pei, Yaling; Schmitz, Christoph H; Pfeil, Douglas S; Tyagi, Anandita; Andronica, Randy; Lee, Daniel C; Barbour, San-Lian S; Nichols, J David; Pflieger, Mark E

    2012-03-01

    An important determinant of the value of quantitative neuroimaging studies is the reliability of the derived information, which is a function of the data collection conditions. Near infrared spectroscopy (NIRS) and electroencelphalography are independent sensing domains that are well suited to explore principal elements of the brain's response to neuroactivation, and whose integration supports development of compact, even wearable, systems suitable for use in open environments. In an effort to maximize the translatability and utility of such resources, we have established an experimental laboratory testbed that supports measures and analysis of simulated macroscopic bioelectric and hemodynamic responses of the brain. Principal elements of the testbed include 1) a programmable anthropomorphic head phantom containing a multisignal source array embedded within a matrix that approximates the background optical and bioelectric properties of the brain, 2) integrated translatable headgear that support multimodal studies, and 3) an integrated data analysis environment that supports anatomically based mapping of experiment-derived measures that are directly and not directly observable. Here, we present a description of system components and fabrication, an overview of the analysis environment, and findings from a representative study that document the ability to experimentally validate effective connectivity models based on NIRS tomography.

  14. A Collaborative Extensible User Environment for Simulation and Knowledge Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freedman, Vicky L.; Lansing, Carina S.; Porter, Ellen A.

    2015-06-01

    In scientific simulation, scientists use measured data to create numerical models, execute simulations and analyze results from advanced simulators executing on high performance computing platforms. This process usually requires a team of scientists collaborating on data collection, model creation and analysis, and on authorship of publications and data. This paper shows that scientific teams can benefit from a user environment called Akuna that permits subsurface scientists in disparate locations to collaborate on numerical modeling and analysis projects. The Akuna user environment is built on the Velo framework that provides both a rich client environment for conducting and analyzing simulations andmore » a Web environment for data sharing and annotation. Akuna is an extensible toolset that integrates with Velo, and is designed to support any type of simulator. This is achieved through data-driven user interface generation, use of a customizable knowledge management platform, and an extensible framework for simulation execution, monitoring and analysis. This paper describes how the customized Velo content management system and the Akuna toolset are used to integrate and enhance an effective collaborative research and application environment. The extensible architecture of Akuna is also described and demonstrates its usage for creation and execution of a 3D subsurface simulation.« less

  15. System integration report

    NASA Technical Reports Server (NTRS)

    Badler, N. I.; Korein, J. D.; Meyer, C.; Manoochehri, K.; Rovins, J.; Beale, J.; Barr, B.

    1985-01-01

    Several areas that arise from the system integration issue were examined. Intersystem analysis is discussed as it relates to software development, shared data bases and interfaces between TEMPUS and PLAID, shaded graphics rendering systems, object design (BUILD), the TEMPUS animation system, anthropometric lab integration, ongoing TEMPUS support and maintenance, and the impact of UNIX and local workstations on the OSDS environment.

  16. Analysis of minimum rail size in heavy axle load environment

    DOT National Transportation Integrated Search

    2013-04-15

    The effects of increasing axle loads on rail integrity are examined in this paper. In the present context, rail integrity refers to the prevention and control of rail failures. Rail failures usually occur because cracks or defects develop and grow fr...

  17. Navy Collaborative Integrated Information Technology Initiative

    DTIC Science & Technology

    2000-01-11

    investigating the development and application of collaborative multimedia conferencing software for education and other groupwork activities. We are extending...an alternative environment for place-based synchronous groupwork . The new environment is based on the same collaborative infrastructure as the...alternative environment for place- based synchronous groupwork . This information is being used as an initial user profile, requirements analysis

  18. Methods for radiation detection and characterization using a multiple detector probe

    DOEpatents

    Akers, Douglas William; Roybal, Lyle Gene

    2014-11-04

    Apparatuses, methods, and systems relating to radiological characterization of environments are disclosed. Multi-detector probes with a plurality of detectors in a common housing may be used to substantially concurrently detect a plurality of different radiation activities and types. Multiple multi-detector probes may be used in a down-hole environment to substantially concurrently detect radioactive activity and contents of a buried waste container. Software may process, analyze, and integrate the data from the different multi-detector probes and the different detector types therein to provide source location and integrated analysis as to the source types and activity in the measured environment. Further, the integrated data may be used to compensate for differential density effects and the effects of radiation shielding materials within the volume being measured.

  19. An inexpensive, temporally-integrated system for monitoring occurrence and biological effects of aquatic contaminants in the field

    EPA Science Inventory

    Assessment of potential ecological risks of complex contaminant mixtures in the environment requires integrated chemical and biological approaches. Instrumental analysis of environmental samples alone can identify contaminants, but provides only limited insights as to possible a...

  20. The multi-modal Australian ScienceS Imaging and Visualization Environment (MASSIVE) high performance computing infrastructure: applications in neuroscience and neuroinformatics research

    PubMed Central

    Goscinski, Wojtek J.; McIntosh, Paul; Felzmann, Ulrich; Maksimenko, Anton; Hall, Christopher J.; Gureyev, Timur; Thompson, Darren; Janke, Andrew; Galloway, Graham; Killeen, Neil E. B.; Raniga, Parnesh; Kaluza, Owen; Ng, Amanda; Poudel, Govinda; Barnes, David G.; Nguyen, Toan; Bonnington, Paul; Egan, Gary F.

    2014-01-01

    The Multi-modal Australian ScienceS Imaging and Visualization Environment (MASSIVE) is a national imaging and visualization facility established by Monash University, the Australian Synchrotron, the Commonwealth Scientific Industrial Research Organization (CSIRO), and the Victorian Partnership for Advanced Computing (VPAC), with funding from the National Computational Infrastructure and the Victorian Government. The MASSIVE facility provides hardware, software, and expertise to drive research in the biomedical sciences, particularly advanced brain imaging research using synchrotron x-ray and infrared imaging, functional and structural magnetic resonance imaging (MRI), x-ray computer tomography (CT), electron microscopy and optical microscopy. The development of MASSIVE has been based on best practice in system integration methodologies, frameworks, and architectures. The facility has: (i) integrated multiple different neuroimaging analysis software components, (ii) enabled cross-platform and cross-modality integration of neuroinformatics tools, and (iii) brought together neuroimaging databases and analysis workflows. MASSIVE is now operational as a nationally distributed and integrated facility for neuroinfomatics and brain imaging research. PMID:24734019

  1. The Application of Virtual Intensive Care Unit Principles in the Aeromedical Evacuation Environment Can Improve Patient Safety, Lead to Better Patient Outcomes and Deliver Integrated Medical Care

    DTIC Science & Technology

    2016-02-01

    not reflect the official policy or position of the US government, the Department of Defense , or Air University. In accordance with Air Force...capability in the AE environment. Utilizing current civilian and Department of Defense (DoD) vICU research, an analysis of the principles demonstrates...deliver integrated medical care. This paper provides a historical narrative of telemedicine and vICU principles and highlights the utility of this

  2. An integrated environment for tactical guidance research and evaluation

    NASA Technical Reports Server (NTRS)

    Goodrich, Kenneth H.; Mcmanus, John W.

    1990-01-01

    NASA-Langley's Tactical Guidance Research and Evaluation System (TGRES) constitutes an integrated environment for the development of tactical guidance algorithms and evaluating the effects of novel technologies; the modularity of the system allows easy modification or replacement of system elements in order to conduct evaluations of alternative technologies. TGRES differs from existing systems in its capitalization on AI programming techniques for guidance-logic implementation. Its ability to encompass high-fidelity, six-DOF simulation models will facilitate the analysis of complete aircraft dynamics.

  3. Advanced Modeling, Simulation and Analysis (AMSA) Capability Roadmap Progress Review

    NASA Technical Reports Server (NTRS)

    Antonsson, Erik; Gombosi, Tamas

    2005-01-01

    Contents include the following: NASA capability roadmap activity. Advanced modeling, simulation, and analysis overview. Scientific modeling and simulation. Operations modeling. Multi-special sensing (UV-gamma). System integration. M and S Environments and Infrastructure.

  4. The TAME Project: Towards improvement-oriented software environments

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Rombach, H. Dieter

    1988-01-01

    Experience from a dozen years of analyzing software engineering processes and products is summarized as a set of software engineering and measurement principles that argue for software engineering process models that integrate sound planning and analysis into the construction process. In the TAME (Tailoring A Measurement Environment) project at the University of Maryland, such an improvement-oriented software engineering process model was developed that uses the goal/question/metric paradigm to integrate the constructive and analytic aspects of software development. The model provides a mechanism for formalizing the characterization and planning tasks, controlling and improving projects based on quantitative analysis, learning in a deeper and more systematic way about the software process and product, and feeding the appropriate experience back into the current and future projects. The TAME system is an instantiation of the TAME software engineering process model as an ISEE (integrated software engineering environment). The first in a series of TAME system prototypes has been developed. An assessment of experience with this first limited prototype is presented including a reassessment of its initial architecture.

  5. True Concurrent Thermal Engineering Integrating CAD Model Building with Finite Element and Finite Difference Methods

    NASA Technical Reports Server (NTRS)

    Panczak, Tim; Ring, Steve; Welch, Mark

    1999-01-01

    Thermal engineering has long been left out of the concurrent engineering environment dominated by CAD (computer aided design) and FEM (finite element method) software. Current tools attempt to force the thermal design process into an environment primarily created to support structural analysis, which results in inappropriate thermal models. As a result, many thermal engineers either build models "by hand" or use geometric user interfaces that are separate from and have little useful connection, if any, to CAD and FEM systems. This paper describes the development of a new thermal design environment called the Thermal Desktop. This system, while fully integrated into a neutral, low cost CAD system, and which utilizes both FEM and FD methods, does not compromise the needs of the thermal engineer. Rather, the features needed for concurrent thermal analysis are specifically addressed by combining traditional parametric surface based radiation and FD based conduction modeling with CAD and FEM methods. The use of flexible and familiar temperature solvers such as SINDA/FLUINT (Systems Improved Numerical Differencing Analyzer/Fluid Integrator) is retained.

  6. Design for perception management system on offshore reef based on integrated management

    NASA Astrophysics Data System (ADS)

    Peng, Li; Qiankun, Wang

    2017-06-01

    According to an analysis of actual monitoring demands using integrated management and information technology, a quad monitoring system is proposed to provide intelligent perception of offshore reefs, including indoor building environments, architectural structures, and facilities and perimeter integrity. This will strengthen the ability to analyse and evaluate offshore reef operation and health, promoting efficiency in decision making.

  7. The Environment for Application Software Integration and Execution (EASIE), version 1.0. Volume 2: Program integration guide

    NASA Technical Reports Server (NTRS)

    Jones, Kennie H.; Randall, Donald P.; Stallcup, Scott S.; Rowell, Lawrence F.

    1988-01-01

    The Environment for Application Software Integration and Execution, EASIE, provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational data base management system. In volume 2, the use of a SYSTEM LIBRARY PROCESSOR is used to construct a DATA DICTIONARY describing all relations defined in the data base, and a TEMPLATE LIBRARY. A TEMPLATE is a description of all subsets of relations (including conditional selection criteria and sorting specifications) to be accessed as input or output for a given application. Together, these form the SYSTEM LIBRARY which is used to automatically produce the data base schema, FORTRAN subroutines to retrieve/store data from/to the data base, and instructions to a generic REVIEWER program providing review/modification of data for a given template. Automation of these functions eliminates much of the tedious, error prone work required by the usual approach to data base integration.

  8. Integrated Optical Design Analysis (IODA): New Test Data and Modeling Features

    NASA Technical Reports Server (NTRS)

    Moore, Jim; Troy, Ed; Patrick, Brian

    2003-01-01

    A general overview of the capabilities of the IODA ("Integrated Optical Design Analysis") exchange of data and modeling results between thermal, structures, optical design, and testing engineering disciplines. This presentation focuses on new features added to the software that allow measured test data to be imported into the IODA environment for post processing or comparisons with pretest model predictions. software is presented. IODA promotes efficient

  9. 20-Year Status Report on the Human Systems Integration (HSI)/Manpower and Personnel Integration (MANPRINT) Soldier Survivability Domain

    DTIC Science & Technology

    2015-05-01

    ARL/SLAD Electromagnetic Vulnerability Analysis Facility: provides the results of radio frequency detection and jamming susceptibility analysis of the...Lecture 10: Environment/Weather*  Lecture 11 : Habitability*  Lecture 12: Air Composition and Pressure*  Lecture 14: Communications ...Lecture 15: Radio Frequency /Microwave/E3I  Lecture 16: Vibration/Acoustic Noise  Lecture 17: Acceleration/Stabilization  Lecture 18

  10. 40 CFR 161.170 - Preliminary analysis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Preliminary analysis. (a) If the product is produced by an integrated system, the applicant must provide a preliminary analysis of each technical grade of active ingredient contained in the product to identify all... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Preliminary analysis. 161.170 Section...

  11. IN13B-1660: Analytics and Visualization Pipelines for Big Data on the NASA Earth Exchange (NEX) and OpenNEX

    NASA Technical Reports Server (NTRS)

    Chaudhary, Aashish; Votava, Petr; Nemani, Ramakrishna R.; Michaelis, Andrew; Kotfila, Chris

    2016-01-01

    We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.

  12. Analytics and Visualization Pipelines for Big ­Data on the NASA Earth Exchange (NEX) and OpenNEX

    NASA Astrophysics Data System (ADS)

    Chaudhary, A.; Votava, P.; Nemani, R. R.; Michaelis, A.; Kotfila, C.

    2016-12-01

    We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.

  13. An Overview of NASA's Integrated Design and Engineering Analysis (IDEA) Environment

    NASA Technical Reports Server (NTRS)

    Robinson, Jeffrey S.

    2011-01-01

    Historically, the design of subsonic and supersonic aircraft has been divided into separate technical disciplines (such as propulsion, aerodynamics and structures), each of which performs design and analysis in relative isolation from others. This is possible, in most cases, either because the amount of interdisciplinary coupling is minimal, or because the interactions can be treated as linear. The design of hypersonic airbreathing vehicles, like NASA's X-43, is quite the opposite. Such systems are dominated by strong non-linear interactions between disciplines. The design of these systems demands that a multi-disciplinary approach be taken. Furthermore, increased analytical fidelity at the conceptual design phase is highly desirable, as many of the non-linearities are not captured by lower fidelity tools. Only when these systems are designed from a true multi-disciplinary perspective, can the real performance benefits be achieved and complete vehicle systems be fielded. Toward this end, the Vehicle Analysis Branch at NASA Langley Research Center has been developing the Integrated Design and Engineering Analysis (IDEA) Environment. IDEA is a collaborative environment for parametrically modeling conceptual and preliminary designs for launch vehicle and high speed atmospheric flight configurations using the Adaptive Modeling Language (AML) as the underlying framework. The environment integrates geometry, packaging, propulsion, trajectory, aerodynamics, aerothermodynamics, engine and airframe subsystem design, thermal and structural analysis, and vehicle closure into a generative, parametric, unified computational model where data is shared seamlessly between the different disciplines. Plans are also in place to incorporate life cycle analysis tools into the environment which will estimate vehicle operability, reliability and cost. IDEA is currently being funded by NASA?s Hypersonics Project, a part of the Fundamental Aeronautics Program within the Aeronautics Research Mission Directorate. The environment is currently focused around a two-stage-to-orbit configuration with a turbine-based combined cycle (TBCC) first stage and a reusable rocket second stage. IDEA will be rolled out in generations, with each successive generation providing a significant increase in capability, either through increased analytic fidelity, expansion of vehicle classes considered, or by the inclusion of advanced modeling techniques. This paper provides the motivation behind the current effort, an overview of the development of the IDEA environment (including the contents and capabilities to be included in Generation 1 and Generation 2), and a description of the current status and detail of future plans.

  14. DELIVERing Library Resources to the Virtual Learning Environment

    ERIC Educational Resources Information Center

    Secker, Jane

    2005-01-01

    Purpose: Examines a project to integrate digital libraries and virtual learning environments (VLE) focusing on requirements for online reading list systems. Design/methodology/approach: Conducted a user needs analysis using interviews and focus groups and evaluated three reading or resource list management systems. Findings: Provides a technical…

  15. Integration of passive driver-assistance systems with on-board vehicle systems

    NASA Astrophysics Data System (ADS)

    Savchenko, V. V.; Poddubko, S. N.

    2018-02-01

    Implementation in OIAS such functions as driver’s state monitoring and high-precision calculation of the current navigation coordinates of the vehicle, modularity of the OIAS construction and the possible increase in the functionality through integration with other onboard systems has a promising development future. The development of intelligent transport systems and their components allows setting and solving fundamentally new tasks for the safety of human-to-machine transport systems, and the automatic analysis of heterogeneous information flows provides a synergistic effect. The analysis of cross-modal information exchange in human-machine transport systems, from uniform methodological points of view, will allow us, with an accuracy acceptable for solving applied problems, to form in real time an integrated assessment of the state of the basic components of the human-to-machine system and the dynamics in changing situation-centered environment, including the external environment, in their interrelations.

  16. Energy Integrated Lighting-Heating-Cooling System.

    ERIC Educational Resources Information Center

    Meckler, Gershon; And Others

    1964-01-01

    Energy balance problems in the design of office buildings are analyzed. Through the use of integrated systems utilizing dual purpose products, a controlled environment with minimum expenditure of energy, equipment and space can be provided. Contents include--(1) office building occupancy loads, (2) office building heating load analysis, (3) office…

  17. Integration of a Professional Dancer into College

    ERIC Educational Resources Information Center

    Pulinkala, Ivan

    2011-01-01

    This paper evaluates the experience of integrating a professional dancer into a collegiate environment using a case study method of qualitative research. Interviews, focus groups, observation studies and document analysis techniques were used to gather qualitative data for this study. The innovation of diffusion theory was used to highlight the…

  18. Graphing Calculator Exposure of Mathematics Learning in a Partially Technology Incorporated Environment

    ERIC Educational Resources Information Center

    Kharuddin, Azrul Fazwan; Ismail, Noor Azina

    2017-01-01

    Integrating technology in the mathematics curriculum has become a necessary task for curriculum developers as well as mathematics practitioners across the world and time. In general research studies seeking a better understanding of how best to integrate mathematics analysis tools with mathematics subject matter normally observe mathematics…

  19. Computational Tools and Facilities for the Next-Generation Analysis and Design Environment

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)

    1997-01-01

    This document contains presentations from the joint UVA/NASA Workshop on Computational Tools and Facilities for the Next-Generation Analysis and Design Environment held at the Virginia Consortium of Engineering and Science Universities in Hampton, Virginia on September 17-18, 1996. The presentations focused on the computational tools and facilities for analysis and design of engineering systems, including, real-time simulations, immersive systems, collaborative engineering environment, Web-based tools and interactive media for technical training. Workshop attendees represented NASA, commercial software developers, the aerospace industry, government labs, and academia. The workshop objectives were to assess the level of maturity of a number of computational tools and facilities and their potential for application to the next-generation integrated design environment.

  20. Cost Sensitivity Analysis for Consolidated Interim Storage of Spent Fuel: Evaluating the Effect of Economic Environment Parameters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cumberland, Riley M.; Williams, Kent Alan; Jarrell, Joshua J.

    This report evaluates how the economic environment (i.e., discount rate, inflation rate, escalation rate) can impact previously estimated differences in lifecycle costs between an integrated waste management system with an interim storage facility (ISF) and a similar system without an ISF.

  1. International Management: Creating a More Realistic Global Planning Environment.

    ERIC Educational Resources Information Center

    Waldron, Darryl G.

    2000-01-01

    Discusses the need for realistic global planning environments in international business education, introducing a strategic planning model that has teams interacting with teams to strategically analyze a selected multinational company. This dynamic process must result in a single integrated written analysis that specifies an optimal strategy for…

  2. Power Grid Data Analysis with R and Hadoop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hafen, Ryan P.; Gibson, Tara D.; Kleese van Dam, Kerstin

    This book chapter presents an approach to analysis of large-scale time-series sensor information based on our experience with power grid data. We use the R-Hadoop Integrated Programming Environment (RHIPE) to analyze a 2TB data set and present code and results for this analysis.

  3. Box truss analysis and technology development. Task 1: Mesh analysis and control

    NASA Technical Reports Server (NTRS)

    Bachtell, E. E.; Bettadapur, S. S.; Coyner, J. V.

    1985-01-01

    An analytical tool was developed to model, analyze and predict RF performance of box truss antennas with reflective mesh surfaces. The analysis system is unique in that it integrates custom written programs for cord tied mesh surfaces, thereby drastically reducing the cost of analysis. The analysis system is capable of determining the RF performance of antennas under any type of manufacturing or operating environment by integrating together the various disciplines of design, finite element analysis, surface best fit analysis and RF analysis. The Integrated Mesh Analysis System consists of six separate programs: The Mesh Tie System Model Generator, The Loadcase Generator, The Model Optimizer, The Model Solver, The Surface Topography Solver and The RF Performance Solver. Additionally, a study using the mesh analysis system was performed to determine the effect of on orbit calibration, i.e., surface adjustment, on a typical box truss antenna.

  4. High-performance integrated virtual environment (HIVE): a robust infrastructure for next-generation sequence data analysis

    PubMed Central

    Simonyan, Vahan; Chumakov, Konstantin; Dingerdissen, Hayley; Faison, William; Goldweber, Scott; Golikov, Anton; Gulzar, Naila; Karagiannis, Konstantinos; Vinh Nguyen Lam, Phuc; Maudru, Thomas; Muravitskaja, Olesja; Osipova, Ekaterina; Pan, Yang; Pschenichnov, Alexey; Rostovtsev, Alexandre; Santana-Quintero, Luis; Smith, Krista; Thompson, Elaine E.; Tkachenko, Valery; Torcivia-Rodriguez, John; Wan, Quan; Wang, Jing; Wu, Tsung-Jung; Wilson, Carolyn; Mazumder, Raja

    2016-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a distributed storage and compute environment designed primarily to handle next-generation sequencing (NGS) data. This multicomponent cloud infrastructure provides secure web access for authorized users to deposit, retrieve, annotate and compute on NGS data, and to analyse the outcomes using web interface visual environments appropriately built in collaboration with research and regulatory scientists and other end users. Unlike many massively parallel computing environments, HIVE uses a cloud control server which virtualizes services, not processes. It is both very robust and flexible due to the abstraction layer introduced between computational requests and operating system processes. The novel paradigm of moving computations to the data, instead of moving data to computational nodes, has proven to be significantly less taxing for both hardware and network infrastructure. The honeycomb data model developed for HIVE integrates metadata into an object-oriented model. Its distinction from other object-oriented databases is in the additional implementation of a unified application program interface to search, view and manipulate data of all types. This model simplifies the introduction of new data types, thereby minimizing the need for database restructuring and streamlining the development of new integrated information systems. The honeycomb model employs a highly secure hierarchical access control and permission system, allowing determination of data access privileges in a finely granular manner without flooding the security subsystem with a multiplicity of rules. HIVE infrastructure will allow engineers and scientists to perform NGS analysis in a manner that is both efficient and secure. HIVE is actively supported in public and private domains, and project collaborations are welcomed. Database URL: https://hive.biochemistry.gwu.edu PMID:26989153

  5. High-performance integrated virtual environment (HIVE): a robust infrastructure for next-generation sequence data analysis.

    PubMed

    Simonyan, Vahan; Chumakov, Konstantin; Dingerdissen, Hayley; Faison, William; Goldweber, Scott; Golikov, Anton; Gulzar, Naila; Karagiannis, Konstantinos; Vinh Nguyen Lam, Phuc; Maudru, Thomas; Muravitskaja, Olesja; Osipova, Ekaterina; Pan, Yang; Pschenichnov, Alexey; Rostovtsev, Alexandre; Santana-Quintero, Luis; Smith, Krista; Thompson, Elaine E; Tkachenko, Valery; Torcivia-Rodriguez, John; Voskanian, Alin; Wan, Quan; Wang, Jing; Wu, Tsung-Jung; Wilson, Carolyn; Mazumder, Raja

    2016-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a distributed storage and compute environment designed primarily to handle next-generation sequencing (NGS) data. This multicomponent cloud infrastructure provides secure web access for authorized users to deposit, retrieve, annotate and compute on NGS data, and to analyse the outcomes using web interface visual environments appropriately built in collaboration with research and regulatory scientists and other end users. Unlike many massively parallel computing environments, HIVE uses a cloud control server which virtualizes services, not processes. It is both very robust and flexible due to the abstraction layer introduced between computational requests and operating system processes. The novel paradigm of moving computations to the data, instead of moving data to computational nodes, has proven to be significantly less taxing for both hardware and network infrastructure.The honeycomb data model developed for HIVE integrates metadata into an object-oriented model. Its distinction from other object-oriented databases is in the additional implementation of a unified application program interface to search, view and manipulate data of all types. This model simplifies the introduction of new data types, thereby minimizing the need for database restructuring and streamlining the development of new integrated information systems. The honeycomb model employs a highly secure hierarchical access control and permission system, allowing determination of data access privileges in a finely granular manner without flooding the security subsystem with a multiplicity of rules. HIVE infrastructure will allow engineers and scientists to perform NGS analysis in a manner that is both efficient and secure. HIVE is actively supported in public and private domains, and project collaborations are welcomed. Database URL: https://hive.biochemistry.gwu.edu. © The Author(s) 2016. Published by Oxford University Press.

  6. Atmospheric Environment Vulnerability Cause Analysis for the Beijing-Tianjin-Hebei Metropolitan Region

    PubMed Central

    Zhang, Yang; Shen, Jing; Li, Yu

    2018-01-01

    Assessing and quantifying atmospheric vulnerability is a key issue in urban environmental protection and management. This paper integrated the Analytical hierarchy process (AHP), fuzzy synthesis evaluation and Geographic Information System (GIS) spatial analysis into an Exposure-Sensitivity-Adaptive capacity (ESA) framework to quantitatively assess atmospheric environment vulnerability in the Beijing-Tianjin-Hebei (BTH) region with spatial and temporal comparisons. The elaboration of the relationships between atmospheric environment vulnerability and indices of exposure, sensitivity, and adaptive capacity supports enable analysis of the atmospheric environment vulnerability. Our findings indicate that the atmospheric environment vulnerability of 13 cities in the BTH region exhibits obvious spatial heterogeneity, which is caused by regional diversity in exposure, sensitivity, and adaptive capacity indices. The results of atmospheric environment vulnerability assessment and the cause analysis can provide guidance to pick out key control regions and recognize vulnerable indicators for study sites. The framework developed in this paper can also be replicated at different spatial and temporal scales using context-specific datasets to support environmental management. PMID:29342852

  7. Atmospheric Environment Vulnerability Cause Analysis for the Beijing-Tianjin-Hebei Metropolitan Region.

    PubMed

    Zhang, Yang; Shen, Jing; Li, Yu

    2018-01-13

    Assessing and quantifying atmospheric vulnerability is a key issue in urban environmental protection and management. This paper integrated the Analytical hierarchy process (AHP), fuzzy synthesis evaluation and Geographic Information System (GIS) spatial analysis into an Exposure-Sensitivity-Adaptive capacity (ESA) framework to quantitatively assess atmospheric environment vulnerability in the Beijing-Tianjin-Hebei (BTH) region with spatial and temporal comparisons. The elaboration of the relationships between atmospheric environment vulnerability and indices of exposure, sensitivity, and adaptive capacity supports enable analysis of the atmospheric environment vulnerability. Our findings indicate that the atmospheric environment vulnerability of 13 cities in the BTH region exhibits obvious spatial heterogeneity, which is caused by regional diversity in exposure, sensitivity, and adaptive capacity indices. The results of atmospheric environment vulnerability assessment and the cause analysis can provide guidance to pick out key control regions and recognize vulnerable indicators for study sites. The framework developed in this paper can also be replicated at different spatial and temporal scales using context-specific datasets to support environmental management.

  8. Velo and REXAN - Integrated Data Management and High Speed Analysis for Experimental Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kleese van Dam, Kerstin; Carson, James P.; Corrigan, Abigail L.

    2013-01-10

    The Chemical Imaging Initiative at the Pacific Northwest National Laboratory (PNNL) is creating a ‘Rapid Experimental Analysis’ (REXAN) Framework, based on the concept of reusable component libraries. REXAN allows developers to quickly compose and customize high throughput analysis pipelines for a range of experiments, as well as supporting the creation of multi-modal analysis pipelines. In addition, PNNL has coupled REXAN with its collaborative data management and analysis environment Velo to create an easy to use data management and analysis environments for experimental facilities. This paper will discuss the benefits of Velo and REXAN in the context of three examples: PNNLmore » High Resolution Mass Spectrometry - reducing analysis times from hours to seconds, and enabling the analysis of much larger data samples (100KB to 40GB) at the same time · ALS X-Ray tomography - reducing analysis times of combined STXM and EM data collected at the ALS from weeks to minutes, decreasing manual work and increasing data volumes that can be analysed in a single step ·Multi-modal nano-scale analysis of STXM and TEM data - providing a semi automated process for particle detection The creation of REXAN has significantly shortened the development time for these analysis pipelines. The integration of Velo and REXAN has significantly increased the scientific productivity of the instruments and their users by creating easy to use data management and analysis environments with greatly reduced analysis times and improved analysis capabilities.« less

  9. Integrated modeling environment for systems-level performance analysis of the Next-Generation Space Telescope

    NASA Astrophysics Data System (ADS)

    Mosier, Gary E.; Femiano, Michael; Ha, Kong; Bely, Pierre Y.; Burg, Richard; Redding, David C.; Kissil, Andrew; Rakoczy, John; Craig, Larry

    1998-08-01

    All current concepts for the NGST are innovative designs which present unique systems-level challenges. The goals are to outperform existing observatories at a fraction of the current price/performance ratio. Standard practices for developing systems error budgets, such as the 'root-sum-of- squares' error tree, are insufficient for designs of this complexity. Simulation and optimization are the tools needed for this project; in particular tools that integrate controls, optics, thermal and structural analysis, and design optimization. This paper describes such an environment which allows sub-system performance specifications to be analyzed parametrically, and includes optimizing metrics that capture the science requirements. The resulting systems-level design trades are greatly facilitated, and significant cost savings can be realized. This modeling environment, built around a tightly integrated combination of commercial off-the-shelf and in-house- developed codes, provides the foundation for linear and non- linear analysis on both the time and frequency-domains, statistical analysis, and design optimization. It features an interactive user interface and integrated graphics that allow highly-effective, real-time work to be done by multidisciplinary design teams. For the NGST, it has been applied to issues such as pointing control, dynamic isolation of spacecraft disturbances, wavefront sensing and control, on-orbit thermal stability of the optics, and development of systems-level error budgets. In this paper, results are presented from parametric trade studies that assess requirements for pointing control, structural dynamics, reaction wheel dynamic disturbances, and vibration isolation. These studies attempt to define requirements bounds such that the resulting design is optimized at the systems level, without attempting to optimize each subsystem individually. The performance metrics are defined in terms of image quality, specifically centroiding error and RMS wavefront error, which directly links to science requirements.

  10. Train integrity detection risk analysis based on PRISM

    NASA Astrophysics Data System (ADS)

    Wen, Yuan

    2018-04-01

    GNSS based Train Integrity Monitoring System (TIMS) is an effective and low-cost detection scheme for train integrity detection. However, as an external auxiliary system of CTCS, GNSS may be influenced by external environments, such as uncertainty of wireless communication channels, which may lead to the failure of communication and positioning. In order to guarantee the reliability and safety of train operation, a risk analysis method of train integrity detection based on PRISM is proposed in this article. First, we analyze the risk factors (in GNSS communication process and the on-board communication process) and model them. Then, we evaluate the performance of the model in PRISM based on the field data. Finally, we discuss how these risk factors influence the train integrity detection process.

  11. EWB: The Environment WorkBench Version 4.0

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Environment WorkBench EWB is a desktop integrated analysis tool for studying a spacecraft's interactions with its environment. Over 100 environment and analysis models are integrated into the menu-based tool. EWB, which was developed for and under the guidance of the NASA Lewis Research Center, is built atop the Module Integrator and Rule-based Intelligent Analytic Database (MIRIAD) architecture. This allows every module in EWB to communicate information to other modules in a transparent manner from the user's point of view. It removes the tedious and error-prone steps of entering data by hand from one model to another. EWB runs under UNIX operating systems (SGI and SUN workstations) and under MS Windows (3.x, 95, and NT) operating systems. MIRIAD, the unique software that makes up the core of EWB, provides the flexibility to easily modify old models and incorporate new ones as user needs change. The MIRIAD approach separates the computer assisted engineering (CAE) tool into three distinct units: 1) A modern graphical user interface to present information; 2) A data dictionary interpreter to coordinate analysis; and 3) A database for storing system designs and analysis results. The user interface is externally programmable through ASCII data files, which contain the location and type of information to be displayed on the screen. This approach provides great flexibility in tailoring the look and feel of the code to individual user needs. MIRIADbased applications, such as EWB, have utilities for viewing tabulated parametric study data, XY line plots, contour plots, and three-dimensional plots of contour data and system geometries. In addition, a Monte Carlo facility is provided to allow statistical assessments (including uncertainties) in models or data.

  12. UAS Integration into the NAS Project

    NASA Technical Reports Server (NTRS)

    Bauer, Jeff

    2010-01-01

    The goal of the UAS Integration in the NAS Project is to contribute capabilities that reduce technical barriers related to the safety and operational challenges associated with enabling routine UAS access to the NAS This goal will be accomplished through a two-phased approach of system-level integration of key concepts, technologies and/or procedures, and demonstrations of integrated capabilities in an operationally relevant environment. Technical objectives include: PHASE 1: a) Validating the key technical areas identified by this project. System-level analyses, a State of the Art Analysis (SOAA), and a ConOps will identify the challenges and barriers preventing routine UAS access to the NAS. b) Developing a national roadmap and gap analysis identifying specific deliverables in the area of operations, procedures, and technologies that will impact future policy decisions. PHASE 2: a) Provide regulators with a methodology for developing airworthiness requirements for UAS and data to support development of certifications standards and regulatory guidance. b) Provide systems-level integrated testing of concepts and/or capabilities that address barriers to routine access to the NAS. Through simulation and flight testing, address issues including separation assurance, communications requirements, and Pilot Aircraft Interfaces (PAIs) in operationally relevant environments

  13. Integration of Multifidelity Multidisciplinary Computer Codes for Design and Analysis of Supersonic Aircraft

    NASA Technical Reports Server (NTRS)

    Geiselhart, Karl A.; Ozoroski, Lori P.; Fenbert, James W.; Shields, Elwood W.; Li, Wu

    2011-01-01

    This paper documents the development of a conceptual level integrated process for design and analysis of efficient and environmentally acceptable supersonic aircraft. To overcome the technical challenges to achieve this goal, a conceptual design capability which provides users with the ability to examine the integrated solution between all disciplines and facilitates the application of multidiscipline design, analysis, and optimization on a scale greater than previously achieved, is needed. The described capability is both an interactive design environment as well as a high powered optimization system with a unique blend of low, mixed and high-fidelity engineering tools combined together in the software integration framework, ModelCenter. The various modules are described and capabilities of the system are demonstrated. The current limitations and proposed future enhancements are also discussed.

  14. Modelling Student Satisfaction and Motivation in the Integrated Educational Environment: An Empirical study

    ERIC Educational Resources Information Center

    Stukalina, Yulia

    2016-01-01

    Purpose: The purpose of this paper is to explore some issues related to enhancing the quality of educational services provided by a university in the agenda of integrating quality assurance activities and strategic management procedures. Design/methodology/approach: Employing multiple regression analysis the author has examined some factors that…

  15. Incorporating "motivation" into the functional analysis of challenging behavior: on the interactive and integrative potential of the motivating operation.

    PubMed

    Langthorne, Paul; McGill, Peter; O'Reilly, Mark

    2007-07-01

    Sensitivity theory attempts to account for the variability often observed in challenging behavior by recourse to the "aberrant motivation" of people with intellectual and developmental disabilities. In this article, we suggest that a functional analysis based on environmental (challenging environments) and biological (challenging needs) motivating operations provides a more parsimonious and empirically grounded account of challenging behavior than that proposed by sensitivity theory. It is argued that the concept of the motivating operation provides a means of integrating diverse strands of research without the undue inference of mentalistic constructs. An integrated model of challenging behavior is proposed, one that remains compatible with the central tenets of functional analysis.

  16. Human Factors Virtual Analysis Techniques for NASA's Space Launch System Ground Support using MSFC's Virtual Environments Lab (VEL)

    NASA Technical Reports Server (NTRS)

    Searcy, Brittani

    2017-01-01

    Using virtual environments to assess complex large scale human tasks provides timely and cost effective results to evaluate designs and to reduce operational risks during assembly and integration of the Space Launch System (SLS). NASA's Marshall Space Flight Center (MSFC) uses a suite of tools to conduct integrated virtual analysis during the design phase of the SLS Program. Siemens Jack is a simulation tool that allows engineers to analyze human interaction with CAD designs by placing a digital human model into the environment to test different scenarios and assess the design's compliance to human factors requirements. Engineers at MSFC are using Jack in conjunction with motion capture and virtual reality systems in MSFC's Virtual Environments Lab (VEL). The VEL provides additional capability beyond standalone Jack to record and analyze a person performing a planned task to assemble the SLS at Kennedy Space Center (KSC). The VEL integrates Vicon Blade motion capture system, Siemens Jack, Oculus Rift, and other virtual tools to perform human factors assessments. By using motion capture and virtual reality, a more accurate breakdown and understanding of how an operator will perform a task can be gained. By virtual analysis, engineers are able to determine if a specific task is capable of being safely performed by both a 5% (approx. 5ft) female and a 95% (approx. 6'1) male. In addition, the analysis will help identify any tools or other accommodations that may to help complete the task. These assessments are critical for the safety of ground support engineers and keeping launch operations on schedule. Motion capture allows engineers to save and examine human movements on a frame by frame basis, while virtual reality gives the actor (person performing a task in the VEL) an immersive view of the task environment. This presentation will discuss the need of human factors for SLS and the benefits of analyzing tasks in NASA MSFC's VEL.

  17. PHYLOGEOrec: A QGIS plugin for spatial phylogeographic reconstruction from phylogenetic tree and geographical information data

    NASA Astrophysics Data System (ADS)

    Nashrulloh, Maulana Malik; Kurniawan, Nia; Rahardi, Brian

    2017-11-01

    The increasing availability of genetic sequence data associated with explicit geographic and environment (including biotic and abiotic components) information offers new opportunities to study the processes that shape biodiversity and its patterns. Developing phylogeography reconstruction, by integrating phylogenetic and biogeographic knowledge, provides richer and deeper visualization and information on diversification events than ever before. Geographical information systems such as QGIS provide an environment for spatial modeling, analysis, and dissemination by which phylogenetic models can be explicitly linked with their associated spatial data, and subsequently, they will be integrated with other related georeferenced datasets describing the biotic and abiotic environment. We are introducing PHYLOGEOrec, a QGIS plugin for building spatial phylogeographic reconstructions constructed from phylogenetic tree and geographical information data based on QGIS2threejs. By using PHYLOGEOrec, researchers can integrate existing phylogeny and geographical information data, resulting in three-dimensional geographic visualizations of phylogenetic trees in the Keyhole Markup Language (KML) format. Such formats can be overlaid on a map using QGIS and finally, spatially viewed in QGIS by means of a QGIS2threejs engine for further analysis. KML can also be viewed in reputable geobrowsers with KML-support (i.e., Google Earth).

  18. Using GIS to analyze animal movements in the marine environment

    USGS Publications Warehouse

    Hooge, Philip N.; Eichenlaub, William M.; Solomon, Elizabeth K.; Kruse, Gordon H.; Bez, Nicolas; Booth, Anthony; Dorn, Martin W.; Hills, Susan; Lipcius, Romuald N.; Pelletier, Dominique; Roy, Claude; Smith, Stephen J.; Witherell, David B.

    2001-01-01

    Advanced methods for analyzing animal movements have been little used in the aquatic research environment compared to the terrestrial. In addition, despite obvious advantages of integrating geographic information systems (GIS) with spatial studies of animal movement behavior, movement analysis tools have not been integrated into GIS for either aquatic or terrestrial environments. We therefore developed software that integrates one of the most commonly used GIS programs (ArcView®) with a large collection of animal movement analysis tools. This application, the Animal Movement Analyst Extension (AMAE), can be loaded as an extension to ArcView® under multiple operating system platforms (PC, Unix, and Mac OS). It contains more than 50 functions, including parametric and nonparametric home range analyses, random walk models, habitat analyses, point and circular statistics, tests of complete spatial randomness, tests for autocorrelation and sample size, point and line manipulation tools, and animation tools. This paper describes the use of these functions in analyzing animal location data; some limited examples are drawn from a sonic-tracking study of Pacific halibut (Hippoglossus stenolepis) in Glacier Bay, Alaska. The extension is available on the Internet at www.absc.usgs.gov/glba/gistools/index.htm.

  19. Incorporating Informal Learning Environments and Local Fossil Specimens in Earth Science Classrooms: A Recipe for Success

    ERIC Educational Resources Information Center

    Clary, Renee M.; Wandersee, James H.

    2009-01-01

    In an online graduate paleontology course taken by practicing Earth Science teachers, we designed an investigation using teachers' local informal educational environments. Teachers (N = 28) were responsible for photographing, describing, and integrating fossil specimens from two informal sites into a paleoenvironmental analysis of the landscape in…

  20. Towards the Integration of APECS with VE-Suite to Create a Comprehensive Virtual Engineering Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCorkle, D.; Yang, C.; Jordan, T.

    2007-06-01

    Modeling and simulation tools are becoming pervasive in the process engineering practice of designing advanced power generation facilities. These tools enable engineers to explore many what-if scenarios before cutting metal or constructing a pilot scale facility. While such tools enable investigation of crucial plant design aspects, typical commercial process simulation tools such as Aspen Plus®, gPROMS®, and HYSYS® still do not explore some plant design information, including computational fluid dynamics (CFD) models for complex thermal and fluid flow phenomena, economics models for policy decisions, operational data after the plant is constructed, and as-built information for use in as-designed models. Softwaremore » tools must be created that allow disparate sources of information to be integrated if environments are to be constructed where process simulation information can be accessed. At the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL), the Advanced Process Engineering Co-Simulator (APECS) has been developed as an integrated software suite that combines process simulation (e.g., Aspen Plus) and high-fidelity equipment simulation (e.g., Fluent® CFD), together with advanced analysis capabilities including case studies, sensitivity analysis, stochastic simulation for risk/uncertainty analysis, and multi-objective optimization. In this paper, we discuss the initial phases of integrating APECS with the immersive and interactive virtual engineering software, VE-Suite, developed at Iowa State University and Ames Laboratory. VE-Suite utilizes the ActiveX (OLE Automation) controls in Aspen Plus wrapped by the CASI library developed by Reaction Engineering International to run the process simulation and query for unit operation results. This integration permits any application that uses the VE-Open interface to integrate with APECS co-simulations, enabling construction of the comprehensive virtual engineering environment needed for the rapid engineering of advanced power generation facilities.« less

  1. Assessing regional environmental quality by integrated use of remote sensing, GIS, and spatial multi-criteria evaluation for prioritization of environmental restoration.

    PubMed

    Rahman, Md Rejaur; Shi, Z H; Chongfa, Cai

    2014-11-01

    This study was an attempt to analyse the regional environmental quality with the application of remote sensing, geographical information system, and spatial multiple criteria decision analysis and, to project a quantitative method applicable to identify the status of the regional environment of the study area. Using spatial multi-criteria evaluation (SMCE) approach with expert knowledge in this study, an integrated regional environmental quality index (REQI) was computed and classified into five levels of regional environment quality viz. worse, poor, moderate, good, and very good. During the process, a set of spatial criteria were selected (here, 15 criterions) together with the degree of importance of criteria in sustainability of the regional environment. Integrated remote sensing and GIS technique and models were applied to generate the necessary factors (criterions) maps for the SMCE approach. The ranking, along with expected value method, was used to standardize the factors and on the other hand, an analytical hierarchy process (AHP) was applied for calculating factor weights. The entire process was executed in the integrated land and water information system (ILWIS) software tool that supports SMCE. The analysis showed that the overall regional environmental quality of the area was at moderate level and was partly determined by elevation. Areas under worse and poor quality of environment indicated that the regional environmental status showed decline in these parts of the county. The study also revealed that the human activities, vegetation condition, soil erosion, topography, climate, and soil conditions have serious influence on the regional environment condition of the area. Considering the regional characteristics of environmental quality, priority, and practical needs for environmental restoration, the study area was further regionalized into four priority areas which may serve as base areas of decision making for the recovery, rebuilding, and protection of the environment.

  2. Jupiter Environment Tool

    NASA Technical Reports Server (NTRS)

    Sturm, Erick J.; Monahue, Kenneth M.; Biehl, James P.; Kokorowski, Michael; Ngalande, Cedrick,; Boedeker, Jordan

    2012-01-01

    The Jupiter Environment Tool (JET) is a custom UI plug-in for STK that provides an interface to Jupiter environment models for visualization and analysis. Users can visualize the different magnetic field models of Jupiter through various rendering methods, which are fully integrated within STK s 3D Window. This allows users to take snapshots and make animations of their scenarios with magnetic field visualizations. Analytical data can be accessed in the form of custom vectors. Given these custom vectors, users have access to magnetic field data in custom reports, graphs, access constraints, coverage analysis, and anywhere else vectors are used within STK.

  3. On meta- and mega-analyses for gene-environment interactions.

    PubMed

    Huang, Jing; Liu, Yulun; Vitale, Steve; Penning, Trevor M; Whitehead, Alexander S; Blair, Ian A; Vachani, Anil; Clapper, Margie L; Muscat, Joshua E; Lazarus, Philip; Scheet, Paul; Moore, Jason H; Chen, Yong

    2017-12-01

    Gene-by-environment (G × E) interactions are important in explaining the missing heritability and understanding the causation of complex diseases, but a single, moderately sized study often has limited statistical power to detect such interactions. With the increasing need for integrating data and reporting results from multiple collaborative studies or sites, debate over choice between mega- versus meta-analysis continues. In principle, data from different sites can be integrated at the individual level into a "mega" data set, which can be fit by a joint "mega-analysis." Alternatively, analyses can be done at each site, and results across sites can be combined through a "meta-analysis" procedure without integrating individual level data across sites. Although mega-analysis has been advocated in several recent initiatives, meta-analysis has the advantages of simplicity and feasibility, and has recently led to several important findings in identifying main genetic effects. In this paper, we conducted empirical and simulation studies, using data from a G × E study of lung cancer, to compare the mega- and meta-analyses in four commonly used G × E analyses under the scenario that the number of studies is small and sample sizes of individual studies are relatively large. We compared the two data integration approaches in the context of fixed effect models and random effects models separately. Our investigations provide valuable insights in understanding the differences between mega- and meta-analyses in practice of combining small number of studies in identifying G × E interactions. © 2017 WILEY PERIODICALS, INC.

  4. Mission Operations and Navigation Toolkit Environment

    NASA Technical Reports Server (NTRS)

    Sunseri, Richard F.; Wu, Hsi-Cheng; Hanna, Robert A.; Mossey, Michael P.; Duncan, Courtney B.; Evans, Scott E.; Evans, James R.; Drain, Theodore R.; Guevara, Michelle M.; Martin Mur, Tomas J.; hide

    2009-01-01

    MONTE (Mission Operations and Navigation Toolkit Environment) Release 7.3 is an extensible software system designed to support trajectory and navigation analysis/design for space missions. MONTE is intended to replace the current navigation and trajectory analysis software systems, which, at the time of this reporting, are used by JPL's Navigation and Mission Design section. The software provides an integrated, simplified, and flexible system that can be easily maintained to serve the needs of future missions in need of navigation services.

  5. An Adaptive INS-Aided PLL Tracking Method for GNSS Receivers in Harsh Environments.

    PubMed

    Cong, Li; Li, Xin; Jin, Tian; Yue, Song; Xue, Rui

    2016-01-23

    As the weak link in global navigation satellite system (GNSS) signal processing, the phase-locked loop (PLL) is easily influenced with frequent cycle slips and loss of lock as a result of higher vehicle dynamics and lower signal-to-noise ratios. With inertial navigation system (INS) aid, PLLs' tracking performance can be improved. However, for harsh environments with high dynamics and signal attenuation, the traditional INS-aided PLL with fixed loop parameters has some limitations to improve the tracking adaptability. In this paper, an adaptive INS-aided PLL capable of adjusting its noise bandwidth and coherent integration time has been proposed. Through theoretical analysis, the relation between INS-aided PLL phase tracking error and carrier to noise density ratio (C/N₀), vehicle dynamics, aiding information update time, noise bandwidth, and coherent integration time has been built. The relation formulae are used to choose the optimal integration time and bandwidth for a given application under the minimum tracking error criterion. Software and hardware simulation results verify the correctness of the theoretical analysis, and demonstrate that the adaptive tracking method can effectively improve the PLL tracking ability and integrated GNSS/INS navigation performance. For harsh environments, the tracking sensitivity is increased by 3 to 5 dB, velocity errors are decreased by 36% to 50% and position errors are decreased by 6% to 24% when compared with other INS-aided PLL methods.

  6. Design and implementation of spatial knowledge grid for integrated spatial analysis

    NASA Astrophysics Data System (ADS)

    Liu, Xiangnan; Guan, Li; Wang, Ping

    2006-10-01

    Supported by spatial information grid(SIG), the spatial knowledge grid (SKG) for integrated spatial analysis utilizes the middleware technology in constructing the spatial information grid computation environment and spatial information service system, develops spatial entity oriented spatial data organization technology, carries out the profound computation of the spatial structure and spatial process pattern on the basis of Grid GIS infrastructure, spatial data grid and spatial information grid (specialized definition). At the same time, it realizes the complex spatial pattern expression and the spatial function process simulation by taking the spatial intelligent agent as the core to establish space initiative computation. Moreover through the establishment of virtual geographical environment with man-machine interactivity and blending, complex spatial modeling, network cooperation work and spatial community decision knowledge driven are achieved. The framework of SKG is discussed systematically in this paper. Its implement flow and the key technology with examples of overlay analysis are proposed as well.

  7. Assessing the Nexus of Built, Natural, and Social Environments and Public Health Outcomes

    NASA Astrophysics Data System (ADS)

    Archer, R.; Alexander, S.; Douglas, J.

    2017-12-01

    This study investigates community-related environmental justice concerns and chemical and non-chemical health stressors from built, natural, and social environments in Southeast Los Angeles (SELA) County and East Oakland, California. The geographical distribution of health outcomes is related to the built and natural environments, as well as impacts from the social environment. A holistic systems view is important in assessing healthy behaviors within a community, because they do not occur in isolation. Geospatial analysis will be performed to integrate a total environment framework and explore the spatial patterns of exposure to chemical and non-chemical stressors and access to health-promoting environments. Geographic Information Systems (GIS) analysis using primary and secondary existing data will be performed to determine how social environments impact exposure to chemical health stressors and access to health-promoting built and natural environments. This project will develop a comprehensive list of health-promoting built and natural environments (e.g., parks and community gardens) and polluting sites (e.g., shipping ports and sources of pollution not included in federal regulatory databases) in East Oakland and SELA. California Department of Public Health and U.S. Decennial Census data will also be included for geospatial analysis to overlay the distribution of air pollution-related morbidities (e.g. asthma, diabetes, and cancer) and access to health-promoting built and natural environments and related community assets, exposure to polluting industries, social disorganization, and public health outcomes in the target areas. This research will help identify the spatial and temporal distribution and cumulative impacts of critical pollution hotspots causing community environmental health impacts. The research team will also map how social environments impact exposure to chemical health stressors and access to health-promoting built and natural environments. The process and outcomes of this research should empower communities and aid decision-makers to integrate environmental justice considerations into public health policies.

  8. An integrated, ethically driven environmental model of clinical decision making in emergency settings.

    PubMed

    Wolf, Lisa

    2013-02-01

    To explore the relationship between multiple variables within a model of critical thinking and moral reasoning. A quantitative descriptive correlational design using a purposive sample of 200 emergency nurses. Measured variables were accuracy in clinical decision-making, moral reasoning, perceived care environment, and demographics. Analysis was by bivariate correlation using Pearson's product-moment correlation coefficients, chi square and multiple linear regression analysis. The elements as identified in the integrated ethically-driven environmental model of clinical decision-making (IEDEM-CD) corrected depict moral reasoning and environment of care as factors significantly affecting accuracy in decision-making. The integrated, ethically driven environmental model of clinical decision making is a framework useful for predicting clinical decision making accuracy for emergency nurses in practice, with further implications in education, research and policy. A diagnostic and therapeutic framework for identifying and remediating individual and environmental challenges to accurate clinical decision making. © 2012, The Author. International Journal of Nursing Knowledge © 2012, NANDA International.

  9. CONDUIT: A New Multidisciplinary Integration Environment for Flight Control Development

    NASA Technical Reports Server (NTRS)

    Tischler, Mark B.; Colbourne, Jason D.; Morel, Mark R.; Biezad, Daniel J.; Levine, William S.; Moldoveanu, Veronica

    1997-01-01

    A state-of-the-art computational facility for aircraft flight control design, evaluation, and integration called CONDUIT (Control Designer's Unified Interface) has been developed. This paper describes the CONDUIT tool and case study applications to complex rotary- and fixed-wing fly-by-wire flight control problems. Control system analysis and design optimization methods are presented, including definition of design specifications and system models within CONDUIT, and the multi-objective function optimization (CONSOL-OPTCAD) used to tune the selected design parameters. Design examples are based on flight test programs for which extensive data are available for validation. CONDUIT is used to analyze baseline control laws against pertinent military handling qualities and control system specifications. In both case studies, CONDUIT successfully exploits trade-offs between forward loop and feedback dynamics to significantly improve the expected handling, qualities and minimize the required actuator authority. The CONDUIT system provides a new environment for integrated control system analysis and design, and has potential for significantly reducing the time and cost of control system flight test optimization.

  10. An Observation Analysis Tool for time-series analysis and sensor management in the FREEWAT GIS environment for water resources management

    NASA Astrophysics Data System (ADS)

    Cannata, Massimiliano; Neumann, Jakob; Cardoso, Mirko; Rossetto, Rudy; Foglia, Laura; Borsi, Iacopo

    2017-04-01

    In situ time-series are an important aspect of environmental modelling, especially with the advancement of numerical simulation techniques and increased model complexity. In order to make use of the increasing data available through the requirements of the EU Water Framework Directive, the FREEWAT GIS environment incorporates the newly developed Observation Analysis Tool for time-series analysis. The tool is used to import time-series data into QGIS from local CSV files, online sensors using the istSOS service, or MODFLOW model result files and enables visualisation, pre-processing of data for model development, and post-processing of model results. OAT can be used as a pre-processor for calibration observations, integrating the creation of observations for calibration directly from sensor time-series. The tool consists in an expandable Python library of processing methods and an interface integrated in the QGIS FREEWAT plug-in which includes a large number of modelling capabilities, data management tools and calibration capacity.

  11. Skinner Meets Piaget on the Reggio Playground: Practical Synthesis of Applied Behavior Analysis and Developmentally Appropriate Practice Orientations

    ERIC Educational Resources Information Center

    Warash, Bobbie; Curtis, Reagan; Hursh, Dan; Tucci, Vicci

    2008-01-01

    We focus on integrating developmentally appropriate practices, the project approach of Reggio Emilia, and a behavior analytic model to support a quality preschool environment. While the above practices often are considered incompatible, we have found substantial overlap and room for integration of these perspectives in practical application. With…

  12. Integrated wetland management: an analysis with group model building based on system dynamics model.

    PubMed

    Chen, Hsin; Chang, Yang-Chi; Chen, Kung-Chen

    2014-12-15

    The wetland system possesses diverse functions such as preserving water sources, mediating flooding, providing habitats for wildlife and stabilizing coastlines. Nonetheless, rapid economic growth and the increasing population have significantly deteriorated the wetland environment. To secure the sustainability of the wetland, it is essential to introduce integrated and systematic management. This paper examines the resource management of the Jiading Wetland by applying group model building (GMB) and system dynamics (SD). We systematically identify local stakeholders' mental model regarding the impact brought by the yacht industry, and further establish a SD model to simulate the dynamic wetland environment. The GMB process improves the stakeholders' understanding about the interaction between the wetland environment and management policies. Differences between the stakeholders' perceptions and the behaviors shown by the SD model also suggest that our analysis would facilitate the stakeholders to broaden their horizons and achieve consensus on the wetland resource management. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Report on Integration of Existing Grid Models for N-R HES Interaction Focused on Balancing Authorities for Sub-hour Penalties and Opportunities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McJunkin, Timothy; Epiney, Aaron; Rabiti, Cristian

    2017-06-01

    This report provides a summary of the effort in the Nuclear-Renewable Hybrid Energy System (N-R HES) project on the level 4 milestone to consider integration of existing grid models into the factors for optimization on shorter time intervals than the existing electric grid models with the Risk Analysis Virtual Environment (RAVEN) and Modelica [1] optimizations and economic analysis that are the focus of the project to date.

  14. Evolution of the Tropical Cyclone Integrated Data Exchange And Analysis System (TC-IDEAS)

    NASA Technical Reports Server (NTRS)

    Turk, J.; Chao, Y.; Haddad, Z.; Hristova-Veleva, S.; Knosp, B.; Lambrigtsen, B.; Li, P.; Licata, S.; Poulsen, W.; Su, H.; hide

    2010-01-01

    The Tropical Cyclone Integrated Data Exchange and Analysis System (TC-IDEAS) is being jointly developed by the Jet Propulsion Laboratory (JPL) and the Marshall Space Flight Center (MSFC) as part of NASA's Hurricane Science Research Program. The long-term goal is to create a comprehensive tropical cyclone database of satellite and airborne observations, in-situ measurements and model simulations containing parameters that pertain to the thermodynamic and microphysical structure of the storms; the air-sea interaction processes; and the large-scale environment.

  15. Integrated communication and control systems. I - Analysis

    NASA Technical Reports Server (NTRS)

    Halevi, Yoram; Ray, Asok

    1988-01-01

    The paper presents the results of an ICCS analysis focusing on discrete-time control systems subject to time-varying delays. The present analytical technique is applicable to integrated dynamic systems such as those encountered in advanced aircraft, spacecraft, and the real-time control of robots and machine tools via a high-speed network within an autonomous manufacturing environment. The significance of data latency and missynchronization between individual system components in ICCS networks is discussed in view of the time-varying delays.

  16. Analysis of French Secondary School Teachers' Intention to Integrate Digital Work Environments into Their Teaching Practices

    ERIC Educational Resources Information Center

    Pacurar, Ecaterina; Abbas, Nargis

    2015-01-01

    This research investigates the effective use of Digital Work Environment (DWE) in France. A theoretical framework based on the pedagogical engineering approach is used to propose an hypothetical model, which results in an explained variable of intention for the pedagogical use of an educational technology. The sex, the teaching disciplines, the…

  17. An integrated toolbox for processing and analysis of remote sensing data of inland and coastal waters - atmospheric correction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haan, J.F. de; Kokke, J.M.M.; Hoogenboom, H.J.

    1997-06-01

    Deriving thematic maps of water quality parameters from a remote sensing image requires a number of processing steps, such as calibration, atmospheric correction, air-water interface correction, and application of water quality algorithms. A prototype version of an integrated software environment has recently been developed that enables the user to perform and control these processing steps. Major parts of this environment are: (i) access to the MODTRAN 3 radiative transfer code, (ii) a database of water quality algorithms, and (iii) a spectral library of Dutch coastal and inland waters, containing subsurface irradiance reflectance spectra and associated water quality parameters. The atmosphericmore » correction part of this environment is discussed here. It is shown that this part can be used to accurately retrieve spectral signatures of inland water for wavelengths between 450 and 750 nm, provided in situ measurements are used to determine atmospheric model parameters. Assessment of the usefulness of the completely integrated software system in an operational environment requires a revised version that is presently being developed.« less

  18. Research environments that promote integrity.

    PubMed

    Jeffers, Brenda Recchia; Whittemore, Robin

    2005-01-01

    The body of empirical knowledge about research integrity and the factors that promote research integrity in nursing research environments remains small. To propose an internal control model as an innovative framework for the design and structure of nursing research environments that promote integrity. An internal control model is adapted to illustrate its use for conceptualizing and designing research environments that promote integrity. The internal control model integrates both the organizational elements necessary to promote research integrity and the processes needed to assess research environments. The model provides five interrelated process components within which any number of research integrity variables and processes may be used and studied: internal control environment, risk assessment, internal control activities, monitoring, and information and communication. The components of the proposed research integrity internal control model proposed comprise an integrated conceptualization of the processes that provide reasonable assurance that research integrity will be promoted within the nursing research environment. Schools of nursing can use the model to design, implement, and evaluate systems that promote research integrity. The model process components need further exploration to substantiate the use of the model in nursing research environments.

  19. Study of plasma environments for the integrated Space Station electromagnetic analysis system

    NASA Technical Reports Server (NTRS)

    Singh, Nagendra

    1992-01-01

    The final report includes an analysis of various plasma effects on the electromagnetic environment of the Space Station Freedom. Effects of arcing are presented. Concerns of control of arcing by a plasma contactor are highlighted. Generation of waves by contaminant ions are studied and amplitude levels of the waves are estimated. Generation of electromagnetic waves by currents in the structure of the space station, driven by motional EMF, is analyzed and the radiation level is estimated.

  20. Proceedings of the Third Annual Symposium on Mathematical Pattern Recognition and Image Analysis

    NASA Technical Reports Server (NTRS)

    Guseman, L. F., Jr.

    1985-01-01

    Topics addressed include: multivariate spline method; normal mixture analysis applied to remote sensing; image data analysis; classifications in spatially correlated environments; probability density functions; graphical nonparametric methods; subpixel registration analysis; hypothesis integration in image understanding systems; rectification of satellite scanner imagery; spatial variation in remotely sensed images; smooth multidimensional interpolation; and optimal frequency domain textural edge detection filters.

  1. Multidisciplinary analysis and design of printed wiring boards

    NASA Astrophysics Data System (ADS)

    Fulton, Robert E.; Hughes, Joseph L.; Scott, Waymond R., Jr.; Umeagukwu, Charles; Yeh, Chao-Pin

    1991-04-01

    Modern printed wiring board design depends on electronic prototyping using computer-based simulation and design tools. Existing electrical computer-aided design (ECAD) tools emphasize circuit connectivity with only rudimentary analysis capabilities. This paper describes a prototype integrated PWB design environment denoted Thermal Structural Electromagnetic Testability (TSET) being developed at Georgia Tech in collaboration with companies in the electronics industry. TSET provides design guidance based on enhanced electrical and mechanical CAD capabilities including electromagnetic modeling testability analysis thermal management and solid mechanics analysis. TSET development is based on a strong analytical and theoretical science base and incorporates an integrated information framework and a common database design based on a systematic structured methodology.

  2. Aiding planning in air traffic control: an experimental investigation of the effects of perceptual information integration.

    PubMed

    Moertl, Peter M; Canning, John M; Gronlund, Scott D; Dougherty, Michael R P; Johansson, Joakim; Mills, Scott H

    2002-01-01

    Prior research examined how controllers plan in their traditional environment and identified various information uncertainties as detriments to planning. A planning aid was designed to reduce this uncertainty by perceptually representing important constraints. This included integrating spatial information on the radar screen with discrete information (planned sequences of air traffic). Previous research reported improved planning performance and decreased workload in the planning aid condition. The purpose of this paper was to determine the source of these performance improvements. Analysis of computer interactions using log-linear modeling showed that the planning interface led to less repetitive--but more integrated--information retrieval compared with the traditional planning environment. Ecological interface design principles helped explain how the integrated information retrieval gave rise to the performance improvements. Actual or potential applications of this research include the design and evaluation of interface automation that keeps users in active control by modification of perceptual task characteristics.

  3. Cloudy Solar Software - Enhanced Capabilities for Finding, Pre-processing, and Visualizing Solar Data

    NASA Astrophysics Data System (ADS)

    Istvan Etesi, Laszlo; Tolbert, K.; Schwartz, R.; Zarro, D.; Dennis, B.; Csillaghy, A.

    2010-05-01

    In our project "Extending the Virtual Solar Observatory (VSO)” we have combined some of the features available in Solar Software (SSW) to produce an integrated environment for data analysis, supporting the complete workflow from data location, retrieval, preparation, and analysis to creating publication-quality figures. Our goal is an integrated analysis experience in IDL, easy-to-use but flexible enough to allow more sophisticated procedures such as multi-instrument analysis. To that end, we have made the transition from a locally oriented setting where all the analysis is done on the user's computer, to an extended analysis environment where IDL has access to services available on the Internet. We have implemented a form of Cloud Computing that uses the VSO search and a new data retrieval and pre-processing server (PrepServer) that provides remote execution of instrument-specific data preparation. We have incorporated the interfaces to the VSO search and the PrepServer into an IDL widget (SHOW_SYNOP) that provides user-friendly searching and downloading of raw solar data and optionally sends search results for pre-processing to the PrepServer prior to downloading the data. The raw and pre-processed data can be displayed with our plotting suite, PLOTMAN, which can handle different data types (light curves, images, and spectra) and perform basic data operations such as zooming, image overlays, solar rotation, etc. PLOTMAN is highly configurable and suited for visual data analysis and for creating publishable figures. PLOTMAN and SHOW_SYNOP work hand-in-hand for a convenient working environment. Our environment supports a growing number of solar instruments that currently includes RHESSI, SOHO/EIT, TRACE, SECCHI/EUVI, HINODE/XRT, and HINODE/EIS.

  4. SPIRE: Systematic protein investigative research environment.

    PubMed

    Kolker, Eugene; Higdon, Roger; Morgan, Phil; Sedensky, Margaret; Welch, Dean; Bauman, Andrew; Stewart, Elizabeth; Haynes, Winston; Broomall, William; Kolker, Natali

    2011-12-10

    The SPIRE (Systematic Protein Investigative Research Environment) provides web-based experiment-specific mass spectrometry (MS) proteomics analysis (https://www.proteinspire.org). Its emphasis is on usability and integration of the best analytic tools. SPIRE provides an easy to use web-interface and generates results in both interactive and simple data formats. In contrast to run-based approaches, SPIRE conducts the analysis based on the experimental design. It employs novel methods to generate false discovery rates and local false discovery rates (FDR, LFDR) and integrates the best and complementary open-source search and data analysis methods. The SPIRE approach of integrating X!Tandem, OMSSA and SpectraST can produce an increase in protein IDs (52-88%) over current combinations of scoring and single search engines while also providing accurate multi-faceted error estimation. One of SPIRE's primary assets is combining the results with data on protein function, pathways and protein expression from model organisms. We demonstrate some of SPIRE's capabilities by analyzing mitochondrial proteins from the wild type and 3 mutants of C. elegans. SPIRE also connects results to publically available proteomics data through its Model Organism Protein Expression Database (MOPED). SPIRE can also provide analysis and annotation for user supplied protein ID and expression data. Copyright © 2011. Published by Elsevier B.V.

  5. Total Risk Integrated Methodology (TRIM) - TRIM.Expo

    EPA Pesticide Factsheets

    The Exposure Event module of TRIM (TRIM.Expo), similar to most human exposure models, provides an analysis of the relationships between various chemical concentrations in the environment and exposure levels of humans.

  6. A neural network ActiveX based integrated image processing environment.

    PubMed

    Ciuca, I; Jitaru, E; Alaicescu, M; Moisil, I

    2000-01-01

    The paper outlines an integrated image processing environment that uses neural networks ActiveX technology for object recognition and classification. The image processing environment which is Windows based, encapsulates a Multiple-Document Interface (MDI) and is menu driven. Object (shape) parameter extraction is focused on features that are invariant in terms of translation, rotation and scale transformations. The neural network models that can be incorporated as ActiveX components into the environment allow both clustering and classification of objects from the analysed image. Mapping neural networks perform an input sensitivity analysis on the extracted feature measurements and thus facilitate the removal of irrelevant features and improvements in the degree of generalisation. The program has been used to evaluate the dimensions of the hydrocephalus in a study for calculating the Evans index and the angle of the frontal horns of the ventricular system modifications.

  7. On meta- and mega-analyses for gene–environment interactions

    PubMed Central

    Huang, Jing; Liu, Yulun; Vitale, Steve; Penning, Trevor M.; Whitehead, Alexander S.; Blair, Ian A.; Vachani, Anil; Clapper, Margie L.; Muscat, Joshua E.; Lazarus, Philip; Scheet, Paul; Moore, Jason H.; Chen, Yong

    2017-01-01

    Gene-by-environment (G × E) interactions are important in explaining the missing heritability and understanding the causation of complex diseases, but a single, moderately sized study often has limited statistical power to detect such interactions. With the increasing need for integrating data and reporting results from multiple collaborative studies or sites, debate over choice between mega- versus meta-analysis continues. In principle, data from different sites can be integrated at the individual level into a “mega” data set, which can be fit by a joint “mega-analysis.” Alternatively, analyses can be done at each site, and results across sites can be combined through a “meta-analysis” procedure without integrating individual level data across sites. Although mega-analysis has been advocated in several recent initiatives, meta-analysis has the advantages of simplicity and feasibility, and has recently led to several important findings in identifying main genetic effects. In this paper, we conducted empirical and simulation studies, using data from a G × E study of lung cancer, to compare the mega- and meta-analyses in four commonly used G × E analyses under the scenario that the number of studies is small and sample sizes of individual studies are relatively large. We compared the two data integration approaches in the context of fixed effect models and random effects models separately. Our investigations provide valuable insights in understanding the differences between mega- and meta-analyses in practice of combining small number of studies in identifying G × E interactions. PMID:29110346

  8. Integrating natural disturbances and management activities to examine risks and opportunities in the central Oregon landscape analysis

    Treesearch

    Miles A. Hemstrom; James Merzenich; Theresa Burcsu; Janet Ohmann; Ryan Singleton

    2010-01-01

    We used state and transition models to integrate natural disturbances and management activities for a 275 000-ha landscape in the central Oregon Cascades. The landscape consists of a diverse mix of land ownerships, land use allocations, and environments. Three different management scenarios were developed from public input: (1) no management except wildfire suppression...

  9. imDEV: a graphical user interface to R multivariate analysis tools in Microsoft Excel

    USDA-ARS?s Scientific Manuscript database

    Interactive modules for data exploration and visualization (imDEV) is a Microsoft Excel spreadsheet embedded application providing an integrated environment for the analysis of omics data sets with a user-friendly interface. Individual modules were designed to provide toolsets to enable interactive ...

  10. OpenICE medical device interoperability platform overview and requirement analysis.

    PubMed

    Arney, David; Plourde, Jeffrey; Goldman, Julian M

    2018-02-23

    We give an overview of OpenICE, an open source implementation of the ASTM standard F2761 for the Integrated Clinical Environment (ICE) that leverages medical device interoperability, together with an analysis of the clinical and non-functional requirements and community process that inspired its design.

  11. Thermal-Acoustic Analysis of a Metallic Integrated Thermal Protection System Structure

    NASA Technical Reports Server (NTRS)

    Behnke, Marlana N.; Sharma, Anurag; Przekop, Adam; Rizzi, Stephen A.

    2010-01-01

    A study is undertaken to investigate the response of a representative integrated thermal protection system structure under combined thermal, aerodynamic pressure, and acoustic loadings. A two-step procedure is offered and consists of a heat transfer analysis followed by a nonlinear dynamic analysis under a combined loading environment. Both analyses are carried out in physical degrees-of-freedom using implicit and explicit solution techniques available in the Abaqus commercial finite-element code. The initial study is conducted on a reduced-size structure to keep the computational effort contained while validating the procedure and exploring the effects of individual loadings. An analysis of a full size integrated thermal protection system structure, which is of ultimate interest, is subsequently presented. The procedure is demonstrated to be a viable approach for analysis of spacecraft and hypersonic vehicle structures under a typical mission cycle with combined loadings characterized by largely different time-scales.

  12. On the collaborative design and simulation of space camera: stop structural/thermal/optical) analysis

    NASA Astrophysics Data System (ADS)

    Duan, Pengfei; Lei, Wenping

    2017-11-01

    A number of disciplines (mechanics, structures, thermal, and optics) are needed to design and build Space Camera. Separate design models are normally constructed by each discipline CAD/CAE tools. Design and analysis is conducted largely in parallel subject to requirements that have been levied on each discipline, and technical interaction between the different disciplines is limited and infrequent. As a result a unified view of the Space Camera design across discipline boundaries is not directly possible in the approach above, and generating one would require a large manual, and error-prone process. A collaborative environment that is built on abstract model and performance template allows engineering data and CAD/CAE results to be shared across above discipline boundaries within a common interface, so that it can help to attain speedy multivariate design and directly evaluate optical performance under environment loadings. A small interdisciplinary engineering team from Beijing Institute of Space Mechanics and Electricity has recently conducted a Structural/Thermal/Optical (STOP) analysis of a space camera with this collaborative environment. STOP analysis evaluates the changes in image quality that arise from the structural deformations when the thermal environment of the camera changes throughout its orbit. STOP analyses were conducted for four different test conditions applied during final thermal vacuum (TVAC) testing of the payload on the ground. The STOP Simulation Process begins with importing an integrated CAD model of the camera geometry into the collaborative environment, within which 1. Independent thermal and structural meshes are generated. 2. The thermal mesh and relevant engineering data for material properties and thermal boundary conditions are then used to compute temperature distributions at nodal points in both the thermal and structures mesh through Thermal Desktop, a COTS thermal design and analysis code. 3. Thermally induced structural deformations of the camera are then evaluated in Nastran, an industry standard code for structural design and analysis. 4. Thermal and structural results are next imported into SigFit, another COTS tool that computes deformation and best fit rigid body displacements for the optical surfaces. 5. SigFit creates a modified optical prescription that is imported into CODE V for evaluation of optical performance impacts. The integrated STOP analysis was validated using TVAC test data. For the four different TVAC tests, the relative errors between simulation and test data of measuring points temperatures were almost around 5%, while in some test conditions, they were even much lower to 1%. As to image quality MTF, relative error between simulation and test was 8.3% in the worst condition, others were all below 5%. Through the validation, it has been approved that the collaborative design and simulation environment can achieved the integrated STOP analysis of Space Camera efficiently. And further, the collaborative environment allows an interdisciplinary analysis that formerly might take several months to perform to be completed in two or three weeks, which is very adaptive to scheme demonstration of projects in earlier stages.

  13. Integration of bio-inspired, control-based visual and olfactory data for the detection of an elusive target

    NASA Astrophysics Data System (ADS)

    Duong, Tuan A.; Duong, Nghi; Le, Duong

    2017-01-01

    In this paper, we present an integration technique using a bio-inspired, control-based visual and olfactory receptor system to search for elusive targets in practical environments where the targets cannot be seen obviously by either sensory data. Bio-inspired Visual System is based on a modeling of extended visual pathway which consists of saccadic eye movements and visual pathway (vertebrate retina, lateral geniculate nucleus and visual cortex) to enable powerful target detections of noisy, partial, incomplete visual data. Olfactory receptor algorithm, namely spatial invariant independent component analysis, that was developed based on data of old factory receptor-electronic nose (enose) of Caltech, is adopted to enable the odorant target detection in an unknown environment. The integration of two systems is a vital approach and sets up a cornerstone for effective and low-cost of miniaturized UAVs or fly robots for future DOD and NASA missions, as well as for security systems in Internet of Things environments.

  14. PARALLEL MEASUREMENT AND MODELING OF TRANSPORT IN THE DARHT II BEAMLINE ON ETA II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chambers, F W; Raymond, B A; Falabella, S

    To successfully tune the DARHT II transport beamline requires the close coupling of a model of the beam transport and the measurement of the beam observables as the beam conditions and magnet settings are varied. For the ETA II experiment using the DARHT II beamline components this was achieved using the SUICIDE (Simple User Interface Connecting to an Integrated Data Environment) data analysis environment and the FITS (Fully Integrated Transport Simulation) model. The SUICIDE environment has direct access to the experimental beam transport data at acquisition and the FITS predictions of the transport for immediate comparison. The FITS model ismore » coupled into the control system where it can read magnet current settings for real time modeling. We find this integrated coupling is essential for model verification and the successful development of a tuning aid for the efficient convergence on a useable tune. We show the real time comparisons of simulation and experiment and explore the successes and limitations of this close coupled approach.« less

  15. A framework for interactive visual analysis of heterogeneous marine data in an integrated problem solving environment

    NASA Astrophysics Data System (ADS)

    Liu, Shuai; Chen, Ge; Yao, Shifeng; Tian, Fenglin; Liu, Wei

    2017-07-01

    This paper presents a novel integrated marine visualization framework which focuses on processing, analyzing the multi-dimension spatiotemporal marine data in one workflow. Effective marine data visualization is needed in terms of extracting useful patterns, recognizing changes, and understanding physical processes in oceanography researches. However, the multi-source, multi-format, multi-dimension characteristics of marine data pose a challenge for interactive and feasible (timely) marine data analysis and visualization in one workflow. And, global multi-resolution virtual terrain environment is also needed to give oceanographers and the public a real geographic background reference and to help them to identify the geographical variation of ocean phenomena. This paper introduces a data integration and processing method to efficiently visualize and analyze the heterogeneous marine data. Based on the data we processed, several GPU-based visualization methods are explored to interactively demonstrate marine data. GPU-tessellated global terrain rendering using ETOPO1 data is realized and the video memory usage is controlled to ensure high efficiency. A modified ray-casting algorithm for the uneven multi-section Argo volume data is also presented and the transfer function is designed to analyze the 3D structure of ocean phenomena. Based on the framework we designed, an integrated visualization system is realized. The effectiveness and efficiency of the framework is demonstrated. This system is expected to make a significant contribution to the demonstration and understanding of marine physical process in a virtual global environment.

  16. Web-based courses. More than curriculum.

    PubMed

    Mills, M E; Fisher, C; Stair, N

    2001-01-01

    Online program development depends on an educationally and technologically sound curriculum supported by a solid infrastructure. Creation of a virtual environment through design of online registration and records, financial aid, orientation, advisement, resources, and evaluation and assessment provides students with access and program integrity.The planning of an academic support system as an electronic environment provides challenges and institutional issues requiring systematic analysis.

  17. The natural space environment: Effects on spacecraft

    NASA Technical Reports Server (NTRS)

    James, Bonnie F.; Norton, O. W. (Compiler); Alexander, Margaret B. (Editor)

    1994-01-01

    The effects of the natural space environments on spacecraft design, development, and operation are the topic of a series of NASA Reference Publications currently being developed by the Electromagnetics and Environments Branch, Systems Analysis and Integration Laboratory, Marshall Space Flight Center. This primer provides an overview of the natural space environments and their effect on spacecraft design, development, and operations, and also highlights some of the new developments in science and technology for each space environment. It is hoped that a better understanding of the space environment and its effect on spacecraft will enable program management to more effectively minimize program risks and costs, optimize design quality, and successfully achieve mission objectives.

  18. Open-Source Integrated Design-Analysis Environment For Nuclear Energy Advanced Modeling & Simulation Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Leary, Patrick

    The framework created through the Open-Source Integrated Design-Analysis Environment (IDAE) for Nuclear Energy Advanced Modeling & Simulation grant has simplify and democratize advanced modeling and simulation in the nuclear energy industry that works on a range of nuclear engineering applications. It leverages millions of investment dollars from the Department of Energy's Office of Nuclear Energy for modeling and simulation of light water reactors and the Office of Nuclear Energy's research and development. The IDEA framework enhanced Kitware’s Computational Model Builder (CMB) while leveraging existing open-source toolkits and creating a graphical end-to-end umbrella guiding end-users and developers through the nuclear energymore » advanced modeling and simulation lifecycle. In addition, the work deliver strategic advancements in meshing and visualization for ensembles.« less

  19. Integrating medical imaging analyses through a high-throughput bundled resource imaging system

    NASA Astrophysics Data System (ADS)

    Covington, Kelsie; Welch, E. Brian; Jeong, Ha-Kyu; Landman, Bennett A.

    2011-03-01

    Exploitation of advanced, PACS-centric image analysis and interpretation pipelines provides well-developed storage, retrieval, and archival capabilities along with state-of-the-art data providence, visualization, and clinical collaboration technologies. However, pursuit of integrated medical imaging analysis through a PACS environment can be limiting in terms of the overhead required to validate, evaluate and integrate emerging research technologies. Herein, we address this challenge through presentation of a high-throughput bundled resource imaging system (HUBRIS) as an extension to the Philips Research Imaging Development Environment (PRIDE). HUBRIS enables PACS-connected medical imaging equipment to invoke tools provided by the Java Imaging Science Toolkit (JIST) so that a medical imaging platform (e.g., a magnetic resonance imaging scanner) can pass images and parameters to a server, which communicates with a grid computing facility to invoke the selected algorithms. Generated images are passed back to the server and subsequently to the imaging platform from which the images can be sent to a PACS. JIST makes use of an open application program interface layer so that research technologies can be implemented in any language capable of communicating through a system shell environment (e.g., Matlab, Java, C/C++, Perl, LISP, etc.). As demonstrated in this proof-of-concept approach, HUBRIS enables evaluation and analysis of emerging technologies within well-developed PACS systems with minimal adaptation of research software, which simplifies evaluation of new technologies in clinical research and provides a more convenient use of PACS technology by imaging scientists.

  20. FREEWAT: an HORIZON 2020 project to build open source tools for water management.

    NASA Astrophysics Data System (ADS)

    Foglia, L.; Rossetto, R.; Borsi, I.; Mehl, S.; Velasco Mansilla, V.

    2015-12-01

    FREEWAT is an HORIZON 2020 EU project. FREEWAT main result will be an open source and public domain GIS integrated modelling environment for the simulation of water quantity and quality in surface water and groundwater with an integrated water management and planning module. FREEWAT aims at promoting water resource management by simplifying the application of the Water Framework Directive and related Directives. Specific objectives of the project are: to coordinate previous EU and national funded research to integrate existing software modules for water management in a single environment into the GIS based FREEWAT and to support the FREEWAT application in an innovative participatory approach gathering technical staff and relevant stakeholders (policy and decision makers) in designing scenarios for application of water policies. The open source characteristics of the platform allow to consider this an initiative "ad includendum", as further institutions or developers may contribute to the development. Core of the platform is the SID&GRID framework (GIS integrated physically-based distributed numerical hydrological model based on a modified version of MODFLOW 2005; Rossetto et al. 2013) in its version ported to QGIS desktop. Activities are carried out on two lines: (i) integration of modules to fulfill the end-users requirements, including tools for producing feasibility and management plans; (ii) a set of activities to fix bugs and to provide a well-integrated interface for the different tools implemented. Further capabilities to be integrated are: - module for water management and planning; - calibration, uncertainty and sensitivity analysis; - module for solute transport in unsaturated zone; - module for crop growth and water requirements in agriculture; - tools for groundwater quality issues and for the analysis, interpretation and visualization of hydrogeological data. Through creating a common environment among water research/professionals, policy makers and implementers, FREEWAT main impact will be on enhancing science- and participatory approach and evidence-based decision making in water resource management, hence producing relevant and appropriate outcomes for policy implementation. Large stakeholders involvement is thought to guarantee results dissemination and exploitation.

  1. Integration Toolkit and Methods (ITKM) Corporate Data Integration Tools (CDIT). Review of the State-of-the-Art with Respect to Integration Toolkits and Methods (ITKM)

    DTIC Science & Technology

    1992-06-01

    system capabilities \\Jch as memory management and network communications are provided by a virtual machine-type operating environment. Various human ...thinking. The elements of this substrate include representational formality, genericity, a method of formal analysis, and augmentation of human analytical...the form of identifying: the data entity itself; its aliases (including how the data is presented th programs or human users in the form of copy

  2. Promotion of Nursing Student Civility in Nursing Education: A Concept Analysis.

    PubMed

    Woodworth, Julie A

    2016-07-01

    Substantive research into the development of civility within nursing education is long overdue. Behaviors learned by nursing students while in the school of nursing transfer to the work environment and culture of nursing. This paper reveals a concept analysis of civility within nursing education using Rodgers' evolutionary concept analysis method. Civility is defined to provide clarity for the current terminology of civility within nursing education. Nurse educators must set socially acceptable behavioral expectations in the learning environment, establishing positive interpersonal relationships with students, maintaining moral and academic integrity, and role model civil behaviors. Suggestions are included to help nurse educators outline acceptable behaviors in the learning environment and promote the development of civility. The development of civil behaviors in nursing students will carry into professional practice after graduation. Civility is necessary to establish meaningful interpersonal relationships, supportive communication, and optimum learning environments to ensure quality patient care with optimum outcomes. Woodworth. © 2015 Wiley Periodicals, Inc.

  3. Computer-aided operations engineering with integrated models of systems and operations

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Ryan, Dan; Fleming, Land

    1994-01-01

    CONFIG 3 is a prototype software tool that supports integrated conceptual design evaluation from early in the product life cycle, by supporting isolated or integrated modeling, simulation, and analysis of the function, structure, behavior, failures and operation of system designs. Integration and reuse of models is supported in an object-oriented environment providing capabilities for graph analysis and discrete event simulation. Integration is supported among diverse modeling approaches (component view, configuration or flow path view, and procedure view) and diverse simulation and analysis approaches. Support is provided for integrated engineering in diverse design domains, including mechanical and electro-mechanical systems, distributed computer systems, and chemical processing and transport systems. CONFIG supports abstracted qualitative and symbolic modeling, for early conceptual design. System models are component structure models with operating modes, with embedded time-related behavior models. CONFIG supports failure modeling and modeling of state or configuration changes that result in dynamic changes in dependencies among components. Operations and procedure models are activity structure models that interact with system models. CONFIG is designed to support evaluation of system operability, diagnosability and fault tolerance, and analysis of the development of system effects of problems over time, including faults, failures, and procedural or environmental difficulties.

  4. Aeronautics Research Mission Directorate Integrated Systems Research Program (ISRP) and UAS Integration in the NAS Project

    NASA Technical Reports Server (NTRS)

    Wolfe, Jean

    2010-01-01

    Program Goal: Conduct research at an integrated system-level on promising concepts and technologies and explore, assess, or demonstrate the benefits in a relevant environment.Criteria for selection of projects for Integrated Systems Research: a) Technology has attained enough maturity in the foundational research program that they merit more in-depth evaluation at an integrated system level in a relevant environment. b) Technologies which systems analysis indicates have the most potential for contributing to the simultaneous attainment of goals. c) Technologies identified through stakeholder input as having potential for simultaneous attainment of goals. d) Research not being done by other government agencies and appropriate for NASA to conduct. e) Budget augmentation. Environmentally Responsible Aviation (ERA) Project Explore and assess new vehicle concepts and enabling technologies through system-level experimentation to simultaneously reduce fuel burn, noise, and emissions Unmanned Aircraft Systems (UAS) Integration in the National Airspace System (NAS) Project Contribute capabilities that reduce technical barriers related to the safety and operational challenges associated with enabling routine UAS access to the NAS Innovative Concepts for Green Aviation (ICGA) Project Spur innovation by offering research opportunities to the broader aeronautics community through peer-reviewed proposals, with a focus on making aviation more eco-friendly. Establish incentive prizes similar to the Centennial Challenges and sponsor innovation demonstrations of selected technologies that show promise of reducing aviation s impact on the environment

  5. Integrated Reporting as a Tool for Communicating with Stakeholders - Advantages and Disadvantages

    NASA Astrophysics Data System (ADS)

    Matuszyk, Iwona; Rymkiewicz, Bartosz

    2018-03-01

    Financial and non-financial reporting from the beginning of its existence is the primary source of communication between the company and a wide range of stakeholders. Over the decades it has adapted to the needs of rapidly changing business and social environment. Currently, the final link in the evolution of organizational reporting, such as integrated reporting, assumes integration and mutual connectivity to both financial and non-financial data. The main interest in the concept of integrated reporting comes from the value it contributes to the organization. Undoubtedly, the concept of integrated reporting is a milestone in the evolution of organizational reporting. It is however important to consider whether it adequately addresses the information needs of a wide range of stakeholders, and whether it is a universal tool for communication between the company and its stakeholders. The aim of the paper is to discuss the advantages and disadvantages of the concept of integrated reporting as a tool for communication with stakeholders and to further directions of its development. The article uses the research methods such as literature analysis, the content analysis of the corporate publications and comparative analysis.

  6. Systems and methods for knowledge discovery in spatial data

    DOEpatents

    Obradovic, Zoran; Fiez, Timothy E.; Vucetic, Slobodan; Lazarevic, Aleksandar; Pokrajac, Dragoljub; Hoskinson, Reed L.

    2005-03-08

    Systems and methods are provided for knowledge discovery in spatial data as well as to systems and methods for optimizing recipes used in spatial environments such as may be found in precision agriculture. A spatial data analysis and modeling module is provided which allows users to interactively and flexibly analyze and mine spatial data. The spatial data analysis and modeling module applies spatial data mining algorithms through a number of steps. The data loading and generation module obtains or generates spatial data and allows for basic partitioning. The inspection module provides basic statistical analysis. The preprocessing module smoothes and cleans the data and allows for basic manipulation of the data. The partitioning module provides for more advanced data partitioning. The prediction module applies regression and classification algorithms on the spatial data. The integration module enhances prediction methods by combining and integrating models. The recommendation module provides the user with site-specific recommendations as to how to optimize a recipe for a spatial environment such as a fertilizer recipe for an agricultural field.

  7. Thermal Analysis of the NASA Integrated Vehicle Health Monitoring Experiment Technology for X-Vehicles (NITEX)

    NASA Technical Reports Server (NTRS)

    Hegab, Hisham E.

    2002-01-01

    The purpose of this project was to perform a thermal analysis for the NASA Integrated Vehicle Health Monitoring (IVHM) Technology Experiment for X-vehicles (NITEX). This electronics package monitors vehicle sensor information in flight and downlinks vehicle health summary information via telemetry. The experiment will be tested on the X-34 in an unpressurized compartment, in the vicinity of one of the vehicle's liquid oxygen tanks. The transient temperature profile for the electronics package has been determined using finite element analysis for possible mission profiles that will most likely expose the package to the most extreme hot and cold environmental conditions. From the analyses, it was determined that temperature limits for the electronics would be exceeded for the worst case cold environment mission profile. The finite element model used for the analyses was modified to examine the use of insulation to address this problem. Recommendations for insulating the experiment for the cold environment are presented, and were analyzed to determine their effect on a nominal mission profile.

  8. Thermal Analysis Of The NASA Integrated Vehicle Health Monitoring Experiment Technology For X-Vehicles (NITEX)

    NASA Technical Reports Server (NTRS)

    Hegab, Hisham E.

    2001-01-01

    The purpose of this project was to perform a thermal analysis for the NASA Integrated Vehicle Health Monitoring (IVHM) Technology Experiment for X-vehicles (NITEX). This electronics package monitors vehicle sensor information in flight and downlinks vehicle health summary information via telemetry. The experiment will be tested on the X-34 in an unpressurized compartment, in the vicinity of one of the vehicle's liquid oxygen tanks. The transient temperature profile for the electronics package has been determined using finite element analysis for possible mission profiles that will most likely expose the package to the most extreme hot and cold environmental conditions. From the analyses, it was determined that temperature limits for the electronics would be exceeded for the worst case cold environment mission profile. The finite element model used for the analyses was modified to examine the use of insulation to address this problem. Recommendations for insulating the experiment for the cold environment are presented, and were analyzed to determine their effect on a nominal mission profile.

  9. Computer aided design environment for the analysis and design of multi-body flexible structures

    NASA Technical Reports Server (NTRS)

    Ramakrishnan, Jayant V.; Singh, Ramen P.

    1989-01-01

    A computer aided design environment consisting of the programs NASTRAN, TREETOPS and MATLAB is presented in this paper. With links for data transfer between these programs, the integrated design of multi-body flexible structures is significantly enhanced. The CAD environment is used to model the Space Shuttle/Pinhole Occulater Facility. Then a controller is designed and evaluated in the nonlinear time history sense. Recent enhancements and ongoing research to add more capabilities are also described.

  10. Globus | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    Globus software services provide secure cancer research data transfer, synchronization, and sharing in distributed environments at large scale. These services can be integrated into applications and research data gateways, leveraging Globus identity management, single sign-on, search, and authorization capabilities. Globus Genomics integrates Globus with the Galaxy genomics workflow engine and Amazon Web Services to enable cancer genomics analysis that can elastically scale compute resources with demand.

  11. Electroencephalographic Correlates of Sensorimotor Integration and Embodiment during the Appreciation of Virtual Architectural Environments.

    PubMed

    Vecchiato, Giovanni; Tieri, Gaetano; Jelic, Andrea; De Matteis, Federico; Maglione, Anton G; Babiloni, Fabio

    2015-01-01

    Nowadays there is the hope that neuroscientific findings will contribute to the improvement of building design in order to create environments which satisfy man's demands. This can be achieved through the understanding of neurophysiological correlates of architectural perception. To this aim, the electroencephalographic (EEG) signals of 12 healthy subjects were recorded during the perception of three immersive virtual reality environments (VEs). Afterwards, participants were asked to describe their experience in terms of Familiarity, Novelty, Comfort, Pleasantness, Arousal, and Presence using a rating scale from 1 to 9. These perceptual dimensions are hypothesized to influence the pattern of cerebral spectral activity, while Presence is used to assess the realism of the virtual stimulation. Hence, the collected scores were used to analyze the Power Spectral Density (PSD) of the EEG for each behavioral dimension in the theta, alpha and mu bands by means of time-frequency analysis and topographic statistical maps. Analysis of Presence resulted in the activation of the frontal-midline theta, indicating the involvement of sensorimotor integration mechanisms when subjects expressed to feel more present in the VEs. Similar patterns also characterized the experience of familiar and comfortable VEs. In addition, pleasant VEs increased the theta power across visuomotor circuits and activated the alpha band in areas devoted to visuospatial exploration and processing of categorical spatial relations. Finally, the de-synchronization of the mu rhythm described the perception of pleasant and comfortable VEs, showing the involvement of left motor areas and embodied mechanisms for environment appreciation. Overall, these results show the possibility to measure EEG correlates of architectural perception involving the cerebral circuits of sensorimotor integration, spatial navigation, and embodiment. These observations can help testing architectural hypotheses in order to design environments matching the changing needs of humans.

  12. Electroencephalographic Correlates of Sensorimotor Integration and Embodiment during the Appreciation of Virtual Architectural Environments

    PubMed Central

    Vecchiato, Giovanni; Tieri, Gaetano; Jelic, Andrea; De Matteis, Federico; Maglione, Anton G.; Babiloni, Fabio

    2015-01-01

    Nowadays there is the hope that neuroscientific findings will contribute to the improvement of building design in order to create environments which satisfy man's demands. This can be achieved through the understanding of neurophysiological correlates of architectural perception. To this aim, the electroencephalographic (EEG) signals of 12 healthy subjects were recorded during the perception of three immersive virtual reality environments (VEs). Afterwards, participants were asked to describe their experience in terms of Familiarity, Novelty, Comfort, Pleasantness, Arousal, and Presence using a rating scale from 1 to 9. These perceptual dimensions are hypothesized to influence the pattern of cerebral spectral activity, while Presence is used to assess the realism of the virtual stimulation. Hence, the collected scores were used to analyze the Power Spectral Density (PSD) of the EEG for each behavioral dimension in the theta, alpha and mu bands by means of time-frequency analysis and topographic statistical maps. Analysis of Presence resulted in the activation of the frontal-midline theta, indicating the involvement of sensorimotor integration mechanisms when subjects expressed to feel more present in the VEs. Similar patterns also characterized the experience of familiar and comfortable VEs. In addition, pleasant VEs increased the theta power across visuomotor circuits and activated the alpha band in areas devoted to visuospatial exploration and processing of categorical spatial relations. Finally, the de-synchronization of the mu rhythm described the perception of pleasant and comfortable VEs, showing the involvement of left motor areas and embodied mechanisms for environment appreciation. Overall, these results show the possibility to measure EEG correlates of architectural perception involving the cerebral circuits of sensorimotor integration, spatial navigation, and embodiment. These observations can help testing architectural hypotheses in order to design environments matching the changing needs of humans. PMID:26733924

  13. Orbiter lessons learned: A guide to future vehicle development

    NASA Technical Reports Server (NTRS)

    Greenberg, Harry Stan

    1993-01-01

    Topics addressed are: (1) wind persistence loads methodology; (2) emphasize supportability in design of reusable vehicles; (3) design for robustness; (4) improved aerodynamic environment prediction methods for complex vehicles; (5) automated integration of aerothermal, manufacturing, and structures analysis; (6) continued electronic documentation of structural design and analysis; and (7) landing gear rollout load simulations.

  14. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.

    PubMed

    Simonyan, Vahan; Mazumder, Raja

    2014-09-30

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  15. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis

    PubMed Central

    Simonyan, Vahan; Mazumder, Raja

    2014-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis. PMID:25271953

  16. Parametric Modelling of As-Built Beam Framed Structure in Bim Environment

    NASA Astrophysics Data System (ADS)

    Yang, X.; Koehl, M.; Grussenmeyer, P.

    2017-02-01

    A complete documentation and conservation of a historic timber roof requires the integration of geometry modelling, attributional and dynamic information management and results of structural analysis. Recently developed as-built Building Information Modelling (BIM) technique has the potential to provide a uniform platform, which provides possibility to integrate the traditional geometry modelling, parametric elements management and structural analysis together. The main objective of the project presented in this paper is to develop a parametric modelling tool for a timber roof structure whose elements are leaning and crossing beam frame. Since Autodesk Revit, as the typical BIM software, provides the platform for parametric modelling and information management, an API plugin, able to automatically create the parametric beam elements and link them together with strict relationship, was developed. The plugin under development is introduced in the paper, which can obtain the parametric beam model via Autodesk Revit API from total station points and terrestrial laser scanning data. The results show the potential of automatizing the parametric modelling by interactive API development in BIM environment. It also integrates the separate data processing and different platforms into the uniform Revit software.

  17. Social support network, mental health and quality of life: a cross-sectional study in primary care.

    PubMed

    Portugal, Flávia Batista; Campos, Mônica Rodrigues; Correia, Celina Ragoni; Gonçalves, Daniel Almeida; Ballester, Dinarte; Tófoli, Luis Fernando; Mari, Jair de Jesus; Gask, Linda; Dowrick, Christopher; Bower, Peter; Fortes, Sandra

    2016-12-22

    The objective of this study was to identify the association between emotional distress and social support networks with quality of life in primary care patients. This was a cross-sectional study involving 1,466 patients in the cities of São Paulo and Rio de Janeiro, Brazil, in 2009/2010. The General Health Questionnaire, the Hospital Anxiety and Depression Scale and the brief version of the World Health Organization Quality of Life Instrument were used. The Social Support Network Index classified patients with the highest and lowest index as socially integrated or isolated. A bivariate analysis and four multiple linear regressions were conducted for each quality of life outcome. The means scores for the physical, psychological, social relations, and environment domains were, respectively, 64.7; 64.2; 68.5 and 49.1. In the multivariate analysis, the psychological domain was negatively associated with isolation, whereas the social relations and environment domains were positively associated with integration. Integration and isolation proved to be important factors for those in emotional distress as they minimize or maximize negative effects on quality of life.

  18. Computational System For Rapid CFD Analysis In Engineering

    NASA Technical Reports Server (NTRS)

    Barson, Steven L.; Ascoli, Edward P.; Decroix, Michelle E.; Sindir, Munir M.

    1995-01-01

    Computational system comprising modular hardware and software sub-systems developed to accelerate and facilitate use of techniques of computational fluid dynamics (CFD) in engineering environment. Addresses integration of all aspects of CFD analysis process, including definition of hardware surfaces, generation of computational grids, CFD flow solution, and postprocessing. Incorporates interfaces for integration of all hardware and software tools needed to perform complete CFD analysis. Includes tools for efficient definition of flow geometry, generation of computational grids, computation of flows on grids, and postprocessing of flow data. System accepts geometric input from any of three basic sources: computer-aided design (CAD), computer-aided engineering (CAE), or definition by user.

  19. An interactive environment for the analysis of large Earth observation and model data sets

    NASA Technical Reports Server (NTRS)

    Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.

    1993-01-01

    We propose to develop an interactive environment for the analysis of large Earth science observation and model data sets. We will use a standard scientific data storage format and a large capacity (greater than 20 GB) optical disk system for data management; develop libraries for coordinate transformation and regridding of data sets; modify the NCSA X Image and X DataSlice software for typical Earth observation data sets by including map transformations and missing data handling; develop analysis tools for common mathematical and statistical operations; integrate the components described above into a system for the analysis and comparison of observations and model results; and distribute software and documentation to the scientific community.

  20. An interactive environment for the analysis of large Earth observation and model data sets

    NASA Technical Reports Server (NTRS)

    Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.

    1992-01-01

    We propose to develop an interactive environment for the analysis of large Earth science observation and model data sets. We will use a standard scientific data storage format and a large capacity (greater than 20 GB) optical disk system for data management; develop libraries for coordinate transformation and regridding of data sets; modify the NCSA X Image and X Data Slice software for typical Earth observation data sets by including map transformations and missing data handling; develop analysis tools for common mathematical and statistical operations; integrate the components described above into a system for the analysis and comparison of observations and model results; and distribute software and documentation to the scientific community.

  1. Universal microfluidic automaton for autonomous sample processing: application to the Mars Organic Analyzer.

    PubMed

    Kim, Jungkyu; Jensen, Erik C; Stockton, Amanda M; Mathies, Richard A

    2013-08-20

    A fully integrated multilayer microfluidic chemical analyzer for automated sample processing and labeling, as well as analysis using capillary zone electrophoresis is developed and characterized. Using lifting gate microfluidic control valve technology, a microfluidic automaton consisting of a two-dimensional microvalve cellular array is fabricated with soft lithography in a format that enables facile integration with a microfluidic capillary electrophoresis device. The programmable sample processor performs precise mixing, metering, and routing operations that can be combined to achieve automation of complex and diverse assay protocols. Sample labeling protocols for amino acid, aldehyde/ketone and carboxylic acid analysis are performed automatically followed by automated transfer and analysis by the integrated microfluidic capillary electrophoresis chip. Equivalent performance to off-chip sample processing is demonstrated for each compound class; the automated analysis resulted in a limit of detection of ~16 nM for amino acids. Our microfluidic automaton provides a fully automated, portable microfluidic analysis system capable of autonomous analysis of diverse compound classes in challenging environments.

  2. Use of Cusp Catastrophe for Risk Analysis of Navigational Environment: A Case Study of Three Gorges Reservoir Area

    PubMed Central

    Hao, Guozhu

    2016-01-01

    A water traffic system is a huge, nonlinear, complex system, and its stability is affected by various factors. Water traffic accidents can be considered to be a kind of mutation of a water traffic system caused by the coupling of multiple navigational environment factors. In this study, the catastrophe theory, principal component analysis (PCA), and multivariate statistics are integrated to establish a situation recognition model for a navigational environment with the aim of performing a quantitative analysis of the situation of this environment via the extraction and classification of its key influencing factors; in this model, the natural environment and traffic environment are considered to be two control variables. The Three Gorges Reservoir area of the Yangtze River is considered as an example, and six critical factors, i.e., the visibility, wind, current velocity, route intersection, channel dimension, and traffic flow, are classified into two principal components: the natural environment and traffic environment. These two components are assumed to have the greatest influence on the navigation risk. Then, the cusp catastrophe model is employed to identify the safety situation of the regional navigational environment in the Three Gorges Reservoir area. The simulation results indicate that the situation of the navigational environment of this area is gradually worsening from downstream to upstream. PMID:27391057

  3. Use of Cusp Catastrophe for Risk Analysis of Navigational Environment: A Case Study of Three Gorges Reservoir Area.

    PubMed

    Jiang, Dan; Hao, Guozhu; Huang, Liwen; Zhang, Dan

    2016-01-01

    A water traffic system is a huge, nonlinear, complex system, and its stability is affected by various factors. Water traffic accidents can be considered to be a kind of mutation of a water traffic system caused by the coupling of multiple navigational environment factors. In this study, the catastrophe theory, principal component analysis (PCA), and multivariate statistics are integrated to establish a situation recognition model for a navigational environment with the aim of performing a quantitative analysis of the situation of this environment via the extraction and classification of its key influencing factors; in this model, the natural environment and traffic environment are considered to be two control variables. The Three Gorges Reservoir area of the Yangtze River is considered as an example, and six critical factors, i.e., the visibility, wind, current velocity, route intersection, channel dimension, and traffic flow, are classified into two principal components: the natural environment and traffic environment. These two components are assumed to have the greatest influence on the navigation risk. Then, the cusp catastrophe model is employed to identify the safety situation of the regional navigational environment in the Three Gorges Reservoir area. The simulation results indicate that the situation of the navigational environment of this area is gradually worsening from downstream to upstream.

  4. Psychosocial environment for the integrated education opportunities of the disabled in Lithuania

    PubMed Central

    Samsoniene, Laimute; Juozulynas, Algirdas; Surkiene, Gene; Jankauskiene, Konstancija; Lukšiene, Aloyza

    2006-01-01

    Background The policy of the diminution of the social isolation of the disabled is the main objective of the strategy of the EU new policy concerning the disabled. Lithuanian society faces this objective as well. For this reason, this study aiming at providing the theoretical basis for and predicting the possible psycho-social environment in an integrated education system, as well as at the evaluation of the reasons for the formation of a positive approach to the disabled, is especially relevant, since it creates the prerequisites for the optimisation of the process of the integration of disabled schoolchildren into the general system of education. Method The sample of the study consisted of 2471 children from the same schools: not integrated (1958), integrated (126) and special schools (382). Empirical methods: questionnaire poll, comparative analysis. The statistical analysis was carried out using SAS. Results Our study showed that the majority of schoolchildren without disabilities and disabled schoolchildren have positive intentions for interpersonal interactions (>82%) and positive emotions (>69%) independently of the discrepant character of interpersonal contacts, different conditions of education and family life, and despite of low level of knowledge. Conclusion The results of the study confirmed positive intentions for interpersonal interaction between disabled schoolchildren and schoolchildren without disabilities, as well as a positive character of emotions, and disprove the unsound myth of the opponents of the social integration of the disabled stating that disabled children in comprehensive schools would undoubtedly experience offence from their peers without disabilities. PMID:17173706

  5. Bioinformatics resource manager v2.3: an integrated software environment for systems biology with microRNA and cross-species analysis tools

    PubMed Central

    2012-01-01

    Background MicroRNAs (miRNAs) are noncoding RNAs that direct post-transcriptional regulation of protein coding genes. Recent studies have shown miRNAs are important for controlling many biological processes, including nervous system development, and are highly conserved across species. Given their importance, computational tools are necessary for analysis, interpretation and integration of high-throughput (HTP) miRNA data in an increasing number of model species. The Bioinformatics Resource Manager (BRM) v2.3 is a software environment for data management, mining, integration and functional annotation of HTP biological data. In this study, we report recent updates to BRM for miRNA data analysis and cross-species comparisons across datasets. Results BRM v2.3 has the capability to query predicted miRNA targets from multiple databases, retrieve potential regulatory miRNAs for known genes, integrate experimentally derived miRNA and mRNA datasets, perform ortholog mapping across species, and retrieve annotation and cross-reference identifiers for an expanded number of species. Here we use BRM to show that developmental exposure of zebrafish to 30 uM nicotine from 6–48 hours post fertilization (hpf) results in behavioral hyperactivity in larval zebrafish and alteration of putative miRNA gene targets in whole embryos at developmental stages that encompass early neurogenesis. We show typical workflows for using BRM to integrate experimental zebrafish miRNA and mRNA microarray datasets with example retrievals for zebrafish, including pathway annotation and mapping to human ortholog. Functional analysis of differentially regulated (p<0.05) gene targets in BRM indicates that nicotine exposure disrupts genes involved in neurogenesis, possibly through misregulation of nicotine-sensitive miRNAs. Conclusions BRM provides the ability to mine complex data for identification of candidate miRNAs or pathways that drive phenotypic outcome and, therefore, is a useful hypothesis generation tool for systems biology. The miRNA workflow in BRM allows for efficient processing of multiple miRNA and mRNA datasets in a single software environment with the added capability to interact with public data sources and visual analytic tools for HTP data analysis at a systems level. BRM is developed using Java™ and other open-source technologies for free distribution (http://www.sysbio.org/dataresources/brm.stm). PMID:23174015

  6. MSFC/EV44 Natural Environment Capabilities

    NASA Technical Reports Server (NTRS)

    NeergaardParker, Linda; Willis, Emily M.; Minnow, Joseph I.; Coffey, Vic N.

    2014-01-01

    The Natural Environments Branch at Marshall Space Flight Center is an integral part of many NASA satellite and launch vehicle programs, providing analyses of the space and terrestrial environments that are used for program development efforts, operational support, and anomaly investigations. These capabilities include model development, instrument build and testing, analysis of space and terrestrial related data, spacecraft charging anomaly investigations, surface and internal charging modeling, space environment definition, and radiation assessments for electronic parts. All aspects of space and terrestrial design are implemented with the goal of devising missions that are successful from launch to operations in the space environment of LEO, polar, GEO, and interplanetary orbits.

  7. Stability analysis using SDSA tool

    NASA Astrophysics Data System (ADS)

    Goetzendorf-Grabowski, Tomasz; Mieszalski, Dawid; Marcinkiewicz, Ewa

    2011-11-01

    The SDSA (Simulation and Dynamic Stability Analysis) application is presented as a tool for analysing the dynamic characteristics of the aircraft just in the conceptual design stage. SDSA is part of the CEASIOM (Computerized Environment for Aircraft Synthesis and Integrated Optimization Methods) software environment which was developed within the SimSAC (Simulating Aircraft Stability And Control Characteristics for Use in Conceptual Design) project, funded by the European Commission 6th Framework Program. SDSA can also be used as stand alone software, and integrated with other design and optimisation systems using software wrappers. This paper focuses on the main functionalities of SDSA and presents both computational and free flight experimental results to compare and validate the presented software. Two aircraft are considered, the EADS Ranger 2000 and the Warsaw University designed PW-6 glider. For the two cases considered here the SDSA software is shown to be an excellent tool for predicting dynamic characteristics of an aircraft.

  8. Recoding low-level simulator data into a record of meaningful task performance: the integrated task modeling environment (ITME).

    PubMed

    King, Robert; Parker, Simon; Mouzakis, Kon; Fletcher, Winston; Fitzgerald, Patrick

    2007-11-01

    The Integrated Task Modeling Environment (ITME) is a user-friendly software tool that has been developed to automatically recode low-level data into an empirical record of meaningful task performance. The present research investigated and validated the performance of the ITME software package by conducting complex simulation missions and comparing the task analyses produced by ITME with taskanalyses produced by experienced video analysts. A very high interrater reliability (> or = .94) existed between experienced video analysts and the ITME for the task analyses produced for each mission. The mean session time:analysis time ratio was 1:24 using video analysis techniques and 1:5 using the ITME. It was concluded that the ITME produced task analyses that were as reliable as those produced by experienced video analysts, and significantly reduced the time cost associated with these analyses.

  9. Planetary/DOD entry technology flight experiments. Volume 2: Planetary entry flight experiments

    NASA Technical Reports Server (NTRS)

    Christensen, H. E.; Krieger, R. J.; Mcneilly, W. R.; Vetter, H. C.

    1976-01-01

    The technical feasibility of launching a high speed, earth entry vehicle from the space shuttle to advance technology for the exploration of the outer planets' atmospheres was established. Disciplines of thermodynamics, orbital mechanics, aerodynamics propulsion, structures, design, electronics and system integration focused on the goal of producing outer planet environments on a probe shaped vehicle during an earth entry. Major aspects of analysis and vehicle design studied include: planetary environments, earth entry environment capability, mission maneuvers, capabilities of shuttle upper stages, a comparison of earth entry planetary environments, experiment design and vehicle design.

  10. Integrated instrumentation & computation environment for GRACE

    NASA Astrophysics Data System (ADS)

    Dhekne, P. S.

    2002-03-01

    The project GRACE (Gamma Ray Astrophysics with Coordinated Experiments) aims at setting up a state of the art Gamma Ray Observatory at Mt. Abu, Rajasthan for undertaking comprehensive scientific exploration over a wide spectral window (10's keV - 100's TeV) from a single location through 4 coordinated experiments. The cumulative data collection rate of all the telescopes is expected to be about 1 GB/hr, necessitating innovations in the data management environment. As real-time data acquisition and control as well as off-line data processing, analysis and visualization environment of these systems is based on the us cutting edge and affordable technologies in the field of computers, communications and Internet. We propose to provide a single, unified environment by seamless integration of instrumentation and computations by taking advantage of the recent advancements in Web based technologies. This new environment will allow researchers better acces to facilities, improve resource utilization and enhance collaborations by having identical environments for online as well as offline usage of this facility from any location. We present here a proposed implementation strategy for a platform independent web-based system that supplements automated functions with video-guided interactive and collaborative remote viewing, remote control through virtual instrumentation console, remote acquisition of telescope data, data analysis, data visualization and active imaging system. This end-to-end web-based solution will enhance collaboration among researchers at the national and international level for undertaking scientific studies, using the telescope systems of the GRACE project.

  11. Airbreathing hypersonic vehicle design and analysis methods

    NASA Technical Reports Server (NTRS)

    Lockwood, Mary Kae; Petley, Dennis H.; Hunt, James L.; Martin, John G.

    1996-01-01

    The design, analysis, and optimization of airbreathing hypersonic vehicles requires analyses involving many highly coupled disciplines at levels of accuracy exceeding those traditionally considered in a conceptual or preliminary-level design. Discipline analysis methods including propulsion, structures, thermal management, geometry, aerodynamics, performance, synthesis, sizing, closure, and cost are discussed. Also, the on-going integration of these methods into a working environment, known as HOLIST, is described.

  12. The development of learning competency and skills for the 21st century to integrate "TPCK" of world life with local environment in students grade 11

    NASA Astrophysics Data System (ADS)

    Jedaman, Pornchai; Kinboon, Nittayaporn; Suksup, Chareon; Kinboon, Wutcharapong

    2018-01-01

    The teaching strategies of append were the technological pedagogical content knowledge; TPCK. This description knowledge in a content to corresponds of a effective teaching. This article aims to address the relevance of; 1) the learning competency and skills for the 21st Century to integrate "TPCK" of a world life with local environment in the students grade 11, 2) the effective achievement of the learning competency and skills to integrate "TPCK" of world life with local environment in the students grade 11 were the passing score of a percentage 80 upwards (EI.), and 3) the attitudes for learning activities in the students grade 11. The study employed both quantitative and qualitative approaches for 9teachers of third schools were the participatory action research (PAR) in collecting data including a instructional, a testing, and a questionnaire surveywith 33 students grade 11 of Phuwiengwittayakom school. The participants werecluster random sampling. The data analysis of descriptive statistical, percentage, mean average, standard deviation. The findings in the study were to learning competency and skills for the 21st Century to integrate "TPCK" of a world life with local environment on TK, PK, CK, and TCK at a excellent levels (X¯ = 3.62, 3.57, 3.54, 3.51) respectively, and PCK, TPK,TPCK at a good levels (X¯ = 3.36, 3.23, 3.17) respectively. The effective achievement of the learning competency and skills to integrate "TPCK" of world life with local environment in 30 students grade 11 were the passing score (EI.) of a percentage at 90.91. The attitudes for learning activities in the students grade 11 at a high levels (X¯ =3.29). In addition of the students grade 11 to understanding of the importance were a local learning resources including the value integrating technology of the knowledge technology and to choose the right were create of an effective information.

  13. 48 CFR 970.5223-1 - Integration of environment, safety, and health into work planning and execution.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Integration of environment... Integration of environment, safety, and health into work planning and execution. As prescribed in 970.2303-3(b), insert the following clause: Integration of Environment, Safety, and Health Into Work Planning and...

  14. Analysis of the possibility of a PGA309 integrated circuit application in pressure sensors

    NASA Astrophysics Data System (ADS)

    Walendziuk, Wojciech; Baczewski, Michal; Idzkowski, Adam

    2016-09-01

    This article present the results of research concerning the analysis of the possibilities of applying a PGA309 integrated circuit in transducers used for pressure measurement. The experiments were done with the use of a PGA309EVM-USB evaluation circuit with a BD|SENSORS pressure sensor. A specially prepared MATLAB script was used in the process of the calibration setting choice and the results analysis. The article discusses the worked out algorithm that processes the measurement results, i.e. the algorithm which calculates the desired gain and the offset adjustment voltage of the transducer measurement bridge in relation to the input signal range of the integrated circuit and the temperature of the environment (temperature compensation). The checking procedure was conducted in a measurement laboratory and the obtained result were analyzed and discussed.

  15. DEPEND: A simulation-based environment for system level dependability analysis

    NASA Technical Reports Server (NTRS)

    Goswami, Kumar; Iyer, Ravishankar K.

    1992-01-01

    The design and evaluation of highly reliable computer systems is a complex issue. Designers mostly develop such systems based on prior knowledge and experience and occasionally from analytical evaluations of simplified designs. A simulation-based environment called DEPEND which is especially geared for the design and evaluation of fault-tolerant architectures is presented. DEPEND is unique in that it exploits the properties of object-oriented programming to provide a flexible framework with which a user can rapidly model and evaluate various fault-tolerant systems. The key features of the DEPEND environment are described, and its capabilities are illustrated with a detailed analysis of a real design. In particular, DEPEND is used to simulate the Unix based Tandem Integrity fault-tolerance and evaluate how well it handles near-coincident errors caused by correlated and latent faults. Issues such as memory scrubbing, re-integration policies, and workload dependent repair times which affect how the system handles near-coincident errors are also evaluated. Issues such as the method used by DEPEND to simulate error latency and the time acceleration technique that provides enormous simulation speed up are also discussed. Unlike any other simulation-based dependability studies, the use of these approaches and the accuracy of the simulation model are validated by comparing the results of the simulations, with measurements obtained from fault injection experiments conducted on a production Tandem Integrity machine.

  16. Scaling to diversity: The DERECHOS distributed infrastructure for analyzing and sharing data

    NASA Astrophysics Data System (ADS)

    Rilee, M. L.; Kuo, K. S.; Clune, T.; Oloso, A.; Brown, P. G.

    2016-12-01

    Integrating Earth Science data from diverse sources such as satellite imagery and simulation output can be expensive and time-consuming, limiting scientific inquiry and the quality of our analyses. Reducing these costs will improve innovation and quality in science. The current Earth Science data infrastructure focuses on downloading data based on requests formed from the search and analysis of associated metadata. And while the data products provided by archives may use the best available data sharing technologies, scientist end-users generally do not have such resources (including staff) available to them. Furthermore, only once an end-user has received the data from multiple diverse sources and has integrated them can the actual analysis and synthesis begin. The cost of getting from idea to where synthesis can start dramatically slows progress. In this presentation we discuss a distributed computational and data storage framework that eliminates much of the aforementioned cost. The SciDB distributed array database is central as it is optimized for scientific computing involving very large arrays, performing better than less specialized frameworks like Spark. Adding spatiotemporal functions to the SciDB creates a powerful platform for analyzing and integrating massive, distributed datasets. SciDB allows Big Earth Data analysis to be performed "in place" without the need for expensive downloads and end-user resources. Spatiotemporal indexing technologies such as the hierarchical triangular mesh enable the compute and storage affinity needed to efficiently perform co-located and conditional analyses minimizing data transfers. These technologies automate the integration of diverse data sources using the framework, a critical step beyond current metadata search and analysis. Instead of downloading data into their idiosyncratic local environments, end-users can generate and share data products integrated from diverse multiple sources using a common shared environment, turning distributed active archive centers (DAACs) from warehouses into distributed active analysis centers.

  17. Human Support Technology Research to Enable Exploration

    NASA Technical Reports Server (NTRS)

    Joshi, Jitendra

    2003-01-01

    Contents include the following: Advanced life support. System integration, modeling, and analysis. Progressive capabilities. Water processing. Air revitalization systems. Why advanced CO2 removal technology? Solid waste resource recovery systems: lyophilization. ISRU technologies for Mars life support. Atmospheric resources of Mars. N2 consumable/make-up for Mars life. Integrated test beds. Monitoring and controlling the environment. Ground-based commercial technology. Optimizing size vs capability. Water recovery systems. Flight verification topics.

  18. Multimedia-modeling integration development environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pelton, Mitchell A.; Hoopes, Bonnie L.

    2002-09-02

    There are many framework systems available; however, the purpose of the framework presented here is to capitalize on the successes of the Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) and Multi-media Multi-pathway Multi-receptor Risk Assessment (3MRA) methodology as applied to the Hazardous Waste Identification Rule (HWIR) while focusing on the development of software tools to simplify the module developer?s effort of integrating a module into the framework.

  19. Integrated analysis of landscape management scenarios using state and transition models in the upper Grande Ronde River subbasin, Oregon, USA.

    Treesearch

    Miles A. Hemstrom; James Merzenich; Allison Reger; Barbara. Wales

    2007-01-01

    We modeled the integrated effects of natural disturbances and management activities for three disturbance scenarios on a 178 000-ha landscape in the upper Grande Ronde subbasin of northeast Oregon. The landscape included three forest environments (warm-dry, cool-moist, and cold) as well as a mixture of publicly and privately owned lands. Our models were state and...

  20. JIMM: the next step for mission-level models

    NASA Astrophysics Data System (ADS)

    Gump, Jamieson; Kurker, Robert G.; Nalepka, Joseph P.

    2001-09-01

    The (Simulation Based Acquisition) SBA process is one in which the planning, design, and test of a weapon system or other product is done through the more effective use of modeling and simulation, information technology, and process improvement. This process results in a product that is produced faster, cheaper, and more reliably than its predecessors. Because the SBA process requires realistic and detailed simulation conditions, it was necessary to develop a simulation tool that would provide a simulation environment acceptable for doing SBA analysis. The Joint Integrated Mission Model (JIMM) was created to help define and meet the analysis, test and evaluation, and training requirements of a Department of Defense program utilizing SBA. Through its generic nature of representing simulation entities, its data analysis capability, and its robust configuration management process, JIMM can be used to support a wide range of simulation applications as both a constructive and a virtual simulation tool. JIMM is a Mission Level Model (MLM). A MLM is capable of evaluating the effectiveness and survivability of a composite force of air and space systems executing operational objectives in a specific scenario against an integrated air and space defense system. Because MLMs are useful for assessing a system's performance in a realistic, integrated, threat environment, they are key to implementing the SBA process. JIMM is a merger of the capabilities of one legacy model, the Suppressor MLM, into another, the Simulated Warfare Environment Generator (SWEG) MLM. By creating a more capable MLM, JIMM will not only be a tool to support the SBA initiative, but could also provide the framework for the next generation of MLMs.

  1. Making authentic science accessible—the benefits and challenges of integrating bioinformatics into a high-school science curriculum

    PubMed Central

    Gelbart, Hadas; Ben-Dor, Shifra; Yarden, Anat

    2017-01-01

    Despite the central place held by bioinformatics in modern life sciences and related areas, it has only recently been integrated to a limited extent into high-school teaching and learning programs. Here we describe the assessment of a learning environment entitled ‘Bioinformatics in the Service of Biotechnology’. Students’ learning outcomes and attitudes toward the bioinformatics learning environment were measured by analyzing their answers to questions embedded within the activities, questionnaires, interviews and observations. Students’ difficulties and knowledge acquisition were characterized based on four categories: the required domain-specific knowledge (declarative, procedural, strategic or situational), the scientific field that each question stems from (biology, bioinformatics or their combination), the associated cognitive-process dimension (remember, understand, apply, analyze, evaluate, create) and the type of question (open-ended or multiple choice). Analysis of students’ cognitive outcomes revealed learning gains in bioinformatics and related scientific fields, as well as appropriation of the bioinformatics approach as part of the students’ scientific ‘toolbox’. For students, questions stemming from the ‘old world’ biology field and requiring declarative or strategic knowledge were harder to deal with. This stands in contrast to their teachers’ prediction. Analysis of students’ affective outcomes revealed positive attitudes toward bioinformatics and the learning environment, as well as their perception of the teacher’s role. Insights from this analysis yielded implications and recommendations for curriculum design, classroom enactment, teacher education and research. For example, we recommend teaching bioinformatics in an integrative and comprehensive manner, through an inquiry process, and linking it to the wider science curriculum. PMID:26801769

  2. Making authentic science accessible-the benefits and challenges of integrating bioinformatics into a high-school science curriculum.

    PubMed

    Machluf, Yossy; Gelbart, Hadas; Ben-Dor, Shifra; Yarden, Anat

    2017-01-01

    Despite the central place held by bioinformatics in modern life sciences and related areas, it has only recently been integrated to a limited extent into high-school teaching and learning programs. Here we describe the assessment of a learning environment entitled 'Bioinformatics in the Service of Biotechnology'. Students' learning outcomes and attitudes toward the bioinformatics learning environment were measured by analyzing their answers to questions embedded within the activities, questionnaires, interviews and observations. Students' difficulties and knowledge acquisition were characterized based on four categories: the required domain-specific knowledge (declarative, procedural, strategic or situational), the scientific field that each question stems from (biology, bioinformatics or their combination), the associated cognitive-process dimension (remember, understand, apply, analyze, evaluate, create) and the type of question (open-ended or multiple choice). Analysis of students' cognitive outcomes revealed learning gains in bioinformatics and related scientific fields, as well as appropriation of the bioinformatics approach as part of the students' scientific 'toolbox'. For students, questions stemming from the 'old world' biology field and requiring declarative or strategic knowledge were harder to deal with. This stands in contrast to their teachers' prediction. Analysis of students' affective outcomes revealed positive attitudes toward bioinformatics and the learning environment, as well as their perception of the teacher's role. Insights from this analysis yielded implications and recommendations for curriculum design, classroom enactment, teacher education and research. For example, we recommend teaching bioinformatics in an integrative and comprehensive manner, through an inquiry process, and linking it to the wider science curriculum. © The Author 2016. Published by Oxford University Press.

  3. Integrating automated structured analysis and design with Ada programming support environments

    NASA Technical Reports Server (NTRS)

    Hecht, Alan; Simmons, Andy

    1986-01-01

    Ada Programming Support Environments (APSE) include many powerful tools that address the implementation of Ada code. These tools do not address the entire software development process. Structured analysis is a methodology that addresses the creation of complete and accurate system specifications. Structured design takes a specification and derives a plan to decompose the system subcomponents, and provides heuristics to optimize the software design to minimize errors and maintenance. It can also produce the creation of useable modules. Studies have shown that most software errors result from poor system specifications, and that these errors also become more expensive to fix as the development process continues. Structured analysis and design help to uncover error in the early stages of development. The APSE tools help to insure that the code produced is correct, and aid in finding obscure coding errors. However, they do not have the capability to detect errors in specifications or to detect poor designs. An automated system for structured analysis and design TEAMWORK, which can be integrated with an APSE to support software systems development from specification through implementation is described. These tools completement each other to help developers improve quality and productivity, as well as to reduce development and maintenance costs. Complete system documentation and reusable code also resultss from the use of these tools. Integrating an APSE with automated tools for structured analysis and design provide capabilities and advantages beyond those realized with any of these systems used by themselves.

  4. Integrated software environment based on COMKAT for analyzing tracer pharmacokinetics with molecular imaging.

    PubMed

    Fang, Yu-Hua Dean; Asthana, Pravesh; Salinas, Cristian; Huang, Hsuan-Ming; Muzic, Raymond F

    2010-01-01

    An integrated software package, Compartment Model Kinetic Analysis Tool (COMKAT), is presented in this report. COMKAT is an open-source software package with many functions for incorporating pharmacokinetic analysis in molecular imaging research and has both command-line and graphical user interfaces. With COMKAT, users may load and display images, draw regions of interest, load input functions, select kinetic models from a predefined list, or create a novel model and perform parameter estimation, all without having to write any computer code. For image analysis, COMKAT image tool supports multiple image file formats, including the Digital Imaging and Communications in Medicine (DICOM) standard. Image contrast, zoom, reslicing, display color table, and frame summation can be adjusted in COMKAT image tool. It also displays and automatically registers images from 2 modalities. Parametric imaging capability is provided and can be combined with the distributed computing support to enhance computation speeds. For users without MATLAB licenses, a compiled, executable version of COMKAT is available, although it currently has only a subset of the full COMKAT capability. Both the compiled and the noncompiled versions of COMKAT are free for academic research use. Extensive documentation, examples, and COMKAT itself are available on its wiki-based Web site, http://comkat.case.edu. Users are encouraged to contribute, sharing their experience, examples, and extensions of COMKAT. With integrated functionality specifically designed for imaging and kinetic modeling analysis, COMKAT can be used as a software environment for molecular imaging and pharmacokinetic analysis.

  5. Project Integration Architecture (PIA) and Computational Analysis Programming Interface (CAPRI) for Accessing Geometry Data from CAD Files

    NASA Technical Reports Server (NTRS)

    Benyo, Theresa L.

    2002-01-01

    Integration of a supersonic inlet simulation with a computer aided design (CAD) system is demonstrated. The integration is performed using the Project Integration Architecture (PIA). PIA provides a common environment for wrapping many types of applications. Accessing geometry data from CAD files is accomplished by incorporating appropriate function calls from the Computational Analysis Programming Interface (CAPRI). CAPRI is a CAD vendor neutral programming interface that aids in acquiring geometry data directly from CAD files. The benefits of wrapping a supersonic inlet simulation into PIA using CAPRI are; direct access of geometry data, accurate capture of geometry data, automatic conversion of data units, CAD vendor neutral operation, and on-line interactive history capture. This paper describes the PIA and the CAPRI wrapper and details the supersonic inlet simulation demonstration.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farren Hunt

    Idaho National Laboratory (INL) performed an Annual Effectiveness Review of the Integrated Safety Management System (ISMS), per 48 Code of Federal Regulations (CFR) 970.5223 1, “Integration of Environment, Safety and Health into Work Planning and Execution.” The annual review assessed Integrated Safety Management (ISM) effectiveness, provided feedback to maintain system integrity, and identified target areas for focused improvements and assessments for fiscal year (FY) 2013. Results of the FY 2012 annual effectiveness review demonstrated that the INL’s ISMS program was significantly strengthened. Actions implemented by the INL demonstrate that the overall Integrated Safety Management System is sound and ensures safemore » and successful performance of work while protecting workers, the public, and environment. This report also provides several opportunities for improvement that will help further strengthen the ISM Program and the pursuit of safety excellence. Demonstrated leadership and commitment, continued surveillance, and dedicated resources have been instrumental in maturing a sound ISMS program. Based upon interviews with personnel, reviews of assurance activities, and analysis of ISMS process implementation, this effectiveness review concludes that ISM is institutionalized and is “Effective”.« less

  7. Using EIGER for Antenna Design and Analysis

    NASA Technical Reports Server (NTRS)

    Champagne, Nathan J.; Khayat, Michael; Kennedy, Timothy F.; Fink, Patrick W.

    2007-01-01

    EIGER (Electromagnetic Interactions GenERalized) is a frequency-domain electromagnetics software package that is built upon a flexible framework, designed using object-oriented techniques. The analysis methods used include moment method solutions of integral equations, finite element solutions of partial differential equations, and combinations thereof. The framework design permits new analysis techniques (boundary conditions, Green#s functions, etc.) to be added to the software suite with a sensible effort. The code has been designed to execute (in serial or parallel) on a wide variety of platforms from Intel-based PCs and Unix-based workstations. Recently, new potential integration scheme s that avoid singularity extraction techniques have been added for integral equation analysis. These new integration schemes are required for facilitating the use of higher-order elements and basis functions. Higher-order elements are better able to model geometrical curvature using fewer elements than when using linear elements. Higher-order basis functions are beneficial for simulating structures with rapidly varying fields or currents. Results presented here will demonstrate curren t and future capabilities of EIGER with respect to analysis of installed antenna system performance in support of NASA#s mission of exploration. Examples include antenna coupling within an enclosed environment and antenna analysis on electrically large manned space vehicles.

  8. Microbial and viral chitinases: Attractive biopesticides for integrated pest management.

    PubMed

    Berini, Francesca; Katz, Chen; Gruzdev, Nady; Casartelli, Morena; Tettamanti, Gianluca; Marinelli, Flavia

    The negative impact of the massive use of synthetic pesticides on the environment and on human health has stimulated the search for environment-friendly practices for controlling plant diseases and pests. Among them, biocontrol, which relies on using beneficial organisms or their products (bioactive molecules and/or hydrolytic enzymes), holds the greatest promise and is considered a pillar of integrated pest management. Chitinases are particularly attractive to this purpose since they have fungicidal, insecticidal, and nematicidal activities. Here, current knowledge on the biopesticidal action of microbial and viral chitinases is reviewed, together with a critical analysis of their future development as biopesticides. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. Technology developments integrating a space network communications testbed

    NASA Technical Reports Server (NTRS)

    Kwong, Winston; Jennings, Esther; Clare, Loren; Leang, Dee

    2006-01-01

    As future manned and robotic space explorations missions involve more complex systems, it is essential to verify, validate, and optimize such systems through simulation and emulation in a low cost testbed environment. The goal of such a testbed is to perform detailed testing of advanced space and ground communications networks, technologies, and client applications that are essential for future space exploration missions. We describe the development of new technologies enhancing our Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) that enables its integration in a distributed space communications testbed. MACHETE combines orbital modeling, link analysis, and protocol and service modeling to quantify system performance based on comprehensive considerations of different aspects of space missions.

  10. 14 CFR 121.909 - Approval of Advanced Qualification Program.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... methodology must incorporate a thorough analysis of the certificate holder's operations, aircraft, line environment and job functions. All AQP qualification and continuing qualification curriculums must integrate.... (ii) Initial job task listing. (iii) Instructional systems development methodology. (iv) Qualification...

  11. Integrating traffic operation with emission impact using dual-loop data.

    DOT National Transportation Integrated Search

    2012-02-22

    "Transportation contributes great amount of green house gases and other pollutant emissions to the global environment. Localized analysis of on-road traffic source emissions is often required by the U.S. Environmental Protection Agency (EPA) for proj...

  12. Neuro-Holistic Learning©: An Integrated Kinesthetic Approach to Cognitive Learning© Using Collaborative Interactive Thought Exchange© in a Blended Environment to Enhance the Learning of Young African American Males

    ERIC Educational Resources Information Center

    Osler, James Edward, II; Wright, Mark Anthony

    2016-01-01

    This paper is part two of the article entitled, "Dynamic Neuroscientific Systemology: Using Tri-Squared Meta-Analysis and Innovative Instructional Design to Develop a Novel Distance Education Model for the Systemic Creation of Engaging Online Learning Environments" published in the July-September 2015 issue of i-manager's "Journal…

  13. A proposed biophysical approach to Visual absorption capability (VAC)

    Treesearch

    W. C. Yeomans

    1979-01-01

    In British Columbia, visual analysis is in its formative stages and has only recently been accepted by Government as a resource component, notably within the Resource Analysis Branch, Ministry of Environment. Visual absorption capability (VAC), is an integral factor in visual resource assessment. VAC is examined by the author in the degree to which it relates to...

  14. In-field volatile collection and analysis method utilizing a portable GC-MS: in situ headspace analysis of intact and damaged yellow starthistle flower heads

    USDA-ARS?s Scientific Manuscript database

    Introduction: Understanding the complex chemical signaling of plants and insects is an important component of chemical ecology. Accordingly, the collection of chemical cues from plants in their normal environment is integral to elucidation of multifaceted plant-insect communications. Often times rem...

  15. Incorporating "Motivation" into the Functional Analysis of Challenging Behavior: On the Interactive and Integrative Potential of the Motivating Operation

    ERIC Educational Resources Information Center

    Langthorne, Paul; McGill, Peter; O'Reilly, Mark

    2007-01-01

    Sensitivity theory attempts to account for the variability often observed in challenging behavior by recourse to the "aberrant motivation" of people with intellectual and developmental disabilities. In this article, we suggest that a functional analysis based on environmental (challenging environments) and biological (challenging needs) motivating…

  16. Integration of health and environment through health impact assessment: cases from three continents.

    PubMed

    Negev, Maya; Levine, Hagai; Davidovitch, Nadav; Bhatia, Rajiv; Mindell, Jennifer

    2012-04-01

    Despite the strong linkage between environment and health, institutions responsible for these fields operate in largely fragmented ways with limited interaction. As illustrated in the recent engagement between health and urban planning institutions, inter-institutional cooperation could support more effective and politically acceptable solutions for both local and global problems. Analysis of three case-studies, from three different continents, shows that HIA might serve to promote synergies among health and environmental disciplines in different local contexts, and could lead to institutional and procedural changes that promote health. Case examples provided supportive evidence for these effects, despite differences in approaches to HIA and governance levels. Obstacles to the use of HIA for inter-institutional integration also differed between countries. Lessons learned could support cooperation in other common interests of health and environment disciplines such as research, training and preparedness, and mitigation of public health emergencies related to the environment. Copyright © 2012 Elsevier Inc. All rights reserved.

  17. Integration of statistical and physiological analyses of adaptation of near-isogenic barley lines.

    PubMed

    Romagosa, I; Fox, P N; García Del Moral, L F; Ramos, J M; García Del Moral, B; Roca de Togores, F; Molina-Cano, J L

    1993-08-01

    Seven near-isogenic barley lines, differing for three independent mutant genes, were grown in 15 environments in Spain. Genotype x environment interaction (G x E) for grain yield was examined with the Additive Main Effects and Multiplicative interaction (AMMI) model. The results of this statistical analysis of multilocation yield-data were compared with a morpho-physiological characterization of the lines at two sites (Molina-Cano et al. 1990). The first two principal component axes from the AMMI analysis were strongly associated with the morpho-physiological characters. The independent but parallel discrimination among genotypes reflects genetic differences and highlights the power of the AMMI analysis as a tool to investigate G x E. Characters which appear to be positively associated with yield in the germplasm under study could be identified for some environments.

  18. [Watershed water environment pollution models and their applications: a review].

    PubMed

    Zhu, Yao; Liang, Zhi-Wei; Li, Wei; Yang, Yi; Yang, Mu-Yi; Mao, Wei; Xu, Han-Li; Wu, Wei-Xiang

    2013-10-01

    Watershed water environment pollution model is the important tool for studying watershed environmental problems. Through the quantitative description of the complicated pollution processes of whole watershed system and its parts, the model can identify the main sources and migration pathways of pollutants, estimate the pollutant loadings, and evaluate their impacts on water environment, providing a basis for watershed planning and management. This paper reviewed the watershed water environment models widely applied at home and abroad, with the focuses on the models of pollutants loading (GWLF and PLOAD), water quality of received water bodies (QUAL2E and WASP), and the watershed models integrated pollutant loadings and water quality (HSPF, SWAT, AGNPS, AnnAGNPS, and SWMM), and introduced the structures, principles, and main characteristics as well as the limitations in practical applications of these models. The other models of water quality (CE-QUAL-W2, EFDC, and AQUATOX) and watershed models (GLEAMS and MIKE SHE) were also briefly introduced. Through the case analysis on the applications of single model and integrated models, the development trend and application prospect of the watershed water environment pollution models were discussed.

  19. A Prototyping Effort for the Integrated Spacecraft Analysis System

    NASA Technical Reports Server (NTRS)

    Wong, Raymond; Tung, Yu-Wen; Maldague, Pierre

    2011-01-01

    Computer modeling and simulation has recently become an essential technique for predicting and validating spacecraft performance. However, most computer models only examine spacecraft subsystems, and the independent nature of the models creates integration problems, which lowers the possibilities of simulating a spacecraft as an integrated unit despite a desire for this type of analysis. A new project called Integrated Spacecraft Analysis was proposed to serve as a framework for an integrated simulation environment. The project is still in its infancy, but a software prototype would help future developers assess design issues. The prototype explores a service oriented design paradigm that theoretically allows programs written in different languages to communicate with one another. It includes creating a uniform interface to the SPICE libraries such that different in-house tools like APGEN or SEQGEN can exchange information with it without much change. Service orientation may result in a slower system as compared to a single application, and more research needs to be done on the different available technologies, but a service oriented approach could increase long term maintainability and extensibility.

  20. Vulnerability assessment of atmospheric environment driven by human impacts.

    PubMed

    Zhang, Yang; Shen, Jing; Ding, Feng; Li, Yu; He, Li

    2016-11-15

    Atmospheric environment quality worsening is a substantial threat to public health worldwide, and in many places, air pollution due to the intensification of the human activity is increasing dramatically. However, no studies have been investigated the integration of vulnerability assessment and atmospheric environment driven by human impacts. The objective of this study was to identify and prioritize the undesirable environmental changes as an early warning system for environment managers and decision makers in term of human, atmospheric environment, and social economic elements. We conduct a vulnerability assessment method of atmospheric environment associated with human impact, this method integrates spatial context of Geographic Information System (GIS) tool, multi-criteria decision analysis (MCDA) method, ordered weighted averaging (OWA) operators under the Exposure-Sensitivity- Adaptive Capacity (ESA) framework. Decision makers can find out relevant vulnerability assessment results with different vulnerable attitudes. In the Beijing-Tianjin-Hebei (BTH) region, China, we further applied this developed method and proved it to be reliable and consistent with the China Environmental Status Bulletin. Results indicate that the vulnerability of atmospheric environment in the BTH region is not optimistic, and environment managers should do more about air pollution. Thus, the most appropriate strategic decision and development program of city or state can be picked out assisting by the vulnerable results. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. 76 FR 17145 - Agency Information Collection Activities: Business Transformation-Automated Integrated Operating...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-28

    ... Collection Activities: Business Transformation--Automated Integrated Operating Environment (IOE), New... Transformation--Integrated Operating Environment (IOE); OMB Control No. 1615-NEW. SUMMARY: USCIS is developing an automated Integrated Operating Environment (IOE) to process benefit applications. The IOE will collect...

  2. Managed Development Environment Successes for MSFC's VIPA Team

    NASA Technical Reports Server (NTRS)

    Finckenor, Jeff; Corder, Gary; Owens, James; Meehan, Jim; Tidwell, Paul H.

    2005-01-01

    This paper outlines the best practices of the Vehicle Design Team for VIPA. The functions of the VIPA Vehicle Design (VVD) discipline team are to maintain the controlled reference geometry and provide linked, simplified geometry for each of the other discipline analyses. The core of the VVD work, and the approach for VVD s first task of controlling the reference geometry, involves systems engineering, top-down, layout-based CAD modeling within a Product Data Manager (PDM) development environment. The top- down approach allows for simple control of very large, integrated assemblies and greatly enhances the ability to generate trade configurations and reuse data. The second VVD task, model simplification for analysis, is handled within the managed environment through application of the master model concept. In this approach, there is a single controlling, or master, product definition dataset. Connected to this master model are reference datasets with live geometric and expression links. The referenced models can be for drawings, manufacturing, visualization, embedded analysis, or analysis simplification. A discussion of web based interaction, including visualization, between the design and other disciplines is included. Demonstrated examples are cited, including the Space Launch Initiative development cycle, the Saturn V systems integration and verification cycle, an Orbital Space Plane study, and NASA Exploration Office studies of Shuttle derived and clean sheet launch vehicles. The VIPA Team has brought an immense amount of detailed data to bear on program issues. A central piece of that success has been the Managed Development Environment and the VVD Team approach to modeling.

  3. Competitive assessment of aerospace systems using system dynamics

    NASA Astrophysics Data System (ADS)

    Pfaender, Jens Holger

    Aircraft design has recently experienced a trend away from performance centric design towards a more balanced approach with increased emphasis on engineering an economically successful system. This approach focuses on bringing forward a comprehensive economic and life-cycle cost analysis. Since the success of any system also depends on many external factors outside of the control of the designer, this traditionally has been modeled as noise affecting the uncertainty of the design. However, this approach is currently lacking a strategic treatment of necessary early decisions affecting the probability of success of a given concept in a dynamic environment. This suggests that the introduction of a dynamic method into a life-cycle cost analysis should allow the analysis of the future attractiveness of such a concept in the presence of uncertainty. One way of addressing this is through the use of a competitive market model. However, existing market models do not focus on the dynamics of the market. Instead, they focus on modeling and predicting market share through logit regression models. The resulting models exhibit relatively poor predictive capabilities. The method proposed here focuses on a top-down approach that integrates a competitive model based on work in the field of system dynamics into the aircraft design process. Demonstrating such integration is one of the primary contributions of this work, which previously has not been demonstrated. This integration is achieved through the use of surrogate models, in this case neural networks. This enabled not only the practical integration of analysis techniques, but also reduced the computational requirements so that interactive exploration as envisioned was actually possible. The example demonstration of this integration is built on the competition in the 250 seat large commercial aircraft market exemplified by the Boeing 767-400ER and the Airbus A330-200. Both aircraft models were calibrated to existing performance and certification data and then integrated into the system dynamics market model. The market model was then calibrated with historical market data. This calibration showed a much improved predictive capability as compared to the conventional logit regression models. An additional advantage of this dynamic model is that to realize this improved capability, no additional explanatory variables were required. Furthermore, the resulting market model was then integrated into a prediction profiler environment with a time variant Monte-Carlo analysis resulting in a unique trade-off environment. This environment was shown to allow interactive trade-off between aircraft design decisions and economic considerations while allowing the exploration potential market success in the light of varying external market conditions and scenarios. The resulting method is capable of reduced decision support uncertainty and identification of robust design decisions in future scenarios with a high likelihood of occurrence with special focus on the path dependent nature of future implications of decisions. Furthermore, it was possible to demonstrate the increased importance of design and technology choices on the competitiveness in scenarios with drastic increases in commodity prices during the time period modeled. Another use of the existing outputs of the Monte-Carlo analysis was then realized by showing them on a multivariate scatter plot. This plot was then shown to enable by appropriate grouping of variables to enable the top down definition of an aircraft design, also known as inverse design. In other words this enables the designer to define strategic market and return on investment goals for a number of scenarios, for example the development of fuel prices, and then directly see which specific aircraft designs meet these goals.

  4. Development of the GREEN (Garden Resources, Education, and Environment Nexus) Tool: An Evidence-Based Model for School Garden Integration.

    PubMed

    Burt, Kate Gardner; Koch, Pamela; Contento, Isobel

    2017-10-01

    Researchers have established the benefits of school gardens on students' academic achievement, dietary outcomes, physical activity, and psychosocial skills, yet limited research has been conducted about how school gardens become institutionalized and sustained. Our aim was to develop a tool that captures how gardens are effectively established, integrated, and sustained in schools. We conducted a sequential, exploratory, mixed-methods study. Participants were identified with the help of Grow To Learn, the organization coordinating the New York City school garden initiative, and recruited via e-mail. A stratified, purposeful sample of 21 New York City elementary and middle schools participated in this study throughout the 2013/2014 school year. The sample was stratified in their garden budgets and purposeful in that each of the schools' gardens were determined to be well integrated and sustained. The processes and strategies used by school gardeners to establish well-integrated school gardens were assessed via data collected from surveys, interviews, observations, and concept mapping. Descriptive statistics as well as multidimensional scaling and hierarchical cluster analysis were used to examine the survey and concept mapping data. Qualitative data analysis consisted of thematic coding, pattern matching, explanation building and cross-case synthesis. Nineteen components within four domains of school garden integration were found through the mixed-methods concept mapping analysis. When the analyses of other data were combined, relationships between domains and components emerged. These data resulted in the development of the GREEN (Garden Resources, Education, and Environment Nexus) Tool. When schools with integrated and sustained gardens were studied, patterns emerged about how gardeners achieve institutionalization through different combinations of critical components. These patterns are best described by the GREEN Tool, the first framework to identify how to operationalize school gardening components and describe an evidence-based strategy of successful school garden integration. Copyright © 2017 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  5. Interactive design and analysis of future large spacecraft concepts

    NASA Technical Reports Server (NTRS)

    Garrett, L. B.

    1981-01-01

    An interactive computer aided design program used to perform systems level design and analysis of large spacecraft concepts is presented. Emphasis is on rapid design, analysis of integrated spacecraft, and automatic spacecraft modeling for lattice structures. Capabilities and performance of multidiscipline applications modules, the executive and data management software, and graphics display features are reviewed. A single user at an interactive terminal create, design, analyze, and conduct parametric studies of Earth orbiting spacecraft with relative ease. Data generated in the design, analysis, and performance evaluation of an Earth-orbiting large diameter antenna satellite are used to illustrate current capabilities. Computer run time statistics for the individual modules quantify the speed at which modeling, analysis, and design evaluation of integrated spacecraft concepts is accomplished in a user interactive computing environment.

  6. User-driven integrated software lives: ``Paleomag'' paleomagnetics analysis on the Macintosh

    NASA Astrophysics Data System (ADS)

    Jones, Craig H.

    2002-12-01

    "PaleoMag," a paleomagnetics analysis package originally developed for the Macintosh operating system in 1988, allows examination of demagnetization of individual samples and analysis of directional data from collections of samples. Prior to recent reinvigorated development of the software for both Macintosh and Windows, it was widely used despite not running properly on machines and operating systems sold after 1995. This somewhat surprising situation demonstrates that there is a continued need for integrated analysis software within the earth sciences, in addition to well-developed scripting and batch-mode software. One distinct advantage of software like PaleoMag is in the ability to combine quality control with analysis within a unique graphical environment. Because such demands are frequent within the earth sciences, means of nurturing the development of similar software should be found.

  7. Ground Systems Development Environment (GSDE) interface requirements analysis

    NASA Technical Reports Server (NTRS)

    Church, Victor E.; Philips, John; Hartenstein, Ray; Bassman, Mitchell; Ruskin, Leslie; Perez-Davila, Alfredo

    1991-01-01

    A set of procedural and functional requirements are presented for the interface between software development environments and software integration and test systems used for space station ground systems software. The requirements focus on the need for centralized configuration management of software as it is transitioned from development to formal, target based testing. This concludes the GSDE Interface Requirements study. A summary is presented of findings concerning the interface itself, possible interface and prototyping directions for further study, and results of the investigation of the Cronus distributed applications environment.

  8. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-109

    NASA Technical Reports Server (NTRS)

    Oliu, Armando

    2005-01-01

    The Debris Team has developed and implemented measures to control damage from debris in the Shuttle operational environment and to make the control measures a part of routine launch flows. These measures include engineering surveillance during vehicle processing and closeout operations, facility and flight hardware inspections before and after launch, and photographic analysis of mission events. Photographic analyses of mission imagery from launch, on-orbit, and landing provide significant data in verifying proper operation of systems and evaluating anomalies. In addition to the Kennedy Space Center Photo/Video Analysis, reports from Johnson Space Center and Marshall Space Flight Center are also included in this document to provide an integrated assessment of the mission.

  9. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-110

    NASA Technical Reports Server (NTRS)

    Oliu, Armando

    2005-01-01

    The Debris Team has developed and implemented measures to control damage from debris in the Shuttle operational environment and to make the control measures a part of routine launch flows. These measures include engineering surveillance during vehicle processing and closeout operations, facility and flight hardware inspections before and after launch, and photographic analysis of mission events. Photographic analyses of mission imagery from launch, on-orbit, and landing provide significant data in verifying proper operation of systems and evaluating anomalies. In addition to the Kennedy Space Center Photo/Video Analysis, reports from Johnson Space Center and Marshall Space Flight Center are also included in this document to provide an integrated assessment of the mission.

  10. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-105

    NASA Technical Reports Server (NTRS)

    Oliu, Armando

    2005-01-01

    The Debris Team has developed and implemented measures to control damage from debris in the Shuttle operational environment and to make the control measures a part of routine launch flows. These measures include engineering surveillance during vehicle processing and closeout operations, facility and flight hardware inspections before and after launch, and photographic analysis of mission events. Photographic analyses of mission imagery from launch, on-orbit, and landing provide significant data in verifying proper operation of systems and evaluating anomalies. In addition to the Kennedy Space Center Photo/Video Analysis, reports from Johnson Space Center and Marshall Space Flight Center are also included in this document to provide an integrated assessment of the mission.

  11. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-104

    NASA Technical Reports Server (NTRS)

    Oliu, Armando

    2005-01-01

    The Debris Team has developed and implemented measures to control damage from debris in the Shuttle operational environment and to make the control measures a part of routine launch flows. These measures include engineering surveillance during vehicle processing and closeout operations, facility and flight hardware inspections before and after launch, and photographic analysis of mission events. Photographic analyses of mission imagery from launch, on-orbit, and landing provide significant data in verifying proper operation of systems and evaluating anomalies. In addition to the Kennedy Space Center Photo/Video Analysis, reports from Johnson Space Center and Marshall Space Flight Center are also included in this document to provide an integrated assessment of the mission.

  12. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-108

    NASA Technical Reports Server (NTRS)

    Oliu, Armando

    2005-01-01

    The Debris Team has developed and implemented measures to control damage from debris in the Shuttle operational environment and to make the control measures a part of routine launch flows. These measures include engineering surveillance during vehicle processing and closeout operations, facility and flight hardware inspections before and after launch, and photographic analysis of mission events. Photographic analyses of mission imagery from launch, on-orbit, and landing provide significant data in verifying proper operation of systems and evaluating anomalies. In addition to the Kennedy Space Center Photo/Video Analysis, reports from Johnson Space Center and Marshall Space Flight Center are also included in this document to provide an integrated assessment of the mission.

  13. MULTIMEDIA ENVIRONMENTAL DISTRIBUTION OF TOXICS (MEND-TOX): PART II, SOFTWARE IMPLEMENTATION AND CASE STUDIES

    EPA Science Inventory

    An integrated hybrid spatial-compartmental simulator is presented for analyzing the dynamic distribution of chemicals in the multimedia environment. Information obtained from such analysis, which includes temporal chemical concentration profiles in various media, mass distribu...

  14. MULTIMEDIA ENVIRONMENTAL DISTRIBUTION OF TOXICS (MEND-TOX): PART I, HYBRID COMPARTMENTAL-SPATIAL MODELING FRAMEWORK

    EPA Science Inventory

    An integrated hybrid spatial-compartmental modeling approach is presented for analyzing the dynamic distribution of chemicals in the multimedia environment. Information obtained from such analysis, which includes temporal chemical concentration profiles in various media, mass ...

  15. Software technology testbed softpanel prototype

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The following subject areas are covered: analysis of using Ada for the development of real-time control systems for the Space Station; analysis of the functionality of the Application Generator; analysis of the User Support Environment criteria; analysis of the SSE tools and procedures which are to be used for the development of ground/flight software for the Space Station; analysis if the CBATS tutorial (an Ada tutorial package); analysis of Interleaf; analysis of the Integration, Test and Verification process of the Space Station; analysis of the DMS on-orbit flight architecture; analysis of the simulation architecture.

  16. The component-based architecture of the HELIOS medical software engineering environment.

    PubMed

    Degoulet, P; Jean, F C; Engelmann, U; Meinzer, H P; Baud, R; Sandblad, B; Wigertz, O; Le Meur, R; Jagermann, C

    1994-12-01

    The constitution of highly integrated health information networks and the growth of multimedia technologies raise new challenges for the development of medical applications. We describe in this paper the general architecture of the HELIOS medical software engineering environment devoted to the development and maintenance of multimedia distributed medical applications. HELIOS is made of a set of software components, federated by a communication channel called the HELIOS Unification Bus. The HELIOS kernel includes three main components, the Analysis-Design and Environment, the Object Information System and the Interface Manager. HELIOS services consist in a collection of toolkits providing the necessary facilities to medical application developers. They include Image Related services, a Natural Language Processor, a Decision Support System and Connection services. The project gives special attention to both object-oriented approaches and software re-usability that are considered crucial steps towards the development of more reliable, coherent and integrated applications.

  17. Analysis on the overpressure characterization with respect to depositional environment facies: Case Study in Miri Area, Baram Delta

    NASA Astrophysics Data System (ADS)

    Mhd Hanapiah, N.; Yusoff, W. I. Wan; Zakariah, M. N. A.

    2017-10-01

    Overpressure studies in oil and gas exploration and production are carried out in order to mitigate any losses that could happen while drilling. These concerns can be addressed by enhancing the understanding of overpressure characterization in the fields. This research emphasizes in determining the pore pressure trend in Miri area to assist pore pressure prediction for future hydrocarbon exploration and production. Generally, pore pressure trends are related to mechanisms that contribute to the overpressure generation. In the region predominant overpressure are disequilibrium compaction within the prodelta shales meanwhile in outer shelf overpressure generation controlled by fluid expansion in deltaic sequence of inner shelf area. The objective of this research is to analyze the pore pressure profile of wells for determining vertical trends of pore pressure for various depositional environment facies of Miri area. Integration of rock physics and pore pressure analysis and relating the trends to environment depositional environment facies within shale underlying sand interval. Analysis done shows that overpressure top is characterize by depositional environment facies within shale underlying sand interval.

  18. Agile Combat Support Doctrine and Logistics Officer Training: Do We Need an Integrated Logistics School for the Expeditionary Air and Space Force?

    DTIC Science & Technology

    2003-02-01

    Rank-Order Correlation Coefficients statistical analysis via SPSS 8.0. Interview informants’ perceptions and perspec­ tives are combined with...logistics training in facilitating the em­ ployment of doctrinal tenets in a deployed environment. Statistical Correlations: Confirmed Relationships...integration of technology and cross-func­ tional training for the tactical practitioners. Statistical Correlations: Confirmed Relationships on the Need

  19. The use of COSMIC NASTRAN in an integrated conceptual design environment

    NASA Technical Reports Server (NTRS)

    White, Gil

    1989-01-01

    Changes in both software and hardware are rapidly bringing conceptual engineering tools like finite element analysis into mainstream mechanical design. Systems that integrate all phases of the manufacturing process provide the most cost benefits. The application of programming concepts like object oriented programming allow for the encapsulation of intelligent data within the design geometry. This combined with declining cost in per seat hardware bring new alternatives to the user.

  20. Aerospace Systems Design in NASA's Collaborative Engineering Environment

    NASA Technical Reports Server (NTRS)

    Monell, Donald W.; Piland, William M.

    1999-01-01

    Past designs of complex aerospace systems involved an environment consisting of collocated design teams with project managers, technical discipline experts, and other experts (e.g. manufacturing and systems operations). These experts were generally qualified only on the basis of past design experience and typically had access to a limited set of integrated analysis tools. These environments provided less than desirable design fidelity, often lead to the inability of assessing critical programmatic and technical issues (e.g., cost risk, technical impacts), and generally derived a design that was not necessarily optimized across the entire system. The continually changing, modern aerospace industry demands systems design processes that involve the best talent available (no matter where it resides) and access to the best design and analysis tools. A solution to these demands involves a design environment referred to as collaborative engineering. The collaborative engineering environment evolving within the National Aeronautics and Space Administration (NASA) is a capability that enables the Agency's engineering infrastructure to interact and use the best state-of-the-art tools and data across organizational boundaries. Using collaborative engineering, the collocated team is replaced with an interactive team structure where the team members are geographically distributed and the best engineering talent can be applied to the design effort regardless of physical location. In addition, a more efficient, higher quality design product is delivered by bringing together the best engineering talent with more up-to-date design and analysis tools. These tools are focused on interactive, multidisciplinary design and analysis with emphasis on the complete life cycle of the system, and they include nontraditional, integrated tools for life cycle cost estimation and risk assessment. NASA has made substantial progress during the last two years in developing a collaborative engineering environment. NASA is planning to use this collaborative engineering infrastructure to provide better aerospace systems life cycle design and analysis, which includes analytical assessment of the technical and programmatic aspects of a system from "cradle to grave." This paper describes the recent NASA developments in the area of collaborative engineering, the benefits (realized and anticipated) of using the developed capability, and the long-term plans for implementing this capability across the Agency.

  1. Aerospace Systems Design in NASA's Collaborative Engineering Environment

    NASA Technical Reports Server (NTRS)

    Monell, Donald W.; Piland, William M.

    2000-01-01

    Past designs of complex aerospace systems involved an environment consisting of collocated design teams with project managers, technical discipline experts, and other experts (e.g., manufacturing and systems operation). These experts were generally qualified only on the basis of past design experience and typically had access to a limited set of integrated analysis tools. These environments provided less than desirable design fidelity, often lead to the inability of assessing critical programmatic and technical issues (e.g., cost, risk, technical impacts), and generally derived a design that was not necessarily optimized across the entire system. The continually changing, modern aerospace industry demands systems design processes that involve the best talent available (no matter where it resides) and access to the the best design and analysis tools. A solution to these demands involves a design environment referred to as collaborative engineering. The collaborative engineering environment evolving within the National Aeronautics and Space Administration (NASA) is a capability that enables the Agency's engineering infrastructure to interact and use the best state-of-the-art tools and data across organizational boundaries. Using collaborative engineering, the collocated team is replaced with an interactive team structure where the team members are geographical distributed and the best engineering talent can be applied to the design effort regardless of physical location. In addition, a more efficient, higher quality design product is delivered by bringing together the best engineering talent with more up-to-date design and analysis tools. These tools are focused on interactive, multidisciplinary design and analysis with emphasis on the complete life cycle of the system, and they include nontraditional, integrated tools for life cycle cost estimation and risk assessment. NASA has made substantial progress during the last two years in developing a collaborative engineering environment. NASA is planning to use this collaborative engineering engineering infrastructure to provide better aerospace systems life cycle design and analysis, which includes analytical assessment of the technical and programmatic aspects of a system from "cradle to grave." This paper describes the recent NASA developments in the area of collaborative engineering, the benefits (realized and anticipated) of using the developed capability, and the long-term plans for implementing this capability across Agency.

  2. Aerospace Systems Design in NASA's Collaborative Engineering Environment

    NASA Astrophysics Data System (ADS)

    Monell, Donald W.; Piland, William M.

    2000-07-01

    Past designs of complex aerospace systems involved an environment consisting of collocated design teams with project managers, technical discipline experts, and other experts (e.g., manufacturing and systems operations). These experts were generally qualified only on the basis of past design experience and typically had access to a limited set of integrated analysis tools. These environments provided less than desirable design fidelity, often led to the inability of assessing critical programmatic and technical issues (e.g., cost, risk, technical impacts), and generally derived a design that was not necessarily optimized across the entire system. The continually changing, modern aerospace industry demands systems design processes that involve the best talent available (no matter where it resides) and access to the best design and analysis tools. A solution to these demands involves a design environment referred to as collaborative engineering. The collaborative engineering environment evolving within the National Aeronautics and Space Administration (NASA) is a capability that enables the Agency's engineering infrastructure to interact and use the best state-of-the-art tools and data across organizational boundaries. Using collaborative engineering, the collocated team is replaced with an interactive team structure where the team members are geographically distributed and the best engineering talent can be applied to the design effort regardless of physical location. In addition, a more efficient, higher quality design product is delivered by bringing together the best engineering talent with more up-to-date design and analysis tools. These tools are focused on interactive, multidisciplinary design and analysis with emphasis on the complete life cycle of the system, and they include nontraditional, integrated tools for life cycle cost estimation and risk assessment. NASA has made substantial progress during the last two years in developing a collaborative engineering environment. NASA is planning to use this collaborative engineering infrastructure to provide better aerospace systems life cycle design and analysis, which includes analytical assessment of the technical and programmatic aspects of a system from "cradle to grave." This paper describes the recent NASA developments in the area of collaborative engineering, the benefits (realized and anticipated) of using the developed capability, and the long-term plans for implementing this capability across the Agency.

  3. DR-Integrator: a new analytic tool for integrating DNA copy number and gene expression data.

    PubMed

    Salari, Keyan; Tibshirani, Robert; Pollack, Jonathan R

    2010-02-01

    DNA copy number alterations (CNA) frequently underlie gene expression changes by increasing or decreasing gene dosage. However, only a subset of genes with altered dosage exhibit concordant changes in gene expression. This subset is likely to be enriched for oncogenes and tumor suppressor genes, and can be identified by integrating these two layers of genome-scale data. We introduce DNA/RNA-Integrator (DR-Integrator), a statistical software tool to perform integrative analyses on paired DNA copy number and gene expression data. DR-Integrator identifies genes with significant correlations between DNA copy number and gene expression, and implements a supervised analysis that captures genes with significant alterations in both DNA copy number and gene expression between two sample classes. DR-Integrator is freely available for non-commercial use from the Pollack Lab at http://pollacklab.stanford.edu/ and can be downloaded as a plug-in application to Microsoft Excel and as a package for the R statistical computing environment. The R package is available under the name 'DRI' at http://cran.r-project.org/. An example analysis using DR-Integrator is included as supplemental material. Supplementary data are available at Bioinformatics online.

  4. AEROELASTIC SIMULATION TOOL FOR INFLATABLE BALLUTE AEROCAPTURE

    NASA Technical Reports Server (NTRS)

    Liever, P. A.; Sheta, E. F.; Habchi, S. D.

    2006-01-01

    A multidisciplinary analysis tool is under development for predicting the impact of aeroelastic effects on the functionality of inflatable ballute aeroassist vehicles in both the continuum and rarefied flow regimes. High-fidelity modules for continuum and rarefied aerodynamics, structural dynamics, heat transfer, and computational grid deformation are coupled in an integrated multi-physics, multi-disciplinary computing environment. This flexible and extensible approach allows the integration of state-of-the-art, stand-alone NASA and industry leading continuum and rarefied flow solvers and structural analysis codes into a computing environment in which the modules can run concurrently with synchronized data transfer. Coupled fluid-structure continuum flow demonstrations were conducted on a clamped ballute configuration. The feasibility of implementing a DSMC flow solver in the simulation framework was demonstrated, and loosely coupled rarefied flow aeroelastic demonstrations were performed. A NASA and industry technology survey identified CFD, DSMC and structural analysis codes capable of modeling non-linear shape and material response of thin-film inflated aeroshells. The simulation technology will find direct and immediate applications with NASA and industry in ongoing aerocapture technology development programs.

  5. Integrative Application of Life Cycle Assessment and Risk Assessment to Environmental Impacts of Anthropogenic Pollutants at a Watershed Scale.

    PubMed

    Lin, Xiaodan; Yu, Shen; Ma, Hwongwen

    2018-01-01

    Intense human activities have led to increasing deterioration of the watershed environment via pollutant discharge, which threatens human health and ecosystem function. To meet a need of comprehensive environmental impact/risk assessment for sustainable watershed development, a biogeochemical process-based life cycle assessment and risk assessment (RA) integration for pollutants aided by geographic information system is proposed in this study. The integration is to frame a conceptual protocol of "watershed life cycle assessment (WLCA) for pollutants". The proposed WLCA protocol consists of (1) geographic and environmental characterization mapping; (2) life cycle inventory analysis; (3) integration of life-cycle impact assessment (LCIA) with RA via characterization factor of pollutant of interest; and (4) result analysis and interpretation. The WLCA protocol can visualize results of LCIA and RA spatially for the pollutants of interest, which might be useful for decision or policy makers for mitigating impacts of watershed development.

  6. Discovering new methods of data fusion, visualization, and analysis in 3D immersive environments for hyperspectral and laser altimetry data

    NASA Astrophysics Data System (ADS)

    Moore, C. A.; Gertman, V.; Olsoy, P.; Mitchell, J.; Glenn, N. F.; Joshi, A.; Norpchen, D.; Shrestha, R.; Pernice, M.; Spaete, L.; Grover, S.; Whiting, E.; Lee, R.

    2011-12-01

    Immersive virtual reality environments such as the IQ-Station or CAVE° (Cave Automated Virtual Environment) offer new and exciting ways to visualize and explore scientific data and are powerful research and educational tools. Combining remote sensing data from a range of sensor platforms in immersive 3D environments can enhance the spectral, textural, spatial, and temporal attributes of the data, which enables scientists to interact and analyze the data in ways never before possible. Visualization and analysis of large remote sensing datasets in immersive environments requires software customization for integrating LiDAR point cloud data with hyperspectral raster imagery, the generation of quantitative tools for multidimensional analysis, and the development of methods to capture 3D visualizations for stereographic playback. This study uses hyperspectral and LiDAR data acquired over the China Hat geologic study area near Soda Springs, Idaho, USA. The data are fused into a 3D image cube for interactive data exploration and several methods of recording and playback are investigated that include: 1) creating and implementing a Virtual Reality User Interface (VRUI) patch configuration file to enable recording and playback of VRUI interactive sessions within the CAVE and 2) using the LiDAR and hyperspectral remote sensing data and GIS data to create an ArcScene 3D animated flyover, where left- and right-eye visuals are captured from two independent monitors for playback in a stereoscopic player. These visualizations can be used as outreach tools to demonstrate how integrated data and geotechnology techniques can help scientists see, explore, and more adequately comprehend scientific phenomena, both real and abstract.

  7. Landscape maps as an aid to management of scenic mountain areas

    Treesearch

    Roland Baumgartner

    1979-01-01

    Before any question about wise management decisions concerning the visual resource of our environment can be answered, it is necessary to conduct a detailed analysis to determine the integral visual inventory of landscape, as it impresses any involved person. With this method of landscape analysis researchers and planners can specify the potential of any region with an...

  8. Proteometabolomic response of Deinococcus radiodurans exposed to UVC and vacuum conditions: Initial studies prior to the Tanpopo space mission.

    PubMed

    Ott, Emanuel; Kawaguchi, Yuko; Kölbl, Denise; Chaturvedi, Palak; Nakagawa, Kazumichi; Yamagishi, Akihiko; Weckwerth, Wolfram; Milojevic, Tetyana

    2017-01-01

    The multiple extremes resistant bacterium Deinococcus radiodurans is able to withstand harsh conditions of simulated outer space environment. The Tanpopo orbital mission performs a long-term space exposure of D. radiodurans aiming to investigate the possibility of interplanetary transfer of life. The revealing of molecular machinery responsible for survivability of D. radiodurans in the outer space environment can improve our understanding of underlying stress response mechanisms. In this paper, we have evaluated the molecular response of D. radiodurans after the exposure to space-related conditions of UVC irradiation and vacuum. Notably, scanning electron microscopy investigations showed that neither morphology nor cellular integrity of irradiated cells was affected, while integrated proteomic and metabolomic analysis revealed numerous molecular alterations in metabolic and stress response pathways. Several molecular key mechanisms of D. radiodurans, including the tricarboxylic acid cycle, the DNA damage response systems, ROS scavenging systems and transcriptional regulators responded in order to cope with the stressful situation caused by UVC irradiation under vacuum conditions. These results reveal the effectiveness of the integrative proteometabolomic approach as a tool in molecular analysis of microbial stress response caused by space-related factors.

  9. Proteometabolomic response of Deinococcus radiodurans exposed to UVC and vacuum conditions: Initial studies prior to the Tanpopo space mission

    PubMed Central

    Ott, Emanuel; Kawaguchi, Yuko; Kölbl, Denise; Chaturvedi, Palak; Nakagawa, Kazumichi; Yamagishi, Akihiko; Weckwerth, Wolfram

    2017-01-01

    The multiple extremes resistant bacterium Deinococcus radiodurans is able to withstand harsh conditions of simulated outer space environment. The Tanpopo orbital mission performs a long-term space exposure of D. radiodurans aiming to investigate the possibility of interplanetary transfer of life. The revealing of molecular machinery responsible for survivability of D. radiodurans in the outer space environment can improve our understanding of underlying stress response mechanisms. In this paper, we have evaluated the molecular response of D. radiodurans after the exposure to space-related conditions of UVC irradiation and vacuum. Notably, scanning electron microscopy investigations showed that neither morphology nor cellular integrity of irradiated cells was affected, while integrated proteomic and metabolomic analysis revealed numerous molecular alterations in metabolic and stress response pathways. Several molecular key mechanisms of D. radiodurans, including the tricarboxylic acid cycle, the DNA damage response systems, ROS scavenging systems and transcriptional regulators responded in order to cope with the stressful situation caused by UVC irradiation under vacuum conditions. These results reveal the effectiveness of the integrative proteometabolomic approach as a tool in molecular analysis of microbial stress response caused by space-related factors. PMID:29244852

  10. EVALUATING LANDSCAPE CHANGE AND HYDROLOGICAL CONSEQUENCES IN A SEMI-ARID ENVIRONMENT

    EPA Science Inventory

    During the past two decades, important advances in the integration of remote imagery, computer processing, and spatial analysis technologies have been used to better understand the distribution of natural communities and ecosystems, and the ecological processes that affect these ...

  11. A Synthetic Vision Preliminary Integrated Safety Analysis

    NASA Technical Reports Server (NTRS)

    Hemm, Robert; Houser, Scott

    2001-01-01

    This report documents efforts to analyze a sample of aviation safety programs, using the LMI-developed integrated safety analysis tool to determine the change in system risk resulting from Aviation Safety Program (AvSP) technology implementation. Specifically, we have worked to modify existing system safety tools to address the safety impact of synthetic vision (SV) technology. Safety metrics include reliability, availability, and resultant hazard. This analysis of SV technology is intended to be part of a larger effort to develop a model that is capable of "providing further support to the product design and development team as additional information becomes available". The reliability analysis portion of the effort is complete and is fully documented in this report. The simulation analysis is still underway; it will be documented in a subsequent report. The specific goal of this effort is to apply the integrated safety analysis to SV technology. This report also contains a brief discussion of data necessary to expand the human performance capability of the model, as well as a discussion of human behavior and its implications for system risk assessment in this modeling environment.

  12. QuakeSim 2.0

    NASA Technical Reports Server (NTRS)

    Donnellan, Andrea; Parker, Jay W.; Lyzenga, Gregory A.; Granat, Robert A.; Norton, Charles D.; Rundle, John B.; Pierce, Marlon E.; Fox, Geoffrey C.; McLeod, Dennis; Ludwig, Lisa Grant

    2012-01-01

    QuakeSim 2.0 improves understanding of earthquake processes by providing modeling tools and integrating model applications and various heterogeneous data sources within a Web services environment. QuakeSim is a multisource, synergistic, data-intensive environment for modeling the behavior of earthquake faults individually, and as part of complex interacting systems. Remotely sensed geodetic data products may be explored, compared with faults and landscape features, mined by pattern analysis applications, and integrated with models and pattern analysis applications in a rich Web-based and visualization environment. Integration of heterogeneous data products with pattern informatics tools enables efficient development of models. Federated database components and visualization tools allow rapid exploration of large datasets, while pattern informatics enables identification of subtle, but important, features in large data sets. QuakeSim is valuable for earthquake investigations and modeling in its current state, and also serves as a prototype and nucleus for broader systems under development. The framework provides access to physics-based simulation tools that model the earthquake cycle and related crustal deformation. Spaceborne GPS and Inter ferometric Synthetic Aperture (InSAR) data provide information on near-term crustal deformation, while paleoseismic geologic data provide longerterm information on earthquake fault processes. These data sources are integrated into QuakeSim's QuakeTables database system, and are accessible by users or various model applications. UAVSAR repeat pass interferometry data products are added to the QuakeTables database, and are available through a browseable map interface or Representational State Transfer (REST) interfaces. Model applications can retrieve data from Quake Tables, or from third-party GPS velocity data services; alternatively, users can manually input parameters into the models. Pattern analysis of GPS and seismicity data has proved useful for mid-term forecasting of earthquakes, and for detecting subtle changes in crustal deformation. The GPS time series analysis has also proved useful as a data-quality tool, enabling the discovery of station anomalies and data processing and distribution errors. Improved visualization tools enable more efficient data exploration and understanding. Tools provide flexibility to science users for exploring data in new ways through download links, but also facilitate standard, intuitive, and routine uses for science users and end users such as emergency responders.

  13. Bringing numerous methods for expression and promoter analysis to a public cloud computing service.

    PubMed

    Polanski, Krzysztof; Gao, Bo; Mason, Sam A; Brown, Paul; Ott, Sascha; Denby, Katherine J; Wild, David L

    2018-03-01

    Every year, a large number of novel algorithms are introduced to the scientific community for a myriad of applications, but using these across different research groups is often troublesome, due to suboptimal implementations and specific dependency requirements. This does not have to be the case, as public cloud computing services can easily house tractable implementations within self-contained dependency environments, making the methods easily accessible to a wider public. We have taken 14 popular methods, the majority related to expression data or promoter analysis, developed these up to a good implementation standard and housed the tools in isolated Docker containers which we integrated into the CyVerse Discovery Environment, making these easily usable for a wide community as part of the CyVerse UK project. The integrated apps can be found at http://www.cyverse.org/discovery-environment, while the raw code is available at https://github.com/cyversewarwick and the corresponding Docker images are housed at https://hub.docker.com/r/cyversewarwick/. info@cyverse.warwick.ac.uk or D.L.Wild@warwick.ac.uk. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  14. Digital evaluation of sitting posture comfort in human-vehicle system under Industry 4.0 framework

    NASA Astrophysics Data System (ADS)

    Tao, Qing; Kang, Jinsheng; Sun, Wenlei; Li, Zhaobo; Huo, Xiao

    2016-09-01

    Most of the previous studies on the vibration ride comfort of the human-vehicle system were focused only on one or two aspects of the investigation. A hybrid approach which integrates all kinds of investigation methods in real environment and virtual environment is described. The real experimental environment includes the WBV(whole body vibration) test, questionnaires for human subjective sensation and motion capture. The virtual experimental environment includes the theoretical calculation on simplified 5-DOF human body vibration model, the vibration simulation and analysis within ADAMS/VibrationTM module, and the digital human biomechanics and occupational health analysis in Jack software. While the real experimental environment provides realistic and accurate test results, it also serves as core and validation for the virtual experimental environment. The virtual experimental environment takes full advantages of current available vibration simulation and digital human modelling software, and makes it possible to evaluate the sitting posture comfort in a human-vehicle system with various human anthropometric parameters. How this digital evaluation system for car seat comfort design is fitted in the Industry 4.0 framework is also proposed.

  15. Socio-Spatial Integration of Older Adults in Four Types of Residential Environments in Israel.

    PubMed

    Schorr, Adi Vitman; Iecovich, Esther; Alfasi, Nurit; Shamai, Shmuel

    2017-10-01

    The socio-spatial integration of older people in different types of residential environments is a key factor affecting the well-being of older people. This study, which included a convenience sample of 565 participants, examined the socio-spatial integration of older people living in two different regional areas (central and peripheral) and four different residential environments (metropolitan hub, city, and town and rural settlements) in Israel. Willing participants were asked to complete a self-administered questionnaire. Socio-spatial integration was assessed by recognition of their neighbors and sense of belonging to the residential environment. The findings show that older adults who resided in the town and in rural settlements were more socio-spatially integrated in their living environments compared with their counterparts who resided in cities. The best predictors of socio-spatial integration were a combination of personal characteristics and characteristics of the environment (perceived accessibility) except for rural settlements, where none of the variables predicted socio-spatial integration.

  16. Modeling and dynamic environment analysis technology for spacecraft

    NASA Astrophysics Data System (ADS)

    Fang, Ren; Zhaohong, Qin; Zhong, Zhang; Zhenhao, Liu; Kai, Yuan; Long, Wei

    Spacecraft sustains complex and severe vibrations and acoustic environments during flight. Predicting the resulting structures, including numerical predictions of fluctuating pressure, updating models and random vibration and acoustic analysis, plays an important role during the design, manufacture and ground testing of spacecraft. In this paper, Monotony Integrative Large Eddy Simulation (MILES) is introduced to predict the fluctuating pressure of the fairing. The exact flow structures of the fairing wall surface under different Mach numbers are obtained, then a spacecraft model is constructed using the finite element method (FEM). According to the modal test data, the model is updated by the penalty method. On this basis, the random vibration and acoustic responses of the fairing and satellite are analyzed by different methods. The simulated results agree well with the experimental ones, which shows the validity of the modeling and dynamic environment analysis technology. This information can better support test planning, defining test conditions and designing optimal structures.

  17. Airspace Systems Program: Next Generation Air Transportation System, NextGen Systems Analysis, Integration and Evaluation Project. Version 1.0; Project Plan

    NASA Technical Reports Server (NTRS)

    Quon, Leighton

    2010-01-01

    The key objectives of the NASA ASP are to: Improve mobility, capacity efficiency and access of the airspace system. Improve collaboration, predictability, and flexibility for the airspace users. Enable accurate modeling and simulation of air transportation systems. Accommodate operations of all classes of aircraft. Maintain system safety and environmental protection. In support of these program objectives, the major goal of the NextGen-SAIE Project is to enable the transition of key capacity and efficiency improvements to the NAS. Since many aspects of the NAS are unique to specific airport or airspace environments, demand on various parts of the NAS is not expected to increase equally as system demand grows. SAIE will provide systems level analysis of the NAS characteristics, constraints, and demands such that a suite of capacity-increasing concepts and technologies for system solutions are enabled and facilitated. The technical objectives in support of this goal are the following: Integration, evaluation, and transition of more mature concepts and technologies in an environment that faithfully emulates real-world complexities. Interoperability research and analysis of ASP technologies across ATM functions is performed to facilitate integration and take ASP concepts and technologies to higher Technology Readiness Level (TRL). Analyses are conducted on the program s concepts to identify the system benefits or impacts. System level analysis is conducted to increase understanding of the characteristics and constraints of airspace system and its domains.

  18. Development of System Architecture to Investigate the Impact of Integrated Air and Missile Defense in a Distributed Lethality Environment

    DTIC Science & Technology

    2017-12-01

    SYSTEM ARCHITECTURE TO INVESTIGATE THE IMPACT OF INTEGRATED AIR AND MISSILE DEFENSE IN A DISTRIBUTED LETHALITY ENVIRONMENT by Justin K. Davis...TO INVESTIGATE THE IMPACT OF INTEGRATED AIR AND MISSILE DEFENSE IN A DISTRIBUTED LETHALITY ENVIRONMENT 5. FUNDING NUMBERS 6. AUTHOR(S) Justin K...ARCHITECTURE TO INVESTIGATE THE IMPACT OF INTEGRATED AIR AND MISSILE DEFENSE IN A DISTRIBUTED LETHALITY ENVIRONMENT Justin K. Davis Lieutenant

  19. Web-Based Integrated Research Environment for Aerodynamic Analyses and Design

    NASA Astrophysics Data System (ADS)

    Ahn, Jae Wan; Kim, Jin-Ho; Kim, Chongam; Cho, Jung-Hyun; Hur, Cinyoung; Kim, Yoonhee; Kang, Sang-Hyun; Kim, Byungsoo; Moon, Jong Bae; Cho, Kum Won

    e-AIRS[1,2], an abbreviation of ‘e-Science Aerospace Integrated Research System,' is a virtual organization designed to support aerodynamic flow analyses in aerospace engineering using the e-Science environment. As the first step toward a virtual aerospace engineering organization, e-AIRS intends to give a full support of aerodynamic research process. Currently, e-AIRS can handle both the computational and experimental aerodynamic research on the e-Science infrastructure. In detail, users can conduct a full CFD (Computational Fluid Dynamics) research process, request wind tunnel experiment, perform comparative analysis between computational prediction and experimental measurement, and finally, collaborate with other researchers using the web portal. The present paper describes those services and the internal architecture of the e-AIRS system.

  20. Scene analysis in the natural environment

    PubMed Central

    Lewicki, Michael S.; Olshausen, Bruno A.; Surlykke, Annemarie; Moss, Cynthia F.

    2014-01-01

    The problem of scene analysis has been studied in a number of different fields over the past decades. These studies have led to important insights into problems of scene analysis, but not all of these insights are widely appreciated, and there remain critical shortcomings in current approaches that hinder further progress. Here we take the view that scene analysis is a universal problem solved by all animals, and that we can gain new insight by studying the problems that animals face in complex natural environments. In particular, the jumping spider, songbird, echolocating bat, and electric fish, all exhibit behaviors that require robust solutions to scene analysis problems encountered in the natural environment. By examining the behaviors of these seemingly disparate animals, we emerge with a framework for studying scene analysis comprising four essential properties: (1) the ability to solve ill-posed problems, (2) the ability to integrate and store information across time and modality, (3) efficient recovery and representation of 3D scene structure, and (4) the use of optimal motor actions for acquiring information to progress toward behavioral goals. PMID:24744740

  1. Aeromechanics and man-machine integration technology opportunities for rotorcraft of the 1990s and beyond

    NASA Technical Reports Server (NTRS)

    Kerr, Andrew W.

    1989-01-01

    Programs related to rotorcraft aeromechanics and man-machine integration are discussed which will support advanced army rotorcraft design. In aeromechanics, recent advances in computational fluid dynamics will be used to characterize the complex unsteady flowfields of rotorcraft, and a second-generation comprehensive helicopter analysis system will be used along with models of aerodynamics, engines, and control systems to study the structural dynamics of rotor/body configurations. The man-machine integration program includes the development of advanced cockpit design technology and the evaluation of cockpit and mission equipment concepts in a real-time full-combat environment.

  2. Integrated Modeling, Mapping, and Simulation (IMMS) framework for planning exercises.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman-Hill, Ernest J.; Plantenga, Todd D.

    2010-06-01

    The Integrated Modeling, Mapping, and Simulation (IMMS) program is designing and prototyping a simulation and collaboration environment for linking together existing and future modeling and simulation tools to enable analysts, emergency planners, and incident managers to more effectively, economically, and rapidly prepare, analyze, train, and respond to real or potential incidents. When complete, the IMMS program will demonstrate an integrated modeling and simulation capability that supports emergency managers and responders with (1) conducting 'what-if' analyses and exercises to address preparedness, analysis, training, operations, and lessons learned, and (2) effectively, economically, and rapidly verifying response tactics, plans and procedures.

  3. Pulseq-Graphical Programming Interface: Open source visual environment for prototyping pulse sequences and integrated magnetic resonance imaging algorithm development.

    PubMed

    Ravi, Keerthi Sravan; Potdar, Sneha; Poojar, Pavan; Reddy, Ashok Kumar; Kroboth, Stefan; Nielsen, Jon-Fredrik; Zaitsev, Maxim; Venkatesan, Ramesh; Geethanath, Sairam

    2018-03-11

    To provide a single open-source platform for comprehensive MR algorithm development inclusive of simulations, pulse sequence design and deployment, reconstruction, and image analysis. We integrated the "Pulseq" platform for vendor-independent pulse programming with Graphical Programming Interface (GPI), a scientific development environment based on Python. Our integrated platform, Pulseq-GPI, permits sequences to be defined visually and exported to the Pulseq file format for execution on an MR scanner. For comparison, Pulseq files using either MATLAB only ("MATLAB-Pulseq") or Python only ("Python-Pulseq") were generated. We demonstrated three fundamental sequences on a 1.5 T scanner. Execution times of the three variants of implementation were compared on two operating systems. In vitro phantom images indicate equivalence with the vendor supplied implementations and MATLAB-Pulseq. The examples demonstrated in this work illustrate the unifying capability of Pulseq-GPI. The execution times of all the three implementations were fast (a few seconds). The software is capable of user-interface based development and/or command line programming. The tool demonstrated here, Pulseq-GPI, integrates the open-source simulation, reconstruction and analysis capabilities of GPI Lab with the pulse sequence design and deployment features of Pulseq. Current and future work includes providing an ISMRMRD interface and incorporating Specific Absorption Ratio and Peripheral Nerve Stimulation computations. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. Towards an integral computer environment supporting system operations analysis and conceptual design

    NASA Technical Reports Server (NTRS)

    Barro, E.; Delbufalo, A.; Rossi, F.

    1994-01-01

    VITROCISET has in house developed a prototype tool named System Dynamic Analysis Environment (SDAE) to support system engineering activities in the initial definition phase of a complex space system. The SDAE goal is to provide powerful means for the definition, analysis, and trade-off of operations and design concepts for the space and ground elements involved in a mission. For this purpose SDAE implements a dedicated modeling methodology based on the integration of different modern (static and dynamic) analysis and simulation techniques. The resulting 'system model' is capable of representing all the operational, functional, and behavioral aspects of the system elements which are part of a mission. The execution of customized model simulations enables: the validation of selected concepts with respect to mission requirements; the in-depth investigation of mission specific operational and/or architectural aspects; and the early assessment of performances required by the system elements to cope with mission constraints and objectives. Due to its characteristics, SDAE is particularly tailored for nonconventional or highly complex systems, which require a great analysis effort in their early definition stages. SDAE runs under PC-Windows and is currently used by VITROCISET system engineering group. This paper describes the SDAE main features, showing some tool output examples.

  5. Applications of the pipeline environment for visual informatics and genomics computations

    PubMed Central

    2011-01-01

    Background Contemporary informatics and genomics research require efficient, flexible and robust management of large heterogeneous data, advanced computational tools, powerful visualization, reliable hardware infrastructure, interoperability of computational resources, and detailed data and analysis-protocol provenance. The Pipeline is a client-server distributed computational environment that facilitates the visual graphical construction, execution, monitoring, validation and dissemination of advanced data analysis protocols. Results This paper reports on the applications of the LONI Pipeline environment to address two informatics challenges - graphical management of diverse genomics tools, and the interoperability of informatics software. Specifically, this manuscript presents the concrete details of deploying general informatics suites and individual software tools to new hardware infrastructures, the design, validation and execution of new visual analysis protocols via the Pipeline graphical interface, and integration of diverse informatics tools via the Pipeline eXtensible Markup Language syntax. We demonstrate each of these processes using several established informatics packages (e.g., miBLAST, EMBOSS, mrFAST, GWASS, MAQ, SAMtools, Bowtie) for basic local sequence alignment and search, molecular biology data analysis, and genome-wide association studies. These examples demonstrate the power of the Pipeline graphical workflow environment to enable integration of bioinformatics resources which provide a well-defined syntax for dynamic specification of the input/output parameters and the run-time execution controls. Conclusions The LONI Pipeline environment http://pipeline.loni.ucla.edu provides a flexible graphical infrastructure for efficient biomedical computing and distributed informatics research. The interactive Pipeline resource manager enables the utilization and interoperability of diverse types of informatics resources. The Pipeline client-server model provides computational power to a broad spectrum of informatics investigators - experienced developers and novice users, user with or without access to advanced computational-resources (e.g., Grid, data), as well as basic and translational scientists. The open development, validation and dissemination of computational networks (pipeline workflows) facilitates the sharing of knowledge, tools, protocols and best practices, and enables the unbiased validation and replication of scientific findings by the entire community. PMID:21791102

  6. Development of a Multi-Disciplinary Computing Environment (MDICE)

    NASA Technical Reports Server (NTRS)

    Kingsley, Gerry; Siegel, John M., Jr.; Harrand, Vincent J.; Lawrence, Charles; Luker, Joel J.

    1999-01-01

    The growing need for and importance of multi-component and multi-disciplinary engineering analysis has been understood for many years. For many applications, loose (or semi-implicit) coupling is optimal, and allows the use of various legacy codes without requiring major modifications. For this purpose, CFDRC and NASA LeRC have developed a computational environment to enable coupling between various flow analysis codes at several levels of fidelity. This has been referred to as the Visual Computing Environment (VCE), and is being successfully applied to the analysis of several aircraft engine components. Recently, CFDRC and AFRL/VAAC (WL) have extended the framework and scope of VCE to enable complex multi-disciplinary simulations. The chosen initial focus is on aeroelastic aircraft applications. The developed software is referred to as MDICE-AE, an extensible system suitable for integration of several engineering analysis disciplines. This paper describes the methodology, basic architecture, chosen software technologies, salient library modules, and the current status of and plans for MDICE. A fluid-structure interaction application is described in a separate companion paper.

  7. EPOS Thematic Core Service Anthropogenic Hazards: Implementation Plan

    NASA Astrophysics Data System (ADS)

    Orlecka-Sikora, Beata; Lasocki, Stanislaw; Grasso, Jean Robert; Schmittbuhl, Jean; Styles, Peter; Kwiatek, Grzegorz; Sterzel, Mariusz; Garcia, Alexander

    2015-04-01

    EPOS Thematic Core Service ANTHROPOGENIC HAZARDS (TCS AH) aims to integrate distributed research infrastructures (RI) to facilitate and stimulate research on anthropogenic hazards (AH) especially those associated with the exploration and exploitation of geo-resources. The innovative element is the uniqueness of the integrated RI which comprises two main deliverables: (1) Exceptional datasets, called "episodes", which comprehensively describe a geophysical process; induced or triggered by human technological activity, posing hazard for populations, infrastructure and the environment, (2) Problem-oriented, bespoke services uniquely designed for the discrimination and analysis of correlations between technology, geophysical response and resulting hazard. These objectives will be achieved through the Science-Industry Synergy (SIS) built by EPOS WG10, ensuring bi-directional information exchange, including unique and previously unavailable data furnished by industrial partners. The Episodes and services to be integrated have been selected using strict criteria during the EPOS PP. The data are related to a wide spectrum of inducing technologies, with seismic/aseismic deformation and production history as a minimum data set requirement and the quality of software services is confirmed and referenced in literature. Implementation of TCS AH is planned for four years and requires five major activities: (1) Strategic Activities and Governance: will define and establish the governance structure to ensure the long-term sustainability of these research infrastructures for data provision through EPOS. (2) Coordination and Interaction with the Community: will establish robust communication channels within the whole TCS AH community while supporting global EPOS communication strategy. (3) Interoperability with EPOS Integrated Core Service (ICS) and Testing Activities: will coordinate and ensure interoperability between the RIs and the ICS. Within this modality a functional e-research environment with access to High-Performance Computing will be built. A prototype for such an environment is already under construction and will become operational in mid -2015 (is-epos.eu). (4) Integration of AH Episodes: will address at least 20 global episodes related to conventional hydrocarbon extraction, reservoir treatment, underground mining and geothermal energy production which will be integrated into the e-environment of TCS AH. All the multi-disciplinary heterogeneous data from these particular episodes will be transformed to unified structures to form integrated data sets articulated with the defined standards of ICS and other TCS's. (5) Implementation of services for analyzing Episodes: will deliver the protocols and methodologies for analysis of the seismic/deformation response to time-varying georesource exploitation technologies on long and short time scales and the related time- and technology-dependent seismic hazard issues.

  8. An integrated genetic linkage map and comparative genome analysis for the estuarine Atlantic killifish, Fundulus heteroclitus

    USDA-ARS?s Scientific Manuscript database

    Background: Fundulus heteroclitus (Atlantic killifish), a non-migratory estuarine fish, exhibits high allelic and phenotypic diversity, partitioned among subpopulations that reside in disparate environmental conditions. An ideal candidate model organism for studying gene-environment reactions, th...

  9. An inexpensive, temporally-integrated system for monitoring occurrence and biological effects of contaminants in the field

    EPA Science Inventory

    Assessing potential biological impacts of complex mixtures of contaminants in aquatic environments is an ongoing challenge for ecotoxicologists. Instrumental analysis of site waters alone can identify contaminants but provides only limited insights as to possible adverse effects...

  10. Technology in HRD.

    ERIC Educational Resources Information Center

    1998

    This document contains four papers from a symposium on technology in human resource development (HRD). "COBRA, an Electronic Performance Support System for the Analysis of Jobs and Tasks" (Theo J. Bastiaens) is described as an integrated computerized environment that provides tools, information, advice, and training to help employees do…

  11. Systems Analysis Initiated for All-Electric Aircraft Propulsion

    NASA Technical Reports Server (NTRS)

    Kohout, Lisa L.

    2003-01-01

    A multidisciplinary effort is underway at the NASA Glenn Research Center to develop concepts for revolutionary, nontraditional fuel cell power and propulsion systems for aircraft applications. There is a growing interest in the use of fuel cells as a power source for electric propulsion as well as an auxiliary power unit to substantially reduce or eliminate environmentally harmful emissions. A systems analysis effort was initiated to assess potential concepts in an effort to identify those configurations with the highest payoff potential. Among the technologies under consideration are advanced proton exchange membrane (PEM) and solid oxide fuel cells, alternative fuels and fuel processing, and fuel storage. Prior to this effort, the majority of fuel cell analysis done at Glenn was done for space applications. Because of this, a new suite of models was developed. These models include the hydrogen-air PEM fuel cell; internal reforming solid oxide fuel cell; balance-of-plant components (compressor, humidifier, separator, and heat exchangers); compressed gas, cryogenic, and liquid fuel storage tanks; and gas turbine/generator models for hybrid system applications. Initial mass, volume, and performance estimates of a variety of PEM systems operating on hydrogen and reformate have been completed for a baseline general aviation aircraft. Solid oxide/turbine hybrid systems are being analyzed. In conjunction with the analysis efforts, a joint effort has been initiated with Glenn s Computer Services Division to integrate fuel cell stack and component models with the visualization environment that supports the GRUVE lab, Glenn s virtual reality facility. The objective of this work is to provide an environment to assist engineers in the integration of fuel cell propulsion systems into aircraft and provide a better understanding of the interaction between system components and the resulting effect on the overall design and performance of the aircraft. Initially, three-dimensional computer-aided design (CAD) models of representative PEM fuel cell stack and components were developed and integrated into the virtual reality environment along with an Excel-based model used to calculate fuel cell electrical performance on the basis of cell dimensions (see the figure). CAD models of a representative general aviation aircraft were also developed and added to the environment. With the use of special headgear, users will be able to virtually manipulate the fuel cell s physical characteristics and its placement within the aircraft while receiving information on the resultant fuel cell output power and performance. As the systems analysis effort progresses, we will add more component models to the GRUVE environment to help us more fully understand the effect of various system configurations on the aircraft.

  12. A blended learning approach for teaching computer programming: design for large classes in Sub-Saharan Africa

    NASA Astrophysics Data System (ADS)

    Bayu Bati, Tesfaye; Gelderblom, Helene; van Biljon, Judy

    2014-01-01

    The challenge of teaching programming in higher education is complicated by problems associated with large class teaching, a prevalent situation in many developing countries. This paper reports on an investigation into the use of a blended learning approach to teaching and learning of programming in a class of more than 200 students. A course and learning environment was designed by integrating constructivist learning models of Constructive Alignment, Conversational Framework and the Three-Stage Learning Model. Design science research is used for the course redesign and development of the learning environment, and action research is integrated to undertake participatory evaluation of the intervention. The action research involved the Students' Approach to Learning survey, a comparative analysis of students' performance, and qualitative data analysis of data gathered from various sources. The paper makes a theoretical contribution in presenting a design of a blended learning solution for large class teaching of programming grounded in constructivist learning theory and use of free and open source technologies.

  13. 48 CFR 970.5223-1 - Integration of environment, safety, and health into work planning and execution.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Integration of environment, safety, and health into work planning and execution. As prescribed in 970.2303-3(b), insert the following clause: Integration of Environment, Safety, and Health Into Work Planning and... danger to the environment or health and safety of employees or the public, the Contracting Officer may...

  14. 48 CFR 970.5223-1 - Integration of environment, safety, and health into work planning and execution.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Integration of environment, safety, and health into work planning and execution. As prescribed in 970.2303-3(b), insert the following clause: Integration of Environment, Safety, and Health Into Work Planning and... danger to the environment or health and safety of employees or the public, the Contracting Officer may...

  15. 48 CFR 970.5223-1 - Integration of environment, safety, and health into work planning and execution.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Integration of environment, safety, and health into work planning and execution. As prescribed in 970.2303-3(b), insert the following clause: Integration of Environment, Safety, and Health Into Work Planning and... danger to the environment or health and safety of employees or the public, the Contracting Officer may...

  16. 48 CFR 970.5223-1 - Integration of environment, safety, and health into work planning and execution.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Integration of environment, safety, and health into work planning and execution. As prescribed in 970.2303-3(b), insert the following clause: Integration of Environment, Safety, and Health Into Work Planning and... danger to the environment or health and safety of employees or the public, the Contracting Officer may...

  17. A dedicated database system for handling multi-level data in systems biology.

    PubMed

    Pornputtapong, Natapol; Wanichthanarak, Kwanjeera; Nilsson, Avlant; Nookaew, Intawat; Nielsen, Jens

    2014-01-01

    Advances in high-throughput technologies have enabled extensive generation of multi-level omics data. These data are crucial for systems biology research, though they are complex, heterogeneous, highly dynamic, incomplete and distributed among public databases. This leads to difficulties in data accessibility and often results in errors when data are merged and integrated from varied resources. Therefore, integration and management of systems biological data remain very challenging. To overcome this, we designed and developed a dedicated database system that can serve and solve the vital issues in data management and hereby facilitate data integration, modeling and analysis in systems biology within a sole database. In addition, a yeast data repository was implemented as an integrated database environment which is operated by the database system. Two applications were implemented to demonstrate extensibility and utilization of the system. Both illustrate how the user can access the database via the web query function and implemented scripts. These scripts are specific for two sample cases: 1) Detecting the pheromone pathway in protein interaction networks; and 2) Finding metabolic reactions regulated by Snf1 kinase. In this study we present the design of database system which offers an extensible environment to efficiently capture the majority of biological entities and relations encountered in systems biology. Critical functions and control processes were designed and implemented to ensure consistent, efficient, secure and reliable transactions. The two sample cases on the yeast integrated data clearly demonstrate the value of a sole database environment for systems biology research.

  18. High-performance liquid chromatography separation and intact mass analysis of detergent-solubilized integral membrane proteins

    PubMed Central

    Berridge, Georgina; Chalk, Rod; D’Avanzo, Nazzareno; Dong, Liang; Doyle, Declan; Kim, Jung-In; Xia, Xiaobing; Burgess-Brown, Nicola; deRiso, Antonio; Carpenter, Elisabeth Paula; Gileadi, Opher

    2011-01-01

    We have developed a method for intact mass analysis of detergent-solubilized and purified integral membrane proteins using liquid chromatography–mass spectrometry (LC–MS) with methanol as the organic mobile phase. Membrane proteins and detergents are separated chromatographically during the isocratic stage of the gradient profile from a 150-mm C3 reversed-phase column. The mass accuracy is comparable to standard methods employed for soluble proteins; the sensitivity is 10-fold lower, requiring 0.2–5 μg of protein. The method is also compatible with our standard LC–MS method used for intact mass analysis of soluble proteins and may therefore be applied on a multiuser instrument or in a high-throughput environment. PMID:21093405

  19. Debris/ice/TPS assessment and integrated photographic analysis for Shuttle Mission STS-45

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.; Higginbotham, Scott A.; Davis, J. Bradley

    1992-01-01

    The Debris Team has developed and implemented measures to control damage from debris in the Shuttle operational environment and to make the control measures a part of routine launch flows. These measures include engineering surveillance during vehicle processing and closeout operations, facility and flight hardware inspections before and after launch, and photographic analysis of mission events. Photographic analyses of mission imagery from launch, on-orbit, and landing provide significant data in verifying proper operation of systems and evaluating anomalies. In addition to the Kennedy Space Center (KSC) Photo/Video Analysis, reports from Johnson Space Center, Marshall Space Flight Center, and Rockwell International-Downey are also included to provide an integrated assessment of each Shuttle mission.

  20. Space Environmental Viewing and Analysis Network (SEVAN)

    NASA Astrophysics Data System (ADS)

    Chilingarian, Ashot

    A network of particle detectors located at middle to low latitudes, SEVAN (Space Environ-mental Viewing and Analysis Network), aims to improve fundamental research of the particle acceleration in the vicinity of the sun and the space environment. The new type of particle detectors will simultaneously measure changing fluxes of most species of secondary cosmic rays, thus turning into a powerful integrated device used for exploration of solar modulation effects. The first SEVAN modules are under test operation at Aragats Space Environmental Center in Armenia, in Bulgaria and Croatia. We present the first results of SEVAN operation, as well as some characteristics of the detector setup.

  1. [Management, quality of health and occupational safety and hospital organization: is integration possible?].

    PubMed

    Corrao, Carmela Romana Natalina

    2011-01-01

    The evolution of the national and European legislation has progressively transformed the working environments into organized environments. Specific models for its management are being proposed, which should be integrated into general management strategies. In the case of hospitals this integration should consider the peculiar organizational complexity, where the management of the occupational risk needs to be integrated with clinical risk management and economic risk management. Resources management should also consider that Occupational Medicine has not a direct monetary benefit for the organisation, but only indirect health consequences in terms of reduction of accidents and occupational diseases. The deep and simultaneous analysis of the current general management systems and the current management methods of occupational safety and health protection allows one to hyphotesise a possible integration between them. For both of them the Top Management is the main responsible of the quality management strategies and the use of specific documents in the managerial process, such as the document of risks evaluation in the occupational management and the quality manual in the general management, is of paramount importance. An integrated management has also the scope to pursue a particular kind of quality management, where ethics and job satisfaction are innovative, as established by recent European guidelines, management systems and national legislations.

  2. Visualizing Complex Environments in the Geo- and BioSciences

    NASA Astrophysics Data System (ADS)

    Prabhu, A.; Fox, P. A.; Zhong, H.; Eleish, A.; Ma, X.; Zednik, S.; Morrison, S. M.; Moore, E. K.; Muscente, D.; Meyer, M.; Hazen, R. M.

    2017-12-01

    Earth's living and non-living components have co-evolved for 4 billion years through numerous positive and negative feedbacks. Earth and life scientists have amassed vast amounts of data in diverse fields related to planetary evolution through deep time-mineralogy and petrology, paleobiology and paleontology, paleotectonics and paleomagnetism, geochemistry and geochrononology, genomics and proteomics, and more. Integrating the data from these complimentary disciplines is very useful in gaining an understanding of the evolution of our planet's environment. The integrated data however, represent many extremely complex environments. In order to gain insights and make discoveries using this data, it is important for us to model and visualize these complex environments. As part of work in understanding the "Co-Evolution of Geo and Biospheres using Data Driven Methodologies," we have developed several visualizations to help represent the information stored in the datasets from complimentary disciplines. These visualizations include 2D and 3D force directed Networks, Chord Diagrams, 3D Klee Diagrams. Evolving Network Diagrams, Skyline Diagrams and Tree Diagrams. Combining these visualizations with the results of machine learning and data analysis methods leads to a powerful way to discover patterns and relationships about the Earth's past and today's changing environment.

  3. Crew Launch Vehicle Mobile Launcher Solid Rocket Motor Plume Induced Environment

    NASA Technical Reports Server (NTRS)

    Vu, Bruce T.; Sulyma, Peter

    2008-01-01

    The plume-induced environment created by the Ares 1 first stage, five-segment reusable solid rocket motor (RSRMV) will impose high heating rates and impact pressures on Launch Complex 39. The extremes of these environments pose a potential threat to weaken or even cause structural components to fail if insufficiently designed. Therefore the ability to accurately predict these environments is critical to assist in specifying structural design requirements to insure overall structural integrity and flight safety. This paper presents the predicted thermal and pressure environments induced by the launch of the Crew Launch Vehicle (CLV) from Launch Complex (LC) 39. Once the environments are predicted, a follow-on thermal analysis is required to determine the surface temperature response and the degradation rate of the materials. An example of structures responding to the plume-induced environment will be provided.

  4. Integrated Analysis of Environment, Cattle and Human Serological Data: Risks and Mechanisms of Transmission of Rift Valley Fever in Madagascar

    PubMed Central

    Olive, Marie-Marie; Chevalier, Véronique; Grosbois, Vladimir; Tran, Annelise; Andriamandimby, Soa-Fy; Durand, Benoit; Ravalohery, Jean-Pierre; Andriamamonjy, Seta; Rakotomanana, Fanjasoa; Rogier, Christophe; Heraud, Jean-Michel

    2016-01-01

    Background Rift Valley fever (RVF) is a vector-borne disease affecting ruminants and humans. Madagascar was heavily affected by RVF in 2008–2009, with evidence of a large and heterogeneous spread of the disease. The identification of at-risk environments is essential to optimize the available resources by targeting RVF surveillance in Madagascar. Herein, the objectives of our study were: (i) to identify the environmental factors and areas favorable to RVF transmission to both cattle and human and (ii) to identify human behaviors favoring human infections in Malagasy contexts. Methodology/Principal Findings First, we characterized the environments of Malagasy communes using a Multiple Factor Analysis (MFA). Then, we analyzed cattle and human serological data collected at national level using Generalized Linear Mixed Models, with the individual serological status (cattle or human) as the response, and MFA factors, as well as other potential risk factors (cattle density, human behavior) as explanatory variables. Cattle and human seroprevalence rates were positively associated to humid environments (p<0.001). Areas with high cattle density were at risk (p<0.01; OR = 2.6). Furthermore, our analysis showed that frequent contact with raw milk contributed to explain human infection (OR = 1.6). Finally, our study highlighted the eastern-coast, western and north-western parts as high-risk areas for RVF transmission in cattle. Conclusions/Significance Our integrated approach analyzing environmental, cattle and human datasets allow us to bring new insight on RVF transmission patterns in Madagascar. The association between cattle seroprevalence, humid environments and high cattle density suggests that concomitant vectorial and direct transmissions are critical to maintain RVF enzootic transmission. Additionally, in the at-risk humid environment of the western, north-western and the eastern-coast areas, suitable to Culex and Anopheles mosquitoes, vectorial transmission probably occurs in both cattle and human. The relative contribution of vectorial or direct transmissions could be further assessed by mathematic modelling. PMID:27415438

  5. Integrated Analysis of Environment, Cattle and Human Serological Data: Risks and Mechanisms of Transmission of Rift Valley Fever in Madagascar.

    PubMed

    Olive, Marie-Marie; Chevalier, Véronique; Grosbois, Vladimir; Tran, Annelise; Andriamandimby, Soa-Fy; Durand, Benoit; Ravalohery, Jean-Pierre; Andriamamonjy, Seta; Rakotomanana, Fanjasoa; Rogier, Christophe; Heraud, Jean-Michel

    2016-07-01

    Rift Valley fever (RVF) is a vector-borne disease affecting ruminants and humans. Madagascar was heavily affected by RVF in 2008-2009, with evidence of a large and heterogeneous spread of the disease. The identification of at-risk environments is essential to optimize the available resources by targeting RVF surveillance in Madagascar. Herein, the objectives of our study were: (i) to identify the environmental factors and areas favorable to RVF transmission to both cattle and human and (ii) to identify human behaviors favoring human infections in Malagasy contexts. First, we characterized the environments of Malagasy communes using a Multiple Factor Analysis (MFA). Then, we analyzed cattle and human serological data collected at national level using Generalized Linear Mixed Models, with the individual serological status (cattle or human) as the response, and MFA factors, as well as other potential risk factors (cattle density, human behavior) as explanatory variables. Cattle and human seroprevalence rates were positively associated to humid environments (p<0.001). Areas with high cattle density were at risk (p<0.01; OR = 2.6). Furthermore, our analysis showed that frequent contact with raw milk contributed to explain human infection (OR = 1.6). Finally, our study highlighted the eastern-coast, western and north-western parts as high-risk areas for RVF transmission in cattle. Our integrated approach analyzing environmental, cattle and human datasets allow us to bring new insight on RVF transmission patterns in Madagascar. The association between cattle seroprevalence, humid environments and high cattle density suggests that concomitant vectorial and direct transmissions are critical to maintain RVF enzootic transmission. Additionally, in the at-risk humid environment of the western, north-western and the eastern-coast areas, suitable to Culex and Anopheles mosquitoes, vectorial transmission probably occurs in both cattle and human. The relative contribution of vectorial or direct transmissions could be further assessed by mathematic modelling.

  6. Community Coordinated Modeling Center Support of Science Needs for Integrated Data Environment

    NASA Technical Reports Server (NTRS)

    Kuznetsova, M. M.; Hesse, M.; Rastatter, L.; Maddox, M.

    2007-01-01

    Space science models are essential component of integrated data environment. Space science models are indispensable tools to facilitate effective use of wide variety of distributed scientific sources and to place multi-point local measurements into global context. The Community Coordinated Modeling Center (CCMC) hosts a set of state-of-the- art space science models ranging from the solar atmosphere to the Earth's upper atmosphere. The majority of models residing at CCMC are comprehensive computationally intensive physics-based models. To allow the models to be driven by data relevant to particular events, the CCMC developed an online data file generation tool that automatically downloads data from data providers and transforms them to required format. CCMC provides a tailored web-based visualization interface for the model output, as well as the capability to download simulations output in portable standard format with comprehensive metadata and user-friendly model output analysis library of routines that can be called from any C supporting language. CCMC is developing data interpolation tools that enable to present model output in the same format as observations. CCMC invite community comments and suggestions to better address science needs for the integrated data environment.

  7. White paper on the promotion of an integrated risk assessment concept in European regulatory frameworks for chemicals.

    PubMed

    Wilks, M F; Roth, N; Aicher, L; Faust, M; Papadaki, P; Marchis, A; Calliera, M; Ginebreda, A; Andres, S; Kühne, R; Schüürmann, G

    2015-07-15

    The vision of a sustainable and safe use of chemicals to protect human health, preserve the environment and maintain the ecosystem requires innovative and more holistic approaches to risk assessment (RA) in order to better inform decision making. Integrated risk assessment (IRA) has been proposed as a solution to current scientific, societal and policy needs. It is defined as the mutual exploitation of environmental risk assessment (ERA) for human health risk assessment (HHRA) and vice versa in order to coherently and more efficiently characterize an overall risk to humans and the environment for better informing the risk analysis process. Extrapolating between species which are relevant for HHRA and ERA requires a detailed understanding of pathways of toxicity/modes of action (MoA) for the various toxicological endpoints. Significant scientific advances, changes in chemical legislation, and increasing environmental consciousness have created a favourable scientific and regulatory environment to develop and promote the concept and vision of IRA. An initial proof of concept is needed to foster the incorporation of IRA approaches into different chemical sectorial regulations and demonstrate their reliability for regulatory purposes. More familiarity and confidence with IRA will ultimately contribute to an overall reduction in in vivo toxicity testing requirements. However, significant progress will only be made if long-term support for MoA-related research is secured. In the short term, further exchange and harmonization of RA terminology, models and methodologies across chemical categories and regulatory agencies will support these efforts. Since societal values, public perceptions and cultural factors are of increasing importance for the acceptance of risk analysis and successful implementation of risk mitigation measures, the integration of socio-economic analysis and socio-behavioural considerations into the risk analysis process may help to produce a more effective risk evaluation and consideration of the risks and benefits associated with the use of chemicals. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Integration of analytical and biological measurements for assessing the effects of contaminants present at Great Lakes areas of concern

    EPA Science Inventory

    Assessing the potential biological impacts of complex contaminant mixtures in aquatic environments is often a challenge for ecotoxicologists. Instrumental analysis of site waters provides insights into the occurrence of contaminants, but provides little information about possibl...

  9. An inexpensive, temporally-integrated system for monitoring occurrence and biological effects of contaminants in the field (Poster)

    EPA Science Inventory

    Assessing potential biological impacts of complex mixtures of contaminants in aquatic environments is an ongoing challenge for ecotoxicologists. Instrumental analysis of site waters alone can identify contaminants but provides only limited insights as to possible adverse effects...

  10. HUMAN AND ECOLOIGCAL RISK: CORRELATIONS AMONG HUMAN HEALTH, ECOLOGICAL AND ENVIORNMENTAL MONITORING DATA

    EPA Science Inventory

    While all life is affected by the quality of the environment, environmental risk factors for human and wildlife health are typically assessed using independent processes that are dissimilar in scale and scope. However, the integrated analysis of human, ecological, and environmen...

  11. HUMAN AND ECOLOGICAL RISK ASSESSMENT: ASSOCIATIONS AMONG HUMAN HEALTH, ECOLOGICAL, AND ENVIRONMENTAL MONITORING

    EPA Science Inventory

    While all life is affected by the quality of the environment, environmental risk factors for human and wildlife health are typically assessed using independent processes that are dissimilar in scale and scope. However, the integrated analysis of human, ecological, and environmen...

  12. HUMAN AND ECOLOGICAL RISK: CORRELATIONS AMONG HUMAN HEALTH, ECOLOGICAL AND ENVIRONMENTAL MONITORING DATA

    EPA Science Inventory

    While all life is affected by the quality of the environment, environmental risk factors for human and wildlife health are typically assessed using independent processes that are dissimilar in scale and scope. However, the integrated analysis of human, ecological, and environmen...

  13. HUMAN AND ECOLOGICAL RISK ASSESSMENT: ASSOCIATIONS AMONG HUMAN HEALTH, ECOLOGICAL AND ENVIRONMENTAL MONITORING DATA

    EPA Science Inventory

    While all life is affected by the quality of the environment, environmental risk factors for human and wildlife health are typically assessed using independent processes that are dissimilar in scale and scope. However, the integrated analysis of human, ecological, and environmen...

  14. HUMAN AND ECOLOGICAL RISK ASSESSMENT: ASSOCIATIONS AMONH HUMAN HEALTH, ECOLOGICAL AND ENVIRONMENTAL MONITORING DATA

    EPA Science Inventory

    While all life is affected by the quality of the environment, environmental risk factors for human and wildlife health are typically assessed using independent processes that are dissimilar in scale and scope. However, the integrated analysis of human, ecological, and environmen...

  15. Automated Environment Generation for Software Model Checking

    NASA Technical Reports Server (NTRS)

    Tkachuk, Oksana; Dwyer, Matthew B.; Pasareanu, Corina S.

    2003-01-01

    A key problem in model checking open systems is environment modeling (i.e., representing the behavior of the execution context of the system under analysis). Software systems are fundamentally open since their behavior is dependent on patterns of invocation of system components and values defined outside the system but referenced within the system. Whether reasoning about the behavior of whole programs or about program components, an abstract model of the environment can be essential in enabling sufficiently precise yet tractable verification. In this paper, we describe an approach to generating environments of Java program fragments. This approach integrates formally specified assumptions about environment behavior with sound abstractions of environment implementations to form a model of the environment. The approach is implemented in the Bandera Environment Generator (BEG) which we describe along with our experience using BEG to reason about properties of several non-trivial concurrent Java programs.

  16. Delineation of karst terranes in complex environments: Application of modern developments in the wavelet theory and data mining

    NASA Astrophysics Data System (ADS)

    Alperovich, Leonid; Averbuch, Amir; Eppelbaum, Lev; Zheludev, Valery

    2013-04-01

    Karst areas occupy about 14% of the world land. Karst terranes of different origin have caused difficult conditions for building, industrial activity and tourism, and are the source of heightened danger for environment. Mapping of karst (sinkhole) hazards, obviously, will be one of the most significant problems of engineering geophysics in the XXI century. Taking into account the complexity of geological media, some unfavourable environments and known ambiguity of geophysical data analysis, a single geophysical method examination might be insufficient. Wavelet methodology as whole has a significant impact on cardinal problems of geophysical signal processing such as: denoising of signals, enhancement of signals and distinguishing of signals with closely related characteristics and integrated analysis of different geophysical fields (satellite, airborne, earth surface or underground observed data). We developed a three-phase approach to the integrated geophysical localization of subsurface karsts (the same approach could be used for following monitoring of karst dynamics). The first phase consists of modeling devoted to compute various geophysical effects characterizing karst phenomena. The second phase determines development of the signal processing approaches to analyzing of profile or areal geophysical observations. Finally, at the third phase provides integration of these methods in order to create a new method of the combined interpretation of different geophysical data. In the base of our combine geophysical analysis we put modern developments in the wavelet technique of the signal and image processing. The development of the integrated methodology of geophysical field examination will enable to recognizing the karst terranes even by a small ratio of "useful signal - noise" in complex geological environments. For analyzing the geophysical data, we used a technique based on the algorithm to characterize a geophysical image by a limited number of parameters. This set of parameters serves as a signature of the image and is to be utilized for discrimination of images containing karst cavity (K) from the images non-containing karst (N). The constructed algorithm consists of the following main phases: (a) collection of the database, (b) characterization of geophysical images, (c) and dimensionality reduction. Then, each image is characterized by the histogram of the coherency directions. As a result of the previous steps we obtain two sets K and N of the signatures vectors for images from sections containing karst cavity and non-karst subsurface, respectively.

  17. Towards systems genetic analyses in barley: Integration of phenotypic, expression and genotype data into GeneNetwork.

    PubMed

    Druka, Arnis; Druka, Ilze; Centeno, Arthur G; Li, Hongqiang; Sun, Zhaohui; Thomas, William T B; Bonar, Nicola; Steffenson, Brian J; Ullrich, Steven E; Kleinhofs, Andris; Wise, Roger P; Close, Timothy J; Potokina, Elena; Luo, Zewei; Wagner, Carola; Schweizer, Günther F; Marshall, David F; Kearsey, Michael J; Williams, Robert W; Waugh, Robbie

    2008-11-18

    A typical genetical genomics experiment results in four separate data sets; genotype, gene expression, higher-order phenotypic data and metadata that describe the protocols, processing and the array platform. Used in concert, these data sets provide the opportunity to perform genetic analysis at a systems level. Their predictive power is largely determined by the gene expression dataset where tens of millions of data points can be generated using currently available mRNA profiling technologies. Such large, multidimensional data sets often have value beyond that extracted during their initial analysis and interpretation, particularly if conducted on widely distributed reference genetic materials. Besides quality and scale, access to the data is of primary importance as accessibility potentially allows the extraction of considerable added value from the same primary dataset by the wider research community. Although the number of genetical genomics experiments in different plant species is rapidly increasing, none to date has been presented in a form that allows quick and efficient on-line testing for possible associations between genes, loci and traits of interest by an entire research community. Using a reference population of 150 recombinant doubled haploid barley lines we generated novel phenotypic, mRNA abundance and SNP-based genotyping data sets, added them to a considerable volume of legacy trait data and entered them into the GeneNetwork http://www.genenetwork.org. GeneNetwork is a unified on-line analytical environment that enables the user to test genetic hypotheses about how component traits, such as mRNA abundance, may interact to condition more complex biological phenotypes (higher-order traits). Here we describe these barley data sets and demonstrate some of the functionalities GeneNetwork provides as an easily accessible and integrated analytical environment for exploring them. By integrating barley genotypic, phenotypic and mRNA abundance data sets directly within GeneNetwork's analytical environment we provide simple web access to the data for the research community. In this environment, a combination of correlation analysis and linkage mapping provides the potential to identify and substantiate gene targets for saturation mapping and positional cloning. By integrating datasets from an unsequenced crop plant (barley) in a database that has been designed for an animal model species (mouse) with a well established genome sequence, we prove the importance of the concept and practice of modular development and interoperability of software engineering for biological data sets.

  18. Demonstration of NICT Space Weather Cloud --Integration of Supercomputer into Analysis and Visualization Environment--

    NASA Astrophysics Data System (ADS)

    Watari, S.; Morikawa, Y.; Yamamoto, K.; Inoue, S.; Tsubouchi, K.; Fukazawa, K.; Kimura, E.; Tatebe, O.; Kato, H.; Shimojo, S.; Murata, K. T.

    2010-12-01

    In the Solar-Terrestrial Physics (STP) field, spatio-temporal resolution of computer simulations is getting higher and higher because of tremendous advancement of supercomputers. A more advanced technology is Grid Computing that integrates distributed computational resources to provide scalable computing resources. In the simulation research, it is effective that a researcher oneself designs his physical model, performs calculations with a supercomputer, and analyzes and visualizes for consideration by a familiar method. A supercomputer is far from an analysis and visualization environment. In general, a researcher analyzes and visualizes in the workstation (WS) managed at hand because the installation and the operation of software in the WS are easy. Therefore, it is necessary to copy the data from the supercomputer to WS manually. Time necessary for the data transfer through long delay network disturbs high-accuracy simulations actually. In terms of usefulness, integrating a supercomputer and an analysis and visualization environment seamlessly with a researcher's familiar method is important. NICT has been developing a cloud computing environment (NICT Space Weather Cloud). In the NICT Space Weather Cloud, disk servers are located near its supercomputer and WSs for data analysis and visualization. They are connected to JGN2plus that is high-speed network for research and development. Distributed virtual high-capacity storage is also constructed by Grid Datafarm (Gfarm v2). Huge-size data output from the supercomputer is transferred to the virtual storage through JGN2plus. A researcher can concentrate on the research by a familiar method without regard to distance between a supercomputer and an analysis and visualization environment. Now, total 16 disk servers are setup in NICT headquarters (at Koganei, Tokyo), JGN2plus NOC (at Otemachi, Tokyo), Okinawa Subtropical Environment Remote-Sensing Center, and Cybermedia Center, Osaka University. They are connected on JGN2plus, and they constitute 1PB (physical size) virtual storage by Gfarm v2. These disk servers are connected with supercomputers of NICT and Osaka University. A system that data output from the supercomputers are automatically transferred to the virtual storage had been built up. Transfer rate is about 50 GB/hrs by actual measurement. It is estimated that the performance is reasonable for a certain simulation and analysis for reconstruction of coronal magnetic field. This research is assumed an experiment of the system, and the verification of practicality is advanced at the same time. Herein we introduce an overview of the space weather cloud system so far we have developed. We also demonstrate several scientific results using the space weather cloud system. We also introduce several web applications of the cloud as a service of the space weather cloud, which is named as "e-SpaceWeather" (e-SW). The e-SW provides with a variety of space weather online services from many aspects.

  19. Designing Computer Learning Environments for Engineering and Computer Science: The Scaffolded Knowledge Integration Framework.

    ERIC Educational Resources Information Center

    Linn, Marcia C.

    1995-01-01

    Describes a framework called scaffolded knowledge integration and illustrates how it guided the design of two successful course enhancements in the field of computer science and engineering: the LISP Knowledge Integration Environment and the spatial reasoning environment. (101 references) (Author/MKR)

  20. Object-oriented developmental environment for image-analysis applications: implementation for 2D gel electrophoretogram analysis.

    PubMed

    Pánek, J; Vohradský, J

    1997-06-01

    The principal motivation was to design an environment for the development of image-analysis applications which would allow the integration of independent modules into one frame and make available tools for their build-up, running, management and mutual communication. The system was designed as modular, consisting of the core and work modules. The system core focuses on overall management and provides a library of classes for build-up of the work modules, their user interface and data communication. The work modules carry practical implementation of algorithms and data structures for the solution of a particular problem, and were implemented as dynamic-link libraries. They are mutually independent and run as individual threads, communicating with each other via a unified mechanism. The environment was designed to simplify the development and testing of new algorithms or applications. An example of implementation for the particular problem of the analysis of two-dimensional (2D) gel electrophoretograms is presented. The environment was designed for the Windows NT operating system with the use of Microsoft Foundation Class Library employing the possibilities of C++ programming language. Available on request from the authors.

  1. Integrated Multidisciplinary Optimization Objects

    NASA Technical Reports Server (NTRS)

    Alston, Katherine

    2014-01-01

    OpenMDAO is an open-source MDAO framework. It is used to develop an integrated analysis and design environment for engineering challenges. This Phase II project integrated additional modules and design tools into OpenMDAO to perform discipline-specific analysis across multiple flight regimes at varying levels of fidelity. It also showcased a refined system architecture that allows the system to be less customized to a specific configuration (i.e., system and configuration separation). By delivering a capable and validated MDAO system along with a set of example applications to be used as a template for future users, this work greatly expands NASA's high-fidelity, physics-based MDAO capabilities and enables the design of revolutionary vehicles in a cost-effective manner. This proposed work complements M4 Engineering's expertise in developing modeling and simulation toolsets that solve relevant subsonic, supersonic, and hypersonic demonstration applications.

  2. Reliability, Safety and Error Recovery for Advanced Control Software

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2003-01-01

    For long-duration automated operation of regenerative life support systems in space environments, there is a need for advanced integration and control systems that are significantly more reliable and safe, and that support error recovery and minimization of operational failures. This presentation outlines some challenges of hazardous space environments and complex system interactions that can lead to system accidents. It discusses approaches to hazard analysis and error recovery for control software and challenges of supporting effective intervention by safety software and the crew.

  3. ATLAS I/O performance optimization in as-deployed environments

    NASA Astrophysics Data System (ADS)

    Maier, T.; Benjamin, D.; Bhimji, W.; Elmsheuser, J.; van Gemmeren, P.; Malon, D.; Krumnack, N.

    2015-12-01

    This paper provides an overview of an integrated program of work underway within the ATLAS experiment to optimise I/O performance for large-scale physics data analysis in a range of deployment environments. It proceeds to examine in greater detail one component of that work, the tuning of job-level I/O parameters in response to changes to the ATLAS event data model, and considers the implications of such tuning for a number of measures of I/O performance.

  4. Planetary exploration with nanosatellites: a space campus for future technology development

    NASA Astrophysics Data System (ADS)

    Drossart, P.; Mosser, B.; Segret, B.

    2017-09-01

    Planetary exploration is at the eve of a revolution through nanosatellites accompanying larger missions, or freely cruising in the solar system, providing a man-made cosmic web for in situ or remote sensing exploration of the Solar System. A first step is to build a specific place dedicated to nanosatellite development. The context of the CCERES PSL space campus presents an environment for nanosatellite testing and integration, a concurrent engineering facility room for project analysis and science environment dedicated to this task.

  5. Construction of integrated case environments.

    PubMed

    Losavio, Francisca; Matteo, Alfredo; Pérez, María

    2003-01-01

    The main goal of Computer-Aided Software Engineering (CASE) technology is to improve the entire software system development process. The CASE approach is not merely a technology; it involves a fundamental change in the process of software development. The tendency of the CASE approach, technically speaking, is the integration of tools that assist in the application of specific methods. In this sense, the environment architecture, which includes the platform and the system's hardware and software, constitutes the base of the CASE environment. The problem of tools integration has been proposed for two decades. Current integration efforts emphasize the interoperability of tools, especially in distributed environments. In this work we use the Brown approach. The environment resulting from the application of this model is called a federative environment, focusing on the fact that this architecture pays special attention to the connections among the components of the environment. This approach is now being used in component-based design. This paper describes a concrete experience in civil engineering and architecture fields, for the construction of an integrated CASE environment. A generic architectural framework based on an intermediary architectural pattern is applied to achieve the integration of the different tools. This intermediary represents the control perspective of the PAC (Presentation-Abstraction-Control) style, which has been implemented as a Mediator pattern and it has been used in the interactive systems domain. In addition, a process is given to construct the integrated CASE.

  6. A Spatial Analysis and Modeling System (SAMS) for environment management

    NASA Technical Reports Server (NTRS)

    Stetina, Fran; Hill, John; Chan, Paul; Jaske, Robert; Rochon, Gilbert

    1993-01-01

    This is a proposal to develop a uniform global environmental data gathering and distribution system to support the calibration and validation of remotely sensed data. SAMS is based on an enhanced version of FEMA's Integrated Emergency Management Information Systems and the Department of Defense's Air land Battlefield Environment Software Systems. This system consists of state-of-the-art graphics and visualization techniques, simulation models, database management and expert systems for conducting environmental and disaster preparedness studies. This software package will be integrated into various Landsat and UNEP-GRID stations which are planned to become direct readout stations during the EOS (Earth Observing System) timeframe. This system would be implemented as a pilot program to support the Tropical Rainfall Measuring Mission (TRMM). This will be a joint NASA-FEMA-University-Industry project.

  7. A Spatial Analysis and Modeling System (SAMS) for environment management

    NASA Technical Reports Server (NTRS)

    Vermillion, Charles H.; Stetina, Fran; Hill, John; Chan, Paul; Jaske, Robert; Rochon, Gilbert

    1992-01-01

    This is a proposal to develop a uniform global environmental data gathering and distribution system to support the calibration and validation of remotely sensed data. SAMS is based on an enhanced version of FE MA's Integrated Emergency Management Information Systems and the Department of Defense's Air Land Battlefield Environment Software Systems. This system consists of state-of-the-art graphics and visualization techniques, simulation models, database management and expert systems for conducting environmental and disaster preparedness studies. This software package will be integrated into various Landsat and UNEP-GRID stations which are planned to become direct readout stations during the EOS timeframe. This system would be implemented as a pilot program to support the Tropical Rainfall Measuring Mission (TRMM). This will be a joint NASA-FEMA-University-Industry project.

  8. High diffraction efficiency of three-layer diffractive optics designed for wide temperature range and large incident angle.

    PubMed

    Mao, Shan; Cui, Qingfeng; Piao, Mingxu; Zhao, Lidong

    2016-05-01

    A mathematical model of diffraction efficiency and polychromatic integral diffraction efficiency affected by environment temperature change and incident angle for three-layer diffractive optics with different dispersion materials is put forward, and its effects are analyzed. Taking optical materials N-FK5 and N-SF1 as the substrates of multilayer diffractive optics, the effect on diffraction efficiency and polychromatic integral diffraction efficiency with intermediate materials POLYCARB is analyzed with environment temperature change as well as incident angle. Therefore, three-layer diffractive optics can be applied in more wide environmental temperature ranges and larger incident angles for refractive-diffractive hybrid optical systems, which can obtain better image quality. Analysis results can be used to guide the hybrid imaging optical system design for optical engineers.

  9. GenomeGraphs: integrated genomic data visualization with R.

    PubMed

    Durinck, Steffen; Bullard, James; Spellman, Paul T; Dudoit, Sandrine

    2009-01-06

    Biological studies involve a growing number of distinct high-throughput experiments to characterize samples of interest. There is a lack of methods to visualize these different genomic datasets in a versatile manner. In addition, genomic data analysis requires integrated visualization of experimental data along with constantly changing genomic annotation and statistical analyses. We developed GenomeGraphs, as an add-on software package for the statistical programming environment R, to facilitate integrated visualization of genomic datasets. GenomeGraphs uses the biomaRt package to perform on-line annotation queries to Ensembl and translates these to gene/transcript structures in viewports of the grid graphics package. This allows genomic annotation to be plotted together with experimental data. GenomeGraphs can also be used to plot custom annotation tracks in combination with different experimental data types together in one plot using the same genomic coordinate system. GenomeGraphs is a flexible and extensible software package which can be used to visualize a multitude of genomic datasets within the statistical programming environment R.

  10. PCSIM: A Parallel Simulation Environment for Neural Circuits Fully Integrated with Python

    PubMed Central

    Pecevski, Dejan; Natschläger, Thomas; Schuch, Klaus

    2008-01-01

    The Parallel Circuit SIMulator (PCSIM) is a software package for simulation of neural circuits. It is primarily designed for distributed simulation of large scale networks of spiking point neurons. Although its computational core is written in C++, PCSIM's primary interface is implemented in the Python programming language, which is a powerful programming environment and allows the user to easily integrate the neural circuit simulator with data analysis and visualization tools to manage the full neural modeling life cycle. The main focus of this paper is to describe PCSIM's full integration into Python and the benefits thereof. In particular we will investigate how the automatically generated bidirectional interface and PCSIM's object-oriented modular framework enable the user to adopt a hybrid modeling approach: using and extending PCSIM's functionality either employing pure Python or C++ and thus combining the advantages of both worlds. Furthermore, we describe several supplementary PCSIM packages written in pure Python and tailored towards setting up and analyzing neural simulations. PMID:19543450

  11. Integration of sustainability into process simulaton of a dairy process

    USDA-ARS?s Scientific Manuscript database

    Life cycle analysis, a method used to quantify the energy and environmental flows of a process or product on the environment, is increasingly utilized by food processors to develop strategies to lessen the carbon footprint of their operations. In the case of the milk supply chain, the method requir...

  12. Electronic Performance Support Systems: Comparison of Types of Integration Levels on Performance Outcomes

    ERIC Educational Resources Information Center

    Phillips, Sharon A.

    2013-01-01

    Selecting appropriate performance improvement interventions is a critical component of a comprehensive model of performance improvement. Intervention selection is an interconnected process involving analysis of an organization's environment, definition of the performance problem, and identification of a performance gap and identification of causal…

  13. City of Minneapolis, Minnesota Municipal Tree Resource Analysis

    Treesearch

    E.G. McPherson; J.R. Simpson; P.J. Peper; S.E. Maco; S.L. Gardner; K.E. Vargas; S. Cozad; Q. Xiao

    2005-01-01

    Minneapolis, a vibrant city, renowned for its lakes, its livability, and its cultural wealth, maintains trees as an integral component of the urban infrastructure. Research indicates that healthy trees can mitigate impacts associated with the built environment by reducing stormwater runoff, energy consumption, and air pollutants. Trees improve urban life, making...

  14. Teaching Persons with Severe Disabilities to Use Self-Instruction in Community Settings: An Analysis of Applications.

    ERIC Educational Resources Information Center

    Hughes, Carolyn; Agran, Martin

    1993-01-01

    This literature review examines the effects of self-instructional programs on increasing independence of persons with moderate/severe mental retardation in integrated environments. The article discusses methodological issues, research needs, and recommendations for program implementation. The feasibility of using self-instruction to promote…

  15. EMERGY ANALYSIS OF HUMAN HEALTH: WELL-BEING IN WEST VIRGINIA

    EPA Science Inventory

    Human health depends on the underlying state of stress experienced by people. The stress people are exposed to in turn depends on the combined influences of environment, economy, and society which are integrated in their daily life. The net stress experienced by people from the i...

  16. SADA: A FREEWARE DECISION SUPPORT TOOL INTEGRATING GIS, SAMPLE DESIGN, SPATIAL MODELING AND RISK ASSESSMENT (SLIDE PRESENTATION)

    EPA Science Inventory

    Spatial Analysis and Decision Assistance (SADA) is a Windows freeware program that incorporates tools from environmental assessment into an effective problem-solving environment. SADA was developed by the Institute for Environmental Modeling at the University of Tennessee and inc...

  17. MEETING IN CHICAGO: SADA: A FREEWARE DECISION SUPPORT TOOL INTEGRATING GIS, SAMPLE DESIGN, SPATIAL MODELING, AND ENVIRONMENTAL RISK ASSESSMENT

    EPA Science Inventory

    Spatial Analysis and Decision Assistance (SADA) is a Windows freeware program that incorporates tools from environmental assessment into an effective problem-solving environment. SADA was developed by the Institute for Environmental Modeling at the University of Tennessee and inc...

  18. MEETING IN CZECH REPUBLIC: SADA: A FREEWARE DECISION SUPPORT TOOL INTEGRATING GIS, SAMPLE DESIGN, SPATIAL MODELING, AND RISK ASSESSMENT

    EPA Science Inventory

    Spatial Analysis and Decision Assistance (SADA) is a Windows freeware program that incorporates tools from environmental assessment into an effective problem-solving environment. SADA was developed by the Institute for Environmental Modeling at the University of Tennessee and inc...

  19. Composing across Modes: A Comparative Analysis of Adolescents' Multimodal Composing Processes

    ERIC Educational Resources Information Center

    Smith, Blaine E.

    2017-01-01

    Although the shift from page to screen has dramatically redefined conceptions of writing, very little is known about how youth compose with multiple modes in digital environments. Integrating multimodality and multiliteracies theoretical frameworks, this comparative case study examined how urban twelfth-grade students collaboratively composed…

  20. The Scenario Analysis Tool Suite: A User’s Guide

    DTIC Science & Technology

    2009-01-01

    be exported at any stage and continued manually. The free, open-source integrated development environment (IDE) NetBeans [14] was used in the creation...and Technology Organisation, Australia. 14. Sun Microsystems & CollabNet (2008) NetBeans IDE 6.0, http://wwwnetbeans.org. 15. Tri, N., Boswell, S

  1. Using Geographic Information Systems to Evaluate Energy Initiatives in Austere Environments

    DTIC Science & Technology

    2013-03-01

    conducting economic analysis of energy reduction initiatives. This research examined the energy savings potential of improving the thermal properties...shelter improvements in any climate and location in the world. Specifically, solar flies developed through Solar Integrated Power Shelter System...94 Improvements to the Existing Model

  2. Developing smartphone apps for behavioural studies: The AlcoRisk app case study.

    PubMed

    Smith, Anthony; de Salas, Kristy; Lewis, Ian; Schüz, Benjamin

    2017-08-01

    Smartphone apps have emerged as valuable research tools to sample human behaviours at their time of occurrence within natural environments. Human behaviour sampling methods, such as Ecological Momentary Assessment (EMA), aim to facilitate research that is situated in ecologically valid real world environments rather than laboratory environments. Researchers have trialled a range of EMA smartphone apps to sample human behaviours such as dieting, physical activity and smoking. Software development processes for EMA smartphones apps, however, are not widely documented with little guidance provided for the integration of complex multidisciplinary behavioural and technical fields. In this paper, the AlcoRisk app for studying alcohol consumption and risk taking tendencies is presented alongside a software development process that integrates these multidisciplinary fields. The software development process consists of three stages including requirements analysis, feature and interface design followed by app implementation. Results from a preliminary feasibility study support the efficacy of the AlcoRisk app's software development process. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Spatially explicit integrated modeling and economic valuation of climate driven land use change and its indirect effects.

    PubMed

    Bateman, Ian; Agarwala, Matthew; Binner, Amy; Coombes, Emma; Day, Brett; Ferrini, Silvia; Fezzi, Carlo; Hutchins, Michael; Lovett, Andrew; Posen, Paulette

    2016-10-01

    We present an integrated model of the direct consequences of climate change on land use, and the indirect effects of induced land use change upon the natural environment. The model predicts climate-driven shifts in the profitability of alternative uses of agricultural land. Both the direct impact of climate change and the induced shift in land use patterns will cause secondary effects on the water environment, for which agriculture is the major source of diffuse pollution. We model the impact of changes in such pollution on riverine ecosystems showing that these will be spatially heterogeneous. Moreover, we consider further knock-on effects upon the recreational benefits derived from water environments, which we assess using revealed preference methods. This analysis permits a multi-layered examination of the economic consequences of climate change, assessing the sequence of impacts from climate change through farm gross margins, land use, water quality and recreation, both at the individual and catchment scale. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Future of lignite resources: a life cycle analysis.

    PubMed

    Wang, Qingsong; Liu, Wei; Yuan, Xueliang; Zheng, Xiaoning; Zuo, Jian

    2016-12-01

    Lignite is a low-quality energy source which accounts for 13 % of China's coal reserves. It is imperative to improve the quality of lignite for large-scale utilization. To further explore and analyze the influence of various key processes on the environment and economic costs, a lignite drying and compression technology is evaluated using an integrated approach of life cycle assessment and life cycle costs. Results showed that lignite mining, direct air emissions, and electricity consumption have most significant impacts on the environment. An integrated evaluation of life cycle assessment and life cycle costs showed that the most significant contributor to the environmental impacts and economic costs was the lignite mining process. The impact of transportation and wastewater treatment process on the environment and economic costs was small enough to be ignored. Critical factors were identified for reducing the environmental and economic impacts of lignite drying and compression technology. These findings provide useful inputs for both industrial practice and policy making for exploitation, processing, and utilization of lignite resources.

  5. Integration of modern statistical tools for the analysis of climate extremes into the web-GIS “CLIMATE”

    NASA Astrophysics Data System (ADS)

    Ryazanova, A. A.; Okladnikov, I. G.; Gordov, E. P.

    2017-11-01

    The frequency of occurrence and magnitude of precipitation and temperature extreme events show positive trends in several geographical regions. These events must be analyzed and studied in order to better understand their impact on the environment, predict their occurrences, and mitigate their effects. For this purpose, we augmented web-GIS called “CLIMATE” to include a dedicated statistical package developed in the R language. The web-GIS “CLIMATE” is a software platform for cloud storage processing and visualization of distributed archives of spatial datasets. It is based on a combined use of web and GIS technologies with reliable procedures for searching, extracting, processing, and visualizing the spatial data archives. The system provides a set of thematic online tools for the complex analysis of current and future climate changes and their effects on the environment. The package includes new powerful methods of time-dependent statistics of extremes, quantile regression and copula approach for the detailed analysis of various climate extreme events. Specifically, the very promising copula approach allows obtaining the structural connections between the extremes and the various environmental characteristics. The new statistical methods integrated into the web-GIS “CLIMATE” can significantly facilitate and accelerate the complex analysis of climate extremes using only a desktop PC connected to the Internet.

  6. A Simple Tool for the Design and Analysis of Multiple-Reflector Antennas in a Multi-Disciplinary Environment

    NASA Technical Reports Server (NTRS)

    Katz, Daniel S.; Cwik, Tom; Fu, Chuigang; Imbriale, William A.; Jamnejad, Vahraz; Springer, Paul L.; Borgioli, Andrea

    2000-01-01

    The process of designing and analyzing a multiple-reflector system has traditionally been time-intensive, requiring large amounts of both computational and human time. At many frequencies, a discrete approximation of the radiation integral may be used to model the system. The code which implements this physical optics (PO) algorithm was developed at the Jet Propulsion Laboratory. It analyzes systems of antennas in pairs, and for each pair, the analysis can be computationally time-consuming. Additionally, the antennas must be described using a local coordinate system for each antenna, which makes it difficult to integrate the design into a multi-disciplinary framework in which there is traditionally one global coordinate system, even before considering deforming the antenna as prescribed by external structural and/or thermal factors. Finally, setting up the code to correctly analyze all the antenna pairs in the system can take a fair amount of time, and introduces possible human error. The use of parallel computing to reduce the computational time required for the analysis of a given pair of antennas has been previously discussed. This paper focuses on the other problems mentioned above. It will present a methodology and examples of use of an automated tool that performs the analysis of a complete multiple-reflector system in an integrated multi-disciplinary environment (including CAD modeling, and structural and thermal analysis) at the click of a button. This tool, named MOD Tool (Millimeter-wave Optics Design Tool), has been designed and implemented as a distributed tool, with a client that runs almost identically on Unix, Mac, and Windows platforms, and a server that runs primarily on a Unix workstation and can interact with parallel supercomputers with simple instruction from the user interacting with the client.

  7. Supervised classification of continental shelf sediment off western Donegal, Ireland

    NASA Astrophysics Data System (ADS)

    Monteys, X.; Craven, K.; McCarron, S. G.

    2017-12-01

    Managing human impacts on marine ecosystems requires natural regions to be identified and mapped over a range of hierarchically nested scales. In recent years (2000-present) the Irish National Seabed Survey (INSS) and Integrated Mapping for the Sustainable Development of Ireland's Marine Resources programme (INFOMAR) (Geological Survey Ireland and Marine Institute collaborations) has provided unprecedented quantities of high quality data on Ireland's offshore territories. The increasing availability of large, detailed digital representations of these environments requires the application of objective and quantitative analyses. This study presents results of a new approach for sea floor sediment mapping based on an integrated analysis of INFOMAR multibeam bathymetric data (including the derivatives of slope and relative position), backscatter data (including derivatives of angular response analysis) and sediment groundtruthing over the continental shelf, west of Donegal. It applies a Geographic-Object-Based Image Analysis software package to provide a supervised classification of the surface sediment. This approach can provide a statistically robust, high resolution classification of the seafloor. Initial results display a differentiation of sediment classes and a reduction in artefacts from previously applied methodologies. These results indicate a methodology that could be used during physical habitat mapping and classification of marine environments.

  8. Modeling Nitrogen Dynamics in a Waste Stabilization Pond System Using Flexible Modeling Environment with MCMC.

    PubMed

    Mukhtar, Hussnain; Lin, Yu-Pin; Shipin, Oleg V; Petway, Joy R

    2017-07-12

    This study presents an approach for obtaining realization sets of parameters for nitrogen removal in a pilot-scale waste stabilization pond (WSP) system. The proposed approach was designed for optimal parameterization, local sensitivity analysis, and global uncertainty analysis of a dynamic simulation model for the WSP by using the R software package Flexible Modeling Environment (R-FME) with the Markov chain Monte Carlo (MCMC) method. Additionally, generalized likelihood uncertainty estimation (GLUE) was integrated into the FME to evaluate the major parameters that affect the simulation outputs in the study WSP. Comprehensive modeling analysis was used to simulate and assess nine parameters and concentrations of ON-N, NH₃-N and NO₃-N. Results indicate that the integrated FME-GLUE-based model, with good Nash-Sutcliffe coefficients (0.53-0.69) and correlation coefficients (0.76-0.83), successfully simulates the concentrations of ON-N, NH₃-N and NO₃-N. Moreover, the Arrhenius constant was the only parameter sensitive to model performances of ON-N and NH₃-N simulations. However, Nitrosomonas growth rate, the denitrification constant, and the maximum growth rate at 20 °C were sensitive to ON-N and NO₃-N simulation, which was measured using global sensitivity.

  9. Current Activities and Capabilities of the Terrestrial Environment Group at NASA's Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Roberts, Barry C.; Batts, Wade

    1997-01-01

    The National Aeronautics and Space Administration (NASA) designated Marshall Space Flight Center (MSFC) the center of excellence for space transportation. The Aerospace Environments and Effects (AEE) team of the Electromagnetics and Aerospace Environments Branch (EL23) in the Systems Analysis and Integration Laboratory at MSFC, supports the center of excellence designation by providing near-Earth space, deep space, planetary, and terrestrial environments expertise to projects as required. The Terrestrial Environment (TE) group within the AEE team maintains an extensive TE data base. Statistics and models derived from this data are applied to the design and development of new aerospace vehicles, as well as performance enhancement of operational vehicles such as the Space Shuttle. The TE is defined as the Earth's atmospheric environment extending from the surface to orbital insertion altitudes (approximately 90 km).

  10. Using a social capital framework to enhance measurement of the nursing work environment.

    PubMed

    Sheingold, Brenda Helen; Sheingold, Steven H

    2013-07-01

    To develop, field test and analyse a social capital survey instrument for measuring the nursing work environment. The concept of social capital, which focuses on improving productive capacity by examining relationships and networks, may provide a promising framework to measure and evaluate the nurse work environment in a variety of settings. A survey instrument for measuring social capital in the nurse work environment was developed by adapting the World Bank's Social Capital - Integrated Questionnaire (SC-IQ). Exploratory factor analysis and multiple regression analyses were applied to assess the properties of the instrument. The exploratory factor analysis yielded five factors that align well with the social capital framework, while reflecting unique aspects of the nurse work environment. The results suggest that the social capital framework provides a promising context to assess the nurse work environment. Further work is needed to refine the instrument for a diverse range of health-care providers and to correlate social capital measures with quality of patient care. Social capital measurement of the nurse work environment has the potential to provide managers with an enhanced set of tools for building productive capacity in health-care organisations and achieving desired outcomes. © 2013 John Wiley & Sons Ltd.

  11. A semantic problem solving environment for integrative parasite research: identification of intervention targets for Trypanosoma cruzi.

    PubMed

    Parikh, Priti P; Minning, Todd A; Nguyen, Vinh; Lalithsena, Sarasi; Asiaee, Amir H; Sahoo, Satya S; Doshi, Prashant; Tarleton, Rick; Sheth, Amit P

    2012-01-01

    Research on the biology of parasites requires a sophisticated and integrated computational platform to query and analyze large volumes of data, representing both unpublished (internal) and public (external) data sources. Effective analysis of an integrated data resource using knowledge discovery tools would significantly aid biologists in conducting their research, for example, through identifying various intervention targets in parasites and in deciding the future direction of ongoing as well as planned projects. A key challenge in achieving this objective is the heterogeneity between the internal lab data, usually stored as flat files, Excel spreadsheets or custom-built databases, and the external databases. Reconciling the different forms of heterogeneity and effectively integrating data from disparate sources is a nontrivial task for biologists and requires a dedicated informatics infrastructure. Thus, we developed an integrated environment using Semantic Web technologies that may provide biologists the tools for managing and analyzing their data, without the need for acquiring in-depth computer science knowledge. We developed a semantic problem-solving environment (SPSE) that uses ontologies to integrate internal lab data with external resources in a Parasite Knowledge Base (PKB), which has the ability to query across these resources in a unified manner. The SPSE includes Web Ontology Language (OWL)-based ontologies, experimental data with its provenance information represented using the Resource Description Format (RDF), and a visual querying tool, Cuebee, that features integrated use of Web services. We demonstrate the use and benefit of SPSE using example queries for identifying gene knockout targets of Trypanosoma cruzi for vaccine development. Answers to these queries involve looking up multiple sources of data, linking them together and presenting the results. The SPSE facilitates parasitologists in leveraging the growing, but disparate, parasite data resources by offering an integrative platform that utilizes Semantic Web techniques, while keeping their workload increase minimal.

  12. Application Architecture of Avian Influenza Research Collaboration Network in Korea e-Science

    NASA Astrophysics Data System (ADS)

    Choi, Hoon; Lee, Junehawk

    In the pursuit of globalization of the AI e-Science environment, KISTI is fostering to extend the AI research community to the AI research institutes of neighboring countries and to share the AI e-Science environment with them in the near future. In this paper we introduce the application architecture of AI research collaboration network (AIRCoN). AIRCoN is a global e-Science environment for AI research conducted by KISTI. It consists of AI virus sequence information sharing system for sufficing data requirement of research community, integrated analysis environment for analyzing the mutation pattern of AI viruses and their risks, epidemic modeling and simulation environment for establishing national effective readiness strategy against AI pandemics, and knowledge portal for sharing expertise of epidemic study and unpublished research results with community members.

  13. Geoinformatics 2007: data to knowledge

    USGS Publications Warehouse

    Brady, Shailaja R.; Sinha, A. Krishna; Gundersen, Linda C.

    2007-01-01

    Geoinformatics is the term used to describe a variety of efforts to promote collaboration between the computer sciences and the geosciences to solve complex scientific questions. It refers to the distributed, integrated digital information system and working environment that provides innovative means for the study of the Earth systems, as well as other planets, through use of advanced information technologies. Geoinformatics activities range from major research and development efforts creating new technologies to provide high-quality, sustained production-level services for data discovery, integration and analysis, to small, discipline-specific efforts that develop earth science data collections and data analysis tools serving the needs of individual communities. The ultimate vision of Geoinformatics is a highly interconnected data system populated with high quality, freely available data, as well as, a robust set of software for analysis, visualization, and modeling.

  14. Spacecraft System Failures and Anomalies Attributed to the Natural Space Environment

    NASA Technical Reports Server (NTRS)

    Bedingfield, Keith, L.; Leach, Richard D.; Alexander, Margaret B. (Editor)

    1996-01-01

    The natural space environment is characterized by many complex and subtle phenomena hostile to spacecraft. The effects of these phenomena impact spacecraft design, development, and operations. Space systems become increasingly susceptible to the space environment as use of composite materials and smaller, faster electronics increases. This trend makes an understanding of the natural space environment essential to accomplish overall mission objectives, especially in the current climate of better/cheaper/faster. This primer provides a brief overview of the natural space environment - definition, related programmatic issues, and effects on various spacecraft subsystems. The primary focus, however, is to catalog, through representative case histories, spacecraft failures and anomalies attributed to the natural space environment. This primer is one in a series of NASA Reference Publications currently being developed by the Electromagnetics and Aerospace Environments Branch, Systems Analysis and Integration Laboratory, Marshall Space Flight Center (MSFC), National Aeronautics and Space Administration (NASA).

  15. Information Management for a Large Multidisciplinary Project

    NASA Technical Reports Server (NTRS)

    Jones, Kennie H.; Randall, Donald P.; Cronin, Catherine K.

    1992-01-01

    In 1989, NASA's Langley Research Center (LaRC) initiated the High-Speed Airframe Integration Research (HiSAIR) Program to develop and demonstrate an integrated environment for high-speed aircraft design using advanced multidisciplinary analysis and optimization procedures. The major goals of this program were to evolve the interactions among disciplines and promote sharing of information, to provide a timely exchange of information among aeronautical disciplines, and to increase the awareness of the effects each discipline has upon other disciplines. LaRC historically has emphasized the advancement of analysis techniques. HiSAIR was founded to synthesize these advanced methods into a multidisciplinary design process emphasizing information feedback among disciplines and optimization. Crucial to the development of such an environment are the definition of the required data exchanges and the methodology for both recording the information and providing the exchanges in a timely manner. These requirements demand extensive use of data management techniques, graphic visualization, and interactive computing. HiSAIR represents the first attempt at LaRC to promote interdisciplinary information exchange on a large scale using advanced data management methodologies combined with state-of-the-art, scientific visualization techniques on graphics workstations in a distributed computing environment. The subject of this paper is the development of the data management system for HiSAIR.

  16. Framework for Infectious Disease Analysis: A comprehensive and integrative multi-modeling approach to disease prediction and management.

    PubMed

    Erraguntla, Madhav; Zapletal, Josef; Lawley, Mark

    2017-12-01

    The impact of infectious disease on human populations is a function of many factors including environmental conditions, vector dynamics, transmission mechanics, social and cultural behaviors, and public policy. A comprehensive framework for disease management must fully connect the complete disease lifecycle, including emergence from reservoir populations, zoonotic vector transmission, and impact on human societies. The Framework for Infectious Disease Analysis is a software environment and conceptual architecture for data integration, situational awareness, visualization, prediction, and intervention assessment. Framework for Infectious Disease Analysis automatically collects biosurveillance data using natural language processing, integrates structured and unstructured data from multiple sources, applies advanced machine learning, and uses multi-modeling for analyzing disease dynamics and testing interventions in complex, heterogeneous populations. In the illustrative case studies, natural language processing from social media, news feeds, and websites was used for information extraction, biosurveillance, and situation awareness. Classification machine learning algorithms (support vector machines, random forests, and boosting) were used for disease predictions.

  17. Envision: An interactive system for the management and visualization of large geophysical data sets

    NASA Technical Reports Server (NTRS)

    Searight, K. R.; Wojtowicz, D. P.; Walsh, J. E.; Pathi, S.; Bowman, K. P.; Wilhelmson, R. B.

    1995-01-01

    Envision is a software project at the University of Illinois and Texas A&M, funded by NASA's Applied Information Systems Research Project. It provides researchers in the geophysical sciences convenient ways to manage, browse, and visualize large observed or model data sets. Envision integrates data management, analysis, and visualization of geophysical data in an interactive environment. It employs commonly used standards in data formats, operating systems, networking, and graphics. It also attempts, wherever possible, to integrate with existing scientific visualization and analysis software. Envision has an easy-to-use graphical interface, distributed process components, and an extensible design. It is a public domain package, freely available to the scientific community.

  18. Integrated Model Reduction and Control of Aircraft with Flexible Wings

    NASA Technical Reports Server (NTRS)

    Swei, Sean Shan-Min; Zhu, Guoming G.; Nguyen, Nhan T.

    2013-01-01

    This paper presents an integrated approach to the modeling and control of aircraft with exible wings. The coupled aircraft rigid body dynamics with a high-order elastic wing model can be represented in a nite dimensional state-space form. Given a set of desired output covariance, a model reduction process is performed by using the weighted Modal Cost Analysis (MCA). A dynamic output feedback controller, which is designed based on the reduced-order model, is developed by utilizing output covariance constraint (OCC) algorithm, and the resulting OCC design weighting matrix is used for the next iteration of the weighted cost analysis. This controller is then validated for full-order evaluation model to ensure that the aircraft's handling qualities are met and the uttering motion of the wings suppressed. An iterative algorithm is developed in CONDUIT environment to realize the integration of model reduction and controller design. The proposed integrated approach is applied to NASA Generic Transport Model (GTM) for demonstration.

  19. Architecture and Children: Learning Environments and Design Education.

    ERIC Educational Resources Information Center

    Taylor, Anne, Ed.; Muhlberger, Joe, Ed.

    1998-01-01

    This issue addresses (1) growing international interest in learning environments and their effects on behavior, and (2) design education, an integrated model for visual-spatial lifelong learning. It focuses on this new and emerging integrated field which integrates elements in education, new learning environment design, and the use of more two-…

  20. Study the Effectiveness of Technology-Enhanced Interactive Teaching Environment on Student Learning of Junior High School Biology

    ERIC Educational Resources Information Center

    Yang, Kai-Ti; Wang, Tzu-Hua; Chiu, Mei-Hung

    2015-01-01

    This research investigates the effectiveness of integrating Interactive Whiteboard (IWB) into the junior high school biology teaching. This research adopts a quasi-experimental design and divides the participating students into the conventional ICT-integrated learning environment and IWB-integrated learning environment. Before teaching, students…

  1. Man-Machine Integration Design and Analysis System (MIDAS) v5: Augmentations, Motivations, and Directions for Aeronautics Applications

    NASA Technical Reports Server (NTRS)

    Gore, Brian F.

    2011-01-01

    As automation and advanced technologies are introduced into transport systems ranging from the Next Generation Air Transportation System termed NextGen, to the advanced surface transportation systems as exemplified by the Intelligent Transportations Systems, to future systems designed for space exploration, there is an increased need to validly predict how the future systems will be vulnerable to error given the demands imposed by the assistive technologies. One formalized approach to study the impact of assistive technologies on the human operator in a safe and non-obtrusive manner is through the use of human performance models (HPMs). HPMs play an integral role when complex human-system designs are proposed, developed, and tested. One HPM tool termed the Man-machine Integration Design and Analysis System (MIDAS) is a NASA Ames Research Center HPM software tool that has been applied to predict human-system performance in various domains since 1986. MIDAS is a dynamic, integrated HPM and simulation environment that facilitates the design, visualization, and computational evaluation of complex man-machine system concepts in simulated operational environments. The paper will discuss a range of aviation specific applications including an approach used to model human error for NASA s Aviation Safety Program, and what-if analyses to evaluate flight deck technologies for NextGen operations. This chapter will culminate by raising two challenges for the field of predictive HPMs for complex human-system designs that evaluate assistive technologies: that of (1) model transparency and (2) model validation.

  2. Palynology in coal systems analysis-The key to floras, climate, and stratigraphy of coal-forming environments

    USGS Publications Warehouse

    Nichols, D.J.

    2005-01-01

    Palynology can be effectively used in coal systems analysis to understand the nature of ancient coal-forming peat mires. Pollen and spores preserved in coal effectively reveal the floristic composition of mires, which differed substantially through geologic time, and contribute to determination of depositional environment and paleo- climate. Such applications are most effective when integrated with paleobotanical and coal-petrographic data. Examples of previous studies of Miocene, Carboniferous, and Paleogene coal beds illustrate the methods and results. Palynological age determinations and correlations of deposits are also important in coal systems analysis to establish stratigraphic setting. Application to studies of coalbed methane generation shows potential because certain kinds of pollen are associated with gas-prone lithotypes. ??2005 Geological Society of America.

  3. QUEST/Ada (Query Utility Environment for Software Testing) of Ada: The development of a program analysis environment for Ada

    NASA Technical Reports Server (NTRS)

    Brown, David B.

    1988-01-01

    A history of the Query Utility Environment for Software Testing (QUEST)/Ada is presented. A fairly comprehensive literature review which is targeted toward issues of Ada testing is given. The definition of the system structure and the high level interfaces are then presented. The design of the three major components is described. The QUEST/Ada IORL System Specifications to this point in time are included in the Appendix. A paper is also included in the appendix which gives statistical evidence of the validity of the test case generation approach which is being integrated into QUEST/Ada.

  4. Interactive Schematic Integration Within the Propellant System Modeling Environment

    NASA Technical Reports Server (NTRS)

    Coote, David; Ryan, Harry; Burton, Kenneth; McKinney, Lee; Woodman, Don

    2012-01-01

    Task requirements for rocket propulsion test preparations of the test stand facilities drive the need to model the test facility propellant systems prior to constructing physical modifications. The Propellant System Modeling Environment (PSME) is an initiative designed to enable increased efficiency and expanded capabilities to a broader base of NASA engineers in the use of modeling and simulation (M&S) technologies for rocket propulsion test and launch mission requirements. PSME will enable a wider scope of users to utilize M&S of propulsion test and launch facilities for predictive and post-analysis functionality by offering a clean, easy-to-use, high-performance application environment.

  5. Is inequality harmful for the environment? An empirical analysis applied to developing and transition countries.

    PubMed

    Clement, Matthieu; Meunie, Andre

    2010-01-01

    The object of this article is to examine the relation between social inequalities and pollution. First of all we provide a survey demonstrating that, from a theoretical point of view, a decrease in inequality has an uncertain impact on the environment. Second, on the basis of these conceptual considerations, we propose an econometric analysis based on panel data (fixed-effects and dynamic panel data models) concerning developing and transition countries for the 1988-2003 period. We examine specifically the effect of inequality on the extent of local pollution (sulphur dioxide emissions and organic water pollution) by integrating the Gini index into the formulation of the environmental Kuznets' curve.

  6. Thou Shalt Be Reproducible! A Technology Perspective

    PubMed Central

    Mair, Patrick

    2016-01-01

    This article elaborates on reproducibility in psychology from a technological viewpoint. Modern open source computational environments are shown and explained that foster reproducibility throughout the whole research life cycle, and to which emerging psychology researchers should be sensitized, are shown and explained. First, data archiving platforms that make datasets publicly available are presented. Second, R is advocated as the data-analytic lingua franca in psychology for achieving reproducible statistical analysis. Third, dynamic report generation environments for writing reproducible manuscripts that integrate text, data analysis, and statistical outputs such as figures and tables in a single document are described. Supplementary materials are provided in order to get the reader started with these technologies. PMID:27471486

  7. Analysis and Design of Fuselage Structures Including Residual Strength Prediction Methodology

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.

    1998-01-01

    The goal of this research project is to develop and assess methodologies for the design and analysis of fuselage structures accounting for residual strength. Two primary objectives are included in this research activity: development of structural analysis methodology for predicting residual strength of fuselage shell-type structures; and the development of accurate, efficient analysis, design and optimization tool for fuselage shell structures. Assessment of these tools for robustness, efficient, and usage in a fuselage shell design environment will be integrated with these two primary research objectives.

  8. The Integrality of Situated Caring in Nursing and the Environment

    PubMed Central

    Jarrin, Olga F.

    2012-01-01

    Much emphasis has been placed on the importance of the environment as a determinant of health; however, little theoretical work in nursing has specifically articulated the importance of the nursing practice environment as a factor in patient outcomes. This work advances the unitary-transformative-caring paradigm by focusing on the concept of integrality and exploring the nursing meta-paradigm concepts (nursing, environment, human being, and health) through integral philosophical inquiry. PMID:22222236

  9. Development of a novel visuomotor integration paradigm by integrating a virtual environment with mobile eye-tracking and motion-capture systems

    PubMed Central

    Miller, Haylie L.; Bugnariu, Nicoleta; Patterson, Rita M.; Wijayasinghe, Indika; Popa, Dan O.

    2018-01-01

    Visuomotor integration (VMI), the use of visual information to guide motor planning, execution, and modification, is necessary for a wide range of functional tasks. To comprehensively, quantitatively assess VMI, we developed a paradigm integrating virtual environments, motion-capture, and mobile eye-tracking. Virtual environments enable tasks to be repeatable, naturalistic, and varied in complexity. Mobile eye-tracking and minimally-restricted movement enable observation of natural strategies for interacting with the environment. This paradigm yields a rich dataset that may inform our understanding of VMI in typical and atypical development. PMID:29876370

  10. Extensive Radiation Shielding Analysis for Different Spacecraft Orbits

    NASA Astrophysics Data System (ADS)

    Çay, Yiǧit; Kaymaz, Zerefsan

    2016-07-01

    Radiation environment around Earth poses a great danger for spacecraft and causes immature de-orbiting or loss of the spacecraft in near Earth space environment. In this study, a student project has been designed to build a CubeSat, PolarBeeSail (PBS), with an orbit having inclination of 80°, 4 Re in perigee and 20 Re in apogee to study the polar magnetospheric environment. An extensive radiation dose analyses were carried out for PBS orbit, and integral and differential fluxes were calculated using SPENVIS tools. A shielding analysis was performed and an optimum Aluminum thickness, 3 mm, was obtained. These results for PBS were then compared for other orbits at different altitudes both for polar and equatorial orbits. For this purpose, orbital characteristics of POES-19 and GOES-15 were used. The resulting proton flux analyses, TID analyses, and further shielding studies were conducted; comparisons and recommendations were made for future design of spacecraft that will use these environments.

  11. 48 CFR 952.223-71 - Integration of environment, safety, and health into work planning and execution.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Integration of environment, safety, and health into work planning and execution. 952.223-71 Section 952.223-71 Federal Acquisition... Provisions and Clauses 952.223-71 Integration of environment, safety, and health into work planning and...

  12. ENFIN--A European network for integrative systems biology.

    PubMed

    Kahlem, Pascal; Clegg, Andrew; Reisinger, Florian; Xenarios, Ioannis; Hermjakob, Henning; Orengo, Christine; Birney, Ewan

    2009-11-01

    Integration of biological data of various types and the development of adapted bioinformatics tools represent critical objectives to enable research at the systems level. The European Network of Excellence ENFIN is engaged in developing an adapted infrastructure to connect databases, and platforms to enable both the generation of new bioinformatics tools and the experimental validation of computational predictions. With the aim of bridging the gap existing between standard wet laboratories and bioinformatics, the ENFIN Network runs integrative research projects to bring the latest computational techniques to bear directly on questions dedicated to systems biology in the wet laboratory environment. The Network maintains internally close collaboration between experimental and computational research, enabling a permanent cycling of experimental validation and improvement of computational prediction methods. The computational work includes the development of a database infrastructure (EnCORE), bioinformatics analysis methods and a novel platform for protein function analysis FuncNet.

  13. Injection Locking Techniques for Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Gathma, Timothy D.; Buckwalter, James F.

    2011-04-01

    Wideband spectrum analysis supports future communication systems that reconfigure and adapt to the capacity of the spectral environment. While test equipment manufacturers offer wideband spectrum analyzers with excellent sensitivity and resolution, these spectrum analyzers typically cannot offer acceptable size, weight, and power (SWAP). CMOS integrated circuits offer the potential to fully integrate spectrum analysis capability with analog front-end circuitry and digital signal processing on a single chip. Unfortunately, CMOS lacks high-Q passives and wideband resonator tunability that is necessary for heterodyne implementations of spectrum analyzers. As an alternative to the heterodyne receiver architectures, two nonlinear methods for performing wideband, low-power spectrum analysis are presented. The first method involves injecting the spectrum of interest into an array of injection-locked oscillators. The second method employs the closed loop dynamics of both injection locking and phase locking to independently estimate the injected frequency and power.

  14. Electricity prices in a competitive environment: Marginal cost pricing of generation services and financial status of electric utilities. A preliminary analysis through 2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1997-08-01

    The emergence of competitive markets for electricity generation services is changing the way that electricity is and will be priced in the United States. This report presents the results of an analysis that focuses on two questions: (1) How are prices for competitive generation services likely to differ from regulated prices if competitive prices are based on marginal costs rather than regulated {open_quotes}cost-of-service{close_quotes} pricing? (2) What impacts will the competitive pricing of generation services (based on marginal costs) have on electricity consumption patterns, production costs, and the financial integrity patterns, production costs, and the financial integrity of electricity suppliers? Thismore » study is not intended to be a cost-benefit analysis of wholesale or retail competition, nor does this report include an analysis of the macroeconomic impacts of competitive electricity prices.« less

  15. Integrating biophysical and socioeconomic information for prioritizing watersheds in a Kashmir Himalayan lake: a remote sensing and GIS approach.

    PubMed

    Badar, Bazigha; Romshoo, Shakil A; Khan, M A

    2013-08-01

    Dal Lake, a cradle of Kashmiri civilization has strong linkage with socioeconomics of the state of Jammu and Kashmir. During last few decades, anthropogenic pressures in Dal Lake Catchment have caused environmental deterioration impairing, inter-alia, sustained biotic communities and water quality. The present research was an integrated impact analysis of socioeconomic and biophysical processes at the watershed level on the current status of Dal Lake using multi-sensor and multi-temporal satellite data, simulation modelling together with field data verification. Thirteen watersheds (designated as 'W1-W13') were identified and investigated for land use/land cover change detection, quantification of erosion and sediment loads and socioeconomic analysis (total population, total households, literacy rate and economic development status). All the data for the respective watersheds was integrated into the GIS environment based upon multi-criteria analysis and knowledge-based weightage system was adopted for watershed prioritization based on its factors and after carefully observing the field situation. The land use/land cover change detection revealed significant changes with a uniform trend of decreased vegetation and increased impervious surface cover. Increased erosion and sediment loadings were recorded for the watersheds corresponding to their changing land systems, with bare and agriculture lands being the major contributors. The prioritization analysis revealed that W5 > W2 > W6 > W8 > W1 ranked highest in priority and W13 > W3 > W4 > W11 > W7 under medium priority. W12 > W9 > W10 belonged to low-priority category. The integration of the biophysical and the socioeconomic environment at the watershed level using modern geospatial tools would be of vital importance for the conservation and management strategies of Dal Lake ecosystem.

  16. The Rapid Integration and Test Environment: A Process for Achieving Software Test Acceptance

    DTIC Science & Technology

    2010-05-01

    Test Environment : A Process for Achieving Software Test Acceptance 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S...mlif`v= 365= k^s^i=mlpqdo^ar^qb=p`elli= The Rapid Integration and Test Environment : A Process for Achieving Software Test Acceptance Patrick V...was awarded the Bronze Star. Introduction The Rapid Integration and Test Environment (RITE) initiative, implemented by the Program Executive Office

  17. 40 CFR 255.31 - Integration with other acts.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Integration with other acts. 255.31 Section 255.31 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES... Relationship to Other Programs § 255.31 Integration with other acts. The Governor shall integrate the...

  18. A study of diverse clinical decision support rule authoring environments and requirements for integration

    PubMed Central

    2012-01-01

    Background Efficient rule authoring tools are critical to allow clinical Knowledge Engineers (KEs), Software Engineers (SEs), and Subject Matter Experts (SMEs) to convert medical knowledge into machine executable clinical decision support rules. The goal of this analysis was to identify the critical success factors and challenges of a fully functioning Rule Authoring Environment (RAE) in order to define requirements for a scalable, comprehensive tool to manage enterprise level rules. Methods The authors evaluated RAEs in active use across Partners Healthcare, including enterprise wide, ambulatory only, and system specific tools, with a focus on rule editors for reminder and medication rules. We conducted meetings with users of these RAEs to discuss their general experience and perceived advantages and limitations of these tools. Results While the overall rule authoring process is similar across the 10 separate RAEs, the system capabilities and architecture vary widely. Most current RAEs limit the ability of the clinical decision support (CDS) interventions to be standardized, sharable, interoperable, and extensible. No existing system meets all requirements defined by knowledge management users. Conclusions A successful, scalable, integrated rule authoring environment will need to support a number of key requirements and functions in the areas of knowledge representation, metadata, terminology, authoring collaboration, user interface, integration with electronic health record (EHR) systems, testing, and reporting. PMID:23145874

  19. A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine.

    PubMed

    Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun

    2017-02-06

    In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human-machine-environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines.

  20. A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine

    PubMed Central

    Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun

    2017-01-01

    In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human–machine–environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines. PMID:28178184

  1. Affordances of ICT in Science Learning: Implications for an Integrated Pedagogy. Research Report

    ERIC Educational Resources Information Center

    Webb, Mary E.

    2005-01-01

    This paper presents an analysis of how affordances of ICT-rich environments identified from a recent review of the research literature can support students in learning science in schools within a proposed framework for pedagogical practice in science education. Furthermore other pedagogical and curriculum innovations in science education…

  2. City of Charleston, South Carolina Municipal Forest Resource Analysis

    Treesearch

    E.G. McPherson; J.R. Simpson; P.J. Peper; S.L. Gardner; K.E. Vargas; S.E. Maco; Q. Xiao

    2006-01-01

    Charleston, a charming Southern city appreciated for its rich history and culture, maintains trees as an integral component of the urban infrastructure (Figure 1). Research indicates that healthy trees can lessen impacts associated with the built environment by reducing stormwater runoff, energy consumption, and air pollutants. Trees improve urban life, making...

  3. City of Charlotte, North Carolina Municipal Forest Resource Analysis

    Treesearch

    E.G. McPherson; J.R. Simpson; P.J. Peper; S.L. Gardner; K.E. Vargas; S.E. Maco; Q. Xiao

    2005-01-01

    Charlotte, a vibrant Southern city appreciated for its rich history and cultural wealth, maintains trees as an integral component of the urban infrastructure (Figure 1). Research indicates that healthy trees can lessen impacts associated with the built environment by reducing stormwater runoff, energy consumption, and air pollutants. Trees improve urban life, making...

  4. Integrating 3D Visualization and GIS in Planning Education

    ERIC Educational Resources Information Center

    Yin, Li

    2010-01-01

    Most GIS-related planning practices and education are currently limited to two-dimensional mapping and analysis although 3D GIS is a powerful tool to study the complex urban environment in its full spatial extent. This paper reviews current GIS and 3D visualization uses and development in planning practice and education. Current literature…

  5. Relationship of Class-Size to Classroom Processes, Teacher Satisfaction and Pupil Affect: A Meta-Analysis.

    ERIC Educational Resources Information Center

    Smith, Mary Lee; Glass, Gene V.

    Using data from previously completed research, the authors of this report attempted to examine the relationship between class size and measures of outcomes such as student attitudes and behavior, classroom processes and learning environment, and teacher satisfaction. The authors report that statistical integration of the existing research…

  6. City of Berkeley, California Municipal Tree Resource Analysis

    Treesearch

    S.E. Maco; E.G. McPherson; J.R. Simpson; P.J. Peper; Q. Xiao

    2005-01-01

    Vibrant, renowned for its livability and cultural wealth, the city of Berkeley maintains trees as an integral component of the urban infrastructure. Research indicates that healthy trees can mitigate impacts associated with the built environment by reducing stormwater runoff, energy consumption, and air pollutants. Put simply, trees improve urban life, making Berkeley...

  7. Development, Demonstration, and Analysis of an Integrated Iodine Hall Thruster Feed System

    NASA Technical Reports Server (NTRS)

    Polzin, Kurt A.; Peeples, Steven R.; Burt, Adam O.; Martin, Adam K.; Martinez, Armando; Seixal, Joao F.; Mauro, Stephanie

    2016-01-01

    The design of an in-space iodine-vapor-fed Hall effect thruster propellant management system is described. The solid-iodine propellant tank has unique issues associated with the microgravity environment, requiring a solution where the iodine is maintained in intimate thermal contact with the heated tank walls. The flow control valves required alterations from earlier iterations to survive for extended periods of time in the corrosive iodine-vapor environment. Materials have been selected for the entire feed system that can chemically resist the iodine vapor, with the design now featuring Hastelloy or Inconel for almost all the wetted components. An integrated iodine feed system/Hall thruster demonstration unit was fabricated and tested, with all control being handled by an onboard electronics card specifically designed to operate the feed system. Structural analysis shows that the feed system can survive launch loads after the implementation of some minor reinforcement. Flow modeling, while still requiring significant additional validation, is presented to show its potential in capturing the behavior of components in this low-flow, low-pressure system.

  8. An Update on Improvements to NiCE Support for PROTEUS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, Andrew; McCaskey, Alexander J.; Billings, Jay Jay

    2015-09-01

    The Department of Energy Office of Nuclear Energy's Nuclear Energy Advanced Modeling and Simulation (NEAMS) program has supported the development of the NEAMS Integrated Computational Environment (NiCE), a modeling and simulation workflow environment that provides services and plugins to facilitate tasks such as code execution, model input construction, visualization, and data analysis. This report details the development of workflows for the reactor core neutronics application, PROTEUS. This advanced neutronics application (primarily developed at Argonne National Laboratory) aims to improve nuclear reactor design and analysis by providing an extensible and massively parallel, finite-element solver for current and advanced reactor fuel neutronicsmore » modeling. The integration of PROTEUS-specific tools into NiCE is intended to make the advanced capabilities that PROTEUS provides more accessible to the nuclear energy research and development community. This report will detail the work done to improve existing PROTEUS workflow support in NiCE. We will demonstrate and discuss these improvements, including the development of flexible IO services, an improved interface for input generation, and the addition of advanced Fortran development tools natively in the platform.« less

  9. Object-Oriented Multi-Disciplinary Design, Analysis, and Optimization Tool

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi

    2011-01-01

    An Object-Oriented Optimization (O3) tool was developed that leverages existing tools and practices, and allows the easy integration and adoption of new state-of-the-art software. At the heart of the O3 tool is the Central Executive Module (CEM), which can integrate disparate software packages in a cross platform network environment so as to quickly perform optimization and design tasks in a cohesive, streamlined manner. This object-oriented framework can integrate the analysis codes for multiple disciplines instead of relying on one code to perform the analysis for all disciplines. The CEM was written in FORTRAN and the script commands for each performance index were submitted through the use of the FORTRAN Call System command. In this CEM, the user chooses an optimization methodology, defines objective and constraint functions from performance indices, and provides starting and side constraints for continuous as well as discrete design variables. The structural analysis modules such as computations of the structural weight, stress, deflection, buckling, and flutter and divergence speeds have been developed and incorporated into the O3 tool to build an object-oriented Multidisciplinary Design, Analysis, and Optimization (MDAO) tool.

  10. Student Engagement: A Principle-Based Concept Analysis.

    PubMed

    Bernard, Jean S

    2015-08-04

    A principle-based concept analysis of student engagement was used to examine the state of the science across disciplines. Four major perspectives of philosophy of science guided analysis and provided a framework for study of interrelationships and integration of conceptual components which then resulted in formulation of a theoretical definition. Findings revealed student engagement as a dynamic reiterative process marked by positive behavioral, cognitive, and affective elements exhibited in pursuit of deep learning. This process is influenced by a broader sociocultural environment bound by contextual preconditions of self-investment, motivation, and a valuing of learning. Outcomes of student engagement include satisfaction, sense of well-being, and personal development. Findings of this analysis prove relevant to nursing education as faculty transition from traditional teaching paradigms, incorporate learner-centered strategies, and adopt innovative pedagogical methodologies. It lends support for curricula reform, development of more accurate evaluative measures, and creation of meaningful teaching-learning environments within the discipline.

  11. Investigation of learning environment for arithmetic word problems by problem posing as sentence integration in Indonesian language

    NASA Astrophysics Data System (ADS)

    Hasanah, N.; Hayashi, Y.; Hirashima, T.

    2017-02-01

    Arithmetic word problems remain one of the most difficult area of teaching mathematics. Learning by problem posing has been suggested as an effective way to improve students’ understanding. However, the practice in usual classroom is difficult due to extra time needed for assessment and giving feedback to students’ posed problems. To address this issue, we have developed a tablet PC software named Monsakun for learning by posing arithmetic word problems based on Triplet Structure Model. It uses the mechanism of sentence-integration, an efficient implementation of problem-posing that enables agent-assessment of posed problems. The learning environment has been used in actual Japanese elementary school classrooms and the effectiveness has been confirmed in previous researches. In this study, ten Indonesian elementary school students living in Japan participated in a learning session of problem posing using Monsakun in Indonesian language. We analyzed their learning activities and show that students were able to interact with the structure of simple word problem using this learning environment. The results of data analysis and questionnaire suggested that the use of Monsakun provides a way of creating an interactive and fun environment for learning by problem posing for Indonesian elementary school students.

  12. Soundscape elaboration from anthrophonic adaptation of community noise

    NASA Astrophysics Data System (ADS)

    Teddy Badai Samodra, FX

    2018-03-01

    Under the situation of an urban environment, noise has been a critical issue in affecting the indoor environment. A reliable approach is required for evaluation of the community noise as one factor of anthrophonic in the urban environment. This research investigates the level of noise exposure from different community noise sources and elaborates the advantage of the noise disadvantages for soundscape innovation. Integrated building element design as a protector for noise control and speech intelligibility compliance using field experiment and MATLAB programming and modeling are also carried out. Meanwhile, for simulation analysis and building acoustic optimization, Sound Reduction-Speech Intelligibility and Reverberation Time are the main parameters for identifying tropical building model as case study object. The results show that the noise control should consider its integration with the other critical issue, thermal control, in an urban environment. The 1.1 second of reverberation time for speech activities and noise reduction more than 28.66 dBA for critical frequency (20 Hz), the speech intelligibility index could be reached more than fair assessment, 0.45. Furthermore, the environmental psychology adaptation result “Close The Opening” as the best method in high noise condition and personal adjustment as the easiest and the most adaptable way.

  13. An integrated healthcare system for personalized chronic disease care in home-hospital environments.

    PubMed

    Jeong, Sangjin; Youn, Chan-Hyun; Shim, Eun Bo; Kim, Moonjung; Cho, Young Min; Peng, Limei

    2012-07-01

    Facing the increasing demands and challenges in the area of chronic disease care, various studies on the healthcare system which can, whenever and wherever, extract and process patient data have been conducted. Chronic diseases are the long-term diseases and require the processes of the real-time monitoring, multidimensional quantitative analysis, and the classification of patients' diagnostic information. A healthcare system for chronic diseases is characterized as an at-hospital and at-home service according to a targeted environment. Both services basically aim to provide patients with accurate diagnoses of disease by monitoring a variety of physical states with a number of monitoring methods, but there are differences between home and hospital environments, and the different characteristics should be considered in order to provide more accurate diagnoses for patients, especially, patients having chronic diseases. In this paper, we propose a patient status classification method for effectively identifying and classifying chronic diseases and show the validity of the proposed method. Furthermore, we present a new healthcare system architecture that integrates the at-home and at-hospital environment and discuss the applicability of the architecture using practical target services.

  14. Genome-environment interactions and prospective technology assessment: evolution from pharmacogenomics to nutrigenomics and ecogenomics.

    PubMed

    Ozdemir, Vural; Motulsky, Arno G; Kolker, Eugene; Godard, Béatrice

    2009-02-01

    The relationships between food, nutrition science, and health outcomes have been mapped over the past century. Genomic variation among individuals and populations is a new factor that enriches and challenges our understanding of these complex relationships. Hence, the confluence of nutritional science and genomics-nutrigenomics--was the focus of the OMICS: A Journal of Integrative Biology in December 2008 (Part 1). The 2009 Special Issue (Part 2) concludes the analysis of nutrigenomics research and innovations. Together, these two issues expand the scope and depth of critical scholarship in nutrigenomics, in keeping with an integrated multidisciplinary analysis across the bioscience, omics technology, social, ethical, intellectual property and policy dimensions. Historically, the field of pharmacogenetics provided the first examples of specifically identifiable gene variants predisposing to unexpected responses to drugs since the 1950s. Brewer coined the term ecogenetics in 1971 to broaden the concept of gene-environment interactions from drugs and nutrition to include environmental agents in general. In the mid-1990s, introduction of high-throughput technologies led to the terms pharmacogenomics, nutrigenomics and ecogenomics to describe, respectively, the contribution of genomic variability to differential responses to drugs, food, and environment defined in the broadest sense. The distinctions, if any, between these newer fields (e.g., nutrigenomics) and their predecessors (e.g., nutrigenetics) remain to be delineated. For nutrigenomics, its reliance on genome-wide analyses may lead to detection of new biological mechanisms governing host response to food. Recognizing "genome-environment interactions" as the conceptual thread that connects and runs through pharmacogenomics, nutrigenomics, and ecogenomics may contribute toward anticipatory governance and prospective real-time analysis of these omics fields. Such real-time analysis of omics technologies and innovations is crucial, because it can influence and positively shape them as these approaches develop, and help avoid predictable pitfalls, and thus ensure their effective and ethical application in the laboratory, clinic, and society.

  15. The connectome viewer toolkit: an open source framework to manage, analyze, and visualize connectomes.

    PubMed

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit - a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/

  16. The Connectome Viewer Toolkit: An Open Source Framework to Manage, Analyze, and Visualize Connectomes

    PubMed Central

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit – a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/ PMID:21713110

  17. Analysis, Mining and Visualization Service at NCSA

    NASA Astrophysics Data System (ADS)

    Wilhelmson, R.; Cox, D.; Welge, M.

    2004-12-01

    NCSA's goal is to create a balanced system that fully supports high-end computing as well as: 1) high-end data management and analysis; 2) visualization of massive, highly complex data collections; 3) large databases; 4) geographically distributed Grid computing; and 5) collaboratories, all based on a secure computational environment and driven with workflow-based services. To this end NCSA has defined a new technology path that includes the integration and provision of cyberservices in support of data analysis, mining, and visualization. NCSA has begun to develop and apply a data mining system-NCSA Data-to-Knowledge (D2K)-in conjunction with both the application and research communities. NCSA D2K will enable the formation of model-based application workflows and visual programming interfaces for rapid data analysis. The Java-based D2K framework, which integrates analytical data mining methods with data management, data transformation, and information visualization tools, will be configurable from the cyberservices (web and grid services, tools, ..) viewpoint to solve a wide range of important data mining problems. This effort will use modules, such as a new classification methods for the detection of high-risk geoscience events, and existing D2K data management, machine learning, and information visualization modules. A D2K cyberservices interface will be developed to seamlessly connect client applications with remote back-end D2K servers, providing computational resources for data mining and integration with local or remote data stores. This work is being coordinated with SDSC's data and services efforts. The new NCSA Visualization embedded workflow environment (NVIEW) will be integrated with D2K functionality to tightly couple informatics and scientific visualization with the data analysis and management services. Visualization services will access and filter disparate data sources, simplifying tasks such as fusing related data from distinct sources into a coherent visual representation. This approach enables collaboration among geographically dispersed researchers via portals and front-end clients, and the coupling with data management services enables recording associations among datasets and building annotation systems into visualization tools and portals, giving scientists a persistent, shareable, virtual lab notebook. To facilitate provision of these cyberservices to the national community, NCSA will be providing a computational environment for large-scale data assimilation, analysis, mining, and visualization. This will be initially implemented on the new 512 processor shared memory SGI's recently purchased by NCSA. In addition to standard batch capabilities, NCSA will provide on-demand capabilities for those projects requiring rapid response (e.g., development of severe weather, earthquake events) for decision makers. It will also be used for non-sequential interactive analysis of data sets where it is important have access to large data volumes over space and time.

  18. Environmental Control of Root System Biology.

    PubMed

    Rellán-Álvarez, Rubén; Lobet, Guillaume; Dinneny, José R

    2016-04-29

    The plant root system traverses one of the most complex environments on earth. Understanding how roots support plant life on land requires knowing how soil properties affect the availability of nutrients and water and how roots manipulate the soil environment to optimize acquisition of these resources. Imaging of roots in soil allows the integrated analysis and modeling of environmental interactions occurring at micro- to macroscales. Advances in phenotyping of root systems is driving innovation in cross-platform-compatible methods for data analysis. Root systems acclimate to the environment through architectural changes that act at the root-type level as well as through tissue-specific changes that affect the metabolic needs of the root and the efficiency of nutrient uptake. A molecular understanding of the signaling mechanisms that guide local and systemic signaling is providing insight into the regulatory logic of environmental responses and has identified points where crosstalk between pathways occurs.

  19. Integrated self-cleaning window assembly for optical transmission in combustion environments

    DOEpatents

    Kass, Michael D [Oak Ridge, TN

    2007-07-24

    An integrated window design for optical transmission in combustion environments is described. The invention consists of an integrated optical window design that prevents and removes the accumulation of carbon-based particulate matter and gaseous hydrocarbons through a combination of heat and catalysis. These windows will enable established optical technologies to be applied to combustion environments and their exhaust systems.

  20. Meteoroids and Orbital Debris: Effects on Spacecraft

    NASA Technical Reports Server (NTRS)

    Belk, Cynthia A.; Robinson, Jennifer H.; Alexander, Margaret B.; Cooke, William J.; Pavelitz, Steven D.

    1997-01-01

    The natural space environment is characterized by many complex and subtle phenomena hostile to spacecraft. The effects of these phenomena impact spacecraft design, development, and operations. Space systems become increasingly susceptible to the space environment as use of composite materials and smaller, faster electronics increases. This trend makes an understanding of the natural space environment essential to accomplish overall mission objectives, especially in the current climate of better/cheaper/faster. Meteoroids are naturally occurring phenomena in the natural space environment. Orbital debris is manmade space litter accumulated in Earth orbit from the exploration of space. Descriptions are presented of orbital debris source, distribution, size, lifetime, and mitigation measures. This primer is one in a series of NASA Reference Publications currently being developed by the Electromagnetics and Aerospace Environments Branch, Systems Analysis and Integration Laboratory, Marshall Space Flight Center, National Aeronautics and Space Administration.

  1. Performance Analysis on Carrier Phase-Based Tightly-Coupled GPS/BDS/INS Integration in GNSS Degraded and Denied Environments

    PubMed Central

    Han, Houzeng; Wang, Jian; Wang, Jinling; Tan, Xinglong

    2015-01-01

    The integration of Global Navigation Satellite Systems (GNSS) carrier phases with Inertial Navigation System (INS) measurements is essential to provide accurate and continuous position, velocity and attitude information, however it is necessary to fix ambiguities rapidly and reliably to obtain high accuracy navigation solutions. In this paper, we present the notion of combining the Global Positioning System (GPS), the BeiDou Navigation Satellite System (BDS) and low-cost micro-electro-mechanical sensors (MEMS) inertial systems for reliable navigation. An adaptive multipath factor-based tightly-coupled (TC) GPS/BDS/INS integration algorithm is presented and the overall performance of the integrated system is illustrated. A twenty seven states TC GPS/BDS/INS model is adopted with an extended Kalman filter (EKF), which is carried out by directly fusing ambiguity fixed double-difference (DD) carrier phase measurements with the INS predicted pseudoranges to estimate the error states. The INS-aided integer ambiguity resolution (AR) strategy is developed by using a dynamic model, a two-step estimation procedure is applied with adaptively estimated covariance matrix to further improve the AR performance. A field vehicular test was carried out to demonstrate the positioning performance of the combined system. The results show the TC GPS/BDS/INS system significantly improves the single-epoch AR reliability as compared to that of GPS/BDS-only or single satellite navigation system integrated strategy, especially for high cut-off elevations. The AR performance is also significantly improved for the combined system with adaptive covariance matrix in the presence of low elevation multipath related to the GNSS-only case. A total of fifteen simulated outage tests also show that the time to relock of the GPS/BDS signals is shortened, which improves the system availability. The results also indicate that TC integration system achieves a few centimeters accuracy in positioning based on the comparison analysis and covariance analysis, even in harsh environments (e.g., in urban canyons), thus we can see the advantage of positioning at high cut-off elevations that the combined GPS/BDS brings. PMID:25875191

  2. Performance analysis on carrier phase-based tightly-coupled GPS/BDS/INS integration in GNSS degraded and denied environments.

    PubMed

    Han, Houzeng; Wang, Jian; Wang, Jinling; Tan, Xinglong

    2015-04-14

    The integration of Global Navigation Satellite Systems (GNSS) carrier phases with Inertial Navigation System (INS) measurements is essential to provide accurate and continuous position, velocity and attitude information, however it is necessary to fix ambiguities rapidly and reliably to obtain high accuracy navigation solutions. In this paper, we present the notion of combining the Global Positioning System (GPS), the BeiDou Navigation Satellite System (BDS) and low-cost micro-electro-mechanical sensors (MEMS) inertial systems for reliable navigation. An adaptive multipath factor-based tightly-coupled (TC) GPS/BDS/INS integration algorithm is presented and the overall performance of the integrated system is illustrated. A twenty seven states TC GPS/BDS/INS model is adopted with an extended Kalman filter (EKF), which is carried out by directly fusing ambiguity fixed double-difference (DD) carrier phase measurements with the INS predicted pseudoranges to estimate the error states. The INS-aided integer ambiguity resolution (AR) strategy is developed by using a dynamic model, a two-step estimation procedure is applied with adaptively estimated covariance matrix to further improve the AR performance. A field vehicular test was carried out to demonstrate the positioning performance of the combined system. The results show the TC GPS/BDS/INS system significantly improves the single-epoch AR reliability as compared to that of GPS/BDS-only or single satellite navigation system integrated strategy, especially for high cut-off elevations. The AR performance is also significantly improved for the combined system with adaptive covariance matrix in the presence of low elevation multipath related to the GNSS-only case. A total of fifteen simulated outage tests also show that the time to relock of the GPS/BDS signals is shortened, which improves the system availability. The results also indicate that TC integration system achieves a few centimeters accuracy in positioning based on the comparison analysis and covariance analysis, even in harsh environments (e.g., in urban canyons), thus we can see the advantage of positioning at high cut-off elevations that the combined GPS/BDS brings.

  3. Feasibility analysis on integration of luminous environment measuring and design based on exposure curve calibration

    NASA Astrophysics Data System (ADS)

    Zou, Yuan; Shen, Tianxing

    2013-03-01

    Besides illumination calculating during architecture and luminous environment design, to provide more varieties of photometric data, the paper presents combining relation between luminous environment design and SM light environment measuring system, which contains a set of experiment devices including light information collecting and processing modules, and can offer us various types of photometric data. During the research process, we introduced a simulation method for calibration, which mainly includes rebuilding experiment scenes in 3ds Max Design, calibrating this computer aid design software in simulated environment under conditions of various typical light sources, and fitting the exposure curves of rendered images. As analytical research went on, the operation sequence and points for attention during the simulated calibration were concluded, connections between Mental Ray renderer and SM light environment measuring system were established as well. From the paper, valuable reference conception for coordination between luminous environment design and SM light environment measuring system was pointed out.

  4. Post2 End-to-End Descent and Landing Simulation for ALHAT Design Analysis Cycle 2

    NASA Technical Reports Server (NTRS)

    Davis, Jody L.; Striepe, Scott A.; Maddock, Robert W.; Johnson, Andrew E.; Paschall, Stephen C., II

    2010-01-01

    The ALHAT project is an agency-level program involving NASA centers, academia, and industry, with a primary goal to develop a safe, autonomous, precision-landing system for robotic and crew-piloted lunar and planetary descent vehicles. POST2 is used as the 6DOF descent and landing trajectory simulation for determining integrated system performance of ALHAT landing-system models and lunar environment models. This paper presents updates in the development of the ALHAT POST2 simulation, as well as preliminary system performance analysis for ALDAC-2 used for the testing and assessment of ALHAT system models. The ALDAC-2 POST2 Monte Carlo simulation results have been generated and focus on HRN model performance with the fully integrated system, as well performance improvements of AGNC and TSAR model since the previous design analysis cycle

  5. Service-Oriented Security Framework for Remote Medical Services in the Internet of Things Environment

    PubMed Central

    Lee, Jae Dong; Yoon, Tae Sik; Chung, Seung Hyun

    2015-01-01

    Objectives Remote medical services have been expanding globally, and this is expansion is steadily increasing. It has had many positive effects, including medical access convenience, timeliness of service, and cost reduction. The speed of research and development in remote medical technology has been gradually accelerating. Therefore, it is expected to expand to enable various high-tech information and communications technology (ICT)-based remote medical services. However, the current state lacks an appropriate security framework that can resolve security issues centered on the Internet of things (IoT) environment that will be utilized significantly in telemedicine. Methods This study developed a medical service-oriented frame work for secure remote medical services, possessing flexibility regarding new service and security elements through its service-oriented structure. First, the common architecture of remote medical services is defined. Next medical-oriented secu rity threats and requirements within the IoT environment are identified. Finally, we propose a "service-oriented security frame work for remote medical services" based on previous work and requirements for secure remote medical services in the IoT. Results The proposed framework is a secure framework based on service-oriented cases in the medical environment. A com parative analysis focusing on the security elements (confidentiality, integrity, availability, privacy) was conducted, and the analysis results demonstrate the security of the proposed framework for remote medical services with IoT. Conclusions The proposed framework is service-oriented structure. It can support dynamic security elements in accordance with demands related to new remote medical services which will be diversely generated in the IoT environment. We anticipate that it will enable secure services to be provided that can guarantee confidentiality, integrity, and availability for all, including patients, non-patients, and medical staff. PMID:26618034

  6. Service-Oriented Security Framework for Remote Medical Services in the Internet of Things Environment.

    PubMed

    Lee, Jae Dong; Yoon, Tae Sik; Chung, Seung Hyun; Cha, Hyo Soung

    2015-10-01

    Remote medical services have been expanding globally, and this is expansion is steadily increasing. It has had many positive effects, including medical access convenience, timeliness of service, and cost reduction. The speed of research and development in remote medical technology has been gradually accelerating. Therefore, it is expected to expand to enable various high-tech information and communications technology (ICT)-based remote medical services. However, the current state lacks an appropriate security framework that can resolve security issues centered on the Internet of things (IoT) environment that will be utilized significantly in telemedicine. This study developed a medical service-oriented frame work for secure remote medical services, possessing flexibility regarding new service and security elements through its service-oriented structure. First, the common architecture of remote medical services is defined. Next medical-oriented secu rity threats and requirements within the IoT environment are identified. Finally, we propose a "service-oriented security frame work for remote medical services" based on previous work and requirements for secure remote medical services in the IoT. The proposed framework is a secure framework based on service-oriented cases in the medical environment. A com parative analysis focusing on the security elements (confidentiality, integrity, availability, privacy) was conducted, and the analysis results demonstrate the security of the proposed framework for remote medical services with IoT. The proposed framework is service-oriented structure. It can support dynamic security elements in accordance with demands related to new remote medical services which will be diversely generated in the IoT environment. We anticipate that it will enable secure services to be provided that can guarantee confidentiality, integrity, and availability for all, including patients, non-patients, and medical staff.

  7. Integrating Environmental Education With Beginning Reading and Phonics Instruction

    NASA Astrophysics Data System (ADS)

    Bolsem, Anne M.

    The practice of integrating methods of instruction shows signs of promise as teachers seek new ways to raise student achievement. Although researchers have focused on the efficacy of environment-based education, little research has been done to investigate a framework for integration. The research question that guided this case study was designed to explore the instructional practices teachers employ to integrate environmental education with beginning reading and phonics instruction. The conceptual framework is based on the seminal idea of using the environment as an integrating context, which postulates that the natural world becomes the overarching theme for teaching and learning. Data were obtained through interviews and lesson observations using a purposeful sample of 4 kindergarten and first grade teacher participants. A focus group interview also included the on-site naturalist. Inductive data analysis was used to discover categories and themes. Findings indicated that teachers integrate instruction primarily in the areas of vocabulary and writing after purposeful planning, collaboration, and a deep understanding of broader student goals. Findings also specified that teachers' integration strategies represent a combination of who the teachers are, what they believe, and what action they take in the classroom. Site-specific recommendations for action include ongoing professional development for teachers and support staff, time for collaboration, and review of teacher personal beliefs. Implications for social change include reflection and improvement of practice by teachers at the study site, as well as continued evaluation and discussion about teaching practices and teacher roles leading to more cohesive, enriching integrated instruction rooted in authentic, place-based experiences.

  8. Integration of GIS and Bim for Indoor Geovisual Analytics

    NASA Astrophysics Data System (ADS)

    Wu, B.; Zhang, S.

    2016-06-01

    This paper presents an endeavour of integration of GIS (Geographical Information System) and BIM (Building Information Modelling) for indoor geovisual analytics. The merits of two types of technologies, GIS and BIM are firstly analysed in the context of indoor environment. GIS has well-developed capabilities of spatial analysis such as network analysis, while BIM has the advantages for indoor 3D modelling and dynamic simulation. This paper firstly investigates the important aspects for integrating GIS and BIM. Different data standards and formats such as the IFC (Industry Foundation Classes) and GML (Geography Markup Language) are discussed. Their merits and limitations in data transformation between GIS and BIM are analysed in terms of semantic and geometric information. An optimized approach for data exchange between GIS and BIM datasets is then proposed. After that, a strategy of using BIM for 3D indoor modelling, GIS for spatial analysis, and BIM again for visualization and dynamic simulation of the analysis results is presented. Based on the developments, this paper selects a typical problem, optimized indoor emergency evacuation, to demonstrate the integration of GIS and BIM for indoor geovisual analytics. The block Z of the Hong Kong Polytechnic University is selected as a test site. Detailed indoor and outdoor 3D models of the block Z are created using a BIM software Revit. The 3D models are transferred to a GIS software ArcGIS to carry out spatial analysis. Optimized evacuation plans considering dynamic constraints are generated based on network analysis in ArcGIS assuming there is a fire accident inside the building. The analysis results are then transferred back to BIM software for visualization and dynamic simulation. The developed methods and results are of significance to facilitate future development of GIS and BIM integrated solutions in various applications.

  9. Performance assessment of human resource by integration of HSE and ergonomics and EFQM management system.

    PubMed

    Sadegh Amalnick, Mohsen; Zarrin, Mansour

    2017-03-13

    Purpose The purpose of this paper is to present an integrated framework for performance evaluation and analysis of human resource (HR) with respect to the factors of health, safety, environment and ergonomics (HSEE) management system, and also the criteria of European federation for quality management (EFQM) as one of the well-known business excellence models. Design/methodology/approach In this study, an intelligent algorithm based on adaptive neuro-fuzzy inference system (ANFIS) along with fuzzy data envelopment analysis (FDEA) are developed and employed to assess the performance of the company. Furthermore, the impact of the factors on the company's performance as well as their strengths and weaknesses are identified by conducting a sensitivity analysis on the results. Similarly, a design of experiment is performed to prioritize the factors in the order of importance. Findings The results show that EFQM model has a far greater impact upon the company's performance than HSEE management system. According to the obtained results, it can be argued that integration of HSEE and EFQM leads to the performance improvement in the company. Practical implications In current study, the required data for executing the proposed framework are collected via valid questionnaires which are filled in by the staff of an aviation industry located in Tehran, Iran. Originality/value Managing HR performance results in improving usability, maintainability and reliability and finally in a significant reduction in the commercial aviation accident rate. Also, study of factors affecting HR performance authorities participate in developing systems in order to help operators better manage human error. This paper for the first time presents an intelligent framework based on ANFIS, FDEA and statistical tests for HR performance assessment and analysis with the ability of handling uncertainty and vagueness existing in real world environment.

  10. Genomic insights into the Acidobacteria reveal strategies for their success in terrestrial environments

    PubMed Central

    Trojan, Daniela; Roux, Simon; Herbold, Craig; Rattei, Thomas; Woebken, Dagmar

    2018-01-01

    Summary Members of the phylum Acidobacteria are abundant and ubiquitous across soils. We performed a large‐scale comparative genome analysis spanning subdivisions 1, 3, 4, 6, 8 and 23 (n = 24) with the goal to identify features to help explain their prevalence in soils and understand their ecophysiology. Our analysis revealed that bacteriophage integration events along with transposable and mobile elements influenced the structure and plasticity of these genomes. Low‐ and high‐affinity respiratory oxygen reductases were detected in multiple genomes, suggesting the capacity for growing across different oxygen gradients. Among many genomes, the capacity to use a diverse collection of carbohydrates, as well as inorganic and organic nitrogen sources (such as via extracellular peptidases), was detected – both advantageous traits in environments with fluctuating nutrient environments. We also identified multiple soil acidobacteria with the potential to scavenge atmospheric concentrations of H2, now encompassing mesophilic soil strains within the subdivision 1 and 3, in addition to a previously identified thermophilic strain in subdivision 4. This large‐scale acidobacteria genome analysis reveal traits that provide genomic, physiological and metabolic versatility, presumably allowing flexibility and versatility in the challenging and fluctuating soil environment. PMID:29327410

  11. Integrated protocol for reliable and fast quantification and documentation of electrophoresis gels.

    PubMed

    Rehbein, Peter; Schwalbe, Harald

    2015-06-01

    Quantitative analysis of electrophoresis gels is an important part in molecular cloning, as well as in protein expression and purification. Parallel quantifications in yield and purity can be most conveniently obtained from densitometric analysis. This communication reports a comprehensive, reliable and simple protocol for gel quantification and documentation, applicable for single samples and with special features for protein expression screens. As major component of the protocol, the fully annotated code of a proprietary open source computer program for semi-automatic densitometric quantification of digitized electrophoresis gels is disclosed. The program ("GelQuant") is implemented for the C-based macro-language of the widespread integrated development environment of IGOR Pro. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Brainlab: A Python Toolkit to Aid in the Design, Simulation, and Analysis of Spiking Neural Networks with the NeoCortical Simulator.

    PubMed

    Drewes, Rich; Zou, Quan; Goodman, Philip H

    2009-01-01

    Neuroscience modeling experiments often involve multiple complex neural network and cell model variants, complex input stimuli and input protocols, followed by complex data analysis. Coordinating all this complexity becomes a central difficulty for the experimenter. The Python programming language, along with its extensive library packages, has emerged as a leading "glue" tool for managing all sorts of complex programmatic tasks. This paper describes a toolkit called Brainlab, written in Python, that leverages Python's strengths for the task of managing the general complexity of neuroscience modeling experiments. Brainlab was also designed to overcome the major difficulties of working with the NCS (NeoCortical Simulator) environment in particular. Brainlab is an integrated model-building, experimentation, and data analysis environment for the powerful parallel spiking neural network simulator system NCS.

  13. Brainlab: A Python Toolkit to Aid in the Design, Simulation, and Analysis of Spiking Neural Networks with the NeoCortical Simulator

    PubMed Central

    Drewes, Rich; Zou, Quan; Goodman, Philip H.

    2008-01-01

    Neuroscience modeling experiments often involve multiple complex neural network and cell model variants, complex input stimuli and input protocols, followed by complex data analysis. Coordinating all this complexity becomes a central difficulty for the experimenter. The Python programming language, along with its extensive library packages, has emerged as a leading “glue” tool for managing all sorts of complex programmatic tasks. This paper describes a toolkit called Brainlab, written in Python, that leverages Python's strengths for the task of managing the general complexity of neuroscience modeling experiments. Brainlab was also designed to overcome the major difficulties of working with the NCS (NeoCortical Simulator) environment in particular. Brainlab is an integrated model-building, experimentation, and data analysis environment for the powerful parallel spiking neural network simulator system NCS. PMID:19506707

  14. Research-Based Monitoring, Prediction, and Analysis Tools of the Spacecraft Charging Environment for Spacecraft Users

    NASA Technical Reports Server (NTRS)

    Zheng, Yihua; Kuznetsova, Maria M.; Pulkkinen, Antti A.; Maddox, Marlo M.; Mays, Mona Leila

    2015-01-01

    The Space Weather Research Center (http://swrc. gsfc.nasa.gov) at NASA Goddard, part of the Community Coordinated Modeling Center (http://ccmc.gsfc.nasa.gov), is committed to providing research-based forecasts and notifications to address NASA's space weather needs, in addition to its critical role in space weather education. It provides a host of services including spacecraft anomaly resolution, historical impact analysis, real-time monitoring and forecasting, tailored space weather alerts and products, and weekly summaries and reports. In this paper, we focus on how (near) real-time data (both in space and on ground), in combination with modeling capabilities and an innovative dissemination system called the integrated Space Weather Analysis system (http://iswa.gsfc.nasa.gov), enable monitoring, analyzing, and predicting the spacecraft charging environment for spacecraft users. Relevant tools and resources are discussed.

  15. Developing tools for digital radar image data evaluation

    NASA Technical Reports Server (NTRS)

    Domik, G.; Leberl, F.; Raggam, J.

    1986-01-01

    The refinement of radar image analysis methods has led to a need for a systems approach to radar image processing software. Developments stimulated through satellite radar are combined with standard image processing techniques to create a user environment to manipulate and analyze airborne and satellite radar images. One aim is to create radar products for the user from the original data to enhance the ease of understanding the contents. The results are called secondary image products and derive from the original digital images. Another aim is to support interactive SAR image analysis. Software methods permit use of a digital height model to create ortho images, synthetic images, stereo-ortho images, radar maps or color combinations of different component products. Efforts are ongoing to integrate individual tools into a combined hardware/software environment for interactive radar image analysis.

  16. Advanced Technology Lifecycle Analysis System (ATLAS)

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.; Mankins, John C.

    2004-01-01

    Developing credible mass and cost estimates for space exploration and development architectures require multidisciplinary analysis based on physics calculations, and parametric estimates derived from historical systems. Within the National Aeronautics and Space Administration (NASA), concurrent engineering environment (CEE) activities integrate discipline oriented analysis tools through a computer network and accumulate the results of a multidisciplinary analysis team via a centralized database or spreadsheet Each minute of a design and analysis study within a concurrent engineering environment is expensive due the size of the team and supporting equipment The Advanced Technology Lifecycle Analysis System (ATLAS) reduces the cost of architecture analysis by capturing the knowledge of discipline experts into system oriented spreadsheet models. A framework with a user interface presents a library of system models to an architecture analyst. The analyst selects models of launchers, in-space transportation systems, and excursion vehicles, as well as space and surface infrastructure such as propellant depots, habitats, and solar power satellites. After assembling the architecture from the selected models, the analyst can create a campaign comprised of missions spanning several years. The ATLAS controller passes analyst specified parameters to the models and data among the models. An integrator workbook calls a history based parametric analysis cost model to determine the costs. Also, the integrator estimates the flight rates, launched masses, and architecture benefits over the years of the campaign. An accumulator workbook presents the analytical results in a series of bar graphs. In no way does ATLAS compete with a CEE; instead, ATLAS complements a CEE by ensuring that the time of the experts is well spent Using ATLAS, an architecture analyst can perform technology sensitivity analysis, study many scenarios, and see the impact of design decisions. When the analyst is satisfied with the system configurations, technology portfolios, and deployment strategies, he or she can present the concepts to a team, which will conduct a detailed, discipline-oriented analysis within a CEE. An analog to this approach is the music industry where a songwriter creates the lyrics and music before entering a recording studio.

  17. Towards systems genetic analyses in barley: Integration of phenotypic, expression and genotype data into GeneNetwork

    PubMed Central

    Druka, Arnis; Druka, Ilze; Centeno, Arthur G; Li, Hongqiang; Sun, Zhaohui; Thomas, William TB; Bonar, Nicola; Steffenson, Brian J; Ullrich, Steven E; Kleinhofs, Andris; Wise, Roger P; Close, Timothy J; Potokina, Elena; Luo, Zewei; Wagner, Carola; Schweizer, Günther F; Marshall, David F; Kearsey, Michael J; Williams, Robert W; Waugh, Robbie

    2008-01-01

    Background A typical genetical genomics experiment results in four separate data sets; genotype, gene expression, higher-order phenotypic data and metadata that describe the protocols, processing and the array platform. Used in concert, these data sets provide the opportunity to perform genetic analysis at a systems level. Their predictive power is largely determined by the gene expression dataset where tens of millions of data points can be generated using currently available mRNA profiling technologies. Such large, multidimensional data sets often have value beyond that extracted during their initial analysis and interpretation, particularly if conducted on widely distributed reference genetic materials. Besides quality and scale, access to the data is of primary importance as accessibility potentially allows the extraction of considerable added value from the same primary dataset by the wider research community. Although the number of genetical genomics experiments in different plant species is rapidly increasing, none to date has been presented in a form that allows quick and efficient on-line testing for possible associations between genes, loci and traits of interest by an entire research community. Description Using a reference population of 150 recombinant doubled haploid barley lines we generated novel phenotypic, mRNA abundance and SNP-based genotyping data sets, added them to a considerable volume of legacy trait data and entered them into the GeneNetwork . GeneNetwork is a unified on-line analytical environment that enables the user to test genetic hypotheses about how component traits, such as mRNA abundance, may interact to condition more complex biological phenotypes (higher-order traits). Here we describe these barley data sets and demonstrate some of the functionalities GeneNetwork provides as an easily accessible and integrated analytical environment for exploring them. Conclusion By integrating barley genotypic, phenotypic and mRNA abundance data sets directly within GeneNetwork's analytical environment we provide simple web access to the data for the research community. In this environment, a combination of correlation analysis and linkage mapping provides the potential to identify and substantiate gene targets for saturation mapping and positional cloning. By integrating datasets from an unsequenced crop plant (barley) in a database that has been designed for an animal model species (mouse) with a well established genome sequence, we prove the importance of the concept and practice of modular development and interoperability of software engineering for biological data sets. PMID:19017390

  18. Evolution of Integrated Causal Structures in Animats Exposed to Environments of Increasing Complexity

    PubMed Central

    Albantakis, Larissa; Hintze, Arend; Koch, Christof; Adami, Christoph; Tononi, Giulio

    2014-01-01

    Natural selection favors the evolution of brains that can capture fitness-relevant features of the environment's causal structure. We investigated the evolution of small, adaptive logic-gate networks (“animats”) in task environments where falling blocks of different sizes have to be caught or avoided in a ‘Tetris-like’ game. Solving these tasks requires the integration of sensor inputs and memory. Evolved networks were evaluated using measures of information integration, including the number of evolved concepts and the total amount of integrated conceptual information. The results show that, over the course of the animats' adaptation, i) the number of concepts grows; ii) integrated conceptual information increases; iii) this increase depends on the complexity of the environment, especially on the requirement for sequential memory. These results suggest that the need to capture the causal structure of a rich environment, given limited sensors and internal mechanisms, is an important driving force for organisms to develop highly integrated networks (“brains”) with many concepts, leading to an increase in their internal complexity. PMID:25521484

  19. Model-based analysis of pattern motion processing in mouse primary visual cortex

    PubMed Central

    Muir, Dylan R.; Roth, Morgane M.; Helmchen, Fritjof; Kampa, Björn M.

    2015-01-01

    Neurons in sensory areas of neocortex exhibit responses tuned to specific features of the environment. In visual cortex, information about features such as edges or textures with particular orientations must be integrated to recognize a visual scene or object. Connectivity studies in rodent cortex have revealed that neurons make specific connections within sub-networks sharing common input tuning. In principle, this sub-network architecture enables local cortical circuits to integrate sensory information. However, whether feature integration indeed occurs locally in rodent primary sensory areas has not been examined directly. We studied local integration of sensory features in primary visual cortex (V1) of the mouse by presenting drifting grating and plaid stimuli, while recording the activity of neuronal populations with two-photon calcium imaging. Using a Bayesian model-based analysis framework, we classified single-cell responses as being selective for either individual grating components or for moving plaid patterns. Rather than relying on trial-averaged responses, our model-based framework takes into account single-trial responses and can easily be extended to consider any number of arbitrary predictive models. Our analysis method was able to successfully classify significantly more responses than traditional partial correlation (PC) analysis, and provides a rigorous statistical framework to rank any number of models and reject poorly performing models. We also found a large proportion of cells that respond strongly to only one stimulus class. In addition, a quarter of selectively responding neurons had more complex responses that could not be explained by any simple integration model. Our results show that a broad range of pattern integration processes already take place at the level of V1. This diversity of integration is consistent with processing of visual inputs by local sub-networks within V1 that are tuned to combinations of sensory features. PMID:26300738

  20. Content Integration as a Factor in Math-Game Effectiveness

    ERIC Educational Resources Information Center

    Vandercruysse, Sylke; ter Vrugte, Judith; de Jong, Ton; Wouters, Pieter; van Oostendorp, Herre; Verschaffel, Lieven; Elen, Jan

    2017-01-01

    In this study we focus on the integration of mathematical learning content (i.e., proportional reasoning) in game-based learning environments (GBLE). More specifically, two kinds of GBLEs are set up: an extrinsically integrated GBLE and an intrinsically integrated GBLE. In the former environment, the mathematical content is not part of the core…

  1. A Model for Integrating New Technologies into Pre-Service Teacher Training Programs Ajman University (A Case Study)

    ERIC Educational Resources Information Center

    Shaqour, Ali Zuhdi H.

    2005-01-01

    This study introduces a "Technology Integration Model" for a learning environment utilizing constructivist learning principles and integrating new technologies namely computers and the Internet into pre-service teacher training programs. The technology integrated programs and learning environments may assist learners to gain experiences…

  2. Ecosystem services and livelihoods in deltaic environments

    NASA Astrophysics Data System (ADS)

    Nicholls, R. J.; Rahman, M. M.; Salehin, M.; Hutton, C.

    2015-12-01

    While overall, deltas account for only 1% of global land area, they are home to more than a half billion people or ca. 7% of the world's population. In many deltas, livelihoods and food security are strongly dependent on ecosystem services, which in turn are affected by various environmental change factors, including climate variability and change, modifications to upstream river, sediment and nutrient fluxes, evolving nearshore ecosystems, and delta-level change factors such as subsidence, changing land use and management interventions such as polders. Key limits include scarcity of fresh water, saline water intrusion and the impacts of extreme events (e.g. river floods, cyclones and storm surges), which constrain land use choices and livelihood opportunities for the deltaic populations. The ESPA Deltas project takes a systemic perspective of the interaction between the coupled bio-physical environment and the livelihoods of rural delta residents. The methods emphasise poverty reduction and use coastal Bangladesh as an example. This includes a set of consistent biophysical analyses of the delta and the upstream catchments and the downstream Bay of Bengal, as well as governance and policy analysis and socio-demographic analysis, including an innovative household survey on ecosystem utilization. These results are encapsulated in an integrated model that analyses ecosystem services and livelihood implications. This integrated approach is designed to support delta-level policy formulation. It allows the exploration of contrasting development trajectories, including issues such as robustness of different governance options on ecosystem services and livelihoods. The method is strongly participatory including an ongoing series of stakeholder workshops addressing issue identification, scenario development and consideration of policy responses. The methods presented are generic and transferable to other deltas. The paper will consider the overall ESPA Deltas project and its wider lessons for integrated assessment of deltaic environments.

  3. Vendor-buyer inventory models with trade credit financing under both non-cooperative and integrated environments

    NASA Astrophysics Data System (ADS)

    Teng, Jinn-Tsair; Chang, Chun-Tao; Chern, Maw-Sheng

    2012-11-01

    Most researchers studied vendor-buyer supply chain inventory policies only from the perspective of an integrated model, which provides us the best cooperative solution. However, in reality, not many vendors and buyers are wholly integrated. Hence, it is necessary to study the optimal policies not only under an integrated environment but also under a non-cooperative environment. In this article, we develop a supply chain vendor-buyer inventory model with trade credit financing linked to order quantity. We then study the optimal policies for both the vendor and the buyer under a non-cooperative environment first, and then under a cooperative integrated situation. Further, we provide some numerical examples to illustrate the theoretical results, compare the differences between these two distinct solutions, and obtain some managerial insights. For example, in a cooperative environment, to reduce the total cost for both parties, the vendor should either provide a simple permissible delay without order quantity restriction or offer a long permissible delay linked order quantity. By contrast, in a non-cooperative environment, the vendor should provide a short permissible delay to reduce its total cost.

  4. Using animal models of enriched environments to inform research on sensory integration intervention for the rehabilitation of neurodevelopmental disorders.

    PubMed

    Reynolds, Stacey; Lane, Shelly J; Richards, Lorie

    2010-09-01

    The field of behavioral neuroscience has been successful in using an animal model of enriched environments for over five decades to measure the rehabilitative and preventative effects of sensory, cognitive and motor stimulation in animal models. Several key principles of enriched environments match those used in sensory integration therapy, a treatment used for children with neurodevelopmental disorders. This paper reviews the paradigm of environmental enrichment, compares animal models of enriched environments to principles of sensory integration treatment, and discusses applications for the rehabilitation of neurodevelopmental disorders. Based on this review, the essential features in the enriched environment paradigm which should be included in sensory integration treatment are multiple sensory experiences, novelty in the environment, and active engagement in challenging cognitive, sensory, and motor tasks. Use of sensory integration treatment may be most applicable for children with anxiety, hypersensitivity, repetitive behaviors or heightened levels of stress. Additionally, individuals with deficits in social behavior, social participation, or impairments in learning and memory may show gains with this type of treatment.

  5. Cyberinfrastructure to support Real-time, End-to-End, High Resolution, Localized Forecasting

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M. K.; Lindholm, D.; Baltzer, T.; Domenico, B.

    2004-12-01

    From natural disasters such as flooding and forest fires to man-made disasters such as toxic gas releases, the impact of weather-influenced severe events on society can be profound. Understanding, predicting, and mitigating such local, mesoscale events calls for a cyberinfrastructure to integrate multidisciplinary data, tools, and services as well as the capability to generate and use high resolution data (such as wind and precipitation) from localized models. The need for such end to end systems -- including data collection, distribution, integration, assimilation, regionalized mesoscale modeling, analysis, and visualization -- has been realized to some extent in many academic and quasi-operational environments, especially for atmospheric sciences data. However, many challenges still remain in the integration and synthesis of data from multiple sources and the development of interoperable data systems and services across those disciplines. Over the years, the Unidata Program Center has developed several tools that have either directly or indirectly facilitated these local modeling activities. For example, the community is using Unidata technologies such as the Internet Data Distribution (IDD) system, Local Data Manger (LDM), decoders, netCDF libraries, Thematic Realtime Environmental Distributed Data Services (THREDDS), and the Integrated Data Viewer (IDV) in their real-time prediction efforts. In essence, these technologies for data reception and processing, local and remote access, cataloging, and analysis and visualization coupled with technologies from others in the community are becoming the foundation of a cyberinfrastructure to support an end-to-end regional forecasting system. To build on these capabilities, the Unidata Program Center is pleased to be a significant contributor to the Linked Environments for Atmospheric Discovery (LEAD) project, a NSF-funded multi-institutional large Information Technology Research effort. The goal of LEAD is to create an integrated and scalable framework for identifying, accessing, preparing, assimilating, predicting, managing, analyzing, mining, and visualizing a broad array of meteorological data and model output, independent of format and physical location. To that end, LEAD will create a series of interconnected, heterogeneous Grid environments to provide a complete framework for mesoscale research, including a set of integrated Grid and Web Services. This talk will focus on the transition from today's end-to-end systems into the types of systems that the LEAD project envisions and the multidisciplinary research problems they will enable.

  6. Agricultural land management options after the Chernobyl and Fukushima accidents: The articulation of science, technology, and society.

    PubMed

    Vandenhove, Hildegarde; Turcanu, Catrinel

    2016-10-01

    The options adopted for recovery of agricultural land after the Chernobyl and Fukushima accidents are compared by examining their technical and socio-economic aspects. The analysis highlights commonalities such as the implementation of tillage and other types of countermeasures and differences in approach, such as preferences for topsoil removal in Fukushima and the application of K fertilizers in Chernobyl. This analysis shows that the recovery approach needs to be context-specific to best suit the physical, social, and political environment. The complex nature of the decision problem calls for a formal process for engaging stakeholders and the development of adequate decision support tools. Integr Environ Assess Manag 2016;12:662-666. © 2016 SETAC. © 2016 SETAC.

  7. Integrated Data Collection Analysis (IDCA) Program - Statistical Analysis of RDX Standard Data Sets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sandstrom, Mary M.; Brown, Geoffrey W.; Preston, Daniel N.

    2015-10-30

    The Integrated Data Collection Analysis (IDCA) program is conducting a Proficiency Test for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are statistical analyses of the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard. The material was tested as a well-characterized standard several times during the proficiency study to assess differences among participants and the range of results that may arise for well-behaved explosive materials. The analyses show that there are detectable differences among the results from IDCA participants. While these differences are statisticallymore » significant, most of them can be disregarded for comparison purposes to assess potential variability when laboratories attempt to measure identical samples using methods assumed to be nominally the same. The results presented in this report include the average sensitivity results for the IDCA participants and the ranges of values obtained. The ranges represent variation about the mean values of the tests of between 26% and 42%. The magnitude of this variation is attributed to differences in operator, method, and environment as well as the use of different instruments that are also of varying age. The results appear to be a good representation of the broader safety testing community based on the range of methods, instruments, and environments included in the IDCA Proficiency Test.« less

  8. The Impact of the Physical Environment on the Social Integration of Individuals with Disabilities in Community

    ERIC Educational Resources Information Center

    Christensen, Keith M.

    2010-01-01

    Social integration in community is especially important for individuals with disabilities well-being. Although individuals with disabilities reside within the community's physical environment, they are often marginalized in the social environment. This may be the result of individuals with disabilities residing in physical environments that…

  9. Pre-Service English Teachers in Blended Learning Environment in Respect to Their Learning Approaches

    ERIC Educational Resources Information Center

    Yilmaz, M. Betul; Orhan, Feza

    2010-01-01

    Blended learning environment (BLE) is increasingly used in the world, especially in university degrees and it is based on integrating web-based learning and face-to-face (FTF) learning environments. Besides integrating different learning environments, BLE also addresses to students with different learning approaches. The "learning…

  10. Fault tree safety analysis of a large Li/SOCl(sub)2 spacecraft battery

    NASA Technical Reports Server (NTRS)

    Uy, O. Manuel; Maurer, R. H.

    1987-01-01

    The results of the safety fault tree analysis on the eight module, 576 F cell Li/SOCl2 battery on the spacecraft and in the integration and test environment prior to launch on the ground are presented. The analysis showed that with the right combination of blocking diodes, electrical fuses, thermal fuses, thermal switches, cell balance, cell vents, and battery module vents the probability of a single cell or a 72 cell module exploding can be reduced to .000001, essentially the probability due to explosion for unexplained reasons.

  11. Evolution of a Mediterranean coastal zone: human impacts on the marine environment of Cape Creus.

    PubMed

    Lloret, Josep; Riera, Victòria

    2008-12-01

    This study presents an integrated analysis of the evolution of the marine environment and the human uses in Cape Creus, a Mediterranean coastal area where intense commercial fisheries and recreational uses have coexisted over the last fifty years. The investigation synthesizes the documented impacts of human activities on the marine environment of Cap de Creus and integrates them with new data. In particular, the evolution of vulnerable, exploited species is used to evaluate the fishing impacts. The effects of area protection through the establishment of a marine reserve in the late 1990s and the potential climate change impacts are also considered. The evolution of the human uses is marked by the increasing socioeconomic importance of recreational activities (which affect species and habitats) in detriment to artisanal and red coral fisheries (which principally affect at a species level). Overall, populations of sedentary, vulnerable exploited species, hard sessile benthic invertebrates, and ecologically fragile habitats, such as seagrass meadows, the coralligenous and infralittoral algal assemblages have been the most negatively impacted by anthropogenic activities. Albeit human uses currently constitute the largest negative impact on the marine environment of Cap de Creus, climate change is emerging as a key factor that could have considerable implications for the marine environment and tourism activities. The establishment of the marine reserve appears to have had little socioeconomic impact, but there is some evidence that it had some positive biological effects on sedentary, littoral fishes. Results demonstrate that the declaration of a marine reserve alone does not guarantee the sustainability of marine resources and habitats but should be accompanied with an integrated coastal management plan.

  12. Evolution of a Mediterranean Coastal Zone: Human Impacts on the Marine Environment of Cape Creus

    NASA Astrophysics Data System (ADS)

    Lloret, Josep; Riera, Victòria

    2008-12-01

    This study presents an integrated analysis of the evolution of the marine environment and the human uses in Cape Creus, a Mediterranean coastal area where intense commercial fisheries and recreational uses have coexisted over the last fifty years. The investigation synthesizes the documented impacts of human activities on the marine environment of Cap de Creus and integrates them with new data. In particular, the evolution of vulnerable, exploited species is used to evaluate the fishing impacts. The effects of area protection through the establishment of a marine reserve in the late 1990s and the potential climate change impacts are also considered. The evolution of the human uses is marked by the increasing socioeconomic importance of recreational activities (which affect species and habitats) in detriment to artisanal and red coral fisheries (which principally affect at a species level). Overall, populations of sedentary, vulnerable exploited species, hard sessile benthic invertebrates, and ecologically fragile habitats, such as seagrass meadows, the coralligenous and infralittoral algal assemblages have been the most negatively impacted by anthropogenic activities. Albeit human uses currently constitute the largest negative impact on the marine environment of Cap de Creus, climate change is emerging as a key factor that could have considerable implications for the marine environment and tourism activities. The establishment of the marine reserve appears to have had little socioeconomic impact, but there is some evidence that it had some positive biological effects on sedentary, littoral fishes. Results demonstrate that the declaration of a marine reserve alone does not guarantee the sustainability of marine resources and habitats but should be accompanied with an integrated coastal management plan.

  13. Supporting secure programming in web applications through interactive static analysis.

    PubMed

    Zhu, Jun; Xie, Jing; Lipford, Heather Richter; Chu, Bill

    2014-07-01

    Many security incidents are caused by software developers' failure to adhere to secure programming practices. Static analysis tools have been used to detect software vulnerabilities. However, their wide usage by developers is limited by the special training required to write rules customized to application-specific logic. Our approach is interactive static analysis, to integrate static analysis into Integrated Development Environment (IDE) and provide in-situ secure programming support to help developers prevent vulnerabilities during code construction. No additional training is required nor are there any assumptions on ways programs are built. Our work is motivated in part by the observation that many vulnerabilities are introduced due to failure to practice secure programming by knowledgeable developers. We implemented a prototype interactive static analysis tool as a plug-in for Java in Eclipse. Our technical evaluation of our prototype detected multiple zero-day vulnerabilities in a large open source project. Our evaluations also suggest that false positives may be limited to a very small class of use cases.

  14. Supporting secure programming in web applications through interactive static analysis

    PubMed Central

    Zhu, Jun; Xie, Jing; Lipford, Heather Richter; Chu, Bill

    2013-01-01

    Many security incidents are caused by software developers’ failure to adhere to secure programming practices. Static analysis tools have been used to detect software vulnerabilities. However, their wide usage by developers is limited by the special training required to write rules customized to application-specific logic. Our approach is interactive static analysis, to integrate static analysis into Integrated Development Environment (IDE) and provide in-situ secure programming support to help developers prevent vulnerabilities during code construction. No additional training is required nor are there any assumptions on ways programs are built. Our work is motivated in part by the observation that many vulnerabilities are introduced due to failure to practice secure programming by knowledgeable developers. We implemented a prototype interactive static analysis tool as a plug-in for Java in Eclipse. Our technical evaluation of our prototype detected multiple zero-day vulnerabilities in a large open source project. Our evaluations also suggest that false positives may be limited to a very small class of use cases. PMID:25685513

  15. Numerical methods for engine-airframe integration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murthy, S.N.B.; Paynter, G.C.

    1986-01-01

    Various papers on numerical methods for engine-airframe integration are presented. The individual topics considered include: scientific computing environment for the 1980s, overview of prediction of complex turbulent flows, numerical solutions of the compressible Navier-Stokes equations, elements of computational engine/airframe integrations, computational requirements for efficient engine installation, application of CAE and CFD techniques to complete tactical missile design, CFD applications to engine/airframe integration, and application of a second-generation low-order panel methods to powerplant installation studies. Also addressed are: three-dimensional flow analysis of turboprop inlet and nacelle configurations, application of computational methods to the design of large turbofan engine nacelles, comparison ofmore » full potential and Euler solution algorithms for aeropropulsive flow field computations, subsonic/transonic, supersonic nozzle flows and nozzle integration, subsonic/transonic prediction capabilities for nozzle/afterbody configurations, three-dimensional viscous design methodology of supersonic inlet systems for advanced technology aircraft, and a user's technology assessment.« less

  16. Modeling Nitrogen Dynamics in a Waste Stabilization Pond System Using Flexible Modeling Environment with MCMC

    PubMed Central

    Mukhtar, Hussnain; Lin, Yu-Pin; Shipin, Oleg V.; Petway, Joy R.

    2017-01-01

    This study presents an approach for obtaining realization sets of parameters for nitrogen removal in a pilot-scale waste stabilization pond (WSP) system. The proposed approach was designed for optimal parameterization, local sensitivity analysis, and global uncertainty analysis of a dynamic simulation model for the WSP by using the R software package Flexible Modeling Environment (R-FME) with the Markov chain Monte Carlo (MCMC) method. Additionally, generalized likelihood uncertainty estimation (GLUE) was integrated into the FME to evaluate the major parameters that affect the simulation outputs in the study WSP. Comprehensive modeling analysis was used to simulate and assess nine parameters and concentrations of ON-N, NH3-N and NO3-N. Results indicate that the integrated FME-GLUE-based model, with good Nash–Sutcliffe coefficients (0.53–0.69) and correlation coefficients (0.76–0.83), successfully simulates the concentrations of ON-N, NH3-N and NO3-N. Moreover, the Arrhenius constant was the only parameter sensitive to model performances of ON-N and NH3-N simulations. However, Nitrosomonas growth rate, the denitrification constant, and the maximum growth rate at 20 °C were sensitive to ON-N and NO3-N simulation, which was measured using global sensitivity. PMID:28704958

  17. Integrating health and environmental impact analysis.

    PubMed

    Reis, S; Morris, G; Fleming, L E; Beck, S; Taylor, T; White, M; Depledge, M H; Steinle, S; Sabel, C E; Cowie, H; Hurley, F; Dick, J McP; Smith, R I; Austen, M

    2015-10-01

    Scientific investigations have progressively refined our understanding of the influence of the environment on human health, and the many adverse impacts that human activities exert on the environment, from the local to the planetary level. Nonetheless, throughout the modern public health era, health has been pursued as though our lives and lifestyles are disconnected from ecosystems and their component organisms. The inadequacy of the societal and public health response to obesity, health inequities, and especially global environmental and climate change now calls for an ecological approach which addresses human activity in all its social, economic and cultural complexity. The new approach must be integral to, and interactive, with the natural environment. We see the continuing failure to truly integrate human health and environmental impact analysis as deeply damaging, and we propose a new conceptual model, the ecosystems-enriched Drivers, Pressures, State, Exposure, Effects, Actions or 'eDPSEEA' model, to address this shortcoming. The model recognizes convergence between the concept of ecosystems services which provides a human health and well-being slant to the value of ecosystems while equally emphasizing the health of the environment, and the growing calls for 'ecological public health' as a response to global environmental concerns now suffusing the discourse in public health. More revolution than evolution, ecological public health will demand new perspectives regarding the interconnections among society, the economy, the environment and our health and well-being. Success must be built on collaborations between the disparate scientific communities of the environmental sciences and public health as well as interactions with social scientists, economists and the legal profession. It will require outreach to political and other stakeholders including a currently largely disengaged general public. The need for an effective and robust science-policy interface has never been more pressing. Conceptual models can facilitate this by providing theoretical frameworks and supporting stakeholder engagement process simplifications for inherently complex situations involving environment and human health and well-being. They can be tools to think with, to engage, to communicate and to help navigate in a sea of complexity. We believe models such as eDPSEEA can help frame many of the issues which have become the challenges of the new public health era and can provide the essential platforms necessary for progress. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  18. 40 CFR 147.3107 - Mechanical integrity.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Mechanical integrity. 147.3107 Section 147.3107 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS... Oklahoma Indian Tribes § 147.3107 Mechanical integrity. (a) Monitoring of annulus pressure conducted...

  19. An Integrated Computer Modeling Environment for Regional Land Use, Air Quality, and Transportation Planning

    DOT National Transportation Integrated Search

    1997-04-01

    The Land Use, Air Quality, and Transportation Integrated Modeling Environment (LATIME) represents an integrated approach to computer modeling and simulation of land use allocation, travel demand, and mobile source emissions for the Albuquerque, New M...

  20. Evaluating the Usability of Authoring Environments for Serious Games.

    PubMed

    Slootmaker, Aad; Hummel, Hans; Koper, Rob

    2017-08-01

    Background . The EMERGO method and online platform enable the development and delivery of scenario-based serious games that foster students to acquire professional competence. One of the main goals of the platform is to provide a user-friendly authoring environment for creating virtual environments where students can perform authentic tasks. Aim . We present the findings of an in-depth qualitative case study of the platform's authoring environment and compare our findings on usability with those found for comparable environments in literature. Method . We carried out semi-structured interviews, with two experienced game developers who have authored a game for higher education, and a literature review of comparable environments. Findings . The analysis shows that the usability of the authoring environment is problematic, especially regarding understandability and learnability , which is in line with findings of comparable environments. Other findings are that authoring is well integrated with the EMERGO method and that functionality and reliability of the authoring environment are valued. Practical implications . The lessons learned are presented in the form of general guidelines to improve the understandability and learnability of authoring environments for serious games .

  1. Evaluating the Usability of Authoring Environments for Serious Games

    PubMed Central

    Slootmaker, Aad; Hummel, Hans; Koper, Rob

    2017-01-01

    Background. The EMERGO method and online platform enable the development and delivery of scenario-based serious games that foster students to acquire professional competence. One of the main goals of the platform is to provide a user-friendly authoring environment for creating virtual environments where students can perform authentic tasks. Aim. We present the findings of an in-depth qualitative case study of the platform’s authoring environment and compare our findings on usability with those found for comparable environments in literature. Method. We carried out semi-structured interviews, with two experienced game developers who have authored a game for higher education, and a literature review of comparable environments. Findings. The analysis shows that the usability of the authoring environment is problematic, especially regarding understandability and learnability, which is in line with findings of comparable environments. Other findings are that authoring is well integrated with the EMERGO method and that functionality and reliability of the authoring environment are valued. Practical implications. The lessons learned are presented in the form of general guidelines to improve the understandability and learnability of authoring environments for serious games. PMID:29081638

  2. Training and Labour Market Integration of Education Science Graduates

    ERIC Educational Resources Information Center

    Vivas, Amparo Jimenez

    2009-01-01

    The aim of the research behind this article is to identify the relationships that must exist between university training and the social and occupational environment. One of the many functions that derive from the university-society relationship is to train students to carry out certain professions. as a result, the analysis of the labour market…

  3. Expected frontiers: Incorporating weather uncertainty into a policy analysis using an integrated bi-level multi-objective optimization framework

    EPA Science Inventory

    Weather is the main driver in both plant use of nutrients and fate and transport of nutrients in the environment. In previous work, we evaluated a green tax for control of agricultural nutrients in a bi-level optimization framework that linked deterministic models. In this study,...

  4. A Digital Mixed Methods Research Design: Integrating Multimodal Analysis with Data Mining and Information Visualization for Big Data Analytics

    ERIC Educational Resources Information Center

    O'Halloran, Kay L.; Tan, Sabine; Pham, Duc-Son; Bateman, John; Vande Moere, Andrew

    2018-01-01

    This article demonstrates how a digital environment offers new opportunities for transforming qualitative data into quantitative data in order to use data mining and information visualization for mixed methods research. The digital approach to mixed methods research is illustrated by a framework which combines qualitative methods of multimodal…

  5. Data-Intensive Scientific Management, Analysis and Visualization

    NASA Astrophysics Data System (ADS)

    Goranova, Mariana; Shishedjiev, Bogdan; Juliana Georgieva, Juliana

    2012-11-01

    The proposed integrated system provides a suite of services for data-intensive sciences that enables scientists to describe, manage, analyze and visualize data from experiments and numerical simulations in distributed and heterogeneous environment. This paper describes the advisor and the converter services and presents an example from the monitoring of the slant column content of atmospheric minor gases.

  6. Range of sound levels in the outdoor environment

    Treesearch

    Lewis S. Goodfriend

    1977-01-01

    Current methods of measuring and rating noise in a metropolitan area are examined, including real-time spectrum analysis and sound-level integration, producing a single-number value representing the noise impact for each hour or each day. Methods of noise rating for metropolitan areas are reviewed, and the various measures from multidimensional rating methods such as...

  7. Analysis of the Technology Acceptance Model in Examining Students' Behavioural Intention to Use an e-Portfolio System

    ERIC Educational Resources Information Center

    Shroff, Ronnie H.; Deneen, Christopher C.; Ng, Eugenia M. W.

    2011-01-01

    In recent years, instructors have had an increasing interest in integrating Internet based technologies into their classroom as part of the learning environment. Compared to studies on other information systems, student users' behaviour towards e-portfolios have not been assessed and thoroughly understood. This paper analyses the "Technology…

  8. An integrated study of earth resources in the State of California based on Skylab and supporting aircraft data. [environmental monitoring, tectonics, ecology, and forest management in California

    NASA Technical Reports Server (NTRS)

    Colwell, R. N. (Principal Investigator)

    1974-01-01

    Skylab data has been used: (1) as an aid to resource management in Northern California; (2) to assess and monitor change in the Southern California environment; and (3) for resource inventory and analysis of The California Desert Program.

  9. Enabling the democratization of the genomics revolution with a fully integrated web-based bioinformatics platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Po-E; Lo, Chien -Chi; Anderson, Joseph J.

    Continued advancements in sequencing technologies have fueled the development of new sequencing applications and promise to flood current databases with raw data. A number of factors prevent the seamless and easy use of these data, including the breadth of project goals, the wide array of tools that individually perform fractions of any given analysis, the large number of associated software/hardware dependencies, and the detailed expertise required to perform these analyses. To address these issues, we have developed an intuitive web-based environment with a wide assortment of integrated and cutting-edge bioinformatics tools in pre-configured workflows. These workflows, coupled with the easemore » of use of the environment, provide even novice next-generation sequencing users with the ability to perform many complex analyses with only a few mouse clicks and, within the context of the same environment, to visualize and further interrogate their results. As a result, this bioinformatics platform is an initial attempt at Empowering the Development of Genomics Expertise (EDGE) in a wide range of applications for microbial research.« less

  10. A domain-specific compiler for a parallel multiresolution adaptive numerical simulation environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rajbhandari, Samyam; Kim, Jinsung; Krishnamoorthy, Sriram

    This paper describes the design and implementation of a layered domain-specific compiler to support MADNESS---Multiresolution ADaptive Numerical Environment for Scientific Simulation. MADNESS is a high-level software environment for the solution of integral and differential equations in many dimensions, using adaptive and fast harmonic analysis methods with guaranteed precision. MADNESS uses k-d trees to represent spatial functions and implements operators like addition, multiplication, differentiation, and integration on the numerical representation of functions. The MADNESS runtime system provides global namespace support and a task-based execution model including futures. MADNESS is currently deployed on massively parallel supercomputers and has enabled many science advances.more » Due to the highly irregular and statically unpredictable structure of the k-d trees representing the spatial functions encountered in MADNESS applications, only purely runtime approaches to optimization have previously been implemented in the MADNESS framework. This paper describes a layered domain-specific compiler developed to address some performance bottlenecks in MADNESS. The newly developed static compile-time optimizations, in conjunction with the MADNESS runtime support, enable significant performance improvement for the MADNESS framework.« less

  11. Enabling the democratization of the genomics revolution with a fully integrated web-based bioinformatics platform

    PubMed Central

    Li, Po-E; Lo, Chien-Chi; Anderson, Joseph J.; Davenport, Karen W.; Bishop-Lilly, Kimberly A.; Xu, Yan; Ahmed, Sanaa; Feng, Shihai; Mokashi, Vishwesh P.; Chain, Patrick S.G.

    2017-01-01

    Continued advancements in sequencing technologies have fueled the development of new sequencing applications and promise to flood current databases with raw data. A number of factors prevent the seamless and easy use of these data, including the breadth of project goals, the wide array of tools that individually perform fractions of any given analysis, the large number of associated software/hardware dependencies, and the detailed expertise required to perform these analyses. To address these issues, we have developed an intuitive web-based environment with a wide assortment of integrated and cutting-edge bioinformatics tools in pre-configured workflows. These workflows, coupled with the ease of use of the environment, provide even novice next-generation sequencing users with the ability to perform many complex analyses with only a few mouse clicks and, within the context of the same environment, to visualize and further interrogate their results. This bioinformatics platform is an initial attempt at Empowering the Development of Genomics Expertise (EDGE) in a wide range of applications for microbial research. PMID:27899609

  12. Envirotyping for deciphering environmental impacts on crop plants.

    PubMed

    Xu, Yunbi

    2016-04-01

    Global climate change imposes increasing impacts on our environments and crop production. To decipher environmental impacts on crop plants, the concept "envirotyping" is proposed, as a third "typing" technology, complementing with genotyping and phenotyping. Environmental factors can be collected through multiple environmental trials, geographic and soil information systems, measurement of soil and canopy properties, and evaluation of companion organisms. Envirotyping contributes to crop modeling and phenotype prediction through its functional components, including genotype-by-environment interaction (GEI), genes responsive to environmental signals, biotic and abiotic stresses, and integrative phenotyping. Envirotyping, driven by information and support systems, has a wide range of applications, including environmental characterization, GEI analysis, phenotype prediction, near-iso-environment construction, agronomic genomics, precision agriculture and breeding, and development of a four-dimensional profile of crop science involving genotype (G), phenotype (P), envirotype (E) and time (T) (developmental stage). In the future, envirotyping needs to zoom into specific experimental plots and individual plants, along with the development of high-throughput and precision envirotyping platforms, to integrate genotypic, phenotypic and envirotypic information for establishing a high-efficient precision breeding and sustainable crop production system based on deciphered environmental impacts.

  13. Enabling the democratization of the genomics revolution with a fully integrated web-based bioinformatics platform

    DOE PAGES

    Li, Po-E; Lo, Chien -Chi; Anderson, Joseph J.; ...

    2016-11-24

    Continued advancements in sequencing technologies have fueled the development of new sequencing applications and promise to flood current databases with raw data. A number of factors prevent the seamless and easy use of these data, including the breadth of project goals, the wide array of tools that individually perform fractions of any given analysis, the large number of associated software/hardware dependencies, and the detailed expertise required to perform these analyses. To address these issues, we have developed an intuitive web-based environment with a wide assortment of integrated and cutting-edge bioinformatics tools in pre-configured workflows. These workflows, coupled with the easemore » of use of the environment, provide even novice next-generation sequencing users with the ability to perform many complex analyses with only a few mouse clicks and, within the context of the same environment, to visualize and further interrogate their results. As a result, this bioinformatics platform is an initial attempt at Empowering the Development of Genomics Expertise (EDGE) in a wide range of applications for microbial research.« less

  14. Applied behavior analysis: behavior management of children with autism spectrum disorders in dental environments.

    PubMed

    Hernandez, Purnima; Ikkanda, Zachary

    2011-03-01

    There are a limited number of studies addressing behavior management techniques and procedural modifications that dentists can use to treat people with an autism spectrum disorder (ASD). The authors conducted a search of the dental and behavioral analytic literature to identify management techniques that address problem behaviors exhibited by children with ASDs in dental and other health-related environments. Applied behavior analysis (ABA) is a science in which procedures are based on the principles of behavior through systematic experimentation. Clinicians have used ABA procedures successfully to modify socially significant behaviors of people with ASD. Basic behavior management techniques currently used in dentistry may not encourage people with cognitive and behavioral disabilities, such as ASD, to tolerate simple in-office dental procedures consistently. Instead, dental care providers often are required to use advanced behavior management techniques to complete simple in-office procedures such as prophylaxis, sealant placement and obtaining radiographs. ABA procedures can be integrated in the dental environment to manage problem behaviors often exhibited by children with an ASD. The authors found no evidence-based procedural modifications that address the behavioral characteristics and problematic behaviors of children with an ASD in a dental environment. Further research in this area should be conducted. Knowledge and in-depth understanding of behavioral principles is essential when a dentist is concerned with modifying behaviors. Using ABA procedures can help dentists manage problem behaviors effectively and systematically when performing routine dental treatment. Being knowledgeable about each patient's behavioral characteristics and the parents' level of involvement is important in the successful integration of the procedures and reduction of in-office time.

  15. The Integrated Hazard Analysis Integrator

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Massie, Michael J.

    2009-01-01

    Hazard analysis addresses hazards that arise in the design, development, manufacturing, construction, facilities, transportation, operations and disposal activities associated with hardware, software, maintenance, operations and environments. An integrated hazard is an event or condition that is caused by or controlled by multiple systems, elements, or subsystems. Integrated hazard analysis (IHA) is especially daunting and ambitious for large, complex systems such as NASA s Constellation program which incorporates program, systems and element components that impact others (International Space Station, public, International Partners, etc.). An appropriate IHA should identify all hazards, causes, controls and verifications used to mitigate the risk of catastrophic loss of crew, vehicle and/or mission. Unfortunately, in the current age of increased technology dependence, there is the tendency to sometimes overlook the necessary and sufficient qualifications of the integrator, that is, the person/team that identifies the parts, analyzes the architectural structure, aligns the analysis with the program plan and then communicates/coordinates with large and small components, each contributing necessary hardware, software and/or information to prevent catastrophic loss. As viewed from both Challenger and Columbia accidents, lack of appropriate communication, management errors and lack of resources dedicated to safety were cited as major contributors to these fatalities. From the accident reports, it would appear that the organizational impact of managers, integrators and safety personnel contributes more significantly to mission success and mission failure than purely technological components. If this is so, then organizations who sincerely desire mission success must put as much effort in selecting managers and integrators as they do when designing the hardware, writing the software code and analyzing competitive proposals. This paper will discuss the necessary and sufficient requirements of one of the significant contributors to mission success, the IHA integrator. Discussions will be provided to describe both the mindset required as well as deleterious assumptions/behaviors to avoid when integrating within a large scale system.

  16. Small Projects Rapid Integration and Test Environment (SPRITE): Application for Increasing Robustness

    NASA Technical Reports Server (NTRS)

    Rakoczy, John; Heater, Daniel; Lee, Ashley

    2013-01-01

    Marshall Space Flight Center's (MSFC) Small Projects Rapid Integration and Test Environment (SPRITE) is a Hardware-In-The-Loop (HWIL) facility that provides rapid development, integration, and testing capabilities for small projects (CubeSats, payloads, spacecraft, and launch vehicles). This facility environment focuses on efficient processes and modular design to support rapid prototyping, integration, testing and verification of small projects at an affordable cost, especially compared to larger type HWIL facilities. SPRITE (Figure 1) consists of a "core" capability or "plant" simulation platform utilizing a graphical programming environment capable of being rapidly re-configured for any potential test article's space environments, as well as a standard set of interfaces (i.e. Mil-Std 1553, Serial, Analog, Digital, etc.). SPRITE also allows this level of interface testing of components and subsystems very early in a program, thereby reducing program risk.

  17. Knowledge and power in integrated coastal management. For a political anthropology of the sea combined with the sciences of the marine environment

    NASA Astrophysics Data System (ADS)

    Mazé, Camille; Dahou, Tarik; Ragueneau, Olivier; Danto, Anatole; Mariat-Roy, Emilie; Raimonet, Mélanie; Weisbein, Julien

    2017-10-01

    This article presents an innovative collaborative approach, which aims to reinforce and institutionalize the field of the political anthropology of the sea combined with the natural sciences. It begins by relating the evolution in coastal areas, from integrated coastal zone management to the notion of adaptive co-management. It then sets out what contribution the social sciences of politics may bring to our understanding of the government/governance of the sea in terms of sustainable development, starting with political science and then highlighting the importance of a deep anthropological and socio-historical approach. Finally, it gives us a glimpse of the benefits of combining the human and social sciences with the natural sciences to produce a critical analysis of the categories of thought and action associated with the systemic management of the environment, especially the coastal areas.

  18. Barriers to Practical Learning in the Field: A Qualitative Study of Iranian Nursing Students' Experiences.

    PubMed

    Jahanpour, Faezeh; Azodi, Parviz; Azodi, Farzan; Khansir, Ali Akbar

    2016-06-01

    Clinical training is an integral part of nursing education; however, some studies have shown that it is not always efficient. This study aimed to find out the factors that can impede nursing students' clinical learning. In this qualitative study, data were collected via reflective journal writing. Purposeful sampling was used, and 12 senior nursing students were recruited to the study. The data were analyzed using a content analysis method. Three main categories were derived, including inappropriate communication, ineffective role models, and theory-practice gaps. Students perceived that inappropriate communication between instructors, staff members, and students had the greatest impact on student learning. The competence of clinical instructors and staff is an important factor affecting students' training. The clinical learning environment does not always integrate theory and practice together. Nursing students did not experience effective clinical learning. Having expert instructors and supportive communication are important factors in creating a clinical learning environment.

  19. Integrated model for pricing, delivery time setting, and scheduling in make-to-order environments

    NASA Astrophysics Data System (ADS)

    Garmdare, Hamid Sattari; Lotfi, M. M.; Honarvar, Mahboobeh

    2018-03-01

    Usually, in make-to-order environments which work only in response to the customer's orders, manufacturers for maximizing the profits should offer the best price and delivery time for an order considering the existing capacity and the customer's sensitivity to both the factors. In this paper, an integrated approach for pricing, delivery time setting and scheduling of new arrival orders are proposed based on the existing capacity and accepted orders in system. In the problem, the acquired market demands dependent on the price and delivery time of both the manufacturer and its competitors. A mixed-integer non-linear programming model is presented for the problem. After converting to a pure non-linear model, it is validated through a case study. The efficiency of proposed model is confirmed by comparing it to both the literature and the current practice. Finally, sensitivity analysis for the key parameters is carried out.

  20. Integrated optical 3D digital imaging based on DSP scheme

    NASA Astrophysics Data System (ADS)

    Wang, Xiaodong; Peng, Xiang; Gao, Bruce Z.

    2008-03-01

    We present a scheme of integrated optical 3-D digital imaging (IO3DI) based on digital signal processor (DSP), which can acquire range images independently without PC support. This scheme is based on a parallel hardware structure with aid of DSP and field programmable gate array (FPGA) to realize 3-D imaging. In this integrated scheme of 3-D imaging, the phase measurement profilometry is adopted. To realize the pipeline processing of the fringe projection, image acquisition and fringe pattern analysis, we present a multi-threads application program that is developed under the environment of DSP/BIOS RTOS (real-time operating system). Since RTOS provides a preemptive kernel and powerful configuration tool, with which we are able to achieve a real-time scheduling and synchronization. To accelerate automatic fringe analysis and phase unwrapping, we make use of the technique of software optimization. The proposed scheme can reach a performance of 39.5 f/s (frames per second), so it may well fit into real-time fringe-pattern analysis and can implement fast 3-D imaging. Experiment results are also presented to show the validity of proposed scheme.

  1. Map Matching and Real World Integrated Sensor Data Warehousing (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burton, E.

    2014-02-01

    The inclusion of interlinked temporal and spatial elements within integrated sensor data enables a tremendous degree of flexibility when analyzing multi-component datasets. The presentation illustrates how to warehouse, process, and analyze high-resolution integrated sensor datasets to support complex system analysis at the entity and system levels. The example cases presented utilizes in-vehicle sensor system data to assess vehicle performance, while integrating a map matching algorithm to link vehicle data to roads to demonstrate the enhanced analysis possible via interlinking data elements. Furthermore, in addition to the flexibility provided, the examples presented illustrate concepts of maintaining proprietary operational information (Fleet DNA)more » and privacy of study participants (Transportation Secure Data Center) while producing widely distributed data products. Should real-time operational data be logged at high resolution across multiple infrastructure types, map matched to their associated infrastructure, and distributed employing a similar approach; dependencies between urban environment infrastructures components could be better understood. This understanding is especially crucial for the cities of the future where transportation will rely more on grid infrastructure to support its energy demands.« less

  2. Army-NASA aircrew/aircraft integration program (A3I) software detailed design document, phase 3

    NASA Technical Reports Server (NTRS)

    Banda, Carolyn; Chiu, Alex; Helms, Gretchen; Hsieh, Tehming; Lui, Andrew; Murray, Jerry; Shankar, Renuka

    1990-01-01

    The capabilities and design approach of the MIDAS (Man-machine Integration Design and Analysis System) computer-aided engineering (CAE) workstation under development by the Army-NASA Aircrew/Aircraft Integration Program is detailed. This workstation uses graphic, symbolic, and numeric prototyping tools and human performance models as part of an integrated design/analysis environment for crewstation human engineering. Developed incrementally, the requirements and design for Phase 3 (Dec. 1987 to Jun. 1989) are described. Software tools/models developed or significantly modified during this phase included: an interactive 3-D graphic cockpit design editor; multiple-perspective graphic views to observe simulation scenarios; symbolic methods to model the mission decomposition, equipment functions, pilot tasking and loading, as well as control the simulation; a 3-D dynamic anthropometric model; an intermachine communications package; and a training assessment component. These components were successfully used during Phase 3 to demonstrate the complex interactions and human engineering findings involved with a proposed cockpit communications design change in a simulated AH-64A Apache helicopter/mission that maps to empirical data from a similar study and AH-1 Cobra flight test.

  3. PathCase-SB architecture and database design

    PubMed Central

    2011-01-01

    Background Integration of metabolic pathways resources and regulatory metabolic network models, and deploying new tools on the integrated platform can help perform more effective and more efficient systems biology research on understanding the regulation in metabolic networks. Therefore, the tasks of (a) integrating under a single database environment regulatory metabolic networks and existing models, and (b) building tools to help with modeling and analysis are desirable and intellectually challenging computational tasks. Description PathCase Systems Biology (PathCase-SB) is built and released. The PathCase-SB database provides data and API for multiple user interfaces and software tools. The current PathCase-SB system provides a database-enabled framework and web-based computational tools towards facilitating the development of kinetic models for biological systems. PathCase-SB aims to integrate data of selected biological data sources on the web (currently, BioModels database and KEGG), and to provide more powerful and/or new capabilities via the new web-based integrative framework. This paper describes architecture and database design issues encountered in PathCase-SB's design and implementation, and presents the current design of PathCase-SB's architecture and database. Conclusions PathCase-SB architecture and database provide a highly extensible and scalable environment with easy and fast (real-time) access to the data in the database. PathCase-SB itself is already being used by researchers across the world. PMID:22070889

  4. Integrated Modeling Environment

    NASA Technical Reports Server (NTRS)

    Mosier, Gary; Stone, Paul; Holtery, Christopher

    2006-01-01

    The Integrated Modeling Environment (IME) is a software system that establishes a centralized Web-based interface for integrating people (who may be geographically dispersed), processes, and data involved in a common engineering project. The IME includes software tools for life-cycle management, configuration management, visualization, and collaboration.

  5. Using an architectural approach to integrate heterogeneous, distributed software components

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Purtilo, James M.

    1995-01-01

    Many computer programs cannot be easily integrated because their components are distributed and heterogeneous, i.e., they are implemented in diverse programming languages, use different data representation formats, or their runtime environments are incompatible. In many cases, programs are integrated by modifying their components or interposing mechanisms that handle communication and conversion tasks. For example, remote procedure call (RPC) helps integrate heterogeneous, distributed programs. When configuring such programs, however, mechanisms like RPC must be used explicitly by software developers in order to integrate collections of diverse components. Each collection may require a unique integration solution. This paper describes improvements to the concepts of software packaging and some of our experiences in constructing complex software systems from a wide variety of components in different execution environments. Software packaging is a process that automatically determines how to integrate a diverse collection of computer programs based on the types of components involved and the capabilities of available translators and adapters in an environment. Software packaging provides a context that relates such mechanisms to software integration processes and reduces the cost of configuring applications whose components are distributed or implemented in different programming languages. Our software packaging tool subsumes traditional integration tools like UNIX make by providing a rule-based approach to software integration that is independent of execution environments.

  6. Supporting community annotation and user collaboration in the integrated microbial genomes (IMG) system.

    PubMed

    Chen, I-Min A; Markowitz, Victor M; Palaniappan, Krishna; Szeto, Ernest; Chu, Ken; Huang, Jinghua; Ratner, Anna; Pillay, Manoj; Hadjithomas, Michalis; Huntemann, Marcel; Mikhailova, Natalia; Ovchinnikova, Galina; Ivanova, Natalia N; Kyrpides, Nikos C

    2016-04-26

    The exponential growth of genomic data from next generation technologies renders traditional manual expert curation effort unsustainable. Many genomic systems have included community annotation tools to address the problem. Most of these systems adopted a "Wiki-based" approach to take advantage of existing wiki technologies, but encountered obstacles in issues such as usability, authorship recognition, information reliability and incentive for community participation. Here, we present a different approach, relying on tightly integrated method rather than "Wiki-based" method, to support community annotation and user collaboration in the Integrated Microbial Genomes (IMG) system. The IMG approach allows users to use existing IMG data warehouse and analysis tools to add gene, pathway and biosynthetic cluster annotations, to analyze/reorganize contigs, genes and functions using workspace datasets, and to share private user annotations and workspace datasets with collaborators. We show that the annotation effort using IMG can be part of the research process to overcome the user incentive and authorship recognition problems thus fostering collaboration among domain experts. The usability and reliability issues are addressed by the integration of curated information and analysis tools in IMG, together with DOE Joint Genome Institute (JGI) expert review. By incorporating annotation operations into IMG, we provide an integrated environment for users to perform deeper and extended data analysis and annotation in a single system that can lead to publications and community knowledge sharing as shown in the case studies.

  7. Quantifying long-term human impact in contrasting environments: Statistical analysis of modern and fossil pollen records

    NASA Astrophysics Data System (ADS)

    Broothaerts, Nils; López-Sáez, José Antonio; Verstraeten, Gert

    2017-04-01

    Reconstructing and quantifying human impact is an important step to understand human-environment interactions in the past. Quantitative measures of human impact on the landscape are needed to fully understand long-term influence of anthropogenic land cover changes on the global climate, ecosystems and geomorphic processes. Nevertheless, quantifying past human impact is not straightforward. Recently, multivariate statistical analysis of fossil pollen records have been proposed to characterize vegetation changes and to get insights in past human impact. Although statistical analysis of fossil pollen data can provide useful insights in anthropogenic driven vegetation changes, still it cannot be used as an absolute quantification of past human impact. To overcome this shortcoming, in this study fossil pollen records were included in a multivariate statistical analysis (cluster analysis and non-metric multidimensional scaling (NMDS)) together with modern pollen data and modern vegetation data. The information on the modern pollen and vegetation dataset can be used to get a better interpretation of the representativeness of the fossil pollen records, and can result in a full quantification of human impact in the past. This methodology was applied in two contrasting environments: SW Turkey and Central Spain. For each region, fossil pollen data from different study sites were integrated, together with modern pollen data and information on modern vegetation. In this way, arboreal cover, grazing pressure and agricultural activities in the past were reconstructed and quantified. The data from SW Turkey provides new integrated information on changing human impact through time in the Sagalassos territory, and shows that human impact was most intense during the Hellenistic and Roman Period (ca. 2200-1750 cal a BP) and decreased and changed in nature afterwards. The data from central Spain shows for several sites that arboreal cover decreases bellow 5% from the Feudal period onwards (ca. 850 cal a BP) related to increasing human impact in the landscape. At other study sites arboreal cover remained above 25% beside significant human impact. Overall, the presented examples from two contrasting environments shows how cluster analysis and NMDS of modern and fossil pollen data can help to provide quantitative insights in anthropogenic land cover changes. Our study extensively discuss and illustrate the possibilities and limitations of statistical analysis of pollen data to quantify human induced land use changes.

  8. BiologicalNetworks 2.0 - an integrative view of genome biology data

    PubMed Central

    2010-01-01

    Background A significant problem in the study of mechanisms of an organism's development is the elucidation of interrelated factors which are making an impact on the different levels of the organism, such as genes, biological molecules, cells, and cell systems. Numerous sources of heterogeneous data which exist for these subsystems are still not integrated sufficiently enough to give researchers a straightforward opportunity to analyze them together in the same frame of study. Systematic application of data integration methods is also hampered by a multitude of such factors as the orthogonal nature of the integrated data and naming problems. Results Here we report on a new version of BiologicalNetworks, a research environment for the integral visualization and analysis of heterogeneous biological data. BiologicalNetworks can be queried for properties of thousands of different types of biological entities (genes/proteins, promoters, COGs, pathways, binding sites, and other) and their relations (interactions, co-expression, co-citations, and other). The system includes the build-pathways infrastructure for molecular interactions/relations and module discovery in high-throughput experiments. Also implemented in BiologicalNetworks are the Integrated Genome Viewer and Comparative Genomics Browser applications, which allow for the search and analysis of gene regulatory regions and their conservation in multiple species in conjunction with molecular pathways/networks, experimental data and functional annotations. Conclusions The new release of BiologicalNetworks together with its back-end database introduces extensive functionality for a more efficient integrated multi-level analysis of microarray, sequence, regulatory, and other data. BiologicalNetworks is freely available at http://www.biologicalnetworks.org. PMID:21190573

  9. Relationships among supervisor feedback environment, work-related stressors, and employee deviance.

    PubMed

    Peng, Jei-Chen; Tseng, Mei-Man; Lee, Yin-Ling

    2011-03-01

    Previous research has demonstrated that the employee deviance imposes enormous costs on organizational performance and productivity. Similar research supports the positive effect of favorable supervisor feedback on employee job performance. In light of such, it is important to understand the interaction between supervisor feedback environment and employee deviant behavior to streamline organization operations. The purposes of this study were to explore how the supervisor feedback environment influences employee deviance and to examine the mediating role played by work-related stressors. Data were collected from 276 subordinate-supervisor dyads at a regional hospital in Yilan. Structural equation modeling analyses were conducted to test hypotheses. Structural equation modeling analysis results show that supervisor feedback environment negatively related to interpersonal and organizational deviance. Moreover, work-related stressors were found to partially mediate the relationship between supervisor feedback environment and employee deviance. Study findings suggest that when employees (nurses in this case) perceive an appropriate supervisor-provided feedback environment, their deviance is suppressed because of the related reduction in work-related stressors. Thus, to decrease deviant behavior, organizations may foster supervisor integration of disseminated knowledge such as (a) how to improve employees' actual performance, (b) how to effectively clarify expected performance, and (c) how to improve continuous performance feedback. If supervisors absorb this integrated feedback knowledge, they should be in a better position to enhance their own daily interactions with nurses and reduce nurses' work-related stress and, consequently, decrease deviant behavior.

  10. Nursing Faculty and Academic Integrity

    ERIC Educational Resources Information Center

    Wilson, Cecilia E.

    2013-01-01

    Insufficient information exists regarding the process influencing faculty decisions, specifically in the area of maintaining academic integrity in an online environment. The purpose of the study was to explore the experiences and decision-making process of nursing faculty related to maintaining academic integrity in an online environment. The…

  11. Science Laboratory Environment and Academic Performance

    NASA Astrophysics Data System (ADS)

    Aladejana, Francisca; Aderibigbe, Oluyemisi

    2007-12-01

    The study determined how students assess the various components of their science laboratory environment. It also identified how the laboratory environment affects students' learning outcomes. The modified ex-post facto design was used. A sample of 328 randomly selected students was taken from a population of all Senior Secondary School chemistry students in a state in Nigeria. The research instrument, Science Laboratory Environment Inventory (SLEI) designed and validated by Fraser et al. (Sci Educ 77:1-24, 1993) was administered on the selected students. Data analysis was done using descriptive statistics and Product Moment Correlation. Findings revealed that students could assess the five components (Student cohesiveness, Open-endedness, Integration, Rule clarity, and Material Environment) of the laboratory environment. Student cohesiveness has the highest assessment while material environment has the least. The results also showed that the five components of the science laboratory environment are positively correlated with students' academic performance. The findings are discussed with a view to improving the quality of the laboratory environment, subsequent academic performance in science and ultimately the enrolment and retaining of learners in science.

  12. Space Operations Analysis Using the Synergistic Engineering Environment

    NASA Technical Reports Server (NTRS)

    Angster, Scott; Brewer, Laura

    2002-01-01

    The Synergistic Engineering Environment has been under development at the NASA Langley Research Center to aid in the understanding of the operations of spacecraft. This is accomplished through the integration of multiple data sets, analysis tools, spacecraft geometric models, and a visualization environment to create an interactive virtual simulation of the spacecraft. Initially designed to support the needs of the International Space Station, the SEE has broadened the scope to include spacecraft ranging from low-earth orbit to deep space missions. Analysis capabilities within the SEE include rigid body dynamics, kinematics, orbital mechanics, and payload operations. This provides the user the ability to perform real-time interactive engineering analyses in areas including flight attitudes and maneuvers, visiting vehicle docking scenarios, robotic operations, plume impingement, field of view obscuration, and alternative assembly configurations. The SEE has been used to aid in the understanding of several operational procedures related to the International Space Station. This paper will address the capabilities of the first build of the SEE, present several use cases of the SEE, and discuss the next build of the SEE.

  13. Building integrated business environments: analysing open-source ESB

    NASA Astrophysics Data System (ADS)

    Martínez-Carreras, M. A.; García Jimenez, F. J.; Gómez Skarmeta, A. F.

    2015-05-01

    Integration and interoperability are two concepts that have gained significant prominence in the business field, providing tools which enable enterprise application integration (EAI). In this sense, enterprise service bus (ESB) has played a crucial role as the underpinning technology for creating integrated environments in which companies may connect all their legacy-applications. However, the potential of these technologies remains unknown and some important features are not used to develop suitable business environments. The aim of this paper is to describe and detail the elements for building the next generation of integrated business environments (IBE) and to analyse the features of ESBs as the core of this infrastructure. For this purpose, we evaluate how well-known open-source ESB products fulfil these needs. Moreover, we introduce a scenario in which the collaborative system 'Alfresco' is integrated in the business infrastructure. Finally, we provide a comparison of the different open-source ESBs available for IBE requirements. According to this study, Fuse ESB provides the best results, considering features such as support for a wide variety of standards and specifications, documentation and implementation, security, advanced business trends, ease of integration and performance.

  14. A Visual Basic simulation software tool for performance analysis of a membrane-based advanced water treatment plant.

    PubMed

    Pal, P; Kumar, R; Srivastava, N; Chaudhuri, J

    2014-02-01

    A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater.

  15. Enabling Medical Device Interoperability for the Integrated Clinical Environment

    DTIC Science & Technology

    2016-12-01

    else who is eager to work together to mature the healthcare technology ecosystem to enable the next generation of safe and intelligent medical device...Award Number: W81XWH-12-C-0154 TITLE: “Enabling Medical Device Interoperability for the Integrated Clinical Environment ” PRINCIPAL INVESTIGATOR...SUBTITLE 5a. CONTRACT NUMBER W81XWH-12-C-0154 “Enabling Medical Device Interoperability for the Integrated Clinical Environment ” 5b. GRANT NUMBER 5c

  16. Space Medicine in the Human System Integration Process

    NASA Technical Reports Server (NTRS)

    Scheuring, Richard A.

    2010-01-01

    This slide presentation reviews the importance of integration of space medicine in the human system of lunar exploration. There is a review of historical precedence in reference to lunar surface operations. The integration process is reviewed in a chart which shows the steps from research to requirements development, requirements integration, design, verification, operations and using the lessons learned, giving more information and items for research. These steps are reviewed in view of specific space medical issues. Some of the testing of the operations are undertaken in an environment that is an analog to the exploration environment. Some of these analog environments are reviewed, and there is some discussion of the benefits of use of an analog environment in testing the processes that are derived.

  17. Stratigraphy, facies analysis and depositional environments of the Upper Unit of Abu Roash "E" member in the Abu Gharadig field, Western Desert, Egypt

    NASA Astrophysics Data System (ADS)

    Hewaidy, Abdel Galil; Elshahat, O. R.; Kamal, Samy

    2018-03-01

    Abu Roach "E" member is of an important hydrocarbon reservoir-producing horizon in the Abu Gharadig Field (north Western Desert, Egypt). This study is used to build facies analysis and depositional environments model for the Upper Unit of the Abu Roash "E" member in Abu Gharadig Field. This target has been achieved throughout the sedimentological, wire line logs, lithostratigraphic and biostratigraphic analyses of more than 528 feet cores. The high-resolution biostratigraphic analysis provides a calibration for the paleo-bathymetry and depositional environmental interpretations. Biozonation and lithostratigraphic markers are used to constrain stratigraphic correlation. Integration between the core description and petorographic microfacies analysis by microscope examination provide an excellent indication for the rock types and depositional environments. Five depositional facies types are detected including carbonate inner ramp, tidal flats, tidal channels, supra-tidal and tide dominated delta facies. This model helps in the understanding of the Upper Unit of Abu Roash "E" member reservoir distribution as well as lateral and vertical facies changes that contribute to the development strategy for the remaining hydrocarbon reserves for this important oil reservoir.

  18. 49 CFR 190.239 - Safety orders.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... pipeline integrity risk to public safety, property, or the environment, the Associate Administrator may... existence of a condition that poses a pipeline integrity risk to public safety, property, or the environment... public safety, property, or the environment. (5) Post-hearing action. Following a hearing under this...

  19. Hippocampus and Retrosplenial Cortex Combine Path Integration Signals for Successful Navigation

    PubMed Central

    Erdem, Uğur M.; Ross, Robert S.; Brown, Thackery I.; Hasselmo, Michael E.; Stern, Chantal E.

    2013-01-01

    The current study used fMRI in humans to examine goal-directed navigation in an open field environment. We designed a task that required participants to encode survey-level spatial information and subsequently navigate to a goal location in either first person, third person, or survey perspectives. Critically, no distinguishing landmarks or goal location markers were present in the environment, thereby requiring participants to rely on path integration mechanisms for successful navigation. We focused our analysis on mechanisms related to navigation and mechanisms tracking linear distance to the goal location. Successful navigation required translation of encoded survey-level map information for orientation and implementation of a planned route to the goal. Our results demonstrate that successful first and third person navigation trials recruited the anterior hippocampus more than trials when the goal location was not successfully reached. When examining only successful trials, the retrosplenial and posterior parietal cortices were recruited for goal-directed navigation in both first person and third person perspectives. Unique to first person perspective navigation, the hippocampus was recruited to path integrate self-motion cues with location computations toward the goal location. Last, our results demonstrate that the hippocampus supports goal-directed navigation by actively tracking proximity to the goal throughout navigation. When using path integration mechanisms in first person and third person perspective navigation, the posterior hippocampus was more strongly recruited as participants approach the goal. These findings provide critical insight into the neural mechanisms by which we are able to use map-level representations of our environment to reach our navigational goals. PMID:24305826

  20. Dynamics of Hierarchical Urban Green Space Patches and Implications for Management Policy.

    PubMed

    Yu, Zhoulu; Wang, Yaohui; Deng, Jinsong; Shen, Zhangquan; Wang, Ke; Zhu, Jinxia; Gan, Muye

    2017-06-06

    Accurately quantifying the variation of urban green space is the prerequisite for fully understanding its ecosystem services. However, knowledge about the spatiotemporal dynamics of urban green space is still insufficient due to multiple challenges that remain in mapping green spaces within heterogeneous urban environments. This paper uses the city of Hangzhou to demonstrate an analysis methodology that integrates sub-pixel mapping technology and landscape analysis to fully investigate the spatiotemporal pattern and variation of hierarchical urban green space patches. Firstly, multiple endmember spectral mixture analysis was applied to time series Landsat data to derive green space coverage at the sub-pixel level. Landscape metric analysis was then employed to characterize the variation pattern of urban green space patches. Results indicate that Hangzhou has experienced a significant loss of urban greenness, producing a more fragmented and isolated vegetation landscape. Additionally, a remarkable amelioration of urban greenness occurred in the city core from 2002 to 2013, characterized by the significant increase of small-sized green space patches. The green space network has been formed as a consequence of new urban greening strategies in Hangzhou. These strategies have greatly fragmented the built-up areas and enriched the diversity of the urban landscape. Gradient analysis further revealed a distinct pattern of urban green space landscape variation in the process of urbanization. By integrating both sub-pixel mapping technology and landscape analysis, our approach revealed the subtle variation of urban green space patches which are otherwise easy to overlook. Findings from this study will help us to refine our understanding of the evolution of heterogeneous urban environments.

  1. Dynamics of Hierarchical Urban Green Space Patches and Implications for Management Policy

    PubMed Central

    Yu, Zhoulu; Wang, Yaohui; Deng, Jinsong; Shen, Zhangquan; Wang, Ke; Zhu, Jinxia; Gan, Muye

    2017-01-01

    Accurately quantifying the variation of urban green space is the prerequisite for fully understanding its ecosystem services. However, knowledge about the spatiotemporal dynamics of urban green space is still insufficient due to multiple challenges that remain in mapping green spaces within heterogeneous urban environments. This paper uses the city of Hangzhou to demonstrate an analysis methodology that integrates sub-pixel mapping technology and landscape analysis to fully investigate the spatiotemporal pattern and variation of hierarchical urban green space patches. Firstly, multiple endmember spectral mixture analysis was applied to time series Landsat data to derive green space coverage at the sub-pixel level. Landscape metric analysis was then employed to characterize the variation pattern of urban green space patches. Results indicate that Hangzhou has experienced a significant loss of urban greenness, producing a more fragmented and isolated vegetation landscape. Additionally, a remarkable amelioration of urban greenness occurred in the city core from 2002 to 2013, characterized by the significant increase of small-sized green space patches. The green space network has been formed as a consequence of new urban greening strategies in Hangzhou. These strategies have greatly fragmented the built-up areas and enriched the diversity of the urban landscape. Gradient analysis further revealed a distinct pattern of urban green space landscape variation in the process of urbanization. By integrating both sub-pixel mapping technology and landscape analysis, our approach revealed the subtle variation of urban green space patches which are otherwise easy to overlook. Findings from this study will help us to refine our understanding of the evolution of heterogeneous urban environments. PMID:28587309

  2. Distributed collaborative environments for predictive battlespace awareness

    NASA Astrophysics Data System (ADS)

    McQuay, William K.

    2003-09-01

    The past decade has produced significant changes in the conduct of military operations: asymmetric warfare, the reliance on dynamic coalitions, stringent rules of engagement, increased concern about collateral damage, and the need for sustained air operations. Mission commanders need to assimilate a tremendous amount of information, make quick-response decisions, and quantify the effects of those decisions in the face of uncertainty. Situational assessment is crucial in understanding the battlespace. Decision support tools in a distributed collaborative environment offer the capability of decomposing complex multitask processes and distributing them over a dynamic set of execution assets that include modeling, simulations, and analysis tools. Decision support technologies can semi-automate activities, such as analysis and planning, that have a reasonably well-defined process and provide machine-level interfaces to refine the myriad of information that the commander must fused. Collaborative environments provide the framework and integrate models, simulations, and domain specific decision support tools for the sharing and exchanging of data, information, knowledge, and actions. This paper describes ongoing AFRL research efforts in applying distributed collaborative environments to predictive battlespace awareness.

  3. HANFORD DOUBLE SHELL TANK (DST) THERMAL & SEISMIC PROJECT BUCKLING EVALUATION METHODS & RESULTS FOR THE PRIMARY TANKS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MACKEY, T.C.

    2006-03-17

    This report documents a detailed buckling evaluation of the primary tanks in the Hanford double shell waste tanks. The analysis is part of a comprehensive structural review for the Double-Shell Tank Integrity Project. This work also provides information on tank integrity that specifically responds to concerns raise by the Office of Environment, Safety, and Health (ES&H) Oversight (EH-22) during a review (in April and May 2001) of work being performed on the double-shell tank farms, and the operation of the aging waste facility (AWF) primary tank ventilation system.

  4. Monocular camera/IMU/GNSS integration for ground vehicle navigation in challenging GNSS environments.

    PubMed

    Chu, Tianxing; Guo, Ningyan; Backén, Staffan; Akos, Dennis

    2012-01-01

    Low-cost MEMS-based IMUs, video cameras and portable GNSS devices are commercially available for automotive applications and some manufacturers have already integrated such facilities into their vehicle systems. GNSS provides positioning, navigation and timing solutions to users worldwide. However, signal attenuation, reflections or blockages may give rise to positioning difficulties. As opposed to GNSS, a generic IMU, which is independent of electromagnetic wave reception, can calculate a high-bandwidth navigation solution, however the output from a self-contained IMU accumulates errors over time. In addition, video cameras also possess great potential as alternate sensors in the navigation community, particularly in challenging GNSS environments and are becoming more common as options in vehicles. Aiming at taking advantage of these existing onboard technologies for ground vehicle navigation in challenging environments, this paper develops an integrated camera/IMU/GNSS system based on the extended Kalman filter (EKF). Our proposed integration architecture is examined using a live dataset collected in an operational traffic environment. The experimental results demonstrate that the proposed integrated system provides accurate estimations and potentially outperforms the tightly coupled GNSS/IMU integration in challenging environments with sparse GNSS observations.

  5. Monocular Camera/IMU/GNSS Integration for Ground Vehicle Navigation in Challenging GNSS Environments

    PubMed Central

    Chu, Tianxing; Guo, Ningyan; Backén, Staffan; Akos, Dennis

    2012-01-01

    Low-cost MEMS-based IMUs, video cameras and portable GNSS devices are commercially available for automotive applications and some manufacturers have already integrated such facilities into their vehicle systems. GNSS provides positioning, navigation and timing solutions to users worldwide. However, signal attenuation, reflections or blockages may give rise to positioning difficulties. As opposed to GNSS, a generic IMU, which is independent of electromagnetic wave reception, can calculate a high-bandwidth navigation solution, however the output from a self-contained IMU accumulates errors over time. In addition, video cameras also possess great potential as alternate sensors in the navigation community, particularly in challenging GNSS environments and are becoming more common as options in vehicles. Aiming at taking advantage of these existing onboard technologies for ground vehicle navigation in challenging environments, this paper develops an integrated camera/IMU/GNSS system based on the extended Kalman filter (EKF). Our proposed integration architecture is examined using a live dataset collected in an operational traffic environment. The experimental results demonstrate that the proposed integrated system provides accurate estimations and potentially outperforms the tightly coupled GNSS/IMU integration in challenging environments with sparse GNSS observations. PMID:22736999

  6. MIMI: multimodality, multiresource, information integration environment for biomedical core facilities.

    PubMed

    Szymanski, Jacek; Wilson, David L; Zhang, Guo-Qiang

    2009-10-01

    The rapid expansion of biomedical research has brought substantial scientific and administrative data management challenges to modern core facilities. Scientifically, a core facility must be able to manage experimental workflow and the corresponding set of large and complex scientific data. It must also disseminate experimental data to relevant researchers in a secure and expedient manner that facilitates collaboration and provides support for data interpretation and analysis. Administratively, a core facility must be able to manage the scheduling of its equipment and to maintain a flexible and effective billing system to track material, resource, and personnel costs and charge for services to sustain its operation. It must also have the ability to regularly monitor the usage and performance of its equipment and to provide summary statistics on resources spent on different categories of research. To address these informatics challenges, we introduce a comprehensive system called MIMI (multimodality, multiresource, information integration environment) that integrates the administrative and scientific support of a core facility into a single web-based environment. We report the design, development, and deployment experience of a baseline MIMI system at an imaging core facility and discuss the general applicability of such a system in other types of core facilities. These initial results suggest that MIMI will be a unique, cost-effective approach to addressing the informatics infrastructure needs of core facilities and similar research laboratories.

  7. Integrity and moral residue: nurses as participants in a moral community.

    PubMed

    Hardingham, Lorraine B

    2004-07-01

    This paper will examine the concepts of integrity and moral residue as they relate to nursing practice in the current health care environment. I will begin with my definition and conception of ethical practice, and, based on that, will go on to argue for the importance of recognizing that nurses often find themselves in the position of compromising their moral integrity in order to maintain their self-survival in the hospital or health care environment. I will argue that moral integrity is necessary to a moral life, and is relational in nature. When integrity is threatened, the result is moral distress, moral residue, and in some cases, abandonment of the profession. The solution will require more than teaching bioethics to nursing students and nurses. It will require changes in the health care environment, organizational culture and the education of nurses, with an emphasis on building a moral community as an environment in which to practise ethically.

  8. The COPD Knowledge Base: enabling data analysis and computational simulation in translational COPD research.

    PubMed

    Cano, Isaac; Tényi, Ákos; Schueller, Christine; Wolff, Martin; Huertas Migueláñez, M Mercedes; Gomez-Cabrero, David; Antczak, Philipp; Roca, Josep; Cascante, Marta; Falciani, Francesco; Maier, Dieter

    2014-11-28

    Previously we generated a chronic obstructive pulmonary disease (COPD) specific knowledge base (http://www.copdknowledgebase.eu) from clinical and experimental data, text-mining results and public databases. This knowledge base allowed the retrieval of specific molecular networks together with integrated clinical and experimental data. The COPDKB has now been extended to integrate over 40 public data sources on functional interaction (e.g. signal transduction, transcriptional regulation, protein-protein interaction, gene-disease association). In addition we integrated COPD-specific expression and co-morbidity networks connecting over 6 000 genes/proteins with physiological parameters and disease states. Three mathematical models describing different aspects of systemic effects of COPD were connected to clinical and experimental data. We have completely redesigned the technical architecture of the user interface and now provide html and web browser-based access and form-based searches. A network search enables the use of interconnecting information and the generation of disease-specific sub-networks from general knowledge. Integration with the Synergy-COPD Simulation Environment enables multi-scale integrated simulation of individual computational models while integration with a Clinical Decision Support System allows delivery into clinical practice. The COPD Knowledge Base is the only publicly available knowledge resource dedicated to COPD and combining genetic information with molecular, physiological and clinical data as well as mathematical modelling. Its integrated analysis functions provide overviews about clinical trends and connections while its semantically mapped content enables complex analysis approaches. We plan to further extend the COPDKB by offering it as a repository to publish and semantically integrate data from relevant clinical trials. The COPDKB is freely available after registration at http://www.copdknowledgebase.eu.

  9. Packetized Video On MAGNET

    NASA Astrophysics Data System (ADS)

    Lazar, Aurel A.; White, John S.

    1987-07-01

    Theoretical analysis of integrated local area network model of MAGNET, an integrated network testbed developed at Columbia University, shows that the bandwidth freed up during video and voice calls during periods of little movement in the images and periods of silence in the speech signals could be utilized efficiently for graphics and data transmission. Based on these investigations, an architecture supporting adaptive protocols that are dynamicaly controlled by the requirements of a fluctuating load and changing user environment has been advanced. To further analyze the behavior of the network, a real-time packetized video system has been implemented. This system is embedded in the real-time multimedia workstation EDDY, which integrates video, voice, and data traffic flows. Protocols supporting variable-bandwidth, fixed-quality packetized video transport are described in detail.

  10. imDEV: a graphical user interface to R multivariate analysis tools in Microsoft Excel.

    PubMed

    Grapov, Dmitry; Newman, John W

    2012-09-01

    Interactive modules for Data Exploration and Visualization (imDEV) is a Microsoft Excel spreadsheet embedded application providing an integrated environment for the analysis of omics data through a user-friendly interface. Individual modules enables interactive and dynamic analyses of large data by interfacing R's multivariate statistics and highly customizable visualizations with the spreadsheet environment, aiding robust inferences and generating information-rich data visualizations. This tool provides access to multiple comparisons with false discovery correction, hierarchical clustering, principal and independent component analyses, partial least squares regression and discriminant analysis, through an intuitive interface for creating high-quality two- and a three-dimensional visualizations including scatter plot matrices, distribution plots, dendrograms, heat maps, biplots, trellis biplots and correlation networks. Freely available for download at http://sourceforge.net/projects/imdev/. Implemented in R and VBA and supported by Microsoft Excel (2003, 2007 and 2010).

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lingerfelt, Eric J; Endeve, Eirik; Hui, Yawei

    Improvements in scientific instrumentation allow imaging at mesoscopic to atomic length scales, many spectroscopic modes, and now--with the rise of multimodal acquisition systems and the associated processing capability--the era of multidimensional, informationally dense data sets has arrived. Technical issues in these combinatorial scientific fields are exacerbated by computational challenges best summarized as a necessity for drastic improvement in the capability to transfer, store, and analyze large volumes of data. The Bellerophon Environment for Analysis of Materials (BEAM) platform provides material scientists the capability to directly leverage the integrated computational and analytical power of High Performance Computing (HPC) to perform scalablemore » data analysis and simulation and manage uploaded data files via an intuitive, cross-platform client user interface. This framework delivers authenticated, "push-button" execution of complex user workflows that deploy data analysis algorithms and computational simulations utilizing compute-and-data cloud infrastructures and HPC environments like Titan at the Oak Ridge Leadershp Computing Facility (OLCF).« less

  12. An analysis of space environment effects on performance and missions of a Solar Electric Propulsion Stage (SEPS)

    NASA Technical Reports Server (NTRS)

    Mcglathery, D. M.

    1975-01-01

    The development of an analysis which addresses the problems of degrading space environmental effects on the performance and missions of a Solar Electric Propulsion Stage (SEPS) is reported. A detailed study concerning the degrading effects of the Van Allen Belt charged-particle radiation on specific spacecraft subsystems is included, along with some of the thermal problems caused by electromagnetic radiation from the sun. The analytical methods used require the integration of two distinct analyses. The first, is a low-thrust trajectory analysis which uses analytical approximations to optimum steering for orbit raising, including three-dimensional plane change cases. The second is the conversion of the Vette time-averaged differential energy spectra for protons and electrons into a 1-MeV electron equivalent environment as a function of spatial position and thickness of various shielding materials and solar-cell cover slides.

  13. Apparatus and methods for packaging integrated circuit chips with antenna modules providing closed electromagnetic environment for integrated antennas

    NASA Technical Reports Server (NTRS)

    Gaucher, Brian P. (Inventor); Grzyb, Janusz (Inventor); Liu, Duixian (Inventor); Pfeiffer, Ullrich R. (Inventor)

    2008-01-01

    Apparatus and methods are provided for packaging IC chips together with integrated antenna modules designed to provide a closed EM (electromagnetic) environment for antenna radiators, thereby allowing antennas to be designed independent from the packaging technology.

  14. Telearch - Integrated visual simulation environment for collaborative virtual archaeology.

    NASA Astrophysics Data System (ADS)

    Kurillo, Gregorij; Forte, Maurizio

    Archaeologists collect vast amounts of digital data around the world; however, they lack tools for integration and collaborative interaction to support reconstruction and interpretation process. TeleArch software is aimed to integrate different data sources and provide real-time interaction tools for remote collaboration of geographically distributed scholars inside a shared virtual environment. The framework also includes audio, 2D and 3D video streaming technology to facilitate remote presence of users. In this paper, we present several experimental case studies to demonstrate the integration and interaction with 3D models and geographical information system (GIS) data in this collaborative environment.

  15. Lacustrine Environment Reservoir Properties on Sandstone Minerals and Hydrocarbon Content: A Case Study on Doba Basin, Southern Chad

    NASA Astrophysics Data System (ADS)

    Sumery, N. F. Mohd; Lo, S. Z.; Salim, A. M. A.

    2017-10-01

    The contribution of lacustrine environment as the hydrocarbon reservoir has been widely known. However, despite its growing importance, the lacustrine petroleum geology has received far less attention than marine due to its sedimentological complexity. This study therefore aims in developing an understanding of the unique aspects of lacustrine reservoirs which eventually impacts the future exploration decisions. Hydrocarbon production in Doba Basin, particularly the northern boundary, for instance, has not yet succeeded due to the unawareness of its depositional environment. The drilling results show that the problems were due to the: radioactive sand and waxy oil/formation damage, which all are related to the lacustrine depositional environment. Detailed study of geological and petrophysical integration on wireline logs and petrographic thin sections analysis of this environment helps in distinguishing reservoir and non-reservoir areas and determining the possible mechanism causing the failed DST results. The interpretations show that the correlation of all types> of logs and rho matrix analysis are capable in identifying sand and shale bed despite of the radioactive sand present. The failure of DST results were due to the presence of arkose in sand and waxy oil in reservoir bed. This had been confirmed by the petrographic thin section analysis where the arkose has mineral twinning effect indicate feldspar and waxy oil showing bright colour under fluorescent light. Understanding these special lacustrine environment characteristics and features will lead to a better interpretation of hydrocarbon prospectivity for future exploration.

  16. Enabling a systems biology knowledgebase with gaggle and firegoose

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baliga, Nitin S.

    The overall goal of this project was to extend the existing Gaggle and Firegoose systems to develop an open-source technology that runs over the web and links desktop applications with many databases and software applications. This technology would enable researchers to incorporate workflows for data analysis that can be executed from this interface to other online applications. The four specific aims were to (1) provide one-click mapping of genes, proteins, and complexes across databases and species; (2) enable multiple simultaneous workflows; (3) expand sophisticated data analysis for online resources; and enhance open-source development of the Gaggle-Firegoose infrastructure. Gaggle is anmore » open-source Java software system that integrates existing bioinformatics programs and data sources into a user-friendly, extensible environment to allow interactive exploration, visualization, and analysis of systems biology data. Firegoose is an extension to the Mozilla Firefox web browser that enables data transfer between websites and desktop tools including Gaggle. In the last phase of this funding period, we have made substantial progress on development and application of the Gaggle integration framework. We implemented the workspace to the Network Portal. Users can capture data from Firegoose and save them to the workspace. Users can create workflows to start multiple software components programmatically and pass data between them. Results of analysis can be saved to the cloud so that they can be easily restored on any machine. We also developed the Gaggle Chrome Goose, a plugin for the Google Chrome browser in tandem with an opencpu server in the Amazon EC2 cloud. This allows users to interactively perform data analysis on a single web page using the R packages deployed on the opencpu server. The cloud-based framework facilitates collaboration between researchers from multiple organizations. We have made a number of enhancements to the cmonkey2 application to enable and improve the integration within different environments, and we have created a new tools pipeline for generating EGRIN2 models in a largely automated way.« less

  17. Low Power Shoe Integrated Intelligent Wireless Gait Measurement System

    NASA Astrophysics Data System (ADS)

    Wahab, Y.; Mazalan, M.; Bakar, N. A.; Anuar, A. F.; Zainol, M. Z.; Hamzah, F.

    2014-04-01

    Gait analysis measurement is a method to assess and identify gait events and the measurements of dynamic, motion and pressure parameters involving the lowest part of the body. This significant analysis is widely used in sports, rehabilitation as well as other health diagnostic towards improving the quality of life. This paper presents a new system empowered by Inertia Measurement Unit (IMU), ultrasonic sensors, piezoceramic sensors array, XBee wireless modules and Arduino processing unit. This research focuses on the design and development of a low power ultra-portable shoe integrated wireless intelligent gait measurement using MEMS and recent microelectronic devices for foot clearance, orientation, error correction, gait events and pressure measurement system. It is developed to be cheap, low power, wireless, real time and suitable for real life in-door and out-door environment.

  18. Simulation analysis of an integrated model for dynamic cellular manufacturing system

    NASA Astrophysics Data System (ADS)

    Hao, Chunfeng; Luan, Shichao; Kong, Jili

    2017-05-01

    Application of dynamic cellular manufacturing system (DCMS) is a well-known strategy to improve manufacturing efficiency in the production environment with high variety and low volume of production. Often, neither the trade-off of inter and intra-cell material movements nor the trade-off of hiring and firing of operators are examined in details. This paper presents simulation results of an integrated mixed-integer model including sensitivity analysis for several numerical examples. The comprehensive model includes cell formation, inter and intracellular materials handling, inventory and backorder holding, operator assignment (including resource adjustment) and flexible production routing. The model considers multi-production planning with flexible resources (machines and operators) where each period has different demands. The results verify the validity and sensitivity of the proposed model using a genetic algorithm.

  19. Cantera Integration with the Toolbox for Modeling and Analysis of Thermodynamic Systems (T-MATS)

    NASA Technical Reports Server (NTRS)

    Lavelle, Thomas M.; Chapman, Jeffryes W.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei

    2014-01-01

    NASA Glenn Research Center (GRC) has recently developed a software package for modeling generic thermodynamic systems called the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS). T-MATS is a library of building blocks that can be assembled to represent any thermodynamic system in the Simulink(Registered TradeMark) (The MathWorks, Inc.) environment. These elements, along with a Newton Raphson solver (also provided as part of the T-MATS package), enable users to create models of a wide variety of systems. The current version of T-MATS (v1.0.1) uses tabular data for providing information about a specific mixture of air, water (humidity), and hydrocarbon fuel in calculations of thermodynamic properties. The capabilities of T-MATS can be expanded by integrating it with the Cantera thermodynamic package. Cantera is an object-oriented analysis package that calculates thermodynamic solutions for any mixture defined by the user. Integration of Cantera with T-MATS extends the range of systems that may be modeled using the toolbox. In addition, the library of elements released with Cantera were developed using MATLAB native M-files, allowing for quicker prototyping of elements. This paper discusses how the new Cantera-based elements are created and provides examples for using T-MATS integrated with Cantera.

  20. Cantera Integration with the Toolbox for Modeling and Analysis of Thermodynamic Systems (T-MATS)

    NASA Technical Reports Server (NTRS)

    Lavelle, Thomas M.; Chapman, Jeffryes W.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei

    2014-01-01

    NASA Glenn Research Center (GRC) has recently developed a software package for modeling generic thermodynamic systems called the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS). T-MATS is a library of building blocks that can be assembled to represent any thermodynamic system in the Simulink (The MathWorks, Inc.) environment. These elements, along with a Newton Raphson solver (also provided as part of the T-MATS package), enable users to create models of a wide variety of systems. The current version of T-MATS (v1.0.1) uses tabular data for providing information about a specific mixture of air, water (humidity), and hydrocarbon fuel in calculations of thermodynamic properties. The capabilities of T-MATS can be expanded by integrating it with the Cantera thermodynamic package. Cantera is an object-oriented analysis package that calculates thermodynamic solutions for any mixture defined by the user. Integration of Cantera with T-MATS extends the range of systems that may be modeled using the toolbox. In addition, the library of elements released with Cantera were developed using MATLAB native M-files, allowing for quicker prototyping of elements. This paper discusses how the new Cantera-based elements are created and provides examples for using T-MATS integrated with Cantera.

  1. Software Management Environment (SME): Components and algorithms

    NASA Technical Reports Server (NTRS)

    Hendrick, Robert; Kistler, David; Valett, Jon

    1994-01-01

    This document presents the components and algorithms of the Software Management Environment (SME), a management tool developed for the Software Engineering Branch (Code 552) of the Flight Dynamics Division (FDD) of the Goddard Space Flight Center (GSFC). The SME provides an integrated set of visually oriented experienced-based tools that can assist software development managers in managing and planning software development projects. This document describes and illustrates the analysis functions that underlie the SME's project monitoring, estimation, and planning tools. 'SME Components and Algorithms' is a companion reference to 'SME Concepts and Architecture' and 'Software Engineering Laboratory (SEL) Relationships, Models, and Management Rules.'

  2. An Expert System for the Development of Efficient Parallel Code

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Chun, Robert; Jin, Hao-Qiang; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    We have built the prototype of an expert system to assist the user in the development of efficient parallel code. The system was integrated into the parallel programming environment that is currently being developed at NASA Ames. The expert system interfaces to tools for automatic parallelization and performance analysis. It uses static program structure information and performance data in order to automatically determine causes of poor performance and to make suggestions for improvements. In this paper we give an overview of our programming environment, describe the prototype implementation of our expert system, and demonstrate its usefulness with several case studies.

  3. Overview of Heatshield for Extreme Entry Environment Technology (HEEET)

    NASA Technical Reports Server (NTRS)

    Driver, David M.; Ellerby, Donald T.; Gasch, Matthew J.; Mahzari, Milad; Milos, Frank S.; Nishioka, Owen S.; Stackpoole, Margaret M.; Venkatapathy, Ethiraj; Young, Zion W.; Gage, Peter J.; hide

    2018-01-01

    The Heatshield for Extreme Entry Environment Technology (HEEET) projects objective is to mature a 3-D Woven Thermal Protection System (TPS) to Technical Readiness Level (TRL) 6 to support future NASA missions to destinations such as Venus and Saturn. The scope of the project, status of which will be discussed, encompasses development of manufacturing and integration processes, fabrication of a prototype 1m diameter engineering test unit (ETU) that will undergo a series of structural tests, characterizing material aerothermal performance including development of a material response model, and structural testing and analysis to develop tools to support design and establish system capability.

  4. Searching for justice for body and self in a coercive environment: sex work in Kerala, India.

    PubMed

    Jayasree, A K

    2004-05-01

    Sex workers in Kerala, India, live in a coercive environment and face violence from the police and criminals, lack of shelter, lack of childcare support and have many physical and mental health problems. This paper documents the environment in which women have been selling sex in Kerala since 1995, and their efforts to claim their rights. It is based on sex workers' own reports and experiences, a situation analysis and a needs assessment study by the Foundation for Integrated Research in Mental Health. Involvement in HIV/AIDS prevention projects first gave sex workers in Kerala an opportunity to come together. Some have become peer educators and distribute condoms but they continue to be harassed by police. Most anti-trafficking interventions, including rescue and rehabilitation, either criminalise or victimise sex workers, and sex workers reject them as a solution to sex work. They understand that the lack of sexual fulfillment in other relationships and their own lack of access to other work and resources are the reasons why commercial sex flourishes. Sex workers are not mere victims without agency. They have a right to bodily integrity, pleasure, livelihood, self-determination and a safe working environment. Sex workers are organising themselves for these objectives and demand decriminalisation of sex work.

  5. NCI's High Performance Computing (HPC) and High Performance Data (HPD) Computing Platform for Environmental and Earth System Data Science

    NASA Astrophysics Data System (ADS)

    Evans, Ben; Allen, Chris; Antony, Joseph; Bastrakova, Irina; Gohar, Kashif; Porter, David; Pugh, Tim; Santana, Fabiana; Smillie, Jon; Trenham, Claire; Wang, Jingbo; Wyborn, Lesley

    2015-04-01

    The National Computational Infrastructure (NCI) has established a powerful and flexible in-situ petascale computational environment to enable both high performance computing and Data-intensive Science across a wide spectrum of national environmental and earth science data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress so far to harmonise the underlying data collections for future interdisciplinary research across these large volume data collections. NCI has established 10+ PBytes of major national and international data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the major Australian national-scale scientific collections), leading research communities, and collaborating overseas organisations. New infrastructures created at NCI mean the data collections are now accessible within an integrated High Performance Computing and Data (HPC-HPD) environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large-scale high-bandwidth Lustre filesystems. The hardware was designed at inception to ensure that it would allow the layered software environment to flexibly accommodate the advancement of future data science. New approaches to software technology and data models have also had to be developed to enable access to these large and exponentially increasing data volumes at NCI. Traditional HPC and data environments are still made available in a way that flexibly provides the tools, services and supporting software systems on these new petascale infrastructures. But to enable the research to take place at this scale, the data, metadata and software now need to evolve together - creating a new integrated high performance infrastructure. The new infrastructure at NCI currently supports a catalogue of integrated, reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. One of the challenges for NCI has been to support existing techniques and methods, while carefully preparing the underlying infrastructure for the transition needed for the next class of Data-intensive Science. In doing so, a flexible range of techniques and software can be made available for application across the corpus of data collections available, and to provide a new infrastructure for future interdisciplinary research.

  6. Motivational Support in Web 2.0 Learning Environments: A Regression Analysis Based on the Integrative Theory of Motivation, Volition and Performance

    ERIC Educational Resources Information Center

    Huang, Wen-Hao David; Hood, Denice Ward; Yoo, Sun Joo

    2014-01-01

    Web 2.0 applications have been widely applied for teaching and learning in US higher education in recent years. Their potential impact on learning motivation and learner performance, however, has not attracted substantial research efforts. To better understand how Web 2.0 applications might impact learners' motivation in higher education…

  7. Specifications of a Simulation Model for a Local Area Network Design in Support of a Stock Point Logistics Integrated Communication Environment (SPLICE).

    DTIC Science & Technology

    1983-06-01

    constrained at each step. Use of dis- crete simulation can be a powerful tool in this process if its role is carefully planned. The gross behavior of the...by projecting: - the arrival of units of work at SPLICE processing facilities (workload analysis) . - the amount of processing resources comsumed in

  8. Integrating Space Systems Operations at the Marine Expeditionary Force Level

    DTIC Science & Technology

    2015-06-01

    Electromagnetic Interference ENVI Environment for Visualizing Images EW Electronic Warfare xvi FA40 Space Operations Officer FEC Fires and Effects...Information Facility SFE Space Force Enhancement SIGINT Signals Intelligence SSA Space Situational Awareness SSE Space Support Element STK Systems...April 23, 2015. 65 • GPS Interference and Navigation Tool (GIANT) for providing GPS accuracy prediction reports • Systems Toolkit ( STK ) Analysis

  9. Influence of Problem-Based Learning Strategy on Enhancing Student's Industrial Oriented Competences Learned: An Action Research on Learning Weblog Analysis

    ERIC Educational Resources Information Center

    Chung, Pansy; Yeh, Ron Chuen; Chen, Yi-Cheng

    2016-01-01

    In order to respond to the ever-changing global economic environment, the technological and vocational education system in Taiwan needs to be dramatically reformed to the changing needs of the domestic industrial structure. Integrating practical talents with practical industrial experiences and competences can help avoid discrepancy and close the…

  10. Has God Been Expelled from School: An Investigation into the Evolution of the Church-State Debate within the American Public School System

    ERIC Educational Resources Information Center

    Swick, Matthew W.

    2009-01-01

    This study focused upon the infamous "Wall of Separation" between church and state--its cornerstones, apparently eternal construction process, and current structural integrity. Within the context of providing truly caring learning environments for all students, a critical analysis was completed in order to help determine whether or not…

  11. Thermal Response Modeling System for a Mars Sample Return Vehicle

    NASA Technical Reports Server (NTRS)

    Chen, Y.-K.; Miles, Frank S.; Arnold, Jim (Technical Monitor)

    2001-01-01

    A multi-dimensional, coupled thermal response modeling system for analysis of hypersonic entry vehicles is presented. The system consists of a high fidelity Navier-Stokes equation solver (GIANTS), a two-dimensional implicit thermal response, pyrolysis and ablation program (TITAN), and a commercial finite-element thermal and mechanical analysis code (MARC). The simulations performed by this integrated system include hypersonic flowfield, fluid and solid interaction, ablation, shape change, pyrolysis gas eneration and flow, and thermal response of heatshield and structure. The thermal response of the heatshield is simulated using TITAN, and that of the underlying structural is simulated using MARC. The ablating heatshield is treated as an outer boundary condition of the structure, and continuity conditions of temperature and heat flux are imposed at the interface between TITAN and MARC. Aerothermal environments with fluid and solid interaction are predicted by coupling TITAN and GIANTS through surface energy balance equations. With this integrated system, the aerothermal environments for an entry vehicle and the thermal response of the entire vehicle can be obtained simultaneously. Representative computations for a flat-faced arc-jet test model and a proposed Mars sample return capsule are presented and discussed.

  12. Thermal Response Modeling System for a Mars Sample Return Vehicle

    NASA Technical Reports Server (NTRS)

    Chen, Y.-K.; Milos, F. S.

    2002-01-01

    A multi-dimensional, coupled thermal response modeling system for analysis of hypersonic entry vehicles is presented. The system consists of a high fidelity Navier-Stokes equation solver (GIANTS), a two-dimensional implicit thermal response, pyrolysis and ablation program (TITAN), and a commercial finite element thermal and mechanical analysis code (MARC). The simulations performed by this integrated system include hypersonic flowfield, fluid and solid interaction, ablation, shape change, pyrolysis gas generation and flow, and thermal response of heatshield and structure. The thermal response of the heatshield is simulated using TITAN, and that of the underlying structural is simulated using MARC. The ablating heatshield is treated as an outer boundary condition of the structure, and continuity conditions of temperature and heat flux are imposed at the interface between TITAN and MARC. Aerothermal environments with fluid and solid interaction are predicted by coupling TITAN and GIANTS through surface energy balance equations. With this integrated system, the aerothermal environments for an entry vehicle and the thermal response of the entire vehicle can be obtained simultaneously. Representative computations for a flat-faced arc-jet test model and a proposed Mars sample return capsule are presented and discussed.

  13. An ethnographic object-oriented analysis of explorer presence in a volcanic terrain environment: Claims and evidence

    NASA Technical Reports Server (NTRS)

    Mcgreevy, Michael W.

    1994-01-01

    An ethnographic field study was conducted to investigate the nature of presence in field geology, and to develop specifications for domain-based planetary exploration systems utilizing virtual presence. Two planetary geologists were accompanied on a multi-day geologic field trip that they had arranged for their own scientific purposes, which centered on an investigation of the extraordinary xenolith/nodule deposits in the Kaupulehu lava flow of Hualalai Volcano, on the island of Hawaii. The geologists were observed during the course of their field investigations and interviewed regarding their activities and ideas. Analysis of the interview resulted in the identification of key domain entities and their attributes, relations among the entities, and explorer interactions with the environment. The results support and extend the author's previously reported continuity theory of presence, indicating that presence in field geology is characterized by persistent engagement with objects associated by metonymic relations. The results also provide design specifications for virtual planetary exploration systems, including an integrating structure for disparate data integration. Finally, the results suggest that unobtrusive participant observation coupled with field interviews is an effective research methodology for engineering ethnography.

  14. Environmental impact assessment of transportation projects: An analysis using an integrated GIS, remote sensing, and spatial modeling approach

    NASA Astrophysics Data System (ADS)

    El-Gafy, Mohamed Anwar

    Transportation projects will have impact on the environment. The general environmental pollution and damage caused by roads is closely associated with the level of economic activity. Although Environmental Impact Assessments (EIAs) are dependent on geo-spatial information in order to make an assessment, there are no rules per se how to conduct an environmental assessment. Also, the particular objective of each assessment is dictated case-by-case, based on what information and analyses are required. The conventional way of Environmental Impact Assessment (EIA) study is a time consuming process because it has large number of dependent and independent variables which have to be taken into account, which also have different consequences. With the emergence of satellite remote sensing technology and Geographic Information Systems (GIS), this research presents a new framework for the analysis phase of the Environmental Impact Assessment (EIA) for transportation projects based on the integration between remote sensing technology, geographic information systems, and spatial modeling. By integrating the merits of the map overlay method and the matrix method, the framework analyzes comprehensively the environmental vulnerability around the road and its impact on the environment. This framework is expected to: (1) improve the quality of the decision making process, (2) be applied both to urban and inter-urban projects, regardless of transport mode, and (3) present the data and make the appropriate analysis to support the decision of the decision-makers and allow them to present these data to the public hearings in a simple manner. Case studies, transportation projects in the State of Florida, were analyzed to illustrate the use of the decision support framework and demonstrate its capabilities. This cohesive and integrated system will facilitate rational decisions through cost effective coordination of environmental information and data management that can be tailored to specific projects. The framework would facilitate collecting, organizing, analyzing, archiving, and coordinating the information and data necessary to support technical and policy transportation decisions.

  15. Overview of ICE Project: Integration of Computational Fluid Dynamics and Experiments

    NASA Technical Reports Server (NTRS)

    Stegeman, James D.; Blech, Richard A.; Babrauckas, Theresa L.; Jones, William H.

    2001-01-01

    Researchers at the NASA Glenn Research Center have developed a prototype integrated environment for interactively exploring, analyzing, and validating information from computational fluid dynamics (CFD) computations and experiments. The Integrated CFD and Experiments (ICE) project is a first attempt at providing a researcher with a common user interface for control, manipulation, analysis, and data storage for both experiments and simulation. ICE can be used as a live, on-tine system that displays and archives data as they are gathered; as a postprocessing system for dataset manipulation and analysis; and as a control interface or "steering mechanism" for simulation codes while visualizing the results. Although the full capabilities of ICE have not been completely demonstrated, this report documents the current system. Various applications of ICE are discussed: a low-speed compressor, a supersonic inlet, real-time data visualization, and a parallel-processing simulation code interface. A detailed data model for the compressor application is included in the appendix.

  16. AutoMicromanager: A microscopy scripting toolkit for LABVIEW and other programming environments

    NASA Astrophysics Data System (ADS)

    Ashcroft, Brian Alan; Oosterkamp, Tjerk

    2010-11-01

    We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.

  17. AutoMicromanager: a microscopy scripting toolkit for LABVIEW and other programming environments.

    PubMed

    Ashcroft, Brian Alan; Oosterkamp, Tjerk

    2010-11-01

    We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.

  18. The Correlation of Geo-Ecological Environment and Mountain Urban planning

    NASA Astrophysics Data System (ADS)

    Yang, Chun; Zeng, Wei

    2018-01-01

    As a special area with the complex geological structure, mountain city is more prone to geological disasters. Due to air pollution, ground subsidence, serious water pollution, earthquakes and floods geo-ecological environment problems have become increasingly serious, mountain urban planning is facing more severe challenges. Therefore, this article bases on the correlation research of geo-ecological environment and mountain urban planning, and re-examins mountain urban planning from the perspective of geo-ecological, coordinates the relationship between the human and nature by geo-ecological thinking, raises the questions which urban planning need to pay attention. And advocates creating an integrated system of geo-ecological and mountain urban planning, analysis the status and dynamics of present mountain urban planning.

  19. Optimization of MLS receivers for multipath environments

    NASA Technical Reports Server (NTRS)

    Mcalpine, G. A.; Irwin, S. H.; NELSON; Roleyni, G.

    1977-01-01

    Optimal design studies of MLS angle-receivers and a theoretical design-study of MLS DME-receivers are reported. The angle-receiver results include an integration of the scan data processor and tracking filter components of the optimal receiver into a unified structure. An extensive simulation study comparing the performance of the optimal and threshold receivers in a wide variety of representative dynamical interference environments was made. The optimal receiver was generally superior. A simulation of the performance of the threshold and delay-and-compare receivers in various signal environments was performed. An analysis of combined errors due to lateral reflections from vertical structures with small differential path delays, specular ground reflections with neglible differential path delays, and thermal noise in the receivers is provided.

  20. Open discovery: An integrated live Linux platform of Bioinformatics tools.

    PubMed

    Vetrivel, Umashankar; Pilla, Kalabharath

    2008-01-01

    Historically, live linux distributions for Bioinformatics have paved way for portability of Bioinformatics workbench in a platform independent manner. Moreover, most of the existing live Linux distributions limit their usage to sequence analysis and basic molecular visualization programs and are devoid of data persistence. Hence, open discovery - a live linux distribution has been developed with the capability to perform complex tasks like molecular modeling, docking and molecular dynamics in a swift manner. Furthermore, it is also equipped with complete sequence analysis environment and is capable of running windows executable programs in Linux environment. Open discovery portrays the advanced customizable configuration of fedora, with data persistency accessible via USB drive or DVD. The Open Discovery is distributed free under Academic Free License (AFL) and can be downloaded from http://www.OpenDiscovery.org.in.

  1. MONTE: the next generation of mission design and navigation software

    NASA Astrophysics Data System (ADS)

    Evans, Scott; Taber, William; Drain, Theodore; Smith, Jonathon; Wu, Hsi-Cheng; Guevara, Michelle; Sunseri, Richard; Evans, James

    2018-03-01

    The Mission analysis, Operations and Navigation Toolkit Environment (MONTE) (Sunseri et al. in NASA Tech Briefs 36(9), 2012) is an astrodynamic toolkit produced by the Mission Design and Navigation Software Group at the Jet Propulsion Laboratory. It provides a single integrated environment for all phases of deep space and Earth orbiting missions. Capabilities include: trajectory optimization and analysis, operational orbit determination, flight path control, and 2D/3D visualization. MONTE is presented to the user as an importable Python language module. This allows a simple but powerful user interface via CLUI or script. In addition, the Python interface allows MONTE to be used seamlessly with other canonical scientific programming tools such as SciPy, NumPy, and Matplotlib. MONTE is the prime operational orbit determination software for all JPL navigated missions.

  2. GRAPE: a graphical pipeline environment for image analysis in adaptive magnetic resonance imaging.

    PubMed

    Gabr, Refaat E; Tefera, Getaneh B; Allen, William J; Pednekar, Amol S; Narayana, Ponnada A

    2017-03-01

    We present a platform, GRAphical Pipeline Environment (GRAPE), to facilitate the development of patient-adaptive magnetic resonance imaging (MRI) protocols. GRAPE is an open-source project implemented in the Qt C++ framework to enable graphical creation, execution, and debugging of real-time image analysis algorithms integrated with the MRI scanner. The platform provides the tools and infrastructure to design new algorithms, and build and execute an array of image analysis routines, and provides a mechanism to include existing analysis libraries, all within a graphical environment. The application of GRAPE is demonstrated in multiple MRI applications, and the software is described in detail for both the user and the developer. GRAPE was successfully used to implement and execute three applications in MRI of the brain, performed on a 3.0-T MRI scanner: (i) a multi-parametric pipeline for segmenting the brain tissue and detecting lesions in multiple sclerosis (MS), (ii) patient-specific optimization of the 3D fluid-attenuated inversion recovery MRI scan parameters to enhance the contrast of brain lesions in MS, and (iii) an algebraic image method for combining two MR images for improved lesion contrast. GRAPE allows graphical development and execution of image analysis algorithms for inline, real-time, and adaptive MRI applications.

  3. Mediator infrastructure for information integration and semantic data integration environment for biomedical research.

    PubMed

    Grethe, Jeffrey S; Ross, Edward; Little, David; Sanders, Brian; Gupta, Amarnath; Astakhov, Vadim

    2009-01-01

    This paper presents current progress in the development of semantic data integration environment which is a part of the Biomedical Informatics Research Network (BIRN; http://www.nbirn.net) project. BIRN is sponsored by the National Center for Research Resources (NCRR), a component of the National Institutes of Health (NIH). A goal is the development of a cyberinfrastructure for biomedical research that supports advance data acquisition, data storage, data management, data integration, data mining, data visualization, and other computing and information processing services over the Internet. Each participating institution maintains storage of their experimental or computationally derived data. Mediator-based data integration system performs semantic integration over the databases to enable researchers to perform analyses based on larger and broader datasets than would be available from any single institution's data. This paper describes recent revision of the system architecture, implementation, and capabilities of the semantically based data integration environment for BIRN.

  4. Core Values | NREL

    Science.gov Websites

    Core Values Core Values NREL's core values are rooted in a safe and supportive work environment guide our everyday actions and efforts: Safe and supportive work environment Respect for the rights physical and social environment Integrity Maintain the highest standard of ethics, honesty, and integrity

  5. The five-box method: The "four-box method" for the Catholic physician.

    PubMed

    Marugg, Lindsey; Atkinson, Marie-Noelle; Fernandes, Ashley

    2014-11-01

    The traditional ethical model of the "Four-Box Method" can be adapted to integrate the perspective of a Catholic physician. In an increasingly secularist environment, medical students and physicians are often asked to "leave religious beliefs at the door" and not consider the care and stewardship of our own morality and involvement as a provider. We reject this view. A patient's own religious and moral beliefs should be respected to the extent that they do not destroy our own; for us, the Catholic viewpoint can shine a light into dark corners and aid us in translating true things to patients of any religion. We analyzed a sample case in five different categories: medical indications, patient preferences, quality of life, contextual features, and the Catholic context. We explored how to methodically integrate the perspective of a Catholic physician into the analysis of this case to make the best decision for the patients. We felt that we were successfully able to integrate this perspective and create a "fifth box" based on the principles of Catholic social teaching. There were also points during the analysis that the perspective of the Catholic physician was integrated into the discussion of medical indications, proving to us that the "Catholic perspective" cannot be just put in one box either. The traditional ethical model of the "four-box method" can be adapted to integrate the perspective of a Catholic physician. In an increasingly secularist environment, medical students and physicians are often asked to "leave religious beliefs at the door" and not consider the care and stewardship of our own morality and involvement as a provider. We reject this view. A patient's own religious and moral beliefs should be respected to the extent that they do not destroy our own; for us, the Catholic viewpoint can shine a light into dark corners and aid us in translating true things to patients of any religion. By expanding to a "fifth box" of Catholic social teaching, the Catholic physician finds a way to methodically analyze an ethical scenario. This case study is an example of this type of "five-box" analysis.

  6. Toolpack mathematical software development environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osterweil, L.

    1982-07-21

    The purpose of this research project was to produce a well integrated set of tools for the support of numerical computation. The project entailed the specification, design and implementation of both a diversity of tools and an innovative tool integration mechanism. This large configuration of tightly integrated tools comprises an environment for numerical software development, and has been named Toolpack/IST (Integrated System of Tools). Following the creation of this environment in prototype form, the environment software was readied for widespread distribution by transitioning it to a development organization for systematization, documentation and distribution. It is expected that public release ofmore » Toolpack/IST will begin imminently and will provide a basis for evaluation of the innovative software approaches taken as well as a uniform set of development tools for the numerical software community.« less

  7. The collaboratory for MS3D: a new cyberinfrastructure for the structural elucidation of biological macromolecules and their assemblies using mass spectrometry-based approaches.

    PubMed

    Yu, Eizadora T; Hawkins, Arie; Kuntz, Irwin D; Rahn, Larry A; Rothfuss, Andrew; Sale, Kenneth; Young, Malin M; Yang, Christine L; Pancerella, Carmen M; Fabris, Daniele

    2008-11-01

    Modern biomedical research is evolving with the rapid growth of diverse data types, biophysical characterization methods, computational tools and extensive collaboration among researchers spanning various communities and having complementary backgrounds and expertise. Collaborating researchers are increasingly dependent on shared data and tools made available by other investigators with common interests, thus forming communities that transcend the traditional boundaries of the single research laboratory or institution. Barriers, however, remain to the formation of these virtual communities, usually due to the steep learning curve associated with becoming familiar with new tools, or with the difficulties associated with transferring data between tools. Recognizing the need for shared reference data and analysis tools, we are developing an integrated knowledge environment that supports productive interactions among researchers. Here we report on our current collaborative environment, which focuses on bringing together structural biologists working in the area of mass spectrometric based methods for the analysis of tertiary and quaternary macromolecular structures (MS3D) called the Collaboratory for MS3D (C-MS3D). C-MS3D is a Web-portal designed to provide collaborators with a shared work environment that integrates data storage and management with data analysis tools. Files are stored and archived along with pertinent meta data in such a way as to allow file handling to be tracked (data provenance) and data files to be searched using keywords and modification dates. While at this time the portal is designed around a specific application, the shared work environment is a general approach to building collaborative work groups. The goal of this is to not only provide a common data sharing and archiving system, but also to assist in the building of new collaborations and to spur the development of new tools and technologies.

  8. Probabilistic Model Development

    NASA Technical Reports Server (NTRS)

    Adam, James H., Jr.

    2010-01-01

    Objective: Develop a Probabilistic Model for the Solar Energetic Particle Environment. Develop a tool to provide a reference solar particle radiation environment that: 1) Will not be exceeded at a user-specified confidence level; 2) Will provide reference environments for: a) Peak flux; b) Event-integrated fluence; and c) Mission-integrated fluence. The reference environments will consist of: a) Elemental energy spectra; b) For protons, helium and heavier ions.

  9. BioconductorBuntu: a Linux distribution that implements a web-based DNA microarray analysis server.

    PubMed

    Geeleher, Paul; Morris, Dermot; Hinde, John P; Golden, Aaron

    2009-06-01

    BioconductorBuntu is a custom distribution of Ubuntu Linux that automatically installs a server-side microarray processing environment, providing a user-friendly web-based GUI to many of the tools developed by the Bioconductor Project, accessible locally or across a network. System installation is via booting off a CD image or by using a Debian package provided to upgrade an existing Ubuntu installation. In its current version, several microarray analysis pipelines are supported including oligonucleotide, dual-or single-dye experiments, including post-processing with Gene Set Enrichment Analysis. BioconductorBuntu is designed to be extensible, by server-side integration of further relevant Bioconductor modules as required, facilitated by its straightforward underlying Python-based infrastructure. BioconductorBuntu offers an ideal environment for the development of processing procedures to facilitate the analysis of next-generation sequencing datasets. BioconductorBuntu is available for download under a creative commons license along with additional documentation and a tutorial from (http://bioinf.nuigalway.ie).

  10. Design for cyclic loading endurance of composites

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Murthy, Pappu L. N.; Chamis, Christos C.; Liaw, Leslie D. G.

    1993-01-01

    The application of the computer code IPACS (Integrated Probabilistic Assessment of Composite Structures) to aircraft wing type structures is described. The code performs a complete probabilistic analysis for composites taking into account the uncertainties in geometry, boundary conditions, material properties, laminate lay-ups, and loads. Results of the analysis are presented in terms of cumulative distribution functions (CDF) and probability density function (PDF) of the fatigue life of a wing type composite structure under different hygrothermal environments subjected to the random pressure. The sensitivity of the fatigue life to a number of critical structural/material variables is also computed from the analysis.

  11. ISTAR: Intelligent System for Telemetry Analysis in Real-time

    NASA Technical Reports Server (NTRS)

    Simmons, Charles

    1994-01-01

    The intelligent system for telemetry analysis in real-time (ISTAR) is an advanced vehicle monitoring environment incorporating expert systems, analysis tools, and on-line hypermedia documentation. The system was developed for the Air Force Space and Missile Systems Center (SMC) in Los Angeles, California, in support of the inertial upper stage (IUS) booster vehicle. Over a five year period the system progressed from rapid prototype to operational system. ISTAR has been used to support five IUS missions and countless mission simulations. There were a significant number of lessons learned with respect to integrating an expert system capability into an existing ground system.

  12. Decision Making in Nursing Practice: A Concept Analysis.

    PubMed

    Johansen, Mary L; O'Brien, Janice L

    2016-01-01

    The study aims to gain an understanding of the concept of decision making as it relates to the nurse practice environment. Rodgers' evolutionary method on concept analysis was used as a framework for the study of the concept. Articles from 1952 to 2014 were reviewed from PsycINFO, Medline, Cumulative Index to Nursing and Allied Health Literature (CINAHL), JSTOR, PubMed, and Science Direct. Findings suggest that decision making in the nurse practice environment is a complex process, integral to the nursing profession. The definition of decision making, and the attributes, antecedents, and consequences, are discussed. Contextual factors that influence the process are also discussed. An exemplar is presented to illustrate the concept. Decision making in the nurse practice environment is a dynamic conceptual process that may affect patient outcomes. Nurses need to call upon ways of knowing to make sound decisions and should be self-reflective in order to develop the process further in the professional arena. The need for further research is discussed. © 2015 Wiley Periodicals, Inc.

  13. An Integrated Product Environment

    NASA Technical Reports Server (NTRS)

    Higgins, Chuck

    1997-01-01

    Mechanical Advantage is a mechanical design decision support system. Unlike our CAD/CAM cousins, Mechanical Advantage addresses true engineering processes, not just the form and fit of geometry. If we look at a traditional engineering environment, we see that an engineer starts with two things - performance goals and design rules. The intent is to have a product perform specific functions and accomplish that within a designated environment. Geometry should be a simple byproduct of that engineering process - not the controller of it. Mechanical Advantage is a performance modeler allowing engineers to consider all these criteria in making their decisions by providing such capabilities as critical parameter analysis, tolerance and sensitivity analysis, math driven Geometry, and automated design optimizations. If you should desire an industry standard solid model, we would produce an ACIS-based solid model. If you should desire an ANSI/ISO standard drawing, we would produce this as well with a virtual push of the button. For more information on this and other Advantage Series products, please contact the author.

  14. A study on industrial accident rate forecasting and program development of estimated zero accident time in Korea.

    PubMed

    Kim, Tae-gu; Kang, Young-sig; Lee, Hyung-won

    2011-01-01

    To begin a zero accident campaign for industry, the first thing is to estimate the industrial accident rate and the zero accident time systematically. This paper considers the social and technical change of the business environment after beginning the zero accident campaign through quantitative time series analysis methods. These methods include sum of squared errors (SSE), regression analysis method (RAM), exponential smoothing method (ESM), double exponential smoothing method (DESM), auto-regressive integrated moving average (ARIMA) model, and the proposed analytic function method (AFM). The program is developed to estimate the accident rate, zero accident time and achievement probability of an efficient industrial environment. In this paper, MFC (Microsoft Foundation Class) software of Visual Studio 2008 was used to develop a zero accident program. The results of this paper will provide major information for industrial accident prevention and be an important part of stimulating the zero accident campaign within all industrial environments.

  15. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    NASA Astrophysics Data System (ADS)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  16. NeuroMatic: An Integrated Open-Source Software Toolkit for Acquisition, Analysis and Simulation of Electrophysiological Data

    PubMed Central

    Rothman, Jason S.; Silver, R. Angus

    2018-01-01

    Acquisition, analysis and simulation of electrophysiological properties of the nervous system require multiple software packages. This makes it difficult to conserve experimental metadata and track the analysis performed. It also complicates certain experimental approaches such as online analysis. To address this, we developed NeuroMatic, an open-source software toolkit that performs data acquisition (episodic, continuous and triggered recordings), data analysis (spike rasters, spontaneous event detection, curve fitting, stationarity) and simulations (stochastic synaptic transmission, synaptic short-term plasticity, integrate-and-fire and Hodgkin-Huxley-like single-compartment models). The merging of a wide range of tools into a single package facilitates a more integrated style of research, from the development of online analysis functions during data acquisition, to the simulation of synaptic conductance trains during dynamic-clamp experiments. Moreover, NeuroMatic has the advantage of working within Igor Pro, a platform-independent environment that includes an extensive library of built-in functions, a history window for reviewing the user's workflow and the ability to produce publication-quality graphics. Since its original release, NeuroMatic has been used in a wide range of scientific studies and its user base has grown considerably. NeuroMatic version 3.0 can be found at http://www.neuromatic.thinkrandom.com and https://github.com/SilverLabUCL/NeuroMatic. PMID:29670519

  17. Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Gorelick, Noel

    2013-04-01

    The Google Earth Engine platform is a system designed to enable petabyte-scale, scientific analysis and visualization of geospatial datasets. Earth Engine provides a consolidated environment including a massive data catalog co-located with thousands of computers for analysis. The user-friendly front-end provides a workbench environment to allow interactive data and algorithm development and exploration and provides a convenient mechanism for scientists to share data, visualizations and analytic algorithms via URLs. The Earth Engine data catalog contains a wide variety of popular, curated datasets, including the world's largest online collection of Landsat scenes (> 2.0M), numerous MODIS collections, and many vector-based data sets. The platform provides a uniform access mechanism to a variety of data types, independent of their bands, projection, bit-depth, resolution, etc..., facilitating easy multi-sensor analysis. Additionally, a user is able to add and curate their own data and collections. Using a just-in-time, distributed computation model, Earth Engine can rapidly process enormous quantities of geo-spatial data. All computation is performed lazily; nothing is computed until it's required either for output or as input to another step. This model allows real-time feedback and preview during algorithm development, supporting a rapid algorithm development, test, and improvement cycle that scales seamlessly to large-scale production data processing. Through integration with a variety of other services, Earth Engine is able to bring to bear considerable analytic and technical firepower in a transparent fashion, including: AI-based classification via integration with Google's machine learning infrastructure, publishing and distribution at Google scale through integration with the Google Maps API, Maps Engine and Google Earth, and support for in-the-field activities such as validation, ground-truthing, crowd-sourcing and citizen science though the Android Open Data Kit.

  18. Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Gorelick, N.

    2012-12-01

    The Google Earth Engine platform is a system designed to enable petabyte-scale, scientific analysis and visualization of geospatial datasets. Earth Engine provides a consolidated environment including a massive data catalog co-located with thousands of computers for analysis. The user-friendly front-end provides a workbench environment to allow interactive data and algorithm development and exploration and provides a convenient mechanism for scientists to share data, visualizations and analytic algorithms via URLs. The Earth Engine data catalog contains a wide variety of popular, curated datasets, including the world's largest online collection of Landsat scenes (> 2.0M), numerous MODIS collections, and many vector-based data sets. The platform provides a uniform access mechanism to a variety of data types, independent of their bands, projection, bit-depth, resolution, etc..., facilitating easy multi-sensor analysis. Additionally, a user is able to add and curate their own data and collections. Using a just-in-time, distributed computation model, Earth Engine can rapidly process enormous quantities of geo-spatial data. All computation is performed lazily; nothing is computed until it's required either for output or as input to another step. This model allows real-time feedback and preview during algorithm development, supporting a rapid algorithm development, test, and improvement cycle that scales seamlessly to large-scale production data processing. Through integration with a variety of other services, Earth Engine is able to bring to bear considerable analytic and technical firepower in a transparent fashion, including: AI-based classification via integration with Google's machine learning infrastructure, publishing and distribution at Google scale through integration with the Google Maps API, Maps Engine and Google Earth, and support for in-the-field activities such as validation, ground-truthing, crowd-sourcing and citizen science though the Android Open Data Kit.

  19. The Webinar Integration Tool: A Framework for Promoting Active Learning in Blended Environments

    ERIC Educational Resources Information Center

    Lieser, Ping; Taf, Steven D.; Murphy-Hagan, Anne

    2018-01-01

    This paper describes a three-stage process of developing a webinar integration tool to enhance the interaction of teaching and learning in blended environments. In the context of medical education, we emphasize three factors of effective webinar integration in blended learning: fostering better solutions for faculty and students to interact…

  20. Integrated care reform in urban China: a qualitative study on design, supporting environment and implementation.

    PubMed

    Qian, Yi; Hou, Zhiyuan; Wang, Wei; Zhang, Donglan; Yan, Fei

    2017-10-25

    Initiatives on integrated care between hospitals and community health centers (CHCs) have been introduced to transform the current fragmented health care delivery system into an integrated system in China. Up to date no research has analyzed in-depth the experiences of these initiatives based on perspectives from various stakeholders. This study analyzed the integrated care pilot in Hangzhou City by investigating stakeholders' perspectives on its design features and supporting environment, their acceptability of this pilot, and further identifying the enabling and constraining factors that may influence the implementation of the integrated care reform. The qualitative study was carried out based on in-depth interviews and focus group discussions with 50 key informants who were involved in the policy-making process and implementation. Relevant policy documents were also collected for analysis. The pilot in Hangzhou was established as a CHC-led delivery system based on cooperation agreement between CHCs and hospitals to deliver primary and specialty care together for patients with chronic diseases. An innovative learning-from-practice mentorship system between specialists and general practitioners was also introduced to solve the poor capacity of general practitioners. The design of the pilot, its governance and organizational structure and human resources were enabling factors, which facilitated the integrated care reform. However, the main constraining factors were a lack of an integrated payment mechanism from health insurance and a lack of tailored information system to ensure its sustainability. The integrated care pilot in Hangzhou enabled CHCs to play as gate-keeper and care coordinator for the full continuum of services across the health care providers. The government put integrated care a priority, and constructed an efficient design, governance and organizational structure to enable its implementation. Health insurance should play a proactive role, and adopt a shared financial incentive system to support integrated care across providers in the future.

Top