Sample records for integrated performance analysis

  1. Role of IAC in large space systems thermal analysis

    NASA Technical Reports Server (NTRS)

    Jones, G. K.; Skladany, J. T.; Young, J. P.

    1982-01-01

    Computer analysis programs to evaluate critical coupling effects that can significantly influence spacecraft system performance are described. These coupling effects arise from the varied parameters of the spacecraft systems, environments, and forcing functions associated with disciplines such as thermal, structures, and controls. Adverse effects can be expected to significantly impact system design aspects such as structural integrity, controllability, and mission performance. One such needed design analysis capability is a software system that can integrate individual discipline computer codes into a highly user-oriented/interactive-graphics-based analysis capability. The integrated analysis capability (IAC) system can be viewed as: a core framework system which serves as an integrating base whereby users can readily add desired analysis modules and as a self-contained interdisciplinary system analysis capability having a specific set of fully integrated multidisciplinary analysis programs that deal with the coupling of thermal, structures, controls, antenna radiation performance, and instrument optical performance disciplines.

  2. Integrating Reliability Analysis with a Performance Tool

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  3. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    NASA Technical Reports Server (NTRS)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  4. The Function of Neuroendocrine Cells in Prostate Cancer

    DTIC Science & Technology

    2013-04-01

    integration site. We then performed deep sequencing and aligned reads to the genome. Our analysis revealed that both histological phenotypes are derived from...lentiviral integration site analysis . (B) Laser capture microdissection was performed on individual glands containing both squamous and...lentiviral integration site analysis . LTR: long terminal repeat (viral DNA), PCR: polymerase chain reaction. (D) Venn diagrams depict shared lentiviral

  5. Box truss analysis and technology development. Task 1: Mesh analysis and control

    NASA Technical Reports Server (NTRS)

    Bachtell, E. E.; Bettadapur, S. S.; Coyner, J. V.

    1985-01-01

    An analytical tool was developed to model, analyze and predict RF performance of box truss antennas with reflective mesh surfaces. The analysis system is unique in that it integrates custom written programs for cord tied mesh surfaces, thereby drastically reducing the cost of analysis. The analysis system is capable of determining the RF performance of antennas under any type of manufacturing or operating environment by integrating together the various disciplines of design, finite element analysis, surface best fit analysis and RF analysis. The Integrated Mesh Analysis System consists of six separate programs: The Mesh Tie System Model Generator, The Loadcase Generator, The Model Optimizer, The Model Solver, The Surface Topography Solver and The RF Performance Solver. Additionally, a study using the mesh analysis system was performed to determine the effect of on orbit calibration, i.e., surface adjustment, on a typical box truss antenna.

  6. The Vehicle Integrated Performance Analysis Experience: Reconnecting With Technical Integration

    NASA Technical Reports Server (NTRS)

    McGhee, D. S.

    2006-01-01

    Very early in the Space Launch Initiative program, a small team of engineers at MSFC proposed a process for performing system-level assessments of a launch vehicle. Aimed primarily at providing insight and making NASA a smart buyer, the Vehicle Integrated Performance Analysis (VIPA) team was created. The difference between the VIPA effort and previous integration attempts is that VIPA a process using experienced people from various disciplines, which focuses them on a technically integrated assessment. The foundations of VIPA s process are described. The VIPA team also recognized the need to target early detailed analysis toward identifying significant systems issues. This process is driven by the T-model for technical integration. VIPA s approach to performing system-level technical integration is discussed in detail. The VIPA process significantly enhances the development and monitoring of realizable project requirements. VIPA s assessment validates the concept s stated performance, identifies significant issues either with the concept or the requirements, and then reintegrates these issues to determine impacts. This process is discussed along with a description of how it may be integrated into a program s insight and review process. The VIPA process has gained favor with both engineering and project organizations for being responsive and insightful

  7. Integrated Sensitivity Analysis Workflow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman-Hill, Ernest J.; Hoffman, Edward L.; Gibson, Marcus J.

    2014-08-01

    Sensitivity analysis is a crucial element of rigorous engineering analysis, but performing such an analysis on a complex model is difficult and time consuming. The mission of the DART Workbench team at Sandia National Laboratories is to lower the barriers to adoption of advanced analysis tools through software integration. The integrated environment guides the engineer in the use of these integrated tools and greatly reduces the cycle time for engineering analysis.

  8. Analysis of integrated healthcare networks' performance: a contingency-strategic management perspective.

    PubMed

    Lin, B Y; Wan, T T

    1999-12-01

    Few empirical analyses have been done in the organizational researches of integrated healthcare networks (IHNs) or integrated healthcare delivery systems. Using a contingency derived contact-process-performance model, this study attempts to explore the relationships among an IHN's strategic direction, structural design, and performance. A cross-sectional analysis of 100 IHNs suggests that certain contextual factors such as market competition and network age and tax status have statistically significant effects on the implementation of an IHN's service differentiation strategy, which addresses coordination and control in the market. An IHN's service differentiation strategy is positively related to its integrated structural design, which is characterized as integration of administration, patient care, and information system across different settings. However, no evidence supports that the development of integrated structural design may benefit an IHN's performance in terms of clinical efficiency and financial viability.

  9. Integrated Modeling for the James Webb Space Telescope (JWST) Project: Structural Analysis Activities

    NASA Technical Reports Server (NTRS)

    Johnston, John; Mosier, Mark; Howard, Joe; Hyde, Tupper; Parrish, Keith; Ha, Kong; Liu, Frank; McGinnis, Mark

    2004-01-01

    This paper presents viewgraphs about structural analysis activities and integrated modeling for the James Webb Space Telescope (JWST). The topics include: 1) JWST Overview; 2) Observatory Structural Models; 3) Integrated Performance Analysis; and 4) Future Work and Challenges.

  10. Integrated corridor management initiative : demonstration phase evaluation – Dallas corridor performance analysis test plan.

    DOT National Transportation Integrated Search

    2012-08-01

    This report presents the test plan for conducting the Corridor Performance Analysis for the United States Department of Transportation (U.S. DOT) evaluation of the Dallas U.S. 75 Integrated Corridor Management (ICM) Initiative Demonstration. The ICM ...

  11. Integrated corridor management initiative : demonstration phase evaluation – San Diego corridor performance analysis test plan.

    DOT National Transportation Integrated Search

    2012-08-01

    This report presents the test plan for conducting the Corridor Performance Analysis for the United States Department of Transportation (U.S. DOT) evaluation of the San Diego Integrated Corridor Management (ICM) Initiative Demonstration. The ICM proje...

  12. The application of integral performance criteria to the analysis of discrete maneuvers in a driving simulator

    NASA Technical Reports Server (NTRS)

    Repa, B. S.; Zucker, R. S.; Wierwille, W. W.

    1977-01-01

    The influence of vehicle transient response characteristics on driver-vehicle performance in discrete maneuvers as measured by integral performance criteria was investigated. A group of eight ordinary drivers was presented with a series of eight vehicle transfer function configurations in a driving simulator. Performance in two discrete maneuvers was analyzed by means of integral performance criteria. Results are presented.

  13. International Space Station Configuration Analysis and Integration

    NASA Technical Reports Server (NTRS)

    Anchondo, Rebekah

    2016-01-01

    Ambitious engineering projects, such as NASA's International Space Station (ISS), require dependable modeling, analysis, visualization, and robotics to ensure that complex mission strategies are carried out cost effectively, sustainably, and safely. Learn how Booz Allen Hamilton's Modeling, Analysis, Visualization, and Robotics Integration Center (MAVRIC) team performs engineering analysis of the ISS Configuration based primarily on the use of 3D CAD models. To support mission planning and execution, the team tracks the configuration of ISS and maintains configuration requirements to ensure operational goals are met. The MAVRIC team performs multi-disciplinary integration and trade studies to ensure future configurations meet stakeholder needs.

  14. Integration of Pharmacy Practice and Pharmaceutical Analysis: Quality Assessment of Laboratory Performance.

    ERIC Educational Resources Information Center

    McGill, Julian E.; Holly, Deborah R.

    1996-01-01

    Laboratory portions of courses in pharmacy practice and pharmaceutical analysis at the Medical University of South Carolina are integrated and coordinated to provide feedback on student performance in compounding medications. Students analyze the products they prepare, with early exposure to compendia requirements and other references. Student…

  15. Integrative analysis of environmental sequences using MEGAN4.

    PubMed

    Huson, Daniel H; Mitra, Suparna; Ruscheweyh, Hans-Joachim; Weber, Nico; Schuster, Stephan C

    2011-09-01

    A major challenge in the analysis of environmental sequences is data integration. The question is how to analyze different types of data in a unified approach, addressing both the taxonomic and functional aspects. To facilitate such analyses, we have substantially extended MEGAN, a widely used taxonomic analysis program. The new program, MEGAN4, provides an integrated approach to the taxonomic and functional analysis of metagenomic, metatranscriptomic, metaproteomic, and rRNA data. While taxonomic analysis is performed based on the NCBI taxonomy, functional analysis is performed using the SEED classification of subsystems and functional roles or the KEGG classification of pathways and enzymes. A number of examples illustrate how such analyses can be performed, and show that one can also import and compare classification results obtained using others' tools. MEGAN4 is freely available for academic purposes, and installers for all three major operating systems can be downloaded from www-ab.informatik.uni-tuebingen.de/software/megan.

  16. Integration mechanisms and hospital efficiency in integrated health care delivery systems.

    PubMed

    Wan, Thomas T H; Lin, Blossom Yen-Ju; Ma, Allen

    2002-04-01

    This study analyzes integration mechanisms that affect system performances measured by indicators of efficiency in integrated delivery systems (IDSs) in the United States. The research question is, do integration mechanisms improve IDSs' efficiency in hospital care? American Hospital Association's Annual Survey (1998) and Dorenfest's Survey on Information Systems in Integrated Healthcare Delivery Systems (1998) were used to conduct the study, using IDS as the unit of analysis. A covariance structure equation model of the effects of system integration mechanisms on IDS performance was formulated and validated by an empirical examination of IDSs. The study sample includes 973 hospital-based integrated health care delivery systems operating in the United States, carried in the list of Dorenfests Survey on Information Systems in Integrated Health care Delivery Systems. The measurement indicators of system integration mechanisms are categorized into six related domains: informatic integration, case management, hybrid physician-hospital integration, forward integration, backward integration, and high tech medical services. The multivariate analysis reveals that integration mechanisms in system operation are positively correlated and positively affect IDSs' efficiency. The six domains of integration mechanisms account for 58.9% of the total variance in hospital performance. The service differentiation strategy such as having more high tech medical services have much stronger influences on efficiency than other integration mechanisms do. The beneficial effects of integration mechanisms have been realized in IDS performance. High efficiency in hospital care can be achieved by employing proper integration strategies in operations.

  17. Development and Testing of an Integrated Sandia Cooler Thermoelectric Device (SCTD).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Terry A.; Staats, Wayne Lawrence,; Leick, Michael Thomas

    This report describes a FY14 effort to develop an integrated Sandia Cooler T hermoelectric D evice (SCTD) . The project included a review of feasible thermoelectric (TE) cooling applications, baseline performance testing of an existing TE device, analysis and design development of an integrated SCTD assembly, and performance measurement and validation of the integrated SCTD prototype.

  18. Integrated Design and Engineering Analysis (IDEA) Environment - Propulsion Related Module Development and Vehicle Integration

    NASA Technical Reports Server (NTRS)

    Kamhawi, Hilmi N.

    2013-01-01

    This report documents the work performed during the period from May 2011 - October 2012 on the Integrated Design and Engineering Analysis (IDEA) environment. IDEA is a collaborative environment based on an object-oriented, multidisciplinary, distributed framework using the Adaptive Modeling Language (AML). This report will focus on describing the work done in the areas of: (1) Integrating propulsion data (turbines, rockets, and scramjets) in the system, and using the data to perform trajectory analysis; (2) Developing a parametric packaging strategy for a hypersonic air breathing vehicles allowing for tank resizing when multiple fuels and/or oxidizer are part of the configuration; and (3) Vehicle scaling and closure strategies.

  19. Integrated Aero-Propulsion CFD Methodology for the Hyper-X Flight Experiment

    NASA Technical Reports Server (NTRS)

    Cockrell, Charles E., Jr.; Engelund, Walter C.; Bittner, Robert D.; Dilley, Arthur D.; Jentink, Tom N.; Frendi, Abdelkader

    2000-01-01

    Computational fluid dynamics (CFD) tools have been used extensively in the analysis and development of the X-43A Hyper-X Research Vehicle (HXRV). A significant element of this analysis is the prediction of integrated vehicle aero-propulsive performance, which includes an integration of aerodynamic and propulsion flow fields. This paper describes analysis tools used and the methodology for obtaining pre-flight predictions of longitudinal performance increments. The use of higher-fidelity methods to examine flow-field characteristics and scramjet flowpath component performance is also discussed. Limited comparisons with available ground test data are shown to illustrate the approach used to calibrate methods and assess solution accuracy. Inviscid calculations to evaluate lateral-directional stability characteristics are discussed. The methodology behind 3D tip-to-tail calculations is described and the impact of 3D exhaust plume expansion in the afterbody region is illustrated. Finally, future technology development needs in the area of hypersonic propulsion-airframe integration analysis are discussed.

  20. Performance analysis of an integrated GPS/inertial attitude determination system. M.S. Thesis - MIT

    NASA Technical Reports Server (NTRS)

    Sullivan, Wendy I.

    1994-01-01

    The performance of an integrated GPS/inertial attitude determination system is investigated using a linear covariance analysis. The principles of GPS interferometry are reviewed, and the major error sources of both interferometers and gyroscopes are discussed and modeled. A new figure of merit, attitude dilution of precision (ADOP), is defined for two possible GPS attitude determination methods, namely single difference and double difference interferometry. Based on this figure of merit, a satellite selection scheme is proposed. The performance of the integrated GPS/inertial attitude determination system is determined using a linear covariance analysis. Based on this analysis, it is concluded that the baseline errors (i.e., knowledge of the GPS interferometer baseline relative to the vehicle coordinate system) are the limiting factor in system performance. By reducing baseline errors, it should be possible to use lower quality gyroscopes without significantly reducing performance. For the cases considered, single difference interferometry is only marginally better than double difference interferometry. Finally, the performance of the system is found to be relatively insensitive to the satellite selection technique.

  1. Total systems design analysis of high performance structures

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1993-01-01

    Designer-control parameters were identified at interdiscipline interfaces to optimize structural systems performance and downstream development and operations with reliability and least life-cycle cost. Interface tasks and iterations are tracked through a matrix of performance disciplines integration versus manufacturing, verification, and operations interactions for a total system design analysis. Performance integration tasks include shapes, sizes, environments, and materials. Integrity integrating tasks are reliability and recurring structural costs. Significant interface designer control parameters were noted as shapes, dimensions, probability range factors, and cost. Structural failure concept is presented, and first-order reliability and deterministic methods, benefits, and limitations are discussed. A deterministic reliability technique combining benefits of both is proposed for static structures which is also timely and economically verifiable. Though launch vehicle environments were primarily considered, the system design process is applicable to any surface system using its own unique filed environments.

  2. United we stand, divided we fall: a meta-analysis of experiments on clonal integration and its relationship to invasiveness.

    PubMed

    Song, Yao-Bin; Yu, Fei-Hai; Keser, Lidewij H; Dawson, Wayne; Fischer, Markus; Dong, Ming; van Kleunen, Mark

    2013-02-01

    Many ecosystems are dominated by clonal plants. Among the most distinctive characteristics of clonal plants is their potential for clonal integration (i.e. the translocation of resources between interconnected ramets), suggesting that integration may play a role in their success. However, a general synthesis of effects of clonal integration on plant performance is lacking. We conducted a meta-analysis on the effects of clonal integration on biomass production and asexual reproduction of the whole clone, the recipient part (i.e. the part of a clone that imports resources) and the donor part (i.e. the part of a clone that exports resources). The final dataset contained 389 effect sizes from 84 studies covering 57 taxa. Overall, clonal integration increased performance of recipient parts without decreasing that of donor parts, and thus increased performance of whole clones. Among the studies and taxa considered, the benefits of clonal integration did not differ between two types of experimental approaches, between stoloniferous and rhizomatous growth forms, between directions of resource translocation (from younger to older ramet or vice versa), or among types of translocated resources (water, nutrients and carbohydrates). Clonal taxa with larger benefits of integration on whole-clone performance were not more invasive globally, but taxa in which recipient parts in unfavorable patches benefited more from integration were. Our results demonstrate general performance benefits of clonal integration, at least in the short term, and suggest that clonal integration contributes to the success of clonal plants.

  3. Using integrated information systems in supply chain management

    NASA Astrophysics Data System (ADS)

    Gonzálvez-Gallego, Nicolás; Molina-Castillo, Francisco-Jose; Soto-Acosta, Pedro; Varajao, Joao; Trigo, Antonio

    2015-02-01

    The aim of this paper is to empirically test not only the direct effects of information and communication technology (ICT) capabilities and integrated information systems (IS) on firm performance, but also the moderating role of IS integration along the supply chain in the relationship between ICT external and capabilities and business performance. Data collected from 102 large Iberian firms from Spain and Portugal are used to test the research model. The hierarchical multiple regression analysis is employed to test the direct effects and the moderating relationships proposed. Results show that external and internal ICT capabilities are important drivers of firm performance, while merely having integrated IS do not lead to better firm performance. In addition, a moderating effect of IS integration in the relationship between ICT capabilities and business performance is found, although this integration only contributes to firm performance when it is directed to connect with suppliers or customers rather than when integrating the whole supply chain.

  4. Universal microfluidic automaton for autonomous sample processing: application to the Mars Organic Analyzer.

    PubMed

    Kim, Jungkyu; Jensen, Erik C; Stockton, Amanda M; Mathies, Richard A

    2013-08-20

    A fully integrated multilayer microfluidic chemical analyzer for automated sample processing and labeling, as well as analysis using capillary zone electrophoresis is developed and characterized. Using lifting gate microfluidic control valve technology, a microfluidic automaton consisting of a two-dimensional microvalve cellular array is fabricated with soft lithography in a format that enables facile integration with a microfluidic capillary electrophoresis device. The programmable sample processor performs precise mixing, metering, and routing operations that can be combined to achieve automation of complex and diverse assay protocols. Sample labeling protocols for amino acid, aldehyde/ketone and carboxylic acid analysis are performed automatically followed by automated transfer and analysis by the integrated microfluidic capillary electrophoresis chip. Equivalent performance to off-chip sample processing is demonstrated for each compound class; the automated analysis resulted in a limit of detection of ~16 nM for amino acids. Our microfluidic automaton provides a fully automated, portable microfluidic analysis system capable of autonomous analysis of diverse compound classes in challenging environments.

  5. Human performance cognitive-behavioral modeling: a benefit for occupational safety.

    PubMed

    Gore, Brian F

    2002-01-01

    Human Performance Modeling (HPM) is a computer-aided job analysis software methodology used to generate predictions of complex human-automation integration and system flow patterns with the goal of improving operator and system safety. The use of HPM tools has recently been increasing due to reductions in computational cost, augmentations in the tools' fidelity, and usefulness in the generated output. An examination of an Air Man-machine Integration Design and Analysis System (Air MIDAS) model evaluating complex human-automation integration currently underway at NASA Ames Research Center will highlight the importance to occupational safety of considering both cognitive and physical aspects of performance when researching human error.

  6. Human performance cognitive-behavioral modeling: a benefit for occupational safety

    NASA Technical Reports Server (NTRS)

    Gore, Brian F.

    2002-01-01

    Human Performance Modeling (HPM) is a computer-aided job analysis software methodology used to generate predictions of complex human-automation integration and system flow patterns with the goal of improving operator and system safety. The use of HPM tools has recently been increasing due to reductions in computational cost, augmentations in the tools' fidelity, and usefulness in the generated output. An examination of an Air Man-machine Integration Design and Analysis System (Air MIDAS) model evaluating complex human-automation integration currently underway at NASA Ames Research Center will highlight the importance to occupational safety of considering both cognitive and physical aspects of performance when researching human error.

  7. Information Extraction for System-Software Safety Analysis: Calendar Year 2007 Year-End Report

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2008-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis on the models to identify possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations; 4) perform discrete-time-based simulation on the models to investigate scenarios where these paths may play a role in failures and mishaps; and 5) identify resulting candidate scenarios for software integration testing. This paper describes new challenges in a NASA abort system case, and enhancements made to develop the integrated tool set.

  8. Integrated analysis of large space systems

    NASA Technical Reports Server (NTRS)

    Young, J. P.

    1980-01-01

    Based on the belief that actual flight hardware development of large space systems will necessitate a formalized method of integrating the various engineering discipline analyses, an efficient highly user oriented software system capable of performing interdisciplinary design analyses with tolerable solution turnaround time is planned Specific analysis capability goals were set forth with initial emphasis given to sequential and quasi-static thermal/structural analysis and fully coupled structural/control system analysis. Subsequently, the IAC would be expanded to include a fully coupled thermal/structural/control system, electromagnetic radiation, and optical performance analyses.

  9. An integrated analysis for determining the geographical origin of medicinal herbs using ICP-AES/ICP-MS and (1)H NMR analysis.

    PubMed

    Kwon, Yong-Kook; Bong, Yeon-Sik; Lee, Kwang-Sik; Hwang, Geum-Sook

    2014-10-15

    ICP-MS and (1)H NMR are commonly used to determine the geographical origin of food and crops. In this study, data from multielemental analysis performed by ICP-AES/ICP-MS and metabolomic data obtained from (1)H NMR were integrated to improve the reliability of determining the geographical origin of medicinal herbs. Astragalus membranaceus and Paeonia albiflora with different origins in Korea and China were analysed by (1)H NMR and ICP-AES/ICP-MS, and an integrated multivariate analysis was performed to characterise the differences between their origins. Four classification methods were applied: linear discriminant analysis (LDA), k-nearest neighbour classification (KNN), support vector machines (SVM), and partial least squares-discriminant analysis (PLS-DA). Results were compared using leave-one-out cross-validation and external validation. The integration of multielemental and metabolomic data was more suitable for determining geographical origin than the use of each individual data set alone. The integration of the two analytical techniques allowed diverse environmental factors such as climate and geology, to be considered. Our study suggests that an appropriate integration of different types of analytical data is useful for determining the geographical origin of food and crops with a high degree of reliability. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. LLIMAS: Revolutionizing integrating modeling and analysis at MIT Lincoln Laboratory

    NASA Astrophysics Data System (ADS)

    Doyle, Keith B.; Stoeckel, Gerhard P.; Rey, Justin J.; Bury, Mark E.

    2017-08-01

    MIT Lincoln Laboratory's Integrated Modeling and Analysis Software (LLIMAS) enables the development of novel engineering solutions for advanced prototype systems through unique insights into engineering performance and interdisciplinary behavior to meet challenging size, weight, power, environmental, and performance requirements. LLIMAS is a multidisciplinary design optimization tool that wraps numerical optimization algorithms around an integrated framework of structural, thermal, optical, stray light, and computational fluid dynamics analysis capabilities. LLIMAS software is highly extensible and has developed organically across a variety of technologies including laser communications, directed energy, photometric detectors, chemical sensing, laser radar, and imaging systems. The custom software architecture leverages the capabilities of existing industry standard commercial software and supports the incorporation of internally developed tools. Recent advances in LLIMAS's Structural-Thermal-Optical Performance (STOP), aeromechanical, and aero-optical capabilities as applied to Lincoln prototypes are presented.

  11. Performance Analysis of Hierarchical Group Key Management Integrated with Adaptive Intrusion Detection in Mobile ad hoc Networks

    DTIC Science & Technology

    2016-04-05

    applications in wireless networks such as military battlefields, emergency response, mobile commerce , online gaming, and collaborative work are based on the...www.elsevier.com/locate/peva Performance analysis of hierarchical group key management integrated with adaptive intrusion detection in mobile ad hoc...Accepted 19 September 2010 Available online 26 September 2010 Keywords: Mobile ad hoc networks Intrusion detection Group communication systems Group

  12. Integrated vehicle-based safety systems (IVBSS) : light vehicle platform field operational test data analysis plan.

    DOT National Transportation Integrated Search

    2009-12-22

    This document presents the University of Michigan Transportation Research Institutes plan to : perform analysis of data collected from the light vehicle platform field operational test of the : Integrated Vehicle-Based Safety Systems (IVBSS) progr...

  13. Integrated vehicle-based safety systems (IVBSS) : heavy truck platform field operational test data analysis plan.

    DOT National Transportation Integrated Search

    2009-11-23

    This document presents the University of Michigan Transportation Research Institutes plan to perform : analysis of data collected from the heavy truck platform field operational test of the Integrated Vehicle- : Based Safety Systems (IVBSS) progra...

  14. VISPA2: a scalable pipeline for high-throughput identification and annotation of vector integration sites.

    PubMed

    Spinozzi, Giulio; Calabria, Andrea; Brasca, Stefano; Beretta, Stefano; Merelli, Ivan; Milanesi, Luciano; Montini, Eugenio

    2017-11-25

    Bioinformatics tools designed to identify lentiviral or retroviral vector insertion sites in the genome of host cells are used to address the safety and long-term efficacy of hematopoietic stem cell gene therapy applications and to study the clonal dynamics of hematopoietic reconstitution. The increasing number of gene therapy clinical trials combined with the increasing amount of Next Generation Sequencing data, aimed at identifying integration sites, require both highly accurate and efficient computational software able to correctly process "big data" in a reasonable computational time. Here we present VISPA2 (Vector Integration Site Parallel Analysis, version 2), the latest optimized computational pipeline for integration site identification and analysis with the following features: (1) the sequence analysis for the integration site processing is fully compliant with paired-end reads and includes a sequence quality filter before and after the alignment on the target genome; (2) an heuristic algorithm to reduce false positive integration sites at nucleotide level to reduce the impact of Polymerase Chain Reaction or trimming/alignment artifacts; (3) a classification and annotation module for integration sites; (4) a user friendly web interface as researcher front-end to perform integration site analyses without computational skills; (5) the time speedup of all steps through parallelization (Hadoop free). We tested VISPA2 performances using simulated and real datasets of lentiviral vector integration sites, previously obtained from patients enrolled in a hematopoietic stem cell gene therapy clinical trial and compared the results with other preexisting tools for integration site analysis. On the computational side, VISPA2 showed a > 6-fold speedup and improved precision and recall metrics (1 and 0.97 respectively) compared to previously developed computational pipelines. These performances indicate that VISPA2 is a fast, reliable and user-friendly tool for integration site analysis, which allows gene therapy integration data to be handled in a cost and time effective fashion. Moreover, the web access of VISPA2 ( http://openserver.itb.cnr.it/vispa/ ) ensures accessibility and ease of usage to researches of a complex analytical tool. We released the source code of VISPA2 in a public repository ( https://bitbucket.org/andreacalabria/vispa2 ).

  15. Integrated flight/propulsion control - Subsystem specifications for performance

    NASA Technical Reports Server (NTRS)

    Neighbors, W. K.; Rock, Stephen M.

    1993-01-01

    A procedure is presented for calculating multiple subsystem specifications given a number of performance requirements on the integrated system. This procedure applies to problems where the control design must be performed in a partitioned manner. It is based on a structured singular value analysis, and generates specifications as magnitude bounds on subsystem uncertainties. The performance requirements should be provided in the form of bounds on transfer functions of the integrated system. This form allows the expression of model following, command tracking, and disturbance rejection requirements. The procedure is demonstrated on a STOVL aircraft design.

  16. Object-Oriented Multi-Disciplinary Design, Analysis, and Optimization Tool

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi

    2011-01-01

    An Object-Oriented Optimization (O3) tool was developed that leverages existing tools and practices, and allows the easy integration and adoption of new state-of-the-art software. At the heart of the O3 tool is the Central Executive Module (CEM), which can integrate disparate software packages in a cross platform network environment so as to quickly perform optimization and design tasks in a cohesive, streamlined manner. This object-oriented framework can integrate the analysis codes for multiple disciplines instead of relying on one code to perform the analysis for all disciplines. The CEM was written in FORTRAN and the script commands for each performance index were submitted through the use of the FORTRAN Call System command. In this CEM, the user chooses an optimization methodology, defines objective and constraint functions from performance indices, and provides starting and side constraints for continuous as well as discrete design variables. The structural analysis modules such as computations of the structural weight, stress, deflection, buckling, and flutter and divergence speeds have been developed and incorporated into the O3 tool to build an object-oriented Multidisciplinary Design, Analysis, and Optimization (MDAO) tool.

  17. The 19th Project Integration Meeting

    NASA Technical Reports Server (NTRS)

    Mcdonald, R. R.

    1981-01-01

    The Flat-Plate Solar Array Project is described. Project analysis and integration is discussed. Technology research in silicon material, large-area silicon sheet and environmental isolation; cell and module formation; engineering sciences, and module performance and failure analysis. It includes a report on, and copies of visual presentations made at, the 19th Project Integration Meeting held at Pasadena, California, on November 11, 1981.

  18. Interactive design and analysis of future large spacecraft concepts

    NASA Technical Reports Server (NTRS)

    Garrett, L. B.

    1981-01-01

    An interactive computer aided design program used to perform systems level design and analysis of large spacecraft concepts is presented. Emphasis is on rapid design, analysis of integrated spacecraft, and automatic spacecraft modeling for lattice structures. Capabilities and performance of multidiscipline applications modules, the executive and data management software, and graphics display features are reviewed. A single user at an interactive terminal create, design, analyze, and conduct parametric studies of Earth orbiting spacecraft with relative ease. Data generated in the design, analysis, and performance evaluation of an Earth-orbiting large diameter antenna satellite are used to illustrate current capabilities. Computer run time statistics for the individual modules quantify the speed at which modeling, analysis, and design evaluation of integrated spacecraft concepts is accomplished in a user interactive computing environment.

  19. Multidisciplinary High-Fidelity Analysis and Optimization of Aerospace Vehicles. Part 1; Formulation

    NASA Technical Reports Server (NTRS)

    Walsh, J. L.; Townsend, J. C.; Salas, A. O.; Samareh, J. A.; Mukhopadhyay, V.; Barthelemy, J.-F.

    2000-01-01

    An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity, finite element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a highspeed civil transport configuration. The paper describes the engineering aspects of formulating the optimization by integrating these analysis codes and associated interface codes into the system. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture (CORBA) compliant software product. A companion paper presents currently available results.

  20. High-Performance Computing in Neuroscience for Data-Driven Discovery, Integration, and Dissemination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bouchard, Kristofer E.; Aimone, James B.; Chun, Miyoung

    A lack of coherent plans to analyze, manage, and understand data threatens the various opportunities offered by new neuro-technologies. High-performance computing will allow exploratory analysis of massive datasets stored in standardized formats, hosted in open repositories, and integrated with simulations.

  1. High-Performance Computing in Neuroscience for Data-Driven Discovery, Integration, and Dissemination

    DOE PAGES

    Bouchard, Kristofer E.; Aimone, James B.; Chun, Miyoung; ...

    2016-11-01

    A lack of coherent plans to analyze, manage, and understand data threatens the various opportunities offered by new neuro-technologies. High-performance computing will allow exploratory analysis of massive datasets stored in standardized formats, hosted in open repositories, and integrated with simulations.

  2. Optimal Parameter Design of Coarse Alignment for Fiber Optic Gyro Inertial Navigation System.

    PubMed

    Lu, Baofeng; Wang, Qiuying; Yu, Chunmei; Gao, Wei

    2015-06-25

    Two different coarse alignment algorithms for Fiber Optic Gyro (FOG) Inertial Navigation System (INS) based on inertial reference frame are discussed in this paper. Both of them are based on gravity vector integration, therefore, the performance of these algorithms is determined by integration time. In previous works, integration time is selected by experience. In order to give a criterion for the selection process, and make the selection of the integration time more accurate, optimal parameter design of these algorithms for FOG INS is performed in this paper. The design process is accomplished based on the analysis of the error characteristics of these two coarse alignment algorithms. Moreover, this analysis and optimal parameter design allow us to make an adequate selection of the most accurate algorithm for FOG INS according to the actual operational conditions. The analysis and simulation results show that the parameter provided by this work is the optimal value, and indicate that in different operational conditions, the coarse alignment algorithms adopted for FOG INS are different in order to achieve better performance. Lastly, the experiment results validate the effectiveness of the proposed algorithm.

  3. Integrative prescreening in analysis of multiple cancer genomic studies

    PubMed Central

    2012-01-01

    Background In high throughput cancer genomic studies, results from the analysis of single datasets often suffer from a lack of reproducibility because of small sample sizes. Integrative analysis can effectively pool and analyze multiple datasets and provides a cost effective way to improve reproducibility. In integrative analysis, simultaneously analyzing all genes profiled may incur high computational cost. A computationally affordable remedy is prescreening, which fits marginal models, can be conducted in a parallel manner, and has low computational cost. Results An integrative prescreening approach is developed for the analysis of multiple cancer genomic datasets. Simulation shows that the proposed integrative prescreening has better performance than alternatives, particularly including prescreening with individual datasets, an intensity approach and meta-analysis. We also analyze multiple microarray gene profiling studies on liver and pancreatic cancers using the proposed approach. Conclusions The proposed integrative prescreening provides an effective way to reduce the dimensionality in cancer genomic studies. It can be coupled with existing analysis methods to identify cancer markers. PMID:22799431

  4. Assessing the Use of Employment Screening for Sexual Assault Prevention

    DTIC Science & Technology

    2017-01-01

    designed to address. For example, one meta-analysis found the average validity coefficient, or association, between integrity tests and job performance...or a more-general trait), and the group that is tested . For example, one meta-analysis found stronger associations between integrity tests and self... Tests Overt and personality-based integrity tests use questions designed to address somewhat different but overlapping content areas ( Ones

  5. Meta-Analysis of Integrity Tests: A Critical Examination of Validity Generalization and Moderator Variables.

    DTIC Science & Technology

    1992-06-01

    predicting both job performance and counterproductive behaviors on the job such as theft, disciplinary problems, and absenteeism . Validities were found to...DECLASSIFICATION/DOWNGRADING SCHEDULE 4 PERFORMING ORGANIZATION REPORT NUMBER(S) 92-1 6a NAME OF PERFORMING ORGANIZATION Universi+y of Iowa...be generalizable. The estimated mean operational predictive validity of integrity tests for supervisory ratings of job performance is .41. For the

  6. Investigating the Validity of an Integrated Listening-Speaking Task: A Discourse-Based Analysis of Test Takers' Oral Performances

    ERIC Educational Resources Information Center

    Frost, Kellie; Elder, Catherine; Wigglesworth, Gillian

    2012-01-01

    Performance on integrated tasks requires candidates to engage skills and strategies beyond language proficiency alone, in ways that can be difficult to define and measure for testing purposes. While it has been widely recognized that stimulus materials impact test performance, our understanding of the way in which test takers make use of these…

  7. Integrated Modeling of Optical Systems (IMOS): An Assessment and Future Directions

    NASA Technical Reports Server (NTRS)

    Moore, Gregory; Broduer, Steve (Technical Monitor)

    2001-01-01

    Integrated Modeling of Optical Systems (IMOS) is a finite element-based code combining structural, thermal, and optical ray-tracing capabilities in a single environment for analysis of space-based optical systems. We'll present some recent examples of IMOS usage and discuss future development directions. Due to increasing model sizes and a greater emphasis on multidisciplinary analysis and design, much of the anticipated future work will be in the areas of improved architecture, numerics, and overall performance and analysis integration.

  8. Air Vehicle Integration and Technology Research (AVIATR). Task Order 0003: Condition-Based Maintenance Plus Structural Integrity (CBM+SI) Demonstration (April 2011 to August 2011)

    DTIC Science & Technology

    2011-08-01

    investigated. Implementation of this technology into the maintenance framework depends on several factors, including safety of the structural system, cost... Maintenance Parameters The F-15 Program has indicated that, in practice , maintenance actions are generally performed on flight hour multiples of 200...Risk Analysis or the Perform Cost Benefit Analysis sections of the flowchart. 4.6. Determine System Configurations The current maintenance practice

  9. Lunar Exploration Architecture Level Key Drivers and Sensitivities

    NASA Technical Reports Server (NTRS)

    Goodliff, Kandyce; Cirillo, William; Earle, Kevin; Reeves, J. D.; Shyface, Hilary; Andraschko, Mark; Merrill, R. Gabe; Stromgren, Chel; Cirillo, Christopher

    2009-01-01

    Strategic level analysis of the integrated behavior of lunar transportation and lunar surface systems architecture options is performed to assess the benefit, viability, affordability, and robustness of system design choices. This analysis employs both deterministic and probabilistic modeling techniques so that the extent of potential future uncertainties associated with each option are properly characterized. The results of these analyses are summarized in a predefined set of high-level Figures of Merit (FOMs) so as to provide senior NASA Constellation Program (CxP) and Exploration Systems Mission Directorate (ESMD) management with pertinent information to better inform strategic level decision making. The strategic level exploration architecture model is designed to perform analysis at as high a level as possible but still capture those details that have major impacts on system performance. The strategic analysis methodology focuses on integrated performance, affordability, and risk analysis, and captures the linkages and feedbacks between these three areas. Each of these results leads into the determination of the high-level FOMs. This strategic level analysis methodology has been previously applied to Space Shuttle and International Space Station assessments and is now being applied to the development of the Constellation Program point-of-departure lunar architecture. This paper provides an overview of the strategic analysis methodology and the lunar exploration architecture analyses to date. In studying these analysis results, the strategic analysis team has identified and characterized key drivers affecting the integrated architecture behavior. These key drivers include inclusion of a cargo lander, mission rate, mission location, fixed-versus- variable costs/return on investment, and the requirement for probabilistic analysis. Results of sensitivity analysis performed on lunar exploration architecture scenarios are also presented.

  10. The Integration Process of Very Thin Mirror Shells with a Particular Regard to Simbol-X

    NASA Astrophysics Data System (ADS)

    Basso, S.; Pareschi, G.; Tagliaferri, G.; Mazzoleni, F.; Valtolina, R.; Citterio, O.; Conconi, P.

    2009-05-01

    The optics of Simbol-X are very thin compared to previous X-ray missions (like XMM). Therefore their shells floppy and are unable to maintain the correct shape. To avoid the deformations of their very thin X-ray optics during the integration process we adopt two stiffening rings with a good roundness. In this article the procedure used for the first three prototypes of the Simbol-X optics is presented with a description of the problems involved and with an analysis of the degradation of the performances during the integration. This analysis has been performed with the UV vertical bench measurements at INAF-OAB.

  11. An Integrated Low-Speed Performance and Noise Prediction Methodology for Subsonic Aircraft

    NASA Technical Reports Server (NTRS)

    Olson, E. D.; Mavris, D. N.

    2000-01-01

    An integrated methodology has been assembled to compute the engine performance, takeoff and landing trajectories, and community noise levels for a subsonic commercial aircraft. Where feasible, physics-based noise analysis methods have been used to make the results more applicable to newer, revolutionary designs and to allow for a more direct evaluation of new technologies. The methodology is intended to be used with approximation methods and risk analysis techniques to allow for the analysis of a greater number of variable combinations while retaining the advantages of physics-based analysis. Details of the methodology are described and limited results are presented for a representative subsonic commercial aircraft.

  12. New Integrated Modeling Capabilities: MIDAS' Recent Behavioral Enhancements

    NASA Technical Reports Server (NTRS)

    Gore, Brian F.; Jarvis, Peter A.

    2005-01-01

    The Man-machine Integration Design and Analysis System (MIDAS) is an integrated human performance modeling software tool that is based on mechanisms that underlie and cause human behavior. A PC-Windows version of MIDAS has been created that integrates the anthropometric character "Jack (TM)" with MIDAS' validated perceptual and attention mechanisms. MIDAS now models multiple simulated humans engaging in goal-related behaviors. New capabilities include the ability to predict situations in which errors and/or performance decrements are likely due to a variety of factors including concurrent workload and performance influencing factors (PIFs). This paper describes a new model that predicts the effects of microgravity on a mission specialist's performance, and its first application to simulating the task of conducting a Life Sciences experiment in space according to a sequential or parallel schedule of performance.

  13. Integral Airframe Structures (IAS): Validated Feasibility Study of Integrally Stiffened Metallic Fuselage Panels for Reducing Manufacturing Costs

    NASA Technical Reports Server (NTRS)

    Munroe, J.; Wilkins, K.; Gruber, M.; Domack, Marcia S. (Technical Monitor)

    2000-01-01

    The Integral Airframe Structures (IAS) program investigated the feasibility of using "integrally stiffened" construction for commercial transport fuselage structure. The objective of the program was to demonstrate structural performance and weight equal to current "built-up" structure with lower manufacturing cost. Testing evaluated mechanical properties, structural details, joint performance, repair, static compression, and two-bay crack residual strength panels. Alloys evaluated included 7050-T7451 plate, 7050-T74511 extrusion, 6013-T6511x extrusion, and 7475-T7351 plate. Structural performance was evaluated with a large 7475-T7351 pressure test that included the arrest of a two-bay longitudinal crack, and a measure of residual strength for a two-bay crack centered on a broken frame. Analysis predictions for the two-bay longitudinal crack panel correlated well with the test results. Analysis activity conducted by the IAS team strongly indicates that current analysis tools predict integral structural behavior as accurately as built-up structure. The cost study results indicated that, compared to built-up fabrication methods, high-speed machining structure from aluminum plate would yield a recurring cost savings of 61%. Part count dropped from 78 individual parts on a baseline panel to just 7 parts for machined IAS structure.

  14. Automated microfluidic devices integrating solid-phase extraction, fluorescent labeling, and microchip electrophoresis for preterm birth biomarker analysis.

    PubMed

    Sahore, Vishal; Sonker, Mukul; Nielsen, Anna V; Knob, Radim; Kumar, Suresh; Woolley, Adam T

    2018-01-01

    We have developed multichannel integrated microfluidic devices for automated preconcentration, labeling, purification, and separation of preterm birth (PTB) biomarkers. We fabricated multilayer poly(dimethylsiloxane)-cyclic olefin copolymer (PDMS-COC) devices that perform solid-phase extraction (SPE) and microchip electrophoresis (μCE) for automated PTB biomarker analysis. The PDMS control layer had a peristaltic pump and pneumatic valves for flow control, while the PDMS fluidic layer had five input reservoirs connected to microchannels and a μCE system. The COC layers had a reversed-phase octyl methacrylate porous polymer monolith for SPE and fluorescent labeling of PTB biomarkers. We determined μCE conditions for two PTB biomarkers, ferritin (Fer) and corticotropin-releasing factor (CRF). We used these integrated microfluidic devices to preconcentrate and purify off-chip-labeled Fer and CRF in an automated fashion. Finally, we performed a fully automated on-chip analysis of unlabeled PTB biomarkers, involving SPE, labeling, and μCE separation with 1 h total analysis time. These integrated systems have strong potential to be combined with upstream immunoaffinity extraction, offering a compact sample-to-answer biomarker analysis platform. Graphical abstract Pressure-actuated integrated microfluidic devices have been developed for automated solid-phase extraction, fluorescent labeling, and microchip electrophoresis of preterm birth biomarkers.

  15. Measuring Integration of Cancer Services to Support Performance Improvement: The CSI Survey

    PubMed Central

    Dobrow, Mark J.; Paszat, Lawrence; Golden, Brian; Brown, Adalsteinn D.; Holowaty, Eric; Orchard, Margo C.; Monga, Neerav; Sullivan, Terrence

    2009-01-01

    Objective: To develop a measure of cancer services integration (CSI) that can inform clinical and administrative decision-makers in their efforts to monitor and improve cancer system performance. Methods: We employed a systematic approach to measurement development, including review of existing cancer/health services integration measures, key-informant interviews and focus groups with cancer system leaders. The research team constructed a Web-based survey that was field- and pilot-tested, refined and then formally conducted on a sample of cancer care providers and administrators in Ontario, Canada. We then conducted exploratory factor analysis to identify key dimensions of CSI. Results: A total of 1,769 physicians, other clinicians and administrators participated in the survey, responding to a 67-item questionnaire. The exploratory factor analysis identified 12 factors that were linked to three broader dimensions: clinical, functional and vertical system integration. Conclusions: The CSI Survey provides important insights on a range of typically unmeasured aspects of the coordination and integration of cancer services, representing a new tool to inform performance improvement efforts. PMID:20676250

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, L.; Owel, W.R.

    This paper discusses the VISA (Vulnerability of Integrated Safeguards Analysis) method, developed in 1976-77 for the Nuclear Regulatory Commission, and which has been adapted more recently to a broader range of uses. The performance of VISA systems is evaluated in terms of how they perform as an integrated safeguards/security system. The resulting method has been designated VISA-2. 7 refs.

  17. The Role of Reading Strategies in Integrated L2 Writing Tasks

    ERIC Educational Resources Information Center

    Plakans, Lia

    2009-01-01

    Integrated second-language writing tasks elicit writing performances that involve other abilities such as reading or listening. Thus, understanding the role of these other abilities is necessary for interpreting performance on such tasks. This study used an inductive analysis of think-aloud protocol data and interviews to uncover the reading…

  18. Large-scale network integration in the human brain tracks temporal fluctuations in memory encoding performance.

    PubMed

    Keerativittayayut, Ruedeerat; Aoki, Ryuta; Sarabi, Mitra Taghizadeh; Jimura, Koji; Nakahara, Kiyoshi

    2018-06-18

    Although activation/deactivation of specific brain regions have been shown to be predictive of successful memory encoding, the relationship between time-varying large-scale brain networks and fluctuations of memory encoding performance remains unclear. Here we investigated time-varying functional connectivity patterns across the human brain in periods of 30-40 s, which have recently been implicated in various cognitive functions. During functional magnetic resonance imaging, participants performed a memory encoding task, and their performance was assessed with a subsequent surprise memory test. A graph analysis of functional connectivity patterns revealed that increased integration of the subcortical, default-mode, salience, and visual subnetworks with other subnetworks is a hallmark of successful memory encoding. Moreover, multivariate analysis using the graph metrics of integration reliably classified the brain network states into the period of high (vs. low) memory encoding performance. Our findings suggest that a diverse set of brain systems dynamically interact to support successful memory encoding. © 2018, Keerativittayayut et al.

  19. Optical design and tolerancing of an ophthalmological system

    NASA Astrophysics Data System (ADS)

    Sieber, Ingo; Martin, Thomas; Yi, Allen; Li, Likai; Rübenach, Olaf

    2014-09-01

    Tolerance analysis by means of simulation is an essential step in system integration. Tolerance analysis allows for predicting the performance of a system setup of real manufactured parts and for an estimation of the yield with respect to evaluation figures, such as performance requirements, systems specification or cost demands. Currently, optical freeform optics is gaining importance in optical systems design. The performance of freeform optics often strongly depends on the manufacturing accuracy of the surfaces. For this reason, a tolerance analysis with respect to the fabrication accuracy is of crucial importance. The characterization of form tolerances caused by the manufacturing process is based on the definition of straightness, flatness, roundness, and cylindricity. In case of freeform components, however, it is often impossible to define a form deviation by means of this standard classification. Hence, prediction of the impact of manufacturing tolerances on the optical performance is not possible by means of a conventional tolerance analysis. To carry out a tolerance analysis of the optical subsystem, including freeform optics, metrology data of the fabricated surfaces have to be integrated into the optical model. The focus of this article is on design for manufacturability of freeform optics with integrated alignment structures and on tolerance analysis of the optical subsystem based on the measured surface data of manufactured optical freeform components with respect to assembly and manufacturing tolerances. This approach will be reported here using an ophthalmological system as an example.

  20. Evaluation of Contamination Inspection and Analysis Methods through Modeling System Performance

    NASA Technical Reports Server (NTRS)

    Seasly, Elaine; Dever, Jason; Stuban, Steven M. F.

    2016-01-01

    Contamination is usually identified as a risk on the risk register for sensitive space systems hardware. Despite detailed, time-consuming, and costly contamination control efforts during assembly, integration, and test of space systems, contaminants are still found during visual inspections of hardware. Improved methods are needed to gather information during systems integration to catch potential contamination issues earlier and manage contamination risks better. This research explores evaluation of contamination inspection and analysis methods to determine optical system sensitivity to minimum detectable molecular contamination levels based on IEST-STD-CC1246E non-volatile residue (NVR) cleanliness levels. Potential future degradation of the system is modeled given chosen modules representative of optical elements in an optical system, minimum detectable molecular contamination levels for a chosen inspection and analysis method, and determining the effect of contamination on the system. By modeling system performance based on when molecular contamination is detected during systems integration and at what cleanliness level, the decision maker can perform trades amongst different inspection and analysis methods and determine if a planned method is adequate to meet system requirements and manage contamination risk.

  1. Linking Science Analysis with Observation Planning: A Full Circle Data Lifecycle

    NASA Technical Reports Server (NTRS)

    Grosvenor, Sandy; Jones, Jeremy; Koratkar, Anuradha; Li, Connie; Mackey, Jennifer; Neher, Ken; Wolf, Karl; Obenschain, Arthur F. (Technical Monitor)

    2001-01-01

    A clear goal of the Virtual Observatory (VO) is to enable new science through analysis of integrated astronomical archives. An additional and powerful possibility of the VO is to link and integrate these new analyses with planning of new observations. By providing tools that can be used for observation planning in the VO, the VO will allow the data lifecycle to come full circle: from theory to observations to data and back around to new theories and new observations. The Scientist's Expert Assistant (SEA) Simulation Facility (SSF) is working to combine the ability to access existing archives with the ability to model and visualize new observations. Integrating the two will allow astronomers to better use the integrated archives of the VO to plan and predict the success of potential new observations more efficiently, The full circle lifecycle enabled by SEA can allow astronomers to make substantial leaps in the quality of data and science returns on new observations. Our paper examines the exciting potential of integrating archival analysis with new observation planning, such as performing data calibration analysis on archival images and using that analysis to predict the success of new observations, or performing dynamic signal-to-noise analysis combining historical results with modeling of new instruments or targets. We will also describe how the development of the SSF is progressing and what have been its successes and challenges.

  2. Linking Science Analysis with Observation Planning: A Full Circle Data Lifecycle

    NASA Technical Reports Server (NTRS)

    Jones, Jeremy; Grosvenor, Sandy; Wolf, Karl; Li, Connie; Koratkar, Anuradha; Powers, Edward I. (Technical Monitor)

    2001-01-01

    A clear goal of the Virtual Observatory (VO) is to enable new science through analysis of integrated astronomical archives. An additional and powerful possibility of the VO is to link and integrate these new analyses with planning of new observations. By providing tools that can be used for observation planning in the VO, the VO will allow the data lifecycle to come full circle: from theory to observations to data and back around to new theories and new observations. The Scientist's Expert Assistant (SEA) Simulation Facility (SSF) is working to combine the ability to access existing archives with the ability to model and visualize new observations. Integrating the two will allow astronomers to better use the integrated archives of the VO to plan and predict the success of potential new observations. The full circle lifecycle enabled by SEA can allow astronomers to make substantial leaps in the quality of data and science returns on new observations. Our paper will examine the exciting potential of integrating archival analysis with new observation planning, such as performing data calibration analysis on archival images and using that analysis to predict the success of new observations, or performing dynamic signal-to-noise analysis combining historical results with modeling of new instruments or targets. We will also describe how the development of the SSF is progressing and what has been its successes and challenges.

  3. Determining a Method of Enabling and Disabling the Integral Torque in the SDO Science and Inertial Mode Controllers

    NASA Technical Reports Server (NTRS)

    Vess, Melissa F.; Starin, Scott R.

    2007-01-01

    During design of the SDO Science and Inertial mode PID controllers, the decision was made to disable the integral torque whenever system stability was in question. Three different schemes were developed to determine when to disable or enable the integral torque, and a trade study was performed to determine which scheme to implement. The trade study compared complexity of the control logic, risk of not reenabling the integral gain in time to reject steady-state error, and the amount of integral torque space used. The first scheme calculated a simplified Routh criterion to determine when to disable the integral torque. The second scheme calculates the PD part of the torque and looked to see if that torque would cause actuator saturation. If so, only the PD torque is used. If not, the integral torque is added. Finally, the third scheme compares the attitude and rate errors to limits and disables the integral torque if either of the errors is greater than the limit. Based on the trade study results, the third scheme was selected. Once it was decided when to disable the integral torque, analysis was performed to determine how to disable the integral torque and whether or not to reset the integrator once the integral torque was reenabled. Three ways to disable the integral torque were investigated: zero the input into the integrator, which causes the integral part of the PID control torque to be held constant; zero the integral torque directly but allow the integrator to continue integrating; or zero the integral torque directly and reset the integrator on integral torque reactivation. The analysis looked at complexity of the control logic, slew time plus settling time between each calibration maneuver step, and ability to reject steady-state error. Based on the results of the analysis, the decision was made to zero the input into the integrator without resetting it. Throughout the analysis, a high fidelity simulation was used to test the various implementation methods.

  4. Integrals for IBS and beam cooling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burov, A.; /Fermilab

    Simulation of beam cooling usually requires performing certain integral transformations every time step or so, which is a significant burden on the CPU. Examples are the dispersion integrals (Hilbert transforms) in the stochastic cooling, wake fields and IBS integrals. An original method is suggested for fast and sufficiently accurate computation of the integrals. This method is applied for the dispersion integral. Some methodical aspects of the IBS analysis are discussed.

  5. Integrals for IBS and Beam Cooling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burov, A.

    Simulation of beam cooling usually requires performing certain integral transformations every time step or so, which is a significant burden on the CPU. Examples are the dispersion integrals (Hilbert transforms) in the stochastic cooling, wake fields and IBS integrals. An original method is suggested for fast and sufficiently accurate computation of the integrals. This method is applied for the dispersion integral. Some methodical aspects of the IBS analysis are discussed.

  6. Monte Carlo analysis of the Titan III/Transfer Orbit Stage guidance system for the Mars Observer mission

    NASA Astrophysics Data System (ADS)

    Bell, Stephen C.; Ginsburg, Marc A.; Rao, Prabhakara P.

    An important part of space launch vehicle mission planning for a planetary mission is the integrated analysis of guidance and performance dispersions for both booster and upper stage vehicles. For the Mars Observer mission, an integrated trajectory analysis was used to maximize the scientific payload and to minimize injection errors by optimizing the energy management of both vehicles. This was accomplished by designing the Titan III booster vehicle to inject into a hyperbolic departure plane, and the Transfer Orbit Stage (TOS) to correct any booster dispersions. An integrated Monte Carlo analysis of the performance and guidance dispersions of both vehicles provided sensitivities, an evaluation of their guidance schemes and an injection error covariance matrix. The polynomial guidance schemes used for the Titan III variable flight azimuth computations and the TOS solid rocket motor ignition time and burn direction derivations accounted for a wide variation of launch times, performance dispersions, and target conditions. The Mars Observer spacecraft was launched on 25 September 1992 on the Titan III/TOS vehicle. The post flight analysis indicated that a near perfect park orbit injection was achieved, followed by a trans-Mars injection with less than 2sigma errors.

  7. Systems Analysis Of Advanced Coal-Based Power Plants

    NASA Technical Reports Server (NTRS)

    Ferrall, Joseph F.; Jennings, Charles N.; Pappano, Alfred W.

    1988-01-01

    Report presents appraisal of integrated coal-gasification/fuel-cell power plants. Based on study comparing fuel-cell technologies with each other and with coal-based alternatives and recommends most promising ones for research and development. Evaluates capital cost, cost of electricity, fuel consumption, and conformance with environmental standards. Analyzes sensitivity of cost of electricity to changes in fuel cost, to economic assumptions, and to level of technology. Recommends further evaluation of integrated coal-gasification/fuel-cell integrated coal-gasification/combined-cycle, and pulverized-coal-fired plants. Concludes with appendixes detailing plant-performance models, subsystem-performance parameters, performance goals, cost bases, plant-cost data sheets, and plant sensitivity to fuel-cell performance.

  8. Flat Plate Solar Array Project: Proceedings of the 20th Project Integration Meeting

    NASA Technical Reports Server (NTRS)

    Mcdonald, R. R.

    1982-01-01

    Progress made by the Flat-Plate Solar Array Project during the period November 1981 to April 1982 is reported. Project analysis and integration, technology research in silicon material, large-area silicon sheet and environmental isolation, cell and module formation, engineering sciences, and module performance and failure analysis are covered.

  9. Post2 End-to-End Descent and Landing Simulation for ALHAT Design Analysis Cycle 2

    NASA Technical Reports Server (NTRS)

    Davis, Jody L.; Striepe, Scott A.; Maddock, Robert W.; Johnson, Andrew E.; Paschall, Stephen C., II

    2010-01-01

    The ALHAT project is an agency-level program involving NASA centers, academia, and industry, with a primary goal to develop a safe, autonomous, precision-landing system for robotic and crew-piloted lunar and planetary descent vehicles. POST2 is used as the 6DOF descent and landing trajectory simulation for determining integrated system performance of ALHAT landing-system models and lunar environment models. This paper presents updates in the development of the ALHAT POST2 simulation, as well as preliminary system performance analysis for ALDAC-2 used for the testing and assessment of ALHAT system models. The ALDAC-2 POST2 Monte Carlo simulation results have been generated and focus on HRN model performance with the fully integrated system, as well performance improvements of AGNC and TSAR model since the previous design analysis cycle

  10. Integrated dynamic analysis simulation of space stations with controllable solar arrays (supplemental data and analyses)

    NASA Technical Reports Server (NTRS)

    Heinrichs, J. A.; Fee, J. J.

    1972-01-01

    Space station and solar array data and the analyses which were performed in support of the integrated dynamic analysis study. The analysis methods and the formulated digital simulation were developed. Control systems for space station altitude control and solar array orientation control include generic type control systems. These systems have been digitally coded and included in the simulation.

  11. Application of integrated fluid-thermal-structural analysis methods

    NASA Technical Reports Server (NTRS)

    Wieting, Allan R.; Dechaumphai, Pramote; Bey, Kim S.; Thornton, Earl A.; Morgan, Ken

    1988-01-01

    Hypersonic vehicles operate in a hostile aerothermal environment which has a significant impact on their aerothermostructural performance. Significant coupling occurs between the aerodynamic flow field, structural heat transfer, and structural response creating a multidisciplinary interaction. Interfacing state-of-the-art disciplinary analysis methods is not efficient, hence interdisciplinary analysis methods integrated into a single aerothermostructural analyzer are needed. The NASA Langley Research Center is developing such methods in an analyzer called LIFTS (Langley Integrated Fluid-Thermal-Structural) analyzer. The evolution and status of LIFTS is reviewed and illustrated through applications.

  12. Integrated modeling analysis of a novel hexapod and its application in active surface

    NASA Astrophysics Data System (ADS)

    Yang, Dehua; Zago, Lorenzo; Li, Hui; Lambert, Gregory; Zhou, Guohua; Li, Guoping

    2011-09-01

    This paper presents the concept and integrated modeling analysis of a novel mechanism, a 3-CPS/RPPS hexapod, for supporting segmented reflectors for radio telescopes and eventually segmented mirrors of optical telescopes. The concept comprises a novel type of hexapod with an original organization of actuators hence degrees of freedom, based on a swaying arm based design concept. Afterwards, with specially designed connecting joints between panels/segments, an iso-static master-slave active surface concept can be achieved for any triangular and/or hexagonal panel/segment pattern. The integrated modeling comprises all the multifold sizing and performance aspects which must be evaluated concurrently in order to optimize and validate the design and the configuration. In particular, comprehensive investigation of kinematic behavior, dynamic analysis, wave-front error and sensitivity analysis are carried out, where, frequently used tools like MATLAB/SimMechanics, CALFEM and ANSYS are used. Especially, we introduce the finite element method as a competent approach for analyses of the multi-degree of freedom mechanism. Some experimental verifications already performed validating single aspects of the integrated concept are also presented with the results obtained.

  13. Enhancement/upgrade of Engine Structures Technology Best Estimator (EST/BEST) Software System

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin

    2003-01-01

    This report describes the work performed during the contract period and the capabilities included in the EST/BEST software system. The developed EST/BEST software system includes the integrated NESSUS, IPACS, COBSTRAN, and ALCCA computer codes required to perform the engine cycle mission and component structural analysis. Also, the interactive input generator for NESSUS, IPACS, and COBSTRAN computer codes have been developed and integrated with the EST/BEST software system. The input generator allows the user to create input from scratch as well as edit existing input files interactively. Since it has been integrated with the EST/BEST software system, it enables the user to modify EST/BEST generated files and perform the analysis to evaluate the benefits. Appendix A gives details of how to use the newly added features in the EST/BEST software system.

  14. NASA Lighting Research, Test, & Analysis

    NASA Technical Reports Server (NTRS)

    Clark, Toni

    2015-01-01

    The Habitability and Human Factors Branch, at Johnson Space Center, in Houston, TX, provides technical guidance for the development of spaceflight lighting requirements, verification of light system performance, analysis of integrated environmental lighting systems, and research of lighting-related human performance issues. The Habitability & Human Factors Lighting Team maintains two physical facilities that are integrated to provide support. The Lighting Environment Test Facility (LETF) provides a controlled darkroom environment for physical verification of lighting systems with photometric and spetrographic measurement systems. The Graphics Research & Analysis Facility (GRAF) maintains the capability for computer-based analysis of operational lighting environments. The combined capabilities of the Lighting Team at Johnson Space Center have been used for a wide range of lighting-related issues.

  15. Probabilistic structural analysis methods for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Cruse, T. A.

    1989-01-01

    The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.

  16. Integrated Design Engineering Analysis (IDEA) Environment - Aerodynamics, Aerothermodynamics, and Thermal Protection System Integration Module

    NASA Technical Reports Server (NTRS)

    Kamhawi, Hilmi N.

    2011-01-01

    This report documents the work performed during from March 2010 October 2011. The Integrated Design and Engineering Analysis (IDEA) environment is a collaborative environment based on an object-oriented, multidisciplinary, distributed environment using the Adaptive Modeling Language (AML) as the underlying framework. This report will focus on describing the work done in the area of extending the aerodynamics, and aerothermodynamics module using S/HABP, CBAERO, PREMIN and LANMIN. It will also detail the work done integrating EXITS as the TPS sizing tool.

  17. Knowledge integration, teamwork and performance in health care.

    PubMed

    Körner, Mirjam; Lippenberger, Corinna; Becker, Sonja; Reichler, Lars; Müller, Christian; Zimmermann, Linda; Rundel, Manfred; Baumeister, Harald

    2016-01-01

    Knowledge integration is the process of building shared mental models. The integration of the diverse knowledge of the health professions in shared mental models is a precondition for effective teamwork and team performance. As it is known that different groups of health care professionals often tend to work in isolation, the authors compared the perceptions of knowledge integration. It can be expected that based on this isolation, knowledge integration is assessed differently. The purpose of this paper is to test these differences in the perception of knowledge integration between the professional groups and to identify to what extent knowledge integration predicts perceptions of teamwork and team performance and to determine if teamwork has a mediating effect. The study is a multi-center cross-sectional study with a descriptive-explorative design. Data were collected by means of a staff questionnaire for all health care professionals working in the rehabilitation clinics. The results showed that there are significant differences in knowledge integration within interprofessional health care teams. Furthermore, it could be shown that knowledge integration is significantly related to patient-centered teamwork as well as to team performance. Mediation analysis revealed partial mediation of the effect of knowledge integration on team performance through teamwork. PRACTICAL/IMPLICATIONS: In practice, the results of the study provide a valuable starting point for team development interventions. This is the first study that explored knowledge integration in medical rehabilitation teams and its relation to patient-centered teamwork and team performance.

  18. Integration and global analysis of isothermal titration calorimetry data for studying macromolecular interactions.

    PubMed

    Brautigam, Chad A; Zhao, Huaying; Vargas, Carolyn; Keller, Sandro; Schuck, Peter

    2016-05-01

    Isothermal titration calorimetry (ITC) is a powerful and widely used method to measure the energetics of macromolecular interactions by recording a thermogram of differential heating power during a titration. However, traditional ITC analysis is limited by stochastic thermogram noise and by the limited information content of a single titration experiment. Here we present a protocol for bias-free thermogram integration based on automated shape analysis of the injection peaks, followed by combination of isotherms from different calorimetric titration experiments into a global analysis, statistical analysis of binding parameters and graphical presentation of the results. This is performed using the integrated public-domain software packages NITPIC, SEDPHAT and GUSSI. The recently developed low-noise thermogram integration approach and global analysis allow for more precise parameter estimates and more reliable quantification of multisite and multicomponent cooperative and competitive interactions. Titration experiments typically take 1-2.5 h each, and global analysis usually takes 10-20 min.

  19. Better Assessment Science Integrating Point and Non-point Sources (BASINS)

    EPA Pesticide Factsheets

    Better Assessment Science Integrating Point and Nonpoint Sources (BASINS) is a multipurpose environmental analysis system designed to help regional, state, and local agencies perform watershed- and water quality-based studies.

  20. Performance assessment of human resource by integration of HSE and ergonomics and EFQM management system.

    PubMed

    Sadegh Amalnick, Mohsen; Zarrin, Mansour

    2017-03-13

    Purpose The purpose of this paper is to present an integrated framework for performance evaluation and analysis of human resource (HR) with respect to the factors of health, safety, environment and ergonomics (HSEE) management system, and also the criteria of European federation for quality management (EFQM) as one of the well-known business excellence models. Design/methodology/approach In this study, an intelligent algorithm based on adaptive neuro-fuzzy inference system (ANFIS) along with fuzzy data envelopment analysis (FDEA) are developed and employed to assess the performance of the company. Furthermore, the impact of the factors on the company's performance as well as their strengths and weaknesses are identified by conducting a sensitivity analysis on the results. Similarly, a design of experiment is performed to prioritize the factors in the order of importance. Findings The results show that EFQM model has a far greater impact upon the company's performance than HSEE management system. According to the obtained results, it can be argued that integration of HSEE and EFQM leads to the performance improvement in the company. Practical implications In current study, the required data for executing the proposed framework are collected via valid questionnaires which are filled in by the staff of an aviation industry located in Tehran, Iran. Originality/value Managing HR performance results in improving usability, maintainability and reliability and finally in a significant reduction in the commercial aviation accident rate. Also, study of factors affecting HR performance authorities participate in developing systems in order to help operators better manage human error. This paper for the first time presents an intelligent framework based on ANFIS, FDEA and statistical tests for HR performance assessment and analysis with the ability of handling uncertainty and vagueness existing in real world environment.

  1. Overview of MSFC AMSD Integrated Modeling and Analysis

    NASA Technical Reports Server (NTRS)

    Cummings, Ramona; Russell, Kevin (Technical Monitor)

    2002-01-01

    Structural, thermal, dynamic, and optical models of the NGST AMSD mirror assemblies are being finalized and integrated for predicting cryogenic vacuum test performance of the developing designs. Analyzers in use by the MSFC Modeling and Analysis Team are identified, with overview of approach to integrate simulated effects. Guidelines to verify the individual models and calibration cases for comparison with the vendors' analyses are presented. In addition, baseline and proposed additional scenarios for the cryogenic vacuum testing are briefly described.

  2. Model-based engineering for laser weapons systems

    NASA Astrophysics Data System (ADS)

    Panthaki, Malcolm; Coy, Steve

    2011-10-01

    The Comet Performance Engineering Workspace is an environment that enables integrated, multidisciplinary modeling and design/simulation process automation. One of the many multi-disciplinary applications of the Comet Workspace is for the integrated Structural, Thermal, Optical Performance (STOP) analysis of complex, multi-disciplinary space systems containing Electro-Optical (EO) sensors such as those which are designed and developed by and for NASA and the Department of Defense. The CometTM software is currently able to integrate performance simulation data and processes from a wide range of 3-D CAD and analysis software programs including CODE VTM from Optical Research Associates and SigFitTM from Sigmadyne Inc. which are used to simulate the optics performance of EO sensor systems in space-borne applications. Over the past year, Comet Solutions has been working with MZA Associates of Albuquerque, NM, under a contract with the Air Force Research Laboratories. This funded effort is a "risk reduction effort", to help determine whether the combination of Comet and WaveTrainTM, a wave optics systems engineering analysis environment developed and maintained by MZA Associates and used by the Air Force Research Laboratory, will result in an effective Model-Based Engineering (MBE) environment for the analysis and design of laser weapons systems. This paper will review the results of this effort and future steps.

  3. MinOmics, an Integrative and Immersive Tool for Multi-Omics Analysis.

    PubMed

    Maes, Alexandre; Martinez, Xavier; Druart, Karen; Laurent, Benoist; Guégan, Sean; Marchand, Christophe H; Lemaire, Stéphane D; Baaden, Marc

    2018-06-21

    Proteomic and transcriptomic technologies resulted in massive biological datasets, their interpretation requiring sophisticated computational strategies. Efficient and intuitive real-time analysis remains challenging. We use proteomic data on 1417 proteins of the green microalga Chlamydomonas reinhardtii to investigate physicochemical parameters governing selectivity of three cysteine-based redox post translational modifications (PTM): glutathionylation (SSG), nitrosylation (SNO) and disulphide bonds (SS) reduced by thioredoxins. We aim to understand underlying molecular mechanisms and structural determinants through integration of redox proteome data from gene- to structural level. Our interactive visual analytics approach on an 8.3 m2 display wall of 25 MPixel resolution features stereoscopic three dimensions (3D) representation performed by UnityMol WebGL. Virtual reality headsets complement the range of usage configurations for fully immersive tasks. Our experiments confirm that fast access to a rich cross-linked database is necessary for immersive analysis of structural data. We emphasize the possibility to display complex data structures and relationships in 3D, intrinsic to molecular structure visualization, but less common for omics-network analysis. Our setup is powered by MinOmics, an integrated analysis pipeline and visualization framework dedicated to multi-omics analysis. MinOmics integrates data from various sources into a materialized physical repository. We evaluate its performance, a design criterion for the framework.

  4. Development of an Integrated Nozzle for a Symmetric, RBCC Launch Vehicle Configuration

    NASA Technical Reports Server (NTRS)

    Smith, Timothy D.; Canabal, Francisco, III; Rice, Tharen; Blaha, Bernard

    2000-01-01

    The development of rocket based combined cycle (RBCC) engines is highly dependent upon integrating several different modes of operation into a single system. One of the key components to develop acceptable performance levels through each mode of operation is the nozzle. It must be highly integrated to serve the expansion processes of both rocket and air-breathing modes without undue weight, drag, or complexity. The NASA GTX configuration requires a fixed geometry, altitude-compensating nozzle configuration. The initial configuration, used mainly to estimate weight and cooling requirements was a 1 So half-angle cone, which cuts a concave surface from a point within the flowpath to the vehicle trailing edge. Results of 3-D CFD calculations on this geometry are presented. To address the critical issues associated with integrated, fixed geometry, multimode nozzle development, the GTX team has initiated a series of tasks to evolve the nozzle design, and validate performance levels. An overview of these tasks is given. The first element is a design activity to develop tools for integration of efficient expansion surfaces With the existing flowpath and vehicle aft-body, and to develop a second-generation nozzle design. A preliminary result using a "streamline-tracing" technique is presented. As the nozzle design evolves, a combination of 3-D CFD analysis and experimental evaluation will be used to validate the design procedure and determine the installed performance for propulsion cycle modeling. The initial experimental effort will consist of cold-flow experiments designed to validate the general trends of the streamline-tracing methodology and anchor the CFD analysis. Experiments will also be conducted to simulate nozzle performance during each mode of operation. As the design matures, hot-fire tests will be conducted to refine performance estimates and anchor more sophisticated reacting-flow analysis.

  5. Reliability studies of Integrated Modular Engine system designs

    NASA Technical Reports Server (NTRS)

    Hardy, Terry L.; Rapp, Douglas C.

    1993-01-01

    A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.

  6. Reliability studies of integrated modular engine system designs

    NASA Technical Reports Server (NTRS)

    Hardy, Terry L.; Rapp, Douglas C.

    1993-01-01

    A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.

  7. Reliability studies of integrated modular engine system designs

    NASA Astrophysics Data System (ADS)

    Hardy, Terry L.; Rapp, Douglas C.

    1993-06-01

    A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.

  8. Reliability studies of Integrated Modular Engine system designs

    NASA Astrophysics Data System (ADS)

    Hardy, Terry L.; Rapp, Douglas C.

    1993-06-01

    A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.

  9. Grid-Integrated Electric Drive Analysis for The Ohio State University |

    Science.gov Websites

    thermal management analysis and simulations on a high-performance, high-speed drive-developed by The Ohio as a pilot study for the future generation of energy efficient, high power density, high-speed integrated medium/high-voltage drive systems. If successful, the proposed project will significantly advance

  10. Analysis of airframe/engine interactions in integrated flight and propulsion control

    NASA Technical Reports Server (NTRS)

    Schierman, John D.; Schmidt, David K.

    1991-01-01

    An analysis framework for the assessment of dynamic cross-coupling between airframe and engine systems from the perspective of integrated flight/propulsion control is presented. This analysis involves to determining the significance of the interactions with respect to deterioration in stability robustness and performance, as well as critical frequency ranges where problems may occur due to these interactions. The analysis illustrated here investigates both the airframe's effects on the engine control loops and the engine's effects on the airframe control loops in two case studies. The second case study involves a multi-input/multi-output analysis of the airframe. Sensitivity studies are performed on critical interactions to examine the degradations in the system's stability robustness and performance. Magnitudes of the interactions required to cause instabilities, as well as the frequencies at which the instabilities occur are recorded. Finally, the analysis framework is expanded to include control laws which contain cross-feeds between the airframe and engine systems.

  11. Performance Analysis of Constrained Loosely Coupled GPS/INS Integration Solutions

    PubMed Central

    Falco, Gianluca; Einicke, Garry A.; Malos, John T.; Dovis, Fabio

    2012-01-01

    The paper investigates approaches for loosely coupled GPS/INS integration. Error performance is calculated using a reference trajectory. A performance improvement can be obtained by exploiting additional map information (for example, a road boundary). A constrained solution has been developed and its performance compared with an unconstrained one. The case of GPS outages is also investigated showing how a Kalman filter that operates on the last received GPS position and velocity measurements provides a performance benefit. Results are obtained by means of simulation studies and real data. PMID:23202241

  12. Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Nagpal, Vinod K.

    2007-01-01

    An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.

  13. Industry structures in private dental markets in Finland.

    PubMed

    Widström, E; Mikkola, H

    2012-12-01

    To use industrial organisation and organisational ecology research methods to survey industry structures and performance in the markets for private dental services and the effect of competition. Data on practice characteristics, performance, and perceived competition were collected from full-time private dentists (n = 1,121) using a questionnaire. The response rate was 59.6%. Cluster analysis was used to identify practice type based on service differentiation and process integration variables formulated from the questionnaire. Four strategic groups were identified in the Finnish markets: Solo practices formed one distinct group and group practices were classified into three clusters Integrated practices, Small practices, and Loosely integrated practices. Statistically significant differences were found in performance and perceived competitiveness between the groups. Integrated practices with the highest level of process integration and service differentiation performed better than solo and small practices. Moreover, loosely integrated and small practices outperformed solo practises. Competitive intensity was highest among small practices which had a low level of service differentiation and was above average among solo practises. Private dental care providers that had differentiated their services from public services and that had a high number of integrated service production processes enjoyed higher performance and less competitive pressures than those who had not.

  14. Warfighter Integrated Physical Ergonomics Tool Development: Needs Analysis and State of the Art Review

    DTIC Science & Technology

    2011-03-01

    Humansystems® Warfighter Integrated Physical Ergonomics Tool Development Page 14 e) Forces: Griffon seat design assessments include questions of vibration...the suitability of alternative designs . Humansystems® Warfighter Integrated Physical Ergonomics Tool Development Page 5 e) Performance Measures...configurations to assess Humansystems® Warfighter Integrated Physical Ergonomics Tool Development Page 8 design and acquisition decisions, and more

  15. A Visual Basic simulation software tool for performance analysis of a membrane-based advanced water treatment plant.

    PubMed

    Pal, P; Kumar, R; Srivastava, N; Chaudhuri, J

    2014-02-01

    A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater.

  16. The Study of an Integrated Rating System for Supplier Quality Performance in the Semiconductor Industry

    NASA Astrophysics Data System (ADS)

    Lee, Yu-Cheng; Yen, Tieh-Min; Tsai, Chih-Hung

    This study provides an integrated model of Supplier Quality Performance Assesment (SQPA) activity for the semiconductor industry through introducing the ISO 9001 management framework, Importance-Performance Analysis (IPA) Supplier Quality Performance Assesment and Taguchi`s Signal-to-Noise Ratio (S/N) techniques. This integrated model provides a SQPA methodology to create value for all members under mutual cooperation and trust in the supply chain. This method helps organizations build a complete SQPA framework, linking organizational objectives and SQPA activities to optimize rating techniques to promote supplier quality improvement. The techniques used in SQPA activities are easily understood. A case involving a design house is illustrated to show our model.

  17. A Framework for Daylighting Optimization in Whole Buildings with OpenStudio

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2016-08-12

    We present a toolkit and workflow for leveraging the OpenStudio (Guglielmetti et al. 2010) platform to perform daylighting analysis and optimization in a whole building energy modeling (BEM) context. We have re-implemented OpenStudio's integrated Radiance and EnergyPlus functionality as an OpenStudio Measure. The OpenStudio Radiance Measure works within the OpenStudio Application and Parametric Analysis Tool, as well as the OpenStudio Server large scale analysis framework, allowing a rigorous daylighting simulation to be performed on a single building model or potentially an entire population of programmatically generated models. The Radiance simulation results can automatically inform the broader building energy model, andmore » provide dynamic daylight metrics as a basis for decision. Through introduction and example, this paper illustrates the utility of the OpenStudio building energy modeling platform to leverage existing simulation tools for integrated building energy performance simulation, daylighting analysis, and reportage.« less

  18. Integrated Modeling Activities for the James Webb Space Telescope (JWST): Structural-Thermal-Optical Analysis

    NASA Technical Reports Server (NTRS)

    Johnston, John D.; Parrish, Keith; Howard, Joseph M.; Mosier, Gary E.; McGinnis, Mark; Bluth, Marcel; Kim, Kevin; Ha, Hong Q.

    2004-01-01

    This is a continuation of a series of papers on modeling activities for JWST. The structural-thermal- optical, often referred to as "STOP", analysis process is used to predict the effect of thermal distortion on optical performance. The benchmark STOP analysis for JWST assesses the effect of an observatory slew on wavefront error. The paper begins an overview of multi-disciplinary engineering analysis, or integrated modeling, which is a critical element of the JWST mission. The STOP analysis process is then described. This process consists of the following steps: thermal analysis, structural analysis, and optical analysis. Temperatures predicted using geometric and thermal math models are mapped to the structural finite element model in order to predict thermally-induced deformations. Motions and deformations at optical surfaces are input to optical models and optical performance is predicted using either an optical ray trace or WFE estimation techniques based on prior ray traces or first order optics. Following the discussion of the analysis process, results based on models representing the design at the time of the System Requirements Review. In addition to baseline performance predictions, sensitivity studies are performed to assess modeling uncertainties. Of particular interest is the sensitivity of optical performance to uncertainties in temperature predictions and variations in metal properties. The paper concludes with a discussion of modeling uncertainty as it pertains to STOP analysis.

  19. Integrated propulsion for near-Earth space missions. Volume 2: Technical

    NASA Technical Reports Server (NTRS)

    Dailey, C. L.; Meissinger, H. F.; Lovberg, R. H.; Zafran, S.

    1981-01-01

    The calculation approach is described for parametric analysis of candidate electric propulsion systems employed in LEO to GEO missions. Occultation relations, atmospheric density effects, and natural radiation effects are presented. A solar cell cover glass tradeoff is performed to determine optimum glass thickness. Solar array and spacecraft pointing strategies are described for low altitude flight and for optimum array illumination during ascent. Mass ratio tradeoffs versus transfer time provide direction for thruster technology improvements. Integrated electric propulsion analysis is performed for orbit boosting, inclination change, attitude control, stationkeeping, repositioning, and disposal functions as well as power sharing with payload on orbit. Comparison with chemical auxiliary propulsion is made to quantify the advantages of integrated propulsion in terms of weight savings and concomittant launch cost savings.

  20. Engineering and Economic Analysis of an Advanced Ultra-Supercritical Pulverized Coal Power Plant with and without Post-Combustion Carbon Capture Task 7. Design and Economic Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Booras, George; Powers, J.; Riley, C.

    2015-09-01

    This report evaluates the economics and performance of two A-USC PC power plants; Case 1 is a conventionally configured A-USC PC power plant with superior emission controls, but without CO 2 removal; and Case 2 adds a post-combustion carbon capture (PCC) system to the plant from Case 1, using the design and heat integration strategies from EPRI’s 2015 report, “Best Integrated Coal Plant.” The capture design basis for this case is “partial,” to meet EPA’s proposed New Source Performance Standard, which was initially proposed as 500 kg-CO 2/MWh (gross) or 1100 lb-CO 2/MWh (gross), but modified in August 2015 tomore » 635 kg-CO 2/MWh (gross) or 1400 lb-CO 2/MWh (gross). This report draws upon the collective experience of consortium members, with EPRI and General Electric leading the study. General Electric provided the steam cycle analysis as well as v the steam turbine design and cost estimating. EPRI performed integrated plant performance analysis using EPRI’s PC Cost model.« less

  1. An Analysis of Students' Academic Performance when Integrating DVD Technology in Geography Teaching and Learning

    ERIC Educational Resources Information Center

    Van der Westhuizen, C. P.; Nel, Carisma; Richter, Barry W.

    2012-01-01

    This article discusses the effect of the integration of the Digital Versatile Disc (DVD) as an ICT-variant on the academic performance of full-time geography teacher students enrolled for a Bachelor of Education (B. Ed.) degree at a rural university in a developing country. Action research (which includes both quantitative and qualitative…

  2. An integrated modeling and design tool for advanced optical spacecraft

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1992-01-01

    Consideration is given to the design and status of the Integrated Modeling of Optical Systems (IMOS) tool and to critical design issues. A multidisciplinary spacecraft design and analysis tool with support for structural dynamics, controls, thermal analysis, and optics, IMOS provides rapid and accurate end-to-end performance analysis, simulations, and optimization of advanced space-based optical systems. The requirements for IMOS-supported numerical arrays, user defined data structures, and a hierarchical data base are outlined, and initial experience with the tool is summarized. A simulation of a flexible telescope illustrates the integrated nature of the tools.

  3. Proceedings of the 22nd Project Integration Meeting

    NASA Technical Reports Server (NTRS)

    1983-01-01

    This report describes progress made by the Flat-Plate Solar Array Project during the period January to September 1983. It includes reports on silicon sheet growth and characterization, module technology, silicon material, cell processing and high-efficiency cells, environmental isolation, engineering sciences, module performance and failure analysis and project analysis and integration. It includes a report on, and copies of visual presentations made at the 22nd Project Integration Meeting held at Pasadena, California, on September 28 and 29, 1983.

  4. Shuttle payload interface verification equipment study. Volume 2: Technical document, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The technical analysis is reported that was performed during the shuttle payload interface verification equipment study. It describes: (1) the background and intent of the study; (2) study approach and philosophy covering all facets of shuttle payload/cargo integration; (3)shuttle payload integration requirements; (4) preliminary design of the horizontal IVE; (5) vertical IVE concept; and (6) IVE program development plans, schedule and cost. Also included is a payload integration analysis task to identify potential uses in addition to payload interface verification.

  5. Integrated restructurable flight control system demonstration results

    NASA Technical Reports Server (NTRS)

    Weiss, Jerold L.; Hsu, John Y.

    1987-01-01

    The purpose of this study was to examine the complementary capabilities of several restructurable flight control system (RFCS) concepts through the integration of these technologies into a complete system. Performance issues were addressed through a re-examination of RFCS functional requirements, and through a qualitative analysis of the design issues that, if properly addressed during integration, will lead to the highest possible degree of fault-tolerant performance. Software developed under previous phases of this contract and under NAS1-18004 was modified and integrated into a complete RFCS subroutine for NASA's B-737 simulation. The integration of these modules involved the development of methods for dealing with the mismatch between the outputs of the failure detection module and the input requirements of the automatic control system redesign module. The performance of this demonstration system was examined through extensive simulation trials.

  6. Computer-aided-engineering system for modeling and analysis of ECLSS integration testing

    NASA Technical Reports Server (NTRS)

    Sepahban, Sonbol

    1987-01-01

    The accurate modeling and analysis of two-phase fluid networks found in environmental control and life support systems is presently undertaken by computer-aided engineering (CAE) techniques whose generalized fluid dynamics package can solve arbitrary flow networks. The CAE system for integrated test bed modeling and analysis will also furnish interfaces and subsystem/test-article mathematical models. Three-dimensional diagrams of the test bed are generated by the system after performing the requisite simulation and analysis.

  7. Identification of Tf1 integration events in S. pombe under nonselective conditions

    PubMed Central

    Cherry, Kristina E.; Hearn, Willis E.; Seshie, Osborne Y.; Singleton, Teresa L.; Singleton, Teresa L.

    2014-01-01

    Integration of retroviral elements into the host genome is a phenomena observed among many classes of retroviruses. Much information concerning integration of retroviral elements has been documented based on in vitro analysis or expression of selectable markers. To identify possible Tf1 integration events within silent regions of the S. pombe genome, we focused on performing an in vivo genome-wide analysis of Tf1 integration events from the nonselective phase of the retrotransposition assay. We analyzed 1000 individual colonies streaked from four independent Tf1 transposed patches under nonselection conditions. Our analysis detected a population of G418S/neo+ Tf1 integration events that would have been overlooked during the selective phase of the assay. Further RNA analysis from the G418S/neo+ clones revealed 50% of clones expressing the neo selectable marker. Our data reveals Tf1’s ability to insert within silent regions of S. pombe’s genome. PMID:24680781

  8. Identification of Tf1 integration events in S. pombe under nonselective conditions.

    PubMed

    Cherry, Kristina E; Hearn, Willis E; Seshie, Osborne Y K; Singleton, Teresa L

    2014-06-01

    Integration of retroviral elements into the host genome is a phenomena observed among many classes of retroviruses. Much information concerning the integration of retroviral elements has been documented based on in vitro analysis or expression of selectable markers. To identify possible Tf1 integration events within silent regions of the Schizosaccharomyces pombe genome, we focused on performing an in vivo genome-wide analysis of Tf1 integration events from the nonselective phase of the retrotransposition assay. We analyzed 1000 individual colonies streaked from four independent Tf1 transposed patches under nonselection conditions. Our analysis detected a population of G418(S)/neo(+) Tf1 integration events that would have been overlooked during the selective phase of the assay. Further RNA analysis from the G418(S)/neo(+) clones revealed 50% of clones expressing the neo selectable marker. Our data reveals Tf1's ability to insert within silent regions of S. pombe's genome. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Performance analysis of different tuning rules for an isothermal CSTR using integrated EPC and SPC

    NASA Astrophysics Data System (ADS)

    Roslan, A. H.; Karim, S. F. Abd; Hamzah, N.

    2018-03-01

    This paper demonstrates the integration of Engineering Process Control (EPC) and Statistical Process Control (SPC) for the control of product concentration of an isothermal CSTR. The objectives of this study are to evaluate the performance of Ziegler-Nichols (Z-N), Direct Synthesis, (DS) and Internal Model Control (IMC) tuning methods and determine the most effective method for this process. The simulation model was obtained from past literature and re-constructed using SIMULINK MATLAB to evaluate the process response. Additionally, the process stability, capability and normality were analyzed using Process Capability Sixpack reports in Minitab. Based on the results, DS displays the best response for having the smallest rise time, settling time, overshoot, undershoot, Integral Time Absolute Error (ITAE) and Integral Square Error (ISE). Also, based on statistical analysis, DS yields as the best tuning method as it exhibits the highest process stability and capability.

  10. Space tug economic analysis study. Volume 2: Tug concepts analysis. Part 1: Overall approach and data generation

    NASA Technical Reports Server (NTRS)

    1972-01-01

    An economic analysis of space tug operations is presented. The subjects discussed are: (1) data base for orbit injection stages, (2) data base for reusable space tug, (3) performance equations, (4) data integration and interpretation, (5) tug performance and mission model accomodation, (6) total program cost, (7) payload analysis, (8) computer software, and (9) comparison of tug concepts.

  11. A novel integrated assessment methodology of urban water reuse.

    PubMed

    Listowski, A; Ngo, H H; Guo, W S; Vigneswaran, S

    2011-01-01

    Wastewater is no longer considered a waste product and water reuse needs to play a stronger part in securing urban water supply. Although treatment technologies for water reclamation have significantly improved the question that deserves further analysis is, how selection of a particular wastewater treatment technology relates to performance and sustainability? The proposed assessment model integrates; (i) technology, characterised by selected quantity and quality performance parameters; (ii) productivity, efficiency and reliability criteria; (iii) quantitative performance indicators; (iv) development of evaluation model. The challenges related to hierarchy and selections of performance indicators have been resolved through the case study analysis. The goal of this study is to validate a new assessment methodology in relation to performance of the microfiltration (MF) technology, a key element of the treatment process. Specific performance data and measurements were obtained at specific Control and Data Acquisition Points (CP) to satisfy the input-output inventory in relation to water resources, products, material flows, energy requirements, chemicals use, etc. Performance assessment process contains analysis and necessary linking across important parametric functions leading to reliable outcomes and results.

  12. Integrating Individual Learning Processes and Organizational Knowledge Formation: Foundational Determinants for Organizational Performance

    ERIC Educational Resources Information Center

    Song, Ji Hoon; Chermack, Thomas J.; Kim, Hong Min

    2008-01-01

    This research examined the link between learning processes and knowledge formation through an integrated literature review from both academic and practical viewpoints. Individuals' learning processes and organizational knowledge creation were reviewed by means of theoretical and integrative analysis based on a lack of empirical research on the…

  13. SysBioCube: A Data Warehouse and Integrative Data Analysis Platform Facilitating Systems Biology Studies of Disorders of Military Relevance

    DTIC Science & Technology

    2013-12-18

    include interactive gene and methylation profiles, interactive heatmaps, cytoscape network views, integrative genomics viewer ( IGV ), and protein-protein...single chart. The website also provides an option to include multiple genes. Integrative Genomics Viewer ( IGV )1, is a high-performance desktop tool for

  14. Optical performance assessment under environmental and mechanical perturbations in large, deployable telescopes

    NASA Astrophysics Data System (ADS)

    Folley, Christopher; Bronowicki, Allen

    2005-09-01

    Prediction of optical performance for large, deployable telescopes under environmental conditions and mechanical disturbances is a crucial part of the design verification process of such instruments for all phases of design and operation: ground testing, commissioning, and on-orbit operation. A Structural-Thermal-Optical-Performance (STOP) analysis methodology is often created that integrates the output of one analysis with the input of another. The integration of thermal environment predictions with structural models is relatively well understood, while the integration of structural deformation results into optical analysis/design software is less straightforward. A Matlab toolbox has been created that effectively integrates the predictions of mechanical deformations on optical elements generated by, for example, finite element analysis, and computes optical path differences for the distorted prescription. The engine of the toolbox is the real ray-tracing algorithm that allows the optical surfaces to be defined in a single, global coordinate system thereby allowing automatic alignment of the mechanical coordinate system with the optical coordinate system. Therefore, the physical location of the optical surfaces is identical in the optical prescription and the finite element model. The application of rigid body displacements to optical surfaces, however, is more general than for use solely in STOP analysis, such as the analysis of misalignments during the commissioning process. Furthermore, all the functionality of Matlab is available for optimization and control. Since this is a new tool for use on flight programs, it has been verified against CODE V. The toolbox' functionality, to date, is described, verification results are presented, and, as an example of its utility, results of a thermal distortion analysis are presented using the James Webb Space Telescope (JWST) prescription.

  15. Integration of Design, Thermal, Structural, and Optical Analysis, Including Thermal Animation

    NASA Technical Reports Server (NTRS)

    Amundsen, Ruth M.

    1993-01-01

    In many industries there has recently been a concerted movement toward 'quality management' and the issue of how to accomplish work more efficiently. Part of this effort is focused on concurrent engineering; the idea of integrating the design and analysis processes so that they are not separate, sequential processes (often involving design rework due to analytical findings) but instead form an integrated system with smooth transfers of information. Presented herein are several specific examples of concurrent engineering methods being carried out at Langley Research Center (LaRC): integration of thermal, structural and optical analyses to predict changes in optical performance based on thermal and structural effects; integration of the CAD design process with thermal and structural analyses; and integration of analysis and presentation by animating the thermal response of a system as an active color map -- a highly effective visual indication of heat flow.

  16. Mixed time integration methods for transient thermal analysis of structures, appendix 5

    NASA Technical Reports Server (NTRS)

    Liu, W. K.

    1982-01-01

    Mixed time integration methods for transient thermal analysis of structures are studied. An efficient solution procedure for predicting the thermal behavior of aerospace vehicle structures was developed. A 2D finite element computer program incorporating these methodologies is being implemented. The performance of these mixed time finite element algorithms can then be evaluated employing the proposed example problem.

  17. Grid connected integrated community energy system. Phase II: final state 2 report. Cost benefit analysis, operating costs and computer simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1978-03-22

    A grid-connected Integrated Community Energy System (ICES) with a coal-burning power plant located on the University of Minnesota campus is planned. The cost benefit analysis performed for this ICES, the cost accounting methods used, and a computer simulation of the operation of the power plant are described. (LCL)

  18. Associations Among Health Care Workplace Safety, Resident Satisfaction, and Quality of Care in Long-Term Care Facilities.

    PubMed

    Boakye-Dankwa, Ernest; Teeple, Erin; Gore, Rebecca; Punnett, Laura

    2017-11-01

    We performed an integrated cross-sectional analysis of relationships between long-term care work environments, employee and resident satisfaction, and quality of patient care. Facility-level data came from a network of 203 skilled nursing facilities in 13 states in the eastern United States owned or managed by one company. K-means cluster analysis was applied to investigate clustered associations between safe resident handling program (SRHP) performance, resident care outcomes, employee satisfaction, rates of workers' compensation claims, and resident satisfaction. Facilities in the better-performing cluster were found to have better patient care outcomes and resident satisfaction; lower rates of workers compensation claims; better SRHP performance; higher employee retention; and greater worker job satisfaction and engagement. The observed clustered relationships support the utility of integrated performance assessment in long-term care facilities.

  19. Electronic Performance Support Systems: Comparison of Types of Integration Levels on Performance Outcomes

    ERIC Educational Resources Information Center

    Phillips, Sharon A.

    2013-01-01

    Selecting appropriate performance improvement interventions is a critical component of a comprehensive model of performance improvement. Intervention selection is an interconnected process involving analysis of an organization's environment, definition of the performance problem, and identification of a performance gap and identification of causal…

  20. Performance Analysis of a NASA Integrated Network Array

    NASA Technical Reports Server (NTRS)

    Nessel, James A.

    2012-01-01

    The Space Communications and Navigation (SCaN) Program is planning to integrate its individual networks into a unified network which will function as a single entity to provide services to user missions. This integrated network architecture is expected to provide SCaN customers with the capabilities to seamlessly use any of the available SCaN assets to support their missions to efficiently meet the collective needs of Agency missions. One potential optimal application of these assets, based on this envisioned architecture, is that of arraying across existing networks to significantly enhance data rates and/or link availabilities. As such, this document provides an analysis of the transmit and receive performance of a proposed SCaN inter-network antenna array. From the study, it is determined that a fully integrated internetwork array does not provide any significant advantage over an intra-network array, one in which the assets of an individual network are arrayed for enhanced performance. Therefore, it is the recommendation of this study that NASA proceed with an arraying concept, with a fundamental focus on a network-centric arraying.

  1. A Microwave Photonic Interference Canceller: Architectures, Systems, and Integration

    NASA Astrophysics Data System (ADS)

    Chang, Matthew P.

    This thesis is a comprehensive portfolio of work on a Microwave Photonic Self-Interference Canceller (MPC), a specialized optical system designed to eliminate interference from radio-frequency (RF) receivers. The novelty and value of the microwave photonic system lies in its ability to operate over bandwidths and frequencies that are orders of magnitude larger than what is possible using existing RF technology. The work begins, in 2012, with a discrete fiber-optic microwave photonic canceller, which prior work had demonstrated as a proof-of-concept, and culminates, in 2017, with the first ever monolithically integrated microwave photonic canceller. With an eye towards practical implementation, the thesis establishes novelty through three major project thrusts. (Fig. 1): (1) Extensive RF and system analysis to develop a full understanding of how, and through what mechanisms, MPCs affect an RF receiver. The first investigations of how a microwave photonic canceller performs in an actual wireless environment and a digital radio are also presented. (2) New architectures to improve the performance and functionality of MPCs, based on the analysis performed in Thrust 1. A novel balanced microwave photonic canceller architecture is developed and experimentally demonstrated. The balanced architecture shows significant improvements in link gain, noise figure, and dynamic range. Its main advantage is its ability to suppress common-mode noise and reduce noise figure by increasing the optical power. (3) Monolithic integration of the microwave photonic canceller into a photonic integrated circuit. This thrust presents the progression of integrating individual discrete devices into their semiconductor equivalent, as well as a full functional and RF analysis of the first ever integrated microwave photonic canceller.

  2. Bring It to the Pitch: Combining Video and Movement Data to Enhance Team Sport Analysis.

    PubMed

    Stein, Manuel; Janetzko, Halldor; Lamprecht, Andreas; Breitkreutz, Thorsten; Zimmermann, Philipp; Goldlucke, Bastian; Schreck, Tobias; Andrienko, Gennady; Grossniklaus, Michael; Keim, Daniel A

    2018-01-01

    Analysts in professional team sport regularly perform analysis to gain strategic and tactical insights into player and team behavior. Goals of team sport analysis regularly include identification of weaknesses of opposing teams, or assessing performance and improvement potential of a coached team. Current analysis workflows are typically based on the analysis of team videos. Also, analysts can rely on techniques from Information Visualization, to depict e.g., player or ball trajectories. However, video analysis is typically a time-consuming process, where the analyst needs to memorize and annotate scenes. In contrast, visualization typically relies on an abstract data model, often using abstract visual mappings, and is not directly linked to the observed movement context anymore. We propose a visual analytics system that tightly integrates team sport video recordings with abstract visualization of underlying trajectory data. We apply appropriate computer vision techniques to extract trajectory data from video input. Furthermore, we apply advanced trajectory and movement analysis techniques to derive relevant team sport analytic measures for region, event and player analysis in the case of soccer analysis. Our system seamlessly integrates video and visualization modalities, enabling analysts to draw on the advantages of both analysis forms. Several expert studies conducted with team sport analysts indicate the effectiveness of our integrated approach.

  3. 10 CFR 70.61 - Performance requirements.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Performance requirements. 70.61 Section 70.61 Energy... Performance requirements. (a) Each applicant or licensee shall evaluate, in the integrated safety analysis performed in accordance with § 70.62, its compliance with the performance requirements in paragraphs (b), (c...

  4. 10 CFR 70.61 - Performance requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Performance requirements. 70.61 Section 70.61 Energy... Performance requirements. (a) Each applicant or licensee shall evaluate, in the integrated safety analysis performed in accordance with § 70.62, its compliance with the performance requirements in paragraphs (b), (c...

  5. 10 CFR 70.61 - Performance requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Performance requirements. 70.61 Section 70.61 Energy... Performance requirements. (a) Each applicant or licensee shall evaluate, in the integrated safety analysis performed in accordance with § 70.62, its compliance with the performance requirements in paragraphs (b), (c...

  6. 10 CFR 70.61 - Performance requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Performance requirements. 70.61 Section 70.61 Energy... Performance requirements. (a) Each applicant or licensee shall evaluate, in the integrated safety analysis performed in accordance with § 70.62, its compliance with the performance requirements in paragraphs (b), (c...

  7. Integral abutment bridges under thermal loading : field monitoring and analysis.

    DOT National Transportation Integrated Search

    2017-08-01

    Integral abutment bridges (IABs) have gained popularity throughout the United States due to their low construction and maintenance costs. Previous research on IABs has been heavily focused on substructure performance, leaving a need for better unders...

  8. Integrated corridor management modeling results report : Dallas, Minneapolis, and San Diego.

    DOT National Transportation Integrated Search

    2012-02-01

    This executive summary documents the analysis methodologies, tools, and performance measures used to analyze Integrated Corridor Management (ICM) strategies; and presents high-level results for the successful implementation of ICM at three Stage 2 Pi...

  9. An Integrated Framework for Parameter-based Optimization of Scientific Workflows.

    PubMed

    Kumar, Vijay S; Sadayappan, P; Mehta, Gaurang; Vahi, Karan; Deelman, Ewa; Ratnakar, Varun; Kim, Jihie; Gil, Yolanda; Hall, Mary; Kurc, Tahsin; Saltz, Joel

    2009-01-01

    Data analysis processes in scientific applications can be expressed as coarse-grain workflows of complex data processing operations with data flow dependencies between them. Performance optimization of these workflows can be viewed as a search for a set of optimal values in a multi-dimensional parameter space. While some performance parameters such as grouping of workflow components and their mapping to machines do not a ect the accuracy of the output, others may dictate trading the output quality of individual components (and of the whole workflow) for performance. This paper describes an integrated framework which is capable of supporting performance optimizations along multiple dimensions of the parameter space. Using two real-world applications in the spatial data analysis domain, we present an experimental evaluation of the proposed framework.

  10. High-Precision Image Aided Inertial Navigation with Known Features: Observability Analysis and Performance Evaluation

    PubMed Central

    Jiang, Weiping; Wang, Li; Niu, Xiaoji; Zhang, Quan; Zhang, Hui; Tang, Min; Hu, Xiangyun

    2014-01-01

    A high-precision image-aided inertial navigation system (INS) is proposed as an alternative to the carrier-phase-based differential Global Navigation Satellite Systems (CDGNSSs) when satellite-based navigation systems are unavailable. In this paper, the image/INS integrated algorithm is modeled by a tightly-coupled iterative extended Kalman filter (IEKF). Tightly-coupled integration ensures that the integrated system is reliable, even if few known feature points (i.e., less than three) are observed in the images. A new global observability analysis of this tightly-coupled integration is presented to guarantee that the system is observable under the necessary conditions. The analysis conclusions were verified by simulations and field tests. The field tests also indicate that high-precision position (centimeter-level) and attitude (half-degree-level)-integrated solutions can be achieved in a global reference. PMID:25330046

  11. Fault tree analysis for integrated and probabilistic risk analysis of drinking water systems.

    PubMed

    Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof

    2009-04-01

    Drinking water systems are vulnerable and subject to a wide range of risks. To avoid sub-optimisation of risk-reduction options, risk analyses need to include the entire drinking water system, from source to tap. Such an integrated approach demands tools that are able to model interactions between different events. Fault tree analysis is a risk estimation tool with the ability to model interactions between events. Using fault tree analysis on an integrated level, a probabilistic risk analysis of a large drinking water system in Sweden was carried out. The primary aims of the study were: (1) to develop a method for integrated and probabilistic risk analysis of entire drinking water systems; and (2) to evaluate the applicability of Customer Minutes Lost (CML) as a measure of risk. The analysis included situations where no water is delivered to the consumer (quantity failure) and situations where water is delivered but does not comply with water quality standards (quality failure). Hard data as well as expert judgements were used to estimate probabilities of events and uncertainties in the estimates. The calculations were performed using Monte Carlo simulations. CML is shown to be a useful measure of risks associated with drinking water systems. The method presented provides information on risk levels, probabilities of failure, failure rates and downtimes of the system. This information is available for the entire system as well as its different sub-systems. Furthermore, the method enables comparison of the results with performance targets and acceptable levels of risk. The method thus facilitates integrated risk analysis and consequently helps decision-makers to minimise sub-optimisation of risk-reduction options.

  12. TRISO Fuel Performance: Modeling, Integration into Mainstream Design Studies, and Application to a Thorium-fueled Fusion-Fission Hybrid Blanket

    NASA Astrophysics Data System (ADS)

    Powers, Jeffrey J.

    2011-12-01

    This study focused on creating a new tristructural isotropic (TRISO) coated particle fuel performance model and demonstrating the integration of this model into an existing system of neutronics and heat transfer codes, creating a user-friendly option for including fuel performance analysis within system design optimization and system-level trade-off studies. The end product enables both a deeper understanding and better overall system performance of nuclear energy systems limited or greatly impacted by TRISO fuel performance. A thorium-fueled hybrid fusion-fission Laser Inertial Fusion Energy (LIFE) blanket design was used for illustrating the application of this new capability and demonstrated both the importance of integrating fuel performance calculations into mainstream design studies and the impact that this new integrated analysis had on system-level design decisions. A new TRISO fuel performance model named TRIUNE was developed and verified and validated during this work with a novel methodology established for simulating the actual lifetime of a TRISO particle during repeated passes through a pebble bed. In addition, integrated self-consistent calculations were performed for neutronics depletion analysis, heat transfer calculations, and then fuel performance modeling for a full parametric study that encompassed over 80 different design options that went through all three phases of analysis. Lastly, side studies were performed that included a comparison of thorium and depleted uranium (DU) LIFE blankets as well as some uncertainty quantification work to help guide future experimental work by assessing what material properties in TRISO fuel performance modeling are most in need of improvement. A recommended thorium-fueled hybrid LIFE engine design was identified with an initial fuel load of 20MT of thorium, 15% TRISO packing within the graphite fuel pebbles, and a 20cm neutron multiplier layer with beryllium pebbles in flibe molten salt coolant. It operated at a system power level of 2000 MWth, took about 3.5 years to reach full plateau power, and was capable of an End of Plateau burnup of 38.7 %FIMA if considering just the neutronic constraints in the system design; however, fuel performance constraints led to a maximum credible burnup of 12.1 %FIMA due to a combination of internal gas pressure and irradiation effects on the TRISO materials (especially PyC) leading to SiC pressure vessel failures. The optimal neutron spectrum for the thorium-fueled blanket options evaluated seemed to favor a hard spectrum (low but non-zero neutron multiplier thicknesses and high TRISO packing fractions) in terms of neutronic performance but the fuel performance constraints demonstrated that a significantly softer spectrum would be needed to decrease the rate of accumulation of fast neutron fluence in order to improve the maximum credible burnup the system could achieve.

  13. Finite time step and spatial grid effects in δf simulation of warm plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sturdevant, Benjamin J., E-mail: benjamin.j.sturdevant@gmail.com; Department of Applied Mathematics, University of Colorado at Boulder, Boulder, CO 80309; Parker, Scott E.

    2016-01-15

    This paper introduces a technique for analyzing time integration methods used with the particle weight equations in δf method particle-in-cell (PIC) schemes. The analysis applies to the simulation of warm, uniform, periodic or infinite plasmas in the linear regime and considers the collective behavior similar to the analysis performed by Langdon for full-f PIC schemes [1,2]. We perform both a time integration analysis and spatial grid analysis for a kinetic ion, adiabatic electron model of ion acoustic waves. An implicit time integration scheme is studied in detail for δf simulations using our weight equation analysis and for full-f simulations usingmore » the method of Langdon. It is found that the δf method exhibits a CFL-like stability condition for low temperature ions, which is independent of the parameter characterizing the implicitness of the scheme. The accuracy of the real frequency and damping rate due to the discrete time and spatial schemes is also derived using a perturbative method. The theoretical analysis of numerical error presented here may be useful for the verification of simulations and for providing intuition for the design of new implicit time integration schemes for the δf method, as well as understanding differences between δf and full-f approaches to plasma simulation.« less

  14. Positive is usually good, negative is not always bad: The effects of group affect on social integration and task performance.

    PubMed

    Knight, Andrew P; Eisenkraft, Noah

    2015-07-01

    Grounded in a social functional perspective, this article examines the conditions under which group affect influences group functioning. Using meta-analysis, the authors leverage heterogeneity across 39 independent studies of 2,799 groups to understand how contextual factors-group affect source (exogenous or endogenous to the group) and group life span (one-shot or ongoing)-moderate the influence of shared feelings on social integration and task performance. As predicted, results indicate that group positive affect has consistent positive effects on social integration and task performance regardless of contextual idiosyncrasies. The effects of group negative affect, on the other hand, are context-dependent. Shared negative feelings promote social integration and task performance when stemming from an exogenous source or experienced in a 1-shot group, but undermine social integration and task performance when stemming from an endogenous source or experienced in an ongoing group. The authors discuss implications of their findings and highlight directions for future theory and research on group affect. (c) 2015 APA, all rights reserved).

  15. The Role of Integrated Modeling in the Design and Verification of the James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Mosier, Gary E.; Howard, Joseph M.; Johnston, John D.; Parrish, Keith A.; Hyde, T. Tupper; McGinnis, Mark A.; Bluth, Marcel; Kim, Kevin; Ha, Kong Q.

    2004-01-01

    The James Web Space Telescope (JWST) is a large, infrared-optimized space telescope scheduled for launch in 2011. System-level verification of critical optical performance requirements will rely on integrated modeling to a considerable degree. In turn, requirements for accuracy of the models are significant. The size of the lightweight observatory structure, coupled with the need to test at cryogenic temperatures, effectively precludes validation of the models and verification of optical performance with a single test in 1-g. Rather, a complex series of steps are planned by which the components of the end-to-end models are validated at various levels of subassembly, and the ultimate verification of optical performance is by analysis using the assembled models. This paper describes the critical optical performance requirements driving the integrated modeling activity, shows how the error budget is used to allocate and track contributions to total performance, and presents examples of integrated modeling methods and results that support the preliminary observatory design. Finally, the concepts for model validation and the role of integrated modeling in the ultimate verification of observatory are described.

  16. Multi-Disciplinary Analysis for Future Launch Systems Using NASA's Advanced Engineering Environment (AEE)

    NASA Technical Reports Server (NTRS)

    Monell, D.; Mathias, D.; Reuther, J.; Garn, M.

    2003-01-01

    A new engineering environment constructed for the purposes of analyzing and designing Reusable Launch Vehicles (RLVs) is presented. The new environment has been developed to allow NASA to perform independent analysis and design of emerging RLV architectures and technologies. The new Advanced Engineering Environment (AEE) is both collaborative and distributed. It facilitates integration of the analyses by both vehicle performance disciplines and life-cycle disciplines. Current performance disciplines supported include: weights and sizing, aerodynamics, trajectories, propulsion, structural loads, and CAD-based geometries. Current life-cycle disciplines supported include: DDT&E cost, production costs, operations costs, flight rates, safety and reliability, and system economics. Involving six NASA centers (ARC, LaRC, MSFC, KSC, GRC and JSC), AEE has been tailored to serve as a web-accessed agency-wide source for all of NASA's future launch vehicle systems engineering functions. Thus, it is configured to facilitate (a) data management, (b) automated tool/process integration and execution, and (c) data visualization and presentation. The core components of the integrated framework are a customized PTC Windchill product data management server, a set of RLV analysis and design tools integrated using Phoenix Integration's Model Center, and an XML-based data capture and transfer protocol. The AEE system has seen production use during the Initial Architecture and Technology Review for the NASA 2nd Generation RLV program, and it continues to undergo development and enhancements in support of its current main customer, the NASA Next Generation Launch Technology (NGLT) program.

  17. Development and Application of an Integrated Approach toward NASA Airspace Systems Research

    NASA Technical Reports Server (NTRS)

    Barhydt, Richard; Fong, Robert K.; Abramson, Paul D.; Koenke, Ed

    2008-01-01

    The National Aeronautics and Space Administration's (NASA) Airspace Systems Program is contributing air traffic management research in support of the 2025 Next Generation Air Transportation System (NextGen). Contributions support research and development needs provided by the interagency Joint Planning and Development Office (JPDO). These needs generally call for integrated technical solutions that improve system-level performance and work effectively across multiple domains and planning time horizons. In response, the Airspace Systems Program is pursuing an integrated research approach and has adapted systems engineering best practices for application in a research environment. Systems engineering methods aim to enable researchers to methodically compare different technical approaches, consider system-level performance, and develop compatible solutions. Systems engineering activities are performed iteratively as the research matures. Products of this approach include a demand and needs analysis, system-level descriptions focusing on NASA research contributions, system assessment and design studies, and common systemlevel metrics, scenarios, and assumptions. Results from the first systems engineering iteration include a preliminary demand and needs analysis; a functional modeling tool; and initial system-level metrics, scenario characteristics, and assumptions. Demand and needs analysis results suggest that several advanced concepts can mitigate demand/capacity imbalances for NextGen, but fall short of enabling three-times current-day capacity at the nation s busiest airports and airspace. Current activities are focusing on standardizing metrics, scenarios, and assumptions, conducting system-level performance assessments of integrated research solutions, and exploring key system design interfaces.

  18. Ceramic Integration Technologies for Energy and Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Singh, Mrityunjay; Asthana, Ralph N.

    2007-01-01

    Robust and affordable integration technologies for advanced ceramics are required to improve the performance, reliability, efficiency, and durability of components, devices, and systems based on them in a wide variety of energy, aerospace, and environmental applications. Many thermochemical and thermomechanical factors including joint design, analysis, and optimization must be considered in integration of similar and dissimilar material systems.

  19. Multidisciplinary High-Fidelity Analysis and Optimization of Aerospace Vehicles. Part 2; Preliminary Results

    NASA Technical Reports Server (NTRS)

    Walsh, J. L.; Weston, R. P.; Samareh, J. A.; Mason, B. H.; Green, L. L.; Biedron, R. T.

    2000-01-01

    An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity finite-element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a high-speed civil transport configuration. The paper describes both the preliminary results from implementing and validating the multidisciplinary analysis and the results from an aerodynamic optimization. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture compliant software product. A companion paper describes the formulation of the multidisciplinary analysis and optimization system.

  20. Development of Response Surface Models for Rapid Analysis & Multidisciplinary Optimization of Launch Vehicle Design Concepts

    NASA Technical Reports Server (NTRS)

    Unal, Resit

    1999-01-01

    Multdisciplinary design optimization (MDO) is an important step in the design and evaluation of launch vehicles, since it has a significant impact on performance and lifecycle cost. The objective in MDO is to search the design space to determine the values of design parameters that optimize the performance characteristics subject to system constraints. Vehicle Analysis Branch (VAB) at NASA Langley Research Center has computerized analysis tools in many of the disciplines required for the design and analysis of launch vehicles. Vehicle performance characteristics can be determined by the use of these computerized analysis tools. The next step is to optimize the system performance characteristics subject to multidisciplinary constraints. However, most of the complex sizing and performance evaluation codes used for launch vehicle design are stand-alone tools, operated by disciplinary experts. They are, in general, difficult to integrate and use directly for MDO. An alternative has been to utilize response surface methodology (RSM) to obtain polynomial models that approximate the functional relationships between performance characteristics and design variables. These approximation models, called response surface models, are then used to integrate the disciplines using mathematical programming methods for efficient system level design analysis, MDO and fast sensitivity simulations. A second-order response surface model of the form given has been commonly used in RSM since in many cases it can provide an adequate approximation especially if the region of interest is sufficiently limited.

  1. Business Case Analysis: Continuous Integrated Logistics Support-Targeted Allowance Technique (CILS-TAT)

    DTIC Science & Technology

    2013-06-01

    In this research, we examine the Naval Sea Logistics Command s Continuous Integrated Logistics Support Targeted Allowancing Technique (CILS TAT) and... the feasibility of program re-implementation. We conduct an analysis of this allowancing method s effectiveness onboard U.S. Navy Ballistic Missile...Defense (BMD) ships, measure the costs associated with performing a CILS TAT, and provide recommendations concerning possible improvements to the

  2. Proceedings of the 21st Project Integration Meeting

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Progress made by the Flat Plate Solar Array Project during the period April 1982 to January 1983 is described. Reports on polysilicon refining, thin film solar cell and module technology development, central station electric utility activities, silicon sheet growth and characteristics, advanced photovoltaic materials, cell and processes research, module technology, environmental isolation, engineering sciences, module performance and failure analysis and project analysis and integration are included.

  3. Performance analysis of Integrated Communication and Control System networks

    NASA Technical Reports Server (NTRS)

    Halevi, Y.; Ray, A.

    1990-01-01

    This paper presents statistical analysis of delays in Integrated Communication and Control System (ICCS) networks that are based on asynchronous time-division multiplexing. The models are obtained in closed form for analyzing control systems with randomly varying delays. The results of this research are applicable to ICCS design for complex dynamical processes like advanced aircraft and spacecraft, autonomous manufacturing plants, and chemical and processing plants.

  4. Documents of the JPL Photovoltaics Program Analysis and Integration Center: An annotated bibliography

    NASA Technical Reports Server (NTRS)

    Pearson, A. M.

    1985-01-01

    A bibliography of internal and external documents produced by the Jet Propulsion Laboratory, based on the work performed by the Photovoltaics Program Analysis and Integration Center, is presented with annotations. As shown in the Table of Contents, the bibliography is divided into three subject areas: (1) Assessments, (2) Methdological Studies, and (3) Supporting Studies. Annotated abstracts are presented for 20 papers.

  5. Integration of fracturing dynamics and pressure transient analysis for hydraulic fracture evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arihara, N.; Abbaszadeh, M.; Wright, C.A.

    This paper presents pre- and post-fracture pressure transient analysis, combined with net fracture pressure interpretation, for a well in a naturally fractured geothermal reservoir. Integrated analysis was performed to achieve a consistent interpretation of the created fracture geometry, propagation, conductivity, shrinkage, reservoir flow behavior, and formation permeability characteristics. The interpreted data includes two-rate pre-frac injection tests, step-rate injection tests, a series of pressure falloff tests, and the net fracturing pressure from a massive fracture treatment. Pressure transient analyses were performed utilizing advanced well test interpretation techniques and a thermal reservoir simulator with fracture propagation option. Hydraulic fracture propagation analysis wasmore » also performed Milt a generalized 3-D dynamic fracture growth model simulator. Three major conclusions resulted from the combined analysis: (1) that an increasing number of hydraulic fractures were being simultaneously propagated during the fracture treatment. (2) that the reservoir behaved as a composite reservoir Keith the outer region permeability being greater than the permeability of the region immediately surrounding the wellbore, and (3) that the created fractures extended into the outer region during the fracture treatment but retreated to the inner region several days after stimulation had ceased. These conclusions were apparent from independent pressure transient analysis and from independent hydraulic fracture propagation analysis. Integrated interpretation, however, increased the confidence in these conclusions and greatly aided the quantification of the created hydraulic fracture geometry and characterization of the reservoir permeability.« less

  6. Integration Test of the High Voltage Hall Accelerator System Components

    NASA Technical Reports Server (NTRS)

    Kamhawi, Hani; Haag, Thomas; Huang, Wensheng; Pinero, Luis; Peterson, Todd; Dankanich, John

    2013-01-01

    NASA Glenn Research Center is developing a 4 kilowatt-class Hall propulsion system for implementation in NASA science missions. NASA science mission performance analysis was completed using the latest high voltage Hall accelerator (HiVHAc) and Aerojet-Rocketdyne's state-of-the-art BPT-4000 Hall thruster performance curves. Mission analysis results indicated that the HiVHAc thruster out performs the BPT-4000 thruster for all but one of the missions studied. Tests of the HiVHAc system major components were performed. Performance evaluation of the HiVHAc thruster at NASA Glenn's vacuum facility 5 indicated that thruster performance was lower than performance levels attained during tests in vacuum facility 12 due to the lower background pressures attained during vacuum facility 5 tests when compared to vacuum facility 12. Voltage-Current characterization of the HiVHAc thruster in vacuum facility 5 showed that the HiVHAc thruster can operate stably for a wide range of anode flow rates for discharge voltages between 250 and 600 volts. A Colorado Power Electronics enhanced brassboard power processing unit was tested in vacuum for 1,500 hours and the unit demonstrated discharge module efficiency of 96.3% at 3.9 kilowatts and 650 volts. Stand-alone open and closed loop tests of a VACCO TRL 6 xenon flow control module were also performed. An integrated test of the HiVHAc thruster, brassboard power processing unit, and xenon flow control module was performed and confirmed that integrated operation of the HiVHAc system major components. Future plans include continuing the maturation of the HiVHAc system major components and the performance of a single-string integration test.

  7. Analysis of a Real-Time Separation Assurance System with Integrated Time-in-Trail Spacing

    NASA Technical Reports Server (NTRS)

    Aweiss, Arwa S.; Farrahi, Amir H.; Lauderdale, Todd A.; Thipphavong, Adam S.; Lee, Chu H.

    2010-01-01

    This paper describes the implementation and analysis of an integrated ground-based separation assurance and time-based metering prototype system into the Center-TRACON Automation System. The integration of this new capability accommodates constraints in four-dimensions: position (x-y), altitude, and meter-fix crossing time. Experiments were conducted to evaluate the performance of the integrated system and its ability to handle traffic levels up to twice that of today. Results suggest that the integrated system reduces the number and magnitude of time-in-trail spacing violations. This benefit was achieved without adversely affecting the resolution success rate of the system. Also, the data suggest that the integrated system is relatively insensitive to an increase in traffic of twice the current levels.

  8. Detailed requirements document for the integrated structural analysis system, phase B

    NASA Technical Reports Server (NTRS)

    Rainey, J. A.

    1976-01-01

    The requirements are defined for a software system entitled integrated Structural Analysis System (ISAS) Phase B which is being developed to provide the user with a tool by which a complete and detailed analysis of a complex structural system can be performed. This software system will allow for automated interface with numerous structural analysis batch programs and for user interaction in the creation, selection, and validation of data. This system will include modifications to the 4 functions developed for ISAS, and the development of 25 new functions. The new functions are described.

  9. Integrated modeling environment for systems-level performance analysis of the Next-Generation Space Telescope

    NASA Astrophysics Data System (ADS)

    Mosier, Gary E.; Femiano, Michael; Ha, Kong; Bely, Pierre Y.; Burg, Richard; Redding, David C.; Kissil, Andrew; Rakoczy, John; Craig, Larry

    1998-08-01

    All current concepts for the NGST are innovative designs which present unique systems-level challenges. The goals are to outperform existing observatories at a fraction of the current price/performance ratio. Standard practices for developing systems error budgets, such as the 'root-sum-of- squares' error tree, are insufficient for designs of this complexity. Simulation and optimization are the tools needed for this project; in particular tools that integrate controls, optics, thermal and structural analysis, and design optimization. This paper describes such an environment which allows sub-system performance specifications to be analyzed parametrically, and includes optimizing metrics that capture the science requirements. The resulting systems-level design trades are greatly facilitated, and significant cost savings can be realized. This modeling environment, built around a tightly integrated combination of commercial off-the-shelf and in-house- developed codes, provides the foundation for linear and non- linear analysis on both the time and frequency-domains, statistical analysis, and design optimization. It features an interactive user interface and integrated graphics that allow highly-effective, real-time work to be done by multidisciplinary design teams. For the NGST, it has been applied to issues such as pointing control, dynamic isolation of spacecraft disturbances, wavefront sensing and control, on-orbit thermal stability of the optics, and development of systems-level error budgets. In this paper, results are presented from parametric trade studies that assess requirements for pointing control, structural dynamics, reaction wheel dynamic disturbances, and vibration isolation. These studies attempt to define requirements bounds such that the resulting design is optimized at the systems level, without attempting to optimize each subsystem individually. The performance metrics are defined in terms of image quality, specifically centroiding error and RMS wavefront error, which directly links to science requirements.

  10. Performance and Reliability Optimization for Aerospace Systems subject to Uncertainty and Degradation

    NASA Technical Reports Server (NTRS)

    Miller, David W.; Uebelhart, Scott A.; Blaurock, Carl

    2004-01-01

    This report summarizes work performed by the Space Systems Laboratory (SSL) for NASA Langley Research Center in the field of performance optimization for systems subject to uncertainty. The objective of the research is to develop design methods and tools to the aerospace vehicle design process which take into account lifecycle uncertainties. It recognizes that uncertainty between the predictions of integrated models and data collected from the system in its operational environment is unavoidable. Given the presence of uncertainty, the goal of this work is to develop means of identifying critical sources of uncertainty, and to combine these with the analytical tools used with integrated modeling. In this manner, system uncertainty analysis becomes part of the design process, and can motivate redesign. The specific program objectives were: 1. To incorporate uncertainty modeling, propagation and analysis into the integrated (controls, structures, payloads, disturbances, etc.) design process to derive the error bars associated with performance predictions. 2. To apply modern optimization tools to guide in the expenditure of funds in a way that most cost-effectively improves the lifecycle productivity of the system by enhancing the subsystem reliability and redundancy. The results from the second program objective are described. This report describes the work and results for the first objective: uncertainty modeling, propagation, and synthesis with integrated modeling.

  11. An integrated environmental and health performance quantification model for pre-occupancy phase of buildings in China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Xiaodong, E-mail: eastdawn@tsinghua.edu.cn; Su, Shu, E-mail: sushuqh@163.com; Zhang, Zhihui, E-mail: zhzhg@tsinghua.edu.cn

    To comprehensively pre-evaluate the damages to both the environment and human health due to construction activities in China, this paper presents an integrated building environmental and health performance (EHP) assessment model based on the Building Environmental Performance Analysis System (BEPAS) and the Building Health Impact Analysis System (BHIAS) models and offers a new inventory data estimation method. The new model follows the life cycle assessment (LCA) framework and the inventory analysis step involves bill of quantity (BOQ) data collection, consumption data formation, and environmental profile transformation. The consumption data are derived from engineering drawings and quotas to conduct the assessmentmore » before construction for pre-evaluation. The new model classifies building impacts into three safeguard areas: ecosystems, natural resources and human health. Thus, this model considers environmental impacts as well as damage to human wellbeing. The monetization approach, distance-to-target method and panel method are considered as optional weighting approaches. Finally, nine residential buildings of different structural types are taken as case studies to test the operability of the integrated model through application. The results indicate that the new model can effectively pre-evaluate building EHP and the structure type significantly affects the performance of residential buildings.« less

  12. Evaluation of the Aurora Application Shade Measurement Accuracy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-12-01

    Aurora is an integrated, Web-based application that helps solar installers perform sales, engineering design, and financial analysis. One of Aurora's key features is its high-resolution remote shading analysis.

  13. CFD in the context of IHPTET: The Integrated High Performance Turbine Technology Program

    NASA Technical Reports Server (NTRS)

    Simoneau, Robert J.; Hudson, Dale A.

    1989-01-01

    The Integrated High Performance Turbine Engine Technology (IHPTET) Program is an integrated DOD/NASA technology program designed to double the performance capability of today's most advanced military turbine engines as we enter the twenty-first century. Computational Fluid Dynamics (CFD) is expected to play an important role in the design/analysis of specific configurations within this complex machine. In order to do this, a plan is being developed to ensure the timely impact of CFD on IHPTET. The developing philosphy of CFD in the context of IHPTET is discussed. The key elements in the developing plan and specific examples of state-of-the-art CFD efforts which are IHPTET turbine engine relevant are discussed.

  14. Integration of heterogeneous data for classification in hyperspectral satellite imagery

    NASA Astrophysics Data System (ADS)

    Benedetto, J.; Czaja, W.; Dobrosotskaya, J.; Doster, T.; Duke, K.; Gillis, D.

    2012-06-01

    As new remote sensing modalities emerge, it becomes increasingly important to nd more suitable algorithms for fusion and integration of dierent data types for the purposes of target/anomaly detection and classication. Typical techniques that deal with this problem are based on performing detection/classication/segmentation separately in chosen modalities, and then integrating the resulting outcomes into a more complete picture. In this paper we provide a broad analysis of a new approach, based on creating fused representations of the multi- modal data, which then can be subjected to analysis by means of the state-of-the-art classiers or detectors. In this scenario we shall consider the hyperspectral imagery combined with spatial information. Our approach involves machine learning techniques based on analysis of joint data-dependent graphs and their associated diusion kernels. Then, the signicant eigenvectors of the derived fused graph Laplace operator form the new representation, which provides integrated features from the heterogeneous input data. We compare these fused approaches with analysis of integrated outputs of spatial and spectral graph methods.

  15. Application of the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) for Dynamic Systems Analysis

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey; Zinnecker, Alicia

    2014-01-01

    Systems analysis involves steady-state simulations of combined components to evaluate the steady-state performance, weight, and cost of a system; dynamic considerations are not included until later in the design process. The Dynamic Systems Analysis task, under NASAs Fixed Wing project, is developing the capability for assessing dynamic issues at earlier stages during systems analysis. To provide this capability the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) has been developed to design a single flight condition controller (defined as altitude and Mach number) and, ultimately, provide an estimate of the closed-loop performance of the engine model. This tool has been integrated with the Commercial Modular Aero-Propulsion System Simulation 40,000(CMAPSS40k) engine model to demonstrate the additional information TTECTrA makes available for dynamic systems analysis. This dynamic data can be used to evaluate the trade-off between performance and safety, which could not be done with steady-state systems analysis data. TTECTrA has been designed to integrate with any turbine engine model that is compatible with the MATLABSimulink (The MathWorks, Inc.) environment.

  16. Application of the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) for Dynamic Systems Analysis

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey Thomas; Zinnecker, Alicia Mae

    2014-01-01

    Systems analysis involves steady-state simulations of combined components to evaluate the steady-state performance, weight, and cost of a system; dynamic considerations are not included until later in the design process. The Dynamic Systems Analysis task, under NASAs Fixed Wing project, is developing the capability for assessing dynamic issues at earlier stages during systems analysis. To provide this capability the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) has been developed to design a single flight condition controller (defined as altitude and Mach number) and, ultimately, provide an estimate of the closed-loop performance of the engine model. This tool has been integrated with the Commercial Modular Aero-Propulsion System Simulation 40,000 (CMAPSS 40k) engine model to demonstrate the additional information TTECTrA makes available for dynamic systems analysis. This dynamic data can be used to evaluate the trade-off between performance and safety, which could not be done with steady-state systems analysis data. TTECTrA has been designed to integrate with any turbine engine model that is compatible with the MATLAB Simulink (The MathWorks, Inc.) environment.

  17. Intelligent Performance Analysis with a Natural Language Interface

    NASA Astrophysics Data System (ADS)

    Juuso, Esko K.

    2017-09-01

    Performance improvement is taken as the primary goal in the asset management. Advanced data analysis is needed to efficiently integrate condition monitoring data into the operation and maintenance. Intelligent stress and condition indices have been developed for control and condition monitoring by combining generalized norms with efficient nonlinear scaling. These nonlinear scaling methodologies can also be used to handle performance measures used for management since management oriented indicators can be presented in the same scale as intelligent condition and stress indices. Performance indicators are responses of the process, machine or system to the stress contributions analyzed from process and condition monitoring data. Scaled values are directly used in intelligent temporal analysis to calculate fluctuations and trends. All these methodologies can be used in prognostics and fatigue prediction. The meanings of the variables are beneficial in extracting expert knowledge and representing information in natural language. The idea of dividing the problems into the variable specific meanings and the directions of interactions provides various improvements for performance monitoring and decision making. The integrated temporal analysis and uncertainty processing facilitates the efficient use of domain expertise. Measurements can be monitored with generalized statistical process control (GSPC) based on the same scaling functions.

  18. Analysis of airfoil leading edge separation bubbles

    NASA Technical Reports Server (NTRS)

    Carter, J. E.; Vatsa, V. N.

    1982-01-01

    A local inviscid-viscous interaction technique was developed for the analysis of low speed airfoil leading edge transitional separation bubbles. In this analysis an inverse boundary layer finite difference analysis is solved iteratively with a Cauchy integral representation of the inviscid flow which is assumed to be a linear perturbation to a known global viscous airfoil analysis. Favorable comparisons with data indicate the overall validity of the present localized interaction approach. In addition numerical tests were performed to test the sensitivity of the computed results to the mesh size, limits on the Cauchy integral, and the location of the transition region.

  19. Introducing WISDEM:An Integrated System Modeling for Wind Turbines and Plant (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dykes, K.; Graf, P.; Scott, G.

    2015-01-01

    The National Wind Technology Center wind energy systems engineering initiative has developed an analysis platform to leverage its research capabilities toward integrating wind energy engineering and cost models across wind plants. This Wind-Plant Integrated System Design & Engineering Model (WISDEM) platform captures the important interactions between various subsystems to achieve a better National Wind Technology Center wind energy systems engineering initiative has developed an analysis platform to leverage its research capabilities toward integrating wind energy engineering and cost models across wind plants. This Wind-Plant Integrated System Design & Engineering Model (WISDEM) platform captures the important interactions between various subsystems tomore » achieve a better understanding of how to improve system-level performance and achieve system-level cost reductions. This work illustrates a few case studies with WISDEM that focus on the design and analysis of wind turbines and plants at different system levels.« less

  20. Train integrity detection risk analysis based on PRISM

    NASA Astrophysics Data System (ADS)

    Wen, Yuan

    2018-04-01

    GNSS based Train Integrity Monitoring System (TIMS) is an effective and low-cost detection scheme for train integrity detection. However, as an external auxiliary system of CTCS, GNSS may be influenced by external environments, such as uncertainty of wireless communication channels, which may lead to the failure of communication and positioning. In order to guarantee the reliability and safety of train operation, a risk analysis method of train integrity detection based on PRISM is proposed in this article. First, we analyze the risk factors (in GNSS communication process and the on-board communication process) and model them. Then, we evaluate the performance of the model in PRISM based on the field data. Finally, we discuss how these risk factors influence the train integrity detection process.

  1. An expert system for integrated structural analysis and design optimization for aerospace structures

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The results of a research study on the development of an expert system for integrated structural analysis and design optimization is presented. An Object Representation Language (ORL) was developed first in conjunction with a rule-based system. This ORL/AI shell was then used to develop expert systems to provide assistance with a variety of structural analysis and design optimization tasks, in conjunction with procedural modules for finite element structural analysis and design optimization. The main goal of the research study was to provide expertise, judgment, and reasoning capabilities in the aerospace structural design process. This will allow engineers performing structural analysis and design, even without extensive experience in the field, to develop error-free, efficient and reliable structural designs very rapidly and cost-effectively. This would not only improve the productivity of design engineers and analysts, but also significantly reduce time to completion of structural design. An extensive literature survey in the field of structural analysis, design optimization, artificial intelligence, and database management systems and their application to the structural design process was first performed. A feasibility study was then performed, and the architecture and the conceptual design for the integrated 'intelligent' structural analysis and design optimization software was then developed. An Object Representation Language (ORL), in conjunction with a rule-based system, was then developed using C++. Such an approach would improve the expressiveness for knowledge representation (especially for structural analysis and design applications), provide ability to build very large and practical expert systems, and provide an efficient way for storing knowledge. Functional specifications for the expert systems were then developed. The ORL/AI shell was then used to develop a variety of modules of expert systems for a variety of modeling, finite element analysis, and design optimization tasks in the integrated aerospace structural design process. These expert systems were developed to work in conjunction with procedural finite element structural analysis and design optimization modules (developed in-house at SAT, Inc.). The complete software, AutoDesign, so developed, can be used for integrated 'intelligent' structural analysis and design optimization. The software was beta-tested at a variety of companies, used by a range of engineers with different levels of background and expertise. Based on the feedback obtained by such users, conclusions were developed and are provided.

  2. An expert system for integrated structural analysis and design optimization for aerospace structures

    NASA Astrophysics Data System (ADS)

    1992-04-01

    The results of a research study on the development of an expert system for integrated structural analysis and design optimization is presented. An Object Representation Language (ORL) was developed first in conjunction with a rule-based system. This ORL/AI shell was then used to develop expert systems to provide assistance with a variety of structural analysis and design optimization tasks, in conjunction with procedural modules for finite element structural analysis and design optimization. The main goal of the research study was to provide expertise, judgment, and reasoning capabilities in the aerospace structural design process. This will allow engineers performing structural analysis and design, even without extensive experience in the field, to develop error-free, efficient and reliable structural designs very rapidly and cost-effectively. This would not only improve the productivity of design engineers and analysts, but also significantly reduce time to completion of structural design. An extensive literature survey in the field of structural analysis, design optimization, artificial intelligence, and database management systems and their application to the structural design process was first performed. A feasibility study was then performed, and the architecture and the conceptual design for the integrated 'intelligent' structural analysis and design optimization software was then developed. An Object Representation Language (ORL), in conjunction with a rule-based system, was then developed using C++. Such an approach would improve the expressiveness for knowledge representation (especially for structural analysis and design applications), provide ability to build very large and practical expert systems, and provide an efficient way for storing knowledge. Functional specifications for the expert systems were then developed. The ORL/AI shell was then used to develop a variety of modules of expert systems for a variety of modeling, finite element analysis, and design optimization tasks in the integrated aerospace structural design process. These expert systems were developed to work in conjunction with procedural finite element structural analysis and design optimization modules (developed in-house at SAT, Inc.). The complete software, AutoDesign, so developed, can be used for integrated 'intelligent' structural analysis and design optimization. The software was beta-tested at a variety of companies, used by a range of engineers with different levels of background and expertise. Based on the feedback obtained by such users, conclusions were developed and are provided.

  3. Analysis of transient fission gas behaviour in oxide fuel using BISON and TRANSURANUS

    NASA Astrophysics Data System (ADS)

    Barani, T.; Bruschi, E.; Pizzocri, D.; Pastore, G.; Van Uffelen, P.; Williamson, R. L.; Luzzi, L.

    2017-04-01

    The modelling of fission gas behaviour is a crucial aspect of nuclear fuel performance analysis in view of the related effects on the thermo-mechanical performance of the fuel rod, which can be particularly significant during transients. In particular, experimental observations indicate that substantial fission gas release (FGR) can occur on a small time scale during transients (burst release). To accurately reproduce the rapid kinetics of the burst release process in fuel performance calculations, a model that accounts for non-diffusional mechanisms such as fuel micro-cracking is needed. In this work, we present and assess a model for transient fission gas behaviour in oxide fuel, which is applied as an extension of conventional diffusion-based models to introduce the burst release effect. The concept and governing equations of the model are presented, and the sensitivity of results to the newly introduced parameters is evaluated through an analytic sensitivity analysis. The model is assessed for application to integral fuel rod analysis by implementation in two structurally different fuel performance codes: BISON (multi-dimensional finite element code) and TRANSURANUS (1.5D code). Model assessment is based on the analysis of 19 light water reactor fuel rod irradiation experiments from the OECD/NEA IFPE (International Fuel Performance Experiments) database, all of which are simulated with both codes. The results point out an improvement in both the quantitative predictions of integral fuel rod FGR and the qualitative representation of the FGR kinetics with the transient model relative to the canonical, purely diffusion-based models of the codes. The overall quantitative improvement of the integral FGR predictions in the two codes is comparable. Moreover, calculated radial profiles of xenon concentration after irradiation are investigated and compared to experimental data, illustrating the underlying representation of the physical mechanisms of burst release.

  4. A Software Tool for Integrated Optical Design Analysis

    NASA Technical Reports Server (NTRS)

    Moore, Jim; Troy, Ed; DePlachett, Charles; Montgomery, Edward (Technical Monitor)

    2001-01-01

    Design of large precision optical systems requires multi-disciplinary analysis, modeling, and design. Thermal, structural and optical characteristics of the hardware must be accurately understood in order to design a system capable of accomplishing the performance requirements. The interactions between each of the disciplines become stronger as systems are designed lighter weight for space applications. This coupling dictates a concurrent engineering design approach. In the past, integrated modeling tools have been developed that attempt to integrate all of the complex analysis within the framework of a single model. This often results in modeling simplifications and it requires engineering specialist to learn new applications. The software described in this presentation addresses the concurrent engineering task using a different approach. The software tool, Integrated Optical Design Analysis (IODA), uses data fusion technology to enable a cross discipline team of engineering experts to concurrently design an optical system using their standard validated engineering design tools.

  5. Bridging ImmunoGenomic Data Analysis Workflow Gaps (BIGDAWG): An integrated case-control analysis pipeline.

    PubMed

    Pappas, Derek J; Marin, Wesley; Hollenbach, Jill A; Mack, Steven J

    2016-03-01

    Bridging ImmunoGenomic Data-Analysis Workflow Gaps (BIGDAWG) is an integrated data-analysis pipeline designed for the standardized analysis of highly-polymorphic genetic data, specifically for the HLA and KIR genetic systems. Most modern genetic analysis programs are designed for the analysis of single nucleotide polymorphisms, but the highly polymorphic nature of HLA and KIR data require specialized methods of data analysis. BIGDAWG performs case-control data analyses of highly polymorphic genotype data characteristic of the HLA and KIR loci. BIGDAWG performs tests for Hardy-Weinberg equilibrium, calculates allele frequencies and bins low-frequency alleles for k×2 and 2×2 chi-squared tests, and calculates odds ratios, confidence intervals and p-values for each allele. When multi-locus genotype data are available, BIGDAWG estimates user-specified haplotypes and performs the same binning and statistical calculations for each haplotype. For the HLA loci, BIGDAWG performs the same analyses at the individual amino-acid level. Finally, BIGDAWG generates figures and tables for each of these comparisons. BIGDAWG obviates the error-prone reformatting needed to traffic data between multiple programs, and streamlines and standardizes the data-analysis process for case-control studies of highly polymorphic data. BIGDAWG has been implemented as the bigdawg R package and as a free web application at bigdawg.immunogenomics.org. Copyright © 2015 American Society for Histocompatibility and Immunogenetics. Published by Elsevier Inc. All rights reserved.

  6. High Fidelity Thermal Simulators for Non-Nuclear Testing: Analysis and Initial Results

    NASA Technical Reports Server (NTRS)

    Bragg-Sitton, Shannon M.; Dickens, Ricky; Dixon, David

    2007-01-01

    Non-nuclear testing can be a valuable tool in the development of a space nuclear power system, providing system characterization data and allowing one to work through various fabrication, assembly and integration issues without the cost and time associated with a full ground nuclear test. In a non-nuclear test bed, electric heaters are used to simulate the heat from nuclear fuel. Testing with non-optimized heater elements allows one to assess thermal, heat transfer, and stress related attributes of a given system, but fails to demonstrate the dynamic response that would be present in an integrated, fueled reactor system. High fidelity thermal simulators that match both the static and the dynamic fuel pin performance that would be observed in an operating, fueled nuclear reactor can vastly increase the value of non-nuclear test results. With optimized simulators, the integration of thermal hydraulic hardware tests with simulated neutronie response provides a bridge between electrically heated testing and fueled nuclear testing, providing a better assessment of system integration issues, characterization of integrated system response times and response characteristics, and assessment of potential design improvements' at a relatively small fiscal investment. Initial conceptual thermal simulator designs are determined by simple one-dimensional analysis at a single axial location and at steady state conditions; feasible concepts are then input into a detailed three-dimensional model for comparison to expected fuel pin performance. Static and dynamic fuel pin performance for a proposed reactor design is determined using SINDA/FLUINT thermal analysis software, and comparison is made between the expected nuclear performance and the performance of conceptual thermal simulator designs. Through a series of iterative analyses, a conceptual high fidelity design can developed. Test results presented in this paper correspond to a "first cut" simulator design for a potential liquid metal (NaK) cooled reactor design that could be applied for Lunar surface power. Proposed refinements to this simulator design are also presented.

  7. Analysis of Logistics in Support of a Human Lunar Outpost

    NASA Technical Reports Server (NTRS)

    Cirillo, William; Earle, Kevin; Goodliff, Kandyce; Reeves, j. D.; Andrashko, Mark; Merrill, R. Gabe; Stromgren, Chel

    2008-01-01

    Strategic level analysis of the integrated behavior of lunar transportation system and lunar surface system architecture options is performed to inform NASA Constellation Program senior management on the benefit, viability, affordability, and robustness of system design choices. This paper presents an overview of the approach used to perform the campaign (strategic) analysis, with an emphasis on the logistics modeling and the impacts of logistics resupply on campaign behavior. An overview of deterministic and probabilistic analysis approaches is provided, with a discussion of the importance of each approach to understanding the integrated system behavior. The logistics required to support lunar surface habitation are analyzed from both 'macro-logistics' and 'micro-logistics' perspectives, where macro-logistics focuses on the delivery of goods to a destination and micro-logistics focuses on local handling of re-supply goods at a destination. An example campaign is provided to tie the theories of campaign analysis to results generation capabilities.

  8. Business Case Analysis: Continuous Integrated Logistics Support-Targeted Allowance Technique (CILS-TAT)

    DTIC Science & Technology

    2013-05-30

    In this research, we examine the Naval Sea Logistics Command’s Continuous Integrated Logistics Support-Targeted Allowancing Technique (CILS-TAT) and... the feasibility of program re-implementation. We conduct an analysis of this allowancing method’s effectiveness onboard U.S. Navy Ballistic Missile...Defense (BMD) ships, measure the costs associated with performing a CILS-TAT, and provide recommendations concerning possible improvements to the

  9. Detectable states, cycle fluxes, and motility scaling of molecular motor kinesin: An integrative kinetic graph theory analysis

    NASA Astrophysics Data System (ADS)

    Ren, Jie

    2017-12-01

    The process by which a kinesin motor couples its ATPase activity with concerted mechanical hand-over-hand steps is a foremost topic of molecular motor physics. Two major routes toward elucidating kinesin mechanisms are the motility performance characterization of velocity and run length, and single-molecular state detection experiments. However, these two sets of experimental approaches are largely uncoupled to date. Here, we introduce an integrative motility state analysis based on a theorized kinetic graph theory for kinesin, which, on one hand, is validated by a wealth of accumulated motility data, and, on the other hand, allows for rigorous quantification of state occurrences and chemomechanical cycling probabilities. An interesting linear scaling for kinesin motility performance across species is discussed as well. An integrative kinetic graph theory analysis provides a powerful tool to bridge motility and state characterization experiments, so as to forge a unified effort for the elucidation of the working mechanisms of molecular motors.

  10. Model-Based Fault Diagnosis: Performing Root Cause and Impact Analyses in Real Time

    NASA Technical Reports Server (NTRS)

    Figueroa, Jorge F.; Walker, Mark G.; Kapadia, Ravi; Morris, Jonathan

    2012-01-01

    Generic, object-oriented fault models, built according to causal-directed graph theory, have been integrated into an overall software architecture dedicated to monitoring and predicting the health of mission- critical systems. Processing over the generic fault models is triggered by event detection logic that is defined according to the specific functional requirements of the system and its components. Once triggered, the fault models provide an automated way for performing both upstream root cause analysis (RCA), and for predicting downstream effects or impact analysis. The methodology has been applied to integrated system health management (ISHM) implementations at NASA SSC's Rocket Engine Test Stands (RETS).

  11. Performance Analysis on Carrier Phase-Based Tightly-Coupled GPS/BDS/INS Integration in GNSS Degraded and Denied Environments

    PubMed Central

    Han, Houzeng; Wang, Jian; Wang, Jinling; Tan, Xinglong

    2015-01-01

    The integration of Global Navigation Satellite Systems (GNSS) carrier phases with Inertial Navigation System (INS) measurements is essential to provide accurate and continuous position, velocity and attitude information, however it is necessary to fix ambiguities rapidly and reliably to obtain high accuracy navigation solutions. In this paper, we present the notion of combining the Global Positioning System (GPS), the BeiDou Navigation Satellite System (BDS) and low-cost micro-electro-mechanical sensors (MEMS) inertial systems for reliable navigation. An adaptive multipath factor-based tightly-coupled (TC) GPS/BDS/INS integration algorithm is presented and the overall performance of the integrated system is illustrated. A twenty seven states TC GPS/BDS/INS model is adopted with an extended Kalman filter (EKF), which is carried out by directly fusing ambiguity fixed double-difference (DD) carrier phase measurements with the INS predicted pseudoranges to estimate the error states. The INS-aided integer ambiguity resolution (AR) strategy is developed by using a dynamic model, a two-step estimation procedure is applied with adaptively estimated covariance matrix to further improve the AR performance. A field vehicular test was carried out to demonstrate the positioning performance of the combined system. The results show the TC GPS/BDS/INS system significantly improves the single-epoch AR reliability as compared to that of GPS/BDS-only or single satellite navigation system integrated strategy, especially for high cut-off elevations. The AR performance is also significantly improved for the combined system with adaptive covariance matrix in the presence of low elevation multipath related to the GNSS-only case. A total of fifteen simulated outage tests also show that the time to relock of the GPS/BDS signals is shortened, which improves the system availability. The results also indicate that TC integration system achieves a few centimeters accuracy in positioning based on the comparison analysis and covariance analysis, even in harsh environments (e.g., in urban canyons), thus we can see the advantage of positioning at high cut-off elevations that the combined GPS/BDS brings. PMID:25875191

  12. Performance analysis on carrier phase-based tightly-coupled GPS/BDS/INS integration in GNSS degraded and denied environments.

    PubMed

    Han, Houzeng; Wang, Jian; Wang, Jinling; Tan, Xinglong

    2015-04-14

    The integration of Global Navigation Satellite Systems (GNSS) carrier phases with Inertial Navigation System (INS) measurements is essential to provide accurate and continuous position, velocity and attitude information, however it is necessary to fix ambiguities rapidly and reliably to obtain high accuracy navigation solutions. In this paper, we present the notion of combining the Global Positioning System (GPS), the BeiDou Navigation Satellite System (BDS) and low-cost micro-electro-mechanical sensors (MEMS) inertial systems for reliable navigation. An adaptive multipath factor-based tightly-coupled (TC) GPS/BDS/INS integration algorithm is presented and the overall performance of the integrated system is illustrated. A twenty seven states TC GPS/BDS/INS model is adopted with an extended Kalman filter (EKF), which is carried out by directly fusing ambiguity fixed double-difference (DD) carrier phase measurements with the INS predicted pseudoranges to estimate the error states. The INS-aided integer ambiguity resolution (AR) strategy is developed by using a dynamic model, a two-step estimation procedure is applied with adaptively estimated covariance matrix to further improve the AR performance. A field vehicular test was carried out to demonstrate the positioning performance of the combined system. The results show the TC GPS/BDS/INS system significantly improves the single-epoch AR reliability as compared to that of GPS/BDS-only or single satellite navigation system integrated strategy, especially for high cut-off elevations. The AR performance is also significantly improved for the combined system with adaptive covariance matrix in the presence of low elevation multipath related to the GNSS-only case. A total of fifteen simulated outage tests also show that the time to relock of the GPS/BDS signals is shortened, which improves the system availability. The results also indicate that TC integration system achieves a few centimeters accuracy in positioning based on the comparison analysis and covariance analysis, even in harsh environments (e.g., in urban canyons), thus we can see the advantage of positioning at high cut-off elevations that the combined GPS/BDS brings.

  13. Noise characteristics analysis of short wave infrared InGaAs focal plane arrays

    NASA Astrophysics Data System (ADS)

    Yu, Chunlei; Li, Xue; Yang, Bo; Huang, Songlei; Shao, Xiumei; Zhang, Yaguang; Gong, Haimei

    2017-09-01

    The increasing application of InGaAs short wave infrared (SWIR) focal plane arrays (FPAs) in low light level imaging requires ultra-low noise FPAs. This paper presents the theoretical analysis of FPA noise, and point out that both dark current and detector capacitance strongly affect the FPA noise. The impact of dark current and detector capacitance on FPA noise is compared in different situations. In order to obtain low noise performance FPAs, the demand for reducing detector capacitance is higher especially when pixel pitch is smaller, integration time is shorter, and integration capacitance is larger. Several InGaAs FPAs were measured and analyzed, the experiments' results could be well fitted to the calculated results. The study found that the major contributor of FPA noise is coupled noise with shorter integration time. The influence of detector capacitance on FPA noise is more significant than that of dark current. To investigate the effect of detector performance on FPA noise, two kinds of photodiodes with different concentration of the absorption layer were fabricated. The detectors' performance and noise characteristics were measured and analyzed, the results are consistent with that of theoretical analysis.

  14. Scaling to diversity: The DERECHOS distributed infrastructure for analyzing and sharing data

    NASA Astrophysics Data System (ADS)

    Rilee, M. L.; Kuo, K. S.; Clune, T.; Oloso, A.; Brown, P. G.

    2016-12-01

    Integrating Earth Science data from diverse sources such as satellite imagery and simulation output can be expensive and time-consuming, limiting scientific inquiry and the quality of our analyses. Reducing these costs will improve innovation and quality in science. The current Earth Science data infrastructure focuses on downloading data based on requests formed from the search and analysis of associated metadata. And while the data products provided by archives may use the best available data sharing technologies, scientist end-users generally do not have such resources (including staff) available to them. Furthermore, only once an end-user has received the data from multiple diverse sources and has integrated them can the actual analysis and synthesis begin. The cost of getting from idea to where synthesis can start dramatically slows progress. In this presentation we discuss a distributed computational and data storage framework that eliminates much of the aforementioned cost. The SciDB distributed array database is central as it is optimized for scientific computing involving very large arrays, performing better than less specialized frameworks like Spark. Adding spatiotemporal functions to the SciDB creates a powerful platform for analyzing and integrating massive, distributed datasets. SciDB allows Big Earth Data analysis to be performed "in place" without the need for expensive downloads and end-user resources. Spatiotemporal indexing technologies such as the hierarchical triangular mesh enable the compute and storage affinity needed to efficiently perform co-located and conditional analyses minimizing data transfers. These technologies automate the integration of diverse data sources using the framework, a critical step beyond current metadata search and analysis. Instead of downloading data into their idiosyncratic local environments, end-users can generate and share data products integrated from diverse multiple sources using a common shared environment, turning distributed active archive centers (DAACs) from warehouses into distributed active analysis centers.

  15. Performance Effects of Measurement and Analysis: Perspectives from CMMI High Maturity Organizations and Appraisers

    DTIC Science & Technology

    2010-06-01

    models 13 The Chi-Square test fails to reject the null hypothesis that there is no difference between 2008 and 2009 data (p-value = 0.601). This...attributed to process performance modeling 53 Table 4: Relationships between data quality and integrity activities and overall value attributed to... data quality and integrity; staffing and resources devoted to the work; pertinent training and coaching; and the alignment of the models with

  16. Debris/ice/tps Assessment and Integrated Photographic Analysis of Shuttle Mission STS-81

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.; Lin, Jill D.

    1997-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-81. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-81 and the resulting effect on the Space Shuttle Program.

  17. Debris/ice/tps Assessment and Integrated Photographic Analysis of Shuttle Mission STS-83

    NASA Technical Reports Server (NTRS)

    Lin, Jill D.; Katnik, Gregory N.

    1997-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-83. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-83 and the resulting effect on the Space Shuttle Program.

  18. Debris/ice/TPS assessment and integrated photographic analysis of Shuttle Mission STS-71

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.; Bowen, Barry C.; Davis, J. Bradley

    1995-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-71. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanner data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-71 and the resulting effect on the Space Shuttle Program.

  19. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-102

    NASA Technical Reports Server (NTRS)

    Rivera, Jorge E.; Kelly, J. David (Technical Monitor)

    2001-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-102. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch were analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or inflight anomalies. This report documents the debris/ice /thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-102 and the resulting effect on the Space Shuttle Program.

  20. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-94

    NASA Technical Reports Server (NTRS)

    Bowen, Barry C.; Lin, Jill D.

    1997-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-94. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-94 and the resulting effect on the Space Shuttle Program.

  1. Debris/ice/tps Assessment and Integrated Photographic Analysis of Shuttle Mission STS-79

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.; Lin, Jill D.

    1996-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-79. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-79 and the resulting effect on the Space Shuttle Program.

  2. Debris/ice/TPS assessment and integrated photographic analysis of Shuttle mission STS-73

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.; Bowen, Barry C.; Lin, Jill D.

    1995-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-73. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanner data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle Mission STS-73 and the resulting effect on the Space Shuttle Program.

  3. Debris/ice/TPS assessment and integrated photographic analysis for Shuttle Mission STS-50

    NASA Technical Reports Server (NTRS)

    Higginbotham, Scott A.; Davis, J. Bradley; Katnik, Gregory N.

    1992-01-01

    Thermal Protection System (TPS) assessment and integrated photographic analysis was conducted for Shuttle Mission STS-50. Debris inspections of the flight elements and launch pad were performed before and after launch. Ice/frost conditions on the external tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography was analyzed after launch to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. The debris/ice/TPS conditions and integrated photographic analysis of Shuttle Mission STS-50, and the resulting effect on the Space Shuttle Program are documented.

  4. Debris/Ice/TPS Assessment and Integrated Photographic Analysis for Shuttle Mission STS-49

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.; Higginbotham, Scott A.; Davis, J. Bradley

    1992-01-01

    A debris/ice/Thermal Protection System (TPS) assessment and integrated photographic analysis was conducted for Shuttle Mission STS-49. Debris inspections of the flight elements and launch pad were performed before and after launch. Ice/frost conditions on the External Tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography was analyzed after launch to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. Debris/ice/TPS conditions and integrated photographic analysis of Shuttle Mission STS-49, and the resulting effect on the Space Shuttle Program are discussed.

  5. Debris/Ice/TPS assessment and integrated photographic analysis of Shuttle Mission STS-77

    NASA Technical Reports Server (NTRS)

    Katnik, GregoryN.; Lin, Jill D. (Compiler)

    1996-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-77. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-77 and the resulting effect on the Space Shuttle Program.

  6. Debris/ice/TPS assessment and integrated photographic analysis of Shuttle Mission STS-70

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.; Bowen, Barry C.; Davis, J. Bradley

    1995-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-70. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanner data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-70 and the resulting effect on the Space Shuttle Program.

  7. Debris/ice/TPS assessment and integrated photographic analysis for Shuttle Mission STS-51

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.; Bowen, Barry C.; Davis, J. Bradley

    1993-01-01

    A debris/ice/thermal protection system (TPS) assessment and integrated photographic analysis was conducted for shuttle mission STS-51. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the external tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography was analyzed after launch to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the debris/ice/TPS conditions and integrated photographic analysis of Shuttle mission STS-51 and the resulting effect on the Space Shuttle Program.

  8. Debris/ice/TPS assessment and integrated photographic analysis for Shuttle Mission STS-55

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.; Bowen, Barry C.; Davis, J. Bradley

    1993-01-01

    A Debris/Ice/TPS assessment and integrated photographic analysis was conducted for Shuttle mission STS-55. Debris inspections of the flight elements and launch pad were performed before and after launch. Ice/Frost conditions on the External Tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography was analyzed after launch to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the debris/ice/TPS conditions and integrated photographic analysis of Shuttle mission STS-55, and the resulting effect on the Space Shuttle Program.

  9. Debris/ice/TPS assessment and integrated photographic analysis of Shuttle mission STS-69

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.; Bowen, Barry C.; Davis, J. Bradley

    1995-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-69. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanner data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in flight anomalies. This report documents the ice/debris/thermal protection system condition and integrated photographic analysis of Shuttle Mission STS-69 and the resulting effect on the Space Shuttle Program.

  10. Debris/ice/TPS assessment and integrated photographic analysis for Shuttle Mission STS-52

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.; Higginbotham, Scott A.; Davis, J. Bradley

    1992-01-01

    A debris/ice/Thermal Protection System (TPS) assessment and integrated photographic analysis was conducted for Shuttle Mission STS-47. Debris inspections of the flight elements and launch pad were performed before and after launch. Ice/frost conditions on the external tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography was analyzed after launch to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the debris/ice/TPS conditions and integrated photographic analysis of Shuttle Mission STS-52, and the resulting effect on the Space Shuttle Program.

  11. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-106

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.; Kelley, J. David (Technical Monitor)

    2000-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-106. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-106 and the resulting effect on the Space Shuttle Program.

  12. Debris/Ice/TPS assessment and integrated photographic analysis of shuttle mission STS-76

    NASA Technical Reports Server (NTRS)

    Lin, Jill D.

    1996-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-76. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-76 and the resulting effect on the Space Shuttle Program.

  13. Debris/ice/TPS assessment and integrated photographic analysis of Shuttle Mission STS-53

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.; Higginbotham, Scott A.; Davis, J. Bradley

    1993-01-01

    A Debris/Ice/TPS assessment and integrated photographic analysis was conducted for Shuttle Mission STS-53. Debris inspections of the flight elements and launch pad were performed before and after launch. Ice/Frost conditions on the External Tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography was analyzed after launch to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the debris/ice/TPS conditions and integrated photographic analysis of Shuttle Mission STS-53, and the resulting effect on the Space Shuttle Program.

  14. Debris/ice/TPS assessment and integrated photographic analysis for Shuttle Mission STS-54

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.; Higginbotham, Scott A.; Davis, J. Bradley

    1993-01-01

    A Debris/Ice/TPS assessment and integrated photographic analysis was conducted for Shuttle Mission STS-54. Debris inspections of the flight elements and launch pad were performed before and after launch. Ice/frost conditions on the External Tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography was analyzed after launch to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the debris/ice/TPS conditions and integrated photographic analysis of Shuttle Mission STS-54, and the resulting effect on the Space Shuttle Program.

  15. Debris/Ice/TPS assessment and integrated photographic analysis for Shuttle Mission STS-61

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.; Bowen, Barry C.; Davis, J. Bradley

    1994-01-01

    A debris/ice/thermal protection system (TPS) assessment and integrated photographic analysis was conducted for shuttle mission STS-61. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the external tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/TPS conditions and integrated photographic analysis of shuttle mission STS-61, and the resulting effect on the space shuttle program.

  16. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-72

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.; Bowen, Barry C.; Lin, Jill D.

    1996-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-72. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-72 and the resulting effect on the Space Shuttle Program.

  17. Debris/ice/TPS assessment and integrated photographic analysis for Shuttle mission STS-58

    NASA Technical Reports Server (NTRS)

    Davis, J. Bradley; Rivera, Jorge E.; Katnik, Gregory N.; Bowen, Barry C.; Speece, Robert F.; Rosado, Pedro J.

    1994-01-01

    A debris/ice/thermal protection system (TPS) assessment and integrated photographic analysis was conducted for Shuttle mission STS-58. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. The ice/debris/TPS conditions and integrated photographic analysis of Shuttle mission STS-58, and the resulting effect on the Space Shuttle Program are documented.

  18. Debris/ice/TPS assessment and integrated photographic analysis for Shuttle mission STS-47

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.; Higginbotham, Scott A.; Davis, J. Bradley

    1992-01-01

    A debris/ice/TPS assessment and integrated photographic analysis was conducted for Shuttle Mission STS-47. Debris inspections of the flight elements and launch pad were performed before and after launch. Ice/frost conditions on the External Tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography was analyzed after launch to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the debris/ice/TPS conditions and integrated photographic analysis of Shuttle Mission STS-47, and the resulting effect on the Space Shuttle Program.

  19. An Integrated Approach for Conducting a Behavioral Systems Analysis

    ERIC Educational Resources Information Center

    Diener, Lori H.; McGee, Heather M.; Miguel, Caio F.

    2009-01-01

    The aim of this paper is to illustrate how to conduct a Behavioral Systems Analysis (BSA) to aid in the design of targeted performance improvement interventions. BSA is a continuous process of analyzing the right variables to the right extent to aid in planning and managing performance at the organization, process, and job levels. BSA helps to…

  20. Pre-irradiation testing and analysis to support the LWRS Hybrid SiC-CMC-Zircaloy-04 unfueled rodlet irradiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Isabella J van Rooyen

    2012-09-01

    Nuclear fuel performance is a significant driver of nuclear power plant operational performance, safety, economics and waste disposal requirements. The Advanced Light Water Reactor (LWR) Nuclear Fuel Development Pathway focuses on improving the scientific knowledge basis to enable the development of high-performance, high burn-up fuels with improved safety and cladding integrity and improved nuclear fuel cycle economics. To achieve significant improvements, fundamental changes are required in the areas of nuclear fuel composition, cladding integrity, and fuel/cladding interaction.

  1. Pre-irradiation testing and analysis to support the LWRS Hybrid SiC-CMC-Zircaloy-04 unfueled rodlet irradiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Isabella J van Rooyen

    2013-01-01

    Nuclear fuel performance is a significant driver of nuclear power plant operational performance, safety, economics and waste disposal requirements. The Advanced Light Water Reactor (LWR) Nuclear Fuel Development Pathway focuses on improving the scientific knowledge basis to enable the development of high-performance, high burn-up fuels with improved safety and cladding integrity and improved nuclear fuel cycle economics. To achieve significant improvements, fundamental changes are required in the areas of nuclear fuel composition, cladding integrity, and fuel/cladding interaction.

  2. Meta-Analysis of Integrity Tests: A Critical Examination of Validity Generalization and Moderator Variables

    DTIC Science & Technology

    1992-06-01

    AVA LABLLTY OF PEPOR’ 2b DECLASSfFiCATION DOWNGRADING SCHEDULE UnI imiited 4 PERFORMING ORGANZAT ON REPORT NUMBER(S) 5 MON’TORzNG ORGA% ZA C% RPEOR...8217 " S 92- 1 6a NAME OF PERFORMING ORGANIZATION 6b OFFPCE SYMBOL 7a NAME OF V0’O0R ’C OCGAz) ZA- %I University of Iowa (Ifappicable) Defense Personnel...data points. Results indicate that integrity test validities are positive and in many cases substantial for predicting both job performance and

  3. Multi-ingredients determination and fingerprint analysis of leaves from Ilex latifolia using ultra-performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry.

    PubMed

    Fan, Chunlin; Deng, Jiewei; Yang, Yunyun; Liu, Junshan; Wang, Ying; Zhang, Xiaoqi; Fai, Kuokchiu; Zhang, Qingwen; Ye, Wencai

    2013-10-01

    An ultra-performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry (UPLC-QTOF-MS) method integrating multi-ingredients determination and fingerprint analysis has been established for quality assessment and control of leaves from Ilex latifolia. The method possesses the advantages of speediness, efficiency, accuracy, and allows the multi-ingredients determination and fingerprint analysis in one chromatographic run within 13min. Multi-ingredients determination was performed based on the extracted ion chromatograms of the exact pseudo-molecular ions (with a 0.01Da window), and fingerprint analysis was performed based on the base peak chromatograms, obtained by negative-ion electrospray ionization QTOF-MS. The method validation results demonstrated our developed method possessing desirable specificity, linearity, precision and accuracy. The method was utilized to analyze 22 I. latifolia samples from different origins. The quality assessment was achieved by using both similarity analysis (SA) and principal component analysis (PCA), and the results from SA were consistent with those from PCA. Our experimental results demonstrate that the strategy integrated multi-ingredients determination and fingerprint analysis using UPLC-QTOF-MS technique is a useful approach for rapid pharmaceutical analysis, with promising prospects for the differentiation of origin, the determination of authenticity, and the overall quality assessment of herbal medicines. Copyright © 2013 Elsevier B.V. All rights reserved.

  4. The Joint Distribution Process Analysis Center (JDPAC): Background and Current Capability

    DTIC Science & Technology

    2007-06-12

    Systems Integration and Data Management JDDE Analysis/Global Distribution Performance Assessment Futures/Transformation Analysis Balancing Operational Art ... Science JDPAC “101” USTRANSCOM Future Operations Center SDDC – TEA Army SES (Dual Hat) • Transportability Engineering • Other Title 10

  5. TRISO Fuel Performance: Modeling, Integration into Mainstream Design Studies, and Application to a Thorium-fueled Fusion-Fission Hybrid Blanket

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Powers, Jeffrey James

    2011-11-30

    This study focused on creating a new tristructural isotropic (TRISO) coated particle fuel performance model and demonstrating the integration of this model into an existing system of neutronics and heat transfer codes, creating a user-friendly option for including fuel performance analysis within system design optimization and system-level trade-off studies. The end product enables both a deeper understanding and better overall system performance of nuclear energy systems limited or greatly impacted by TRISO fuel performance. A thorium-fueled hybrid fusion-fission Laser Inertial Fusion Energy (LIFE) blanket design was used for illustrating the application of this new capability and demonstrated both the importancemore » of integrating fuel performance calculations into mainstream design studies and the impact that this new integrated analysis had on system-level design decisions. A new TRISO fuel performance model named TRIUNE was developed and verified and validated during this work with a novel methodology established for simulating the actual lifetime of a TRISO particle during repeated passes through a pebble bed. In addition, integrated self-consistent calculations were performed for neutronics depletion analysis, heat transfer calculations, and then fuel performance modeling for a full parametric study that encompassed over 80 different design options that went through all three phases of analysis. Lastly, side studies were performed that included a comparison of thorium and depleted uranium (DU) LIFE blankets as well as some uncertainty quantification work to help guide future experimental work by assessing what material properties in TRISO fuel performance modeling are most in need of improvement. A recommended thorium-fueled hybrid LIFE engine design was identified with an initial fuel load of 20MT of thorium, 15% TRISO packing within the graphite fuel pebbles, and a 20cm neutron multiplier layer with beryllium pebbles in flibe molten salt coolant. It operated at a system power level of 2000 MW th, took about 3.5 years to reach full plateau power, and was capable of an End of Plateau burnup of 38.7 %FIMA if considering just the neutronic constraints in the system design; however, fuel performance constraints led to a maximum credible burnup of 12.1 %FIMA due to a combination of internal gas pressure and irradiation effects on the TRISO materials (especially PyC) leading to SiC pressure vessel failures. The optimal neutron spectrum for the thorium-fueled blanket options evaluated seemed to favor a hard spectrum (low but non-zero neutron multiplier thicknesses and high TRISO packing fractions) in terms of neutronic performance but the fuel performance constraints demonstrated that a significantly softer spectrum would be needed to decrease the rate of accumulation of fast neutron fluence in order to improve the maximum credible burnup the system could achieve.« less

  6. Integrated Analysis and Tools for Land Subsidence Surveying and Monitoring: a Semi-Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Mosconi, A.; Pozzoli, A.; Meroni, A.; Gagliano, S.

    2015-10-01

    This paper presents an integrated approach for land subsidence monitoring using measures coming from different sensors. Eni S.p.A., the main Italian oil and gas company, constantly surveys the land with all the state of the art and innovative techniques, and a method able to integrate the results is an important and actual topic. Nowadays the world is a multi-sensor platform, and measure integration is strictly necessary. Combining the different data sources should be done in a clever way, taking advantages from the best performances of each technique. An integrated analysis allows the interpretation of simultaneous temporal series of data, coming from different sources, and try to separate subsidence contributions. With this purpose Exelis VIS in collaboration with Eni S.p.A. customize PISAV (Permanent Interferometric Scatterometer Analysis and Visualization), an ENVI extension able to capitalize on and combine all the different data collected in the surveys. In this article are presented some significant examples to show the potential of this tool in oil and gas activity: a hydrocarbon storage field where the comparison between SAR and production volumes emphasise a correlation between the two measures in few steps; and a hydrocarbon production field with the Satellite Survey Unit (S.S.U.), where SAR, CGPS, piezometers and assestimeters measure in the same area at the same time, giving the opportunity to analyse data contextually. In the integrated analysis performed with PISAV not always a mathematical rigorous study is possible, and a semi-quantitative approach is the only method for results interpretation. As a result, in the first test case strong correlation between injected hydrocarbon volume and vertical displacement were highlighted; in the second one the integrated analysis has different advantages in monitoring the land subsidence: permits a first qualitative "differentiation" of the natural and anthropic component of subsidence, and also gives more reliability and coverage to each measurement, taking advantages from the strong points of each technique.

  7. Sparse representation based biomarker selection for schizophrenia with integrated analysis of fMRI and SNPs.

    PubMed

    Cao, Hongbao; Duan, Junbo; Lin, Dongdong; Shugart, Yin Yao; Calhoun, Vince; Wang, Yu-Ping

    2014-11-15

    Integrative analysis of multiple data types can take advantage of their complementary information and therefore may provide higher power to identify potential biomarkers that would be missed using individual data analysis. Due to different natures of diverse data modality, data integration is challenging. Here we address the data integration problem by developing a generalized sparse model (GSM) using weighting factors to integrate multi-modality data for biomarker selection. As an example, we applied the GSM model to a joint analysis of two types of schizophrenia data sets: 759,075 SNPs and 153,594 functional magnetic resonance imaging (fMRI) voxels in 208 subjects (92 cases/116 controls). To solve this small-sample-large-variable problem, we developed a novel sparse representation based variable selection (SRVS) algorithm, with the primary aim to identify biomarkers associated with schizophrenia. To validate the effectiveness of the selected variables, we performed multivariate classification followed by a ten-fold cross validation. We compared our proposed SRVS algorithm with an earlier sparse model based variable selection algorithm for integrated analysis. In addition, we compared with the traditional statistics method for uni-variant data analysis (Chi-squared test for SNP data and ANOVA for fMRI data). Results showed that our proposed SRVS method can identify novel biomarkers that show stronger capability in distinguishing schizophrenia patients from healthy controls. Moreover, better classification ratios were achieved using biomarkers from both types of data, suggesting the importance of integrative analysis. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Study to evaluate the integration of a mass spectrometer with a wet chemistry instrument. [for amino acid analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The charactertistics and performance capability of the current Viking '75 Gas Chromatograph/Mass Spectrometer Instrument are reviewed and documented for the purpose of possible integration with a wet chemistry instrument. Interface, high mass discrimination, and vacuum requirements were determined in a simulated flight investigation. Suggestions for future investigations, tradeoff studies, and design modifications are presented, along with the results of column bleed measurements. A preliminary design of an integrated Wet Chemistry/Mass Spectrometer instrument for amino acid analysis is shown, including estimates of additional weight, volume, and power requirements.

  9. Study of component technologies for fuel cell on-site integrated energy system. Volume 2: Appendices

    NASA Technical Reports Server (NTRS)

    Lee, W. D.; Mathias, S.

    1980-01-01

    This data base catalogue was compiled in order to facilitate the analysis of various on site integrated energy system with fuel cell power plants. The catalogue is divided into two sections. The first characterizes individual components in terms of their performance profiles as a function of design parameters. The second characterizes total heating and cooling systems in terms of energy output as a function of input and control variables. The integrated fuel cell systems diagrams and the computer analysis of systems are included as well as the cash flows series for baseline systems.

  10. Data Integration Framework Data Management Plan Remote Sensing Dataset

    DTIC Science & Technology

    2016-07-01

    performed by the Coastal Observations and Analysis Branch (CEERD-HFA) of the Flood and Storm Protection Division (CEERD-HF), U.S. Army Engineer Research... Protection Division, Coastal Observations and Analysis Branch CESAM U.S. Army Corps of Engineers, Mobile District CESAM-OP-J U.S. Army Corps of Engineers...ER D C/ CH L SR -1 6- 2 Coastal Ocean Data Systems Program Data Integration Framework Data Management Plan Remote Sensing Dataset Co

  11. Integrated Data Analysis for Fusion: A Bayesian Tutorial for Fusion Diagnosticians

    NASA Astrophysics Data System (ADS)

    Dinklage, Andreas; Dreier, Heiko; Fischer, Rainer; Gori, Silvio; Preuss, Roland; Toussaint, Udo von

    2008-03-01

    Integrated Data Analysis (IDA) offers a unified way of combining information relevant to fusion experiments. Thereby, IDA meets with typical issues arising in fusion data analysis. In IDA, all information is consistently formulated as probability density functions quantifying uncertainties in the analysis within the Bayesian probability theory. For a single diagnostic, IDA allows the identification of faulty measurements and improvements in the setup. For a set of diagnostics, IDA gives joint error distributions allowing the comparison and integration of different diagnostics results. Validation of physics models can be performed by model comparison techniques. Typical data analysis applications benefit from IDA capabilities of nonlinear error propagation, the inclusion of systematic effects and the comparison of different physics models. Applications range from outlier detection, background discrimination, model assessment and design of diagnostics. In order to cope with next step fusion device requirements, appropriate techniques are explored for fast analysis applications.

  12. Semantic integration of gene expression analysis tools and data sources using software connectors

    PubMed Central

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data. PMID:24341380

  13. Semantic integration of gene expression analysis tools and data sources using software connectors.

    PubMed

    Miyazaki, Flávia A; Guardia, Gabriela D A; Vêncio, Ricardo Z N; de Farias, Cléver R G

    2013-10-25

    The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heterogeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data.

  14. CFD in the context of IHPTET - The Integrated High Performance Turbine Engine Technology Program

    NASA Technical Reports Server (NTRS)

    Simoneau, Robert J.; Hudson, Dale A.

    1989-01-01

    The Integrated High Performance Turbine Engine Technology (IHPTET) Program is an integrated DOD/NASA technology program designed to double the performance capability of today's most advanced military turbine engines as we enter the twenty-first century. Computational Fluid Dynamics (CFD) is expected to play an important role in the design/analysis of specific configurations within this complex machine. In order to do this, a plan is being developed to ensure the timely impact of CFD on IHPTET. The developing philosophy of CFD in the context of IHPTET is discussed. The key elements in the developing plan and specific examples of state-of-the-art CFD efforts which are IHPTET turbine engine relevant are discussed.

  15. Structural reliability methods: Code development status

    NASA Astrophysics Data System (ADS)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-05-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  16. Structural reliability methods: Code development status

    NASA Technical Reports Server (NTRS)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-01-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  17. Integrated water system simulation by considering hydrological and biogeochemical processes: model development, with parameter sensitivity and autocalibration

    NASA Astrophysics Data System (ADS)

    Zhang, Y. Y.; Shao, Q. X.; Ye, A. Z.; Xing, H. T.; Xia, J.

    2016-02-01

    Integrated water system modeling is a feasible approach to understanding severe water crises in the world and promoting the implementation of integrated river basin management. In this study, a classic hydrological model (the time variant gain model: TVGM) was extended to an integrated water system model by coupling multiple water-related processes in hydrology, biogeochemistry, water quality, and ecology, and considering the interference of human activities. A parameter analysis tool, which included sensitivity analysis, autocalibration and model performance evaluation, was developed to improve modeling efficiency. To demonstrate the model performances, the Shaying River catchment, which is the largest highly regulated and heavily polluted tributary of the Huai River basin in China, was selected as the case study area. The model performances were evaluated on the key water-related components including runoff, water quality, diffuse pollution load (or nonpoint sources) and crop yield. Results showed that our proposed model simulated most components reasonably well. The simulated daily runoff at most regulated and less-regulated stations matched well with the observations. The average correlation coefficient and Nash-Sutcliffe efficiency were 0.85 and 0.70, respectively. Both the simulated low and high flows at most stations were improved when the dam regulation was considered. The daily ammonium-nitrogen (NH4-N) concentration was also well captured with the average correlation coefficient of 0.67. Furthermore, the diffuse source load of NH4-N and the corn yield were reasonably simulated at the administrative region scale. This integrated water system model is expected to improve the simulation performances with extension to more model functionalities, and to provide a scientific basis for the implementation in integrated river basin managements.

  18. Integrative sparse principal component analysis of gene expression data.

    PubMed

    Liu, Mengque; Fan, Xinyan; Fang, Kuangnan; Zhang, Qingzhao; Ma, Shuangge

    2017-12-01

    In the analysis of gene expression data, dimension reduction techniques have been extensively adopted. The most popular one is perhaps the PCA (principal component analysis). To generate more reliable and more interpretable results, the SPCA (sparse PCA) technique has been developed. With the "small sample size, high dimensionality" characteristic of gene expression data, the analysis results generated from a single dataset are often unsatisfactory. Under contexts other than dimension reduction, integrative analysis techniques, which jointly analyze the raw data of multiple independent datasets, have been developed and shown to outperform "classic" meta-analysis and other multidatasets techniques and single-dataset analysis. In this study, we conduct integrative analysis by developing the iSPCA (integrative SPCA) method. iSPCA achieves the selection and estimation of sparse loadings using a group penalty. To take advantage of the similarity across datasets and generate more accurate results, we further impose contrasted penalties. Different penalties are proposed to accommodate different data conditions. Extensive simulations show that iSPCA outperforms the alternatives under a wide spectrum of settings. The analysis of breast cancer and pancreatic cancer data further shows iSPCA's satisfactory performance. © 2017 WILEY PERIODICALS, INC.

  19. Performance of third-year primary-care-track students in an integrated curriculum at Case Western Reserve University.

    PubMed

    Lewin, L O; Papp, K K; Hodder, S L; Workings, M G; Wolfe, L; Glover, P; Headrick, L A

    1999-01-01

    In 1994, Case Western Reserve University School of Medicine established a Primary Care Track (PCT) with an integrated curriculum as part of The Robert Wood Johnson Foundation's Generalist Physician Initiative. This study compared the performance of the first cohort of students to participate in the PCT third year with that of their classmates and determined student attitudes toward their experiences. The performances of 24 PCT and 81 traditional students on the Medical School Admissions Test (MCAT) and the United States Medical Licensure Examination (USMLE) Step 1 and 2 were compared using analysis of variance. Grades on the six core clerkships were compared using chi-square analysis. Performances of the PCT students and a subset of traditional students on the generalist school's objective structured clinical exam (OSCE) were compared using multivariate analysis. The students reported their perceptions on a questionnaire. The traditional students had significantly higher scores on the physical science section of the MCAT and on the USMLE Step 1, but at the end of year three, their USMLE Step 2 scores did not differ. Grade distributions in the core clerkships did not differ, except in psychiatry, where the PCT students received honors significantly more often. The PCT students had a lower mean score on the internal medicine National Board of Medicine Examiners shelf exam but performed better on the generalist OSCE exam. A majority of PCT students reported that they would choose the integrated third year again and recommend it to others.

  20. Aiding planning in air traffic control: an experimental investigation of the effects of perceptual information integration.

    PubMed

    Moertl, Peter M; Canning, John M; Gronlund, Scott D; Dougherty, Michael R P; Johansson, Joakim; Mills, Scott H

    2002-01-01

    Prior research examined how controllers plan in their traditional environment and identified various information uncertainties as detriments to planning. A planning aid was designed to reduce this uncertainty by perceptually representing important constraints. This included integrating spatial information on the radar screen with discrete information (planned sequences of air traffic). Previous research reported improved planning performance and decreased workload in the planning aid condition. The purpose of this paper was to determine the source of these performance improvements. Analysis of computer interactions using log-linear modeling showed that the planning interface led to less repetitive--but more integrated--information retrieval compared with the traditional planning environment. Ecological interface design principles helped explain how the integrated information retrieval gave rise to the performance improvements. Actual or potential applications of this research include the design and evaluation of interface automation that keeps users in active control by modification of perceptual task characteristics.

  1. 3D noise-resistant segmentation and tracking of unknown and occluded objects using integral imaging

    NASA Astrophysics Data System (ADS)

    Aloni, Doron; Jung, Jae-Hyun; Yitzhaky, Yitzhak

    2017-10-01

    Three dimensional (3D) object segmentation and tracking can be useful in various computer vision applications, such as: object surveillance for security uses, robot navigation, etc. We present a method for 3D multiple-object tracking using computational integral imaging, based on accurate 3D object segmentation. The method does not employ object detection by motion analysis in a video as conventionally performed (such as background subtraction or block matching). This means that the movement properties do not significantly affect the detection quality. The object detection is performed by analyzing static 3D image data obtained through computational integral imaging With regard to previous works that used integral imaging data in such a scenario, the proposed method performs the 3D tracking of objects without prior information about the objects in the scene, and it is found efficient under severe noise conditions.

  2. Verbal Neuropsychological Functions in Aphasia: An Integrative Model

    ERIC Educational Resources Information Center

    Vigliecca, Nora Silvana; Báez, Sandra

    2015-01-01

    A theoretical framework which considers the verbal functions of the brain under a multivariate and comprehensive cognitive model was statistically analyzed. A confirmatory factor analysis was performed to verify whether some recognized aphasia constructs can be hierarchically integrated as latent factors from a homogenously verbal test. The Brief…

  3. Task Values, Achievement Goals, and Interest: An Integrative Analysis

    ERIC Educational Resources Information Center

    Hulleman, Chris S.; Durik, Amanda M.; Schweigert, Shaun B.; Harackiewicz, Judith M.

    2008-01-01

    The research presented in this article integrates 3 theoretical perspectives in the field of motivation: expectancy-value, achievement goals, and interest. The authors examined the antecedents (initial interest, achievement goals) and consequences (interest, performance) of task value judgments in 2 learning contexts: a college classroom and a…

  4. Energy efficient engine: Propulsion system-aircraft integration evaluation

    NASA Technical Reports Server (NTRS)

    Owens, R. E.

    1979-01-01

    Flight performance and operating economics of future commercial transports utilizing the energy efficient engine were assessed as well as the probability of meeting NASA's goals for TSFC, DOC, noise, and emissions. Results of the initial propulsion systems aircraft integration evaluation presented include estimates of engine performance, predictions of fuel burns, operating costs of the flight propulsion system installed in seven selected advanced study commercial transports, estimates of noise and emissions, considerations of thrust growth, and the achievement-probability analysis.

  5. Clonal Integration Enhances the Performance of a Clonal Plant Species under Soil Alkalinity Stress

    PubMed Central

    Sun, Juanjuan; Chen, Jishan; Zhang, Yingjun

    2015-01-01

    Clonal plants have been shown to successfully survive in stressful environments, including salinity stress, drought and depleted nutrients through clonal integration between original and subsequent ramets. However, relatively little is known about whether clonal integration can enhance the performance of clonal plants under alkalinity stress. We investigated the effect of clonal integration on the performance of a typical rhizomatous clonal plant, Leymus chinensis, using a factorial experimental design with four levels of alkalinity and two levels of rhizome connection treatments, connected (allowing integration) and severed (preventing integration). Clonal integration was estimated by comparing physiological and biomass features between the rhizome-connected and rhizome-severed treatments. We found that rhizome-connected treatment increased the biomass, height and leaf water potential of subsequent ramets at highly alkalinity treatments but did not affect them at low alkalinity treatments. However, rhizome-connected treatment decreased the root biomass of subsequent ramets and did not influence the photosynthetic rates of subsequent ramets. The biomass of original ramets was reduced by rhizome-connected treatment at the highest alkalinity level. These results suggest that clonal integration can increase the performance of clonal plants under alkalinity stress. Rhizome-connected plants showed dramatically increased survival of buds with negative effects on root weight, indicating that clonal integration influenced the resource allocation pattern of clonal plants. A cost-benefit analysis based on biomass measures showed that original and subsequent ramets significantly benefited from clonal integration in highly alkalinity stress, indicating that clonal integration is an important adaptive strategy by which clonal plants could survive in local alkalinity soil. PMID:25790352

  6. TRU Waste Management Program. Cost/schedule optimization analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Detamore, J.A.; Raudenbush, M.H.; Wolaver, R.W.

    This Current Year Work Plan presents in detail a description of the activities to be performed by the Joint Integration Office Rockwell International (JIO/RI) during FY86. It breaks down the activities into two major work areas: Program Management and Program Analysis. Program Management is performed by the JIO/RI by providing technical planning and guidance for the development of advanced TRU waste management capabilities. This includes equipment/facility design, engineering, construction, and operations. These functions are integrated to allow transition from interim storage to final disposition. JIO/RI tasks include program requirements identification, long-range technical planning, budget development, program planning document preparation, taskmore » guidance development, task monitoring, task progress information gathering and reporting to DOE, interfacing with other agencies and DOE lead programs, integrating public involvement with program efforts, and preparation of reports for DOE detailing program status. Program Analysis is performed by the JIO/RI to support identification and assessment of alternatives, and development of long-term TRU waste program capabilities. These analyses include short-term analyses in response to DOE information requests, along with performing an RH Cost/Schedule Optimization report. Systems models will be developed, updated, and upgraded as needed to enhance JIO/RI's capability to evaluate the adequacy of program efforts in various fields. A TRU program data base will be maintained and updated to provide DOE with timely responses to inventory related questions.« less

  7. Use of Controller Area Network (CAN) Data to Support Performance Testing

    DTIC Science & Technology

    2015-07-16

    examples below highlight some common CAN data that have been recorded and utilized for vehicle analysis . This is not an exhaustive list. 3.1 Vehicle...sensor integrated into the data acquisition system. The acceptable error for engine speed data used in a system performance analysis is typically...data the test engineer was able to determine that the system was not functioning properly, and which test runs were invalid for analysis purposes

  8. Operationally Efficient Propulsion System Study (OEPSS) Data Book. Volume 8; Integrated Booster Propulsion Module (BPM) Engine Start Dynamics

    NASA Technical Reports Server (NTRS)

    Kemp, Victoria R.

    1992-01-01

    A fluid-dynamic, digital-transient computer model of an integrated, parallel propulsion system was developed for the CDC mainframe and the SUN workstation computers. Since all STME component designs were used for the integrated system, computer subroutines were written characterizing the performance and geometry of all the components used in the system, including the manifolds. Three transient analysis reports were completed. The first report evaluated the feasibility of integrated engine systems in regards to the start and cutoff transient behavior. The second report evaluated turbopump out and combined thrust chamber/turbopump out conditions. The third report presented sensitivity study results in staggered gas generator spin start and in pump performance characteristics.

  9. Evaluation of hierarchical models for integrative genomic analyses.

    PubMed

    Denis, Marie; Tadesse, Mahlet G

    2016-03-01

    Advances in high-throughput technologies have led to the acquisition of various types of -omic data on the same biological samples. Each data type gives independent and complementary information that can explain the biological mechanisms of interest. While several studies performing independent analyses of each dataset have led to significant results, a better understanding of complex biological mechanisms requires an integrative analysis of different sources of data. Flexible modeling approaches, based on penalized likelihood methods and expectation-maximization (EM) algorithms, are studied and tested under various biological relationship scenarios between the different molecular features and their effects on a clinical outcome. The models are applied to genomic datasets from two cancer types in the Cancer Genome Atlas project: glioblastoma multiforme and ovarian serous cystadenocarcinoma. The integrative models lead to improved model fit and predictive performance. They also provide a better understanding of the biological mechanisms underlying patients' survival. Source code implementing the integrative models is freely available at https://github.com/mgt000/IntegrativeAnalysis along with example datasets and sample R script applying the models to these data. The TCGA datasets used for analysis are publicly available at https://tcga-data.nci.nih.gov/tcga/tcgaDownload.jsp marie.denis@cirad.fr or mgt26@georgetown.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. Integrated Model for Performance Analysis of All-Optical Multihop Packet Switches

    NASA Astrophysics Data System (ADS)

    Jeong, Han-You; Seo, Seung-Woo

    2000-09-01

    The overall performance of an all-optical packet switching system is usually determined by two criteria, i.e., switching latency and packet loss rate. In some real-time applications, however, in which packets arriving later than a timeout period are discarded as loss, the packet loss rate becomes the most dominant criterion for system performance. Here we focus on evaluating the performance of all-optical packet switches in terms of the packet loss rate, which normally arises from the insufficient hardware or the degradation of an optical signal. Considering both aspects, we propose what we believe is a new analysis model for the packet loss rate that reflects the complicated interactions between physical impairments and system-level parameters. On the basis of the estimation model for signal quality degradation in a multihop path we construct an equivalent analysis model of a switching network for evaluating an average bit error rate. With the model constructed we then propose an integrated model for estimating the packet loss rate in three architectural examples of multihop packet switches, each of which is based on a different switching concept. We also derive the bounds on the packet loss rate induced by bit errors. Finally, it is verified through simulation studies that our analysis model accurately predicts system performance.

  11. A Composite Model for Employees' Performance Appraisal and Improvement

    ERIC Educational Resources Information Center

    Manoharan, T. R.; Muralidharan, C.; Deshmukh, S. G.

    2012-01-01

    Purpose: The purpose of this paper is to develop an innovative method of performance appraisal that will be useful for designing a structured training programme. Design/methodology/approach: Employees' performance appraisals are conducted using new approaches, namely data envelopment analysis and an integrated fuzzy model. Interpretive structural…

  12. Edge Preserved Speckle Noise Reduction Using Integrated Fuzzy Filters

    PubMed Central

    Dewal, M. L.; Rohit, Manoj Kumar

    2014-01-01

    Echocardiographic images are inherent with speckle noise which makes visual reading and analysis quite difficult. The multiplicative speckle noise masks finer details, necessary for diagnosis of abnormalities. A novel speckle reduction technique based on integration of geometric, wiener, and fuzzy filters is proposed and analyzed in this paper. The denoising applications of fuzzy filters are studied and analyzed along with 26 denoising techniques. It is observed that geometric filter retains noise and, to address this issue, wiener filter is embedded into the geometric filter during iteration process. The performance of geometric-wiener filter is further enhanced using fuzzy filters and the proposed despeckling techniques are called integrated fuzzy filters. Fuzzy filters based on moving average and median value are employed in the integrated fuzzy filters. The performances of integrated fuzzy filters are tested on echocardiographic images and synthetic images in terms of image quality metrics. It is observed that the performance parameters are highest in case of integrated fuzzy filters in comparison to fuzzy and geometric-fuzzy filters. The clinical validation reveals that the output images obtained using geometric-wiener, integrated fuzzy, nonlocal means, and details preserving anisotropic diffusion filters are acceptable. The necessary finer details are retained in the denoised echocardiographic images. PMID:27437499

  13. Spiral Bevel Gear Damage Detection Using Decision Fusion Analysis

    NASA Technical Reports Server (NTRS)

    Dempsey, Paula J.; Handschuh, Robert F.; Afjeh, Abdollah A.

    2002-01-01

    A diagnostic tool for detecting damage to spiral bevel gears was developed. Two different monitoring technologies, oil debris analysis and vibration, were integrated using data fusion into a health monitoring system for detecting surface fatigue pitting damage on gears. This integrated system showed improved detection and decision-making capabilities as compared to using individual monitoring technologies. This diagnostic tool was evaluated by collecting vibration and oil debris data from fatigue tests performed in the NASA Glenn Spiral Bevel Gear Fatigue Rigs. Data was collected during experiments performed in this test rig when pitting damage occurred. Results show that combining the vibration and oil debris measurement technologies improves the detection of pitting damage on spiral bevel gears.

  14. Overview of the systems special investigation. [long duration exposure facility

    NASA Technical Reports Server (NTRS)

    Mason, James B.; Dursch, Harry; Edelman, Joel

    1992-01-01

    The Systems Special Investigation Group (SIG), formed by the Long Duration Exposure Facility (LDEF) Project Office to perform post flight analysis of systems hardware, was chartered to investigate the effects of the extended LDEF mission on both satellite and experiment systems and to coordinate and integrate all systems analysis performed in post flight investigations. Almost all of the top level functional testing of the active experiments has been completed, but many components are still under investigation by either the Systems SIG or individual experimenters. Results reported to date have been collected and integrated by the Systems SIG and an overview of the current results and the status of the Systems Investigation are presented in this paper.

  15. Integrated Analysis of Climate, Soil, Topography and Vegetative Growth in Iberian Viticultural Regions

    PubMed Central

    Fraga, Helder; Malheiro, Aureliano C.; Moutinho-Pereira, José; Cardoso, Rita M.; Soares, Pedro M. M.; Cancela, Javier J.; Pinto, Joaquim G.; Santos, João A.

    2014-01-01

    The Iberian viticultural regions are convened according to the Denomination of Origin (DO) and present different climates, soils, topography and management practices. All these elements influence the vegetative growth of different varieties throughout the peninsula, and are tied to grape quality and wine type. In the current study, an integrated analysis of climate, soil, topography and vegetative growth was performed for the Iberian DO regions, using state-of-the-art datasets. For climatic assessment, a categorized index, accounting for phenological/thermal development, water availability and grape ripening conditions was computed. Soil textural classes were established to distinguish soil types. Elevation and aspect (orientation) were also taken into account, as the leading topographic elements. A spectral vegetation index was used to assess grapevine vegetative growth and an integrated analysis of all variables was performed. The results showed that the integrated climate-soil-topography influence on vine performance is evident. Most Iberian vineyards are grown in temperate dry climates with loamy soils, presenting low vegetative growth. Vineyards in temperate humid conditions tend to show higher vegetative growth. Conversely, in cooler/warmer climates, lower vigour vineyards prevail and other factors, such as soil type and precipitation acquire more important roles in driving vigour. Vines in prevailing loamy soils are grown over a wide climatic diversity, suggesting that precipitation is the primary factor influencing vigour. The present assessment of terroir characteristics allows direct comparison among wine regions and may have great value to viticulturists, particularly under a changing climate. PMID:25251495

  16. Integrated analysis of climate, soil, topography and vegetative growth in Iberian viticultural regions.

    PubMed

    Fraga, Helder; Malheiro, Aureliano C; Moutinho-Pereira, José; Cardoso, Rita M; Soares, Pedro M M; Cancela, Javier J; Pinto, Joaquim G; Santos, João A

    2014-01-01

    The Iberian viticultural regions are convened according to the Denomination of Origin (DO) and present different climates, soils, topography and management practices. All these elements influence the vegetative growth of different varieties throughout the peninsula, and are tied to grape quality and wine type. In the current study, an integrated analysis of climate, soil, topography and vegetative growth was performed for the Iberian DO regions, using state-of-the-art datasets. For climatic assessment, a categorized index, accounting for phenological/thermal development, water availability and grape ripening conditions was computed. Soil textural classes were established to distinguish soil types. Elevation and aspect (orientation) were also taken into account, as the leading topographic elements. A spectral vegetation index was used to assess grapevine vegetative growth and an integrated analysis of all variables was performed. The results showed that the integrated climate-soil-topography influence on vine performance is evident. Most Iberian vineyards are grown in temperate dry climates with loamy soils, presenting low vegetative growth. Vineyards in temperate humid conditions tend to show higher vegetative growth. Conversely, in cooler/warmer climates, lower vigour vineyards prevail and other factors, such as soil type and precipitation acquire more important roles in driving vigour. Vines in prevailing loamy soils are grown over a wide climatic diversity, suggesting that precipitation is the primary factor influencing vigour. The present assessment of terroir characteristics allows direct comparison among wine regions and may have great value to viticulturists, particularly under a changing climate.

  17. QUANTITATIVE ASSESSMENT OF INTEGRATED PHRENIC NERVE ACTIVITY

    PubMed Central

    Nichols, Nicole L.; Mitchell, Gordon S.

    2016-01-01

    Integrated electrical activity in the phrenic nerve is commonly used to assess within-animal changes in phrenic motor output. Because of concerns regarding the consistency of nerve recordings, activity is most often expressed as a percent change from baseline values. However, absolute values of nerve activity are necessary to assess the impact of neural injury or disease on phrenic motor output. To date, no systematic evaluations of the repeatability/reliability have been made among animals when phrenic recordings are performed by an experienced investigator using standardized methods. We performed a meta-analysis of studies reporting integrated phrenic nerve activity in many rat groups by the same experienced investigator; comparisons were made during baseline and maximal chemoreceptor stimulation in 14 wild-type Harlan and 14 Taconic Sprague Dawley groups, and in 3 pre-symptomatic and 11 end-stage SOD1G93A Taconic rat groups (an ALS model). Meta-analysis results indicate: 1) consistent measurements of integrated phrenic activity in each sub-strain of wild-type rats; 2) with bilateral nerve recordings, left-to-right integrated phrenic activity ratios are ~1.0; and 3) consistently reduced activity in end-stage SOD1G93A rats. Thus, with appropriate precautions, integrated phrenic nerve activity enables robust, quantitative comparisons among nerves or experimental groups, including differences caused by neuromuscular disease. PMID:26724605

  18. Fully Integrated Microfluidic Device for Direct Sample-to-Answer Genetic Analysis

    NASA Astrophysics Data System (ADS)

    Liu, Robin H.; Grodzinski, Piotr

    Integration of microfluidics technology with DNA microarrays enables building complete sample-to-answer systems that are useful in many applications such as clinic diagnostics. In this chapter, a fully integrated microfluidic device [1] that consists of microfluidic mixers, valves, pumps, channels, chambers, heaters, and a DNA microarray sensor to perform DNA analysis of complex biological sample solutions is present. This device can perform on-chip sample preparation (including magnetic bead-based cell capture, cell preconcentration and purification, and cell lysis) of complex biological sample solutions (such as whole blood), polymerase chain reaction, DNA hybridization, and electrochemical detection. A few novel microfluidic techniques were developed and employed. A micromix-ing technique based on a cavitation microstreaming principle was implemented to enhance target cell capture from whole blood samples using immunomagnetic beads. This technique was also employed to accelerate DNA hybridization reaction. Thermally actuated paraffin-based microvalves were developed to regulate flows. Electrochemical pumps and thermopneumatic pumps were integrated on the chip to provide pumping of liquid solutions. The device is completely self-contained: no external pressure sources, fluid storage, mechanical pumps, or valves are necessary for fluid manipulation, thus eliminating possible sample contamination and simplifying device operation. Pathogenic bacteria detection from ~mL whole blood samples and single-nucleotide polymorphism analysis directly from diluted blood were demonstrated. The device provides a cost-effective solution to direct sample-to-answer genetic analysis, and thus has a potential impact in the fields of point-of-care genetic analysis, environmental testing, and biological warfare agent detection.

  19. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-103

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.

    2000-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-103. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-103 and the resulting effect on the Space Shuttle Program.

  20. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-91

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.

    1998-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-91. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-91 and the resulting effect on the Space Shuttle Program.

  1. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-93

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.

    1999-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-93. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis findings of Space Shuttle mission STS-93 and the resulting effect on the Space Shuttle Program.

  2. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-95

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.

    1999-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-95. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-95 and the resulting effect on the Space Shuttle Program.

  3. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-90

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.

    1998-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-90. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system-conditions and integrated photographic analysis of Space Shuttle mission STS-90 and the resulting effect on the Space Shuttle Program.

  4. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-80

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.; Lin, Jill D.

    1997-01-01

    A debris/ice/thermal protection system (TPS) assessment and integrated photographic analysis was conducted for Shuttle mission STS-80. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission Space Transportation System (STS-80) and the resulting effect on the Space Shuttle Program.

  5. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-89

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.

    1998-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-89. Debris inspections of the flight element and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection systems conditions and integrated photographic analysis of Space Shuttle mission STS-89 and the resulting effect on the Space Shuttle Program.

  6. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-112

    NASA Technical Reports Server (NTRS)

    Oliu, Armando

    2002-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-112. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. The report documents the debris/ice/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-112 and the resulting effect of the Space Shuttle Program.

  7. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-74

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.; Bowen, Barry C.; Lin, Jill D.

    1996-01-01

    A debris/ice/thermal protection system (TPS) assessment and integrated photographic analysis was conducted for shuttle mission STS-74. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanner data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of shuttle mission STS-74 and the resulting effect on the Space Shuttle Program.

  8. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-87

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.

    1998-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-87. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the-use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-87 and the resulting effect on the Space Shuttle Program.

  9. Debris/ice/tps Assessment and Integrated Photographic Analysis of Shuttle Mission STS-96

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.

    1999-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-96. Debris inspections of the flight elements and launch pad were performed before and after launch. icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-96 and the resulting effect on the Space Shuttle Program.

  10. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-101

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.

    2000-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle Mission STS-101. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-101 and the resulting effect on the Space Shuttle Program.

  11. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-88

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.

    1999-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-88. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-88 and the resulting effect on the Space Shuttle Program.

  12. Debris/ice/TPS assessment and integrated photographic analysis of Shuttle Mission STS-64 on 9 August 1994

    NASA Technical Reports Server (NTRS)

    Davis, J. Bradley; Bowen, Barry C.; Rivera, Jorge E.; Speece, Robert F.; Katnik, Gregory N.

    1994-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-64. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-64, and the resulting effect on the Space Shuttle Program.

  13. Debris/ice/TPS assessment and integrated photographic analysis of Shuttle mission STS-68

    NASA Technical Reports Server (NTRS)

    Rivera, Jorge E.; Bowen, Barry C.; Davis, J. Bradley; Speece, Robert F.

    1994-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-68. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report-documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-68, and the resulting effect on the Space Shuttle Program.

  14. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-111

    NASA Technical Reports Server (NTRS)

    Oliu, Armando

    2005-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-111. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. The report documents the debris/ice/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-111 and the resulting effect of the Space Shuttle Program.

  15. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-99

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.

    2000-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-99. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the debris/ice/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-99 and the resulting effect on the Space Shuttle Program.

  16. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-98

    NASA Technical Reports Server (NTRS)

    Speece, Robert F.

    2004-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle Mission STS-98. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the debris/ice/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-98 and the resulting effect on the Space Shuttle Program.

  17. Debris/ice/TPS assessment and integrated photographic analysis of shuttle mission STS-63

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.; Bowen, Barry C.; Davis, J. Bradley

    1995-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for shuttle mission STS-63. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the external tank were assessed by the use of computer programs, monographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of shuttle mission STS-63, and the resulting effect on the space shuttle program.

  18. Debris/ice/TPS assessment and integrated photographic analysis of Shuttle mission STS-66

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.; Bowen, Barry C.; Davis, J. Bradley

    1995-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-66. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer program nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Shuttle mission STS-66, and the resulting effect on the Space Shuttle Program.

  19. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-97

    NASA Technical Reports Server (NTRS)

    Rivera, Jorge E.; Kelly, J. David (Technical Monitor)

    2001-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-97. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch were analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the debris /ice/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-97 and the resulting effect on the Space Shuttle Program.

  20. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-86

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.; Lin, Jill D.

    1997-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-86. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-86 and the resulting affect on the Space Shuttle Program.

  1. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-100

    NASA Technical Reports Server (NTRS)

    Oliu, Armando

    2004-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-100. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. The report documents the debris/ice/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-100 and the resulting effect of the Space Shuttle Program.

  2. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-92

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.

    2000-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for Shuttle mission STS-92. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the External Tank were assessed by the use of computer programs and infrared scanned data during cryogenic loading of the vehicle, followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the debris/ice/thermal protection system conditions and integrated photographic analysis of Space Shuttle mission STS-92 and the resulting effect, if any, on the Space Shuttle Program.

  3. Debris/ice/TPS assessment and integrated photographic analysis of Shuttle Mission STS-65

    NASA Technical Reports Server (NTRS)

    Katnik, Gregory N.; Bowen, Barry C.; Davis, J. Bradley

    1994-01-01

    A debris/ice/thermal protection system assessment and integrated photographic analysis was conducted for shuttle mission STS-65. Debris inspections of the flight elements and launch pad were performed before and after launch. Icing conditions on the external tank were assessed by the use of computer programs, nomographs, and infrared scanner data during cryogenic loading of the vehicle followed by on-pad visual inspection. High speed photography of the launch was analyzed to identify ice/debris sources and evaluate potential vehicle damage and/or in-flight anomalies. This report documents the ice/debris/thermal protection system conditions and integrated photographic analysis of shuttle mission STS-65, and the resulting effect on the Space Shuttle Program.

  4. Performance analysis of three dimensional integral equation computations on a massively parallel computer. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Logan, Terry G.

    1994-01-01

    The purpose of this study is to investigate the performance of the integral equation computations using numerical source field-panel method in a massively parallel processing (MPP) environment. A comparative study of computational performance of the MPP CM-5 computer and conventional Cray-YMP supercomputer for a three-dimensional flow problem is made. A serial FORTRAN code is converted into a parallel CM-FORTRAN code. Some performance results are obtained on CM-5 with 32, 62, 128 nodes along with those on Cray-YMP with a single processor. The comparison of the performance indicates that the parallel CM-FORTRAN code near or out-performs the equivalent serial FORTRAN code for some cases.

  5. Development of an integrated configuration management/flight director system for piloted STOL approaches

    NASA Technical Reports Server (NTRS)

    Hoh, R. H.; Klein, R. H.; Johnson, W. A.

    1977-01-01

    A system analysis method for the development of an integrated configuration management/flight director system for IFR STOL approaches is presented. Curved descending decelerating approach trajectories are considered. Considerable emphasis is placed on satisfying the pilot centered requirements (acceptable workload) as well as the usual guidance and control requirements (acceptable performance). The Augmentor Wing Jet STOL Research Aircraft was utilized to allow illustration by example, and to validate the analysis procedure via manned simulation.

  6. Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0

    DOE PAGES

    Huck, Kevin A.; Malony, Allen D.; Shende, Sameer; ...

    2008-01-01

    The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis ofmore » individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.« less

  7. Integrated sudomotor axon reflex sweat stimulation for continuous sweat analyte analysis with individuals at rest.

    PubMed

    Sonner, Zachary; Wilder, Eliza; Gaillard, Trudy; Kasting, Gerald; Heikenfeld, Jason

    2017-07-25

    Eccrine sweat has rapidly emerged as a non-invasive, ergonomic, and rich source of chemical analytes with numerous technological demonstrations now showing the ability for continuous electrochemical sensing. However, beyond active perspirers (athletes, workers, etc.), continuous sweat access in individuals at rest has hindered the advancement of both sweat sensing science and technology. Reported here is integration of sudomotor axon reflex sweat stimulation for continuous wearable sweat analyte analysis, including the ability for side-by-side integration of chemical stimulants & sensors without cross-contamination. This integration approach is uniquely compatible with sensors which consume the analyte (enzymatic) or sensors which equilibrate with analyte concentrations. In vivo validation is performed using iontophoretic delivery of carbachol with ion-selective and impedance sensors for sweat analysis. Carbachol has shown prolonged sweat stimulation in directly stimulated regions for five hours or longer. This work represents a significant leap forward in sweat sensing technology, and may be of broader interest to those interested in on-skin sensing integrated with drug-delivery.

  8. CHESS (CgHExpreSS): a comprehensive analysis tool for the analysis of genomic alterations and their effects on the expression profile of the genome.

    PubMed

    Lee, Mikyung; Kim, Yangseok

    2009-12-16

    Genomic alterations frequently occur in many cancer patients and play important mechanistic roles in the pathogenesis of cancer. Furthermore, they can modify the expression level of genes due to altered copy number in the corresponding region of the chromosome. An accumulating body of evidence supports the possibility that strong genome-wide correlation exists between DNA content and gene expression. Therefore, more comprehensive analysis is needed to quantify the relationship between genomic alteration and gene expression. A well-designed bioinformatics tool is essential to perform this kind of integrative analysis. A few programs have already been introduced for integrative analysis. However, there are many limitations in their performance of comprehensive integrated analysis using published software because of limitations in implemented algorithms and visualization modules. To address this issue, we have implemented the Java-based program CHESS to allow integrative analysis of two experimental data sets: genomic alteration and genome-wide expression profile. CHESS is composed of a genomic alteration analysis module and an integrative analysis module. The genomic alteration analysis module detects genomic alteration by applying a threshold based method or SW-ARRAY algorithm and investigates whether the detected alteration is phenotype specific or not. On the other hand, the integrative analysis module measures the genomic alteration's influence on gene expression. It is divided into two separate parts. The first part calculates overall correlation between comparative genomic hybridization ratio and gene expression level by applying following three statistical methods: simple linear regression, Spearman rank correlation and Pearson's correlation. In the second part, CHESS detects the genes that are differentially expressed according to the genomic alteration pattern with three alternative statistical approaches: Student's t-test, Fisher's exact test and Chi square test. By successive operations of two modules, users can clarify how gene expression levels are affected by the phenotype specific genomic alterations. As CHESS was developed in both Java application and web environments, it can be run on a web browser or a local machine. It also supports all experimental platforms if a properly formatted text file is provided to include the chromosomal position of probes and their gene identifiers. CHESS is a user-friendly tool for investigating disease specific genomic alterations and quantitative relationships between those genomic alterations and genome-wide gene expression profiling.

  9. Analysis of experimental results of the inlet for the NASA hypersonic research engine aerothermodynamic integration model. [wind tunnel tests of ramjet engine hypersonic inlets

    NASA Technical Reports Server (NTRS)

    Andrews, E. H., Jr.; Mackley, E. A.

    1976-01-01

    An aerodynamic engine inlet analysis was performed on the experimental results obtained at nominal Mach numbers of 5, 6, and 7 from the NASA Hypersonic Research Engine (HRE) Aerothermodynamic Integration Model (AIM). Incorporation on the AIM of the mixed-compression inlet design represented the final phase of an inlet development program of the HRE Project. The purpose of this analysis was to compare the AIM inlet experimental results with theoretical results. Experimental performance was based on measured surface pressures used in a one-dimensional force-momentum theorem. Results of the analysis indicate that surface static-pressure measurements agree reasonably well with theoretical predictions except in the regions where the theory predicts large pressure discontinuities. Experimental and theoretical results both based on the one-dimensional force-momentum theorem yielded inlet performance parameters as functions of Mach number that exhibited reasonable agreement. Previous predictions of inlet unstart that resulted from pressure disturbances created by fuel injection and combustion appeared to be pessimistic.

  10. FY 1987 current fiscal year work plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This Current Year Work Plan presents a detailed description of the activities to be performed by the Joint Integration Office during FY87. It breaks down the activities into two major work areas: Program Management and Program Analysis. Program Management is performed by the JIO by providing technical planning and guidance for the development of advanced TRU waste management capabilities. This includes equipment/facility design, engineering, construction, and operations. These functions are integrated to allow transition from interim storage to final disposition. JIO tasks include program requirements identification, long-range technical planning, budget development, program planning document preparation, task guidance, task monitoring, informationmore » gathering and task reporting to DOE, interfacing with other agencies and DOE lead programs, integrating public involvement with program efforts, and preparation of program status reports for DOE. Program Analysis is performed by the JIO to support identification and assessment of alternatives, and development of long-term TRU waste program capabilities. This work plan includes: system analyses, requirements analyses, interim and procedure development, legislative and regulatory analyses, dispatch and traffic analyses, and data bases.« less

  11. Halbach array-based design and simulation of disc coreless permanen-magnet integrated starter generator

    NASA Astrophysics Data System (ADS)

    Li, Y. B.; Yang, Z. X.; Chen, W.; He, Q. Y.

    2017-11-01

    The functional performance, such as magnetic flux leakage, power density and efficiency, is related to the structural characteristics and design technique for the disc permanent magnet synchronous generators (PMSGs). Halbach array theory-based magnetic circuit structure is developed, and Maxwell3D simulation analysis approach of PMSG is proposed in this paper for integrated starter generator (ISG). The magnetization direction of adjacent permanent magnet is organized in difference of 45 degrees for focusing air gap side, and improving the performance of the generator. The magnetic field distribution and functional performance in load and/or unload conditions are simulated by Maxwell3D module. The proposed approach is verified by simulation analysis, the air gap flux density is 0.66T, and the phase voltage curve has the characteristics of a preferable sinusoidal wave and the voltage amplitude 335V can meet the design requirements while the disc coreless PMSG is operating at rated speed. And the developed magnetic circuit structure can be used for engineering design of the disc coreless PMSG to the integrated starter generator.

  12. Analysis of thermodynamics of two-fuel power unit integrated with a carbon dioxide separation plant

    NASA Astrophysics Data System (ADS)

    Kotowicz, Janusz; Bartela, Łukasz; Mikosz, Dorota

    2014-12-01

    The article presents the results of thermodynamic analysis of the supercritical coal-fired power plant with gross electrical output of 900 MW and a pulverized coal boiler. This unit is integrated with the absorption-based CO2 separation installation. The heat required for carrying out the desorption process, is supplied by the system with the gas turbine. Analyses were performed for two variants of the system. In the first case, in addition to the gas turbine there is an evaporator powered by exhaust gases from the gas turbine expander. The second expanded variant assumes the application of gas turbine combined cycle with heat recovery steam generator and backpressure steam turbine. The way of determining the efficiency of electricity generation and other defined indicators to assess the energy performance of the test block was showed. The size of the gas turbine system was chosen because of the need for heat for the desorption unit, taking the value of the heat demand 4 MJ/kg CO2. The analysis results obtained for the both variants of the installation with integrated CO2 separation plant were compared with the results of the analysis of the block where the separation is not conducted.

  13. QQACCT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacobsen, Douglas

    2015-01-01

    batchacct provides convenient library and command-line access to batch system accounting data for GridEngine and SLURM schedulers. It can be used to perform queries useful for data analysis of the accounting data alone or for integrative analysis in the context of a larger query.

  14. Statewide crash analysis and forecasting.

    DOT National Transportation Integrated Search

    2008-11-20

    There is a need for the development of safety analysis tools to allow Penn DOT to better assess the safety performance of road : segments in the Commonwealth. The project utilized a safety management system database at Penn DOT that integrates crash,...

  15. A CAD Approach to Integrating NDE With Finite Element

    NASA Technical Reports Server (NTRS)

    Abdul-Aziz, Ali; Downey, James; Ghosn, Louis J.; Baaklini, George Y.

    2004-01-01

    Nondestructive evaluation (NDE) is one of several technologies applied at NASA Glenn Research Center to determine atypical deformities, cracks, and other anomalies experienced by structural components. NDE consists of applying high-quality imaging techniques (such as x-ray imaging and computed tomography (CT)) to discover hidden manufactured flaws in a structure. Efforts are in progress to integrate NDE with the finite element (FE) computational method to perform detailed structural analysis of a given component. This report presents the core outlines for an in-house technical procedure that incorporates this combined NDE-FE interrelation. An example is presented to demonstrate the applicability of this analytical procedure. FE analysis of a test specimen is performed, and the resulting von Mises stresses and the stress concentrations near the anomalies are observed, which indicates the fidelity of the procedure. Additional information elaborating on the steps needed to perform such an analysis is clearly presented in the form of mini step-by-step guidelines.

  16. Energy Analysis Publications | Energy Analysis | NREL

    Science.gov Websites

    Systems Impact Analysis We perform impact analysis to evaluate and understand the impact of markets publications. Featured Publications Complex Systems Analysis Complex systems analysis integrates all aspects of , policies, and financing on technology uptake and the impact of new technologies on markets and policy

  17. Design integration and noise studies for jet STOL aircraft. Volume 1: Program summary

    NASA Technical Reports Server (NTRS)

    Okeefe, V. O.; Kelley, G. S.

    1972-01-01

    This program was undertaken to develop, through analysis, design, experimental static testing, wind tunnel testing, and design integration studies, an augmentor wing jet flap configuration for a jet STOL transport aircraft having maximum propulsion and aerodynamic performance with minimum noise generation. The program had three basic elements: (1) static testing of a scale wing section to demonstrate augmentor performance and noise characteristics; (2) two-dimensional wind tunnel testing to determine flight speed effects on performance; and (3) system design and evaluation which integrated the augmentor information obtained into a complete system and ensured that the design was compatible with the requirements for a large STOL transport having a 500-ft sideline noise of 95 PNdB or less. This objective has been achieved.

  18. Building integrated semi-transparent photovoltaics: energy and daylighting performance

    NASA Astrophysics Data System (ADS)

    Kapsis, Konstantinos; Athienitis, Andreas K.

    2011-08-01

    This paper focuses on modeling and evaluation of semi-transparent photovoltaic technologies integrated into a coolingdominated office building façade by employing the concept of three-section façade. An energy simulation model is developed, using building simulation software, to investigate the effect of semi-transparent photovoltaic transmittance on the energy performance of an office in a typical office building in Montreal. The analysis is performed for five major façade orientations and two façade configurations. Using semi-transparent photovoltaic integrated into the office façade, electricity savings of up to 53.1% can be achieved compared to a typical office equipped with double glazing with Argon filling and a low emissivity coating, and lighting controlled based on occupancy and daylight levels.e.c

  19. Principles of Design for High Performing Organizations: An Assessment of the State of the Field of Organizational Design Research

    DTIC Science & Technology

    1994-03-01

    asked whether the planned structure considered (a) all objectives, (b) all functions, (c) all relevant units of analysis such as the plant , the...literature and provides an integrative model of design for high perfor-ming organizations. The model is based on an analysis of current theories of...important midrange theories underlie much of the work on organizational analysis . 0 Systems Approaches. These approaches emphasize the rational, goal

  20. In situ visualization and data analysis for turbidity currents simulation

    NASA Astrophysics Data System (ADS)

    Camata, Jose J.; Silva, Vítor; Valduriez, Patrick; Mattoso, Marta; Coutinho, Alvaro L. G. A.

    2018-01-01

    Turbidity currents are underflows responsible for sediment deposits that generate geological formations of interest for the oil and gas industry. LibMesh-sedimentation is an application built upon the libMesh library to simulate turbidity currents. In this work, we present the integration of libMesh-sedimentation with in situ visualization and in transit data analysis tools. DfAnalyzer is a solution based on provenance data to extract and relate strategic simulation data in transit from multiple data for online queries. We integrate libMesh-sedimentation and ParaView Catalyst to perform in situ data analysis and visualization. We present a parallel performance analysis for two turbidity currents simulations showing that the overhead for both in situ visualization and in transit data analysis is negligible. We show that our tools enable monitoring the sediments appearance at runtime and steer the simulation based on the solver convergence and visual information on the sediment deposits, thus enhancing the analytical power of turbidity currents simulations.

  1. Identification of candidate genes in osteoporosis by integrated microarray analysis.

    PubMed

    Li, J J; Wang, B Q; Fei, Q; Yang, Y; Li, D

    2016-12-01

    In order to screen the altered gene expression profile in peripheral blood mononuclear cells of patients with osteoporosis, we performed an integrated analysis of the online microarray studies of osteoporosis. We searched the Gene Expression Omnibus (GEO) database for microarray studies of peripheral blood mononuclear cells in patients with osteoporosis. Subsequently, we integrated gene expression data sets from multiple microarray studies to obtain differentially expressed genes (DEGs) between patients with osteoporosis and normal controls. Gene function analysis was performed to uncover the functions of identified DEGs. A total of three microarray studies were selected for integrated analysis. In all, 1125 genes were found to be significantly differentially expressed between osteoporosis patients and normal controls, with 373 upregulated and 752 downregulated genes. Positive regulation of the cellular amino metabolic process (gene ontology (GO): 0033240, false discovery rate (FDR) = 1.00E + 00) was significantly enriched under the GO category for biological processes, while for molecular functions, flavin adenine dinucleotide binding (GO: 0050660, FDR = 3.66E-01) and androgen receptor binding (GO: 0050681, FDR = 6.35E-01) were significantly enriched. DEGs were enriched in many osteoporosis-related signalling pathways, including those of mitogen-activated protein kinase (MAPK) and calcium. Protein-protein interaction (PPI) network analysis showed that the significant hub proteins contained ubiquitin specific peptidase 9, X-linked (Degree = 99), ubiquitin specific peptidase 19 (Degree = 57) and ubiquitin conjugating enzyme E2 B (Degree = 57). Analysis of gene function of identified differentially expressed genes may expand our understanding of fundamental mechanisms leading to osteoporosis. Moreover, significantly enriched pathways, such as MAPK and calcium, may involve in osteoporosis through osteoblastic differentiation and bone formation.Cite this article: J. J. Li, B. Q. Wang, Q. Fei, Y. Yang, D. Li. Identification of candidate genes in osteoporosis by integrated microarray analysis. Bone Joint Res 2016;5:594-601. DOI: 10.1302/2046-3758.512.BJR-2016-0073.R1. © 2016 Fei et al.

  2. SMAUMAT_ITI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jannetti, C.; Becker, R.

    The software is an ABAQUS/Standard UMAT (user defined material behavior subroutine) that implements the constitutive model for shape-memory alloy materials developed by Jannetti et. al. (2003a) using a fully implicit time integration scheme to integrate the constitutive equations. The UMAT is used in conjunction with ABAQUS/Standard to perform a finite-element analysis of SMA materials.

  3. A Study on Aircraft Engine Control Systems for Integrated Flight and Propulsion Control

    NASA Astrophysics Data System (ADS)

    Yamane, Hideaki; Matsunaga, Yasushi; Kusakawa, Takeshi

    A flyable FADEC system engineering model incorporating Integrated Flight and Propulsion Control (IFPC) concept is developed for a highly maneuverable aircraft and a fighter-class engine. An overview of the FADEC system and functional assignments for its components such as the Engine Control Unit (ECU) and the Integrated Control Unit (ICU) are described. Overall system reliability analysis, convex analysis and multivariable controller design for the engine, fault detection/redundancy management, and response characteristics of a fuel system are addressed. The engine control performance of the FADEC is demonstrated by hardware-in-the-loop simulation for fast acceleration and thrust transient characteristics.

  4. Alterations in White Matter Integrity in Young Adults with Smartphone Dependence

    PubMed Central

    Hu, Yuanming; Long, Xiaojing; Lyu, Hanqing; Zhou, Yangyang; Chen, Jianxiang

    2017-01-01

    Smartphone dependence (SPD) is increasingly regarded as a psychological problem, however, the underlying neural substrates of SPD is still not clear. High resolution magnetic resonance imaging provides a useful tool to help understand and manage the disorder. In this study, a tract-based spatial statistics (TBSS) analysis on diffusion tensor imaging (DTI) was used to measure white matter integrity in young adults with SPD. A total of 49 subjects were recruited and categorized into SPD and control group based on their clinical behavioral tests. To localize regions with abnormal white matter integrity in SPD, the voxel-wise analysis of fractional anisotropy (FA) and mean diffusivity (MD) on the whole brain was performed by TBSS. The correlation between the quantitative variables of brain structures and the behavior measures were performed. Our result demonstrated that SPD had significantly lower white matter integrity than controls in superior longitudinal fasciculus (SLF), superior corona radiata (SCR), internal capsule, external capsule, sagittal stratum, fornix/stria terminalis and midbrain structures. Correlation analysis showed that the observed abnormalities in internal capsule and stria terminalis were correlated with the severity of dependence and behavioral assessments. Our finding facilitated a primary understanding of white matter characteristics in SPD and indicated that the structural deficits might link to behavioral impairments. PMID:29163108

  5. Analysis of the clonal repertoire of gene-corrected cells in gene therapy.

    PubMed

    Paruzynski, Anna; Glimm, Hanno; Schmidt, Manfred; Kalle, Christof von

    2012-01-01

    Gene therapy-based clinical phase I/II studies using integrating retroviral vectors could successfully treat different monogenetic inherited diseases. However, with increased efficiency of this therapy, severe side effects occurred in various gene therapy trials. In all cases, integration of the vector close to or within a proto-oncogene contributed substantially to the development of the malignancies. Thus, the in-depth analysis of integration site patterns is of high importance to uncover potential clonal outgrowth and to assess the safety of gene transfer vectors and gene therapy protocols. The standard and nonrestrictive linear amplification-mediated PCR (nrLAM-PCR) in combination with high-throughput sequencing exhibits technologies that allow to comprehensively analyze the clonal repertoire of gene-corrected cells and to assess the safety of the used vector system at an early stage on the molecular level. It enables clarifying the biological consequences of the vector system on the fate of the transduced cell. Furthermore, the downstream performance of real-time PCR allows a quantitative estimation of the clonality of individual cells and their clonal progeny. Here, we present a guideline that should allow researchers to perform comprehensive integration site analysis in preclinical and clinical studies. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. Systematic analysis of signaling pathways using an integrative environment.

    PubMed

    Visvanathan, Mahesh; Breit, Marc; Pfeifer, Bernhard; Baumgartner, Christian; Modre-Osprian, Robert; Tilg, Bernhard

    2007-01-01

    Understanding the biological processes of signaling pathways as a whole system requires an integrative software environment that has comprehensive capabilities. The environment should include tools for pathway design, visualization, simulation and a knowledge base concerning signaling pathways as one. In this paper we introduce a new integrative environment for the systematic analysis of signaling pathways. This system includes environments for pathway design, visualization, simulation and a knowledge base that combines biological and modeling information concerning signaling pathways that provides the basic understanding of the biological system, its structure and functioning. The system is designed with a client-server architecture. It contains a pathway designing environment and a simulation environment as upper layers with a relational knowledge base as the underlying layer. The TNFa-mediated NF-kB signal trans-duction pathway model was designed and tested using our integrative framework. It was also useful to define the structure of the knowledge base. Sensitivity analysis of this specific pathway was performed providing simulation data. Then the model was extended showing promising initial results. The proposed system offers a holistic view of pathways containing biological and modeling data. It will help us to perform biological interpretation of the simulation results and thus contribute to a better understanding of the biological system for drug identification.

  7. An Integrated Solution for Performing Thermo-fluid Conjugate Analysis

    NASA Technical Reports Server (NTRS)

    Kornberg, Oren

    2009-01-01

    A method has been developed which integrates a fluid flow analyzer and a thermal analyzer to produce both steady state and transient results of 1-D, 2-D, and 3-D analysis models. The Generalized Fluid System Simulation Program (GFSSP) is a one dimensional, general purpose fluid analysis code which computes pressures and flow distributions in complex fluid networks. The MSC Systems Improved Numerical Differencing Analyzer (MSC.SINDA) is a one dimensional general purpose thermal analyzer that solves network representations of thermal systems. Both GFSSP and MSC.SINDA have graphical user interfaces which are used to build the respective model and prepare it for analysis. The SINDA/GFSSP Conjugate Integrator (SGCI) is a formbase graphical integration program used to set input parameters for the conjugate analyses and run the models. The contents of this paper describes SGCI and its thermo-fluids conjugate analysis techniques and capabilities by presenting results from some example models including the cryogenic chill down of a copper pipe, a bar between two walls in a fluid stream, and a solid plate creating a phase change in a flowing fluid.

  8. Geographic integration of hepatitis C virus: A global threat.

    PubMed

    Daw, Mohamed A; El-Bouzedi, Abdallah A; Ahmed, Mohamed O; Dau, Aghnyia A; Agnan, Mohamed M; Drah, Aisha M

    2016-11-12

    To assess hepatitis C virus (HCV) geographic integration, evaluate the spatial and temporal evolution of HCV worldwide and propose how to diminish its burden. A literature search of published articles was performed using PubMed, MEDLINE and other related databases up to December 2015. A critical data assessment and analysis regarding the epidemiological integration of HCV was carried out using the meta-analysis method. The data indicated that HCV has been integrated immensely over time and through various geographical regions worldwide. The history of HCV goes back to 1535 but between 1935 and 1965 it exhibited a rapid, exponential spread. This integration is clearly seen in the geo-epidemiology and phylogeography of HCV. HCV integration can be mirrored either as intra-continental or trans-continental. Migration, drug trafficking and HCV co-infection, together with other potential risk factors, have acted as a vehicle for this integration. Evidence shows that the geographic integration of HCV has been important in the global and regional distribution of HCV. HCV geographic integration is clearly evident and this should be reflected in the prevention and treatment of this ongoing pandemic.

  9. Training Needs Analysis: Weaknesses in the Conventional Approach.

    ERIC Educational Resources Information Center

    Leat, Michael James; Lovel, Murray Jack

    1997-01-01

    Identification of the training and development needs of administrative support staff is not aided by conventional performance appraisal, which measures summary or comparative effectiveness. Meaningful diagnostic evaluation integrates three levels of analysis (organization, task, and individual), using behavioral expectation scales. (SK)

  10. Developing a comprehensive framework of community integration for people with acquired brain injury: a conceptual analysis.

    PubMed

    Shaikh, Nusratnaaz M; Kersten, Paula; Siegert, Richard J; Theadom, Alice

    2018-03-06

    Despite increasing emphasis on the importance of community integration as an outcome for acquired brain injury (ABI), there is still no consensus on the definition of community integration. The aim of this study was to complete a concept analysis of community integration in people with ABI. The method of concept clarification was used to guide concept analysis of community integration based on a literature review. Articles were included if they explored community integration in people with ABI. Data extraction was performed by the initial coding of (1) the definition of community integration used in the articles, (2) attributes of community integration recognized in the articles' findings, and (3) the process of community integration. This information was synthesized to develop a model of community integration. Thirty-three articles were identified that met the inclusion criteria. The construct of community integration was found to be a non-linear process reflecting recovery over time, sequential goals, and transitions. Community integration was found to encompass six components including: independence, sense of belonging, adjustment, having a place to live, involved in a meaningful occupational activity, and being socially connected into the community. Antecedents to community integration included individual, injury-related, environmental, and societal factors. The findings of this concept analysis suggest that the concept of community integration is more diverse than previously recognized. New measures and rehabilitation plans capturing all attributes of community integration are needed in clinical practice. Implications for rehabilitation Understanding of perceptions and lived experiences of people with acquired brain injury through this analysis provides basis to ensure rehabilitation meets patients' needs. This model highlights the need for clinicians to be aware and assess the role of antecedents as well as the attributes of community integration itself to ensure all aspects are addressed in in a manner that will enhance the recovery and improve the level of integration into the community. The finding that community integration is a non-linear process also highlights the need for rehabilitation professionals to review and revise plans over time in response to a person's changing circumstances and recovery journey. This analysis provides the groundwork for an operational model of community integration for the development of a measure of community integration that assesses all six attributes revealed in this review not recognized in previous frameworks.

  11. High Fidelity, Fuel-Like Thermal Simulators for Non-Nuclear Testing: Analysis and Initial Test Results

    NASA Technical Reports Server (NTRS)

    Bragg-Sitton, Shannon M.; Dickens, Ricky; Dixon, David; Kapernick, Richard

    2007-01-01

    Non-nuclear testing can be a valuable tool in the development of a space nuclear power system, providing system characterization data and allowing one to work through various fabrication, assembly and integration issues without the cost and time associated with a full ground nuclear test. In a non-nuclear test bed, electric heaters are used to simulate the heat from nuclear fuel. Testing with non-optimized heater elements allows one to assess thermal, heat transfer. and stress related attributes of a given system, but fails to demonstrate the dynamic response that would be present in an integrated, fueled reactor system. High fidelity thermal simulators that match both the static and the dynamic fuel pin performance that would be observed in an operating, fueled nuclear reactor can vastly increase the value of non-nuclear test results. With optimized simulators, the integration of thermal hydraulic hardware tests with simulated neutronic response provides a bridge between electrically heated testing and fueled nuclear testing. By implementing a neutronic response model to simulate the dynamic response that would be expected in a fueled reactor system, one can better understand system integration issues, characterize integrated system response times and response characteristics and assess potential design improvements at relatively small fiscal investment. Initial conceptual thermal simulator designs are determined by simple one-dimensional analysis at a single axial location and at steady state conditions; feasible concepts are then input into a detailed three-dimensional model for comparison to expected fuel pin performance. Static and dynamic fuel pin performance for a proposed reactor design is determined using SINDA/FLUINT thermal analysis software, and comparison is made between the expected nuclear performance and the performance of conceptual thermal simulator designs. Through a series of iterative analyses, a conceptual high fidelity design is developed: this is followed by engineering design, fabrication, and testing to validate the overall design process. Test results presented in this paper correspond to a "first cut" simulator design for a potential liquid metal (NaK) cooled reactor design that could be applied for Lunar surface power. Proposed refinements to this simulator design are also presented.

  12. Review of Exploration Systems Development (ESD) Integrated Hazard Development Process. Volume 1; Appendices

    NASA Technical Reports Server (NTRS)

    Smiles, Michael D.; Blythe, Michael P.; Bejmuk, Bohdan; Currie, Nancy J.; Doremus, Robert C.; Franzo, Jennifer C.; Gordon, Mark W.; Johnson, Tracy D.; Kowaleski, Mark M.; Laube, Jeffrey R.

    2015-01-01

    The Chief Engineer of the Exploration Systems Development (ESD) Office requested that the NASA Engineering and Safety Center (NESC) perform an independent assessment of the ESD's integrated hazard development process. The focus of the assessment was to review the integrated hazard analysis (IHA) process and identify any gaps/improvements in the process (e.g., missed causes, cause tree completeness, missed hazards). This document contains the outcome of the NESC assessment.

  13. Review of Exploration Systems Development (ESD) Integrated Hazard Development Process. Appendices; Volume 2

    NASA Technical Reports Server (NTRS)

    Smiles, Michael D.; Blythe, Michael P.; Bejmuk, Bohdan; Currie, Nancy J.; Doremus, Robert C.; Franzo, Jennifer C.; Gordon, Mark W.; Johnson, Tracy D.; Kowaleski, Mark M.; Laube, Jeffrey R.

    2015-01-01

    The Chief Engineer of the Exploration Systems Development (ESD) Office requested that the NASA Engineering and Safety Center (NESC) perform an independent assessment of the ESD's integrated hazard development process. The focus of the assessment was to review the integrated hazard analysis (IHA) process and identify any gaps/improvements in the process (e.g. missed causes, cause tree completeness, missed hazards). This document contains the outcome of the NESC assessment.

  14. Depth Cue Integration in an Active Control Paradigm

    NASA Technical Reports Server (NTRS)

    Kaiser, Mary K.; Sweet, Barabara T.; Shafto, Meredith; Null, Cynthia H. (Technical Monitor)

    1995-01-01

    Numerous models of depth cue integration have been proposed. Of particular interest is how the visual system processes discrepent cues, as might arise when viewing synthetic displays. A powerful paradigm for examining this integration process can be adapted from manual control research. This methodology introduces independent disturbances in the candidate cues, then performs spectral analysis of subjects' resulting motoric responses (e.g., depth matching). We will describe this technique and present initial findings.

  15. Exploring the Black Box: An Analysis of Work Group Diversity, Conflict, and Performance.

    ERIC Educational Resources Information Center

    Pelled, Lisa Hope; Eisenhardt, Kathleen M.; Xin, Katherine R.

    1999-01-01

    Tests an integrative model of the relationships among diversity, conflict, and performance, using a sample of 45 electronics-industry worker teams. Functional background diversity drives task conflict; multiple types of diversity drive emotional conflict. Task conflict affects task performance more favorably than does emotional conflict. (102…

  16. ASIC For Complex Fixed-Point Arithmetic

    NASA Technical Reports Server (NTRS)

    Petilli, Stephen G.; Grimm, Michael J.; Olson, Erlend M.

    1995-01-01

    Application-specific integrated circuit (ASIC) performs 24-bit, fixed-point arithmetic operations on arrays of complex-valued input data. High-performance, wide-band arithmetic logic unit (ALU) designed for use in computing fast Fourier transforms (FFTs) and for performing ditigal filtering functions. Other applications include general computations involved in analysis of spectra and digital signal processing.

  17. A Study of Performance Support in Higher Education

    ERIC Educational Resources Information Center

    Lion, Robert W.

    2011-01-01

    Successful performance improvement efforts are closely tied to the strength and integrity of the performance analysis process. During a time when higher education institutions are facing increasing budget cuts, the ability to recruit and retain students is extremely important. For some institutions, web-based courses have been viewed as a way to…

  18. APMS 3.0 Flight Analyst Guide: Aviation Performance Measuring System

    NASA Technical Reports Server (NTRS)

    Jay, Griff; Prothero, Gary; Romanowski, Timothy; Lynch, Robert; Lawrence, Robert; Rosenthal, Loren

    2004-01-01

    The Aviation Performance Measuring System (APMS) is a method-embodied in software-that uses mathematical algorithms and related procedures to analyze digital flight data extracted from aircraft flight data recorders. APMS consists of an integrated set of tools used to perform two primary functions: a) Flight Data Importation b) Flight Data Analysis.

  19. Application of a high-energy-density permanent magnet material in underwater systems

    NASA Astrophysics Data System (ADS)

    Cho, C. P.; Egan, C.; Krol, W. P.

    1996-06-01

    This paper addresses the application of high-energy-density permanent magnet (PM) technology to (1) the brushless, axial-field PM motor and (2) the integrated electric motor/pump system for under-water applications. Finite-element analysis and lumped parameter magnetic circuit analysis were used to calculate motor parameters and performance characteristics and to conduct tradeoff studies. Compact, efficient, reliable, and quiet underwater systems are attainable with the development of high-energy-density PM material, power electronic devices, and power integrated-circuit technology.

  20. Integrated multidisciplinary analysis of segmented reflector telescopes

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.; Needels, Laura

    1992-01-01

    The present multidisciplinary telescope-analysis approach, which encompasses thermal, structural, control and optical considerations, is illustrated for the case of an IR telescope in LEO; attention is given to end-to-end evaluations of the effects of mechanical disturbances and thermal gradients in measures of optical performance. Both geometric ray-tracing and surface-to-surface diffraction approximations are used in the telescope's optical model. Also noted is the role played by NASA-JPL's Integrated Modeling of Advanced Optical Systems computation tool, in view of numerical samples.

  1. Taverna: a tool for building and running workflows of services

    PubMed Central

    Hull, Duncan; Wolstencroft, Katy; Stevens, Robert; Goble, Carole; Pocock, Mathew R.; Li, Peter; Oinn, Tom

    2006-01-01

    Taverna is an application that eases the use and integration of the growing number of molecular biology tools and databases available on the web, especially web services. It allows bioinformaticians to construct workflows or pipelines of services to perform a range of different analyses, such as sequence analysis and genome annotation. These high-level workflows can integrate many different resources into a single analysis. Taverna is available freely under the terms of the GNU Lesser General Public License (LGPL) from . PMID:16845108

  2. The EUCLID/V1 Integrated Code for Safety Assessment of Liquid Metal Cooled Fast Reactors. Part 1: Basic Models

    NASA Astrophysics Data System (ADS)

    Mosunova, N. A.

    2018-05-01

    The article describes the basic models included in the EUCLID/V1 integrated code intended for safety analysis of liquid metal (sodium, lead, and lead-bismuth) cooled fast reactors using fuel rods with a gas gap and pellet dioxide, mixed oxide or nitride uranium-plutonium fuel under normal operation, under anticipated operational occurrences and accident conditions by carrying out interconnected thermal-hydraulic, neutronics, and thermal-mechanical calculations. Information about the Russian and foreign analogs of the EUCLID/V1 integrated code is given. Modeled objects, equation systems in differential form solved in each module of the EUCLID/V1 integrated code (the thermal-hydraulic, neutronics, fuel rod analysis module, and the burnup and decay heat calculation modules), the main calculated quantities, and also the limitations on application of the code are presented. The article also gives data on the scope of functions performed by the integrated code's thermal-hydraulic module, using which it is possible to describe both one- and twophase processes occurring in the coolant. It is shown that, owing to the availability of the fuel rod analysis module in the integrated code, it becomes possible to estimate the performance of fuel rods in different regimes of the reactor operation. It is also shown that the models implemented in the code for calculating neutron-physical processes make it possible to take into account the neutron field distribution over the fuel assembly cross section as well as other features important for the safety assessment of fast reactors.

  3. Evaluation Framework and Analyses for Thermal Energy Storage Integrated with Packaged Air Conditioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kung, F.; Deru, M.; Bonnema, E.

    2013-10-01

    Few third-party guidance documents or tools are available for evaluating thermal energy storage (TES) integrated with packaged air conditioning (AC), as this type of TES is relatively new compared to TES integrated with chillers or hot water systems. To address this gap, researchers at the National Renewable Energy Laboratory conducted a project to improve the ability of potential technology adopters to evaluate TES technologies. Major project outcomes included: development of an evaluation framework to describe key metrics, methodologies, and issues to consider when assessing the performance of TES systems integrated with packaged AC; application of multiple concepts from the evaluationmore » framework to analyze performance data from four demonstration sites; and production of a new simulation capability that enables modeling of TES integrated with packaged AC in EnergyPlus. This report includes the evaluation framework and analysis results from the project.« less

  4. Professionalism in the Air Force: A Comparative Analysis of Commissioned Officers with Non-Commissioned Officers

    DTIC Science & Technology

    2007-03-01

    60 Appendix D: Officer Performance Report and Promotion Recommendation Forms.......63 Appendix E. Sample Enlisted Performance Report...can be described in ideal terms such as character, integrity, and commitment, the desired output of professionalism is performance . Performance ...provides the incentives to do well and fuels the professional’s drive toward excellence (Sorley, 1998). The values necessary to consistently perform

  5. Towards a Grand Unified Theory of sports performance.

    PubMed

    Glazier, Paul S

    2017-12-01

    Sports performance is generally considered to be governed by a range of interacting physiological, biomechanical, and psychological variables, amongst others. Despite sports performance being multi-factorial, however, the majority of performance-oriented sports science research has predominantly been monodisciplinary in nature, presumably due, at least in part, to the lack of a unifying theoretical framework required to integrate the various subdisciplines of sports science. In this target article, I propose a Grand Unified Theory (GUT) of sports performance-and, by elaboration, sports science-based around the constraints framework introduced originally by Newell (1986). A central tenet of this GUT is that, at both the intra- and inter-individual levels of analysis, patterns of coordination and control, which directly determine the performance outcome, emerge from the confluence of interacting organismic, environmental, and task constraints via the formation and self-organisation of coordinative structures. It is suggested that this GUT could be used to: foster interdisciplinary research collaborations; break down the silos that have developed in sports science and restore greater disciplinary balance to the field; promote a more holistic understanding of sports performance across all levels of analysis; increase explanatory power of applied research work; provide stronger rationale for data collection and variable selection; and direct the development of integrated performance monitoring technologies. This GUT could also provide a scientifically rigorous basis for integrating the subdisciplines of sports science in applied sports science support programmes adopted by high-performance agencies and national governing bodies for various individual and team sports. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Relationships between academic performance, SES school type and perceptual-motor skills in first grade South African learners: NW-CHILD study.

    PubMed

    Pienaar, A E; Barhorst, R; Twisk, J W R

    2014-05-01

    Perceptual-motor skills contribute to a variety of basic learning skills associated with normal academic success. This study aimed to determine the relationship between academic performance and perceptual-motor skills in first grade South African learners and whether low SES (socio-economic status) school type plays a role in such a relationship. This cross-sectional study of the baseline measurements of the NW-CHILD longitudinal study included a stratified random sample of first grade learners (n = 812; 418 boys and 394 boys), with a mean age of 6.78 years ± 0.49 living in the North West Province (NW) of South Africa. The Beery-Buktenica Developmental Test of Visual-Motor Integration-4 (VMI) was used to assess visual-motor integration, visual perception and hand control while the Bruininks Oseretsky Test of Motor Proficiency, short form (BOT2-SF) assessed overall motor proficiency. Academic performance in math, reading and writing was assessed with the Mastery of Basic Learning Areas Questionnaire. Linear mixed models analysis was performed with spss to determine possible differences between the different VMI and BOT2-SF standard scores in different math, reading and writing mastery categories ranging from no mastery to outstanding mastery. A multinomial multilevel logistic regression analysis was performed to assess the relationship between a clustered score of academic performance and the different determinants. A strong relationship was established between academic performance and VMI, visual perception, hand control and motor proficiency with a significant relationship between a clustered academic performance score, visual-motor integration and visual perception. A negative association was established between low SES school types on academic performance, with a common perceptual motor foundation shared by all basic learning areas. Visual-motor integration, visual perception, hand control and motor proficiency are closely related to basic academic skills required in the first formal school year, especially among learners in low SES type schools. © 2013 John Wiley & Sons Ltd.

  7. Fuel cell on-site integrated energy system parametric analysis of a residential complex

    NASA Technical Reports Server (NTRS)

    Simons, S. N.

    1977-01-01

    A parametric energy-use analysis was performed for a large apartment complex served by a fuel cell on-site integrated energy system (OS/IES). The variables parameterized include operating characteristics for four phosphoric acid fuel cells, eight OS/IES energy recovery systems, and four climatic locations. The annual fuel consumption for selected parametric combinations are presented and a breakeven economic analysis is presented for one parametric combination. The results show fuel cell electrical efficiency and system component choice have the greatest effect on annual fuel consumption; fuel cell thermal efficiency and geographic location have less of an effect.

  8. The integrated analysis capability (IAC Level 2.0)

    NASA Technical Reports Server (NTRS)

    Frisch, Harold P.; Vos, Robert G.

    1988-01-01

    The critical data management issues involved in the development of the integral analysis capability (IAC), Level 2, to support the design analysis and performance evaluation of large space structures, are examined. In particular, attention is given to the advantages and disadvantages of the formalized data base; merging of the matrix and relational data concepts; data types, query operators, and data handling; sequential versus direct-access files; local versus global data access; programming languages and host machines; and data flow techniques. The discussion also covers system architecture, recent system level enhancements, executive/user interface capabilities, and technology applications.

  9. NeuroMatic: An Integrated Open-Source Software Toolkit for Acquisition, Analysis and Simulation of Electrophysiological Data

    PubMed Central

    Rothman, Jason S.; Silver, R. Angus

    2018-01-01

    Acquisition, analysis and simulation of electrophysiological properties of the nervous system require multiple software packages. This makes it difficult to conserve experimental metadata and track the analysis performed. It also complicates certain experimental approaches such as online analysis. To address this, we developed NeuroMatic, an open-source software toolkit that performs data acquisition (episodic, continuous and triggered recordings), data analysis (spike rasters, spontaneous event detection, curve fitting, stationarity) and simulations (stochastic synaptic transmission, synaptic short-term plasticity, integrate-and-fire and Hodgkin-Huxley-like single-compartment models). The merging of a wide range of tools into a single package facilitates a more integrated style of research, from the development of online analysis functions during data acquisition, to the simulation of synaptic conductance trains during dynamic-clamp experiments. Moreover, NeuroMatic has the advantage of working within Igor Pro, a platform-independent environment that includes an extensive library of built-in functions, a history window for reviewing the user's workflow and the ability to produce publication-quality graphics. Since its original release, NeuroMatic has been used in a wide range of scientific studies and its user base has grown considerably. NeuroMatic version 3.0 can be found at http://www.neuromatic.thinkrandom.com and https://github.com/SilverLabUCL/NeuroMatic. PMID:29670519

  10. Development and Integration of Control System Models

    NASA Technical Reports Server (NTRS)

    Kim, Young K.

    1998-01-01

    The computer simulation tool, TREETOPS, has been upgraded and used at NASA/MSFC to model various complicated mechanical systems and to perform their dynamics and control analysis with pointing control systems. A TREETOPS model of Advanced X-ray Astrophysics Facility - Imaging (AXAF-1) dynamics and control system was developed to evaluate the AXAF-I pointing performance for Normal Pointing Mode. An optical model of Shooting Star Experiment (SSE) was also developed and its optical performance analysis was done using the MACOS software.

  11. Reliability Modeling Methodology for Independent Approaches on Parallel Runways Safety Analysis

    NASA Technical Reports Server (NTRS)

    Babcock, P.; Schor, A.; Rosch, G.

    1998-01-01

    This document is an adjunct to the final report An Integrated Safety Analysis Methodology for Emerging Air Transport Technologies. That report presents the results of our analysis of the problem of simultaneous but independent, approaches of two aircraft on parallel runways (independent approaches on parallel runways, or IAPR). This introductory chapter presents a brief overview and perspective of approaches and methodologies for performing safety analyses for complex systems. Ensuing chapter provide the technical details that underlie the approach that we have taken in performing the safety analysis for the IAPR concept.

  12. 1991 NASA Life Support Systems Analysis workshop

    NASA Technical Reports Server (NTRS)

    Evanich, Peggy L.; Crabb, Thomas M.; Gartrell, Charles F.

    1992-01-01

    The 1991 Life Support Systems Analysis Workshop was sponsored by NASA Headquarters' Office of Aeronautics and Space Technology (OAST) to foster communication among NASA, industrial, and academic specialists, and to integrate their inputs and disseminate information to them. The overall objective of systems analysis within the Life Support Technology Program of OAST is to identify, guide the development of, and verify designs which will increase the performance of the life support systems on component, subsystem, and system levels for future human space missions. The specific goals of this workshop were to report on the status of systems analysis capabilities, to integrate the chemical processing industry technologies, and to integrate recommendations for future technology developments related to systems analysis for life support systems. The workshop included technical presentations, discussions, and interactive planning, with time allocated for discussion of both technology status and time-phased technology development recommendations. Key personnel from NASA, industry, and academia delivered inputs and presentations on the status and priorities of current and future systems analysis methods and requirements.

  13. Integrating Oil Debris and Vibration Gear Damage Detection Technologies Using Fuzzy Logic

    NASA Technical Reports Server (NTRS)

    Dempsey, Paula J.; Afjeh, Abdollah A.

    2002-01-01

    A diagnostic tool for detecting damage to spur gears was developed. Two different measurement technologies, wear debris analysis and vibration, were integrated into a health monitoring system for detecting surface fatigue pitting damage on gears. This integrated system showed improved detection and decision-making capabilities as compared to using individual measurement technologies. This diagnostic tool was developed and evaluated experimentally by collecting vibration and oil debris data from fatigue tests performed in the NASA Glenn Spur Gear Fatigue Test Rig. Experimental data were collected during experiments performed in this test rig with and without pitting. Results show combining the two measurement technologies improves the detection of pitting damage on spur gears.

  14. Integration of Multi-Modal Biomedical Data to Predict Cancer Grade and Patient Survival.

    PubMed

    Phan, John H; Hoffman, Ryan; Kothari, Sonal; Wu, Po-Yen; Wang, May D

    2016-02-01

    The Big Data era in Biomedical research has resulted in large-cohort data repositories such as The Cancer Genome Atlas (TCGA). These repositories routinely contain hundreds of matched patient samples for genomic, proteomic, imaging, and clinical data modalities, enabling holistic and multi-modal integrative analysis of human disease. Using TCGA renal and ovarian cancer data, we conducted a novel investigation of multi-modal data integration by combining histopathological image and RNA-seq data. We compared the performances of two integrative prediction methods: majority vote and stacked generalization. Results indicate that integration of multiple data modalities improves prediction of cancer grade and outcome. Specifically, stacked generalization, a method that integrates multiple data modalities to produce a single prediction result, outperforms both single-data-modality prediction and majority vote. Moreover, stacked generalization reveals the contribution of each data modality (and specific features within each data modality) to the final prediction result and may provide biological insights to explain prediction performance.

  15. Using EIGER for Antenna Design and Analysis

    NASA Technical Reports Server (NTRS)

    Champagne, Nathan J.; Khayat, Michael; Kennedy, Timothy F.; Fink, Patrick W.

    2007-01-01

    EIGER (Electromagnetic Interactions GenERalized) is a frequency-domain electromagnetics software package that is built upon a flexible framework, designed using object-oriented techniques. The analysis methods used include moment method solutions of integral equations, finite element solutions of partial differential equations, and combinations thereof. The framework design permits new analysis techniques (boundary conditions, Green#s functions, etc.) to be added to the software suite with a sensible effort. The code has been designed to execute (in serial or parallel) on a wide variety of platforms from Intel-based PCs and Unix-based workstations. Recently, new potential integration scheme s that avoid singularity extraction techniques have been added for integral equation analysis. These new integration schemes are required for facilitating the use of higher-order elements and basis functions. Higher-order elements are better able to model geometrical curvature using fewer elements than when using linear elements. Higher-order basis functions are beneficial for simulating structures with rapidly varying fields or currents. Results presented here will demonstrate curren t and future capabilities of EIGER with respect to analysis of installed antenna system performance in support of NASA#s mission of exploration. Examples include antenna coupling within an enclosed environment and antenna analysis on electrically large manned space vehicles.

  16. Improvements in analysis techniques for segmented mirror arrays

    NASA Astrophysics Data System (ADS)

    Michels, Gregory J.; Genberg, Victor L.; Bisson, Gary R.

    2016-08-01

    The employment of actively controlled segmented mirror architectures has become increasingly common in the development of current astronomical telescopes. Optomechanical analysis of such hardware presents unique issues compared to that of monolithic mirror designs. The work presented here is a review of current capabilities and improvements in the methodology of the analysis of mechanically induced surface deformation of such systems. The recent improvements include capability to differentiate surface deformation at the array and segment level. This differentiation allowing surface deformation analysis at each individual segment level offers useful insight into the mechanical behavior of the segments that is unavailable by analysis solely at the parent array level. In addition, capability to characterize the full displacement vector deformation of collections of points allows analysis of mechanical disturbance predictions of assembly interfaces relative to other assembly interfaces. This capability, called racking analysis, allows engineers to develop designs for segment-to-segment phasing performance in assembly integration, 0g release, and thermal stability of operation. The performance predicted by racking has the advantage of being comparable to the measurements used in assembly of hardware. Approaches to all of the above issues are presented and demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  17. Recoding low-level simulator data into a record of meaningful task performance: the integrated task modeling environment (ITME).

    PubMed

    King, Robert; Parker, Simon; Mouzakis, Kon; Fletcher, Winston; Fitzgerald, Patrick

    2007-11-01

    The Integrated Task Modeling Environment (ITME) is a user-friendly software tool that has been developed to automatically recode low-level data into an empirical record of meaningful task performance. The present research investigated and validated the performance of the ITME software package by conducting complex simulation missions and comparing the task analyses produced by ITME with taskanalyses produced by experienced video analysts. A very high interrater reliability (> or = .94) existed between experienced video analysts and the ITME for the task analyses produced for each mission. The mean session time:analysis time ratio was 1:24 using video analysis techniques and 1:5 using the ITME. It was concluded that the ITME produced task analyses that were as reliable as those produced by experienced video analysts, and significantly reduced the time cost associated with these analyses.

  18. Integrating Human Factors into Space Vehicle Processing for Risk Management

    NASA Technical Reports Server (NTRS)

    Woodbury, Sarah; Richards, Kimberly J.

    2008-01-01

    This presentation will discuss the multiple projects performed in United Space Alliance's Human Engineering Modeling and Performance (HEMAP) Lab, improvements that resulted from analysis, and the future applications of the HEMAP Lab for risk assessment by evaluating human/machine interaction and ergonomic designs.

  19. Automated Hazard Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riddle, F. J.

    2003-06-26

    The Automated Hazard Analysis (AHA) application is a software tool used to conduct job hazard screening and analysis of tasks to be performed in Savannah River Site facilities. The AHA application provides a systematic approach to the assessment of safety and environmental hazards associated with specific tasks, and the identification of controls regulations, and other requirements needed to perform those tasks safely. AHA is to be integrated into existing Savannah River site work control and job hazard analysis processes. Utilization of AHA will improve the consistency and completeness of hazard screening and analysis, and increase the effectiveness of the workmore » planning process.« less

  20. Capability of the Gas Analysis and Testing Laboratory at the NASA Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Broerman, Craig; Jimenez, Javier; Sweterlitsch, Jeff

    2012-01-01

    The Gas Analysis and Testing Laboratory is an integral part of the testing performed at the NASA Johnson Space Center. The Gas Analysis and Testing Laboratory is a high performance laboratory providing real time analytical instruments to support manned and unmanned testing. The lab utilizes precision gas chromatographs, gas analyzers and spectrophotometers to support the technology development programs within the NASA community. The Gas Analysis and Testing Laboratory works with a wide variety of customers and provides engineering support for user-specified applications in compressed gas, chemical analysis, general and research laboratory.

  1. Capability of the Gas Analysis and Testing Laboratory at the NASA Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Broerman, Craig; Jimenez, Javier; Sweterlitsch, Jeff

    2011-01-01

    The Gas Analysis and Testing Laboratory is an integral part of the testing performed at the NASA Johnson Space Center. The Gas Analysis and Testing Laboratory is a high performance laboratory providing real time analytical instruments to support manned and unmanned testing. The lab utilizes precision gas chromatographs, gas analyzers and spectrophotometers to support the technology development programs within the NASA community. The Gas Analysis and Testing Laboratory works with a wide variety of customers and provides engineering support for user-specified applications in compressed gas, chemical analysis, general and research laboratory

  2. Tracking Hazard Analysis Data in a Jungle of Changing Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sullivan, Robin S.; Young, Jonathan

    2006-05-16

    Tracking hazard analysis data during the 'life cycle' of a project can be an extremely complicated task. However, a few simple rules, used consistently, can give you the edge that will save countless headaches and provide the information that will help integrate the hazard analysis and design activities even if performed in parallel.

  3. The Effects of Discrete-Trial Training Commission Errors on Learner Outcomes: An Extension

    ERIC Educational Resources Information Center

    Jenkins, Sarah R.; Hirst, Jason M.; DiGennaro Reed, Florence D.

    2015-01-01

    We conducted a parametric analysis of treatment integrity errors during discrete-trial training and investigated the effects of three integrity conditions (0, 50, or 100 % errors of commission) on performance in the presence and absence of programmed errors. The presence of commission errors impaired acquisition for three of four participants.…

  4. Shuttle program. STS-7 feasibility assessment: IUS/TDRS-A

    NASA Technical Reports Server (NTRS)

    1979-01-01

    This Space Transportation System 7 (STS-7) Flight Feasibility Assessment (FFA) provides a base from which the various design, operation, and integration elements associated with Tracking and Data Relay Satellite-A can perform mission planning and analysis. The STS-7 FFA identifies conflicts, issues, and concerns associated with the integrated flight design requirements and constraints.

  5. Practical Application of Model-based Programming and State-based Architecture to Space Missions

    NASA Technical Reports Server (NTRS)

    Horvath, Gregory; Ingham, Michel; Chung, Seung; Martin, Oliver; Williams, Brian

    2006-01-01

    A viewgraph presentation to develop models from systems engineers that accomplish mission objectives and manage the health of the system is shown. The topics include: 1) Overview; 2) Motivation; 3) Objective/Vision; 4) Approach; 5) Background: The Mission Data System; 6) Background: State-based Control Architecture System; 7) Background: State Analysis; 8) Overview of State Analysis; 9) Background: MDS Software Frameworks; 10) Background: Model-based Programming; 10) Background: Titan Model-based Executive; 11) Model-based Execution Architecture; 12) Compatibility Analysis of MDS and Titan Architectures; 13) Integrating Model-based Programming and Execution into the Architecture; 14) State Analysis and Modeling; 15) IMU Subsystem State Effects Diagram; 16) Titan Subsystem Model: IMU Health; 17) Integrating Model-based Programming and Execution into the Software IMU; 18) Testing Program; 19) Computationally Tractable State Estimation & Fault Diagnosis; 20) Diagnostic Algorithm Performance; 21) Integration and Test Issues; 22) Demonstrated Benefits; and 23) Next Steps

  6. Multimodal integration of micro-Doppler sonar and auditory signals for behavior classification with convolutional networks.

    PubMed

    Dura-Bernal, Salvador; Garreau, Guillaume; Georgiou, Julius; Andreou, Andreas G; Denham, Susan L; Wennekers, Thomas

    2013-10-01

    The ability to recognize the behavior of individuals is of great interest in the general field of safety (e.g. building security, crowd control, transport analysis, independent living for the elderly). Here we report a new real-time acoustic system for human action and behavior recognition that integrates passive audio and active micro-Doppler sonar signatures over multiple time scales. The system architecture is based on a six-layer convolutional neural network, trained and evaluated using a dataset of 10 subjects performing seven different behaviors. Probabilistic combination of system output through time for each modality separately yields 94% (passive audio) and 91% (micro-Doppler sonar) correct behavior classification; probabilistic multimodal integration increases classification performance to 98%. This study supports the efficacy of micro-Doppler sonar systems in characterizing human actions, which can then be efficiently classified using ConvNets. It also demonstrates that the integration of multiple sources of acoustic information can significantly improve the system's performance.

  7. Integrating software architectures for distributed simulations and simulation analysis communities.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldsby, Michael E.; Fellig, Daniel; Linebarger, John Michael

    2005-10-01

    The one-year Software Architecture LDRD (No.79819) was a cross-site effort between Sandia California and Sandia New Mexico. The purpose of this research was to further develop and demonstrate integrating software architecture frameworks for distributed simulation and distributed collaboration in the homeland security domain. The integrated frameworks were initially developed through the Weapons of Mass Destruction Decision Analysis Center (WMD-DAC), sited at SNL/CA, and the National Infrastructure Simulation & Analysis Center (NISAC), sited at SNL/NM. The primary deliverable was a demonstration of both a federation of distributed simulations and a federation of distributed collaborative simulation analysis communities in the context ofmore » the same integrated scenario, which was the release of smallpox in San Diego, California. To our knowledge this was the first time such a combination of federations under a single scenario has ever been demonstrated. A secondary deliverable was the creation of the standalone GroupMeld{trademark} collaboration client, which uses the GroupMeld{trademark} synchronous collaboration framework. In addition, a small pilot experiment that used both integrating frameworks allowed a greater range of crisis management options to be performed and evaluated than would have been possible without the use of the frameworks.« less

  8. Evolving Postmortems as Teams Evolve Through TxP

    DTIC Science & Technology

    2014-12-01

    Instead of waiting for SEI to compile enough data to repeat this kind of analysis for the system integration test domain , a system integration test team...and stand up their Team Test Process (TTP). Some abilities, like planning on how many mistakes will be made by the team in producing a test procedure...can only be performed after the team has determined a) which mistakes count in the domain of system integration testing, b) what units to use to

  9. Quantitative assessment of integrated phrenic nerve activity.

    PubMed

    Nichols, Nicole L; Mitchell, Gordon S

    2016-06-01

    Integrated electrical activity in the phrenic nerve is commonly used to assess within-animal changes in phrenic motor output. Because of concerns regarding the consistency of nerve recordings, activity is most often expressed as a percent change from baseline values. However, absolute values of nerve activity are necessary to assess the impact of neural injury or disease on phrenic motor output. To date, no systematic evaluations of the repeatability/reliability have been made among animals when phrenic recordings are performed by an experienced investigator using standardized methods. We performed a meta-analysis of studies reporting integrated phrenic nerve activity in many rat groups by the same experienced investigator; comparisons were made during baseline and maximal chemoreceptor stimulation in 14 wild-type Harlan and 14 Taconic Sprague Dawley groups, and in 3 pre-symptomatic and 11 end-stage SOD1(G93A) Taconic rat groups (an ALS model). Meta-analysis results indicate: (1) consistent measurements of integrated phrenic activity in each sub-strain of wild-type rats; (2) with bilateral nerve recordings, left-to-right integrated phrenic activity ratios are ∼1.0; and (3) consistently reduced activity in end-stage SOD1(G93A) rats. Thus, with appropriate precautions, integrated phrenic nerve activity enables robust, quantitative comparisons among nerves or experimental groups, including differences caused by neuromuscular disease. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. [Integrated health care organizations: guideline for analysis].

    PubMed

    Vázquez Navarrete, M Luisa; Vargas Lorenzo, Ingrid; Farré Calpe, Joan; Terraza Núñez, Rebeca

    2005-01-01

    There has been a tendency recently to abandon competition and to introduce policies that promote collaboration between health providers as a means of improving the efficiency of the system and the continuity of care. A number of countries, most notably the United States, have experienced the integration of health care providers to cover the continuum of care of a defined population. Catalonia has witnessed the steady emergence of increasing numbers of integrated health organisations (IHO) but, unlike the United States, studies on health providers' integration are scarce. As part of a research project currently underway, a guide was developed to study Catalan IHOs, based on a classical literature review and the development of a theoretical framework. The guide proposes analysing the IHO's performance in relation to their final objectives of improving the efficiency and continuity of health care by an analysis of the integration type (based on key characteristics); external elements (existence of other suppliers, type of services' payment mechanisms); and internal elements (model of government, organization and management) that influence integration. Evaluation of the IHO's performance focuses on global strategies and results on coordination of care and efficiency. Two types of coordination are evaluated: information coordination and coordination of care management. Evaluation of the efficiency of the IHO refers to technical and allocative efficiency. This guide may have to be modified for use in the Catalan context.

  11. Computer codes for thermal analysis of a solid rocket motor nozzle

    NASA Technical Reports Server (NTRS)

    Chauhan, Rajinder Singh

    1988-01-01

    A number of computer codes are available for performing thermal analysis of solid rocket motor nozzles. Aerotherm Chemical Equilibrium (ACE) computer program can be used to perform one-dimensional gas expansion to determine the state of the gas at each location of a nozzle. The ACE outputs can be used as input to a computer program called Momentum/Energy Integral Technique (MEIT) for predicting boundary layer development development, shear, and heating on the surface of the nozzle. The output from MEIT can be used as input to another computer program called Aerotherm Charring Material Thermal Response and Ablation Program (CMA). This program is used to calculate oblation or decomposition response of the nozzle material. A code called Failure Analysis Nonlinear Thermal and Structural Integrated Code (FANTASTIC) is also likely to be used for performing thermal analysis of solid rocket motor nozzles after the program is duly verified. A part of the verification work on FANTASTIC was done by using one and two dimension heat transfer examples with known answers. An attempt was made to prepare input for performing thermal analysis of the CCT nozzle using the FANTASTIC computer code. The CCT nozzle problem will first be solved by using ACE, MEIT, and CMA. The same problem will then be solved using FANTASTIC. These results will then be compared for verification of FANTASTIC.

  12. Analysis of physical layer performance of data center with optical wavelength switches based on advanced modulation formats

    NASA Astrophysics Data System (ADS)

    Ahmad, Iftikhar; Chughtai, Mohsan Niaz

    2018-05-01

    In this paper the IRIS (Integrated Router Interconnected spectrally), an optical domain architecture for datacenter network is analyzed. The IRIS integrated with advanced modulation formats (M-QAM) and coherent optical receiver is analyzed. The channel impairments are compensated using the DSP algorithms following the coherent receiver. The proposed scheme allows N2 multiplexed wavelengths for N×N size. The performance of the N×N-IRIS switch with and without wavelength conversion is analyzed for different Baud rates over M-QAM modulation formats. The performance of the system is analyzed in terms of bit error rate (BER) vs OSNR curves.

  13. Sensory feedback in a bump attractor model of path integration.

    PubMed

    Poll, Daniel B; Nguyen, Khanh; Kilpatrick, Zachary P

    2016-04-01

    Mammalian spatial navigation systems utilize several different sensory information channels. This information is converted into a neural code that represents the animal's current position in space by engaging place cell, grid cell, and head direction cell networks. In particular, sensory landmark (allothetic) cues can be utilized in concert with an animal's knowledge of its own velocity (idiothetic) cues to generate a more accurate representation of position than path integration provides on its own (Battaglia et al. The Journal of Neuroscience 24(19):4541-4550 (2004)). We develop a computational model that merges path integration with feedback from external sensory cues that provide a reliable representation of spatial position along an annular track. Starting with a continuous bump attractor model, we explore the impact of synaptic spatial asymmetry and heterogeneity, which disrupt the position code of the path integration process. We use asymptotic analysis to reduce the bump attractor model to a single scalar equation whose potential represents the impact of asymmetry and heterogeneity. Such imperfections cause errors to build up when the network performs path integration, but these errors can be corrected by an external control signal representing the effects of sensory cues. We demonstrate that there is an optimal strength and decay rate of the control signal when cues appear either periodically or randomly. A similar analysis is performed when errors in path integration arise from dynamic noise fluctuations. Again, there is an optimal strength and decay of discrete control that minimizes the path integration error.

  14. Dual Standards of School Performance and Funding? Empirical Searches of School Funding Adequacy in Kentucky and Maine

    ERIC Educational Resources Information Center

    Lee, Jaekyung

    2010-01-01

    This study examines potential consequences of the discrepancies between national and state performance standards for school funding in Kentucky and Maine. Applying the successful schools observation method and cost function analysis method to integrated data-sets that match schools' eight-grade mathematics test performance measures to district…

  15. Histogram analysis of diffusion kurtosis imaging estimates for in vivo assessment of 2016 WHO glioma grades: A cross-sectional observational study.

    PubMed

    Hempel, Johann-Martin; Schittenhelm, Jens; Brendle, Cornelia; Bender, Benjamin; Bier, Georg; Skardelly, Marco; Tabatabai, Ghazaleh; Castaneda Vega, Salvador; Ernemann, Ulrike; Klose, Uwe

    2017-10-01

    To assess the diagnostic performance of histogram analysis of diffusion kurtosis imaging (DKI) maps for in vivo assessment of the 2016 World Health Organization Classification of Tumors of the Central Nervous System (2016 CNS WHO) integrated glioma grades. Seventy-seven patients with histopathologically-confirmed glioma who provided written informed consent were retrospectively assessed between 01/2014 and 03/2017 from a prospective trial approved by the local institutional review board. Ten histogram parameters of mean kurtosis (MK) and mean diffusivity (MD) metrics from DKI were independently assessed by two blinded physicians from a volume of interest around the entire solid tumor. One-way ANOVA was used to compare MK and MD histogram parameter values between 2016 CNS WHO-based tumor grades. Receiver operating characteristic analysis was performed on MK and MD histogram parameters for significant results. The 25th, 50th, 75th, and 90th percentiles of MK and average MK showed significant differences between IDH1/2 wild-type gliomas, IDH1/2 mutated gliomas, and oligodendrogliomas with chromosome 1p/19q loss of heterozygosity and IDH1/2 mutation (p<0.001). The 50th, 75th, and 90th percentiles showed a slightly higher diagnostic performance (area under the curve (AUC) range; 0.868-0.991) than average MK (AUC range; 0.855-0.988) in classifying glioma according to the integrated approach of 2016 CNS WHO. Histogram analysis of DKI can stratify gliomas according to the integrated approach of 2016 CNS WHO. The 50th (median), 75th , and the 90th percentiles showed the highest diagnostic performance. However, the average MK is also robust and feasible in routine clinical practice. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Image analysis by integration of disparate information

    NASA Technical Reports Server (NTRS)

    Lemoigne, Jacqueline

    1993-01-01

    Image analysis often starts with some preliminary segmentation which provides a representation of the scene needed for further interpretation. Segmentation can be performed in several ways, which are categorized as pixel based, edge-based, and region-based. Each of these approaches are affected differently by various factors, and the final result may be improved by integrating several or all of these methods, thus taking advantage of their complementary nature. In this paper, we propose an approach that integrates pixel-based and edge-based results by utilizing an iterative relaxation technique. This approach has been implemented on a massively parallel computer and tested on some remotely sensed imagery from the Landsat-Thematic Mapper (TM) sensor.

  17. Integrating Multiple Data Views for Improved Malware Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Blake H.

    2014-01-31

    Exploiting multiple views of a program makes obfuscating the intended behavior of a program more difficult allowing for better performance in classification, clustering, and phylogenetic reconstruction.

  18. Automation for System Safety Analysis

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul

    2009-01-01

    This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  19. A comparative study of biomass integrated gasification combined cycle power systems: Performance analysis.

    PubMed

    Zang, Guiyan; Tejasvi, Sharma; Ratner, Albert; Lora, Electo Silva

    2018-05-01

    The Biomass Integrated Gasification Combined Cycle (BIGCC) power system is believed to potentially be a highly efficient way to utilize biomass to generate power. However, there is no comparative study of BIGCC systems that examines all the latest improvements for gasification agents, gas turbine combustion methods, and CO 2 Capture and Storage options. This study examines the impact of recent advancements on BIGCC performance through exergy analysis using Aspen Plus. Results show that the exergy efficiency of these systems is ranged from 22.3% to 37.1%. Furthermore, exergy analysis indicates that the gas turbine with external combustion has relatively high exergy efficiency, and Selexol CO 2 removal method has low exergy destruction. Moreover, the sensitivity analysis shows that the system exergy efficiency is more sensitive to the initial temperature and pressure ratio of the gas turbine, whereas has a relatively weak dependence on the initial temperature and initial pressure of the steam turbine. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Steady and unsteady three-dimensional transonic flow computations by integral equation method

    NASA Technical Reports Server (NTRS)

    Hu, Hong

    1994-01-01

    This is the final technical report of the research performed under the grant: NAG1-1170, from the National Aeronautics and Space Administration. The report consists of three parts. The first part presents the work on unsteady flows around a zero-thickness wing. The second part presents the work on steady flows around non-zero thickness wings. The third part presents the massively parallel processing implementation and performance analysis of integral equation computations. At the end of the report, publications resulting from this grant are listed and attached.

  1. Effects of an integrated physical education/music program in changing early childhood perceptual-motor performance.

    PubMed

    Brown, J; Sherrill, C; Gench, B

    1981-08-01

    Two approaches to facilitating perceptual-motor development in children, ages 4 to 6 yr., were investigated. The experimental group (n = 15) received 24 sessions of integrated physical education/music instruction based upon concepts of Kodaly and Dalcroze. The control group (n = 15) received 24 sessions of movement exploration and self-testing instruction. Analysis of covariance indicated that significant improvement occurred only in the experimental group, with discharges changes in the motor, auditory, and language aspects of perceptual-motor performance as well as total score.

  2. Geodiametris: an integrated geoinformatic approach for monitoring land pollution from the disposal of olive oil mill wastes

    NASA Astrophysics Data System (ADS)

    Alexakis, Dimitrios D.; Sarris, Apostolos; Papadopoulos, Nikos; Soupios, Pantelis; Doula, Maria; Cavvadias, Victor

    2014-08-01

    The olive-oil industry is one of the most important sectors of agricultural production in Greece, which is the third in olive-oil production country worldwide. Olive oil mill wastes (OOMW) constitute a major factor in pollution in olivegrowing regions and an important problem to be solved for the agricultural industry. The olive-oil mill wastes are normally deposited at tanks, or directly in the soil or even on adjacent torrents, rivers and lakes posing a high risk to the environmental pollution and the community health. GEODIAMETRIS project aspires to develop integrated geoinformatic methodologies for performing monitoring of land pollution from the disposal of OOMW in the island of Crete -Greece. These methodologies integrate GPS surveys, satellite remote sensing and risk assessment analysis in GIS environment, application of in situ and laboratory geophysical methodologies as well as soil and water physicochemical analysis. Concerning project's preliminary results, all the operating OOMW areas located in Crete have been already registered through extensive GPS field campaigns. Their spatial and attribute information has been stored in an integrated GIS database and an overall OOMW spectral signature database has been constructed through the analysis of multi-temporal Landsat-8 OLI satellite images. In addition, a specific OOMW area located in Alikianos village (Chania-Crete) has been selected as one of the main case study areas. Various geophysical methodologies, such as Electrical Resistivity Tomography, Induced Polarization, multifrequency electromagnetic, Self Potential measurements and Ground Penetrating Radar have been already implemented. Soil as well as liquid samples have been collected for performing physico-chemical analysis. The preliminary results have already contributed to the gradual development of an integrated environmental monitoring tool for studying and understanding environmental degradation from the disposal of OOMW.

  3. Probabilistic structural analysis using a general purpose finite element program

    NASA Astrophysics Data System (ADS)

    Riha, D. S.; Millwater, H. R.; Thacker, B. H.

    1992-07-01

    This paper presents an accurate and efficient method to predict the probabilistic response for structural response quantities, such as stress, displacement, natural frequencies, and buckling loads, by combining the capabilities of MSC/NASTRAN, including design sensitivity analysis and fast probability integration. Two probabilistic structural analysis examples have been performed and verified by comparison with Monte Carlo simulation of the analytical solution. The first example consists of a cantilevered plate with several point loads. The second example is a probabilistic buckling analysis of a simply supported composite plate under in-plane loading. The coupling of MSC/NASTRAN and fast probability integration is shown to be orders of magnitude more efficient than Monte Carlo simulation with excellent accuracy.

  4. Performance analysis of a fault inferring nonlinear detection system algorithm with integrated avionics flight data

    NASA Technical Reports Server (NTRS)

    Caglayan, A. K.; Godiwala, P. M.; Morrell, F. R.

    1985-01-01

    This paper presents the performance analysis results of a fault inferring nonlinear detection system (FINDS) using integrated avionics sensor flight data for the NASA ATOPS B-737 aircraft in a Microwave Landing System (MLS) environment. First, an overview of the FINDS algorithm structure is given. Then, aircraft state estimate time histories and statistics for the flight data sensors are discussed. This is followed by an explanation of modifications made to the detection and decision functions in FINDS to improve false alarm and failure detection performance. Next, the failure detection and false alarm performance of the FINDS algorithm are analyzed by injecting bias failures into fourteen sensor outputs over six repetitive runs of the five minutes of flight data. Results indicate that the detection speed, failure level estimation, and false alarm performance show a marked improvement over the previously reported simulation runs. In agreement with earlier results, detection speed is faster for filter measurement sensors such as MLS than for filter input sensors such as flight control accelerometers. Finally, the progress in modifications of the FINDS algorithm design to accommodate flight computer constraints is discussed.

  5. A Prototyping Effort for the Integrated Spacecraft Analysis System

    NASA Technical Reports Server (NTRS)

    Wong, Raymond; Tung, Yu-Wen; Maldague, Pierre

    2011-01-01

    Computer modeling and simulation has recently become an essential technique for predicting and validating spacecraft performance. However, most computer models only examine spacecraft subsystems, and the independent nature of the models creates integration problems, which lowers the possibilities of simulating a spacecraft as an integrated unit despite a desire for this type of analysis. A new project called Integrated Spacecraft Analysis was proposed to serve as a framework for an integrated simulation environment. The project is still in its infancy, but a software prototype would help future developers assess design issues. The prototype explores a service oriented design paradigm that theoretically allows programs written in different languages to communicate with one another. It includes creating a uniform interface to the SPICE libraries such that different in-house tools like APGEN or SEQGEN can exchange information with it without much change. Service orientation may result in a slower system as compared to a single application, and more research needs to be done on the different available technologies, but a service oriented approach could increase long term maintainability and extensibility.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cutler, Dylan; Frank, Stephen; Slovensky, Michelle

    Rich, well-organized building performance and energy consumption data enable a host of analytic capabilities for building owners and operators, from basic energy benchmarking to detailed fault detection and system optimization. Unfortunately, data integration for building control systems is challenging and costly in any setting. Large portfolios of buildings--campuses, cities, and corporate portfolios--experience these integration challenges most acutely. These large portfolios often have a wide array of control systems, including multiple vendors and nonstandard communication protocols. They typically have complex information technology (IT) networks and cybersecurity requirements and may integrate distributed energy resources into their infrastructure. Although the challenges are significant,more » the integration of control system data has the potential to provide proportionally greater value for these organizations through portfolio-scale analytics, comprehensive demand management, and asset performance visibility. As a large research campus, the National Renewable Energy Laboratory (NREL) experiences significant data integration challenges. To meet them, NREL has developed an architecture for effective data collection, integration, and analysis, providing a comprehensive view of data integration based on functional layers. The architecture is being evaluated on the NREL campus through deployment of three pilot implementations.« less

  7. Integrated optical detection of autonomous capillary microfluidic immunoassays:a hand-held point-of-care prototype.

    PubMed

    Novo, P; Chu, V; Conde, J P

    2014-07-15

    The miniaturization of biosensors using microfluidics has potential in enabling the development of point-of-care devices, with the added advantages of reduced time and cost of analysis with limits-of-detection comparable to those obtained through traditional laboratory techniques. Interfacing microfluidic devices with the external world can be difficult especially in aspects involving fluid handling and the need for simple sample insertion that avoids special equipment or trained personnel. In this work we present a point-of-care prototype system by integrating capillary microfluidics with a microfabricated photodiode array and electronic instrumentation into a hand-held unit. The capillary microfluidic device is capable of autonomous and sequential fluid flow, including control of the average fluid velocity at any given point of the analysis. To demonstrate the functionality of the prototype, a model chemiluminescence ELISA was performed. The performance of the integrated optical detection in the point-of-care prototype is equal to that obtained with traditional bench-top instrumentation. The photodiode signals were acquired, displayed and processed by a simple graphical user interface using a computer connected to the microcontroller through USB. The prototype performed integrated chemiluminescence ELISA detection in about 15 min with a limit-of-detection of ≈2 nM with an antibody-antigen affinity constant of ≈2×10(7) M(-1). Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Navigation integrity monitoring and obstacle detection for enhanced-vision systems

    NASA Astrophysics Data System (ADS)

    Korn, Bernd; Doehler, Hans-Ullrich; Hecker, Peter

    2001-08-01

    Typically, Enhanced Vision (EV) systems consist of two main parts, sensor vision and synthetic vision. Synthetic vision usually generates a virtual out-the-window view using databases and accurate navigation data, e. g. provided by differential GPS (DGPS). The reliability of the synthetic vision highly depends on both, the accuracy of the used database and the integrity of the navigation data. But especially in GPS based systems, the integrity of the navigation can't be guaranteed. Furthermore, only objects that are stored in the database can be displayed to the pilot. Consequently, unexpected obstacles are invisible and this might cause severe problems. Therefore, additional information has to be extracted from sensor data to overcome these problems. In particular, the sensor data analysis has to identify obstacles and has to monitor the integrity of databases and navigation. Furthermore, if a lack of integrity arises, navigation data, e.g. the relative position of runway and aircraft, has to be extracted directly from the sensor data. The main contribution of this paper is about the realization of these three sensor data analysis tasks within our EV system, which uses the HiVision 35 GHz MMW radar of EADS, Ulm as the primary EV sensor. For the integrity monitoring, objects extracted from radar images are registered with both database objects and objects (e. g. other aircrafts) transmitted via data link. This results in a classification into known and unknown radar image objects and consequently, in a validation of the integrity of database and navigation. Furthermore, special runway structures are searched for in the radar image where they should appear. The outcome of this runway check contributes to the integrity analysis, too. Concurrent to this investigation a radar image based navigation is performed without using neither precision navigation nor detailed database information to determine the aircraft's position relative to the runway. The performance of our approach is demonstrated with real data acquired during extensive flight tests to several airports in Northern Germany.

  9. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    PubMed

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.

  10. 48 CFR 873.116 - Source selection decision.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Source selection decision. (a) An integrated comparative assessment of proposals should be performed... source selection team, or advisory boards or panels, may conduct comparative analysis(es) of proposals...

  11. 48 CFR 873.116 - Source selection decision.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Source selection decision. (a) An integrated comparative assessment of proposals should be performed... source selection team, or advisory boards or panels, may conduct comparative analysis(es) of proposals...

  12. 48 CFR 873.116 - Source selection decision.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Source selection decision. (a) An integrated comparative assessment of proposals should be performed... source selection team, or advisory boards or panels, may conduct comparative analysis(es) of proposals...

  13. 48 CFR 873.116 - Source selection decision.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Source selection decision. (a) An integrated comparative assessment of proposals should be performed... source selection team, or advisory boards or panels, may conduct comparative analysis(es) of proposals...

  14. 48 CFR 873.116 - Source selection decision.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Source selection decision. (a) An integrated comparative assessment of proposals should be performed... source selection team, or advisory boards or panels, may conduct comparative analysis(es) of proposals...

  15. Integrative Analysis of Cancer Diagnosis Studies with Composite Penalization

    PubMed Central

    Liu, Jin; Huang, Jian; Ma, Shuangge

    2013-01-01

    Summary In cancer diagnosis studies, high-throughput gene profiling has been extensively conducted, searching for genes whose expressions may serve as markers. Data generated from such studies have the “large d, small n” feature, with the number of genes profiled much larger than the sample size. Penalization has been extensively adopted for simultaneous estimation and marker selection. Because of small sample sizes, markers identified from the analysis of single datasets can be unsatisfactory. A cost-effective remedy is to conduct integrative analysis of multiple heterogeneous datasets. In this article, we investigate composite penalization methods for estimation and marker selection in integrative analysis. The proposed methods use the minimax concave penalty (MCP) as the outer penalty. Under the homogeneity model, the ridge penalty is adopted as the inner penalty. Under the heterogeneity model, the Lasso penalty and MCP are adopted as the inner penalty. Effective computational algorithms based on coordinate descent are developed. Numerical studies, including simulation and analysis of practical cancer datasets, show satisfactory performance of the proposed methods. PMID:24578589

  16. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.

    2011-03-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repositorymore » designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are needed for repository modeling are severely lacking. In addition, most of existing reactive transport codes were developed for non-radioactive contaminants, and they need to be adapted to account for radionuclide decay and in-growth. The accessibility to the source codes is generally limited. Because the problems of interest for the Waste IPSC are likely to result in relatively large computational models, a compact memory-usage footprint and a fast/robust solution procedure will be needed. A robust massively parallel processing (MPP) capability will also be required to provide reasonable turnaround times on the analyses that will be performed with the code. A performance assessment (PA) calculation for a waste disposal system generally requires a large number (hundreds to thousands) of model simulations to quantify the effect of model parameter uncertainties on the predicted repository performance. A set of codes for a PA calculation must be sufficiently robust and fast in terms of code execution. A PA system as a whole must be able to provide multiple alternative models for a specific set of physical/chemical processes, so that the users can choose various levels of modeling complexity based on their modeling needs. This requires PA codes, preferably, to be highly modularized. Most of the existing codes have difficulties meeting these requirements. Based on the gap analysis results, we have made the following recommendations for the code selection and code development for the NEAMS waste IPSC: (1) build fully coupled high-fidelity THCMBR codes using the existing SIERRA codes (e.g., ARIA and ADAGIO) and platform, (2) use DAKOTA to build an enhanced performance assessment system (EPAS), and build a modular code architecture and key code modules for performance assessments. The key chemical calculation modules will be built by expanding the existing CANTERA capabilities as well as by extracting useful components from other existing codes.« less

  17. What Performance Analysts Need to Know About Research Trends in Association Football (2012-2016): A Systematic Review.

    PubMed

    Sarmento, Hugo; Clemente, Filipe Manuel; Araújo, Duarte; Davids, Keith; McRobert, Allistair; Figueiredo, António

    2018-04-01

    Evolving patterns of match analysis research need to be systematically reviewed regularly since this area of work is burgeoning rapidly and studies can offer new insights to performance analysts if theoretically and coherently organized. The purpose of this paper was to conduct a systematic review of published articles on match analysis in adult male football, identify and organize common research topics, and synthesize the emerging patterns of work between 2012 and 2016, according to the Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) guidelines. The Web of Science database was searched for relevant published studies using the following keywords: 'football' and 'soccer', each one associated with the terms 'match analysis', 'performance analysis', 'notational analysis', 'game analysis', 'tactical analysis' and 'patterns of play'. Of 483 studies initially identified, 77 were fully reviewed and their outcome measures extracted and analyzed. Results showed that research mainly focused on (1) performance at set pieces, i.e. corner kicks, free kicks, penalty kicks; (2) collective system behaviours, captured by established variables such as team centroid (geometrical centre of a set of players) and team dispersion (quantification of how far players are apart), as well as tendencies for team communication (establishing networks based on passing sequences), sequential patterns (predicting future passing sequences), and group outcomes (relationships between match-related statistics and final match scores); and (3) activity profile of players, i.e. playing roles, effects of fatigue, substitutions during matches, and the effects of environmental constraints on performance, such as heat and altitude. From the previous review, novel variables were identified that require new measurement techniques. It is evident that the complexity engendered during performance in competitive soccer requires an integrated approach that considers multiple aspects. A challenge for researchers is to align these new measures with the needs of the coaches through a more integrated relationship between coaches and researchers, to produce practical and usable information that improves player performance and coach activity.

  18. Science Data Report for the Optical Properties Monitor (OPM) Experiment

    NASA Technical Reports Server (NTRS)

    Wilkes, Donald R.; Zwiener, James M.

    1999-01-01

    Long term stability of spacecraft materials when exposed to the space environment continues to be a major area of investigation. The natural and induced environment surrounding a spacecraft can decrease material performance and limit useful lifetimes. The Optical Properties Monitor (OPM) experiment provided the capability to perform the important flight testing of materials and was flown on the Russian Mir Station to study the long term effects of the natural and induced space environment on materials. The core of the OPM in-flight analysis was three independent optical instruments. These instruments included an integrating sphere spectral reflectometer, a vacuum ultraviolet spectrometer, and a Total Integrated Scatter instrument. The OPM also monitored selected components of the environment including molecular contamination. The OPM was exposed on the exterior of the Mir Docking Module for approximately 8-1/2 months. This report describes the OPM experiment, a brief background of its development, program organization, experiment description, mission overview including space environment definition, performance overview, materials data including flight and ground data, in-depth post flight analysis including ground analysis measurements and a summary discussion of the findings and results.

  19. Performance analysis of smart laminated composite plate integrated with distributed AFC material undergoing geometrically nonlinear transient vibrations

    NASA Astrophysics Data System (ADS)

    Shivakumar, J.; Ashok, M. H.; Khadakbhavi, Vishwanath; Pujari, Sanjay; Nandurkar, Santosh

    2018-02-01

    The present work focuses on geometrically nonlinear transient analysis of laminated smart composite plates integrated with the patches of Active fiber composites (AFC) using Active constrained layer damping (ACLD) as the distributed actuators. The analysis has been carried out using generalised energy based finite element model. The coupled electromechanical finite element model is derived using Von Karman type nonlinear strain displacement relations and a first-order shear deformation theory (FSDT). Eight-node iso-parametric serendipity elements are used for discretization of the overall plate integrated with AFC patch material. The viscoelastic constrained layer is modelled using GHM method. The numerical results shows the improvement in the active damping characteristics of the laminated composite plates over the passive damping for suppressing the geometrically nonlinear transient vibrations of laminated composite plates with AFC as patch material.

  20. Assessment of Material Solutions of Multi-level Garage Structure Within Integrated Life Cycle Design Process

    NASA Astrophysics Data System (ADS)

    Wałach, Daniel; Sagan, Joanna; Gicala, Magdalena

    2017-10-01

    The paper presents an environmental and economic analysis of the material solutions of multi-level garage. The construction project approach considered reinforced concrete structure under conditions of use of ordinary concrete and high-performance concrete (HPC). Using of HPC allowed to significant reduction of reinforcement steel, mainly in compression elements (columns) in the construction of the object. The analysis includes elements of the methodology of integrated lice cycle design (ILCD). By making multi-criteria analysis based on established weight of the economic and environmental parameters, three solutions have been evaluated and compared within phase of material production (information modules A1-A3).

  1. Asymptotic/numerical analysis of supersonic propeller noise

    NASA Technical Reports Server (NTRS)

    Myers, M. K.; Wydeven, R.

    1989-01-01

    An asymptotic analysis based on the Mach surface structure of the field of a supersonic helical source distribution is applied to predict thickness and loading noise radiated by high speed propeller blades. The theory utilizes an integral representation of the Ffowcs-Williams Hawkings equation in a fully linearized form. The asymptotic results are used for chordwise strips of the blade, while required spanwise integrations are performed numerically. The form of the analysis enables predicted waveforms to be interpreted in terms of Mach surface propagation. A computer code developed to implement the theory is described and found to yield results in close agreement with more exact computations.

  2. ASIL determination for motorbike's Electronics Throttle Control System (ETCS) mulfunction

    NASA Astrophysics Data System (ADS)

    Zaman Rokhani, Fakhrul; Rahman, Muhammad Taqiuddin Abdul; Ain Kamsani, Noor; Sidek, Roslina Mohd; Saripan, M. Iqbal; Samsudin, Khairulmizam; Khair Hassan, Mohd

    2017-11-01

    Electronics Throttle Control System (ETCS) is the principal electronic unit in all fuel injection engine motorbike, augmenting the engine performance efficiency in comparison to the conventional carburetor based engine. ETCS is regarded as a safety-critical component, whereby ETCS malfunction can cause unintended acceleration or deceleration event, which can be hazardous to riders. In this study, Hazard Analysis and Risk Assessment, an ISO26262 functional safety standard analysis has been applied on motorbike's ETCS to determine the required automotive safety integrity level. Based on the analysis, the established automotive safety integrity level can help to derive technical and functional safety measures for ETCS development.

  3. Computational System For Rapid CFD Analysis In Engineering

    NASA Technical Reports Server (NTRS)

    Barson, Steven L.; Ascoli, Edward P.; Decroix, Michelle E.; Sindir, Munir M.

    1995-01-01

    Computational system comprising modular hardware and software sub-systems developed to accelerate and facilitate use of techniques of computational fluid dynamics (CFD) in engineering environment. Addresses integration of all aspects of CFD analysis process, including definition of hardware surfaces, generation of computational grids, CFD flow solution, and postprocessing. Incorporates interfaces for integration of all hardware and software tools needed to perform complete CFD analysis. Includes tools for efficient definition of flow geometry, generation of computational grids, computation of flows on grids, and postprocessing of flow data. System accepts geometric input from any of three basic sources: computer-aided design (CAD), computer-aided engineering (CAE), or definition by user.

  4. Restoring a smooth function from its noisy integrals

    NASA Astrophysics Data System (ADS)

    Goulko, Olga; Prokof'ev, Nikolay; Svistunov, Boris

    2018-05-01

    Numerical (and experimental) data analysis often requires the restoration of a smooth function from a set of sampled integrals over finite bins. We present the bin hierarchy method that efficiently computes the maximally smooth function from the sampled integrals using essentially all the information contained in the data. We perform extensive tests with different classes of functions and levels of data quality, including Monte Carlo data suffering from a severe sign problem and physical data for the Green's function of the Fröhlich polaron.

  5. Systems and methods for integrating ion mobility and ion trap mass spectrometers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ibrahim, Yehia M.; Garimella, Sandilya; Prost, Spencer A.

    Described herein are examples of systems and methods for integrating IMS and MS systems. In certain examples, systems and methods for decoding double multiplexed data are described. The systems and methods can also perform multiple refining procedures in order to minimize the demultiplexing artifacts. The systems and methods can be used, for example, for the analysis of proteomic and petroleum samples, where the integration of IMS and high mass resolution are used for accurate assignment of molecular formulae.

  6. Is vertical integration adding value to health systems?

    PubMed

    Weil, T P

    2000-04-01

    Vertical integration is a concept used by health systems when attempting to achieve economies of scale, greater coordination of services, and improved market penetration. This article focuses on the actual outcomes of utilizing vertical integration in the health field and then compares these findings with those reported in other industries. This analysis concludes that this organizational model does not work particularly well in the health industry, as illustrated by health alliances' poor fiscal performance when they acquire physician practices or when they start their own HMO plans.

  7. DR-Integrator: a new analytic tool for integrating DNA copy number and gene expression data.

    PubMed

    Salari, Keyan; Tibshirani, Robert; Pollack, Jonathan R

    2010-02-01

    DNA copy number alterations (CNA) frequently underlie gene expression changes by increasing or decreasing gene dosage. However, only a subset of genes with altered dosage exhibit concordant changes in gene expression. This subset is likely to be enriched for oncogenes and tumor suppressor genes, and can be identified by integrating these two layers of genome-scale data. We introduce DNA/RNA-Integrator (DR-Integrator), a statistical software tool to perform integrative analyses on paired DNA copy number and gene expression data. DR-Integrator identifies genes with significant correlations between DNA copy number and gene expression, and implements a supervised analysis that captures genes with significant alterations in both DNA copy number and gene expression between two sample classes. DR-Integrator is freely available for non-commercial use from the Pollack Lab at http://pollacklab.stanford.edu/ and can be downloaded as a plug-in application to Microsoft Excel and as a package for the R statistical computing environment. The R package is available under the name 'DRI' at http://cran.r-project.org/. An example analysis using DR-Integrator is included as supplemental material. Supplementary data are available at Bioinformatics online.

  8. Geographic integration of hepatitis C virus: A global threat

    PubMed Central

    Daw, Mohamed A; El-Bouzedi, Abdallah A; Ahmed, Mohamed O; Dau, Aghnyia A; Agnan, Mohamed M; Drah, Aisha M

    2016-01-01

    AIM To assess hepatitis C virus (HCV) geographic integration, evaluate the spatial and temporal evolution of HCV worldwide and propose how to diminish its burden. METHODS A literature search of published articles was performed using PubMed, MEDLINE and other related databases up to December 2015. A critical data assessment and analysis regarding the epidemiological integration of HCV was carried out using the meta-analysis method. RESULTS The data indicated that HCV has been integrated immensely over time and through various geographical regions worldwide. The history of HCV goes back to 1535 but between 1935 and 1965 it exhibited a rapid, exponential spread. This integration is clearly seen in the geo-epidemiology and phylogeography of HCV. HCV integration can be mirrored either as intra-continental or trans-continental. Migration, drug trafficking and HCV co-infection, together with other potential risk factors, have acted as a vehicle for this integration. Evidence shows that the geographic integration of HCV has been important in the global and regional distribution of HCV. CONCLUSION HCV geographic integration is clearly evident and this should be reflected in the prevention and treatment of this ongoing pandemic. PMID:27878104

  9. Multi-mission telecom analysis tool

    NASA Technical Reports Server (NTRS)

    Hanks, D.; Kordon, M.; Baker, J.

    2002-01-01

    In the early formulation phase of a mission it is critically important to have fast, easy to use, easy to integrate space vehicle subsystem analysis tools so that engineers can rapidly perform trade studies not only by themselves but in coordination with other subsystem engineers as well. The Multi-Mission Telecom Analysis Tool (MMTAT) is designed for just this purpose.

  10. Space Station Environment Control and Life Support System Pressure Control Pump Assembly Modeling and Analysis

    NASA Technical Reports Server (NTRS)

    Schunk, R. Gregory

    2002-01-01

    This paper presents the Modeling and Analysis of the Space Station Environment Control and Life Support System Pressure Control Pump Assembly (PCPA). The contents include: 1) Integrated PCPA/Manifold Analyses; 2) Manifold Performance Analysis; 3) PCPA Motor Heat Leak Study; and 4) Future Plans. This paper is presented in viewgraph form.

  11. Functional integration of PCR amplification and capillary eletrophoresis in a microfabricated DNA analysis device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woolley, A.T.; deMello, A.J.; Mathies, R.A.

    Microfabricated silicon PCR reactors and glass capillary electrophoresis (CE) chips have been successfully coupled to form an integrated DNA analysis system. This construct combines the rapid thermal cycling capabilities of microfabricated PCR devices (10{degree}C/s heating, 2.5{degree}C/s cooling) with the high-speed (<120 s) DNA separations provided by microfabricated CE chips. The PCR chamber and the CE chip were directly linked through a photolithographically fabricated channel filled with hydroxyethylcellulose sieving matrix. Electrophoretic injection directly from the PCR chamber through the cross injection channel was used as an `electrophoretic valve` to couple the PCR and CE devices on-chip. To demonstrate the functionality ofmore » this system, a 15 min PCR amplification of a {Beta}-globin target cloned in m13 was immediately followed by high-speed CE chip separation in under 120 s, providing a rapid PCR-CE analysis in under 20 min. A rapid assay for genomic Salmonella DNA was performed in under 45 min, demonstrating that challenging amplifications of diagnostically interesting targets can also be performed. Real-time monitoring of PCR target amplification in these integrated PCR-CE devices is also feasible. 33 refs., 6 figs.« less

  12. The Integration of Nutrition Education in the Basic Biomedical Sciences

    ERIC Educational Resources Information Center

    Raw, Isaias

    1977-01-01

    At the Center for Biomedical Education at the City University of New York, nutrition is integrated into the chemistry-biochemistry sequence of a six-year B.S.-M.D. program. Students perform an actual analysis of a sample of their own food, learning basic techniques and concepts, and also carry on experiments with rats on other diets. (Editor/LBH)

  13. International Space Station Alpha (ISSA) Integrated Traffic Model

    NASA Technical Reports Server (NTRS)

    Gates, R. E.

    1995-01-01

    The paper discusses the development process of the International Space Station Alpha (ISSA) Integrated Traffic Model which is a subsystem analyses tool utilized in the ISSA design analysis cycles. Fast-track prototyping of the detailed relationships between daily crew and station consumables, propellant needs, maintenance requirements and crew rotation via spread sheets provide adequate benchmarks to assess cargo vehicle design and performance characteristics.

  14. The Effects of Training and Performance Feedback during Behavioral Consultation on General Education Middle School Teachers' Integrity to Functional Analysis Procedures

    ERIC Educational Resources Information Center

    McKenney, Elizabeth L. W.; Waldron, Nancy; Conroy, Maureen

    2013-01-01

    This study describes the integrity with which 3 general education middle school teachers implemented functional analyses (FA) of appropriate behavior for students who typically engaged in disruption. A 4-step model consistent with behavioral consultation was used to support the assessment process. All analyses were conducted during ongoing…

  15. Students' Academic Performance and Various Cognitive Processes of Learning: An Integrative Framework and Empirical Analysis

    ERIC Educational Resources Information Center

    Phan, Huy Phuong

    2010-01-01

    The main aim of this study is to test a conceptualised framework that involved the integration of achievement goals, self-efficacy and self-esteem beliefs, and study-processing strategies. Two hundred and ninety (178 females, 112 males) first-year university students were administered a number of Likert-scale inventories in tutorial classes. Data…

  16. Identification of pathogenic genes related to rheumatoid arthritis through integrated analysis of DNA methylation and gene expression profiling.

    PubMed

    Zhang, Lei; Ma, Shiyun; Wang, Huailiang; Su, Hang; Su, Ke; Li, Longjie

    2017-11-15

    The purpose of our study was to identify new pathogenic genes used for exploring the pathogenesis of rheumatoid arthritis (RA). To screen pathogenic genes of RA, an integrated analysis was performed by using the microarray datasets in RA derived from the Gene Expression Omnibus (GEO) database. The functional annotation and potential pathways of differentially expressed genes (DEGs) were further discovered by Gene Ontology (GO) and Kyoto Encyclopedia of Genes and Genomes (KEGG) enrichment analysis. Afterwards, the integrated analysis of DNA methylation and gene expression profiling was used to screen crucial genes. In addition, we used RT-PCR and MSP to verify the expression levels and methylation status of these crucial genes in 20 synovial biopsy samples obtained from 10 RA model mice and 10 normal mice. BCL11B, CCDC88C, FCRLA and APOL6 were both up-regulated and hypomethylated in RA according to integrated analysis, RT-PCR and MSP verification. Four crucial genes (BCL11B, CCDC88C, FCRLA and APOL6) identified and analyzed in this study might be closely connected with the pathogenesis of RA. Copyright © 2017. Published by Elsevier B.V.

  17. Integrated Safety Analysis Teams

    NASA Technical Reports Server (NTRS)

    Wetherholt, Jonathan C.

    2008-01-01

    Today's complex systems require understanding beyond one person s capability to comprehend. Each system requires a team to divide the system into understandable subsystems which can then be analyzed with an Integrated Hazard Analysis. The team must have both specific experiences and diversity of experience. Safety experience and system understanding are not always manifested in one individual. Group dynamics make the difference between success and failure as well as the difference between a difficult task and a rewarding experience. There are examples in the news which demonstrate the need to connect the pieces of a system into a complete picture. The Columbia disaster is now a standard example of a low consequence hazard in one part of the system; the External Tank is a catastrophic hazard cause for a companion subsystem, the Space Shuttle Orbiter. The interaction between the hardware, the manufacturing process, the handling, and the operations contributed to the problem. Each of these had analysis performed, but who constituted the team which integrated this analysis together? This paper will explore some of the methods used for dividing up a complex system; and how one integration team has analyzed the parts. How this analysis has been documented in one particular launch space vehicle case will also be discussed.

  18. Blog-Integrated Writing with Blog-Buddies: EAP Learners' Writing Performance

    ERIC Educational Resources Information Center

    Asoodar, Maryam; Atai, Mahmood Reza; Vaezi, Shahin

    2016-01-01

    This article reports a mixed-method research probing the effect of utilizing a blog-buddy system on English for academic purposes learners' writing performance. Sixty Iranian undergraduate engineering students at Iran University of Science and Technology Virtual Campus participated in this study. Our analysis of the students' writings indicated…

  19. Project Integration Architecture (PIA) and Computational Analysis Programming Interface (CAPRI) for Accessing Geometry Data from CAD Files

    NASA Technical Reports Server (NTRS)

    Benyo, Theresa L.

    2002-01-01

    Integration of a supersonic inlet simulation with a computer aided design (CAD) system is demonstrated. The integration is performed using the Project Integration Architecture (PIA). PIA provides a common environment for wrapping many types of applications. Accessing geometry data from CAD files is accomplished by incorporating appropriate function calls from the Computational Analysis Programming Interface (CAPRI). CAPRI is a CAD vendor neutral programming interface that aids in acquiring geometry data directly from CAD files. The benefits of wrapping a supersonic inlet simulation into PIA using CAPRI are; direct access of geometry data, accurate capture of geometry data, automatic conversion of data units, CAD vendor neutral operation, and on-line interactive history capture. This paper describes the PIA and the CAPRI wrapper and details the supersonic inlet simulation demonstration.

  20. Space processing applications payload equipment study. Volume 2B: Payload interface analysis (power/thermal/electromagnetic compatibility)

    NASA Technical Reports Server (NTRS)

    Hammel, R. L. (Editor); Smith, A. G. (Editor)

    1974-01-01

    As a part of the task of performing preliminary engineering analysis of modular payload subelement/host vehicle interfaces, a subsystem interface analysis was performed to establish the integrity of the modular approach to the equipment design and integration. Salient areas that were selected for analysis were power and power conditioning, heat rejection and electromagnetic capability (EMC). The equipment and load profiles for twelve representative experiments were identified. Two of the twelve experiments were chosen as being representative of the group and have been described in greater detail to illustrate the evaluations used in the analysis. The shuttle orbiter will provide electrical power from its three fuel cells in support of the orbiter and the Spacelab operations. One of the three shuttle orbiter fuel cells will be dedicated to the Spacelab electrical power requirements during normal shuttle operation. This power supplies the Spacelab subsystems and the excess will be available to the payload. The current Spacelab sybsystem requirements result in a payload allocation of 4.0 to 4.8 kW average (24 hour/day) and 9.0 kW peak for 15 minutes.

  1. The Integrated Radiation Mapper Assistant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlton, R.E.; Tripp, L.R.

    1995-03-01

    The Integrated Radiation Mapper Assistant (IRMA) system combines state-of-the-art radiation sensors and microprocessor based analysis techniques to perform radiation surveys. Control of the survey function is from a control station located outside the radiation thus reducing time spent in radiation areas performing radiation surveys. The system consists of a directional radiation sensor, a laser range finder, two area radiation sensors, and a video camera mounted on a pan and tilt platform. THis sensor package is deployable on a remotely operated vehicle. The outputs of the system are radiation intensity maps identifying both radiation source intensities and radiation levels throughout themore » room being surveyed. After completion of the survey, the data can be removed from the control station computer for further analysis or archiving.« less

  2. Extension of an Object-Oriented Optimization Tool: User's Reference Manual

    NASA Technical Reports Server (NTRS)

    Pak, Chan-Gi; Truong, Samson S.

    2015-01-01

    The National Aeronautics and Space Administration Armstrong Flight Research Center has developed a cost-effective and flexible object-oriented optimization (O (sup 3)) tool that leverages existing tools and practices and allows easy integration and adoption of new state-of-the-art software. This object-oriented framework can integrate the analysis codes for multiple disciplines, as opposed to relying on one code to perform analysis for all disciplines. Optimization can thus take place within each discipline module, or in a loop between the O (sup 3) tool and the discipline modules, or both. Six different sample mathematical problems are presented to demonstrate the performance of the O (sup 3) tool. Instructions for preparing input data for the O (sup 3) tool are detailed in this user's manual.

  3. Design and demonstrate the performance of cryogenic components representative of space vehicles: Start basket liquid acquisition device performance analysis

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The objective was to design, fabricate and test an integrated cryogenic test article incorporating both fluid and thermal propellant management subsystems. A 2.2 m (87 in) diameter aluminum test tank was outfitted with multilayer insulation, helium purge system, low-conductive tank supports, thermodynamic vent system, liquid acquisition device and immersed outflow pump. Tests and analysis performed on the start basket liquid acquisition device and studies of the liquid retention characteristics of fine mesh screens are discussed.

  4. Nuclear Analysis

    NASA Technical Reports Server (NTRS)

    Clement, J. D.; Kirby, K. D.

    1973-01-01

    Exploratory calculations were performed for several gas core breeder reactor configurations. The computational method involved the use of the MACH-1 one dimensional diffusion theory code and the THERMOS integral transport theory code for thermal cross sections. Computations were performed to analyze thermal breeder concepts and nonbreeder concepts. Analysis of breeders was restricted to the (U-233)-Th breeding cycle, and computations were performed to examine a range of parameters. These parameters include U-233 to hydrogen atom ratio in the gaseous cavity, carbon to thorium atom ratio in the breeding blanket, cavity size, and blanket size.

  5. Performance of an exhaled nitric oxide and carbon dioxide sensor using quantum cascade laser-based integrated cavity output spectroscopy.

    PubMed

    McCurdy, Matthew R; Bakhirkin, Yury; Wysocki, Gerard; Tittel, Frank K

    2007-01-01

    Exhaled nitric oxide (NO) is an important biomarker in asthma and other respiratory disorders. The optical performance of a NOCO(2) sensor employing integrated cavity output spectroscopy (ICOS) with a quantum cascade laser operating at 5.22 microm capable of real-time NO and CO(2) measurements in a single breath cycle is reported. A NO noise-equivalent concentration of 0.4 ppb within a 1-sec integration time is achieved. The off-axis ICOS sensor performance is compared to a chemiluminescent NO analyzer and a nondispersive infrared (NDIR) CO(2) absorption capnograph. Differences between the gas analyzers are assessed by the Bland-Altman method to estimate the expected variability between the gas sensors. The off-axis ICOS sensor measurements are in good agreement with the data acquired with the two commercial gas analyzers. This work demonstrates the performance characteristics and merits of mid-infrared spectroscopy for exhaled breath analysis.

  6. Design and Analysis of Enhanced Modulation Response in Integrated Coupled Cavities DBR Lasers Using Photon-Photon Resonance

    DOE PAGES

    Bardella, Paolo; Chow, Weng; Montrosset, Ivo

    2016-01-08

    In the last decades, various solutions have been proposed to increase the modulation bandwidth and consequently the transmission bit rate of integrated semiconductor lasers. In this manuscript we discuss a design procedure for a recently proposed laser structure realized with the integration of two DBR lasers. Design guidelines will be proposed and dynamic small and large signal simulations, calculated using a Finite Difference Traveling Wave numerical simulator, will be performed to confirm the design results and the effectiveness of the analyzed integrated configuration to achieve a direct modulation bandwidth up to 80 GHz

  7. Thermal integration of Spacelab experiments

    NASA Technical Reports Server (NTRS)

    Patterson, W. C.; Hopson, G. D.

    1978-01-01

    The method of thermally integrating the experiments for Spacelab is discussed. The scientific payload consists of a combination of European and United States sponsored experiments located in the module as well as on a single Spacelab pallet. The thermal integration must result in accomodating the individual experiment requirements as well as ensuring that the total payload is within the Spacelab Environmental Control System (ECS) resource capability. An integrated thermal/ECS analysis of the module and pallet is performed in concert with the mission timeline to ensure that the agreed upon experiment requirements are accommodated and to ensure the total payload is within the Spacelab ECS resources.

  8. StrAuto: automation and parallelization of STRUCTURE analysis.

    PubMed

    Chhatre, Vikram E; Emerson, Kevin J

    2017-03-24

    Population structure inference using the software STRUCTURE has become an integral part of population genetic studies covering a broad spectrum of taxa including humans. The ever-expanding size of genetic data sets poses computational challenges for this analysis. Although at least one tool currently implements parallel computing to reduce computational overload of this analysis, it does not fully automate the use of replicate STRUCTURE analysis runs required for downstream inference of optimal K. There is pressing need for a tool that can deploy population structure analysis on high performance computing clusters. We present an updated version of the popular Python program StrAuto, to streamline population structure analysis using parallel computing. StrAuto implements a pipeline that combines STRUCTURE analysis with the Evanno Δ K analysis and visualization of results using STRUCTURE HARVESTER. Using benchmarking tests, we demonstrate that StrAuto significantly reduces the computational time needed to perform iterative STRUCTURE analysis by distributing runs over two or more processors. StrAuto is the first tool to integrate STRUCTURE analysis with post-processing using a pipeline approach in addition to implementing parallel computation - a set up ideal for deployment on computing clusters. StrAuto is distributed under the GNU GPL (General Public License) and available to download from http://strauto.popgen.org .

  9. High throughput gene expression profiling: a molecular approach to integrative physiology

    PubMed Central

    Liang, Mingyu; Cowley, Allen W; Greene, Andrew S

    2004-01-01

    Integrative physiology emphasizes the importance of understanding multiple pathways with overlapping, complementary, or opposing effects and their interactions in the context of intact organisms. The DNA microarray technology, the most commonly used method for high-throughput gene expression profiling, has been touted as an integrative tool that provides insights into regulatory pathways. However, the physiology community has been slow in acceptance of these techniques because of early failure in generating useful data and the lack of a cohesive theoretical framework in which experiments can be analysed. With recent advances in both technology and analysis, we propose a concept of multidimensional integration of physiology that incorporates data generated by DNA microarray and other functional, genomic, and proteomic approaches to achieve a truly integrative understanding of physiology. Analysis of several studies performed in simpler organisms or in mammalian model animals supports the feasibility of such multidimensional integration and demonstrates the power of DNA microarray as an indispensable molecular tool for such integration. Evaluation of DNA microarray techniques indicates that these techniques, despite limitations, have advanced to a point where the question-driven profiling research has become a feasible complement to the conventional, hypothesis-driven research. With a keen sense of homeostasis, global regulation, and quantitative analysis, integrative physiologists are uniquely positioned to apply these techniques to enhance the understanding of complex physiological functions. PMID:14678487

  10. An Integrated Analysis of the Physiological Effects of Space Flight: Executive Summary

    NASA Technical Reports Server (NTRS)

    Leonard, J. I.

    1985-01-01

    A large array of models were applied in a unified manner to solve problems in space flight physiology. Mathematical simulation was used as an alternative way of looking at physiological systems and maximizing the yield from previous space flight experiments. A medical data analysis system was created which consist of an automated data base, a computerized biostatistical and data analysis system, and a set of simulation models of physiological systems. Five basic models were employed: (1) a pulsatile cardiovascular model; (2) a respiratory model; (3) a thermoregulatory model; (4) a circulatory, fluid, and electrolyte balance model; and (5) an erythropoiesis regulatory model. Algorithms were provided to perform routine statistical tests, multivariate analysis, nonlinear regression analysis, and autocorrelation analysis. Special purpose programs were prepared for rank correlation, factor analysis, and the integration of the metabolic balance data.

  11. Human Factors Virtual Analysis Techniques for NASA's Space Launch System Ground Support using MSFC's Virtual Environments Lab (VEL)

    NASA Technical Reports Server (NTRS)

    Searcy, Brittani

    2017-01-01

    Using virtual environments to assess complex large scale human tasks provides timely and cost effective results to evaluate designs and to reduce operational risks during assembly and integration of the Space Launch System (SLS). NASA's Marshall Space Flight Center (MSFC) uses a suite of tools to conduct integrated virtual analysis during the design phase of the SLS Program. Siemens Jack is a simulation tool that allows engineers to analyze human interaction with CAD designs by placing a digital human model into the environment to test different scenarios and assess the design's compliance to human factors requirements. Engineers at MSFC are using Jack in conjunction with motion capture and virtual reality systems in MSFC's Virtual Environments Lab (VEL). The VEL provides additional capability beyond standalone Jack to record and analyze a person performing a planned task to assemble the SLS at Kennedy Space Center (KSC). The VEL integrates Vicon Blade motion capture system, Siemens Jack, Oculus Rift, and other virtual tools to perform human factors assessments. By using motion capture and virtual reality, a more accurate breakdown and understanding of how an operator will perform a task can be gained. By virtual analysis, engineers are able to determine if a specific task is capable of being safely performed by both a 5% (approx. 5ft) female and a 95% (approx. 6'1) male. In addition, the analysis will help identify any tools or other accommodations that may to help complete the task. These assessments are critical for the safety of ground support engineers and keeping launch operations on schedule. Motion capture allows engineers to save and examine human movements on a frame by frame basis, while virtual reality gives the actor (person performing a task in the VEL) an immersive view of the task environment. This presentation will discuss the need of human factors for SLS and the benefits of analyzing tasks in NASA MSFC's VEL.

  12. A two-dimensional cascade solution using minimized surface singularity density distributions - with application to film cooled turbine blades

    NASA Technical Reports Server (NTRS)

    Mcfarland, E.; Tabakoff, W.; Hamed, A.

    1977-01-01

    An investigation of the effects of coolant injection on the aerodynamic performance of cooled turbine blades is presented. The coolant injection is modeled in the inviscid irrotational adiabatic flow analysis through the cascade using the distributed singularities approach. The resulting integral equations are solved using a minimized surface singularity density criteria. The aerodynamic performance was evaluated using this solution in conjunction with an existing mixing theory analysis. The results of the present analysis are compared with experimental measurements in cold flow tests.

  13. Study and Analyses on the Structural Performance of a Balance

    NASA Technical Reports Server (NTRS)

    Karkehabadi, R.; Rhew, R. D.; Hope, D. J.

    2004-01-01

    Strain-gauge balances for use in wind tunnels have been designed at Langley Research Center (LaRC) since its inception. Currently Langley has more than 300 balances available for its researchers. A force balance is inherently a critically stressed component due to the requirements of measurement sensitivity. The strain-gauge balances have been used in Langley s wind tunnels for a wide variety of aerodynamic tests, and the designs encompass a large array of sizes, loads, and environmental effects. There are six degrees of freedom that a balance has to measure. The balance s task to measure these six degrees of freedom has introduced challenging work in transducer development technology areas. As the emphasis increases on improving aerodynamic performance of all types of aircraft and spacecraft, the demand for improved balances is at the forefront. Force balance stress analysis and acceptance criteria are under review due to LaRC wind tunnel operational safety requirements. This paper presents some of the analyses and research done at LaRC that influence structural integrity of the balances. The analyses are helpful in understanding the overall behavior of existing balances and can be used in the design of new balances to enhance performance. Initially, a maximum load combination was used for a linear structural analysis. When nonlinear effects were encountered, the analysis was extended to include nonlinearities using MSC.Nastran . Because most of the balances are designed using Pro/Mechanica , it is desirable and efficient to use Pro/Mechanica for stress analysis. However, Pro/Mechanica is limited to linear analysis. Both Pro/Mechanica and MSC.Nastran are used for analyses in the present work. The structural integrity of balances and the possibility of modifying existing balances to enhance structural integrity are investigated.

  14. The Role of Leadership and Peer Behaviors in the Performance and Well-Being of Women in Combat: Historical Perspectives, Unit Integration, and Family Issues.

    PubMed

    Segal, Mady Wechsler; Smith, David G; Segal, David R; Canuso, Amy A

    2016-01-01

    This article analyzes how the behaviors of leaders and peers affect the performance and well-being of military women. Locating our analysis within the conceptual model in this issue, we summarize the empirical literature and make practice and policy recommendations. We synthesize results about unit integration, such as research on the conditions for successful integration of previously excluded groups and on the relationship between cohesion and performance. We apply lessons learned from the history of diversity integration in military and civilian organizations, analyzing the treatment of military personnel by race, gender, and sexual orientation. The opening of ground combat specialties and units to women is the latest step in personnel policy changes broadening the recruitment base. We analyze research on gender integration in contemporary armed forces, focusing on positive and negative effects on women of leader and peer behaviors. We discuss conditions for successfully integrating women and those that tend to lead to failure. We analyze military women's family issues, including the effects of deployments and how leaders and peers can help ameliorate problems-or exacerbate them with inappropriate or unsupportive behavior. Reprint & Copyright © 2016 Association of Military Surgeons of the U.S.

  15. Management practices and performance of mergers and acquisitions in Pakistan: mediating role of psychological contract.

    PubMed

    Bari, Muhammad Waseem; Fanchen, Meng; Baloch, Muhammad Awais

    2016-01-01

    The objective of this study is to examine the direct and indirect effect of management practices (procedural justice, coordination approach, communication system, integration strategy, and coping programs) on merger and acquisition (M&A) performance in the Pakistan banking industry. Psychological contract (PC) acts as a mediator between Management practices and M&A performance. The Present study distributes a structured questionnaire to 700 bank employees of different management cadres. The useful response rate is 76 % (536 employees). It uses PLS-SEM technique for data analysis. (1) procedural justice is a key strategy which has highly significant direct and indirect effect on M&A performance; however integration strategy and the communication system have an only direct effect. (2) PC performs partial mediation at different levels between management practices and M&A financial and non-financial performance. This study provides an effective solution to solve the soft issues during and post-M&A process. This is one of the few studies which effectively integrate the five constructs into a single framework to study their effects on M&A performance. Limitations and future research directions are presented in the last section of the study.

  16. Integrating multiple immunogenetic data sources for feature extraction and mining somatic hypermutation patterns: the case of "towards analysis" in chronic lymphocytic leukaemia.

    PubMed

    Kavakiotis, Ioannis; Xochelli, Aliki; Agathangelidis, Andreas; Tsoumakas, Grigorios; Maglaveras, Nicos; Stamatopoulos, Kostas; Hadzidimitriou, Anastasia; Vlahavas, Ioannis; Chouvarda, Ioanna

    2016-06-06

    Somatic Hypermutation (SHM) refers to the introduction of mutations within rearranged V(D)J genes, a process that increases the diversity of Immunoglobulins (IGs). The analysis of SHM has offered critical insight into the physiology and pathology of B cells, leading to strong prognostication markers for clinical outcome in chronic lymphocytic leukaemia (CLL), the most frequent adult B-cell malignancy. In this paper we present a methodology for integrating multiple immunogenetic and clinocobiological data sources in order to extract features and create high quality datasets for SHM analysis in IG receptors of CLL patients. This dataset is used as the basis for a higher level integration procedure, inspired form social choice theory. This is applied in the Towards Analysis, our attempt to investigate the potential ontogenetic transformation of genes belonging to specific stereotyped CLL subsets towards other genes or gene families, through SHM. The data integration process, followed by feature extraction, resulted in the generation of a dataset containing information about mutations occurring through SHM. The Towards analysis performed on the integrated dataset applying voting techniques, revealed the distinct behaviour of subset #201 compared to other subsets, as regards SHM related movements among gene clans, both in allele-conserved and non-conserved gene areas. With respect to movement between genes, a high percentage movement towards pseudo genes was found in all CLL subsets. This data integration and feature extraction process can set the basis for exploratory analysis or a fully automated computational data mining approach on many as yet unanswered, clinically relevant biological questions.

  17. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis

    PubMed Central

    Guardia, Gabriela D. A.; Pires, Luís Ferreira; Vêncio, Ricardo Z. N.; Malmegrim, Kelen C. R.; de Farias, Cléver R. G.

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis. PMID:26207740

  18. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    PubMed

    Guardia, Gabriela D A; Pires, Luís Ferreira; Vêncio, Ricardo Z N; Malmegrim, Kelen C R; de Farias, Cléver R G

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  19. Numerical integration and optimization of motions for multibody dynamic systems

    NASA Astrophysics Data System (ADS)

    Aguilar Mayans, Joan

    This thesis considers the optimization and simulation of motions involving rigid body systems. It does so in three distinct parts, with the following topics: optimization and analysis of human high-diving motions, efficient numerical integration of rigid body dynamics with contacts, and motion optimization of a two-link robot arm using Finite-Time Lyapunov Analysis. The first part introduces the concept of eigenpostures, which we use to simulate and analyze human high-diving motions. Eigenpostures are used in two different ways: first, to reduce the complexity of the optimal control problem that we solve to obtain such motions, and second, to generate an eigenposture space to which we map existing real world motions to better analyze them. The benefits of using eigenpostures are showcased through different examples. The second part reviews an extensive list of integration algorithms used for the integration of rigid body dynamics. We analyze the accuracy and stability of the different integrators in the three-dimensional space and the rotation space SO(3). Integrators with an accuracy higher than first order perform more efficiently than integrators with first order accuracy, even in the presence of contacts. The third part uses Finite-time Lyapunov Analysis to optimize motions for a two-link robot arm. Finite-Time Lyapunov Analysis diagnoses the presence of time-scale separation in the dynamics of the optimized motion and provides the information and methodology for obtaining an accurate approximation to the optimal solution, avoiding the complications that timescale separation causes for alternative solution methods.

  20. Integrative analysis of gene expression and DNA methylation using unsupervised feature extraction for detecting candidate cancer biomarkers.

    PubMed

    Moon, Myungjin; Nakai, Kenta

    2018-04-01

    Currently, cancer biomarker discovery is one of the important research topics worldwide. In particular, detecting significant genes related to cancer is an important task for early diagnosis and treatment of cancer. Conventional studies mostly focus on genes that are differentially expressed in different states of cancer; however, noise in gene expression datasets and insufficient information in limited datasets impede precise analysis of novel candidate biomarkers. In this study, we propose an integrative analysis of gene expression and DNA methylation using normalization and unsupervised feature extractions to identify candidate biomarkers of cancer using renal cell carcinoma RNA-seq datasets. Gene expression and DNA methylation datasets are normalized by Box-Cox transformation and integrated into a one-dimensional dataset that retains the major characteristics of the original datasets by unsupervised feature extraction methods, and differentially expressed genes are selected from the integrated dataset. Use of the integrated dataset demonstrated improved performance as compared with conventional approaches that utilize gene expression or DNA methylation datasets alone. Validation based on the literature showed that a considerable number of top-ranked genes from the integrated dataset have known relationships with cancer, implying that novel candidate biomarkers can also be acquired from the proposed analysis method. Furthermore, we expect that the proposed method can be expanded for applications involving various types of multi-omics datasets.

  1. Participation and integration from the perspective of persons with spinal cord injury from five European countries.

    PubMed

    Ruoranen, Kaisa; Post, Marcel W M; Juvalta, Sibylle; Reinhardt, Jan D

    2015-03-01

    To examine the subjective understanding of participation and integration of persons with spinal cord injuries from 5 European countries and to compare these findings with the International Classification of Functioning, Disability and Health (ICF)'s conceptualization of participation. Semi-structured interviews with 54 persons with acquired spinal cord injuries and 3 with spina bifida from 5 countries were examined using qualitative content analysis. Integration was most often associated with social acceptance and, furthermore, with ordinary performance, equality and freedom of choice. Participation was most often described as ordinary performance, with less emphasis on social acceptance and equality. However, participation and integration overlapped in people's narratives and were difficult to separate. The perception of participation and integration was largely similar across countries. In contrast to others, however, Finnish interviewees were more likely to associate participation with contributing to society. A variety of life domains was identified, of which recreation and leisure, work life, sports and going out were the most prevalent. While participation domains are well covered by the ICF, as is the notion of ordinary performance, interviewees also referred to a rights (e.g. acceptance) and duties (e.g. contribution) perspective.

  2. Training shelter volunteers to teach dog compliance.

    PubMed

    Howard, Veronica J; DiGennaro Reed, Florence D

    2014-01-01

    This study examined the degree to which training procedures influenced the integrity of behaviorally based dog training implemented by volunteers of an animal shelter. Volunteers were taught to implement discrete-trial obedience training to teach 2 skills (sit and wait) to dogs. Procedural integrity during the baseline and written instructions conditions was low across all participants. Although performance increased with use of a video model, integrity did not reach criterion levels until performance feedback and modeling were provided. Moreover, the integrity of the discrete-trial training procedure was significantly and positively correlated with dog compliance to instructions for all dyads. Correct implementation and compliance were observed when participants were paired with a novel dog and trainer, respectively, although generalization of procedural integrity from the discrete-trial sit procedure to the discrete-trial wait procedure was not observed. Shelter consumers rated the behavior change in dogs and trainers as socially significant. Implications of these findings and future directions for research are discussed. © Society for the Experimental Analysis of Behavior.

  3. Microfluidic magnetic fluidized bed for DNA analysis in continuous flow mode.

    PubMed

    Hernández-Neuta, Iván; Pereiro, Iago; Ahlford, Annika; Ferraro, Davide; Zhang, Qiongdi; Viovy, Jean-Louis; Descroix, Stéphanie; Nilsson, Mats

    2018-04-15

    Magnetic solid phase substrates for biomolecule manipulation have become a valuable tool for simplification and automation of molecular biology protocols. However, the handling of magnetic particles inside microfluidic chips for miniaturized assays is often challenging due to inefficient mixing, aggregation, and the advanced instrumentation required for effective actuation. Here, we describe the use of a microfluidic magnetic fluidized bed approach that enables dynamic, highly efficient and simplified magnetic bead actuation for DNA analysis in a continuous flow platform with minimal technical requirements. We evaluate the performance of this approach by testing the efficiency of individual steps of a DNA assay based on padlock probes and rolling circle amplification. This assay comprises common nucleic acid analysis principles, such as hybridization, ligation, amplification and restriction digestion. We obtained efficiencies of up to 90% for these reactions with high throughput processing up to 120μL of DNA dilution at flow rates ranging from 1 to 5μL/min without compromising performance. The fluidized bed was 20-50% more efficient than a commercially available solution for microfluidic manipulation of magnetic beads. Moreover, to demonstrate the potential of this approach for integration into micro-total analysis systems, we optimized the production of a low-cost polymer based microarray and tested its analytical performance for integrated single-molecule digital read-out. Finally, we provide the proof-of-concept for a single-chamber microfluidic chip that combines the fluidized bed with the polymer microarray for a highly simplified and integrated magnetic bead-based DNA analyzer, with potential applications in diagnostics. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Uniformity testing: assessment of a centralized web-based uniformity analysis system.

    PubMed

    Klempa, Meaghan C

    2011-06-01

    Uniformity testing is performed daily to ensure adequate camera performance before clinical use. The aim of this study is to assess the reliability of Beth Israel Deaconess Medical Center's locally built, centralized, Web-based uniformity analysis system by examining the differences between manufacturer and Web-based National Electrical Manufacturers Association integral uniformity calculations measured in the useful field of view (FOV) and the central FOV. Manufacturer and Web-based integral uniformity calculations measured in the useful FOV and the central FOV were recorded over a 30-d period for 4 cameras from 3 different manufacturers. These data were then statistically analyzed. The differences between the uniformity calculations were computed, in addition to the means and the SDs of these differences for each head of each camera. There was a correlation between the manufacturer and Web-based integral uniformity calculations in the useful FOV and the central FOV over the 30-d period. The average differences between the manufacturer and Web-based useful FOV calculations ranged from -0.30 to 0.099, with SD ranging from 0.092 to 0.32. For the central FOV calculations, the average differences ranged from -0.163 to 0.055, with SD ranging from 0.074 to 0.24. Most of the uniformity calculations computed by this centralized Web-based uniformity analysis system are comparable to the manufacturers' calculations, suggesting that this system is reasonably reliable and effective. This finding is important because centralized Web-based uniformity analysis systems are advantageous in that they test camera performance in the same manner regardless of the manufacturer.

  5. Design/cost tradeoff studies. Appendix A. Supporting analyses and tradeoffs, book 2. Earth Observatory Satellite system definition study (EOS)

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Attitude reference systems for use with the Earth Observatory Satellite (EOS) are described. The systems considered are fixed and gimbaled star trackers, star mappers, and digital sun sensors. Covariance analyses were performed to determine performance for the most promising candidate in low altitude and synchronous orbits. The performance of attitude estimators that employ gyroscopes which are periodically updated by a star sensor is established by a single axis covariance analysis. The other systems considered are: (1) the propulsion system design, (2) electric power and electrical integration, (3) thermal control, (4) ground data processing, and (5) the test plan and cost reduction aspects of observatory integration and test.

  6. Neural integrators for decision making: a favorable tradeoff between robustness and sensitivity

    PubMed Central

    Cain, Nicholas; Barreiro, Andrea K.; Shadlen, Michael

    2013-01-01

    A key step in many perceptual decision tasks is the integration of sensory inputs over time, but a fundamental questions remain about how this is accomplished in neural circuits. One possibility is to balance decay modes of membranes and synapses with recurrent excitation. To allow integration over long timescales, however, this balance must be exceedingly precise. The need for fine tuning can be overcome via a “robust integrator” mechanism in which momentary inputs must be above a preset limit to be registered by the circuit. The degree of this limiting embodies a tradeoff between sensitivity to the input stream and robustness against parameter mistuning. Here, we analyze the consequences of this tradeoff for decision-making performance. For concreteness, we focus on the well-studied random dot motion discrimination task and constrain stimulus parameters by experimental data. We show that mistuning feedback in an integrator circuit decreases decision performance but that the robust integrator mechanism can limit this loss. Intriguingly, even for perfectly tuned circuits with no immediate need for a robustness mechanism, including one often does not impose a substantial penalty for decision-making performance. The implication is that robust integrators may be well suited to subserve the basic function of evidence integration in many cognitive tasks. We develop these ideas using simulations of coupled neural units and the mathematics of sequential analysis. PMID:23446688

  7. Gas Path On-line Fault Diagnostics Using a Nonlinear Integrated Model for Gas Turbine Engines

    NASA Astrophysics Data System (ADS)

    Lu, Feng; Huang, Jin-quan; Ji, Chun-sheng; Zhang, Dong-dong; Jiao, Hua-bin

    2014-08-01

    Gas turbine engine gas path fault diagnosis is closely related technology that assists operators in managing the engine units. However, the performance gradual degradation is inevitable due to the usage, and it result in the model mismatch and then misdiagnosis by the popular model-based approach. In this paper, an on-line integrated architecture based on nonlinear model is developed for gas turbine engine anomaly detection and fault diagnosis over the course of the engine's life. These two engine models have different performance parameter update rate. One is the nonlinear real-time adaptive performance model with the spherical square-root unscented Kalman filter (SSR-UKF) producing performance estimates, and the other is a nonlinear baseline model for the measurement estimates. The fault detection and diagnosis logic is designed to discriminate sensor fault and component fault. This integration architecture is not only aware of long-term engine health degradation but also effective to detect gas path performance anomaly shifts while the engine continues to degrade. Compared to the existing architecture, the proposed approach has its benefit investigated in the experiment and analysis.

  8. Flight elements: Fault detection and fault management

    NASA Technical Reports Server (NTRS)

    Lum, H.; Patterson-Hine, A.; Edge, J. T.; Lawler, D.

    1990-01-01

    Fault management for an intelligent computational system must be developed using a top down integrated engineering approach. An approach proposed includes integrating the overall environment involving sensors and their associated data; design knowledge capture; operations; fault detection, identification, and reconfiguration; testability; causal models including digraph matrix analysis; and overall performance impacts on the hardware and software architecture. Implementation of the concept to achieve a real time intelligent fault detection and management system will be accomplished via the implementation of several objectives, which are: Development of fault tolerant/FDIR requirement and specification from a systems level which will carry through from conceptual design through implementation and mission operations; Implementation of monitoring, diagnosis, and reconfiguration at all system levels providing fault isolation and system integration; Optimize system operations to manage degraded system performance through system integration; and Lower development and operations costs through the implementation of an intelligent real time fault detection and fault management system and an information management system.

  9. Compact DFB laser modules with integrated isolator at 935 nm

    NASA Astrophysics Data System (ADS)

    Reggentin, M.; Thiem, H.; Tsianos, G.; Malach, M.; Hofmann, J.; Plocke, T.; Kneier, M.; Richter, L.

    2018-02-01

    New developments in industrial applications and applications under rough environmental conditions within the field of spectroscopy and quantum technology in the 935 nm wavelength regime demand new compact, stable and robust laser systems. Beside a stable laser source the integration of a compact optical isolator is necessary to reduce size and power consumption for the whole laser system. The integration of a suitable optical isolator suppresses back reflections from the following optical system efficiently. However, the miniaturization of the optics inside the package leads to high optical power density levels that make a more detailed analysis of the components and their laser damage threshold necessary. We present test results on compact stable DFB laser sources (butterfly style packages) with newly integrated optical isolators operating around 935 nm. The presented data includes performance and lifetime tests for the laser diodes as well as package components. Overall performance data of the packaged laser diodes will be shown as well.

  10. Common source cascode amplifiers for integrating IR-FPA applications

    NASA Technical Reports Server (NTRS)

    Woolaway, James T.; Young, Erick T.

    1989-01-01

    Space based astronomical infrared measurements present stringent performance requirements on the infrared detector arrays and their associated readout circuitry. To evaluate the usefulness of commercial CMOS technology for astronomical readout applications a theoretical and experimental evaluation was performed on source follower and common-source cascode integrating amplifiers. Theoretical analysis indicates that for conditions where the input amplifier integration capacitance is limited by the detectors capacitance the input referred rms noise electrons of each amplifier should be equivalent. For conditions of input gate limited capacitance the source follower should provide lower noise. Measurements of test circuits containing both source follower and common source cascode circuits showed substantially lower input referred noise for the common-source cascode input circuits. Noise measurements yielded 4.8 input referred rms noise electrons for an 8.5 minute integration. The signal and noise gain of the common-source cascode amplifier appears to offer substantial advantages in acheiving predicted noise levels.

  11. A Synthetic Vision Preliminary Integrated Safety Analysis

    NASA Technical Reports Server (NTRS)

    Hemm, Robert; Houser, Scott

    2001-01-01

    This report documents efforts to analyze a sample of aviation safety programs, using the LMI-developed integrated safety analysis tool to determine the change in system risk resulting from Aviation Safety Program (AvSP) technology implementation. Specifically, we have worked to modify existing system safety tools to address the safety impact of synthetic vision (SV) technology. Safety metrics include reliability, availability, and resultant hazard. This analysis of SV technology is intended to be part of a larger effort to develop a model that is capable of "providing further support to the product design and development team as additional information becomes available". The reliability analysis portion of the effort is complete and is fully documented in this report. The simulation analysis is still underway; it will be documented in a subsequent report. The specific goal of this effort is to apply the integrated safety analysis to SV technology. This report also contains a brief discussion of data necessary to expand the human performance capability of the model, as well as a discussion of human behavior and its implications for system risk assessment in this modeling environment.

  12. NASA Supportability Engineering Implementation Utilizing DoD Practices and Processes

    NASA Technical Reports Server (NTRS)

    Smith, David A.; Smith, John V.

    2010-01-01

    The Ares I design and development program made the determination early in the System Design Review Phase to utilize DoD ILS and LSA approach for supportability engineering as an integral part of the system engineering process. This paper is to provide a review of the overall approach to design Ares-I with an emphasis on a more affordable, supportable, and sustainable launch vehicle. Discussions will include the requirements development, design influence, support concept alternatives, ILS and LSA planning, Logistics support analyses/trades performed, LSA tailoring for NASA Ares Program, support system infrastructure identification, ILS Design Review documentation, Working Group coordination, and overall ILS implementation. At the outset, the Ares I Project initiated the development of the Integrated Logistics Support Plan (ILSP) and a Logistics Support Analysis process to provide a path forward for the management of the Ares-I ILS program and supportability analysis activities. The ILSP provide the initial planning and coordination between the Ares-I Project Elements and Ground Operation Project. The LSA process provided a system engineering approach in the development of the Ares-I supportability requirements; influence the design for supportability and development of alternative support concepts that satisfies the program operability requirements. The LSA planning and analysis results are documented in the Logistics Support Analysis Report. This document was required during the Ares-I System Design Review (SDR) and Preliminary Design Review (PDR) review cycles. To help coordinate the LSA process across the Ares-I project and between programs, the LSA Report is updated and released quarterly. A System Requirement Analysis was performed to determine the supportability requirements and technical performance measurements (TPMs). Two working groups were established to provide support in the management and implement the Ares-I ILS program, the Integrated Logistics Support Working Group (ILSWG) and the Logistics Support Analysis Record Working Group (LSARWG). The Ares I ILSWG is established to assess the requirements and conduct, evaluate analyses and trade studies associated with acquisition logistic and supportability processes and to resolve Ares I integrated logistics and supportability issues. It established a strategic collaborative alliance for coordination of Logistics Support Analysis activates in support of the integrated Ares I vehicle design and development of logistics support infrastructure. A Joint Ares I - Orion LSAR Working Group was established to: 1) Guide the development of Ares-I and Orion LSAR data and serve as a model for future Constellation programs, 2) Develop rules and assumptions that will apply across the Constellation program with regards to the program's LSAR development, and 3) Maintain the Constellation LSAR Style Guide.

  13. Design techniques for low-voltage analog integrated circuits

    NASA Astrophysics Data System (ADS)

    Rakús, Matej; Stopjaková, Viera; Arbet, Daniel

    2017-08-01

    In this paper, a review and analysis of different design techniques for (ultra) low-voltage integrated circuits (IC) are performed. This analysis shows that the most suitable design methods for low-voltage analog IC design in a standard CMOS process include techniques using bulk-driven MOS transistors, dynamic threshold MOS transistors and MOS transistors operating in weak or moderate inversion regions. The main advantage of such techniques is that there is no need for any modification of standard CMOS structure or process. Basic circuit building blocks like differential amplifiers or current mirrors designed using these approaches are able to operate with the power supply voltage of 600 mV (or even lower), which is the key feature towards integrated systems for modern portable applications.

  14. Vehicle Design Evaluation Program (VDEP). A computer program for weight sizing, economic, performance and mission analysis of fuel-conservative aircraft, multibodied aircraft and large cargo aircraft using both JP and alternative fuels

    NASA Technical Reports Server (NTRS)

    Oman, B. H.

    1977-01-01

    The NASA Langley Research Center vehicle design evaluation program (VDEP-2) was expanded by (1) incorporating into the program a capability to conduct preliminary design studies on subsonic commercial transport type aircraft using both JP and such alternate fuels as hydrogen and methane;(2) incorporating an aircraft detailed mission and performance analysis capability; and (3) developing and incorporating an external loads analysis capability. The resulting computer program (VDEP-3) provides a preliminary design tool that enables the user to perform integrated sizing, structural analysis, and cost studies on subsonic commercial transport aircraft. Both versions of the VDEP-3 Program which are designated preliminary Analysis VDEP-3 and detailed Analysis VDEP utilize the same vehicle sizing subprogram which includes a detailed mission analysis capability, as well as a geometry and weight analysis for multibodied configurations.

  15. SWATH Mass Spectrometry Performance Using Extended Peptide MS/MS Assay Libraries.

    PubMed

    Wu, Jemma X; Song, Xiaomin; Pascovici, Dana; Zaw, Thiri; Care, Natasha; Krisp, Christoph; Molloy, Mark P

    2016-07-01

    The use of data-independent acquisition methods such as SWATH for mass spectrometry based proteomics is usually performed with peptide MS/MS assay libraries which enable identification and quantitation of peptide peak areas. Reference assay libraries can be generated locally through information dependent acquisition, or obtained from community data repositories for commonly studied organisms. However, there have been no studies performed to systematically evaluate how locally generated or repository-based assay libraries affect SWATH performance for proteomic studies. To undertake this analysis, we developed a software workflow, SwathXtend, which generates extended peptide assay libraries by integration with a local seed library and delivers statistical analysis of SWATH-quantitative comparisons. We designed test samples using peptides from a yeast extract spiked into peptides from human K562 cell lysates at three different ratios to simulate protein abundance change comparisons. SWATH-MS performance was assessed using local and external assay libraries of varying complexities and proteome compositions. These experiments demonstrated that local seed libraries integrated with external assay libraries achieve better performance than local assay libraries alone, in terms of the number of identified peptides and proteins and the specificity to detect differentially abundant proteins. Our findings show that the performance of extended assay libraries is influenced by the MS/MS feature similarity of the seed and external libraries, while statistical analysis using multiple testing corrections increases the statistical rigor needed when searching against large extended assay libraries. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.

  16. SWATH Mass Spectrometry Performance Using Extended Peptide MS/MS Assay Libraries*

    PubMed Central

    Wu, Jemma X.; Song, Xiaomin; Pascovici, Dana; Zaw, Thiri; Care, Natasha; Krisp, Christoph; Molloy, Mark P.

    2016-01-01

    The use of data-independent acquisition methods such as SWATH for mass spectrometry based proteomics is usually performed with peptide MS/MS assay libraries which enable identification and quantitation of peptide peak areas. Reference assay libraries can be generated locally through information dependent acquisition, or obtained from community data repositories for commonly studied organisms. However, there have been no studies performed to systematically evaluate how locally generated or repository-based assay libraries affect SWATH performance for proteomic studies. To undertake this analysis, we developed a software workflow, SwathXtend, which generates extended peptide assay libraries by integration with a local seed library and delivers statistical analysis of SWATH-quantitative comparisons. We designed test samples using peptides from a yeast extract spiked into peptides from human K562 cell lysates at three different ratios to simulate protein abundance change comparisons. SWATH-MS performance was assessed using local and external assay libraries of varying complexities and proteome compositions. These experiments demonstrated that local seed libraries integrated with external assay libraries achieve better performance than local assay libraries alone, in terms of the number of identified peptides and proteins and the specificity to detect differentially abundant proteins. Our findings show that the performance of extended assay libraries is influenced by the MS/MS feature similarity of the seed and external libraries, while statistical analysis using multiple testing corrections increases the statistical rigor needed when searching against large extended assay libraries. PMID:27161445

  17. Boeing's STAR-FODB test results

    NASA Astrophysics Data System (ADS)

    Fritz, Martin E.; de la Chapelle, Michael; Van Ausdal, Arthur W.

    1995-05-01

    Boeing has successfully concluded a 2 1/2 year, two phase developmental contract for the STAR-Fiber Optic Data Bus (FODB) that is intended for future space-based applications. The first phase included system analysis, trade studies, behavior modeling, and architecture and protocal selection. During this phase we selected AS4074 Linear Token Passing Bus (LTPB) protocol operating at 200 Mbps, along with the passive, star-coupled fiber media. The second phase involved design, build, integration, and performance and environmental test of brassboard hardware. The resulting brassboard hardware successfully passed performance testing, providing 200 Mbps operation with a 32 X 32 star-coupled medium. This hardware is suitable for a spaceflight experiment to validate ground testing and analysis and to demonstrate performace in the intended environment. The fiber bus interface unit (FBIU) is a multichip module containing transceiver, protocol, and data formatting chips, buffer memory, and a station management controller. The FBIU has been designed for low power, high reliability, and radiation tolerance. Nine FBIUs were built and integrated with the fiber optic physical layer consisting of the fiber cable plant (FCP) and star coupler assembly (SCA). Performance and environmental testing, including radiation exposure, was performed on selected FBIUs and the physical layer. The integrated system was demonstrated with a full motion color video image transfer across the bus while simultaneously performing utility functions with a fiber bus control module (FBCM) over a telemetry and control (T&C) bus, in this case AS1773.

  18. MicroScope-an integrated resource for community expertise of gene functions and comparative analysis of microbial genomic and metabolic data.

    PubMed

    Médigue, Claudine; Calteau, Alexandra; Cruveiller, Stéphane; Gachet, Mathieu; Gautreau, Guillaume; Josso, Adrien; Lajus, Aurélie; Langlois, Jordan; Pereira, Hugo; Planel, Rémi; Roche, David; Rollin, Johan; Rouy, Zoe; Vallenet, David

    2017-09-12

    The overwhelming list of new bacterial genomes becoming available on a daily basis makes accurate genome annotation an essential step that ultimately determines the relevance of thousands of genomes stored in public databanks. The MicroScope platform (http://www.genoscope.cns.fr/agc/microscope) is an integrative resource that supports systematic and efficient revision of microbial genome annotation, data management and comparative analysis. Starting from the results of our syntactic, functional and relational annotation pipelines, MicroScope provides an integrated environment for the expert annotation and comparative analysis of prokaryotic genomes. It combines tools and graphical interfaces to analyze genomes and to perform the manual curation of gene function in a comparative genomics and metabolic context. In this article, we describe the free-of-charge MicroScope services for the annotation and analysis of microbial (meta)genomes, transcriptomic and re-sequencing data. Then, the functionalities of the platform are presented in a way providing practical guidance and help to the nonspecialists in bioinformatics. Newly integrated analysis tools (i.e. prediction of virulence and resistance genes in bacterial genomes) and original method recently developed (the pan-genome graph representation) are also described. Integrated environments such as MicroScope clearly contribute, through the user community, to help maintaining accurate resources. © The Author 2017. Published by Oxford University Press.

  19. Toward an integrative and predictive sperm quality analysis in Bos taurus.

    PubMed

    Yániz, J L; Soler, C; Alquézar-Baeta, C; Santolaria, P

    2017-06-01

    There is a need to develop more integrative sperm quality analysis methods, enabling researchers to evaluate different parameters simultaneously cell by cell. In this work, we present a new multi-parametric fluorescent test able to discriminate different sperm subpopulations based on their labeling pattern and motility characteristics. Cryopreserved semen samples from 20 Holstein bulls were used in the study. Analyses of sperm motility using computer-assisted sperm analysis (CASA-mot), membrane integrity by acridine orange-propidium iodide combination and multi-parametric by the ISAS ® 3Fun kit, were performed. The new method allows a clear discrimination of sperm subpopulations based on membrane and acrosomal integrity, motility and morphology. It was also possible to observe live spermatozoa showing signs of capacitation such as hyperactivated motility and changes in acrosomal structure. Sperm subpopulation with intact plasma membrane and acrosome showed a higher proportion of motile sperm than those with damaged acrosome or increased fluorescence intensity. Spermatozoa with intact plasmalemma and damaged acrosome were static or exhibit weak movement. Significant correlations among the different sperm quality parameters evaluated were also described. We concluded that the ISAS ® 3Fun is an integrated method that represents an advance in sperm quality analysis with the potential to improve fertility predictions. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. NPSS Multidisciplinary Integration and Analysis

    NASA Technical Reports Server (NTRS)

    Hall, Edward J.; Rasche, Joseph; Simons, Todd A.; Hoyniak, Daniel

    2006-01-01

    The objective of this task was to enhance the capability of the Numerical Propulsion System Simulation (NPSS) by expanding its reach into the high-fidelity multidisciplinary analysis area. This task investigated numerical techniques to convert between cold static to hot running geometry of compressor blades. Numerical calculations of blade deformations were iteratively done with high fidelity flow simulations together with high fidelity structural analysis of the compressor blade. The flow simulations were performed with the Advanced Ducted Propfan Analysis (ADPAC) code, while structural analyses were performed with the ANSYS code. High fidelity analyses were used to evaluate the effects on performance of: variations in tip clearance, uncertainty in manufacturing tolerance, variable inlet guide vane scheduling, and the effects of rotational speed on the hot running geometry of the compressor blades.

  1. Perceptual learning in Williams syndrome: looking beyond averages.

    PubMed

    Gervan, Patricia; Gombos, Ferenc; Kovacs, Ilona

    2012-01-01

    Williams Syndrome is a genetically determined neurodevelopmental disorder characterized by an uneven cognitive profile and surprisingly large neurobehavioral differences among individuals. Previous studies have already shown different forms of memory deficiencies and learning difficulties in WS. Here we studied the capacity of WS subjects to improve their performance in a basic visual task. We employed a contour integration paradigm that addresses occipital visual function, and analyzed the initial (i.e. baseline) and after-learning performance of WS individuals. Instead of pooling the very inhomogeneous results of WS subjects together, we evaluated individual performance by expressing it in terms of the deviation from the average performance of the group of typically developing subjects of similar age. This approach helped us to reveal information about the possible origins of poor performance of WS subjects in contour integration. Although the majority of WS individuals showed both reduced baseline and reduced learning performance, individual analysis also revealed a dissociation between baseline and learning capacity in several WS subjects. In spite of impaired initial contour integration performance, some WS individuals presented learning capacity comparable to learning in the typically developing population, and vice versa, poor learning was also observed in subjects with high initial performance levels. These data indicate a dissociation between factors determining initial performance and perceptual learning.

  2. EMAAS: An extensible grid-based Rich Internet Application for microarray data analysis and management

    PubMed Central

    Barton, G; Abbott, J; Chiba, N; Huang, DW; Huang, Y; Krznaric, M; Mack-Smith, J; Saleem, A; Sherman, BT; Tiwari, B; Tomlinson, C; Aitman, T; Darlington, J; Game, L; Sternberg, MJE; Butcher, SA

    2008-01-01

    Background Microarray experimentation requires the application of complex analysis methods as well as the use of non-trivial computer technologies to manage the resultant large data sets. This, together with the proliferation of tools and techniques for microarray data analysis, makes it very challenging for a laboratory scientist to keep up-to-date with the latest developments in this field. Our aim was to develop a distributed e-support system for microarray data analysis and management. Results EMAAS (Extensible MicroArray Analysis System) is a multi-user rich internet application (RIA) providing simple, robust access to up-to-date resources for microarray data storage and analysis, combined with integrated tools to optimise real time user support and training. The system leverages the power of distributed computing to perform microarray analyses, and provides seamless access to resources located at various remote facilities. The EMAAS framework allows users to import microarray data from several sources to an underlying database, to pre-process, quality assess and analyse the data, to perform functional analyses, and to track data analysis steps, all through a single easy to use web portal. This interface offers distance support to users both in the form of video tutorials and via live screen feeds using the web conferencing tool EVO. A number of analysis packages, including R-Bioconductor and Affymetrix Power Tools have been integrated on the server side and are available programmatically through the Postgres-PLR library or on grid compute clusters. Integrated distributed resources include the functional annotation tool DAVID, GeneCards and the microarray data repositories GEO, CELSIUS and MiMiR. EMAAS currently supports analysis of Affymetrix 3' and Exon expression arrays, and the system is extensible to cater for other microarray and transcriptomic platforms. Conclusion EMAAS enables users to track and perform microarray data management and analysis tasks through a single easy-to-use web application. The system architecture is flexible and scalable to allow new array types, analysis algorithms and tools to be added with relative ease and to cope with large increases in data volume. PMID:19032776

  3. FY 1986 current fiscal year work plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This Current Year Work Plan presents in detail a description of the activities to be performed by the Joint Integration Office/RI during FY86. It breaks down the activities into two major work areas: Program Management and Program Analysis. Program Management is performed by the JIO/RI by providing technical planning and guidance for the development of advanced TRU waste management capabilities. This includes equipment/facility design, engineering, construction, and operations. These functions are integrated to allow transition from interim storage to final disposition. JIO/RI tasks include program requirements identification, long-range technical planning, budget development, program planning document preparation, task guidance development, taskmore » monitoring, task progress information gathering and reporting to DOE, interfacing with other agencies and DOE lead programs, integrating public involvement with program efforts, and preparation of reports for DOE detailing program status. Program Analysis is performed by the JIO/RI to support identification and assessment of alternatives, and development of long-term TRU waste program capabilities. These analyses include short term analyses in response to DOE information requests, along with performing an RH Cost/Schedule Optimization report. System models will be developed, updated, and upgraded as needed to enhance JIO/RI's capability to evaluate the adequacy of program efforts in various fields. A TRU program data base will be maintained and updated to provide DOE with timely responses to inventory related questions.« less

  4. An Integrated Gate Turnaround Management Concept Leveraging Big Data/Analytics for NAS Performance Improvements

    NASA Technical Reports Server (NTRS)

    Chung, William; Chachad, Girish; Hochstetler, Ronald

    2016-01-01

    The Integrated Gate Turnaround Management (IGTM) concept was developed to improve the gate turnaround performance at the airport by leveraging relevant historical data to support optimization of airport gate operations, which include: taxi to the gate, gate services, push back, taxi to the runway, and takeoff, based on available resources, constraints, and uncertainties. By analyzing events of gate operations, primary performance dependent attributes of these events were identified for the historical data analysis such that performance models can be developed based on uncertainties to support descriptive, predictive, and prescriptive functions. A system architecture was developed to examine system requirements in support of such a concept. An IGTM prototype was developed to demonstrate the concept using a distributed network and collaborative decision tools for stakeholders to meet on time pushback performance under uncertainties.

  5. International Space Station Alpha (ISSA) Integrated Traffic Model

    NASA Technical Reports Server (NTRS)

    Gates, Robert E.

    1994-01-01

    The paper discusses the development process of the International Space Station Alpha (ISSA) Integrated Traffic Model which is a subsystem analyses tool utilized in the ISSA design analysis cycles. Fast-track prototyping of the detailed relationships between daily crew and station consumables, propellant needs, maintenance requirements, and crew rotation via spread sheets provides adequate bench marks to assess cargo vehicle design and performance characteristics.

  6. Preliminary candidate advanced avionics system for general aviation

    NASA Technical Reports Server (NTRS)

    Mccalla, T. M.; Grismore, F. L.; Greatline, S. E.; Birkhead, L. M.

    1977-01-01

    An integrated avionics system design was carried out to the level which indicates subsystem function, and the methods of overall system integration. Sufficient detail was included to allow identification of possible system component technologies, and to perform reliability, modularity, maintainability, cost, and risk analysis upon the system design. Retrofit to older aircraft, availability of this system to the single engine two place aircraft, was considered.

  7. A Design Architecture for an Integrated Training System Decision Support System

    DTIC Science & Technology

    1990-07-01

    Sensory modes include visual, auditory, tactile, or kinesthetic; performance categories include time to complete , speed of response, or correct action ...procedures, and finally application and examples from the aviation proponency with emphasis on the LHX program. Appendix B is a complete bibliography...integrated analysis of ITS development. The approach was designed to provide an accurate and complete representation of the ITS development process and

  8. Integrative Approaches of Native and Foreign Scholars to Pedology in the Context of Views of the Third Millennium

    ERIC Educational Resources Information Center

    Kuzminsky, Anatoliy

    2016-01-01

    Problems of appearing and functioning of human study science, i.e. pedology, have been studied in the paper. Theoretical analysis of integrative approaches of native and foreign scholars to pedology in the context of views of the third millennium has been performed. Useful and positive achievements of this science as well as wrong ones determined…

  9. Role of premission testing in the National Missile Defense system

    NASA Astrophysics Data System (ADS)

    Tillman, Janice V.; Atkinson, Beverly

    2001-09-01

    The purpose of the National Missile Defense (NMD) system is to provide detection, discrimination, engagement, interception, and negation of ballistic missile attacks targeted at the United States (U.S.), including Alaska and Hawaii. This capability is achieved through the integration of weapons, sensors, and a battle management, command, control and communications (BMC3) system. The NMD mission includes surveillance, warning, cueing, and engagement of threat objects prior to potential impact on U.S. targets. The NMD Acquisition Strategy encompasses an integrated test program using Integrated Ground Tests (IGTs), Integrated Flight Tests (IFTs), Risk Reduction Flights (RRFs), Pre Mission Tests (PMTs), Command and Control (C2) Simulations, and other Specialty Tests. The IGTs utilize software-in-the-loop/hardware-in-the-loop (SWIL / HWIL) and digital simulations. The IFTs are conducted with targets launched from Vandenberg Air Force Base (VAFB) and interceptors launched from Kwajalein Missile Range (KMR). The RRFs evaluate NMD BMC3 and NMD sensor functional performance and integration by leveraging planned Peacekeeper and Minuteman III operational test flights and other opportunities without employing the NMD interceptor. The PMTs are nondestructive System-level tests representing the use of NMD Element Test Assets in their IFT configuration and are conducted to reduce risks in achieving the IFT objectives. Specifically, PMTs are used to reduce integration, interface, and performance risks associated with Flight Tests to ensure that as much as possible, the System is tested without expending a target or an interceptor. This paper examines several critical test planning and analysis functions as they relate to the NMD Integrated Flight Test program and, in particular, to Pre-Mission Testing. Topics to be discussed include: - Flight-test program planning; - Pre-Test Integration activities; and - Test Execution, Analysis, and Post-Flight Reconstruction.

  10. C3 generic workstation: Performance metrics and applications

    NASA Technical Reports Server (NTRS)

    Eddy, Douglas R.

    1988-01-01

    The large number of integrated dependent measures available on a command, control, and communications (C3) generic workstation under development are described. In this system, embedded communications tasks will manipulate workload to assess the effects of performance-enhancing drugs (sleep aids and decongestants), work/rest cycles, biocybernetics, and decision support systems on performance. Task performance accuracy and latency will be event coded for correlation with other measures of voice stress and physiological functioning. Sessions will be videotaped to score non-verbal communications. Physiological recordings include spectral analysis of EEG, ECG, vagal tone, and EOG. Subjective measurements include SWAT, fatigue, POMS and specialized self-report scales. The system will be used primarily to evaluate the effects on performance of drugs, work/rest cycles, and biocybernetic concepts. Performance assessment algorithms will also be developed, including those used with small teams. This system provides a tool for integrating and synchronizing behavioral and psychophysiological measures in a complex decision-making environment.

  11. Integrated system for automated financial document processing

    NASA Astrophysics Data System (ADS)

    Hassanein, Khaled S.; Wesolkowski, Slawo; Higgins, Ray; Crabtree, Ralph; Peng, Antai

    1997-02-01

    A system was developed that integrates intelligent document analysis with multiple character/numeral recognition engines in order to achieve high accuracy automated financial document processing. In this system, images are accepted in both their grayscale and binary formats. A document analysis module starts by extracting essential features from the document to help identify its type (e.g. personal check, business check, etc.). These features are also utilized to conduct a full analysis of the image to determine the location of interesting zones such as the courtesy amount and the legal amount. These fields are then made available to several recognition knowledge sources such as courtesy amount recognition engines and legal amount recognition engines through a blackboard architecture. This architecture allows all the available knowledge sources to contribute incrementally and opportunistically to the solution of the given recognition query. Performance results on a test set of machine printed business checks using the integrated system are also reported.

  12. Integrative Genomics Viewer (IGV): high-performance genomics data visualization and exploration

    PubMed Central

    Thorvaldsdóttir, Helga; Mesirov, Jill P.

    2013-01-01

    Data visualization is an essential component of genomic data analysis. However, the size and diversity of the data sets produced by today’s sequencing and array-based profiling methods present major challenges to visualization tools. The Integrative Genomics Viewer (IGV) is a high-performance viewer that efficiently handles large heterogeneous data sets, while providing a smooth and intuitive user experience at all levels of genome resolution. A key characteristic of IGV is its focus on the integrative nature of genomic studies, with support for both array-based and next-generation sequencing data, and the integration of clinical and phenotypic data. Although IGV is often used to view genomic data from public sources, its primary emphasis is to support researchers who wish to visualize and explore their own data sets or those from colleagues. To that end, IGV supports flexible loading of local and remote data sets, and is optimized to provide high-performance data visualization and exploration on standard desktop systems. IGV is freely available for download from http://www.broadinstitute.org/igv, under a GNU LGPL open-source license. PMID:22517427

  13. Integrative Genomics Viewer (IGV): high-performance genomics data visualization and exploration.

    PubMed

    Thorvaldsdóttir, Helga; Robinson, James T; Mesirov, Jill P

    2013-03-01

    Data visualization is an essential component of genomic data analysis. However, the size and diversity of the data sets produced by today's sequencing and array-based profiling methods present major challenges to visualization tools. The Integrative Genomics Viewer (IGV) is a high-performance viewer that efficiently handles large heterogeneous data sets, while providing a smooth and intuitive user experience at all levels of genome resolution. A key characteristic of IGV is its focus on the integrative nature of genomic studies, with support for both array-based and next-generation sequencing data, and the integration of clinical and phenotypic data. Although IGV is often used to view genomic data from public sources, its primary emphasis is to support researchers who wish to visualize and explore their own data sets or those from colleagues. To that end, IGV supports flexible loading of local and remote data sets, and is optimized to provide high-performance data visualization and exploration on standard desktop systems. IGV is freely available for download from http://www.broadinstitute.org/igv, under a GNU LGPL open-source license.

  14. An integrated aerobic-anaerobic strategy for performance enhancement of Pseudomonas aeruginosa-inoculated microbial fuel cell.

    PubMed

    Yong, Xiao-Yu; Yan, Zhi-Ying; Shen, Hai-Bo; Zhou, Jun; Wu, Xia-Yuan; Zhang, Li-Juan; Zheng, Tao; Jiang, Min; Wei, Ping; Jia, Hong-Hua; Yong, Yang-Chun

    2017-10-01

    Microbial fuel cell (MFC) is a promising device for energy generation and organic waste treatment simultaneously by electrochemically active bacteria (EAB). In this study, an integrated aerobic-anaerobic strategy was developed to improve the performance of P. aeruginosa-inoculated MFC. With an aerobic start-up and following an anaerobic discharge process, the current density of MFC reached a maximum of 99.80µA/cm 2 , which was 91.6% higher than the MFC with conventional constant-anaerobic operation. Cyclic voltammetry and HPLC analysis showed that aerobic start-up significantly increased electron shuttle (pyocyanin) production (76% higher than the constant-anaerobic MFC). Additionally, enhanced anode biofilm formation was also observed in the integrated aerobic-anaerobic MFC. The increased pyocyanin production and biofilm formation promoted extracellular electron transfer from EAB to the anode and were the underlying mechanism for the MFC performance enhancement. This work demonstrated the integrated aerobic-anaerobic strategy would be a practical strategy to enhance the electricity generation of MFC. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Recurrence Quantification Analysis of Processes and Products of Discourse: A Tutorial in R

    ERIC Educational Resources Information Center

    Wallot, Sebastian

    2017-01-01

    Processes of naturalistic reading and writing are based on complex linguistic input, stretch-out over time, and rely on an integrated performance of multiple perceptual, cognitive, and motor processes. Hence, naturalistic reading and writing performance is nonstationary and exhibits fluctuations and transitions. However, instead of being just…

  16. Methotrexate Reduces DNA Integrity in Sperm From Men With Inflammatory Bowel Disease.

    PubMed

    Ley, Dana; Jones, Jeffrey; Parrish, John; Salih, Sana; Caldera, Freddy; Tirado, Edna; Leader, Benjamin; Saha, Sumona

    2018-06-01

    There are few data on the effects of methotrexate on reproductive capacity in men with inflammatory bowel diseases (IBDs). We performed a case-control study to determine the effects of methotrexate on sperm quality and genetic integrity. We compared sperm samples from 7 men with IBD who had been exposed to methotrexate for at least 3 months with sperm samples collected from 1912 age-matched men at fertility centers (controls) where sperm parameters would be expected to be worse than those of the general population. Sperm were evaluated by basic semen analysis and advanced sperm integrity testing. In samples from men with IBD, all basic semen analysis parameters were within normal limits. However, these samples had reduced sperm integrity, based on significant increases in levels of DNA fragmentation and damage from oxidative stress compared with controls. Our findings indicate that methotrexate can reduce DNA integrity in sperm and cause damage via oxidative stress. Copyright © 2018 AGA Institute. Published by Elsevier Inc. All rights reserved.

  17. Booly: a new data integration platform.

    PubMed

    Do, Long H; Esteves, Francisco F; Karten, Harvey J; Bier, Ethan

    2010-10-13

    Data integration is an escalating problem in bioinformatics. We have developed a web tool and warehousing system, Booly, that features a simple yet flexible data model coupled with the ability to perform powerful comparative analysis, including the use of Boolean logic to merge datasets together, and an integrated aliasing system to decipher differing names of the same gene or protein. Furthermore, Booly features a collaborative sharing system and a public repository so that users can retrieve new datasets while contributors can easily disseminate new content. We illustrate the uses of Booly with several examples including: the versatile creation of homebrew datasets, the integration of heterogeneous data to identify genes useful for comparing avian and mammalian brain architecture, and generation of a list of Food and Drug Administration (FDA) approved drugs with possible alternative disease targets. The Booly paradigm for data storage and analysis should facilitate integration between disparate biological and medical fields and result in novel discoveries that can then be validated experimentally. Booly can be accessed at http://booly.ucsd.edu.

  18. Booly: a new data integration platform

    PubMed Central

    2010-01-01

    Background Data integration is an escalating problem in bioinformatics. We have developed a web tool and warehousing system, Booly, that features a simple yet flexible data model coupled with the ability to perform powerful comparative analysis, including the use of Boolean logic to merge datasets together, and an integrated aliasing system to decipher differing names of the same gene or protein. Furthermore, Booly features a collaborative sharing system and a public repository so that users can retrieve new datasets while contributors can easily disseminate new content. Results We illustrate the uses of Booly with several examples including: the versatile creation of homebrew datasets, the integration of heterogeneous data to identify genes useful for comparing avian and mammalian brain architecture, and generation of a list of Food and Drug Administration (FDA) approved drugs with possible alternative disease targets. Conclusions The Booly paradigm for data storage and analysis should facilitate integration between disparate biological and medical fields and result in novel discoveries that can then be validated experimentally. Booly can be accessed at http://booly.ucsd.edu. PMID:20942966

  19. Finite element modeling simulation-assisted design of integrated microfluidic chips for heavy metal ion stripping analysis

    NASA Astrophysics Data System (ADS)

    Hong, Ying; Zou, Jianhua; Ge, Gang; Xiao, Wanyue; Gao, Ling; Shao, Jinjun; Dong, Xiaochen

    2017-10-01

    In this article, a transparent integrated microfluidic device composed of a 3D-printed thin-layer flow cell (3D-PTLFC) and an S-shaped screen-printed electrode (SPE) has been designed and fabricated for heavy metal ion stripping analysis. A finite element modeling (FEM) simulation is employed to optimize the shape of the electrode, the direction of the inlet pipeline, the thin-layer channel height and the sample flow rate to enhance the electron-enrichment efficiency for stripping analysis. The results demonstrate that the S-shaped SPE configuration matches the channel in 3D-PTLFC perfectly for the anodic stripping behavior of the heavy metal ions. Under optimized conditions, a wide linear range of 1-80 µg l-1 is achieved for Pb2+ detection with a limit of 0.3 µg l-1 for the microfluidic device. Thus, the obtained integrated microfluidic device proves to be a promising approach for heavy metal ions stripping analysis with low cost and high performance.

  20. Integrated optomechanical analysis and testing software development at MIT Lincoln Laboratory

    NASA Astrophysics Data System (ADS)

    Stoeckel, Gerhard P.; Doyle, Keith B.

    2013-09-01

    Advanced analytical software capabilities are being developed to advance the design of prototypical hardware in the Engineering Division at MIT Lincoln Laboratory. The current effort is focused on the integration of analysis tools tailored to the work flow, organizational structure, and current technology demands. These tools are being designed to provide superior insight into the interdisciplinary behavior of optical systems and enable rapid assessment and execution of design trades to optimize the design of optomechanical systems. The custom software architecture is designed to exploit and enhance the functionality of existing industry standard commercial software, provide a framework for centralizing internally developed tools, and deliver greater efficiency, productivity, and accuracy through standardization, automation, and integration. Specific efforts have included the development of a feature-rich software package for Structural-Thermal-Optical Performance (STOP) modeling, advanced Line Of Sight (LOS) jitter simulations, and improved integration of dynamic testing and structural modeling.

  1. Computer-Aided Design Of Turbine Blades And Vanes

    NASA Technical Reports Server (NTRS)

    Hsu, Wayne Q.

    1988-01-01

    Quasi-three-dimensional method for determining aerothermodynamic configuration of turbine uses computer-interactive analysis and design and computer-interactive graphics. Design procedure executed rapidly so designer easily repeats it to arrive at best performance, size, structural integrity, and engine life. Sequence of events in aerothermodynamic analysis and design starts with engine-balance equations and ends with boundary-layer analysis and viscous-flow calculations. Analysis-and-design procedure interactive and iterative throughout.

  2. A Large-Scale Design Integration Approach Developed in Conjunction with the Ares Launch Vehicle Program

    NASA Technical Reports Server (NTRS)

    Redmon, John W.; Shirley, Michael C.; Kinard, Paul S.

    2012-01-01

    This paper presents a method for performing large-scale design integration, taking a classical 2D drawing envelope and interface approach and applying it to modern three dimensional computer aided design (3D CAD) systems. Today, the paradigm often used when performing design integration with 3D models involves a digital mockup of an overall vehicle, in the form of a massive, fully detailed, CAD assembly; therefore, adding unnecessary burden and overhead to design and product data management processes. While fully detailed data may yield a broad depth of design detail, pertinent integration features are often obscured under the excessive amounts of information, making them difficult to discern. In contrast, the envelope and interface method results in a reduction in both the amount and complexity of information necessary for design integration while yielding significant savings in time and effort when applied to today's complex design integration projects. This approach, combining classical and modern methods, proved advantageous during the complex design integration activities of the Ares I vehicle. Downstream processes, benefiting from this approach by reducing development and design cycle time, include: Creation of analysis models for the Aerodynamic discipline; Vehicle to ground interface development; Documentation development for the vehicle assembly.

  3. TAP 2: A finite element program for thermal analysis of convectively cooled structures

    NASA Technical Reports Server (NTRS)

    Thornton, E. A.

    1980-01-01

    A finite element computer program (TAP 2) for steady-state and transient thermal analyses of convectively cooled structures is presented. The program has a finite element library of six elements: two conduction/convection elements to model heat transfer in a solid, two convection elements to model heat transfer in a fluid, and two integrated conduction/convection elements to represent combined heat transfer in tubular and plate/fin fluid passages. Nonlinear thermal analysis due to temperature-dependent thermal parameters is performed using the Newton-Raphson iteration method. Transient analyses are performed using an implicit Crank-Nicolson time integration scheme with consistent or lumped capacitance matrices as an option. Program output includes nodal temperatures and element heat fluxes. Pressure drops in fluid passages may be computed as an option. User instructions and sample problems are presented in appendixes.

  4. Transmission Bearing Damage Detection Using Decision Fusion Analysis

    NASA Technical Reports Server (NTRS)

    Dempsey, Paula J.; Lewicki, David G.; Decker, Harry J.

    2004-01-01

    A diagnostic tool was developed for detecting fatigue damage to rolling element bearings in an OH-58 main rotor transmission. Two different monitoring technologies, oil debris analysis and vibration, were integrated using data fusion into a health monitoring system for detecting bearing surface fatigue pitting damage. This integrated system showed improved detection and decision-making capabilities as compared to using individual monitoring technologies. This diagnostic tool was evaluated by collecting vibration and oil debris data from tests performed in the NASA Glenn 500 hp Helicopter Transmission Test Stand. Data was collected during experiments performed in this test rig when two unanticipated bearing failures occurred. Results show that combining the vibration and oil debris measurement technologies improves the detection of pitting damage on spiral bevel gears duplex ball bearings and spiral bevel pinion triplex ball bearings in a main rotor transmission.

  5. Development of a facility using robotics for testing automation of inertial instruments

    NASA Technical Reports Server (NTRS)

    Greig, Joy Y.; Lamont, Gary B.; Biezad, Daniel J.; Lewantowicz, Zdsislaw H.; Greig, Joy Y.

    1987-01-01

    The Integrated Robotics System Simulation (ROBSIM) was used to evaluate the performance of the PUMA 560 arm as applied to testing of inertial sensors. Results of this effort were used in the design and development of a feasibility test environment using a PUMA 560 arm. The implemented facility demonstrated the ability to perform conventional static inertial instrument tests (rotation and tumble). The facility included an efficient data acquisitions capability along with a precision test servomechanism function resulting in various data presentations which are included in the paper. Analysis of inertial instrument testing accuracy, repeatability and noise characteristics are provided for the PUMA 560 as well as for other possible commercial arm configurations. Another integral aspect of the effort was an in-depth economic analysis and comparison of robot arm testing versus use of contemporary precision test equipment.

  6. Multispectral scanner system parameter study and analysis software system description, volume 2

    NASA Technical Reports Server (NTRS)

    Landgrebe, D. A. (Principal Investigator); Mobasseri, B. G.; Wiersma, D. J.; Wiswell, E. R.; Mcgillem, C. D.; Anuta, P. E.

    1978-01-01

    The author has identified the following significant results. The integration of the available methods provided the analyst with the unified scanner analysis package (USAP), the flexibility and versatility of which was superior to many previous integrated techniques. The USAP consisted of three main subsystems; (1) a spatial path, (2) a spectral path, and (3) a set of analytic classification accuracy estimators which evaluated the system performance. The spatial path consisted of satellite and/or aircraft data, data correlation analyzer, scanner IFOV, and random noise model. The output of the spatial path was fed into the analytic classification and accuracy predictor. The spectral path consisted of laboratory and/or field spectral data, EXOSYS data retrieval, optimum spectral function calculation, data transformation, and statistics calculation. The output of the spectral path was fended into the stratified posterior performance estimator.

  7. Optimal integration strategies for a syngas fuelled SOFC and gas turbine hybrid

    NASA Astrophysics Data System (ADS)

    Zhao, Yingru; Sadhukhan, Jhuma; Lanzini, Andrea; Brandon, Nigel; Shah, Nilay

    This article aims to develop a thermodynamic modelling and optimization framework for a thorough understanding of the optimal integration of fuel cell, gas turbine and other components in an ambient pressure SOFC-GT hybrid power plant. This method is based on the coupling of a syngas-fed SOFC model and an associated irreversible GT model, with an optimization algorithm developed using MATLAB to efficiently explore the range of possible operating conditions. Energy and entropy balance analysis has been carried out for the entire system to observe the irreversibility distribution within the plant and the contribution of different components. Based on the methodology developed, a comprehensive parametric analysis has been performed to explore the optimum system behavior, and predict the sensitivity of system performance to the variations in major design and operating parameters. The current density, operating temperature, fuel utilization and temperature gradient of the fuel cell, as well as the isentropic efficiencies and temperature ratio of the gas turbine cycle, together with three parameters related to the heat transfer between subsystems are all set to be controllable variables. Other factors affecting the hybrid efficiency have been further simulated and analysed. The model developed is able to predict the performance characteristics of a wide range of hybrid systems potentially sizing from 2000 to 2500 W m -2 with efficiencies varying between 50% and 60%. The analysis enables us to identify the system design tradeoffs, and therefore to determine better integration strategies for advanced SOFC-GT systems.

  8. IAC level "O" program development

    NASA Technical Reports Server (NTRS)

    Vos, R. G.

    1982-01-01

    The current status of the IAC development activity is summarized. The listed prototype software and documentation was delivered, and details were planned for development of the level 1 operational system. The planned end product IAC is required to support LSST design analysis and performance evaluation, with emphasis on the coupling of required technical disciplines. The long term IAC effectively provides two distinct features: a specific set of analysis modules (thermal, structural, controls, antenna radiation performance and instrument optical performance) that will function together with the IAC supporting software in an integrated and user friendly manner; and a general framework whereby new analysis modules can readily be incorporated into IAC or be allowed to communicate with it.

  9. Probabilistic Analysis of Solid Oxide Fuel Cell Based Hybrid Gas Turbine System

    NASA Technical Reports Server (NTRS)

    Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.

    2003-01-01

    The emergence of fuel cell systems and hybrid fuel cell systems requires the evolution of analysis strategies for evaluating thermodynamic performance. A gas turbine thermodynamic cycle integrated with a fuel cell was computationally simulated and probabilistically evaluated in view of the several uncertainties in the thermodynamic performance parameters. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the uncertainties in the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design and make it cost effective. The analysis leads to the selection of criteria for gas turbine performance.

  10. Web-based visual analysis for high-throughput genomics

    PubMed Central

    2013-01-01

    Background Visualization plays an essential role in genomics research by making it possible to observe correlations and trends in large datasets as well as communicate findings to others. Visual analysis, which combines visualization with analysis tools to enable seamless use of both approaches for scientific investigation, offers a powerful method for performing complex genomic analyses. However, there are numerous challenges that arise when creating rich, interactive Web-based visualizations/visual analysis applications for high-throughput genomics. These challenges include managing data flow from Web server to Web browser, integrating analysis tools and visualizations, and sharing visualizations with colleagues. Results We have created a platform simplifies the creation of Web-based visualization/visual analysis applications for high-throughput genomics. This platform provides components that make it simple to efficiently query very large datasets, draw common representations of genomic data, integrate with analysis tools, and share or publish fully interactive visualizations. Using this platform, we have created a Circos-style genome-wide viewer, a generic scatter plot for correlation analysis, an interactive phylogenetic tree, a scalable genome browser for next-generation sequencing data, and an application for systematically exploring tool parameter spaces to find good parameter values. All visualizations are interactive and fully customizable. The platform is integrated with the Galaxy (http://galaxyproject.org) genomics workbench, making it easy to integrate new visual applications into Galaxy. Conclusions Visualization and visual analysis play an important role in high-throughput genomics experiments, and approaches are needed to make it easier to create applications for these activities. Our framework provides a foundation for creating Web-based visualizations and integrating them into Galaxy. Finally, the visualizations we have created using the framework are useful tools for high-throughput genomics experiments. PMID:23758618

  11. Structural-Thermal-Optical-Performance (STOP) Model Development and Analysis of a Field-widened Michelson Interferometer

    NASA Technical Reports Server (NTRS)

    Scola, Salvatore J.; Osmundsen, James F.; Murchison, Luke S.; Davis, Warren T.; Fody, Joshua M.; Boyer, Charles M.; Cook, Anthony L.; Hostetler, Chris A.; Seaman, Shane T.; Miller, Ian J.; hide

    2014-01-01

    An integrated Structural-Thermal-Optical-Performance (STOP) model was developed for a field-widened Michelson interferometer which is being built and tested for the High Spectral Resolution Lidar (HSRL) project at NASA Langley Research Center (LaRC). The performance of the interferometer is highly sensitive to thermal expansion, changes in refractive index with temperature, temperature gradients, and deformation due to mounting stresses. Hand calculations can only predict system performance for uniform temperature changes, under the assumption that coefficient of thermal expansion (CTE) mismatch effects are negligible. An integrated STOP model was developed to investigate the effects of design modifications on the performance of the interferometer in detail, including CTE mismatch, and other three- dimensional effects. The model will be used to improve the design for a future spaceflight version of the interferometer. The STOP model was developed using the Comet SimApp'TM' Authoring Workspace which performs automated integration between Pro-Engineer®, Thermal Desktop®, MSC Nastran'TM', SigFit'TM', Code V'TM', and MATLAB®. This is the first flight project for which LaRC has utilized Comet, and it allows a larger trade space to be studied in a shorter time than would be possible in a traditional STOP analysis. This paper describes the development of the STOP model, presents a comparison of STOP results for simple cases with hand calculations, and presents results of the correlation effort to bench-top testing of the interferometer. A trade study conducted with the STOP model which demonstrates a few simple design changes that can improve the performance seen in the lab is also presented.

  12. Challenges in leveraging existing human performance data for quantifying the IDHEAS HRA method

    DOE PAGES

    Liao, Huafei N.; Groth, Katrina; Stevens-Adams, Susan

    2015-07-29

    Our article documents an exploratory study for collecting and using human performance data to inform human error probability (HEP) estimates for a new human reliability analysis (HRA) method, the IntegrateD Human Event Analysis System (IDHEAS). The method was based on cognitive models and mechanisms underlying human behaviour and employs a framework of 14 crew failure modes (CFMs) to represent human failures typical for human performance in nuclear power plant (NPP) internal, at-power events [1]. A decision tree (DT) was constructed for each CFM to assess the probability of the CFM occurring in different contexts. Data needs for IDHEAS quantification aremore » discussed. Then, the data collection framework and process is described and how the collected data were used to inform HEP estimation is illustrated with two examples. Next, five major technical challenges are identified for leveraging human performance data for IDHEAS quantification. Furthermore, these challenges reflect the data needs specific to IDHEAS. More importantly, they also represent the general issues with current human performance data and can provide insight for a path forward to support HRA data collection, use, and exchange for HRA method development, implementation, and validation.« less

  13. The challenge of measuring emergency preparedness: integrating component metrics to build system-level measures for strategic national stockpile operations.

    PubMed

    Jackson, Brian A; Faith, Kay Sullivan

    2013-02-01

    Although significant progress has been made in measuring public health emergency preparedness, system-level performance measures are lacking. This report examines a potential approach to such measures for Strategic National Stockpile (SNS) operations. We adapted an engineering analytic technique used to assess the reliability of technological systems-failure mode and effects analysis-to assess preparedness. That technique, which includes systematic mapping of the response system and identification of possible breakdowns that affect performance, provides a path to use data from existing SNS assessment tools to estimate likely future performance of the system overall. Systems models of SNS operations were constructed and failure mode analyses were performed for each component. Linking data from existing assessments, including the technical assistance review and functional drills, to reliability assessment was demonstrated using publicly available information. The use of failure mode and effects estimates to assess overall response system reliability was demonstrated with a simple simulation example. Reliability analysis appears an attractive way to integrate information from the substantial investment in detailed assessments for stockpile delivery and dispensing to provide a view of likely future response performance.

  14. methylPipe and compEpiTools: a suite of R packages for the integrative analysis of epigenomics data.

    PubMed

    Kishore, Kamal; de Pretis, Stefano; Lister, Ryan; Morelli, Marco J; Bianchi, Valerio; Amati, Bruno; Ecker, Joseph R; Pelizzola, Mattia

    2015-09-29

    Numerous methods are available to profile several epigenetic marks, providing data with different genome coverage and resolution. Large epigenomic datasets are then generated, and often combined with other high-throughput data, including RNA-seq, ChIP-seq for transcription factors (TFs) binding and DNase-seq experiments. Despite the numerous computational tools covering specific steps in the analysis of large-scale epigenomics data, comprehensive software solutions for their integrative analysis are still missing. Multiple tools must be identified and combined to jointly analyze histone marks, TFs binding and other -omics data together with DNA methylation data, complicating the analysis of these data and their integration with publicly available datasets. To overcome the burden of integrating various data types with multiple tools, we developed two companion R/Bioconductor packages. The former, methylPipe, is tailored to the analysis of high- or low-resolution DNA methylomes in several species, accommodating (hydroxy-)methyl-cytosines in both CpG and non-CpG sequence context. The analysis of multiple whole-genome bisulfite sequencing experiments is supported, while maintaining the ability of integrating targeted genomic data. The latter, compEpiTools, seamlessly incorporates the results obtained with methylPipe and supports their integration with other epigenomics data. It provides a number of methods to score these data in regions of interest, leading to the identification of enhancers, lncRNAs, and RNAPII stalling/elongation dynamics. Moreover, it allows a fast and comprehensive annotation of the resulting genomic regions, and the association of the corresponding genes with non-redundant GeneOntology terms. Finally, the package includes a flexible method based on heatmaps for the integration of various data types, combining annotation tracks with continuous or categorical data tracks. methylPipe and compEpiTools provide a comprehensive Bioconductor-compliant solution for the integrative analysis of heterogeneous epigenomics data. These packages are instrumental in providing biologists with minimal R skills a complete toolkit facilitating the analysis of their own data, or in accelerating the analyses performed by more experienced bioinformaticians.

  15. Model reference tracking control of an aircraft: a robust adaptive approach

    NASA Astrophysics Data System (ADS)

    Tanyer, Ilker; Tatlicioglu, Enver; Zergeroglu, Erkan

    2017-05-01

    This work presents the design and the corresponding analysis of a nonlinear robust adaptive controller for model reference tracking of an aircraft that has parametric uncertainties in its system matrices and additive state- and/or time-dependent nonlinear disturbance-like terms in its dynamics. Specifically, robust integral of the sign of the error feedback term and an adaptive term is fused with a proportional integral controller. Lyapunov-based stability analysis techniques are utilised to prove global asymptotic convergence of the output tracking error. Extensive numerical simulations are presented to illustrate the performance of the proposed robust adaptive controller.

  16. Network propagation in the cytoscape cyberinfrastructure.

    PubMed

    Carlin, Daniel E; Demchak, Barry; Pratt, Dexter; Sage, Eric; Ideker, Trey

    2017-10-01

    Network propagation is an important and widely used algorithm in systems biology, with applications in protein function prediction, disease gene prioritization, and patient stratification. However, up to this point it has required significant expertise to run. Here we extend the popular network analysis program Cytoscape to perform network propagation as an integrated function. Such integration greatly increases the access to network propagation by putting it in the hands of biologists and linking it to the many other types of network analysis and visualization available through Cytoscape. We demonstrate the power and utility of the algorithm by identifying mutations conferring resistance to Vemurafenib.

  17. Multifacet structure of observed reconstructed integral images.

    PubMed

    Martínez-Corral, Manuel; Javidi, Bahram; Martínez-Cuenca, Raúl; Saavedra, Genaro

    2005-04-01

    Three-dimensional images generated by an integral imaging system suffer from degradations in the form of grid of multiple facets. This multifacet structure breaks the continuity of the observed image and therefore reduces its visual quality. We perform an analysis of this effect and present the guidelines in the design of lenslet imaging parameters for optimization of viewing conditions with respect to the multifacet degradation. We consider the optimization of the system in terms of field of view, observer position and pupil function, lenslet parameters, and type of reconstruction. Numerical tests are presented to verify the theoretical analysis.

  18. Integration of Rotor Aerodynamic Optimization with the Conceptual Design of a Large Civil Tiltrotor

    NASA Technical Reports Server (NTRS)

    Acree, C. W., Jr.

    2010-01-01

    Coupling of aeromechanics analysis with vehicle sizing is demonstrated with the CAMRAD II aeromechanics code and NDARC sizing code. The example is optimization of cruise tip speed with rotor/wing interference for the Large Civil Tiltrotor (LCTR2) concept design. Free-wake models were used for both rotors and the wing. This report is part of a NASA effort to develop an integrated analytical capability combining rotorcraft aeromechanics, structures, propulsion, mission analysis, and vehicle sizing. The present paper extends previous efforts by including rotor/wing interference explicitly in the rotor performance optimization and implicitly in the sizing.

  19. VIPR III VADR SPIDER Structural Design and Analysis

    NASA Technical Reports Server (NTRS)

    Li, Wesley; Chen, Tony

    2016-01-01

    In support of the National Aeronautics and Space Administration (NASA) Vehicle Integrated Propulsion Research (VIPR) Phase III team to evaluate the volcanic ash environment effects on the Pratt & Whitney F117-PW-100 turbofan engine, NASA Armstrong Flight Research Center has successfully performed structural design and analysis on the Volcanic Ash Distribution Rig (VADR) and the Structural Particulate Integration Device for Engine Research (SPIDER) for the ash ingestion test. Static and dynamic load analyses were performed to ensure no structural failure would occur during the test. Modal analysis was conducted, and the results were used to develop engine power setting avoidance zones. These engine power setting avoidance zones were defined to minimize the dwell time when the natural frequencies of the VADR/SPIDER system coincided with the excitation frequencies of the engine which was operating at various revolutions per minute. Vortex-induced vibration due to engine suction air flow during the ingestion test was also evaluated, but was not a concern.

  20. Advanced GIS Exercise: Performing Error Analysis in ArcGIS ModelBuilder

    ERIC Educational Resources Information Center

    Hall, Steven T.; Post, Christopher J.

    2009-01-01

    Knowledge of Geographic Information Systems is quickly becoming an integral part of the natural resource professionals' skill set. With the growing need of professionals with these skills, we created an advanced geographic information systems (GIS) exercise for students at Clemson University to introduce them to the concept of error analysis,…

  1. The Integration of Psycholinguistic and Discourse Processing Theories of Reading Comprehension.

    ERIC Educational Resources Information Center

    Beebe, Mona J.

    To assess the compatibility of miscue analysis and recall analysis as independent elements in a theory of reading comprehension, a study was performed that operationalized each theory and separated its components into measurable units to allow empirical testing. A cueing strategy model was estimated, but the discourse processing model was broken…

  2. Efficiency in the Community College Sector: Stochastic Frontier Analysis

    ERIC Educational Resources Information Center

    Agasisti, Tommaso; Belfield, Clive

    2017-01-01

    This paper estimates technical efficiency scores across the community college sector in the United States. Using stochastic frontier analysis and data from the Integrated Postsecondary Education Data System for 2003-2010, we estimate efficiency scores for 950 community colleges and perform a series of sensitivity tests to check for robustness. We…

  3. Massively Parallel, Molecular Analysis Platform Developed Using a CMOS Integrated Circuit With Biological Nanopores

    PubMed Central

    Roever, Stefan

    2012-01-01

    A massively parallel, low cost molecular analysis platform will dramatically change the nature of protein, molecular and genomics research, DNA sequencing, and ultimately, molecular diagnostics. An integrated circuit (IC) with 264 sensors was fabricated using standard CMOS semiconductor processing technology. Each of these sensors is individually controlled with precision analog circuitry and is capable of single molecule measurements. Under electronic and software control, the IC was used to demonstrate the feasibility of creating and detecting lipid bilayers and biological nanopores using wild type α-hemolysin. The ability to dynamically create bilayers over each of the sensors will greatly accelerate pore development and pore mutation analysis. In addition, the noise performance of the IC was measured to be 30fA(rms). With this noise performance, single base detection of DNA was demonstrated using α-hemolysin. The data shows that a single molecule, electrical detection platform using biological nanopores can be operationalized and can ultimately scale to millions of sensors. Such a massively parallel platform will revolutionize molecular analysis and will completely change the field of molecular diagnostics in the future.

  4. Information Extraction for System-Software Safety Analysis: Calendar Year 2008 Year-End Report

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2009-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  5. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.

    PubMed

    Simonyan, Vahan; Mazumder, Raja

    2014-09-30

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  6. VisRseq: R-based visual framework for analysis of sequencing data

    PubMed Central

    2015-01-01

    Background Several tools have been developed to enable biologists to perform initial browsing and exploration of sequencing data. However the computational tool set for further analyses often requires significant computational expertise to use and many of the biologists with the knowledge needed to interpret these data must rely on programming experts. Results We present VisRseq, a framework for analysis of sequencing datasets that provides a computationally rich and accessible framework for integrative and interactive analyses without requiring programming expertise. We achieve this aim by providing R apps, which offer a semi-auto generated and unified graphical user interface for computational packages in R and repositories such as Bioconductor. To address the interactivity limitation inherent in R libraries, our framework includes several native apps that provide exploration and brushing operations as well as an integrated genome browser. The apps can be chained together to create more powerful analysis workflows. Conclusions To validate the usability of VisRseq for analysis of sequencing data, we present two case studies performed by our collaborators and report their workflow and insights. PMID:26328469

  7. VisRseq: R-based visual framework for analysis of sequencing data.

    PubMed

    Younesy, Hamid; Möller, Torsten; Lorincz, Matthew C; Karimi, Mohammad M; Jones, Steven J M

    2015-01-01

    Several tools have been developed to enable biologists to perform initial browsing and exploration of sequencing data. However the computational tool set for further analyses often requires significant computational expertise to use and many of the biologists with the knowledge needed to interpret these data must rely on programming experts. We present VisRseq, a framework for analysis of sequencing datasets that provides a computationally rich and accessible framework for integrative and interactive analyses without requiring programming expertise. We achieve this aim by providing R apps, which offer a semi-auto generated and unified graphical user interface for computational packages in R and repositories such as Bioconductor. To address the interactivity limitation inherent in R libraries, our framework includes several native apps that provide exploration and brushing operations as well as an integrated genome browser. The apps can be chained together to create more powerful analysis workflows. To validate the usability of VisRseq for analysis of sequencing data, we present two case studies performed by our collaborators and report their workflow and insights.

  8. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis

    PubMed Central

    Simonyan, Vahan; Mazumder, Raja

    2014-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis. PMID:25271953

  9. The “Emotional Side” of Entrepreneurship: A Meta-Analysis of the Relation between Positive and Negative Affect and Entrepreneurial Performance

    PubMed Central

    Fodor, Oana C.; Pintea, Sebastian

    2017-01-01

    The experience of work in an entrepreneurial context is saturated with emotional experiences. While the literature on the relation between affect and entrepreneurial performance (EP) is growing, there was no quantitative integration of the results so far. This study addresses this gap and meta-analytically integrates the results from 17 studies (N = 3810) in order to estimate the effect size for the relation between positive (PA) and negative affect (NA), on the one hand, and EP, on the other hand. The meta-analysis includes studies in English language, published until August 2016. The results indicate a significant positive relation between PA and EP, r = 0.18. The overall NA – EP relation was not significant, r = -0.12. Only state NA has a significant negative relation with EP (r = -0.16). The moderating role of several conceptual (i.e., emotion duration, integrality etc.), sample (i.e., gender, age, education) and methodological characteristics of the studies (i.e., type of measurements etc.) are explored and implications for future research are discussed. PMID:28348534

  10. Modeling methodology for supply chain synthesis and disruption analysis

    NASA Astrophysics Data System (ADS)

    Wu, Teresa; Blackhurst, Jennifer

    2004-11-01

    The concept of an integrated or synthesized supply chain is a strategy for managing today's globalized and customer driven supply chains in order to better meet customer demands. Synthesizing individual entities into an integrated supply chain can be a challenging task due to a variety of factors including conflicting objectives, mismatched incentives and constraints of the individual entities. Furthermore, understanding the effects of disruptions occurring at any point in the system is difficult when working toward synthesizing supply chain operations. Therefore, the goal of this research is to present a modeling methodology to manage the synthesis of a supply chain by linking hierarchical levels of the system and to model and analyze disruptions in the integrated supply chain. The contribution of this research is threefold: (1) supply chain systems can be modeled hierarchically (2) the performance of synthesized supply chain system can be evaluated quantitatively (3) reachability analysis is used to evaluate the system performance and verify whether a specific state is reachable, allowing the user to understand the extent of effects of a disruption.

  11. The "Emotional Side" of Entrepreneurship: A Meta-Analysis of the Relation between Positive and Negative Affect and Entrepreneurial Performance.

    PubMed

    Fodor, Oana C; Pintea, Sebastian

    2017-01-01

    The experience of work in an entrepreneurial context is saturated with emotional experiences. While the literature on the relation between affect and entrepreneurial performance (EP) is growing, there was no quantitative integration of the results so far. This study addresses this gap and meta-analytically integrates the results from 17 studies ( N = 3810) in order to estimate the effect size for the relation between positive (PA) and negative affect (NA), on the one hand, and EP, on the other hand. The meta-analysis includes studies in English language, published until August 2016. The results indicate a significant positive relation between PA and EP, r = 0.18. The overall NA - EP relation was not significant, r = -0.12. Only state NA has a significant negative relation with EP ( r = -0.16). The moderating role of several conceptual (i.e., emotion duration, integrality etc.), sample (i.e., gender, age, education) and methodological characteristics of the studies (i.e., type of measurements etc.) are explored and implications for future research are discussed.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Y.J.; Sohn, G.H.; Kim, Y.J.

    Typical LBB (Leak-Before-Break) analysis is performed for the highest stress location for each different type of material in the high energy pipe line. In most cases, the highest stress occurs at the nozzle and pipe interface location at the terminal end. The standard finite element analysis approach to calculate J-Integral values at the crack tip utilizes symmetry conditions when modeling near the nozzle as well as away from the nozzle region to minimize the model size and simplify the calculation of J-integral values at the crack tip. A factor of two is typically applied to the J-integral value to accountmore » for symmetric conditions. This simplified analysis can lead to conservative results especially for small diameter pipes where the asymmetry of the nozzle-pipe interface is ignored. The stiffness of the residual piping system and non-symmetries of geometry along with different material for the nozzle, safe end and pipe are usually omitted in current LBB methodology. In this paper, the effects of non-symmetries due to geometry and material at the pipe-nozzle interface are presented. Various LBB analyses are performed for a small diameter piping system to evaluate the effect a nozzle has on the J-integral calculation, crack opening area and crack stability. In addition, material differences between the nozzle and pipe are evaluated. Comparison is made between a pipe model and a nozzle-pipe interface model, and a LBB PED (Piping Evaluation Diagram) curve is developed to summarize the results for use by piping designers.« less

  13. Combined Use of Integral Experiments and Covariance Data

    NASA Astrophysics Data System (ADS)

    Palmiotti, G.; Salvatores, M.; Aliberti, G.; Herman, M.; Hoblit, S. D.; McKnight, R. D.; Obložinský, P.; Talou, P.; Hale, G. M.; Hiruta, H.; Kawano, T.; Mattoon, C. M.; Nobre, G. P. A.; Palumbo, A.; Pigni, M.; Rising, M. E.; Yang, W.-S.; Kahler, A. C.

    2014-04-01

    In the frame of a US-DOE sponsored project, ANL, BNL, INL and LANL have performed a joint multidisciplinary research activity in order to explore the combined use of integral experiments and covariance data with the objective to both give quantitative indications on possible improvements of the ENDF evaluated data files and to reduce at the same time crucial reactor design parameter uncertainties. Methods that have been developed in the last four decades for the purposes indicated above have been improved by some new developments that benefited also by continuous exchanges with international groups working in similar areas. The major new developments that allowed significant progress are to be found in several specific domains: a) new science-based covariance data; b) integral experiment covariance data assessment and improved experiment analysis, e.g., of sample irradiation experiments; c) sensitivity analysis, where several improvements were necessary despite the generally good understanding of these techniques, e.g., to account for fission spectrum sensitivity; d) a critical approach to the analysis of statistical adjustments performance, both a priori and a posteriori; e) generalization of the assimilation method, now applied for the first time not only to multigroup cross sections data but also to nuclear model parameters (the "consistent" method). This article describes the major results obtained in each of these areas; a large scale nuclear data adjustment, based on the use of approximately one hundred high-accuracy integral experiments, will be reported along with a significant example of the application of the new "consistent" method of data assimilation.

  14. Application of Collocated GPS and Seismic Sensors to Earthquake Monitoring and Early Warning

    PubMed Central

    Li, Xingxing; Zhang, Xiaohong; Guo, Bofeng

    2013-01-01

    We explore the use of collocated GPS and seismic sensors for earthquake monitoring and early warning. The GPS and seismic data collected during the 2011 Tohoku-Oki (Japan) and the 2010 El Mayor-Cucapah (Mexico) earthquakes are analyzed by using a tightly-coupled integration. The performance of the integrated results is validated by both time and frequency domain analysis. We detect the P-wave arrival and observe small-scale features of the movement from the integrated results and locate the epicenter. Meanwhile, permanent offsets are extracted from the integrated displacements highly accurately and used for reliable fault slip inversion and magnitude estimation. PMID:24284765

  15. Propulsion system/flight control integration for supersonic aircraft

    NASA Technical Reports Server (NTRS)

    Reukauf, P. J.; Burcham, F. W., Jr.

    1976-01-01

    Digital integrated control systems are studied. Such systems allow minimization of undesirable interactions while maximizing performance at all flight conditions. One such program is the YF-12 cooperative control program. The existing analog air data computer, autothrottle, autopilot, and inlet control systems are converted to digital systems by using a general purpose airborne computer and interface unit. Existing control laws are programed and tested in flight. Integrated control laws, derived using accurate mathematical models of the airplane and propulsion system in conjunction with modern control techniques, are tested in flight. Analysis indicates that an integrated autothrottle autopilot gives good flight path control and that observers are used to replace failed sensors.

  16. Development of a Time-Variant Figure-of-Merit for Use in Analysis of Air Combat Maneuvering Engagements

    DTIC Science & Technology

    1976-07-16

    Influence of Range 10 5 Range Performance Penalty Function II 6 Influence of Closing Velocity 12 7 Energy Influence Function 14 8 Comparison of the...flELtSHAlL, ..E^) RANGE RANGE Figure 7 Energy Influence Function 14 TM 76-1 SA ! PERFORMANCE INDEX COMPARATIVE ANALYSIS Maneuver Conversion Model...hnergy Integral ■’> E s K Energy Influence Function K* Proportionality Constant MT Target Mach Number N Normal Acceleration (load factor) z

  17. Military engine computational structures technology

    NASA Technical Reports Server (NTRS)

    Thomson, Daniel E.

    1992-01-01

    Integrated High Performance Turbine Engine Technology Initiative (IHPTET) goals require a strong analytical base. Effective analysis of composite materials is critical to life analysis and structural optimization. Accurate life prediction for all material systems is critical. User friendly systems are also desirable. Post processing of results is very important. The IHPTET goal is to double turbine engine propulsion capability by the year 2003. Fifty percent of the goal will come from advanced materials and structures, the other 50 percent will come from increasing performance. Computer programs are listed.

  18. Environmental science applications with Rapid Integrated Mapping and analysis System (RIMS)

    NASA Astrophysics Data System (ADS)

    Shiklomanov, A.; Prusevich, A.; Gordov, E.; Okladnikov, I.; Titov, A.

    2016-11-01

    The Rapid Integrated Mapping and analysis System (RIMS) has been developed at the University of New Hampshire as an online instrument for multidisciplinary data visualization, analysis and manipulation with a focus on hydrological applications. Recently it was enriched with data and tools to allow more sophisticated analysis of interdisciplinary data. Three different examples of specific scientific applications with RIMS are demonstrated and discussed. Analysis of historical changes in major components of the Eurasian pan-Arctic water budget is based on historical discharge data, gridded observational meteorological fields, and remote sensing data for sea ice area. Express analysis of the extremely hot and dry summer of 2010 across European Russia is performed using a combination of near-real time and historical data to evaluate the intensity and spatial distribution of this event and its socioeconomic impacts. Integrative analysis of hydrological, water management, and population data for Central Asia over the last 30 years provides an assessment of regional water security due to changes in climate, water use and demography. The presented case studies demonstrate the capabilities of RIMS as a powerful instrument for hydrological and coupled human-natural systems research.

  19. Concurrent Probabilistic Simulation of High Temperature Composite Structural Response

    NASA Technical Reports Server (NTRS)

    Abdi, Frank

    1996-01-01

    A computational structural/material analysis and design tool which would meet industry's future demand for expedience and reduced cost is presented. This unique software 'GENOA' is dedicated to parallel and high speed analysis to perform probabilistic evaluation of high temperature composite response of aerospace systems. The development is based on detailed integration and modification of diverse fields of specialized analysis techniques and mathematical models to combine their latest innovative capabilities into a commercially viable software package. The technique is specifically designed to exploit the availability of processors to perform computationally intense probabilistic analysis assessing uncertainties in structural reliability analysis and composite micromechanics. The primary objectives which were achieved in performing the development were: (1) Utilization of the power of parallel processing and static/dynamic load balancing optimization to make the complex simulation of structure, material and processing of high temperature composite affordable; (2) Computational integration and synchronization of probabilistic mathematics, structural/material mechanics and parallel computing; (3) Implementation of an innovative multi-level domain decomposition technique to identify the inherent parallelism, and increasing convergence rates through high- and low-level processor assignment; (4) Creating the framework for Portable Paralleled architecture for the machine independent Multi Instruction Multi Data, (MIMD), Single Instruction Multi Data (SIMD), hybrid and distributed workstation type of computers; and (5) Market evaluation. The results of Phase-2 effort provides a good basis for continuation and warrants Phase-3 government, and industry partnership.

  20. Integrated microfluidic technology for sub-lethal and behavioral marine ecotoxicity biotests

    NASA Astrophysics Data System (ADS)

    Huang, Yushi; Reyes Aldasoro, Constantino Carlos; Persoone, Guido; Wlodkowic, Donald

    2015-06-01

    Changes in behavioral traits exhibited by small aquatic invertebrates are increasingly postulated as ethically acceptable and more sensitive endpoints for detection of water-born ecotoxicity than conventional mortality assays. Despite importance of such behavioral biotests, their implementation is profoundly limited by the lack of appropriate biocompatible automation, integrated optoelectronic sensors, and the associated electronics and analysis algorithms. This work outlines development of a proof-of-concept miniaturized Lab-on-a-Chip (LOC) platform for rapid water toxicity tests based on changes in swimming patterns exhibited by Artemia franciscana (Artoxkit M™) nauplii. In contrast to conventionally performed end-point analysis based on counting numbers of dead/immobile specimens we performed a time-resolved video data analysis to dynamically assess impact of a reference toxicant on swimming pattern of A. franciscana. Our system design combined: (i) innovative microfluidic device keeping free swimming Artemia sp. nauplii under continuous microperfusion as a mean of toxin delivery; (ii) mechatronic interface for user-friendly fluidic actuation of the chip; and (iii) miniaturized video acquisition for movement analysis of test specimens. The system was capable of performing fully programmable time-lapse and video-microscopy of multiple samples for rapid ecotoxicity analysis. It enabled development of a user-friendly and inexpensive test protocol to dynamically detect sub-lethal behavioral end-points such as changes in speed of movement or distance traveled by each animal.

  1. A novel integrated framework and improved methodology of computer-aided drug design.

    PubMed

    Chen, Calvin Yu-Chian

    2013-01-01

    Computer-aided drug design (CADD) is a critical initiating step of drug development, but a single model capable of covering all designing aspects remains to be elucidated. Hence, we developed a drug design modeling framework that integrates multiple approaches, including machine learning based quantitative structure-activity relationship (QSAR) analysis, 3D-QSAR, Bayesian network, pharmacophore modeling, and structure-based docking algorithm. Restrictions for each model were defined for improved individual and overall accuracy. An integration method was applied to join the results from each model to minimize bias and errors. In addition, the integrated model adopts both static and dynamic analysis to validate the intermolecular stabilities of the receptor-ligand conformation. The proposed protocol was applied to identifying HER2 inhibitors from traditional Chinese medicine (TCM) as an example for validating our new protocol. Eight potent leads were identified from six TCM sources. A joint validation system comprised of comparative molecular field analysis, comparative molecular similarity indices analysis, and molecular dynamics simulation further characterized the candidates into three potential binding conformations and validated the binding stability of each protein-ligand complex. The ligand pathway was also performed to predict the ligand "in" and "exit" from the binding site. In summary, we propose a novel systematic CADD methodology for the identification, analysis, and characterization of drug-like candidates.

  2. Liquid chromatography coupled with time-of-flight and ion trap mass spectrometry for qualitative analysis of herbal medicines.

    PubMed

    Chen, Xiao-Fei; Wu, Hai-Tang; Tan, Guang-Guo; Zhu, Zhen-Yu; Chai, Yi-Feng

    2011-11-01

    With the expansion of herbal medicine (HM) market, the issue on how to apply up-to-date analytical tools on qualitative analysis of HMs to assure their quality, safety and efficacy has been arousing great attention. Due to its inherent characteristics of accurate mass measurements and multiple stages analysis, the integrated strategy of liquid chromatography (LC) coupled with time-of-flight mass spectrometry (TOF-MS) and ion trap mass spectrometry (IT-MS) is well-suited to be performed as qualitative analysis tool in this field. The purpose of this review is to provide an overview on the potential of this integrated strategy, including the review of general features of LC-IT-MS and LC-TOF-MS, the advantages of their combination, the common procedures for structure elucidation, the potential of LC-hybrid-IT-TOF/MS and also the summary and discussion of the applications of the integrated strategy for HM qualitative analysis (2006-2011). The advantages and future developments of LC coupled with IT and TOF-MS are highlighted.

  3. The Necessity of Functional Analysis for Space Exploration Programs

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Breidenthal, Julian C.

    2011-01-01

    As NASA moves toward expanded commercial spaceflight within its human exploration capability, there is increased emphasis on how to allocate responsibilities between government and commercial organizations to achieve coordinated program objectives. The practice of program-level functional analysis offers an opportunity for improved understanding of collaborative functions among heterogeneous partners. Functional analysis is contrasted with the physical analysis more commonly done at the program level, and is shown to provide theoretical performance, risk, and safety advantages beneficial to a government-commercial partnership. Performance advantages include faster convergence to acceptable system solutions; discovery of superior solutions with higher commonality, greater simplicity and greater parallelism by substituting functional for physical redundancy to achieve robustness and safety goals; and greater organizational cohesion around program objectives. Risk advantages include avoidance of rework by revelation of some kinds of architectural and contractual mismatches before systems are specified, designed, constructed, or integrated; avoidance of cost and schedule growth by more complete and precise specifications of cost and schedule estimates; and higher likelihood of successful integration on the first try. Safety advantages include effective delineation of must-work and must-not-work functions for integrated hazard analysis, the ability to formally demonstrate completeness of safety analyses, and provably correct logic for certification of flight readiness. The key mechanism for realizing these benefits is the development of an inter-functional architecture at the program level, which reveals relationships between top-level system requirements that would otherwise be invisible using only a physical architecture. This paper describes the advantages and pitfalls of functional analysis as a means of coordinating the actions of large heterogeneous organizations for space exploration programs.

  4. Strategic Analysis Overview

    NASA Technical Reports Server (NTRS)

    Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe

    2008-01-01

    NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.

  5. Air Vehicle Technology Integration Program (AVTIP). Delivery Order 0054: Opportune Landing Site (OLS) Critical Experiment

    DTIC Science & Technology

    2008-04-01

    suitability would result in safer landings and reduced maintenance costs associated with an intended area of operations 2.1.2. Concept of... cost , integration, logistics, ownership, performance, schedule, and user perception. Criteria were developed for three timeframes—reflecting the end...analysis.. Changed runway finder back to six cardinal headings or user specified headings. Added NASA ACCA cloud recognition filter. Added switches for

  6. Philosophy of ATHEANA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bley, D.C.; Cooper, S.E.; Forester, J.A.

    ATHEANA, a second-generation Human Reliability Analysis (HRA) method integrates advances in psychology with engineering, human factors, and Probabilistic Risk Analysis (PRA) disciplines to provide an HRA quantification process and PRA modeling interface that can accommodate and represent human performance in real nuclear power plant events. The method uses the characteristics of serious accidents identified through retrospective analysis of serious operational events to set priorities in a search process for significant human failure events, unsafe acts, and error-forcing context (unfavorable plant conditions combined with negative performance-shaping factors). ATHEANA has been tested in a demonstration project at an operating pressurized water reactor.

  7. Failure analysis of electronic parts: Laboratory methods. [for destructive and nondestructive testing

    NASA Technical Reports Server (NTRS)

    Anstead, R. J. (Editor); Goldberg, E. (Editor)

    1975-01-01

    Failure analysis test methods are presented for use in analyzing candidate electronic parts and in improving future design reliability. Each test is classified as nondestructive, semidestructive, or destructive. The effects upon applicable part types (i.e. integrated circuit, transitor) are discussed. Methodology is given for performing the following: immersion tests, radio graphic tests, dewpoint tests, gas ambient analysis, cross sectioning, and ultraviolet examination.

  8. Task 2 Report: Algorithm Development and Performance Analysis

    DTIC Science & Technology

    1993-07-01

    separated peaks ............................................. 39 7-16 Example ILGC data for schedule 3 phosphites showing an analysis method which integrates...more closely follows the baseline ................. 40 7-18 Example R.GC data for schedule 3 phosphites showing an analysis method resulting in unwanted...much of the ambiguity that can arise in GC/MS with trace environmental samples, for example. Correlated chromatography, on the other hand, separates the

  9. Airbreathing hypersonic vehicle design and analysis methods

    NASA Technical Reports Server (NTRS)

    Lockwood, Mary Kae; Petley, Dennis H.; Hunt, James L.; Martin, John G.

    1996-01-01

    The design, analysis, and optimization of airbreathing hypersonic vehicles requires analyses involving many highly coupled disciplines at levels of accuracy exceeding those traditionally considered in a conceptual or preliminary-level design. Discipline analysis methods including propulsion, structures, thermal management, geometry, aerodynamics, performance, synthesis, sizing, closure, and cost are discussed. Also, the on-going integration of these methods into a working environment, known as HOLIST, is described.

  10. Integrated Microfluidic Lectin Barcode Platform for High-Performance Focused Glycomic Profiling

    NASA Astrophysics Data System (ADS)

    Shang, Yuqin; Zeng, Yun; Zeng, Yong

    2016-02-01

    Protein glycosylation is one of the key processes that play essential roles in biological functions and dysfunctions. However, progress in glycomics has considerably lagged behind genomics and proteomics, due in part to the enormous challenges in analysis of glycans. Here we present a new integrated and automated microfluidic lectin barcode platform to substantially improve the performance of lectin array for focused glycomic profiling. The chip design and flow control were optimized to promote the lectin-glycan binding kinetics and speed of lectin microarray. Moreover, we established an on-chip lectin assay which employs a very simple blocking method to effectively suppress the undesired background due to lectin binding of antibodies. Using this technology, we demonstrated focused differential profiling of tissue-specific glycosylation changes of a biomarker, CA125 protein purified from ovarian cancer cell line and different tissues from ovarian cancer patients in a fast, reproducible, and high-throughput fashion. Highly sensitive CA125 detection was also demonstrated with a detection limit much lower than the clinical cutoff value for cancer diagnosis. This microfluidic platform holds the potential to integrate with sample preparation functions to construct a fully integrated “sample-to-answer” microsystem for focused differential glycomic analysis. Thus, our technology should present a powerful tool in support of rapid advance in glycobiology and glyco-biomarker development.

  11. Analysis of integrated photovoltaic-thermal systems using solar concentrators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yusoff, M.B.

    1983-01-01

    An integrated photovoltaic-thermal system using solar concentrators utilizes the solar radiation spectrum in the production of electrical and thermal energy. The electrical conversion efficiency of this system decreases with increasing solar cell temperature. Since a high operating temperature is desirable to maximize the quality of thermal output of the planned integrated system, a proper choice of the operating temperature for the unit cell is of vital importance. The analysis predicts performance characteristics of the unit cell by considering the dependence of the heat generation, the heat absorption and the heat transmission on the material properties of the unit cell structure.more » An analytical model has been developed to describe the heat transport phenomena occurring in the unit cell structure. The range of applicability of the one-dimensional and the two-dimensional models, which have closed-form solutions, has been demonstrated. Parametric and design studies point out the requirements for necessary good electrical and thermal performance. A procedure utilizing functional forms of component characteristics in the form of partial coefficients of the dependent variable has been developed to design and operate the integrated system to have a desirable value of the thermal to electrical output ratio both at design and operating modes.« less

  12. Integrated Microfluidic Lectin Barcode Platform for High-Performance Focused Glycomic Profiling

    PubMed Central

    Shang, Yuqin; Zeng, Yun; Zeng, Yong

    2016-01-01

    Protein glycosylation is one of the key processes that play essential roles in biological functions and dysfunctions. However, progress in glycomics has considerably lagged behind genomics and proteomics, due in part to the enormous challenges in analysis of glycans. Here we present a new integrated and automated microfluidic lectin barcode platform to substantially improve the performance of lectin array for focused glycomic profiling. The chip design and flow control were optimized to promote the lectin-glycan binding kinetics and speed of lectin microarray. Moreover, we established an on-chip lectin assay which employs a very simple blocking method to effectively suppress the undesired background due to lectin binding of antibodies. Using this technology, we demonstrated focused differential profiling of tissue-specific glycosylation changes of a biomarker, CA125 protein purified from ovarian cancer cell line and different tissues from ovarian cancer patients in a fast, reproducible, and high-throughput fashion. Highly sensitive CA125 detection was also demonstrated with a detection limit much lower than the clinical cutoff value for cancer diagnosis. This microfluidic platform holds the potential to integrate with sample preparation functions to construct a fully integrated “sample-to-answer” microsystem for focused differential glycomic analysis. Thus, our technology should present a powerful tool in support of rapid advance in glycobiology and glyco-biomarker development. PMID:26831207

  13. Numerical simulation of actuation behavior of active fiber composites in helicopter rotor blade application

    NASA Astrophysics Data System (ADS)

    Paik, Seung Hoon; Kim, Ji Yeon; Shin, Sang Joon; Kim, Seung Jo

    2004-07-01

    Smart structures incorporating active materials have been designed and analyzed to improve aerospace vehicle performance and its vibration/noise characteristics. Helicopter integral blade actuation is one example of those efforts using embedded anisotropic piezoelectric actuators. To design and analyze such integrally-actuated blades, beam approach based on homogenization methodology has been traditionally used. Using this approach, the global behavior of the structures is predicted in an averaged sense. However, this approach has intrinsic limitations in describing the local behaviors in the level of the constituents. For example, the failure analysis of the individual active fibers requires the knowledge of the local behaviors. Microscopic approach for the analysis of integrally-actuated structures is established in this paper. Piezoelectric fibers and matrices are modeled individually and finite element method using three-dimensional solid elements is adopted. Due to huge size of the resulting finite element meshes, high performance computing technology is required in its solution process. The present methodology is quoted as Direct Numerical Simulation (DNS) of the smart structure. As an initial validation effort, present analytical results are correlated with the experiments from a small-scaled integrally-actuated blade, Active Twist Rotor (ATR). Through DNS, local stress distribution around the interface of fiber and matrix can be analyzed.

  14. Graph theory network function in Parkinson's disease assessed with electroencephalography.

    PubMed

    Utianski, Rene L; Caviness, John N; van Straaten, Elisabeth C W; Beach, Thomas G; Dugger, Brittany N; Shill, Holly A; Driver-Dunckley, Erika D; Sabbagh, Marwan N; Mehta, Shyamal; Adler, Charles H; Hentz, Joseph G

    2016-05-01

    To determine what differences exist in graph theory network measures derived from electroencephalography (EEG), between Parkinson's disease (PD) patients who are cognitively normal (PD-CN) and matched healthy controls; and between PD-CN and PD dementia (PD-D). EEG recordings were analyzed via graph theory network analysis to quantify changes in global efficiency and local integration. This included minimal spanning tree analysis. T-tests and correlations were used to assess differences between groups and assess the relationship with cognitive performance. Network measures showed increased local integration across all frequency bands between control and PD-CN; in contrast, decreased local integration occurred in PD-D when compared to PD-CN in the alpha1 frequency band. Differences found in PD-MCI mirrored PD-D. Correlations were found between network measures and assessments of global cognitive performance in PD. Our results reveal distinct patterns of band and network measure type alteration and breakdown for PD, as well as with cognitive decline in PD. These patterns suggest specific ways that interaction between cortical areas becomes abnormal and contributes to PD symptoms at various stages. Graph theory analysis by EEG suggests that network alteration and breakdown are robust attributes of PD cortical dysfunction pathophysiology. Copyright © 2016 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  15. Computational Environments and Analysis methods available on the NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.

    2014-12-01

    The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will involve the further integration and analysis of this data across the social sciences to facilitate the impacts across the societal domain, including timely analysis to more accurately predict and forecast future climate and environmental state.

  16. Mutual information optimization for mass spectra data alignment.

    PubMed

    Zoppis, Italo; Gianazza, Erica; Borsani, Massimiliano; Chinello, Clizia; Mainini, Veronica; Galbusera, Carmen; Ferrarese, Carlo; Galimberti, Gloria; Sorbi, Sandro; Borroni, Barbara; Magni, Fulvio; Antoniotti, Marco; Mauri, Giancarlo

    2012-01-01

    "Signal" alignments play critical roles in many clinical setting. This is the case of mass spectrometry data, an important component of many types of proteomic analysis. A central problem occurs when one needs to integrate (mass spectrometry) data produced by different sources, e.g., different equipment and/or laboratories. In these cases some form of "data integration'" or "data fusion'" may be necessary in order to discard some source specific aspects and improve the ability to perform a classification task such as inferring the "disease classes'" of patients. The need for new high performance data alignments methods is therefore particularly important in these contexts. In this paper we propose an approach based both on an information theory perspective, generally used in a feature construction problem, and on the application of a mathematical programming task (i.e. the weighted bipartite matching problem). We present the results of a competitive analysis of our method against other approaches. The analysis was conducted on data from plasma/ethylenediaminetetraacetic acid (EDTA) of "control" and Alzheimer patients collected from three different hospitals. The results point to a significant performance advantage of our method with respect to the competing ones tested.

  17. Initial Multidisciplinary Design and Analysis Framework

    NASA Technical Reports Server (NTRS)

    Ozoroski, L. P.; Geiselhart, K. A.; Padula, S. L.; Li, W.; Olson, E. D.; Campbell, R. L.; Shields, E. W.; Berton, J. J.; Gray, J. S.; Jones, S. M.; hide

    2010-01-01

    Within the Supersonics (SUP) Project of the Fundamental Aeronautics Program (FAP), an initial multidisciplinary design & analysis framework has been developed. A set of low- and intermediate-fidelity discipline design and analysis codes were integrated within a multidisciplinary design and analysis framework and demonstrated on two challenging test cases. The first test case demonstrates an initial capability to design for low boom and performance. The second test case demonstrates rapid assessment of a well-characterized design. The current system has been shown to greatly increase the design and analysis speed and capability, and many future areas for development were identified. This work has established a state-of-the-art capability for immediate use by supersonic concept designers and systems analysts at NASA, while also providing a strong base to build upon for future releases as more multifidelity capabilities are developed and integrated.

  18. Shuttle: Reaction control system. Cryogenic liquid distribution system: Study

    NASA Technical Reports Server (NTRS)

    Akkerman, J. W.

    1972-01-01

    A cryogenic liquid distribution system suitable for the reaction control system on space shuttles is described. The system thermodynamics, operation, performance and weight analysis are discussed along with the design, maintenance and integration concepts.

  19. Engineering Analysis of Stresses in Railroad Rails.

    DOT National Transportation Integrated Search

    1981-10-01

    One portion of the Federal Railroad Administration's (FRA) Track Performance Improvement Program is the development of engineering and analytic techniques required for the design and maintenance of railroad track of increased integrity and safety. Un...

  20. Statistical Test of Expression Pattern (STEPath): a new strategy to integrate gene expression data with genomic information in individual and meta-analysis studies.

    PubMed

    Martini, Paolo; Risso, Davide; Sales, Gabriele; Romualdi, Chiara; Lanfranchi, Gerolamo; Cagnin, Stefano

    2011-04-11

    In the last decades, microarray technology has spread, leading to a dramatic increase of publicly available datasets. The first statistical tools developed were focused on the identification of significant differentially expressed genes. Later, researchers moved toward the systematic integration of gene expression profiles with additional biological information, such as chromosomal location, ontological annotations or sequence features. The analysis of gene expression linked to physical location of genes on chromosomes allows the identification of transcriptionally imbalanced regions, while, Gene Set Analysis focuses on the detection of coordinated changes in transcriptional levels among sets of biologically related genes. In this field, meta-analysis offers the possibility to compare different studies, addressing the same biological question to fully exploit public gene expression datasets. We describe STEPath, a method that starts from gene expression profiles and integrates the analysis of imbalanced region as an a priori step before performing gene set analysis. The application of STEPath in individual studies produced gene set scores weighted by chromosomal activation. As a final step, we propose a way to compare these scores across different studies (meta-analysis) on related biological issues. One complication with meta-analysis is batch effects, which occur because molecular measurements are affected by laboratory conditions, reagent lots and personnel differences. Major problems occur when batch effects are correlated with an outcome of interest and lead to incorrect conclusions. We evaluated the power of combining chromosome mapping and gene set enrichment analysis, performing the analysis on a dataset of leukaemia (example of individual study) and on a dataset of skeletal muscle diseases (meta-analysis approach). In leukaemia, we identified the Hox gene set, a gene set closely related to the pathology that other algorithms of gene set analysis do not identify, while the meta-analysis approach on muscular disease discriminates between related pathologies and correlates similar ones from different studies. STEPath is a new method that integrates gene expression profiles, genomic co-expressed regions and the information about the biological function of genes. The usage of the STEPath-computed gene set scores overcomes batch effects in the meta-analysis approaches allowing the direct comparison of different pathologies and different studies on a gene set activation level.

Top