Sample records for structural analysis tools

  1. Structured Analysis and the Data Flow Diagram: Tools for Library Analysis.

    ERIC Educational Resources Information Center

    Carlson, David H.

    1986-01-01

    This article discusses tools developed to aid the systems analysis process (program evaluation and review technique, Gantt charts, organizational charts, decision tables, flowcharts, hierarchy plus input-process-output). Similarities and differences among techniques, library applications of analysis, structured systems analysis, and the data flow…

  2. Code Analysis and Refactoring with Clang Tools, Version 0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelley, Timothy M.

    2016-12-23

    Code Analysis and Refactoring with Clang Tools is a small set of example code that demonstrates techniques for applying tools distributed with the open source Clang compiler. Examples include analyzing where variables are used and replacing old data structures with standard structures.

  3. Control/structure interaction conceptual design tool

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1990-01-01

    The JPL Control/Structure Interaction Program is developing new analytical methods for designing micro-precision spacecraft with controlled structures. One of these, the Conceptual Design Tool, will illustrate innovative new approaches to the integration of multi-disciplinary analysis and design methods. The tool will be used to demonstrate homogeneity of presentation, uniform data representation across analytical methods, and integrated systems modeling. The tool differs from current 'integrated systems' that support design teams most notably in its support for the new CSI multi-disciplinary engineer. The design tool will utilize a three dimensional solid model of the spacecraft under design as the central data organization metaphor. Various analytical methods, such as finite element structural analysis, control system analysis, and mechanical configuration layout, will store and retrieve data from a hierarchical, object oriented data structure that supports assemblies of components with associated data and algorithms. In addition to managing numerical model data, the tool will assist the designer in organizing, stating, and tracking system requirements.

  4. NASTRAN as an analytical research tool for composite mechanics and composite structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Sinclair, J. H.; Sullivan, T. L.

    1976-01-01

    Selected examples are described in which NASTRAN is used as an analysis research tool for composite mechanics and for composite structural components. The examples were selected to illustrate the importance of using NASTRAN as an analysis tool in this rapidly advancing field.

  5. StrAuto: automation and parallelization of STRUCTURE analysis.

    PubMed

    Chhatre, Vikram E; Emerson, Kevin J

    2017-03-24

    Population structure inference using the software STRUCTURE has become an integral part of population genetic studies covering a broad spectrum of taxa including humans. The ever-expanding size of genetic data sets poses computational challenges for this analysis. Although at least one tool currently implements parallel computing to reduce computational overload of this analysis, it does not fully automate the use of replicate STRUCTURE analysis runs required for downstream inference of optimal K. There is pressing need for a tool that can deploy population structure analysis on high performance computing clusters. We present an updated version of the popular Python program StrAuto, to streamline population structure analysis using parallel computing. StrAuto implements a pipeline that combines STRUCTURE analysis with the Evanno Δ K analysis and visualization of results using STRUCTURE HARVESTER. Using benchmarking tests, we demonstrate that StrAuto significantly reduces the computational time needed to perform iterative STRUCTURE analysis by distributing runs over two or more processors. StrAuto is the first tool to integrate STRUCTURE analysis with post-processing using a pipeline approach in addition to implementing parallel computation - a set up ideal for deployment on computing clusters. StrAuto is distributed under the GNU GPL (General Public License) and available to download from http://strauto.popgen.org .

  6. Analysis and Design of Fuselage Structures Including Residual Strength Prediction Methodology

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.

    1998-01-01

    The goal of this research project is to develop and assess methodologies for the design and analysis of fuselage structures accounting for residual strength. Two primary objectives are included in this research activity: development of structural analysis methodology for predicting residual strength of fuselage shell-type structures; and the development of accurate, efficient analysis, design and optimization tool for fuselage shell structures. Assessment of these tools for robustness, efficient, and usage in a fuselage shell design environment will be integrated with these two primary research objectives.

  7. An integrated modeling and design tool for advanced optical spacecraft

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1992-01-01

    Consideration is given to the design and status of the Integrated Modeling of Optical Systems (IMOS) tool and to critical design issues. A multidisciplinary spacecraft design and analysis tool with support for structural dynamics, controls, thermal analysis, and optics, IMOS provides rapid and accurate end-to-end performance analysis, simulations, and optimization of advanced space-based optical systems. The requirements for IMOS-supported numerical arrays, user defined data structures, and a hierarchical data base are outlined, and initial experience with the tool is summarized. A simulation of a flexible telescope illustrates the integrated nature of the tools.

  8. Integrating automated structured analysis and design with Ada programming support environments

    NASA Technical Reports Server (NTRS)

    Hecht, Alan; Simmons, Andy

    1986-01-01

    Ada Programming Support Environments (APSE) include many powerful tools that address the implementation of Ada code. These tools do not address the entire software development process. Structured analysis is a methodology that addresses the creation of complete and accurate system specifications. Structured design takes a specification and derives a plan to decompose the system subcomponents, and provides heuristics to optimize the software design to minimize errors and maintenance. It can also produce the creation of useable modules. Studies have shown that most software errors result from poor system specifications, and that these errors also become more expensive to fix as the development process continues. Structured analysis and design help to uncover error in the early stages of development. The APSE tools help to insure that the code produced is correct, and aid in finding obscure coding errors. However, they do not have the capability to detect errors in specifications or to detect poor designs. An automated system for structured analysis and design TEAMWORK, which can be integrated with an APSE to support software systems development from specification through implementation is described. These tools completement each other to help developers improve quality and productivity, as well as to reduce development and maintenance costs. Complete system documentation and reusable code also resultss from the use of these tools. Integrating an APSE with automated tools for structured analysis and design provide capabilities and advantages beyond those realized with any of these systems used by themselves.

  9. Design of a Syntax Validation Tool for Requirements Analysis Using Structured Analysis and Design Technique (SADT)

    DTIC Science & Technology

    1988-09-01

    analysis phase of the software life cycle (16:1-1). While editing a SADT diagram, the tool should be able to check whether or not structured analysis...diag-ams are valid for the SADT’s syntax, produce error messages, do error recovery, and perform editing suggestions. Thus, this tool must have the...directed editors are editors which use the syn- tax of the programming language while editing a program. While text editors treat programs as text, syntax

  10. Stochastic Simulation Tool for Aerospace Structural Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.; Moore, David F.

    2006-01-01

    Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.

  11. RDNAnalyzer: A tool for DNA secondary structure prediction and sequence analysis.

    PubMed

    Afzal, Muhammad; Shahid, Ahmad Ali; Shehzadi, Abida; Nadeem, Shahid; Husnain, Tayyab

    2012-01-01

    RDNAnalyzer is an innovative computer based tool designed for DNA secondary structure prediction and sequence analysis. It can randomly generate the DNA sequence or user can upload the sequences of their own interest in RAW format. It uses and extends the Nussinov dynamic programming algorithm and has various application for the sequence analysis. It predicts the DNA secondary structure and base pairings. It also provides the tools for routinely performed sequence analysis by the biological scientists such as DNA replication, reverse compliment generation, transcription, translation, sequence specific information as total number of nucleotide bases, ATGC base contents along with their respective percentages and sequence cleaner. RDNAnalyzer is a unique tool developed in Microsoft Visual Studio 2008 using Microsoft Visual C# and Windows Presentation Foundation and provides user friendly environment for sequence analysis. It is freely available. http://www.cemb.edu.pk/sw.html RDNAnalyzer - Random DNA Analyser, GUI - Graphical user interface, XAML - Extensible Application Markup Language.

  12. Grid Stiffened Structure Analysis Tool

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The Grid Stiffened Analysis Tool contract is contract performed by Boeing under NASA purchase order H30249D. The contract calls for a "best effort" study comprised of two tasks: (1) Create documentation for a composite grid-stiffened structure analysis tool, in the form of a Microsoft EXCEL spread sheet, that was developed by originally at Stanford University and later further developed by the Air Force, and (2) Write a program that functions as a NASTRAN pre-processor to generate an FEM code for grid-stiffened structure. In performing this contract, Task 1 was given higher priority because it enables NASA to make efficient use of a unique tool they already have; Task 2 was proposed by Boeing because it also would be beneficial to the analysis of composite grid-stiffened structures, specifically in generating models for preliminary design studies. The contract is now complete, this package includes copies of the user's documentation for Task 1 and a CD ROM & diskette with an electronic copy of the user's documentation and an updated version of the "GRID 99" spreadsheet.

  13. RDNAnalyzer: A tool for DNA secondary structure prediction and sequence analysis

    PubMed Central

    Afzal, Muhammad; Shahid, Ahmad Ali; Shehzadi, Abida; Nadeem, Shahid; Husnain, Tayyab

    2012-01-01

    RDNAnalyzer is an innovative computer based tool designed for DNA secondary structure prediction and sequence analysis. It can randomly generate the DNA sequence or user can upload the sequences of their own interest in RAW format. It uses and extends the Nussinov dynamic programming algorithm and has various application for the sequence analysis. It predicts the DNA secondary structure and base pairings. It also provides the tools for routinely performed sequence analysis by the biological scientists such as DNA replication, reverse compliment generation, transcription, translation, sequence specific information as total number of nucleotide bases, ATGC base contents along with their respective percentages and sequence cleaner. RDNAnalyzer is a unique tool developed in Microsoft Visual Studio 2008 using Microsoft Visual C# and Windows Presentation Foundation and provides user friendly environment for sequence analysis. It is freely available. Availability http://www.cemb.edu.pk/sw.html Abbreviations RDNAnalyzer - Random DNA Analyser, GUI - Graphical user interface, XAML - Extensible Application Markup Language. PMID:23055611

  14. Update on HCDstruct - A Tool for Hybrid Wing Body Conceptual Design and Structural Optimization

    NASA Technical Reports Server (NTRS)

    Gern, Frank H.

    2015-01-01

    HCDstruct is a Matlab® based software tool to rapidly build a finite element model for structural optimization of hybrid wing body (HWB) aircraft at the conceptual design level. The tool uses outputs from a Flight Optimization System (FLOPS) performance analysis together with a conceptual outer mold line of the vehicle, e.g. created by Vehicle Sketch Pad (VSP), to generate a set of MSC Nastran® bulk data files. These files can readily be used to perform a structural optimization and weight estimation using Nastran’s® Solution 200 multidisciplinary optimization solver. Initially developed at NASA Langley Research Center to perform increased fidelity conceptual level HWB centerbody structural analyses, HCDstruct has grown into a complete HWB structural sizing and weight estimation tool, including a fully flexible aeroelastic loads analysis. Recent upgrades to the tool include the expansion to a full wing tip-to-wing tip model for asymmetric analyses like engine out conditions and dynamic overswings, as well as a fully actuated trailing edge, featuring up to 15 independently actuated control surfaces and twin tails. Several example applications of the HCDstruct tool are presented.

  15. ULg Spectra: An Interactive Software Tool to Improve Undergraduate Students' Structural Analysis Skills

    ERIC Educational Resources Information Center

    Agnello, Armelinda; Carre, Cyril; Billen, Roland; Leyh, Bernard; De Pauw, Edwin; Damblon, Christian

    2018-01-01

    The analysis of spectroscopic data to solve chemical structures requires practical skills and drills. In this context, we have developed ULg Spectra, a computer-based tool designed to improve the ability of learners to perform complex reasoning. The identification of organic chemical compounds involves gathering and interpreting complementary…

  16. Development of advanced structural analysis methodologies for predicting widespread fatigue damage in aircraft structures

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Starnes, James H., Jr.; Newman, James C., Jr.

    1995-01-01

    NASA is developing a 'tool box' that includes a number of advanced structural analysis computer codes which, taken together, represent the comprehensive fracture mechanics capability required to predict the onset of widespread fatigue damage. These structural analysis tools have complementary and specialized capabilities ranging from a finite-element-based stress-analysis code for two- and three-dimensional built-up structures with cracks to a fatigue and fracture analysis code that uses stress-intensity factors and material-property data found in 'look-up' tables or from equations. NASA is conducting critical experiments necessary to verify the predictive capabilities of the codes, and these tests represent a first step in the technology-validation and industry-acceptance processes. NASA has established cooperative programs with aircraft manufacturers to facilitate the comprehensive transfer of this technology by making these advanced structural analysis codes available to industry.

  17. Design and application of a tool for structuring, capitalizing and making more accessible information and lessons learned from accidents involving machinery.

    PubMed

    Sadeghi, Samira; Sadeghi, Leyla; Tricot, Nicolas; Mathieu, Luc

    2017-12-01

    Accident reports are published in order to communicate the information and lessons learned from accidents. An efficient accident recording and analysis system is a necessary step towards improvement of safety. However, currently there is a shortage of efficient tools to support such recording and analysis. In this study we introduce a flexible and customizable tool that allows structuring and analysis of this information. This tool has been implemented under TEEXMA®. We named our prototype TEEXMA®SAFETY. This tool provides an information management system to facilitate data collection, organization, query, analysis and reporting of accidents. A predefined information retrieval module provides ready access to data which allows the user to quickly identify the possible hazards for specific machines and provides information on the source of hazards. The main target audience for this tool includes safety personnel, accident reporters and designers. The proposed data model has been developed by analyzing different accident reports.

  18. R-based Tool for a Pairwise Structure-activity Relationship Analysis.

    PubMed

    Klimenko, Kyrylo

    2018-04-01

    The Structure-Activity Relationship analysis is a complex process that can be enhanced by computational techniques. This article describes a simple tool for SAR analysis that has a graphic user interface and a flexible approach towards the input of molecular data. The application allows calculating molecular similarity represented by Tanimoto index & Euclid distance, as well as, determining activity cliffs by means of Structure-Activity Landscape Index. The calculation is performed in a pairwise manner either for the reference compound and other compounds or for all possible pairs in the data set. The results of SAR analysis are visualized using two types of plot. The application capability is demonstrated by the analysis of a set of COX2 inhibitors with respect to Isoxicam. This tool is available online: it includes manual and input file examples. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Simultaneous Aerodynamic and Structural Design Optimization (SASDO) for a 3-D Wing

    NASA Technical Reports Server (NTRS)

    Gumbert, Clyde R.; Hou, Gene J.-W.; Newman, Perry A.

    2001-01-01

    The formulation and implementation of an optimization method called Simultaneous Aerodynamic and Structural Design Optimization (SASDO) is shown as an extension of the Simultaneous Aerodynamic Analysis and Design Optimization (SAADO) method. It is extended by the inclusion of structure element sizing parameters as design variables and Finite Element Method (FEM) analysis responses as constraints. The method aims to reduce the computational expense. incurred in performing shape and sizing optimization using state-of-the-art Computational Fluid Dynamics (CFD) flow analysis, FEM structural analysis and sensitivity analysis tools. SASDO is applied to a simple. isolated, 3-D wing in inviscid flow. Results show that the method finds the saine local optimum as a conventional optimization method with some reduction in the computational cost and without significant modifications; to the analysis tools.

  20. CytoSpectre: a tool for spectral analysis of oriented structures on cellular and subcellular levels.

    PubMed

    Kartasalo, Kimmo; Pölönen, Risto-Pekka; Ojala, Marisa; Rasku, Jyrki; Lekkala, Jukka; Aalto-Setälä, Katriina; Kallio, Pasi

    2015-10-26

    Orientation and the degree of isotropy are important in many biological systems such as the sarcomeres of cardiomyocytes and other fibrillar structures of the cytoskeleton. Image based analysis of such structures is often limited to qualitative evaluation by human experts, hampering the throughput, repeatability and reliability of the analyses. Software tools are not readily available for this purpose and the existing methods typically rely at least partly on manual operation. We developed CytoSpectre, an automated tool based on spectral analysis, allowing the quantification of orientation and also size distributions of structures in microscopy images. CytoSpectre utilizes the Fourier transform to estimate the power spectrum of an image and based on the spectrum, computes parameter values describing, among others, the mean orientation, isotropy and size of target structures. The analysis can be further tuned to focus on targets of particular size at cellular or subcellular scales. The software can be operated via a graphical user interface without any programming expertise. We analyzed the performance of CytoSpectre by extensive simulations using artificial images, by benchmarking against FibrilTool and by comparisons with manual measurements performed for real images by a panel of human experts. The software was found to be tolerant against noise and blurring and superior to FibrilTool when analyzing realistic targets with degraded image quality. The analysis of real images indicated general good agreement between computational and manual results while also revealing notable expert-to-expert variation. Moreover, the experiment showed that CytoSpectre can handle images obtained of different cell types using different microscopy techniques. Finally, we studied the effect of mechanical stretching on cardiomyocytes to demonstrate the software in an actual experiment and observed changes in cellular orientation in response to stretching. CytoSpectre, a versatile, easy-to-use software tool for spectral analysis of microscopy images was developed. The tool is compatible with most 2D images and can be used to analyze targets at different scales. We expect the tool to be useful in diverse applications dealing with structures whose orientation and size distributions are of interest. While designed for the biological field, the software could also be useful in non-biological applications.

  1. POLYVIEW-MM: web-based platform for animation and analysis of molecular simulations

    PubMed Central

    Porollo, Aleksey; Meller, Jaroslaw

    2010-01-01

    Molecular simulations offer important mechanistic and functional clues in studies of proteins and other macromolecules. However, interpreting the results of such simulations increasingly requires tools that can combine information from multiple structural databases and other web resources, and provide highly integrated and versatile analysis tools. Here, we present a new web server that integrates high-quality animation of molecular motion (MM) with structural and functional analysis of macromolecules. The new tool, dubbed POLYVIEW-MM, enables animation of trajectories generated by molecular dynamics and related simulation techniques, as well as visualization of alternative conformers, e.g. obtained as a result of protein structure prediction methods or small molecule docking. To facilitate structural analysis, POLYVIEW-MM combines interactive view and analysis of conformational changes using Jmol and its tailored extensions, publication quality animation using PyMol, and customizable 2D summary plots that provide an overview of MM, e.g. in terms of changes in secondary structure states and relative solvent accessibility of individual residues in proteins. Furthermore, POLYVIEW-MM integrates visualization with various structural annotations, including automated mapping of known inter-action sites from structural homologs, mapping of cavities and ligand binding sites, transmembrane regions and protein domains. URL: http://polyview.cchmc.org/conform.html. PMID:20504857

  2. Integrated multidisciplinary analysis tool IMAT users' guide

    NASA Technical Reports Server (NTRS)

    Meissner, Frances T. (Editor)

    1988-01-01

    The Integrated Multidisciplinary Analysis Tool (IMAT) is a computer software system developed at Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite controls systems influenced by structural dynamics. Using a menu-driven executive system, IMAT leads the user through the program options. IMAT links a relational database manager to commercial and in-house structural and controls analysis codes. This paper describes the IMAT software system and how to use it.

  3. Application of structured analysis to a telerobotic system

    NASA Technical Reports Server (NTRS)

    Dashman, Eric; Mclin, David; Harrison, F. W.; Soloway, Donald; Young, Steven

    1990-01-01

    The analysis and evaluation of a multiple arm telerobotic research and demonstration system developed by the NASA Intelligent Systems Research Laboratory (ISRL) is described. Structured analysis techniques were used to develop a detailed requirements model of an existing telerobotic testbed. Performance models generated during this process were used to further evaluate the total system. A commercial CASE tool called Teamwork was used to carry out the structured analysis and development of the functional requirements model. A structured analysis and design process using the ISRL telerobotic system as a model is described. Evaluation of this system focused on the identification of bottlenecks in this implementation. The results demonstrate that the use of structured methods and analysis tools can give useful performance information early in a design cycle. This information can be used to ensure that the proposed system meets its design requirements before it is built.

  4. Process Improvement Through Tool Integration in Aero-Mechanical Design

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2010-01-01

    Emerging capabilities in commercial design tools promise to significantly improve the multi-disciplinary and inter-disciplinary design and analysis coverage for aerospace mechanical engineers. This paper explores the analysis process for two example problems of a wing and flap mechanical drive system and an aircraft landing gear door panel. The examples begin with the design solid models and include various analysis disciplines such as structural stress and aerodynamic loads. Analytical methods include CFD, multi-body dynamics with flexible bodies and structural analysis. Elements of analysis data management, data visualization and collaboration are also included.

  5. Overcoming redundancies in bedside nursing assessments by validating a parsimonious meta-tool: findings from a methodological exercise study.

    PubMed

    Palese, Alvisa; Marini, Eva; Guarnier, Annamaria; Barelli, Paolo; Zambiasi, Paola; Allegrini, Elisabetta; Bazoli, Letizia; Casson, Paola; Marin, Meri; Padovan, Marisa; Picogna, Michele; Taddia, Patrizia; Chiari, Paolo; Salmaso, Daniele; Marognolli, Oliva; Canzan, Federica; Ambrosi, Elisa; Saiani, Luisa; Grassetti, Luca

    2016-10-01

    There is growing interest in validating tools aimed at supporting the clinical decision-making process and research. However, an increased bureaucratization of clinical practice and redundancies in the measures collected have been reported by clinicians. Redundancies in clinical assessments affect negatively both patients and nurses. To validate a meta-tool measuring the risks/problems currently estimated by multiple tools used in daily practice. A secondary analysis of a database was performed, using a cross-validation and a longitudinal study designs. In total, 1464 patients admitted to 12 medical units in 2012 were assessed at admission with the Brass, Barthel, Conley and Braden tools. Pertinent outcomes such as the occurrence of post-discharge need for resources and functional decline at discharge, as well as falls and pressure sores, were measured. Explorative factor analysis of each tool, inter-tool correlations and a conceptual evaluation of the redundant/similar items across tools were performed. Therefore, the validation of the meta-tool was performed through explorative factor analysis, confirmatory factor analysis and the structural equation model to establish the ability of the meta-tool to predict the outcomes estimated by the original tools. High correlations between the tools have emerged (from r 0.428 to 0.867) with a common variance from 18.3% to 75.1%. Through a conceptual evaluation and explorative factor analysis, the items were reduced from 42 to 20, and the three factors that emerged were confirmed by confirmatory factor analysis. According to the structural equation model results, two out of three emerged factors predicted the outcomes. From the initial 42 items, the meta-tool is composed of 20 items capable of predicting the outcomes as with the original tools. © 2016 John Wiley & Sons, Ltd.

  6. CMS Configuration Editor: GUI based application for user analysis job

    NASA Astrophysics Data System (ADS)

    de Cosa, A.

    2011-12-01

    We present the user interface and the software architecture of the Configuration Editor for the CMS experiment. The analysis workflow is organized in a modular way integrated within the CMS framework that organizes in a flexible way user analysis code. The Python scripting language is adopted to define the job configuration that drives the analysis workflow. It could be a challenging task for users, especially for newcomers, to develop analysis jobs managing the configuration of many required modules. For this reason a graphical tool has been conceived in order to edit and inspect configuration files. A set of common analysis tools defined in the CMS Physics Analysis Toolkit (PAT) can be steered and configured using the Config Editor. A user-defined analysis workflow can be produced starting from a standard configuration file, applying and configuring PAT tools according to the specific user requirements. CMS users can adopt this tool, the Config Editor, to create their analysis visualizing in real time which are the effects of their actions. They can visualize the structure of their configuration, look at the modules included in the workflow, inspect the dependences existing among the modules and check the data flow. They can visualize at which values parameters are set and change them according to what is required by their analysis task. The integration of common tools in the GUI needed to adopt an object-oriented structure in the Python definition of the PAT tools and the definition of a layer of abstraction from which all PAT tools inherit.

  7. Army Sustainability Modelling Analysis and Reporting Tool Phase 1: User Manual and Results Interpretation Guide

    DTIC Science & Technology

    2009-11-01

    force structure liability analysis tool, designed to forecast the dynamics of personnel and equipment populations over time for a particular scenario...it is intended that it will support analysis of the sustainability of planned Army force structures against a range of possible scenarios, as well as...the force options testing process. A-SMART Phase 1 has been limited to the development of personnel, major equipment and supplies/strategic lift

  8. Multidisciplinary Analysis of a Hypersonic Engine

    NASA Technical Reports Server (NTRS)

    Suresh, Ambady; Stewart, Mark

    2003-01-01

    The objective is to develop high fidelity tools that can influence ISTAR design In particular, tools for coupling Fluid-Thermal-Structural simulations RBCC/TBCC designers carefully balance aerodynamic, thermal, weight, & structural considerations; consistent multidisciplinary solutions reveal details (at modest cost) At Scram mode design point, simulations give details of inlet & combustor performance, thermal loads, structural deflections.

  9. DWARF – a data warehouse system for analyzing protein families

    PubMed Central

    Fischer, Markus; Thai, Quan K; Grieb, Melanie; Pleiss, Jürgen

    2006-01-01

    Background The emerging field of integrative bioinformatics provides the tools to organize and systematically analyze vast amounts of highly diverse biological data and thus allows to gain a novel understanding of complex biological systems. The data warehouse DWARF applies integrative bioinformatics approaches to the analysis of large protein families. Description The data warehouse system DWARF integrates data on sequence, structure, and functional annotation for protein fold families. The underlying relational data model consists of three major sections representing entities related to the protein (biochemical function, source organism, classification to homologous families and superfamilies), the protein sequence (position-specific annotation, mutant information), and the protein structure (secondary structure information, superimposed tertiary structure). Tools for extracting, transforming and loading data from public available resources (ExPDB, GenBank, DSSP) are provided to populate the database. The data can be accessed by an interface for searching and browsing, and by analysis tools that operate on annotation, sequence, or structure. We applied DWARF to the family of α/β-hydrolases to host the Lipase Engineering database. Release 2.3 contains 6138 sequences and 167 experimentally determined protein structures, which are assigned to 37 superfamilies 103 homologous families. Conclusion DWARF has been designed for constructing databases of large structurally related protein families and for evaluating their sequence-structure-function relationships by a systematic analysis of sequence, structure and functional annotation. It has been applied to predict biochemical properties from sequence, and serves as a valuable tool for protein engineering. PMID:17094801

  10. Modern CACSD using the Robust-Control Toolbox

    NASA Technical Reports Server (NTRS)

    Chiang, Richard Y.; Safonov, Michael G.

    1989-01-01

    The Robust-Control Toolbox is a collection of 40 M-files which extend the capability of PC/PRO-MATLAB to do modern multivariable robust control system design. Included are robust analysis tools like singular values and structured singular values, robust synthesis tools like continuous/discrete H(exp 2)/H infinity synthesis and Linear Quadratic Gaussian Loop Transfer Recovery methods and a variety of robust model reduction tools such as Hankel approximation, balanced truncation and balanced stochastic truncation, etc. The capabilities of the toolbox are described and illustated with examples to show how easily they can be used in practice. Examples include structured singular value analysis, H infinity loop-shaping and large space structure model reduction.

  11. Object-Oriented Multi-Disciplinary Design, Analysis, and Optimization Tool

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi

    2011-01-01

    An Object-Oriented Optimization (O3) tool was developed that leverages existing tools and practices, and allows the easy integration and adoption of new state-of-the-art software. At the heart of the O3 tool is the Central Executive Module (CEM), which can integrate disparate software packages in a cross platform network environment so as to quickly perform optimization and design tasks in a cohesive, streamlined manner. This object-oriented framework can integrate the analysis codes for multiple disciplines instead of relying on one code to perform the analysis for all disciplines. The CEM was written in FORTRAN and the script commands for each performance index were submitted through the use of the FORTRAN Call System command. In this CEM, the user chooses an optimization methodology, defines objective and constraint functions from performance indices, and provides starting and side constraints for continuous as well as discrete design variables. The structural analysis modules such as computations of the structural weight, stress, deflection, buckling, and flutter and divergence speeds have been developed and incorporated into the O3 tool to build an object-oriented Multidisciplinary Design, Analysis, and Optimization (MDAO) tool.

  12. Solvation Structure and Thermodynamic Mapping (SSTMap): An Open-Source, Flexible Package for the Analysis of Water in Molecular Dynamics Trajectories.

    PubMed

    Haider, Kamran; Cruz, Anthony; Ramsey, Steven; Gilson, Michael K; Kurtzman, Tom

    2018-01-09

    We have developed SSTMap, a software package for mapping structural and thermodynamic water properties in molecular dynamics trajectories. The package introduces automated analysis and mapping of local measures of frustration and enhancement of water structure. The thermodynamic calculations are based on Inhomogeneous Fluid Solvation Theory (IST), which is implemented using both site-based and grid-based approaches. The package also extends the applicability of solvation analysis calculations to multiple molecular dynamics (MD) simulation programs by using existing cross-platform tools for parsing MD parameter and trajectory files. SSTMap is implemented in Python and contains both command-line tools and a Python module to facilitate flexibility in setting up calculations and for automated generation of large data sets involving analysis of multiple solutes. Output is generated in formats compatible with popular Python data science packages. This tool will be used by the molecular modeling community for computational analysis of water in problems of biophysical interest such as ligand binding and protein function.

  13. Lightweight Object Oriented Structure analysis: Tools for building Tools to Analyze Molecular Dynamics Simulations

    PubMed Central

    Romo, Tod D.; Leioatts, Nicholas; Grossfield, Alan

    2014-01-01

    LOOS (Lightweight Object-Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 120 pre-built tools, including suites of tools for analyzing simulation convergence, 3D histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only 4 core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. PMID:25327784

  14. Lightweight object oriented structure analysis: tools for building tools to analyze molecular dynamics simulations.

    PubMed

    Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan

    2014-12-15

    LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. © 2014 Wiley Periodicals, Inc.

  15. IMAT (Integrated Multidisciplinary Analysis Tool) user's guide for the VAX/VMS computer

    NASA Technical Reports Server (NTRS)

    Meissner, Frances T. (Editor)

    1988-01-01

    The Integrated Multidisciplinary Analysis Tool (IMAT) is a computer software system for the VAX/VMS computer developed at the Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite control systems influenced by structural dynamics. Using a menu-driven executive system, IMAT leads the user through the program options. IMAT links a relational database manager to commercial and in-house structural and controls analysis codes. This paper describes the IMAT software system and how to use it.

  16. ASTROS: A multidisciplinary automated structural design tool

    NASA Technical Reports Server (NTRS)

    Neill, D. J.

    1989-01-01

    ASTROS (Automated Structural Optimization System) is a finite-element-based multidisciplinary structural optimization procedure developed under Air Force sponsorship to perform automated preliminary structural design. The design task is the determination of the structural sizes that provide an optimal structure while satisfying numerous constraints from many disciplines. In addition to its automated design features, ASTROS provides a general transient and frequency response capability, as well as a special feature to perform a transient analysis of a vehicle subjected to a nuclear blast. The motivation for the development of a single multidisciplinary design tool is that such a tool can provide improved structural designs in less time than is currently needed. The role of such a tool is even more apparent as modern materials come into widespread use. Balancing conflicting requirements for the structure's strength and stiffness while exploiting the benefits of material anisotropy is perhaps an impossible task without assistance from an automated design tool. Finally, the use of a single tool can bring the design task into better focus among design team members, thereby improving their insight into the overall task.

  17. WONKA: objective novel complex analysis for ensembles of protein-ligand structures.

    PubMed

    Bradley, A R; Wall, I D; von Delft, F; Green, D V S; Deane, C M; Marsden, B D

    2015-10-01

    WONKA is a tool for the systematic analysis of an ensemble of protein-ligand structures. It makes the identification of conserved and unusual features within such an ensemble straightforward. WONKA uses an intuitive workflow to process structural co-ordinates. Ligand and protein features are summarised and then presented within an interactive web application. WONKA's power in consolidating and summarising large amounts of data is described through the analysis of three bromodomain datasets. Furthermore, and in contrast to many current methods, WONKA relates analysis to individual ligands, from which we find unusual and erroneous binding modes. Finally the use of WONKA as an annotation tool to share observations about structures is demonstrated. WONKA is freely available to download and install locally or can be used online at http://wonka.sgc.ox.ac.uk.

  18. Control of flexible structures

    NASA Technical Reports Server (NTRS)

    Russell, R. A.

    1985-01-01

    The requirements for future space missions indicate that many of these spacecraft will be large, flexible, and in some applications, require precision geometries. A technology program that addresses the issues associated with the structure/control interactions for these classes of spacecraft is discussed. The goal of the NASA control of flexible structures technology program is to generate a technology data base that will provide the designer with options and approaches to achieve spacecraft performance such as maintaining geometry and/or suppressing undesired spacecraft dynamics. This technology program will define the appropriate combination of analysis, ground testing, and flight testing required to validate the structural/controls analysis and design tools. This work was motivated by a recognition that large minimum weight space structures will be required for many future missions. The tools necessary to support such design included: (1) improved structural analysis; (2) modern control theory; (3) advanced modeling techniques; (4) system identification; and (5) the integration of structures and controls.

  19. A Confirmatory Factor Analysis of the Student Evidence-Based Practice Questionnaire (S-EBPQ) in an Australian sample.

    PubMed

    Beccaria, Lisa; Beccaria, Gavin; McCosker, Catherine

    2018-03-01

    It is crucial that nursing students develop skills and confidence in using Evidence-Based Practice principles early in their education. This should be assessed with valid tools however, to date, few measures have been developed and applied to the student population. To examine the structural validity of the Student Evidence-Based Practice Questionnaire (S-EBPQ), with an Australian online nursing student cohort. A cross-sectional study for constructing validity. Three hundred and forty-five undergraduate nursing students from an Australian regional university were recruited across two semesters. Confirmatory Factor Analysis was used to examine the structural validity. Confirmatory Factor Analysis was applied which resulted in a good fitting model, based on a revised 20-item tool. The S-EBPQ tool remains a psychometrically robust measure of evidence-based practice use, attitudes, and knowledge and skills and can be applied in an online Australian student context. The findings of this study provided further evidence of the reliability and four factor structure of the S-EBPQ. Opportunities for further refinement of the tool may result in improvements in structural validity. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Impact and Penetration Simulations for Composite Wing-like Structures

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.

    1998-01-01

    The goal of this research project was to develop methodologies for the analysis of wing-like structures subjected to impact loadings. Low-speed impact causing either no damage or only minimal damage and high-speed impact causing severe laminate damage and possible penetration of the structure were to be considered during this research effort. To address this goal, an assessment of current analytical tools for impact analysis was performed. Assessment of the analytical tools for impact and penetration simulations with regard to accuracy, modeling, and damage modeling was considered as well as robustness, efficient, and usage in a wing design environment. Following a qualitative assessment, selected quantitative evaluations will be performed using the leading simulation tools. Based on this assessment, future research thrusts for impact and penetration simulation of composite wing-like structures were identified.

  1. The Development of a Web-Based Virtual Environment for Teaching Qualitative Analysis of Structures

    ERIC Educational Resources Information Center

    O'Dwyer, D. W.; Logan-Phelan, T. M.; O'Neill, E. A.

    2007-01-01

    The current paper describes the design and development of a qualitative analysis course and an interactive web-based teaching and assessment tool called VSE (virtual structural environment). The widespread reliance on structural analysis programs requires engineers to be able to verify computer output by carrying out qualitative analyses.…

  2. An Integrated Tool for System Analysis of Sample Return Vehicles

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.; Maddock, Robert W.; Winski, Richard G.

    2012-01-01

    The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis and design of entry vehicles is multidisciplinary in nature, requiring the application of mass sizing, flight mechanics, aerodynamics, aerothermodynamics, thermal analysis, structural analysis, and impact analysis tools. Integration of a multidisciplinary problem is a challenging task; the execution process and data transfer among disciplines should be automated and consistent. This paper describes an integrated analysis tool for the design and sizing of an Earth entry vehicle. The current tool includes the following disciplines: mass sizing, flight mechanics, aerodynamics, aerothermodynamics, and impact analysis tools. Python and Java languages are used for integration. Results are presented and compared with the results from previous studies.

  3. Structural Embeddings: Mechanization with Method

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Rushby, John

    1999-01-01

    The most powerful tools for analysis of formal specifications are general-purpose theorem provers and model checkers, but these tools provide scant methodological support. Conversely, those approaches that do provide a well-developed method generally have less powerful automation. It is natural, therefore, to try to combine the better-developed methods with the more powerful general-purpose tools. An obstacle is that the methods and the tools often employ very different logics. We argue that methods are separable from their logics and are largely concerned with the structure and organization of specifications. We, propose a technique called structural embedding that allows the structural elements of a method to be supported by a general-purpose tool, while substituting the logic of the tool for that of the method. We have found this technique quite effective and we provide some examples of its application. We also suggest how general-purpose systems could be restructured to support this activity better.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janjusic, Tommy; Kartsaklis, Christos

    Application analysis is facilitated through a number of program profiling tools. The tools vary in their complexity, ease of deployment, design, and profiling detail. Specifically, understand- ing, analyzing, and optimizing is of particular importance for scientific applications where minor changes in code paths and data-structure layout can have profound effects. Understanding how intricate data-structures are accessed and how a given memory system responds is a complex task. In this paper we describe a trace profiling tool, Glprof, specifically aimed to lessen the burden of the programmer to pin-point heavily involved data-structures during an application's run-time, and understand data-structure run-time usage.more » Moreover, we showcase the tool's modularity using additional cache simulation components. We elaborate on the tool's design, and features. Finally we demonstrate the application of our tool in the context of Spec bench- marks using the Glprof profiler and two concurrently running cache simulators, PPC440 and AMD Interlagos.« less

  5. RNA-SSPT: RNA Secondary Structure Prediction Tools.

    PubMed

    Ahmad, Freed; Mahboob, Shahid; Gulzar, Tahsin; Din, Salah U; Hanif, Tanzeela; Ahmad, Hifza; Afzal, Muhammad

    2013-01-01

    The prediction of RNA structure is useful for understanding evolution for both in silico and in vitro studies. Physical methods like NMR studies to predict RNA secondary structure are expensive and difficult. Computational RNA secondary structure prediction is easier. Comparative sequence analysis provides the best solution. But secondary structure prediction of a single RNA sequence is challenging. RNA-SSPT is a tool that computationally predicts secondary structure of a single RNA sequence. Most of the RNA secondary structure prediction tools do not allow pseudoknots in the structure or are unable to locate them. Nussinov dynamic programming algorithm has been implemented in RNA-SSPT. The current studies shows only energetically most favorable secondary structure is required and the algorithm modification is also available that produces base pairs to lower the total free energy of the secondary structure. For visualization of RNA secondary structure, NAVIEW in C language is used and modified in C# for tool requirement. RNA-SSPT is built in C# using Dot Net 2.0 in Microsoft Visual Studio 2005 Professional edition. The accuracy of RNA-SSPT is tested in terms of Sensitivity and Positive Predicted Value. It is a tool which serves both secondary structure prediction and secondary structure visualization purposes.

  6. RNA-SSPT: RNA Secondary Structure Prediction Tools

    PubMed Central

    Ahmad, Freed; Mahboob, Shahid; Gulzar, Tahsin; din, Salah U; Hanif, Tanzeela; Ahmad, Hifza; Afzal, Muhammad

    2013-01-01

    The prediction of RNA structure is useful for understanding evolution for both in silico and in vitro studies. Physical methods like NMR studies to predict RNA secondary structure are expensive and difficult. Computational RNA secondary structure prediction is easier. Comparative sequence analysis provides the best solution. But secondary structure prediction of a single RNA sequence is challenging. RNA-SSPT is a tool that computationally predicts secondary structure of a single RNA sequence. Most of the RNA secondary structure prediction tools do not allow pseudoknots in the structure or are unable to locate them. Nussinov dynamic programming algorithm has been implemented in RNA-SSPT. The current studies shows only energetically most favorable secondary structure is required and the algorithm modification is also available that produces base pairs to lower the total free energy of the secondary structure. For visualization of RNA secondary structure, NAVIEW in C language is used and modified in C# for tool requirement. RNA-SSPT is built in C# using Dot Net 2.0 in Microsoft Visual Studio 2005 Professional edition. The accuracy of RNA-SSPT is tested in terms of Sensitivity and Positive Predicted Value. It is a tool which serves both secondary structure prediction and secondary structure visualization purposes. PMID:24250115

  7. Stress analysis and design considerations for Shuttle pointed autonomous research tool for astronomy /SPARTAN/

    NASA Technical Reports Server (NTRS)

    Ferragut, N. J.

    1982-01-01

    The Shuttle Pointed Autonomous Research Tool for Astronomy (SPARTAN) family of spacecraft are intended to operate with minimum interfaces with the U.S. Space Shuttle in order to increase flight opportunities. The SPARTAN I Spacecraft was designed to enhance structural capabilities and increase reliability. The approach followed results from work experience which evolved from sounding rocket projects. Structural models were developed to do the analyses necessary to satisfy safety requirements for Shuttle hardware. A loads analysis must also be performed. Stress analysis calculations will be performed on the main structural elements and subcomponents. Attention is given to design considerations and program definition, the schematic representation of a finite element model used for SPARTAN I spacecraft, details of loads analysis, the stress analysis, and fracture mechanics plan implications.

  8. Analyzing the Scientific Evolution of Social Work Using Science Mapping

    ERIC Educational Resources Information Center

    Martínez, Ma Angeles; Cobo, Manuel Jesús; Herrera, Manuel; Herrera-Viedma, Enrique

    2015-01-01

    Objectives: This article reports the first science mapping analysis of the social work field, which shows its conceptual structure and scientific evolution. Methods: Science Mapping Analysis Software Tool, a bibliometric science mapping tool based on co-word analysis and h-index, is applied using a sample of 18,794 research articles published from…

  9. Probabilistic Structural Analysis of the SRB Aft Skirt External Fitting Modification

    NASA Technical Reports Server (NTRS)

    Townsend, John S.; Peck, J.; Ayala, S.

    1999-01-01

    NASA has funded several major programs (the PSAM Project is an example) to develop Probabilistic Structural Analysis Methods and tools for engineers to apply in the design and assessment of aerospace hardware. A probabilistic finite element design tool, known as NESSUS, is used to determine the reliability of the Space Shuttle Solid Rocket Booster (SRB) aft skirt critical weld. An external bracket modification to the aft skirt provides a comparison basis for examining the details of the probabilistic analysis and its contributions to the design process.

  10. Research resource: Update and extension of a glycoprotein hormone receptors web application.

    PubMed

    Kreuchwig, Annika; Kleinau, Gunnar; Kreuchwig, Franziska; Worth, Catherine L; Krause, Gerd

    2011-04-01

    The SSFA-GPHR (Sequence-Structure-Function-Analysis of Glycoprotein Hormone Receptors) database provides a comprehensive set of mutation data for the glycoprotein hormone receptors (covering the lutropin, the FSH, and the TSH receptors). Moreover, it provides a platform for comparison and investigation of these homologous receptors and helps in understanding protein malfunctions associated with several diseases. Besides extending the data set (> 1100 mutations), the database has been completely redesigned and several novel features and analysis tools have been added to the web site. These tools allow the focused extraction of semiquantitative mutant data from the GPHR subtypes and different experimental approaches. Functional and structural data of the GPHRs are now linked interactively at the web interface, and new tools for data visualization (on three-dimensional protein structures) are provided. The interpretation of functional findings is supported by receptor morphings simulating intramolecular changes during the activation process, which thus help to trace the potential function of each amino acid and provide clues to the local structural environment, including potentially relocated spatial counterpart residues. Furthermore, double and triple mutations are newly included to allow the analysis of their functional effects related to their spatial interrelationship in structures or homology models. A new important feature is the search option and data visualization by interactive and user-defined snake-plots. These new tools allow fast and easy searches for specific functional data and thereby give deeper insights in the mechanisms of hormone binding, signal transduction, and signaling regulation. The web application "Sequence-Structure-Function-Analysis of GPHRs" is accessible on the internet at http://www.ssfa-gphr.de/.

  11. Inspection of the Math Model Tools for On-Orbit Assessment of Impact Damage Report

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Raju, Ivatury S.; Piascik, Robert S> ; KramerWhite, Julie A.; KramerWhite, Julie A.; Labbe, Steve G.; Rotter, Hank A.

    2007-01-01

    In Spring of 2005, the NASA Engineering Safety Center (NESC) was engaged by the Space Shuttle Program (SSP) to peer review the suite of analytical tools being developed to support the determination of impact and damage tolerance of the Orbiter Thermal Protection Systems (TPS). The NESC formed an independent review team with the core disciplines of materials, flight sciences, structures, mechanical analysis and thermal analysis. The Math Model Tools reviewed included damage prediction and stress analysis, aeroheating analysis, and thermal analysis tools. Some tools are physics-based and other tools are empirically-derived. Each tool was created for a specific use and timeframe, including certification, real-time pre-launch assessments. In addition, the tools are used together in an integrated strategy for assessing the ramifications of impact damage to tile and RCC. The NESC teams conducted a peer review of the engineering data package for each Math Model Tool. This report contains the summary of the team observations and recommendations from these reviews.

  12. Eulerian frequency analysis of structural vibrations from high-speed video

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Venanzoni, Andrea; Siemens Industry Software NV, Interleuvenlaan 68, B-3001 Leuven; De Ryck, Laurent

    An approach for the analysis of the frequency content of structural vibrations from high-speed video recordings is proposed. The techniques and tools proposed rely on an Eulerian approach, that is, using the time history of pixels independently to analyse structural motion, as opposed to Lagrangian approaches, where the motion of the structure is tracked in time. The starting point is an existing Eulerian motion magnification method, which consists in decomposing the video frames into a set of spatial scales through a so-called Laplacian pyramid [1]. Each scale — or level — can be amplified independently to reconstruct a magnified motionmore » of the observed structure. The approach proposed here provides two analysis tools or pre-amplification steps. The first tool provides a representation of the global frequency content of a video per pyramid level. This may be further enhanced by applying an angular filter in the spatial frequency domain to each frame of the video before the Laplacian pyramid decomposition, which allows for the identification of the frequency content of the structural vibrations in a particular direction of space. This proposed tool complements the existing Eulerian magnification method by amplifying selectively the levels containing relevant motion information with respect to their frequency content. This magnifies the displacement while limiting the noise contribution. The second tool is a holographic representation of the frequency content of a vibrating structure, yielding a map of the predominant frequency components across the structure. In contrast to the global frequency content representation of the video, this tool provides a local analysis of the periodic gray scale intensity changes of the frame in order to identify the vibrating parts of the structure and their main frequencies. Validation cases are provided and the advantages and limits of the approaches are discussed. The first validation case consists of the frequency content retrieval of the tip of a shaker, excited at selected fixed frequencies. The goal of this setup is to retrieve the frequencies at which the tip is excited. The second validation case consists of two thin metal beams connected to a randomly excited bar. It is shown that the holographic representation visually highlights the predominant frequency content of each pixel and locates the global frequencies of the motion, thus retrieving the natural frequencies for each beam.« less

  13. Multi-body Dynamic Contact Analysis Tool for Transmission Design

    DTIC Science & Technology

    2003-04-01

    frequencies were computed in COSMIC NASTRAN, and were validated against the published experimental modal analysis [17]. • Using assumed time domain... modal superposition. • Results from the structural analysis (mode shapes or forced response) were converted into IDEAS universal format (dataset 55...ARMY RESEARCH LABORATORY Multi-body Dynamic Contact Analysis Tool for Transmission Design SBIR Phase II Final Report by

  14. Sustainability of transport structures - some aspects of the nonlinear reliability assessment

    NASA Astrophysics Data System (ADS)

    Pukl, Radomír; Sajdlová, Tereza; Strauss, Alfred; Lehký, David; Novák, Drahomír

    2017-09-01

    Efficient techniques for both nonlinear numerical analysis of concrete structures and advanced stochastic simulation methods have been combined in order to offer an advanced tool for assessment of realistic behaviour, failure and safety assessment of transport structures. The utilized approach is based on randomization of the non-linear finite element analysis of the structural models. Degradation aspects such as carbonation of concrete can be accounted in order predict durability of the investigated structure and its sustainability. Results can serve as a rational basis for the performance and sustainability assessment based on advanced nonlinear computer analysis of the structures of transport infrastructure such as bridges or tunnels. In the stochastic simulation the input material parameters obtained from material tests including their randomness and uncertainty are represented as random variables or fields. Appropriate identification of material parameters is crucial for the virtual failure modelling of structures and structural elements. Inverse analysis using artificial neural networks and virtual stochastic simulations approach is applied to determine the fracture mechanical parameters of the structural material and its numerical model. Structural response, reliability and sustainability have been investigated on different types of transport structures made from various materials using the above mentioned methodology and tools.

  15. Computational mechanics analysis tools for parallel-vector supercomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, Olaf O.; Nguyen, Duc T.; Baddourah, Majdi; Qin, Jiangning

    1993-01-01

    Computational algorithms for structural analysis on parallel-vector supercomputers are reviewed. These parallel algorithms, developed by the authors, are for the assembly of structural equations, 'out-of-core' strategies for linear equation solution, massively distributed-memory equation solution, unsymmetric equation solution, general eigensolution, geometrically nonlinear finite element analysis, design sensitivity analysis for structural dynamics, optimization search analysis and domain decomposition. The source code for many of these algorithms is available.

  16. Direct Duplex Detection: An Emerging Tool in the RNA Structure Analysis Toolbox.

    PubMed

    Weidmann, Chase A; Mustoe, Anthony M; Weeks, Kevin M

    2016-09-01

    While a variety of powerful tools exists for analyzing RNA structure, identifying long-range and intermolecular base-pairing interactions has remained challenging. Recently, three groups introduced a high-throughput strategy that uses psoralen-mediated crosslinking to directly identify RNA-RNA duplexes in cells. Initial application of these methods highlights the preponderance of long-range structures within and between RNA molecules and their widespread structural dynamics. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Multidisciplinary analysis and design of printed wiring boards

    NASA Astrophysics Data System (ADS)

    Fulton, Robert E.; Hughes, Joseph L.; Scott, Waymond R., Jr.; Umeagukwu, Charles; Yeh, Chao-Pin

    1991-04-01

    Modern printed wiring board design depends on electronic prototyping using computer-based simulation and design tools. Existing electrical computer-aided design (ECAD) tools emphasize circuit connectivity with only rudimentary analysis capabilities. This paper describes a prototype integrated PWB design environment denoted Thermal Structural Electromagnetic Testability (TSET) being developed at Georgia Tech in collaboration with companies in the electronics industry. TSET provides design guidance based on enhanced electrical and mechanical CAD capabilities including electromagnetic modeling testability analysis thermal management and solid mechanics analysis. TSET development is based on a strong analytical and theoretical science base and incorporates an integrated information framework and a common database design based on a systematic structured methodology.

  18. Multi-Body Dynamic Contact Analysis. Tool for Transmission Design SBIR Phase II Final Report

    DTIC Science & Technology

    2003-04-01

    shapes and natural frequencies were computed in COSMIC NASTRAN, and were validated against the published experimental modal analysis [17]. • Using...COSMIC NASTRAN via modal superposition. • Results from the structural analysis (mode shapes or forced response) were converted into IDEAS universal...ARMY RESEARCH LABORATORY Multi-body Dynamic Contact Analysis Tool for Transmission Design SBIR Phase II Final Report by

  19. Mapping healthcare systems: a policy relevant analytic tool

    PubMed Central

    Sekhri Feachem, Neelam; Afshar, Ariana; Pruett, Cristina; Avanceña, Anton L.V.

    2017-01-01

    Abstract Background In the past decade, an international consensus on the value of well-functioning systems has driven considerable health systems research. This research falls into two broad categories. The first provides conceptual frameworks that take complex healthcare systems and create simplified constructs of interactions and functions. The second focuses on granular inputs and outputs. This paper presents a novel translational mapping tool – the University of California, San Francisco mapping tool (the Tool) - which bridges the gap between these two areas of research, creating a platform for multi-country comparative analysis. Methods Using the Murray-Frenk framework, we create a macro-level representation of a country's structure, focusing on how it finances and delivers healthcare. The map visually depicts the fundamental policy questions in healthcare system design: funding sources and amount spent through each source, purchasers, populations covered, provider categories; and the relationship between these entities. Results We use the Tool to provide a macro-level comparative analysis of the structure of India's and Thailand's healthcare systems. Conclusions As part of the systems strengthening arsenal, the Tool can stimulate debate about the merits and consequences of different healthcare systems structural designs, using a common framework that fosters multi-country comparative analyses. PMID:28541518

  20. p3d--Python module for structural bioinformatics.

    PubMed

    Fufezan, Christian; Specht, Michael

    2009-08-21

    High-throughput bioinformatic analysis tools are needed to mine the large amount of structural data via knowledge based approaches. The development of such tools requires a robust interface to access the structural data in an easy way. For this the Python scripting language is the optimal choice since its philosophy is to write an understandable source code. p3d is an object oriented Python module that adds a simple yet powerful interface to the Python interpreter to process and analyse three dimensional protein structure files (PDB files). p3d's strength arises from the combination of a) very fast spatial access to the structural data due to the implementation of a binary space partitioning (BSP) tree, b) set theory and c) functions that allow to combine a and b and that use human readable language in the search queries rather than complex computer language. All these factors combined facilitate the rapid development of bioinformatic tools that can perform quick and complex analyses of protein structures. p3d is the perfect tool to quickly develop tools for structural bioinformatics using the Python scripting language.

  1. Changes among Israeli Youth Movements: A Structural Analysis Based on Kahane's Code of Informality

    ERIC Educational Resources Information Center

    Cohen, Erik H.

    2015-01-01

    Multi-dimensional data analysis tools are applied to Reuven Kahane's data on the informality of youth organizations, yielding a graphic portrayal of Kahane's code of informality. This structure helps address questions of the whether the eight structural components exhaustively cover the field without redundancy. Further, the structure is used to…

  2. Introduction, comparison, and validation of Meta‐Essentials: A free and simple tool for meta‐analysis

    PubMed Central

    van Rhee, Henk; Hak, Tony

    2017-01-01

    We present a new tool for meta‐analysis, Meta‐Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta‐analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta‐Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta‐analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp‐Hartung adjustment of the DerSimonian‐Laird estimator. However, more advanced meta‐analysis methods such as meta‐analytical structural equation modelling and meta‐regression with multiple covariates are not available. In summary, Meta‐Essentials may prove a valuable resource for meta‐analysts, including researchers, teachers, and students. PMID:28801932

  3. Study on Web-Based Tool for Regional Agriculture Industry Structure Optimization Using Ajax

    NASA Astrophysics Data System (ADS)

    Huang, Xiaodong; Zhu, Yeping

    According to the research status of regional agriculture industry structure adjustment information system and the current development of information technology, this paper takes web-based regional agriculture industry structure optimization tool as research target. This paper introduces Ajax technology and related application frameworks to build an auxiliary toolkit of decision support system for agricultural policy maker and economy researcher. The toolkit includes a “one page” style component of regional agriculture industry structure optimization which provides agile arguments setting method that enables applying sensitivity analysis and usage of data and comparative advantage analysis result, and a component that can solve the linear programming model and its dual problem by simplex method.

  4. GAP Final Technical Report 12-14-04

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrew J. Bordner, PhD, Senior Research Scientist

    2004-12-14

    The Genomics Annotation Platform (GAP) was designed to develop new tools for high throughput functional annotation and characterization of protein sequences and structures resulting from genomics and structural proteomics, benchmarking and application of those tools. Furthermore, this platform integrated the genomic scale sequence and structural analysis and prediction tools with the advanced structure prediction and bioinformatics environment of ICM. The development of GAP was primarily oriented towards the annotation of new biomolecular structures using both structural and sequence data. Even though the amount of protein X-ray crystal data is growing exponentially, the volume of sequence data is growing even moremore » rapidly. This trend was exploited by leveraging the wealth of sequence data to provide functional annotation for protein structures. The additional information provided by GAP is expected to assist the majority of the commercial users of ICM, who are involved in drug discovery, in identifying promising drug targets as well in devising strategies for the rational design of therapeutics directed at the protein of interest. The GAP also provided valuable tools for biochemistry education, and structural genomics centers. In addition, GAP incorporates many novel prediction and analysis methods not available in other molecular modeling packages. This development led to signing the first Molsoft agreement in the structural genomics annotation area with the University of oxford Structural Genomics Center. This commercial agreement validated the Molsoft efforts under the GAP project and provided the basis for further development of the large scale functional annotation platform.« less

  5. XLinkDB 2.0: integrated, large-scale structural analysis of protein crosslinking data

    PubMed Central

    Schweppe, Devin K.; Zheng, Chunxiang; Chavez, Juan D.; Navare, Arti T.; Wu, Xia; Eng, Jimmy K.; Bruce, James E.

    2016-01-01

    Motivation: Large-scale chemical cross-linking with mass spectrometry (XL-MS) analyses are quickly becoming a powerful means for high-throughput determination of protein structural information and protein–protein interactions. Recent studies have garnered thousands of cross-linked interactions, yet the field lacks an effective tool to compile experimental data or access the network and structural knowledge for these large scale analyses. We present XLinkDB 2.0 which integrates tools for network analysis, Protein Databank queries, modeling of predicted protein structures and modeling of docked protein structures. The novel, integrated approach of XLinkDB 2.0 enables the holistic analysis of XL-MS protein interaction data without limitation to the cross-linker or analytical system used for the analysis. Availability and Implementation: XLinkDB 2.0 can be found here, including documentation and help: http://xlinkdb.gs.washington.edu/. Contact: jimbruce@uw.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153666

  6. Development of Multidisciplinary, Multifidelity Analysis, Integration, and Optimization of Aerospace Vehicles

    DTIC Science & Technology

    2010-02-27

    investigated in more detail. The intermediate level of fidelity, though more expensive, is then used to refine the analysis , add geometric detail, and...design stage is used to further refine the analysis , narrowing the design to a handful of options. Figure 1. Integrated Hierarchical Framework. In...computational structural and computational fluid modeling. For the structural analysis tool we used McIntosh Structural Dynamics’ finite element code CNEVAL

  7. Final Report for Geometric Analysis for Data Reduction and Structure Discovery DE-FG02-10ER25983, STRIPES award # DE-SC0004096

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vixie, Kevin R.

    This is the final report for the project "Geometric Analysis for Data Reduction and Structure Discovery" in which insights and tools from geometric analysis were developed and exploited for their potential to large scale data challenges.

  8. Designed tools for analysis of lithography patterns and nanostructures

    NASA Astrophysics Data System (ADS)

    Dervillé, Alexandre; Baderot, Julien; Bernard, Guilhem; Foucher, Johann; Grönqvist, Hanna; Labrosse, Aurélien; Martinez, Sergio; Zimmermann, Yann

    2017-03-01

    We introduce a set of designed tools for the analysis of lithography patterns and nano structures. The classical metrological analysis of these objects has the drawbacks of being time consuming, requiring manual tuning and lacking robustness and user friendliness. With the goal of improving the current situation, we propose new image processing tools at different levels: semi automatic, automatic and machine-learning enhanced tools. The complete set of tools has been integrated into a software platform designed to transform the lab into a virtual fab. The underlying idea is to master nano processes at the research and development level by accelerating the access to knowledge and hence speed up the implementation in product lines.

  9. Inspection of the Math Model Tools for On-Orbit Assessment of Impact Damage Report. Version 1.0

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Raju, Ivatury S.; Piascik, Robert S.; Kramer White, Julie; Labbe, Steve G.; Rotter, Hank A.

    2005-01-01

    In Spring of 2005, the NASA Engineering Safety Center (NESC) was engaged by the Space Shuttle Program (SSP) to peer review the suite of analytical tools being developed to support the determination of impact and damage tolerance of the Orbiter Thermal Protection Systems (TPS). The NESC formed an independent review team with the core disciplines of materials, flight sciences, structures, mechanical analysis and thermal analysis. The Math Model Tools reviewed included damage prediction and stress analysis, aeroheating analysis, and thermal analysis tools. Some tools are physics-based and other tools are empirically-derived. Each tool was created for a specific use and timeframe, including certification, real-time pre-launch assessments, and real-time on-orbit assessments. The tools are used together in an integrated strategy for assessing the ramifications of impact damage to tile and RCC. The NESC teams conducted a peer review of the engineering data package for each Math Model Tool. This report contains the summary of the team observations and recommendations from these reviews.

  10. Structural Model Tuning Capability in an Object-Oriented Multidisciplinary Design, Analysis, and Optimization Tool

    NASA Technical Reports Server (NTRS)

    Lung, Shun-fat; Pak, Chan-gi

    2008-01-01

    Updating the finite element model using measured data is a challenging problem in the area of structural dynamics. The model updating process requires not only satisfactory correlations between analytical and experimental results, but also the retention of dynamic properties of structures. Accurate rigid body dynamics are important for flight control system design and aeroelastic trim analysis. Minimizing the difference between analytical and experimental results is a type of optimization problem. In this research, a multidisciplinary design, analysis, and optimization (MDAO) tool is introduced to optimize the objective function and constraints such that the mass properties, the natural frequencies, and the mode shapes are matched to the target data as well as the mass matrix being orthogonalized.

  11. Structural Model Tuning Capability in an Object-Oriented Multidisciplinary Design, Analysis, and Optimization Tool

    NASA Technical Reports Server (NTRS)

    Lung, Shun-fat; Pak, Chan-gi

    2008-01-01

    Updating the finite element model using measured data is a challenging problem in the area of structural dynamics. The model updating process requires not only satisfactory correlations between analytical and experimental results, but also the retention of dynamic properties of structures. Accurate rigid body dynamics are important for flight control system design and aeroelastic trim analysis. Minimizing the difference between analytical and experimental results is a type of optimization problem. In this research, a multidisciplinary design, analysis, and optimization [MDAO] tool is introduced to optimize the objective function and constraints such that the mass properties, the natural frequencies, and the mode shapes are matched to the target data as well as the mass matrix being orthogonalized.

  12. Improved Aerodynamic Analysis for Hybrid Wing Body Conceptual Design Optimization

    NASA Technical Reports Server (NTRS)

    Gern, Frank H.

    2012-01-01

    This paper provides an overview of ongoing efforts to develop, evaluate, and validate different tools for improved aerodynamic modeling and systems analysis of Hybrid Wing Body (HWB) aircraft configurations. Results are being presented for the evaluation of different aerodynamic tools including panel methods, enhanced panel methods with viscous drag prediction, and computational fluid dynamics. Emphasis is placed on proper prediction of aerodynamic loads for structural sizing as well as viscous drag prediction to develop drag polars for HWB conceptual design optimization. Data from transonic wind tunnel tests at the Arnold Engineering Development Center s 16-Foot Transonic Tunnel was used as a reference data set in order to evaluate the accuracy of the aerodynamic tools. Triangularized surface data and Vehicle Sketch Pad (VSP) models of an X-48B 2% scale wind tunnel model were used to generate input and model files for the different analysis tools. In support of ongoing HWB scaling studies within the NASA Environmentally Responsible Aviation (ERA) program, an improved finite element based structural analysis and weight estimation tool for HWB center bodies is currently under development. Aerodynamic results from these analyses are used to provide additional aerodynamic validation data.

  13. Multi-Scale Sizing of Lightweight Multifunctional Spacecraft Structural Components

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.

    2005-01-01

    This document is the final report for the project entitled, "Multi-Scale Sizing of Lightweight Multifunctional Spacecraft Structural Components," funded under the NRA entitled "Cross-Enterprise Technology Development Program" issued by the NASA Office of Space Science in 2000. The project was funded in 2001, and spanned a four year period from March, 2001 to February, 2005. Through enhancements to and synthesis of unique, state of the art structural mechanics and micromechanics analysis software, a new multi-scale tool has been developed that enables design, analysis, and sizing of advance lightweight composite and smart materials and structures from the full vehicle, to the stiffened structure, to the micro (fiber and matrix) scales. The new software tool has broad, cross-cutting value to current and future NASA missions that will rely on advanced composite and smart materials and structures.

  14. Computational mechanics analysis tools for parallel-vector supercomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Nguyen, D. T.; Baddourah, M. A.; Qin, J.

    1993-01-01

    Computational algorithms for structural analysis on parallel-vector supercomputers are reviewed. These parallel algorithms, developed by the authors, are for the assembly of structural equations, 'out-of-core' strategies for linear equation solution, massively distributed-memory equation solution, unsymmetric equation solution, general eigen-solution, geometrically nonlinear finite element analysis, design sensitivity analysis for structural dynamics, optimization algorithm and domain decomposition. The source code for many of these algorithms is available from NASA Langley.

  15. Automatic differentiation for design sensitivity analysis of structural systems using multiple processors

    NASA Technical Reports Server (NTRS)

    Nguyen, Duc T.; Storaasli, Olaf O.; Qin, Jiangning; Qamar, Ramzi

    1994-01-01

    An automatic differentiation tool (ADIFOR) is incorporated into a finite element based structural analysis program for shape and non-shape design sensitivity analysis of structural systems. The entire analysis and sensitivity procedures are parallelized and vectorized for high performance computation. Small scale examples to verify the accuracy of the proposed program and a medium scale example to demonstrate the parallel vector performance on multiple CRAY C90 processors are included.

  16. Multidisciplinary Shape Optimization of a Composite Blended Wing Body Aircraft

    NASA Astrophysics Data System (ADS)

    Boozer, Charles Maxwell

    A multidisciplinary shape optimization tool coupling aerodynamics, structure, and performance was developed for battery powered aircraft. Utilizing high-fidelity computational fluid dynamics analysis tools and a structural wing weight tool, coupled based on the multidisciplinary feasible optimization architecture; aircraft geometry is modified in the optimization of the aircraft's range or endurance. The developed tool is applied to three geometries: a hybrid blended wing body, delta wing UAS, the ONERA M6 wing, and a modified ONERA M6 wing. First, the optimization problem is presented with the objective function, constraints, and design vector. Next, the tool's architecture and the analysis tools that are utilized are described. Finally, various optimizations are described and their results analyzed for all test subjects. Results show that less computationally expensive inviscid optimizations yield positive performance improvements using planform, airfoil, and three-dimensional degrees of freedom. From the results obtained through a series of optimizations, it is concluded that the newly developed tool is both effective at improving performance and serves as a platform ready to receive additional performance modules, further improving its computational design support potential.

  17. Web-based visualisation and analysis of 3D electron-microscopy data from EMDB and PDB.

    PubMed

    Lagerstedt, Ingvar; Moore, William J; Patwardhan, Ardan; Sanz-García, Eduardo; Best, Christoph; Swedlow, Jason R; Kleywegt, Gerard J

    2013-11-01

    The Protein Data Bank in Europe (PDBe) has developed web-based tools for the visualisation and analysis of 3D electron microscopy (3DEM) structures in the Electron Microscopy Data Bank (EMDB) and Protein Data Bank (PDB). The tools include: (1) a volume viewer for 3D visualisation of maps, tomograms and models, (2) a slice viewer for inspecting 2D slices of tomographic reconstructions, and (3) visual analysis pages to facilitate analysis and validation of maps, tomograms and models. These tools were designed to help non-experts and experts alike to get some insight into the content and assess the quality of 3DEM structures in EMDB and PDB without the need to install specialised software or to download large amounts of data from these archives. The technical challenges encountered in developing these tools, as well as the more general considerations when making archived data available to the user community through a web interface, are discussed. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.

  18. Isogeometric analysis: a powerful numerical tool for the elastic analysis of historical masonry arches

    NASA Astrophysics Data System (ADS)

    Cazzani, Antonio; Malagù, Marcello; Turco, Emilio

    2016-03-01

    We illustrate a numerical tool for analyzing plane arches such as those frequently used in historical masonry heritage. It is based on a refined elastic mechanical model derived from the isogeometric approach. In particular, geometry and displacements are modeled by means of non-uniform rational B-splines. After a brief introduction, outlining the basic assumptions of this approach and the corresponding modeling choices, several numerical applications to arches, which are typical of masonry structures, show the performance of this novel technique. These are discussed in detail to emphasize the advantage and potential developments of isogeometric analysis in the field of structural analysis of historical masonry buildings with complex geometries.

  19. A coarse-grained model for DNA origami.

    PubMed

    Reshetnikov, Roman V; Stolyarova, Anastasia V; Zalevsky, Arthur O; Panteleev, Dmitry Y; Pavlova, Galina V; Klinov, Dmitry V; Golovin, Andrey V; Protopopova, Anna D

    2018-02-16

    Modeling tools provide a valuable support for DNA origami design. However, current solutions have limited application for conformational analysis of the designs. In this work we present a tool for a thorough study of DNA origami structure and dynamics. The tool is based on a novel coarse-grained model dedicated to geometry optimization and conformational analysis of DNA origami. We explored the ability of the model to predict dynamic behavior, global shapes, and fine details of two single-layer systems designed in hexagonal and square lattices using atomic force microscopy, Förster resonance energy transfer spectroscopy, and all-atom molecular dynamic simulations for validation of the results. We also examined the performance of the model for multilayer systems by simulation of DNA origami with published cryo-electron microscopy and atomic force microscopy structures. A good agreement between the simulated and experimental data makes the model suitable for conformational analysis of DNA origami objects. The tool is available at http://vsb.fbb.msu.ru/cosm as a web-service and as a standalone version.

  20. A coarse-grained model for DNA origami

    PubMed Central

    Stolyarova, Anastasia V; Zalevsky, Arthur O; Panteleev, Dmitry Y; Pavlova, Galina V; Klinov, Dmitry V; Golovin, Andrey V; Protopopova, Anna D

    2018-01-01

    Abstract Modeling tools provide a valuable support for DNA origami design. However, current solutions have limited application for conformational analysis of the designs. In this work we present a tool for a thorough study of DNA origami structure and dynamics. The tool is based on a novel coarse-grained model dedicated to geometry optimization and conformational analysis of DNA origami. We explored the ability of the model to predict dynamic behavior, global shapes, and fine details of two single-layer systems designed in hexagonal and square lattices using atomic force microscopy, Förster resonance energy transfer spectroscopy, and all-atom molecular dynamic simulations for validation of the results. We also examined the performance of the model for multilayer systems by simulation of DNA origami with published cryo-electron microscopy and atomic force microscopy structures. A good agreement between the simulated and experimental data makes the model suitable for conformational analysis of DNA origami objects. The tool is available at http://vsb.fbb.msu.ru/cosm as a web-service and as a standalone version. PMID:29267876

  1. MOD Tool (Microwave Optics Design Tool)

    NASA Technical Reports Server (NTRS)

    Katz, Daniel S.; Borgioli, Andrea; Cwik, Tom; Fu, Chuigang; Imbriale, William A.; Jamnejad, Vahraz; Springer, Paul L.

    1999-01-01

    The Jet Propulsion Laboratory (JPL) is currently designing and building a number of instruments that operate in the microwave and millimeter-wave bands. These include MIRO (Microwave Instrument for the Rosetta Orbiter), MLS (Microwave Limb Sounder), and IMAS (Integrated Multispectral Atmospheric Sounder). These instruments must be designed and built to meet key design criteria (e.g., beamwidth, gain, pointing) obtained from the scientific goals for the instrument. These criteria are frequently functions of the operating environment (both thermal and mechanical). To design and build instruments which meet these criteria, it is essential to be able to model the instrument in its environments. Currently, a number of modeling tools exist. Commonly used tools at JPL include: FEMAP (meshing), NASTRAN (structural modeling), TRASYS and SINDA (thermal modeling), MACOS/IMOS (optical modeling), and POPO (physical optics modeling). Each of these tools is used by an analyst, who models the instrument in one discipline. The analyst then provides the results of this modeling to another analyst, who continues the overall modeling in another discipline. There is a large reengineering task in place at JPL to automate and speed-up the structural and thermal modeling disciplines, which does not include MOD Tool. The focus of MOD Tool (and of this paper) is in the fields unique to microwave and millimeter-wave instrument design. These include initial design and analysis of the instrument without thermal or structural loads, the automation of the transfer of this design to a high-end CAD tool, and the analysis of the structurally deformed instrument (due to structural and/or thermal loads). MOD Tool is a distributed tool, with a database of design information residing on a server, physical optics analysis being performed on a variety of supercomputer platforms, and a graphical user interface (GUI) residing on the user's desktop computer. The MOD Tool client is being developed using Tcl/Tk, which allows the user to work on a choice of platforms (PC, Mac, or Unix) after downloading the Tcl/Tk binary, which is readily available on the web. The MOD Tool server is written using Expect, and it resides on a Sun workstation. Client/server communications are performed over a socket, where upon a connection from a client to the server, the server spawns a child which is be dedicated to communicating with that client. The server communicates with other machines, such as supercomputers using Expect with the username and password being provided by the user on the client.

  2. CORSS: Cylinder Optimization of Rings, Skin, and Stringers

    NASA Technical Reports Server (NTRS)

    Finckenor, J.; Rogers, P.; Otte, N.

    1994-01-01

    Launch vehicle designs typically make extensive use of cylindrical skin stringer construction. Structural analysis methods are well developed for preliminary design of this type of construction. This report describes an automated, iterative method to obtain a minimum weight preliminary design. Structural optimization has been researched extensively, and various programs have been written for this purpose. Their complexity and ease of use depends on their generality, the failure modes considered, the methodology used, and the rigor of the analysis performed. This computer program employs closed-form solutions from a variety of well-known structural analysis references and joins them with a commercially available numerical optimizer called the 'Design Optimization Tool' (DOT). Any ring and stringer stiffened shell structure of isotropic materials that has beam type loading can be analyzed. Plasticity effects are not included. It performs a more limited analysis than programs such as PANDA, but it provides an easy and useful preliminary design tool for a large class of structures. This report briefly describes the optimization theory, outlines the development and use of the program, and describes the analysis techniques that are used. Examples of program input and output, as well as the listing of the analysis routines, are included.

  3. Rapid analysis of protein backbone resonance assignments using cryogenic probes, a distributed Linux-based computing architecture, and an integrated set of spectral analysis tools.

    PubMed

    Monleón, Daniel; Colson, Kimberly; Moseley, Hunter N B; Anklin, Clemens; Oswald, Robert; Szyperski, Thomas; Montelione, Gaetano T

    2002-01-01

    Rapid data collection, spectral referencing, processing by time domain deconvolution, peak picking and editing, and assignment of NMR spectra are necessary components of any efficient integrated system for protein NMR structure analysis. We have developed a set of software tools designated AutoProc, AutoPeak, and AutoAssign, which function together with the data processing and peak-picking programs NMRPipe and Sparky, to provide an integrated software system for rapid analysis of protein backbone resonance assignments. In this paper we demonstrate that these tools, together with high-sensitivity triple resonance NMR cryoprobes for data collection and a Linux-based computer cluster architecture, can be combined to provide nearly complete backbone resonance assignments and secondary structures (based on chemical shift data) for a 59-residue protein in less than 30 hours of data collection and processing time. In this optimum case of a small protein providing excellent spectra, extensive backbone resonance assignments could also be obtained using less than 6 hours of data collection and processing time. These results demonstrate the feasibility of high throughput triple resonance NMR for determining resonance assignments and secondary structures of small proteins, and the potential for applying NMR in large scale structural proteomics projects.

  4. Aeroelastic Ground Wind Loads Analysis Tool for Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Ivanco, Thomas G.

    2016-01-01

    Launch vehicles are exposed to ground winds during rollout and on the launch pad that can induce static and dynamic loads. Of particular concern are the dynamic loads caused by vortex shedding from nearly-cylindrical structures. When the frequency of vortex shedding nears that of a lowly-damped structural mode, the dynamic loads can be more than an order of magnitude greater than mean drag loads. Accurately predicting vehicle response to vortex shedding during the design and analysis cycles is difficult and typically exceeds the practical capabilities of modern computational fluid dynamics codes. Therefore, mitigating the ground wind loads risk typically requires wind-tunnel tests of dynamically-scaled models that are time consuming and expensive to conduct. In recent years, NASA has developed a ground wind loads analysis tool for launch vehicles to fill this analytical capability gap in order to provide predictions for prelaunch static and dynamic loads. This paper includes a background of the ground wind loads problem and the current state-of-the-art. It then discusses the history and significance of the analysis tool and the methodology used to develop it. Finally, results of the analysis tool are compared to wind-tunnel and full-scale data of various geometries and Reynolds numbers.

  5. Optimization of an Advanced Hybrid Wing Body Concept Using HCDstruct Version 1.2

    NASA Technical Reports Server (NTRS)

    Quinlan, Jesse R.; Gern, Frank H.

    2016-01-01

    Hybrid Wing Body (HWB) aircraft concepts continue to be promising candidates for achieving the simultaneous fuel consumption and noise reduction goals set forth by NASA's Environmentally Responsible Aviation (ERA) project. In order to evaluate the projected benefits, improvements in structural analysis at the conceptual design level were necessary; thus, NASA researchers developed the Hybrid wing body Conceptual Design and structural optimization (HCDstruct) tool to perform aeroservoelastic structural optimizations of advanced HWB concepts. In this paper, the authors present substantial updates to the HCDstruct tool and related analysis, including: the addition of four inboard and eight outboard control surfaces and two all-movable tail/rudder assemblies, providing a full aeroservoelastic analysis capability; the implementation of asymmetric load cases for structural sizing applications; and a methodology for minimizing control surface actuation power using NASTRAN SOL 200 and HCDstruct's aeroservoelastic finite-element model (FEM).

  6. Preliminary design methods for fiber reinforced composite structures employing a personal computer

    NASA Technical Reports Server (NTRS)

    Eastlake, C. N.

    1986-01-01

    The objective of this project was to develop a user-friendly interactive computer program to be used as an analytical tool by structural designers. Its intent was to do preliminary, approximate stress analysis to help select or verify sizing choices for composite structural members. The approach to the project was to provide a subroutine which uses classical lamination theory to predict an effective elastic modulus for a laminate of arbitrary material and ply orientation. This effective elastic modulus can then be used in a family of other subroutines which employ the familiar basic structural analysis methods for isotropic materials. This method is simple and convenient to use but only approximate, as is appropriate for a preliminary design tool which will be subsequently verified by more sophisticated analysis. Additional subroutines have been provided to calculate laminate coefficient of thermal expansion and to calculate ply-by-ply strains within a laminate.

  7. Validation of structural analysis methods using the in-house liner cyclic rigs

    NASA Technical Reports Server (NTRS)

    Thompson, R. L.

    1982-01-01

    Test conditions and variables to be considered in each of the test rigs and test configurations, and also used in the validation of the structural predictive theories and tools, include: thermal and mechanical load histories (simulating an engine mission cycle; different boundary conditions; specimens and components of different dimensions and geometries; different materials; various cooling schemes and cooling hole configurations; several advanced burner liner structural design concepts; and the simulation of hot streaks. Based on these test conditions and test variables, the test matrices for each rig and configurations can be established to verify the predictive tools over as wide a range of test conditions as possible using the simplest possible tests. A flow chart for the thermal/structural analysis of a burner liner and how the analysis relates to the tests is shown schematically. The chart shows that several nonlinear constitutive theories are to be evaluated.

  8. Mapping healthcare systems: a policy relevant analytic tool.

    PubMed

    Sekhri Feachem, Neelam; Afshar, Ariana; Pruett, Cristina; Avanceña, Anton L V

    2017-07-01

    In the past decade, an international consensus on the value of well-functioning systems has driven considerable health systems research. This research falls into two broad categories. The first provides conceptual frameworks that take complex healthcare systems and create simplified constructs of interactions and functions. The second focuses on granular inputs and outputs. This paper presents a novel translational mapping tool - the University of California, San Francisco mapping tool (the Tool) - which bridges the gap between these two areas of research, creating a platform for multi-country comparative analysis. Using the Murray-Frenk framework, we create a macro-level representation of a country's structure, focusing on how it finances and delivers healthcare. The map visually depicts the fundamental policy questions in healthcare system design: funding sources and amount spent through each source, purchasers, populations covered, provider categories; and the relationship between these entities. We use the Tool to provide a macro-level comparative analysis of the structure of India's and Thailand's healthcare systems. As part of the systems strengthening arsenal, the Tool can stimulate debate about the merits and consequences of different healthcare systems structural designs, using a common framework that fosters multi-country comparative analyses. © The Author 2017. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene.

  9. Molecular Dynamics Visualization (MDV): Stereoscopic 3D Display of Biomolecular Structure and Interactions Using the Unity Game Engine.

    PubMed

    Wiebrands, Michael; Malajczuk, Chris J; Woods, Andrew J; Rohl, Andrew L; Mancera, Ricardo L

    2018-06-21

    Molecular graphics systems are visualization tools which, upon integration into a 3D immersive environment, provide a unique virtual reality experience for research and teaching of biomolecular structure, function and interactions. We have developed a molecular structure and dynamics application, the Molecular Dynamics Visualization tool, that uses the Unity game engine combined with large scale, multi-user, stereoscopic visualization systems to deliver an immersive display experience, particularly with a large cylindrical projection display. The application is structured to separate the biomolecular modeling and visualization systems. The biomolecular model loading and analysis system was developed as a stand-alone C# library and provides the foundation for the custom visualization system built in Unity. All visual models displayed within the tool are generated using Unity-based procedural mesh building routines. A 3D user interface was built to allow seamless dynamic interaction with the model while being viewed in 3D space. Biomolecular structure analysis and display capabilities are exemplified with a range of complex systems involving cell membranes, protein folding and lipid droplets.

  10. Structural-Vibration-Response Data Analysis

    NASA Technical Reports Server (NTRS)

    Smith, W. R.; Hechenlaible, R. N.; Perez, R. C.

    1983-01-01

    Computer program developed as structural-vibration-response data analysis tool for use in dynamic testing of Space Shuttle. Program provides fast and efficient time-domain least-squares curve-fitting procedure for reducing transient response data to obtain structural model frequencies and dampings from free-decay records. Procedure simultaneously identifies frequencies, damping values, and participation factors for noisy multiple-response records.

  11. An analysis of burn-off impact on the structure microporous of activated carbons formation

    NASA Astrophysics Data System (ADS)

    Kwiatkowski, Mirosław; Kopac, Türkan

    2017-12-01

    The paper presents the results on the application of the LBET numerical method as a tool for analysis of the microporous structure of activated carbons obtained from a bituminous coal. The LBET method was employed particularly to evaluate the impact of the burn-off on the obtained microporous structure parameters of activated carbons.

  12. Using Molecular Visualization to Explore Protein Structure and Function and Enhance Student Facility with Computational Tools

    ERIC Educational Resources Information Center

    Terrell, Cassidy R.; Listenberger, Laura L.

    2017-01-01

    Recognizing that undergraduate students can benefit from analysis of 3D protein structure and function, we have developed a multiweek, inquiry-based molecular visualization project for Biochemistry I students. This project uses a virtual model of cyclooxygenase-1 (COX-1) to guide students through multiple levels of protein structure analysis. The…

  13. Integrated optomechanical analysis and testing software development at MIT Lincoln Laboratory

    NASA Astrophysics Data System (ADS)

    Stoeckel, Gerhard P.; Doyle, Keith B.

    2013-09-01

    Advanced analytical software capabilities are being developed to advance the design of prototypical hardware in the Engineering Division at MIT Lincoln Laboratory. The current effort is focused on the integration of analysis tools tailored to the work flow, organizational structure, and current technology demands. These tools are being designed to provide superior insight into the interdisciplinary behavior of optical systems and enable rapid assessment and execution of design trades to optimize the design of optomechanical systems. The custom software architecture is designed to exploit and enhance the functionality of existing industry standard commercial software, provide a framework for centralizing internally developed tools, and deliver greater efficiency, productivity, and accuracy through standardization, automation, and integration. Specific efforts have included the development of a feature-rich software package for Structural-Thermal-Optical Performance (STOP) modeling, advanced Line Of Sight (LOS) jitter simulations, and improved integration of dynamic testing and structural modeling.

  14. Integrated piezoelectric actuators in deep drawing tools

    NASA Astrophysics Data System (ADS)

    Neugebauer, R.; Mainda, P.; Drossel, W.-G.; Kerschner, M.; Wolf, K.

    2011-04-01

    The production of car body panels are defective in succession of process fluctuations. Thus the produced car body panel can be precise or damaged. To reduce the error rate, an intelligent deep drawing tool was developed at the Fraunhofer Institute for Machine Tools and Forming Technology IWU in cooperation with Audi and Volkswagen. Mechatronic components in a closed-loop control is the main differentiating factor between an intelligent and a conventional deep drawing tool. In correlation with sensors for process monitoring, the intelligent tool consists of piezoelectric actuators to actuate the deep drawing process. By enabling the usage of sensors and actuators at the die, the forming tool transform to a smart structure. The interface between sensors and actuators will be realized with a closed-loop control. The content of this research will present the experimental results with the piezoelectric actuator. For the analysis a production-oriented forming tool with all automotive requirements were used. The disposed actuators are monolithic multilayer actuators of the piezo injector system. In order to achieve required force, the actuators are combined in a cluster. The cluster is redundant and economical. In addition to the detailed assembly structures, this research will highlight intensive analysis with the intelligent deep drawing tool.

  15. Probabilistic structural analysis by extremum methods

    NASA Technical Reports Server (NTRS)

    Nafday, Avinash M.

    1990-01-01

    The objective is to demonstrate discrete extremum methods of structural analysis as a tool for structural system reliability evaluation. Specifically, linear and multiobjective linear programming models for analysis of rigid plastic frames under proportional and multiparametric loadings, respectively, are considered. Kinematic and static approaches for analysis form a primal-dual pair in each of these models and have a polyhedral format. Duality relations link extreme points and hyperplanes of these polyhedra and lead naturally to dual methods for system reliability evaluation.

  16. Damage Simulation in Composite Materials: Why It Matters and What Is Happening Currently at NASA in This Area

    NASA Technical Reports Server (NTRS)

    McElroy, Mack; de Carvalho, Nelson; Estes, Ashley; Lin, Shih-yung

    2017-01-01

    Use of lightweight composite materials in space and aircraft structure designs is often challenging due to high costs associated with structural certification. Of primary concern in the use of composite structures is durability and damage tolerance. This concern is due to the inherent susceptibility of composite materials to both fabrication and service induced flaws. Due to a lack of general industry accepted analysis tools applicable to composites damage simulation, a certification procedure relies almost entirely on testing. It is this reliance on testing, especially compared to structures comprised of legacy metallic materials where damage simulation tools are available, that can drive costs for using composite materials in aerospace structures. The observation that use of composites can be expensive due to testing requirements is not new and as such, research on analysis tools for simulating damage in composite structures has been occurring for several decades. A convenient approach many researchers/model-developers in this area have taken is to select a specific problem relevant to aerospace structural certification and develop a model that is accurate within that scope. Some examples are open hole tension tests, compression after impact tests, low-velocity impact, damage tolerance of an embedded flaw, and fatigue crack growth to name a few. Based on the premise that running analyses is cheaper than running tests, one motivation that many researchers in this area have is that if generally applicable and reliable damage simulation tools were available the dependence on certification testing could be lessened thereby reducing overall design cost. It is generally accepted that simulation tools if applied in this manner would still need to be thoroughly validated and that composite testing will never be completely replaced by analysis. Research and development is currently occurring at NASA to create numerical damage simulation tools applicable to damage in composites. The Advanced Composites Project (ACP) at NASA Langley has supported the development of composites damage simulation tools in a consortium of aerospace companies with a goal of reducing the certification time of a commercial aircraft by 30%. And while the scope of ACP does not include spacecraft, much of the methodology and simulation capabilities can apply to spacecraft certification in the Space Launch System and Orion programs as well. Some specific applications of composite damage simulation models in a certification program are (1) evaluation of damage during service when maintenance may be difficult or impossible, (2) a tool for early design iterations, (3) gaining insight into a particular damage process and applying this insight towards a test coupon or structural design, and (4) analysis of damage scenarios that are difficult or impossible to recreate in a test. As analysis capabilities improve, these applications and more will become realized resulting in a reduction in cost for use of composites in aerospace vehicles. NASA is engaged in this process from both research and application perspectives. In addition to the background information discussed previously, this presentation covers a look at recent research at NASA in this area and some current/potential applications in the Orion program.

  17. Introduction, comparison, and validation of Meta-Essentials: A free and simple tool for meta-analysis.

    PubMed

    Suurmond, Robert; van Rhee, Henk; Hak, Tony

    2017-12-01

    We present a new tool for meta-analysis, Meta-Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta-analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta-Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta-analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp-Hartung adjustment of the DerSimonian-Laird estimator. However, more advanced meta-analysis methods such as meta-analytical structural equation modelling and meta-regression with multiple covariates are not available. In summary, Meta-Essentials may prove a valuable resource for meta-analysts, including researchers, teachers, and students. © 2017 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.

  18. Detailed requirements document for the integrated structural analysis system, phase B

    NASA Technical Reports Server (NTRS)

    Rainey, J. A.

    1976-01-01

    The requirements are defined for a software system entitled integrated Structural Analysis System (ISAS) Phase B which is being developed to provide the user with a tool by which a complete and detailed analysis of a complex structural system can be performed. This software system will allow for automated interface with numerous structural analysis batch programs and for user interaction in the creation, selection, and validation of data. This system will include modifications to the 4 functions developed for ISAS, and the development of 25 new functions. The new functions are described.

  19. Adaptive Modeling, Engineering Analysis and Design of Advanced Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, Vivek; Hsu, Su-Yuen; Mason, Brian H.; Hicks, Mike D.; Jones, William T.; Sleight, David W.; Chun, Julio; Spangler, Jan L.; Kamhawi, Hilmi; Dahl, Jorgen L.

    2006-01-01

    This paper describes initial progress towards the development and enhancement of a set of software tools for rapid adaptive modeling, and conceptual design of advanced aerospace vehicle concepts. With demanding structural and aerodynamic performance requirements, these high fidelity geometry based modeling tools are essential for rapid and accurate engineering analysis at the early concept development stage. This adaptive modeling tool was used for generating vehicle parametric geometry, outer mold line and detailed internal structural layout of wing, fuselage, skin, spars, ribs, control surfaces, frames, bulkheads, floors, etc., that facilitated rapid finite element analysis, sizing study and weight optimization. The high quality outer mold line enabled rapid aerodynamic analysis in order to provide reliable design data at critical flight conditions. Example application for structural design of a conventional aircraft and a high altitude long endurance vehicle configuration are presented. This work was performed under the Conceptual Design Shop sub-project within the Efficient Aerodynamic Shape and Integration project, under the former Vehicle Systems Program. The project objective was to design and assess unconventional atmospheric vehicle concepts efficiently and confidently. The implementation may also dramatically facilitate physics-based systems analysis for the NASA Fundamental Aeronautics Mission. In addition to providing technology for design and development of unconventional aircraft, the techniques for generation of accurate geometry and internal sub-structure and the automated interface with the high fidelity analysis codes could also be applied towards the design of vehicles for the NASA Exploration and Space Science Mission projects.

  20. Analysis of Ten Reverse Engineering Tools

    NASA Astrophysics Data System (ADS)

    Koskinen, Jussi; Lehmonen, Tero

    Reverse engineering tools can be used in satisfying the information needs of software maintainers. Especially in case of maintaining large-scale legacy systems tool support is essential. Reverse engineering tools provide various kinds of capabilities to provide the needed information to the tool user. In this paper we analyze the provided capabilities in terms of four aspects: provided data structures, visualization mechanisms, information request specification mechanisms, and navigation features. We provide a compact analysis of ten representative reverse engineering tools for supporting C, C++ or Java: Eclipse Java Development Tools, Wind River Workbench (for C and C++), Understand (for C++), Imagix 4D, Creole, Javadoc, Javasrc, Source Navigator, Doxygen, and HyperSoft. The results of the study supplement the earlier findings in this important area.

  1. Microscopy image segmentation tool: Robust image data analysis

    NASA Astrophysics Data System (ADS)

    Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K.

    2014-03-01

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  2. Geocoded data structures and their applications to Earth science investigations

    NASA Technical Reports Server (NTRS)

    Goldberg, M.

    1984-01-01

    A geocoded data structure is a means for digitally representing a geographically referenced map or image. The characteristics of representative cellular, linked, and hybrid geocoded data structures are reviewed. The data processing requirements of Earth science projects at the Goddard Space Flight Center and the basic tools of geographic data processing are described. Specific ways that new geocoded data structures can be used to adapt these tools to scientists' needs are presented. These include: expanding analysis and modeling capabilities; simplifying the merging of data sets from diverse sources; and saving computer storage space.

  3. Challenges Facing Design and Analysis Tools

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Broduer, Steve (Technical Monitor)

    2001-01-01

    The design and analysis of future aerospace systems will strongly rely on advanced engineering analysis tools used in combination with risk mitigation procedures. The implications of such a trend place increased demands on these tools to assess off-nominal conditions, residual strength, damage propagation, and extreme loading conditions in order to understand and quantify these effects as they affect mission success. Advances in computer hardware such as CPU processing speed, memory, secondary storage, and visualization provide significant resources for the engineer to exploit in engineering design. The challenges facing design and analysis tools fall into three primary areas. The first area involves mechanics needs such as constitutive modeling, contact and penetration simulation, crack growth prediction, damage initiation and progression prediction, transient dynamics and deployment simulations, and solution algorithms. The second area involves computational needs such as fast, robust solvers, adaptivity for model and solution strategies, control processes for concurrent, distributed computing for uncertainty assessments, and immersive technology. Traditional finite element codes still require fast direct solvers which when coupled to current CPU power enables new insight as a result of high-fidelity modeling. The third area involves decision making by the analyst. This area involves the integration and interrogation of vast amounts of information - some global in character while local details are critical and often drive the design. The proposed presentation will describe and illustrate these areas using composite structures, energy-absorbing structures, and inflatable space structures. While certain engineering approximations within the finite element model may be adequate for global response prediction, they generally are inadequate in a design setting or when local response prediction is critical. Pitfalls to be avoided and trends for emerging analysis tools will be described.

  4. Dynaflow User’s Guide

    DTIC Science & Technology

    1988-11-01

    264 ANALYSIS RESTART. ............. ..... ....... 269 1.0 TITLE CARD. .............. ............. 271 2.0 CONTROL CARDS...stress soil model will provide a tool for such analysis of waterfront structures. To understand the significance of liquefaction, it is important to note...Implementing this effective stress soil model into a finite element computer program would allow analysis of soil and structure together. TECHNICAL BACKGROUND

  5. Quantitative Analysis Of Three-dimensional Branching Systems From X-ray Computed Microtomography Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKinney, Adriana L.; Varga, Tamas

    Branching structures such as lungs, blood vessels and plant roots play a critical role in life. Growth, structure, and function of these branching structures have an immense effect on our lives. Therefore, quantitative size information on such structures in their native environment is invaluable for studying their growth and the effect of the environment on them. X-ray computed tomography (XCT) has been an effective tool for in situ imaging and analysis of branching structures. We developed a costless tool that approximates the surface and volume of branching structures. Our methodology of noninvasive imaging, segmentation and extraction of quantitative information ismore » demonstrated through the analysis of a plant root in its soil medium from 3D tomography data. XCT data collected on a grass specimen was used to visualize its root structure. A suite of open-source software was employed to segment the root from the soil and determine its isosurface, which was used to calculate its volume and surface. This methodology of processing 3D data is applicable to other branching structures even when the structure of interest is of similar x-ray attenuation to its environment and difficulties arise with sample segmentation.« less

  6. Supporting cognition in systems biology analysis: findings on users' processes and design implications.

    PubMed

    Mirel, Barbara

    2009-02-13

    Current usability studies of bioinformatics tools suggest that tools for exploratory analysis support some tasks related to finding relationships of interest but not the deep causal insights necessary for formulating plausible and credible hypotheses. To better understand design requirements for gaining these causal insights in systems biology analyses a longitudinal field study of 15 biomedical researchers was conducted. Researchers interacted with the same protein-protein interaction tools to discover possible disease mechanisms for further experimentation. Findings reveal patterns in scientists' exploratory and explanatory analysis and reveal that tools positively supported a number of well-structured query and analysis tasks. But for several of scientists' more complex, higher order ways of knowing and reasoning the tools did not offer adequate support. Results show that for a better fit with scientists' cognition for exploratory analysis systems biology tools need to better match scientists' processes for validating, for making a transition from classification to model-based reasoning, and for engaging in causal mental modelling. As the next great frontier in bioinformatics usability, tool designs for exploratory systems biology analysis need to move beyond the successes already achieved in supporting formulaic query and analysis tasks and now reduce current mismatches with several of scientists' higher order analytical practices. The implications of results for tool designs are discussed.

  7. Analysis of real-time vibration data

    USGS Publications Warehouse

    Safak, E.

    2005-01-01

    In recent years, a few structures have been instrumented to provide continuous vibration data in real time, recording not only large-amplitude motions generated by extreme loads, but also small-amplitude motions generated by ambient loads. The main objective in continuous recording is to track any changes in structural characteristics, and to detect damage after an extreme event, such as an earthquake or explosion. The Fourier-based spectral analysis methods have been the primary tool to analyze vibration data from structures. In general, such methods do not work well for real-time data, because real-time data are mainly composed of ambient vibrations with very low amplitudes and signal-to-noise ratios. The long duration, linearity, and the stationarity of ambient data, however, allow us to utilize statistical signal processing tools, which can compensate for the adverse effects of low amplitudes and high noise. The analysis of real-time data requires tools and techniques that can be applied in real-time; i.e., data are processed and analyzed while being acquired. This paper presents some of the basic tools and techniques for processing and analyzing real-time vibration data. The topics discussed include utilization of running time windows, tracking mean and mean-square values, filtering, system identification, and damage detection.

  8. Personal Constructions of Biological Concepts--The Repertory Grid Approach

    ERIC Educational Resources Information Center

    McCloughlin, Thomas J. J.; Matthews, Philip S. C.

    2017-01-01

    This work discusses repertory grid analysis as a tool for investigating the structures of students' representations of biological concepts. Repertory grid analysis provides the researcher with a variety of techniques that are not associated with standard methods of concept mapping for investigating conceptual structures. It can provide valuable…

  9. MODA A Framework for Memory Centric Performance Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shrestha, Sunil; Su, Chun-Yi; White, Amanda M.

    2012-06-29

    In the age of massive parallelism, the focus of performance analysis has switched from the processor and related structures to the memory and I/O resources. Adapting to this new reality, a performance analysis tool has to provide a way to analyze resource usage to pinpoint existing and potential problems in a given application. This paper provides an overview of the Memory Observant Data Analysis (MODA) tool, a memory-centric tool first implemented on the Cray XMT supercomputer. Throughout the paper, MODA's capabilities have been showcased with experiments done on matrix multiply and Graph-500 application codes.

  10. Interfacing Computer Aided Parallelization and Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Biegel, Bryan A. (Technical Monitor)

    2003-01-01

    When porting sequential applications to parallel computer architectures, the program developer will typically go through several cycles of source code optimization and performance analysis. We have started a project to develop an environment where the user can jointly navigate through program structure and performance data information in order to make efficient optimization decisions. In a prototype implementation we have interfaced the CAPO computer aided parallelization tool with the Paraver performance analysis tool. We describe both tools and their interface and give an example for how the interface helps within the program development cycle of a benchmark code.

  11. BiNChE: a web tool and library for chemical enrichment analysis based on the ChEBI ontology.

    PubMed

    Moreno, Pablo; Beisken, Stephan; Harsha, Bhavana; Muthukrishnan, Venkatesh; Tudose, Ilinca; Dekker, Adriano; Dornfeldt, Stefanie; Taruttis, Franziska; Grosse, Ivo; Hastings, Janna; Neumann, Steffen; Steinbeck, Christoph

    2015-02-21

    Ontology-based enrichment analysis aids in the interpretation and understanding of large-scale biological data. Ontologies are hierarchies of biologically relevant groupings. Using ontology annotations, which link ontology classes to biological entities, enrichment analysis methods assess whether there is a significant over or under representation of entities for ontology classes. While many tools exist that run enrichment analysis for protein sets annotated with the Gene Ontology, there are only a few that can be used for small molecules enrichment analysis. We describe BiNChE, an enrichment analysis tool for small molecules based on the ChEBI Ontology. BiNChE displays an interactive graph that can be exported as a high-resolution image or in network formats. The tool provides plain, weighted and fragment analysis based on either the ChEBI Role Ontology or the ChEBI Structural Ontology. BiNChE aids in the exploration of large sets of small molecules produced within Metabolomics or other Systems Biology research contexts. The open-source tool provides easy and highly interactive web access to enrichment analysis with the ChEBI ontology tool and is additionally available as a standalone library.

  12. ISAC - A tool for aeroservoelastic modeling and analysis. [Interaction of Structures, Aerodynamics, and Control

    NASA Technical Reports Server (NTRS)

    Adams, William M., Jr.; Hoadley, Sherwood T.

    1993-01-01

    This paper discusses the capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrate some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.

  13. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation, volume 2, part 1. Appendix A: Software documentation

    NASA Technical Reports Server (NTRS)

    Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.

    1982-01-01

    Documentation of the preliminary software developed as a framework for a generalized integrated robotic system simulation is presented. The program structure is composed of three major functions controlled by a program executive. The three major functions are: system definition, analysis tools, and post processing. The system definition function handles user input of system parameters and definition of the manipulator configuration. The analysis tools function handles the computational requirements of the program. The post processing function allows for more detailed study of the results of analysis tool function executions. Also documented is the manipulator joint model software to be used as the basis of the manipulator simulation which will be part of the analysis tools capability.

  14. Principal component analysis as a tool for library design: a case study investigating natural products, brand-name drugs, natural product-like libraries, and drug-like libraries.

    PubMed

    Wenderski, Todd A; Stratton, Christopher F; Bauer, Renato A; Kopp, Felix; Tan, Derek S

    2015-01-01

    Principal component analysis (PCA) is a useful tool in the design and planning of chemical libraries. PCA can be used to reveal differences in structural and physicochemical parameters between various classes of compounds by displaying them in a convenient graphical format. Herein, we demonstrate the use of PCA to gain insight into structural features that differentiate natural products, synthetic drugs, natural product-like libraries, and drug-like libraries, and show how the results can be used to guide library design.

  15. CFD Methods and Tools for Multi-Element Airfoil Analysis

    NASA Technical Reports Server (NTRS)

    Rogers, Stuart E.; George, Michael W. (Technical Monitor)

    1995-01-01

    This lecture will discuss the computational tools currently available for high-lift multi-element airfoil analysis. It will present an overview of a number of different numerical approaches, their current capabilities, short-comings, and computational costs. The lecture will be limited to viscous methods, including inviscid/boundary layer coupling methods, and incompressible and compressible Reynolds-averaged Navier-Stokes methods. Both structured and unstructured grid generation approaches will be presented. Two different structured grid procedures are outlined, one which uses multi-block patched grids, the other uses overset chimera grids. Turbulence and transition modeling will be discussed.

  16. A Practical Tutorial on Modified Condition/Decision Coverage

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.; Veerhusen, Dan S.; Chilenski, John J.; Rierson, Leanna K.

    2001-01-01

    This tutorial provides a practical approach to assessing modified condition/decision coverage (MC/DC) for aviation software products that must comply with regulatory guidance for DO-178B level A software. The tutorial's approach to MC/DC is a 5-step process that allows a certification authority or verification analyst to evaluate MC/DC claims without the aid of a coverage tool. In addition to the MC/DC approach, the tutorial addresses factors to consider in selecting and qualifying a structural coverage analysis tool, tips for reviewing life cycle data related to MC/DC, and pitfalls common to structural coverage analysis.

  17. Principal Component Analysis as a Tool for Library Design: A Case Study Investigating Natural Products, Brand-Name Drugs, Natural Product-Like Libraries, and Drug-Like Libraries

    PubMed Central

    Wenderski, Todd A.; Stratton, Christopher F.; Bauer, Renato A.; Kopp, Felix; Tan, Derek S.

    2015-01-01

    Principal component analysis (PCA) is a useful tool in the design and planning of chemical libraries. PCA can be used to reveal differences in structural and physicochemical parameters between various classes of compounds by displaying them in a convenient graphical format. Herein, we demonstrate the use of PCA to gain insight into structural features that differentiate natural products, synthetic drugs, natural product-like libraries, and drug-like libraries, and show how the results can be used to guide library design. PMID:25618349

  18. New Tool Released for Engine-Airframe Blade-Out Structural Simulations

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles

    2004-01-01

    Researchers at the NASA Glenn Research Center have enhanced a general-purpose finite element code, NASTRAN, for engine-airframe structural simulations during steady-state and transient operating conditions. For steady-state simulations, the code can predict critical operating speeds, natural modes of vibration, and forced response (e.g., cabin noise and component fatigue). The code can be used to perform static analysis to predict engine-airframe response and component stresses due to maneuver loads. For transient response, the simulation code can be used to predict response due to bladeoff events and subsequent engine shutdown and windmilling conditions. In addition, the code can be used as a pretest analysis tool to predict the results of the bladeout test required for FAA certification of new and derivative aircraft engines. Before the present analysis code was developed, all the major aircraft engine and airframe manufacturers in the United States and overseas were performing similar types of analyses to ensure the structural integrity of engine-airframe systems. Although there were many similarities among the analysis procedures, each manufacturer was developing and maintaining its own structural analysis capabilities independently. This situation led to high software development and maintenance costs, complications with manufacturers exchanging models and results, and limitations in predicting the structural response to the desired degree of accuracy. An industry-NASA team was formed to overcome these problems by developing a common analysis tool that would satisfy all the structural analysis needs of the industry and that would be available and supported by a commercial software vendor so that the team members would be relieved of maintenance and development responsibilities. Input from all the team members was used to ensure that everyone's requirements were satisfied and that the best technology was incorporated into the code. Furthermore, because the code would be distributed by a commercial software vendor, it would be more readily available to engine and airframe manufacturers, as well as to nonaircraft companies that did not previously have access to this capability.

  19. Capturing district nursing through a knowledge-based electronic caseload analysis tool (eCAT).

    PubMed

    Kane, Kay

    2014-03-01

    The Electronic Caseload Analysis Tool (eCAT) is a knowledge-based software tool to assist the caseload analysis process. The tool provides a wide range of graphical reports, along with an integrated clinical advisor, to assist district nurses, team leaders, operational and strategic managers with caseload analysis by describing, comparing and benchmarking district nursing practice in the context of population need, staff resources, and service structure. District nurses and clinical lead nurses in Northern Ireland developed the tool, along with academic colleagues from the University of Ulster, working in partnership with a leading software company. The aim was to use the eCAT tool to identify the nursing need of local populations, along with the variances in district nursing practice, and match the workforce accordingly. This article reviews the literature, describes the eCAT solution and discusses the impact of eCAT on nursing practice, staff allocation, service delivery and workforce planning, using fictitious exemplars and a post-implementation evaluation from the trusts.

  20. IMGT/3Dstructure-DB and IMGT/StructuralQuery, a database and a tool for immunoglobulin, T cell receptor and MHC structural data

    PubMed Central

    Kaas, Quentin; Ruiz, Manuel; Lefranc, Marie-Paule

    2004-01-01

    IMGT/3Dstructure-DB and IMGT/Structural-Query are a novel 3D structure database and a new tool for immunological proteins. They are part of IMGT, the international ImMunoGenetics information system®, a high-quality integrated knowledge resource specializing in immunoglobulins (IG), T cell receptors (TR), major histocompatibility complex (MHC) and related proteins of the immune system (RPI) of human and other vertebrate species, which consists of databases, Web resources and interactive on-line tools. IMGT/3Dstructure-DB data are described according to the IMGT Scientific chart rules based on the IMGT-ONTOLOGY concepts. IMGT/3Dstructure-DB provides IMGT gene and allele identification of IG, TR and MHC proteins with known 3D structures, domain delimitations, amino acid positions according to the IMGT unique numbering and renumbered coordinate flat files. Moreover IMGT/3Dstructure-DB provides 2D graphical representations (or Collier de Perles) and results of contact analysis. The IMGT/StructuralQuery tool allows search of this database based on specific structural characteristics. IMGT/3Dstructure-DB and IMGT/StructuralQuery are freely available at http://imgt.cines.fr. PMID:14681396

  1. NURBS-Based Geometry for Integrated Structural Analysis

    NASA Technical Reports Server (NTRS)

    Oliver, James H.

    1997-01-01

    This grant was initiated in April 1993 and completed in September 1996. The primary goal of the project was to exploit the emerging defacto CAD standard of Non- Uniform Rational B-spline (NURBS) based curve and surface geometry to integrate and streamline the process of turbomachinery structural analysis. We focused our efforts on critical geometric modeling challenges typically posed by the requirements of structural analysts. We developed a suite of software tools that facilitate pre- and post-processing of NURBS-based turbomachinery blade models for finite element structural analyses. We also developed tools to facilitate the modeling of blades in their manufactured (or cold) state based on nominal operating shape and conditions. All of the software developed in the course of this research is written in the C++ language using the Iris Inventor 3D graphical interface tool-kit from Silicon Graphics. In addition to enhanced modularity, improved maintainability, and efficient prototype development, this design facilitates the re-use of code developed for other NASA projects and provides a uniform and professional 'look and feel' for all applications developed by the Iowa State Team.

  2. Precession technique and electron diffractometry as new tools for crystal structure analysis and chemical bonding determination.

    PubMed

    Avilov, A; Kuligin, K; Nicolopoulos, S; Nickolskiy, M; Boulahya, K; Portillo, J; Lepeshov, G; Sobolev, B; Collette, J P; Martin, N; Robins, A C; Fischione, P

    2007-01-01

    We have developed a new fast electron diffractometer working with high dynamic range and linearity for crystal structure determinations. Electron diffraction (ED) patterns can be scanned serially in front of a Faraday cage detector; the total measurement time for several hundred ED reflections can be tens of seconds having high statistical accuracy for all measured intensities (1-2%). This new tool can be installed to any type of TEM without any column modification and is linked to a specially developed electron beam precession "Spinning Star" system. Precession of the electron beam (Vincent-Midgley technique) reduces dynamical effects allowing also use of accurate intensities for crystal structure analysis. We describe the technical characteristics of this new tool together with the first experimental results. Accurate measurement of electron diffraction intensities by electron diffractometer opens new possibilities not only for revealing unknown structures, but also for electrostatic potential determination and chemical bonding investigation. As an example, we present detailed atomic bonding information of CaF(2) as revealed for the first time by precise electron diffractometry.

  3. Elementary Mode Analysis: A Useful Metabolic Pathway Analysis Tool for Characterizing Cellular Metabolism

    PubMed Central

    Trinh, Cong T.; Wlaschin, Aaron; Srienc, Friedrich

    2010-01-01

    Elementary Mode Analysis is a useful Metabolic Pathway Analysis tool to identify the structure of a metabolic network that links the cellular phenotype to the corresponding genotype. The analysis can decompose the intricate metabolic network comprised of highly interconnected reactions into uniquely organized pathways. These pathways consisting of a minimal set of enzymes that can support steady state operation of cellular metabolism represent independent cellular physiological states. Such pathway definition provides a rigorous basis to systematically characterize cellular phenotypes, metabolic network regulation, robustness, and fragility that facilitate understanding of cell physiology and implementation of metabolic engineering strategies. This mini-review aims to overview the development and application of elementary mode analysis as a metabolic pathway analysis tool in studying cell physiology and as a basis of metabolic engineering. PMID:19015845

  4. Matching spatial with ontological brain regions using Java tools for visualization, database access, and integrated data analysis.

    PubMed

    Bezgin, Gleb; Reid, Andrew T; Schubert, Dirk; Kötter, Rolf

    2009-01-01

    Brain atlases are widely used in experimental neuroscience as tools for locating and targeting specific brain structures. Delineated structures in a given atlas, however, are often difficult to interpret and to interface with database systems that supply additional information using hierarchically organized vocabularies (ontologies). Here we discuss the concept of volume-to-ontology mapping in the context of macroscopical brain structures. We present Java tools with which we have implemented this concept for retrieval of mapping and connectivity data on the macaque brain from the CoCoMac database in connection with an electronic version of "The Rhesus Monkey Brain in Stereotaxic Coordinates" authored by George Paxinos and colleagues. The software, including our manually drawn monkey brain template, can be downloaded freely under the GNU General Public License. It adds value to the printed atlas and has a wider (neuro-)informatics application since it can read appropriately annotated data from delineated sections of other species and organs, and turn them into 3D registered stacks. The tools provide additional features, including visualization and analysis of connectivity data, volume and centre-of-mass estimates, and graphical manipulation of entire structures, which are potentially useful for a range of research and teaching applications.

  5. Analysis of Utility and Use of a Web-Based Tool for Digital Signal Processing Teaching by Means of a Technological Acceptance Model

    ERIC Educational Resources Information Center

    Toral, S. L.; Barrero, F.; Martinez-Torres, M. R.

    2007-01-01

    This paper presents an exploratory study about the development of a structural and measurement model for the technological acceptance (TAM) of a web-based educational tool. The aim consists of measuring not only the use of this tool, but also the external variables with a significant influence in its use for planning future improvements. The tool,…

  6. RNA2DMut: a web tool for the design and analysis of RNA structure mutations.

    PubMed

    Moss, Walter N

    2018-03-01

    With the widespread application of high-throughput sequencing, novel RNA sequences are being discovered at an astonishing rate. The analysis of function, however, lags behind. In both the cis - and trans -regulatory functions of RNA, secondary structure (2D base-pairing) plays essential regulatory roles. In order to test RNA function, it is essential to be able to design and analyze mutations that can affect structure. This was the motivation for the creation of the RNA2DMut web tool. With RNA2DMut, users can enter in RNA sequences to analyze, constrain mutations to specific residues, or limit changes to purines/pyrimidines. The sequence is analyzed at each base to determine the effect of every possible point mutation on 2D structure. The metrics used in RNA2DMut rely on the calculation of the Boltzmann structure ensemble and do not require a robust 2D model of RNA structure for designing mutations. This tool can facilitate a wide array of uses involving RNA: for example, in designing and evaluating mutants for biological assays, interrogating RNA-protein interactions, identifying key regions to alter in SELEX experiments, and improving RNA folding and crystallization properties for structural biology. Additional tools are available to help users introduce other mutations (e.g., indels and substitutions) and evaluate their effects on RNA structure. Example calculations are shown for five RNAs that require 2D structure for their function: the MALAT1 mascRNA, an influenza virus splicing regulatory motif, the EBER2 viral noncoding RNA, the Xist lncRNA repA region, and human Y RNA 5. RNA2DMut can be accessed at https://rna2dmut.bb.iastate.edu/. © 2018 Moss; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  7. A computer program for cyclic plasticity and structural fatigue analysis

    NASA Technical Reports Server (NTRS)

    Kalev, I.

    1980-01-01

    A computerized tool for the analysis of time independent cyclic plasticity structural response, life to crack initiation prediction, and crack growth rate prediction for metallic materials is described. Three analytical items are combined: the finite element method with its associated numerical techniques for idealization of the structural component, cyclic plasticity models for idealization of the material behavior, and damage accumulation criteria for the fatigue failure.

  8. Structural analysis for a 40-story building

    NASA Technical Reports Server (NTRS)

    Hua, L.

    1972-01-01

    NASTRAN was chosen as the principal analytical tool for structural analysis of the Illinois Center Plaza Hotel Building in Chicago, Illinois. The building is a 40-story, reinforced concrete structure utilizing a monolithic slab-column system. The displacements, member stresses, and foundation loads due to wind load, live load, and dead load were obtained through a series of NASTRAN runs. These analyses and the input technique are described.

  9. G23D: Online tool for mapping and visualization of genomic variants on 3D protein structures.

    PubMed

    Solomon, Oz; Kunik, Vered; Simon, Amos; Kol, Nitzan; Barel, Ortal; Lev, Atar; Amariglio, Ninette; Somech, Raz; Rechavi, Gidi; Eyal, Eran

    2016-08-26

    Evaluation of the possible implications of genomic variants is an increasingly important task in the current high throughput sequencing era. Structural information however is still not routinely exploited during this evaluation process. The main reasons can be attributed to the partial structural coverage of the human proteome and the lack of tools which conveniently convert genomic positions, which are the frequent output of genomic pipelines, to proteins and structure coordinates. We present G23D, a tool for conversion of human genomic coordinates to protein coordinates and protein structures. G23D allows mapping of genomic positions/variants on evolutionary related (and not only identical) protein three dimensional (3D) structures as well as on theoretical models. By doing so it significantly extends the space of variants for which structural insight is feasible. To facilitate interpretation of the variant consequence, pathogenic variants, functional sites and polymorphism sites are displayed on protein sequence and structure diagrams alongside the input variants. G23D also provides modeling of the mutant structure, analysis of intra-protein contacts and instant access to functional predictions and predictions of thermo-stability changes. G23D is available at http://www.sheba-cancer.org.il/G23D . G23D extends the fraction of variants for which structural analysis is applicable and provides better and faster accessibility for structural data to biologists and geneticists who routinely work with genomic information.

  10. Understanding and Using the Fermi Science Tools

    NASA Astrophysics Data System (ADS)

    Asercion, Joseph

    2018-01-01

    The Fermi Science Support Center (FSSC) provides information, documentation, and tools for the analysis of Fermi science data, including both the Large-Area Telescope (LAT) and the Gamma-ray Burst Monitor (GBM). Source and binary versions of the Fermi Science Tools can be downloaded from the FSSC website, and are supported on multiple platforms. An overview document, the Cicerone, provides details of the Fermi mission, the science instruments and their response functions, the science data preparation and analysis process, and interpretation of the results. Analysis Threads and a reference manual available on the FSSC website provide the user with step-by-step instructions for many different types of data analysis: point source analysis - generating maps, spectra, and light curves, pulsar timing analysis, source identification, and the use of python for scripting customized analysis chains. We present an overview of the structure of the Fermi science tools and documentation, and how to acquire them. We also provide examples of standard analyses, including tips and tricks for improving Fermi science analysis.

  11. Hybrid Wing Body Planform Design with Vehicle Sketch Pad

    NASA Technical Reports Server (NTRS)

    Wells, Douglas P.; Olson, Erik D.

    2011-01-01

    The objective of this paper was to provide an update on NASA s current tools for design and analysis of hybrid wing body (HWB) aircraft with an emphasis on Vehicle Sketch Pad (VSP). NASA started HWB analysis using the Flight Optimization System (FLOPS). That capability is enhanced using Phoenix Integration's ModelCenter(Registered TradeMark). Model Center enables multifidelity analysis tools to be linked as an integrated structure. Two major components are linked to FLOPS as an example; a planform discretization tool and VSP. The planform discretization tool ensures the planform is smooth and continuous. VSP is used to display the output geometry. This example shows that a smooth & continuous HWB planform can be displayed as a three-dimensional model and rapidly sized and analyzed.

  12. RNA-Puzzles: A CASP-like evaluation of RNA three-dimensional structure prediction

    PubMed Central

    Cruz, José Almeida; Blanchet, Marc-Frédérick; Boniecki, Michal; Bujnicki, Janusz M.; Chen, Shi-Jie; Cao, Song; Das, Rhiju; Ding, Feng; Dokholyan, Nikolay V.; Flores, Samuel Coulbourn; Huang, Lili; Lavender, Christopher A.; Lisi, Véronique; Major, François; Mikolajczak, Katarzyna; Patel, Dinshaw J.; Philips, Anna; Puton, Tomasz; Santalucia, John; Sijenyi, Fredrick; Hermann, Thomas; Rother, Kristian; Rother, Magdalena; Serganov, Alexander; Skorupski, Marcin; Soltysinski, Tomasz; Sripakdeevong, Parin; Tuszynska, Irina; Weeks, Kevin M.; Waldsich, Christina; Wildauer, Michael; Leontis, Neocles B.; Westhof, Eric

    2012-01-01

    We report the results of a first, collective, blind experiment in RNA three-dimensional (3D) structure prediction, encompassing three prediction puzzles. The goals are to assess the leading edge of RNA structure prediction techniques; compare existing methods and tools; and evaluate their relative strengths, weaknesses, and limitations in terms of sequence length and structural complexity. The results should give potential users insight into the suitability of available methods for different applications and facilitate efforts in the RNA structure prediction community in ongoing efforts to improve prediction tools. We also report the creation of an automated evaluation pipeline to facilitate the analysis of future RNA structure prediction exercises. PMID:22361291

  13. Computational Tools for Metabolic Engineering

    PubMed Central

    Copeland, Wilbert B.; Bartley, Bryan A.; Chandran, Deepak; Galdzicki, Michal; Kim, Kyung H.; Sleight, Sean C.; Maranas, Costas D.; Sauro, Herbert M.

    2012-01-01

    A great variety of software applications are now employed in the metabolic engineering field. These applications have been created to support a wide range of experimental and analysis techniques. Computational tools are utilized throughout the metabolic engineering workflow to extract and interpret relevant information from large data sets, to present complex models in a more manageable form, and to propose efficient network design strategies. In this review, we present a number of tools that can assist in modifying and understanding cellular metabolic networks. The review covers seven areas of relevance to metabolic engineers. These include metabolic reconstruction efforts, network visualization, nucleic acid and protein engineering, metabolic flux analysis, pathway prospecting, post-structural network analysis and culture optimization. The list of available tools is extensive and we can only highlight a small, representative portion of the tools from each area. PMID:22629572

  14. Visualization of RNA structure models within the Integrative Genomics Viewer.

    PubMed

    Busan, Steven; Weeks, Kevin M

    2017-07-01

    Analyses of the interrelationships between RNA structure and function are increasingly important components of genomic studies. The SHAPE-MaP strategy enables accurate RNA structure probing and realistic structure modeling of kilobase-length noncoding RNAs and mRNAs. Existing tools for visualizing RNA structure models are not suitable for efficient analysis of long, structurally heterogeneous RNAs. In addition, structure models are often advantageously interpreted in the context of other experimental data and gene annotation information, for which few tools currently exist. We have developed a module within the widely used and well supported open-source Integrative Genomics Viewer (IGV) that allows visualization of SHAPE and other chemical probing data, including raw reactivities, data-driven structural entropies, and data-constrained base-pair secondary structure models, in context with linear genomic data tracks. We illustrate the usefulness of visualizing RNA structure in the IGV by exploring structure models for a large viral RNA genome, comparing bacterial mRNA structure in cells with its structure under cell- and protein-free conditions, and comparing a noncoding RNA structure modeled using SHAPE data with a base-pairing model inferred through sequence covariation analysis. © 2017 Busan and Weeks; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  15. Assessment of Technologies for the Space Shuttle External Tank Thermal Protection System and Recommendations for Technology Improvement. Part 2; Structural Analysis Technologies and Modeling Practices

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Nemeth, Michael P.; Hilburger, Mark W.

    2004-01-01

    A technology review and assessment of modeling and analysis efforts underway in support of a safe return to flight of the thermal protection system (TPS) for the Space Shuttle external tank (ET) are summarized. This review and assessment effort focuses on the structural modeling and analysis practices employed for ET TPS foam design and analysis and on identifying analysis capabilities needed in the short-term and long-term. The current understanding of the relationship between complex flight environments and ET TPS foam failure modes are reviewed as they relate to modeling and analysis. A literature review on modeling and analysis of TPS foam material systems is also presented. Finally, a review of modeling and analysis tools employed in the Space Shuttle Program is presented for the ET TPS acreage and close-out foam regions. This review includes existing simplified engineering analysis tools are well as finite element analysis procedures.

  16. Software analysis handbook: Software complexity analysis and software reliability estimation and prediction

    NASA Technical Reports Server (NTRS)

    Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron

    1994-01-01

    This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.

  17. Using enterprise architecture to analyse how organisational structure impact motivation and learning

    NASA Astrophysics Data System (ADS)

    Närman, Pia; Johnson, Pontus; Gingnell, Liv

    2016-06-01

    When technology, environment, or strategies change, organisations need to adjust their structures accordingly. These structural changes do not always enhance the organisational performance as intended partly because organisational developers do not understand the consequences of structural changes in performance. This article presents a model-based analysis framework for quantitative analysis of the effect of organisational structure on organisation performance in terms of employee motivation and learning. The model is based on Mintzberg's work on organisational structure. The quantitative analysis is formalised using the Object Constraint Language (OCL) and the Unified Modelling Language (UML) and implemented in an enterprise architecture tool.

  18. Voroprot: an interactive tool for the analysis and visualization of complex geometric features of protein structure.

    PubMed

    Olechnovic, Kliment; Margelevicius, Mindaugas; Venclovas, Ceslovas

    2011-03-01

    We present Voroprot, an interactive cross-platform software tool that provides a unique set of capabilities for exploring geometric features of protein structure. Voroprot allows the construction and visualization of the Apollonius diagram (also known as the additively weighted Voronoi diagram), the Apollonius graph, protein alpha shapes, interatomic contact surfaces, solvent accessible surfaces, pockets and cavities inside protein structure. Voroprot is available for Windows, Linux and Mac OS X operating systems and can be downloaded from http://www.ibt.lt/bioinformatics/voroprot/.

  19. Potential and Limitations of the Modal Characterization of a Spacecraft Bus Structure by Means of Active Structure Elements

    NASA Technical Reports Server (NTRS)

    Grillenbeck, Anton M.; Dillinger, Stephan A.; Elliott, Kenny B.

    1998-01-01

    Theoretical and experimental studies have been performed to investigate the potential and limitations of the modal characterization of a typical spacecraft bus structure by means of active structure elements. The aim of these studies has been test and advance tools for performing an accurate on-orbit modal identification which may be characterized by the availability of a generally very limited test instrumentation, autonomous excitation capabilities by active structure elements and a zero-g environment. The NASA LARC CSI Evolutionary Testbed provided an excellent object for the experimental part of this study program. The main subjects of investigation were: (1) the selection of optimum excitation and measurement to unambiguously identify modes of interest; (2) the applicability of different types of excitation means with focus on active structure elements; and (3) the assessment of the modal identification potential of different types of excitation functions and modal analysis tools. Conventional as well as dedicated modal analysis tools were applied to determine modal parameters and mode shapes. The results will be presented and discussed based on orthogonality checks as well as on suitable indicators for the quality of the acquired modes with respect to modal purity. In particular, the suitability for modal analysis of the acquired frequency response functions as obtained by excitation with active structure elements will be demonstrated with the help of reciprocity checks. Finally, the results will be summarized in a procedure to perform an on-orbit modal identification, including an indication of limitation to be observed.

  20. Review: To be or not to be an identifiable model. Is this a relevant question in animal science modelling?

    PubMed

    Muñoz-Tamayo, R; Puillet, L; Daniel, J B; Sauvant, D; Martin, O; Taghipoor, M; Blavy, P

    2018-04-01

    What is a good (useful) mathematical model in animal science? For models constructed for prediction purposes, the question of model adequacy (usefulness) has been traditionally tackled by statistical analysis applied to observed experimental data relative to model-predicted variables. However, little attention has been paid to analytic tools that exploit the mathematical properties of the model equations. For example, in the context of model calibration, before attempting a numerical estimation of the model parameters, we might want to know if we have any chance of success in estimating a unique best value of the model parameters from available measurements. This question of uniqueness is referred to as structural identifiability; a mathematical property that is defined on the sole basis of the model structure within a hypothetical ideal experiment determined by a setting of model inputs (stimuli) and observable variables (measurements). Structural identifiability analysis applied to dynamic models described by ordinary differential equations (ODEs) is a common practice in control engineering and system identification. This analysis demands mathematical technicalities that are beyond the academic background of animal science, which might explain the lack of pervasiveness of identifiability analysis in animal science modelling. To fill this gap, in this paper we address the analysis of structural identifiability from a practitioner perspective by capitalizing on the use of dedicated software tools. Our objectives are (i) to provide a comprehensive explanation of the structural identifiability notion for the community of animal science modelling, (ii) to assess the relevance of identifiability analysis in animal science modelling and (iii) to motivate the community to use identifiability analysis in the modelling practice (when the identifiability question is relevant). We focus our study on ODE models. By using illustrative examples that include published mathematical models describing lactation in cattle, we show how structural identifiability analysis can contribute to advancing mathematical modelling in animal science towards the production of useful models and, moreover, highly informative experiments via optimal experiment design. Rather than attempting to impose a systematic identifiability analysis to the modelling community during model developments, we wish to open a window towards the discovery of a powerful tool for model construction and experiment design.

  1. Feasibility study tool for semi-rigid joints design of high-rise buildings steel structures

    NASA Astrophysics Data System (ADS)

    Bagautdinov, Ruslan; Monastireva, Daria; Bodak, Irina; Potapova, Irina

    2018-03-01

    There are many ways to consider the final cost of the high-rise building structures and to define, which of their different variations are the most effective from different points of view. The research of Jaakko Haapio is conducted in Tampere University of Technology, which aims to develop a method that allows determining the manufacturing and installation costs of steel structures already at the tender phase while taking into account their details. This paper is aimed to make the analysis of the Feature-Based Costing Method for skeletal steel structures proposed by Jaakko Haapio. The most appropriate ways to improve the tool and to implement it in the Russian circumstances for high-rise building design are derived. Presented tool can be useful not only for the designers but, also, for the steel structures manufacturing organizations, which can help to utilize BIM technologies in the organization process and controlling on the factory.

  2. Large Angle Transient Dynamics (LATDYN) user's manual

    NASA Technical Reports Server (NTRS)

    Abrahamson, A. Louis; Chang, Che-Wei; Powell, Michael G.; Wu, Shih-Chin; Bingel, Bradford D.; Theophilos, Paula M.

    1991-01-01

    A computer code for modeling the large angle transient dynamics (LATDYN) of structures was developed to investigate techniques for analyzing flexible deformation and control/structure interaction problems associated with large angular motions of spacecraft. This type of analysis is beyond the routine capability of conventional analytical tools without simplifying assumptions. In some instances, the motion may be sufficiently slow and the spacecraft (or component) sufficiently rigid to simplify analyses of dynamics and controls by making pseudo-static and/or rigid body assumptions. The LATDYN introduces a new approach to the problem by combining finite element structural analysis, multi-body dynamics, and control system analysis in a single tool. It includes a type of finite element that can deform and rotate through large angles at the same time, and which can be connected to other finite elements either rigidly or through mechanical joints. The LATDYN also provides symbolic capabilities for modeling control systems which are interfaced directly with the finite element structural model. Thus, the nonlinear equations representing the structural model are integrated along with the equations representing sensors, processing, and controls as a coupled system.

  3. Structured Analysis of the Logistic Support Analysis (LSA) Task, ’Integrated Logistic Support (ILS) Assessment Maintenance Planning E-1 Element’ (APJ 966-204)

    DTIC Science & Technology

    1988-10-01

    Structured Analysis involves building a logical (non-physical) model of a system, using graphic techniques which enable users, analysts, and designers to... Design uses tools, especially graphic ones, to render systems readily understandable. 8 Ř. Structured Design offers a set of strategies for...in the overall systems design process, and an overview of the assessment procedures, as well as a guide to the overall assessment. 20. DISTRIBUTION

  4. System analysis tools for an ELT at ESO

    NASA Astrophysics Data System (ADS)

    Mueller, Michael; Koch, Franz

    2006-06-01

    Engineering of complex, large scale systems like the ELT designs currently investigated and developed in Europe and Northern America require powerful and sophisticated tools within specific technical disciplines such as mechanics, optics and control engineering. However, even analyzing a certain component of the telescope like the telescope structure necessitates a system approach to evaluate the structural effects onto the optical performance. This paper shows several software tools developed by the European Southern Observatory (ESO) which focus onto the system approach in the analyses: Using modal results of a finite element analysis the SMI-toolbox allows an easy generation of structural models with different sizes and levels of accuracy for the control design and closed-loop simulations. The optical modeling code BeamWarrior was developed by ESO and Astrium GmbH, Germany) especially for integrated modeling and interfering with a structural model. Within BeamWarrior displacements and deformations can be applied in an arbitrary coordinate system, and hence also in the global coordinates of the FE model avoiding error prone transformations. In addition to this, a sparse state space model object was developed for Matlab to gain in computational efficiency and reduced memory requirements due to the sparsity pattern of both the structural models and the control architecture. As one result these tools allow building an integrated model in order to reliably simulate interactions, cross-coupling effects, system responses, and to evaluate global performance. In order to evaluate disturbance effects on the optical performance in openloop more efficiently, an optical evaluation toolbox was built in the FE software ANSYS which performs Zernike decomposition and best-fit computation of the deformations directly in the FE analysis.

  5. CometBoards Users Manual Release 1.0

    NASA Technical Reports Server (NTRS)

    Guptill, James D.; Coroneos, Rula M.; Patnaik, Surya N.; Hopkins, Dale A.; Berke, Lazlo

    1996-01-01

    Several nonlinear mathematical programming algorithms for structural design applications are available at present. These include the sequence of unconstrained minimizations technique, the method of feasible directions, and the sequential quadratic programming technique. The optimality criteria technique and the fully utilized design concept are two other structural design methods. A project was undertaken to bring all these design methods under a common computer environment so that a designer can select any one of these tools that may be suitable for his/her application. To facilitate selection of a design algorithm, to validate and check out the computer code, and to ascertain the relative merits of the design tools, modest finite element structural analysis programs based on the concept of stiffness and integrated force methods have been coupled to each design method. The code that contains both these design and analysis tools, by reading input information from analysis and design data files, can cast the design of a structure as a minimum-weight optimization problem. The code can then solve it with a user-specified optimization technique and a user-specified analysis method. This design code is called CometBoards, which is an acronym for Comparative Evaluation Test Bed of Optimization and Analysis Routines for the Design of Structures. This manual describes for the user a step-by-step procedure for setting up the input data files and executing CometBoards to solve a structural design problem. The manual includes the organization of CometBoards; instructions for preparing input data files; the procedure for submitting a problem; illustrative examples; and several demonstration problems. A set of 29 structural design problems have been solved by using all the optimization methods available in CometBoards. A summary of the optimum results obtained for these problems is appended to this users manual. CometBoards, at present, is available for Posix-based Cray and Convex computers, Iris and Sun workstations, and the VM/CMS system.

  6. Establishing a Measurement Tool for a Nursing Work Environment in Taiwan.

    PubMed

    Lin, Li-Chiu; Lee, Huan-Fang; Yen, Miaofen

    2017-02-01

    The nursing work environment is a critical global health care problem. Many health care providers are concerned about the associations between the nursing work environment and the outcomes of organizations, nurses, and patients. Nursing work environment instruments have been assessed in the West but have not been considered in Asia. However, different cultures will affect the factorial structure of the tool. Using a stratified nationwide random sample, we created a measurement tool for the nursing work environment in Taiwan. The Nursing Work Environment Index-Revised Scale and the Essentials of Magnetism scale were used to examine the factorial structure. Item analysis, exploratory factor analysis, and confirmatory factor analysis were used to examine the hypothesis model and generate a new factorial structure. The Taiwan Nursing Work Environment Index (TNWEI) was established to evaluate the nursing work environment in Taiwan. The four factors were labeled "Organizational Support" (7 items), "Nurse Staffing and Resources" (4 items), "Nurse-Physician Collaboration" (4 items), and "Support for Continuing Education" (4 items). The 19 items explained 58.5% of the variance. Confirmatory factor analysis showed a good fit to the model (x2/df = 5.99; p < .05, goodness of fit index [GFI] = .90; RMSEA = .07). The TNWEI provides a comprehensive and efficient method for measuring the nurses' work environment in Taiwan.

  7. Computer program to assess impact of fatigue and fracture criteria on weight and cost of transport aircraft

    NASA Technical Reports Server (NTRS)

    Tanner, C. J.; Kruse, G. S.; Oman, B. H.

    1975-01-01

    A preliminary design analysis tool for rapidly performing trade-off studies involving fatigue, fracture, static strength, weight, and cost is presented. Analysis subprograms were developed for fatigue life, crack growth life, and residual strength; and linked to a structural synthesis module which in turn was integrated into a computer program. The part definition module of a cost and weight analysis program was expanded to be compatible with the upgraded structural synthesis capability. The resultant vehicle design and evaluation program is named VDEP-2. It is an accurate and useful tool for estimating purposes at the preliminary design stage of airframe development. A sample case along with an explanation of program applications and input preparation is presented.

  8. INTELLIGENT COMPUTING SYSTEM FOR RESERVOIR ANALYSIS AND RISK ASSESSMENT OF THE RED RIVER FORMATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kenneth D. Luff

    2002-06-30

    Integrated software has been written that comprises the tool kit for the Intelligent Computing System (ICS). Luff Exploration Company is applying these tools for analysis of carbonate reservoirs in the southern Williston Basin. The integrated software programs are designed to be used by small team consisting of an engineer, geologist and geophysicist. The software tools are flexible and robust, allowing application in many environments for hydrocarbon reservoirs. Keystone elements of the software tools include clustering and neural-network techniques. The tools are used to transform seismic attribute data to reservoir characteristics such as storage (phi-h), probable oil-water contacts, structural depths andmore » structural growth history. When these reservoir characteristics are combined with neural network or fuzzy logic solvers, they can provide a more complete description of the reservoir. This leads to better estimates of hydrocarbons in place, areal limits and potential for infill or step-out drilling. These tools were developed and tested using seismic, geologic and well data from the Red River Play in Bowman County, North Dakota and Harding County, South Dakota. The geologic setting for the Red River Formation is shallow-shelf carbonate at a depth from 8000 to 10,000 ft.« less

  9. INTELLIGENT COMPUTING SYSTEM FOR RESERVOIR ANALYSIS AND RISK ASSESSMENT OF THE RED RIVER FORMATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kenneth D. Luff

    2002-09-30

    Integrated software has been written that comprises the tool kit for the Intelligent Computing System (ICS). Luff Exploration Company is applying these tools for analysis of carbonate reservoirs in the southern Williston Basin. The integrated software programs are designed to be used by small team consisting of an engineer, geologist and geophysicist. The software tools are flexible and robust, allowing application in many environments for hydrocarbon reservoirs. Keystone elements of the software tools include clustering and neural-network techniques. The tools are used to transform seismic attribute data to reservoir characteristics such as storage (phi-h), probable oil-water contacts, structural depths andmore » structural growth history. When these reservoir characteristics are combined with neural network or fuzzy logic solvers, they can provide a more complete description of the reservoir. This leads to better estimates of hydrocarbons in place, areal limits and potential for infill or step-out drilling. These tools were developed and tested using seismic, geologic and well data from the Red River Play in Bowman County, North Dakota and Harding County, South Dakota. The geologic setting for the Red River Formation is shallow-shelf carbonate at a depth from 8000 to 10,000 ft.« less

  10. BDA: A novel method for identifying defects in body-centered cubic crystals.

    PubMed

    Möller, Johannes J; Bitzek, Erik

    2016-01-01

    The accurate and fast identification of crystallographic defects plays a key role for the analysis of atomistic simulation output data. For face-centered cubic (fcc) metals, most existing structure analysis tools allow for the direct distinction of common defects, such as stacking faults or certain low-index surfaces. For body-centered cubic (bcc) metals, on the other hand, a robust way to identify such defects is currently not easily available. We therefore introduce a new method for analyzing atomistic configurations of bcc metals, the BCC Defect Analysis (BDA). It uses existing structure analysis algorithms and combines their results to uniquely distinguish between typical defects in bcc metals. In essence, the BDA method offers the following features:•Identification of typical defect structures in bcc metals.•Reduction of erroneously identified defects by iterative comparison to the defects in the atom's neighborhood.•Availability as ready-to-use Python script for the widespread visualization tool OVITO [http://ovito.org].

  11. CAVER 3.0: A Tool for the Analysis of Transport Pathways in Dynamic Protein Structures

    PubMed Central

    Strnad, Ondrej; Brezovsky, Jan; Kozlikova, Barbora; Gora, Artur; Sustr, Vilem; Klvana, Martin; Medek, Petr; Biedermannova, Lada; Sochor, Jiri; Damborsky, Jiri

    2012-01-01

    Tunnels and channels facilitate the transport of small molecules, ions and water solvent in a large variety of proteins. Characteristics of individual transport pathways, including their geometry, physico-chemical properties and dynamics are instrumental for understanding of structure-function relationships of these proteins, for the design of new inhibitors and construction of improved biocatalysts. CAVER is a software tool widely used for the identification and characterization of transport pathways in static macromolecular structures. Herein we present a new version of CAVER enabling automatic analysis of tunnels and channels in large ensembles of protein conformations. CAVER 3.0 implements new algorithms for the calculation and clustering of pathways. A trajectory from a molecular dynamics simulation serves as the typical input, while detailed characteristics and summary statistics of the time evolution of individual pathways are provided in the outputs. To illustrate the capabilities of CAVER 3.0, the tool was applied for the analysis of molecular dynamics simulation of the microbial enzyme haloalkane dehalogenase DhaA. CAVER 3.0 safely identified and reliably estimated the importance of all previously published DhaA tunnels, including the tunnels closed in DhaA crystal structures. Obtained results clearly demonstrate that analysis of molecular dynamics simulation is essential for the estimation of pathway characteristics and elucidation of the structural basis of the tunnel gating. CAVER 3.0 paves the way for the study of important biochemical phenomena in the area of molecular transport, molecular recognition and enzymatic catalysis. The software is freely available as a multiplatform command-line application at http://www.caver.cz. PMID:23093919

  12. CAVER 3.0: a tool for the analysis of transport pathways in dynamic protein structures.

    PubMed

    Chovancova, Eva; Pavelka, Antonin; Benes, Petr; Strnad, Ondrej; Brezovsky, Jan; Kozlikova, Barbora; Gora, Artur; Sustr, Vilem; Klvana, Martin; Medek, Petr; Biedermannova, Lada; Sochor, Jiri; Damborsky, Jiri

    2012-01-01

    Tunnels and channels facilitate the transport of small molecules, ions and water solvent in a large variety of proteins. Characteristics of individual transport pathways, including their geometry, physico-chemical properties and dynamics are instrumental for understanding of structure-function relationships of these proteins, for the design of new inhibitors and construction of improved biocatalysts. CAVER is a software tool widely used for the identification and characterization of transport pathways in static macromolecular structures. Herein we present a new version of CAVER enabling automatic analysis of tunnels and channels in large ensembles of protein conformations. CAVER 3.0 implements new algorithms for the calculation and clustering of pathways. A trajectory from a molecular dynamics simulation serves as the typical input, while detailed characteristics and summary statistics of the time evolution of individual pathways are provided in the outputs. To illustrate the capabilities of CAVER 3.0, the tool was applied for the analysis of molecular dynamics simulation of the microbial enzyme haloalkane dehalogenase DhaA. CAVER 3.0 safely identified and reliably estimated the importance of all previously published DhaA tunnels, including the tunnels closed in DhaA crystal structures. Obtained results clearly demonstrate that analysis of molecular dynamics simulation is essential for the estimation of pathway characteristics and elucidation of the structural basis of the tunnel gating. CAVER 3.0 paves the way for the study of important biochemical phenomena in the area of molecular transport, molecular recognition and enzymatic catalysis. The software is freely available as a multiplatform command-line application at http://www.caver.cz.

  13. Integrated Modeling Tools for Thermal Analysis and Applications

    NASA Technical Reports Server (NTRS)

    Milman, Mark H.; Needels, Laura; Papalexandris, Miltiadis

    1999-01-01

    Integrated modeling of spacecraft systems is a rapidly evolving area in which multidisciplinary models are developed to design and analyze spacecraft configurations. These models are especially important in the early design stages where rapid trades between subsystems can substantially impact design decisions. Integrated modeling is one of the cornerstones of two of NASA's planned missions in the Origins Program -- the Next Generation Space Telescope (NGST) and the Space Interferometry Mission (SIM). Common modeling tools for control design and opto-mechanical analysis have recently emerged and are becoming increasingly widely used. A discipline that has been somewhat less integrated, but is nevertheless of critical concern for high precision optical instruments, is thermal analysis and design. A major factor contributing to this mild estrangement is that the modeling philosophies and objectives for structural and thermal systems typically do not coincide. Consequently the tools that are used in these discplines suffer a degree of incompatibility, each having developed along their own evolutionary path. Although standard thermal tools have worked relatively well in the past. integration with other disciplines requires revisiting modeling assumptions and solution methods. Over the past several years we have been developing a MATLAB based integrated modeling tool called IMOS (Integrated Modeling of Optical Systems) which integrates many aspects of structural, optical, control and dynamical analysis disciplines. Recent efforts have included developing a thermal modeling and analysis capability, which is the subject of this article. Currently, the IMOS thermal suite contains steady state and transient heat equation solvers, and the ability to set up the linear conduction network from an IMOS finite element model. The IMOS code generates linear conduction elements associated with plates and beams/rods of the thermal network directly from the finite element structural model. Conductances for temperature varying materials are accommodated. This capability both streamlines the process of developing the thermal model from the finite element model, and also makes the structural and thermal models compatible in the sense that each structural node is associated with a thermal node. This is particularly useful when the purpose of the analysis is to predict structural deformations due to thermal loads. The steady state solver uses a restricted step size Newton method, and the transient solver is an adaptive step size implicit method applicable to general differential algebraic systems. Temperature dependent conductances and capacitances are accommodated by the solvers. In addition to discussing the modeling and solution methods. applications where the thermal modeling is "in the loop" with sensitivity analysis, optimization and optical performance drawn from our experiences with the Space Interferometry Mission (SIM), and the Next Generation Space Telescope (NGST) are presented.

  14. Scalability Analysis of Gleipnir: A Memory Tracing and Profiling Tool, on Titan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janjusic, Tommy; Kartsaklis, Christos; Wang, Dali

    2013-01-01

    Application performance is hindered by a variety of factors but most notably driven by the well know CPU-memory speed gap (also known as the memory wall). Understanding application s memory behavior is key if we are trying to optimize performance. Understanding application performance properties is facilitated with various performance profiling tools. The scope of profiling tools varies in complexity, ease of deployment, profiling performance, and the detail of profiled information. Specifically, using profiling tools for performance analysis is a common task when optimizing and understanding scientific applications on complex and large scale systems such as Cray s XK7. This papermore » describes the performance characteristics of using Gleipnir, a memory tracing tool, on the Titan Cray XK7 system when instrumenting large applications such as the Community Earth System Model. Gleipnir is a memory tracing tool built as a plug-in tool for the Valgrind instrumentation framework. The goal of Gleipnir is to provide fine-grained trace information. The generated traces are a stream of executed memory transactions mapped to internal structures per process, thread, function, and finally the data structure or variable. Our focus was to expose tool performance characteristics when using Gleipnir with a combination of an external tools such as a cache simulator, Gl CSim, to characterize the tool s overall performance. In this paper we describe our experience with deploying Gleipnir on the Titan Cray XK7 system, report on the tool s ease-of-use, and analyze run-time performance characteristics under various workloads. While all performance aspects are important we mainly focus on I/O characteristics analysis due to the emphasis on the tools output which are trace-files. Moreover, the tool is dependent on the run-time system to provide the necessary infrastructure to expose low level system detail; therefore, we also discuss any theoretical benefits that can be achieved if such modules were present.« less

  15. Structural reliability assessment capability in NESSUS

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Wu, Y.-T.

    1992-01-01

    The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.

  16. Structural reliability assessment capability in NESSUS

    NASA Astrophysics Data System (ADS)

    Millwater, H.; Wu, Y.-T.

    1992-07-01

    The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.

  17. Nonlinear Shaping Architecture Designed with Using Evolutionary Structural Optimization Tools

    NASA Astrophysics Data System (ADS)

    Januszkiewicz, Krystyna; Banachowicz, Marta

    2017-10-01

    The paper explores the possibilities of using Structural Optimization Tools (ESO) digital tools in an integrated structural and architectural design in response to the current needs geared towards sustainability, combining ecological and economic efficiency. The first part of the paper defines the Evolutionary Structural Optimization tools, which were developed specifically for engineering purposes using finite element analysis as a framework. The development of ESO has led to several incarnations, which are all briefly discussed (Additive ESO, Bi-directional ESO, Extended ESO). The second part presents result of using these tools in structural and architectural design. Actual building projects which involve optimization as a part of the original design process will be presented (Crematorium in Kakamigahara Gifu, Japan, 2006 SANAA“s Learning Centre, EPFL in Lausanne, Switzerland 2008 among others). The conclusion emphasizes that the structural engineering and architectural design mean directing attention to the solutions which are used by Nature, designing works optimally shaped and forming their own environments. Architectural forms never constitute the optimum shape derived through a form-finding process driven only by structural optimization, but rather embody and integrate a multitude of parameters. It might be assumed that there is a similarity between these processes in nature and the presented design methods. Contemporary digital methods make the simulation of such processes possible, and thus enable us to refer back to the empirical methods of previous generations.

  18. Experimental Modal Analysis and Dynaic Strain Fiber Bragg Gratings for Structural Health Monitoring of Composite Aerospace Structures

    NASA Astrophysics Data System (ADS)

    Panopoulou, A.; Fransen, S.; Gomez Molinero, V.; Kostopoulos, V.

    2012-07-01

    The objective of this work is to develop a new structural health monitoring system for composite aerospace structures based on dynamic response strain measurements and experimental modal analysis techniques. Fibre Bragg Grating (FBG) optical sensors were used for monitoring the dynamic response of the composite structure. The structural dynamic behaviour has been numerically simulated and experimentally verified by means of vibration testing. The hypothesis of all vibration tests was that actual damage in composites reduces their stiffness and produces the same result as mass increase produces. Thus, damage was simulated by slightly varying locally the mass of the structure at different zones. Experimental modal analysis based on the strain responses was conducted and the extracted strain mode shapes were the input for the damage detection expert system. A feed-forward back propagation neural network was the core of the damage detection system. The features-input to the neural network consisted of the strain mode shapes, extracted from the experimental modal analysis. Dedicated training and validation activities were carried out based on the experimental results. The system showed high reliability, confirmed by the ability of the neural network to recognize the size and the position of damage on the structure. The experiments were performed on a real structure i.e. a lightweight antenna sub- reflector, manufactured and tested at EADS CASA ESPACIO. An integrated FBG sensor network, based on the advantage of multiplexing, was mounted on the structure with optimum topology. Numerical simulation of both structures was used as a support tool at all the steps of the work. Potential applications for the proposed system are during ground qualification extensive tests of space structures and during the mission as modal analysis tool on board, being able via the FBG responses to identify a potential failure.

  19. State Analysis Database Tool

    NASA Technical Reports Server (NTRS)

    Rasmussen, Robert; Bennett, Matthew

    2006-01-01

    The State Analysis Database Tool software establishes a productive environment for collaboration among software and system engineers engaged in the development of complex interacting systems. The tool embodies State Analysis, a model-based system engineering methodology founded on a state-based control architecture (see figure). A state represents a momentary condition of an evolving system, and a model may describe how a state evolves and is affected by other states. The State Analysis methodology is a process for capturing system and software requirements in the form of explicit models and states, and defining goal-based operational plans consistent with the models. Requirements, models, and operational concerns have traditionally been documented in a variety of system engineering artifacts that address different aspects of a mission s lifecycle. In State Analysis, requirements, models, and operations information are State Analysis artifacts that are consistent and stored in a State Analysis Database. The tool includes a back-end database, a multi-platform front-end client, and Web-based administrative functions. The tool is structured to prompt an engineer to follow the State Analysis methodology, to encourage state discovery and model description, and to make software requirements and operations plans consistent with model descriptions.

  20. Cost-effective use of minicomputers to solve structural problems

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Foster, E. P.

    1978-01-01

    Minicomputers are receiving increased use throughout the aerospace industry. Until recently, their use focused primarily on process control and numerically controlled tooling applications, while their exposure to and the opportunity for structural calculations has been limited. With the increased availability of this computer hardware, the question arises as to the feasibility and practicality of carrying out comprehensive structural analysis on a minicomputer. This paper presents results on the potential for using minicomputers for structural analysis by (1) selecting a comprehensive, finite-element structural analysis system in use on large mainframe computers; (2) implementing the system on a minicomputer; and (3) comparing the performance of the minicomputers with that of a large mainframe computer for the solution to a wide range of finite element structural analysis problems.

  1. A Simple Tool for the Design and Analysis of Multiple-Reflector Antennas in a Multi-Disciplinary Environment

    NASA Technical Reports Server (NTRS)

    Katz, Daniel S.; Cwik, Tom; Fu, Chuigang; Imbriale, William A.; Jamnejad, Vahraz; Springer, Paul L.; Borgioli, Andrea

    2000-01-01

    The process of designing and analyzing a multiple-reflector system has traditionally been time-intensive, requiring large amounts of both computational and human time. At many frequencies, a discrete approximation of the radiation integral may be used to model the system. The code which implements this physical optics (PO) algorithm was developed at the Jet Propulsion Laboratory. It analyzes systems of antennas in pairs, and for each pair, the analysis can be computationally time-consuming. Additionally, the antennas must be described using a local coordinate system for each antenna, which makes it difficult to integrate the design into a multi-disciplinary framework in which there is traditionally one global coordinate system, even before considering deforming the antenna as prescribed by external structural and/or thermal factors. Finally, setting up the code to correctly analyze all the antenna pairs in the system can take a fair amount of time, and introduces possible human error. The use of parallel computing to reduce the computational time required for the analysis of a given pair of antennas has been previously discussed. This paper focuses on the other problems mentioned above. It will present a methodology and examples of use of an automated tool that performs the analysis of a complete multiple-reflector system in an integrated multi-disciplinary environment (including CAD modeling, and structural and thermal analysis) at the click of a button. This tool, named MOD Tool (Millimeter-wave Optics Design Tool), has been designed and implemented as a distributed tool, with a client that runs almost identically on Unix, Mac, and Windows platforms, and a server that runs primarily on a Unix workstation and can interact with parallel supercomputers with simple instruction from the user interacting with the client.

  2. Framework for Multidisciplinary Analysis, Design, and Optimization with High-Fidelity Analysis Tools

    NASA Technical Reports Server (NTRS)

    Orr, Stanley A.; Narducci, Robert P.

    2009-01-01

    A plan is presented for the development of a high fidelity multidisciplinary optimization process for rotorcraft. The plan formulates individual disciplinary design problems, identifies practical high-fidelity tools and processes that can be incorporated in an automated optimization environment, and establishes statements of the multidisciplinary design problem including objectives, constraints, design variables, and cross-disciplinary dependencies. Five key disciplinary areas are selected in the development plan. These are rotor aerodynamics, rotor structures and dynamics, fuselage aerodynamics, fuselage structures, and propulsion / drive system. Flying qualities and noise are included as ancillary areas. Consistency across engineering disciplines is maintained with a central geometry engine that supports all multidisciplinary analysis. The multidisciplinary optimization process targets the preliminary design cycle where gross elements of the helicopter have been defined. These might include number of rotors and rotor configuration (tandem, coaxial, etc.). It is at this stage that sufficient configuration information is defined to perform high-fidelity analysis. At the same time there is enough design freedom to influence a design. The rotorcraft multidisciplinary optimization tool is built and substantiated throughout its development cycle in a staged approach by incorporating disciplines sequentially.

  3. A Probabilistic Approach to Model Update

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Reaves, Mercedes C.; Voracek, David F.

    2001-01-01

    Finite element models are often developed for load validation, structural certification, response predictions, and to study alternate design concepts. In rare occasions, models developed with a nominal set of parameters agree with experimental data without the need to update parameter values. Today, model updating is generally heuristic and often performed by a skilled analyst with in-depth understanding of the model assumptions. Parameter uncertainties play a key role in understanding the model update problem and therefore probabilistic analysis tools, developed for reliability and risk analysis, may be used to incorporate uncertainty in the analysis. In this work, probability analysis (PA) tools are used to aid the parameter update task using experimental data and some basic knowledge of potential error sources. Discussed here is the first application of PA tools to update parameters of a finite element model for a composite wing structure. Static deflection data at six locations are used to update five parameters. It is shown that while prediction of individual response values may not be matched identically, the system response is significantly improved with moderate changes in parameter values.

  4. A User's Guide to Topological Data Analysis

    ERIC Educational Resources Information Center

    Munch, Elizabeth

    2017-01-01

    Topological data analysis (TDA) is a collection of powerful tools that can quantify shape and structure in data in order to answer questions from the data's domain. This is done by representing some aspect of the structure of the data in a simplified topological signature. In this article, we introduce two of the most commonly used topological…

  5. Honeycomb: Visual Analysis of Large Scale Social Networks

    NASA Astrophysics Data System (ADS)

    van Ham, Frank; Schulz, Hans-Jörg; Dimicco, Joan M.

    The rise in the use of social network sites allows us to collect large amounts of user reported data on social structures and analysis of this data could provide useful insights for many of the social sciences. This analysis is typically the domain of Social Network Analysis, and visualization of these structures often proves invaluable in understanding them. However, currently available visual analysis tools are not very well suited to handle the massive scale of this network data, and often resolve to displaying small ego networks or heavily abstracted networks. In this paper, we present Honeycomb, a visualization tool that is able to deal with much larger scale data (with millions of connections), which we illustrate by using a large scale corporate social networking site as an example. Additionally, we introduce a new probability based network metric to guide users to potentially interesting or anomalous patterns and discuss lessons learned during design and implementation.

  6. Nutrition screening tools: an analysis of the evidence.

    PubMed

    Skipper, Annalynn; Ferguson, Maree; Thompson, Kyle; Castellanos, Victoria H; Porcari, Judy

    2012-05-01

    In response to questions about tools for nutrition screening, an evidence analysis project was developed to identify the most valid and reliable nutrition screening tools for use in acute care and hospital-based ambulatory care settings. An oversight group defined nutrition screening and literature search criteria. A trained analyst conducted structured searches of the literature for studies of nutrition screening tools according to predetermined criteria. Eleven nutrition screening tools designed to detect undernutrition in patients in acute care and hospital-based ambulatory care were identified. Trained analysts evaluated articles for quality using criteria specified by the American Dietetic Association's Evidence Analysis Library. Members of the oversight group assigned quality grades to the tools based on the quality of the supporting evidence, including reliability and validity data. One tool, the NRS-2002, received a grade I, and 4 tools-the Simple Two-Part Tool, the Mini-Nutritional Assessment-Short Form (MNA-SF), the Malnutrition Screening Tool (MST), and Malnutrition Universal Screening Tool (MUST)-received a grade II. The MST was the only tool shown to be both valid and reliable for identifying undernutrition in the settings studied. Thus, validated nutrition screening tools that are simple and easy to use are available for application in acute care and hospital-based ambulatory care settings.

  7. Probabilistic Structural Analysis of the Solid Rocket Booster Aft Skirt External Fitting Modification

    NASA Technical Reports Server (NTRS)

    Townsend, John S.; Peck, Jeff; Ayala, Samuel

    2000-01-01

    NASA has funded several major programs (the Probabilistic Structural Analysis Methods Project is an example) to develop probabilistic structural analysis methods and tools for engineers to apply in the design and assessment of aerospace hardware. A probabilistic finite element software code, known as Numerical Evaluation of Stochastic Structures Under Stress, is used to determine the reliability of a critical weld of the Space Shuttle solid rocket booster aft skirt. An external bracket modification to the aft skirt provides a comparison basis for examining the details of the probabilistic analysis and its contributions to the design process. Also, analysis findings are compared with measured Space Shuttle flight data.

  8. Interdisciplinary analysis procedures in the modeling and control of large space-based structures

    NASA Technical Reports Server (NTRS)

    Cooper, Paul A.; Stockwell, Alan E.; Kim, Zeen C.

    1987-01-01

    The paper describes a computer software system called the Integrated Multidisciplinary Analysis Tool, IMAT, that has been developed at NASA Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite control systems influenced by structural dynamics. Using a menu-driven interactive executive program, IMAT links a relational database to commercial structural and controls analysis codes. The paper describes the procedures followed to analyze a complex satellite structure and control system. The codes used to accomplish the analysis are described, and an example is provided of an application of IMAT to the analysis of a reference space station subject to a rectangular pulse loading at its docking port.

  9. Development and application of an algorithm to compute weighted multiple glycan alignments.

    PubMed

    Hosoda, Masae; Akune, Yukie; Aoki-Kinoshita, Kiyoko F

    2017-05-01

    A glycan consists of monosaccharides linked by glycosidic bonds, has branches and forms complex molecular structures. Databases have been developed to store large amounts of glycan-binding experiments, including glycan arrays with glycan-binding proteins. However, there are few bioinformatics techniques to analyze large amounts of data for glycans because there are few tools that can handle the complexity of glycan structures. Thus, we have developed the MCAW (Multiple Carbohydrate Alignment with Weights) tool that can align multiple glycan structures, to aid in the understanding of their function as binding recognition molecules. We have described in detail the first algorithm to perform multiple glycan alignments by modeling glycans as trees. To test our tool, we prepared several data sets, and as a result, we found that the glycan motif could be successfully aligned without any prior knowledge applied to the tool, and the known recognition binding sites of glycans could be aligned at a high rate amongst all our datasets tested. We thus claim that our tool is able to find meaningful glycan recognition and binding patterns using data obtained by glycan-binding experiments. The development and availability of an effective multiple glycan alignment tool opens possibilities for many other glycoinformatics analysis, making this work a big step towards furthering glycomics analysis. http://www.rings.t.soka.ac.jp. kkiyoko@soka.ac.jp. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.

  10. Exploring the potential of a structural alphabet-based tool for mining multiple target conformations and target flexibility insight

    PubMed Central

    Chéron, Jean-Baptiste; Triki, Dhoha; Senac, Caroline; Flatters, Delphine; Camproux, Anne-Claude

    2017-01-01

    Protein flexibility is often implied in binding with different partners and is essential for protein function. The growing number of macromolecular structures in the Protein Data Bank entries and their redundancy has become a major source of structural knowledge of the protein universe. The analysis of structural variability through available redundant structures of a target, called multiple target conformations (MTC), obtained using experimental or modeling methods and under different biological conditions or different sources is one way to explore protein flexibility. This analysis is essential to improve the understanding of various mechanisms associated with protein target function and flexibility. In this study, we explored structural variability of three biological targets by analyzing different MTC sets associated with these targets. To facilitate the study of these MTC sets, we have developed an efficient tool, SA-conf, dedicated to capturing and linking the amino acid and local structure variability and analyzing the target structural variability space. The advantage of SA-conf is that it could be applied to divers sets composed of MTCs available in the PDB obtained using NMR and crystallography or homology models. This tool could also be applied to analyze MTC sets obtained by dynamics approaches. Our results showed that SA-conf tool is effective to quantify the structural variability of a MTC set and to localize the structural variable positions and regions of the target. By selecting adapted MTC subsets and comparing their variability detected by SA-conf, we highlighted different sources of target flexibility such as induced by binding partner, by mutation and intrinsic flexibility. Our results support the interest to mine available structures associated with a target using to offer valuable insight into target flexibility and interaction mechanisms. The SA-conf executable script, with a set of pre-compiled binaries are available at http://www.mti.univ-paris-diderot.fr/recherche/plateformes/logiciels. PMID:28817602

  11. Exploring the potential of a structural alphabet-based tool for mining multiple target conformations and target flexibility insight.

    PubMed

    Regad, Leslie; Chéron, Jean-Baptiste; Triki, Dhoha; Senac, Caroline; Flatters, Delphine; Camproux, Anne-Claude

    2017-01-01

    Protein flexibility is often implied in binding with different partners and is essential for protein function. The growing number of macromolecular structures in the Protein Data Bank entries and their redundancy has become a major source of structural knowledge of the protein universe. The analysis of structural variability through available redundant structures of a target, called multiple target conformations (MTC), obtained using experimental or modeling methods and under different biological conditions or different sources is one way to explore protein flexibility. This analysis is essential to improve the understanding of various mechanisms associated with protein target function and flexibility. In this study, we explored structural variability of three biological targets by analyzing different MTC sets associated with these targets. To facilitate the study of these MTC sets, we have developed an efficient tool, SA-conf, dedicated to capturing and linking the amino acid and local structure variability and analyzing the target structural variability space. The advantage of SA-conf is that it could be applied to divers sets composed of MTCs available in the PDB obtained using NMR and crystallography or homology models. This tool could also be applied to analyze MTC sets obtained by dynamics approaches. Our results showed that SA-conf tool is effective to quantify the structural variability of a MTC set and to localize the structural variable positions and regions of the target. By selecting adapted MTC subsets and comparing their variability detected by SA-conf, we highlighted different sources of target flexibility such as induced by binding partner, by mutation and intrinsic flexibility. Our results support the interest to mine available structures associated with a target using to offer valuable insight into target flexibility and interaction mechanisms. The SA-conf executable script, with a set of pre-compiled binaries are available at http://www.mti.univ-paris-diderot.fr/recherche/plateformes/logiciels.

  12. Coupled rotor/airframe vibration analysis

    NASA Technical Reports Server (NTRS)

    Sopher, R.; Studwell, R. E.; Cassarino, S.; Kottapalli, S. B. R.

    1982-01-01

    A coupled rotor/airframe vibration analysis developed as a design tool for predicting helicopter vibrations and a research tool to quantify the effects of structural properties, aerodynamic interactions, and vibration reduction devices on vehicle vibration levels is described. The analysis consists of a base program utilizing an impedance matching technique to represent the coupled rotor/airframe dynamics of the system supported by inputs from several external programs supplying sophisticated rotor and airframe aerodynamic and structural dynamic representation. The theoretical background, computer program capabilities and limited correlation results are presented in this report. Correlation results using scale model wind tunnel results show that the analysis can adequately predict trends of vibration variations with airspeed and higher harmonic control effects. Predictions of absolute values of vibration levels were found to be very sensitive to modal characteristics and results were not representative of measured values.

  13. Two worlds collide: Image analysis methods for quantifying structural variation in cluster molecular dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steenbergen, K. G., E-mail: kgsteen@gmail.com; Gaston, N.

    2014-02-14

    Inspired by methods of remote sensing image analysis, we analyze structural variation in cluster molecular dynamics (MD) simulations through a unique application of the principal component analysis (PCA) and Pearson Correlation Coefficient (PCC). The PCA analysis characterizes the geometric shape of the cluster structure at each time step, yielding a detailed and quantitative measure of structural stability and variation at finite temperature. Our PCC analysis captures bond structure variation in MD, which can be used to both supplement the PCA analysis as well as compare bond patterns between different cluster sizes. Relying only on atomic position data, without requirement formore » a priori structural input, PCA and PCC can be used to analyze both classical and ab initio MD simulations for any cluster composition or electronic configuration. Taken together, these statistical tools represent powerful new techniques for quantitative structural characterization and isomer identification in cluster MD.« less

  14. Two worlds collide: image analysis methods for quantifying structural variation in cluster molecular dynamics.

    PubMed

    Steenbergen, K G; Gaston, N

    2014-02-14

    Inspired by methods of remote sensing image analysis, we analyze structural variation in cluster molecular dynamics (MD) simulations through a unique application of the principal component analysis (PCA) and Pearson Correlation Coefficient (PCC). The PCA analysis characterizes the geometric shape of the cluster structure at each time step, yielding a detailed and quantitative measure of structural stability and variation at finite temperature. Our PCC analysis captures bond structure variation in MD, which can be used to both supplement the PCA analysis as well as compare bond patterns between different cluster sizes. Relying only on atomic position data, without requirement for a priori structural input, PCA and PCC can be used to analyze both classical and ab initio MD simulations for any cluster composition or electronic configuration. Taken together, these statistical tools represent powerful new techniques for quantitative structural characterization and isomer identification in cluster MD.

  15. Draper Station Analysis Tool

    NASA Technical Reports Server (NTRS)

    Bedrossian, Nazareth; Jang, Jiann-Woei; McCants, Edward; Omohundro, Zachary; Ring, Tom; Templeton, Jeremy; Zoss, Jeremy; Wallace, Jonathan; Ziegler, Philip

    2011-01-01

    Draper Station Analysis Tool (DSAT) is a computer program, built on commercially available software, for simulating and analyzing complex dynamic systems. Heretofore used in designing and verifying guidance, navigation, and control systems of the International Space Station, DSAT has a modular architecture that lends itself to modification for application to spacecraft or terrestrial systems. DSAT consists of user-interface, data-structures, simulation-generation, analysis, plotting, documentation, and help components. DSAT automates the construction of simulations and the process of analysis. DSAT provides a graphical user interface (GUI), plus a Web-enabled interface, similar to the GUI, that enables a remotely located user to gain access to the full capabilities of DSAT via the Internet and Webbrowser software. Data structures are used to define the GUI, the Web-enabled interface, simulations, and analyses. Three data structures define the type of analysis to be performed: closed-loop simulation, frequency response, and/or stability margins. DSAT can be executed on almost any workstation, desktop, or laptop computer. DSAT provides better than an order of magnitude improvement in cost, schedule, and risk assessment for simulation based design and verification of complex dynamic systems.

  16. ANALYTICAL TOOL INTERFACE FOR LANDSCAPE ASSESSMENTS (ATIILA): AN ARCVIEW EXTENSION FOR THE ANALYSIS OF LANDSCAPE PATTERNS, COMPOSITION, AND STRUCTURE

    EPA Science Inventory

    Environmental management practices are trending away from simple, local- scale assessments toward complex, multiple-stressor regional assessments. Landscape ecology provides the theory behind these assessments while geographic information systems (GIS) supply the tools to impleme...

  17. Multi-level factors influence the implementation and use of complex innovations in cancer care: a multiple case study of synoptic reporting.

    PubMed

    Urquhart, Robin; Porter, Geoffrey A; Sargeant, Joan; Jackson, Lois; Grunfeld, Eva

    2014-09-16

    The implementation of innovations (i.e., new tools and practices) in healthcare organizations remains a significant challenge. The objective of this study was to examine the key interpersonal, organizational, and system level factors that influenced the implementation and use of synoptic reporting tools in three specific areas of cancer care. Using case study methodology, we studied three cases in Nova Scotia, Canada, wherein synoptic reporting tools were implemented within clinical departments/programs. Synoptic reporting tools capture and present information about a medical or surgical procedure in a structured, checklist-like format and typically report only items critical for understanding the disease and subsequent impacts on patient care. Data were collected through semi-structured interviews with key informants, document analysis, nonparticipant observation, and tool use/examination. Analysis involved production of case histories, in-depth analysis of each case, and a cross-case analysis. Numerous techniques were used during the research design, data collection, and data analysis stages to increase the rigour of this study. The analysis revealed five common factors that were particularly influential to implementation and use of synoptic reporting tools across the three cases: stakeholder involvement, managing the change process (e.g., building demand, communication, training and support), champions and respected colleagues, administrative and managerial support, and innovation attributes (e.g., complexity, compatibility with interests and values). The direction of influence (facilitating or impeding) of each of these factors differed across and within cases. The findings demonstrate the importance of a multi-level contextual analysis to gaining both breadth and depth to our understanding of innovation implementation and use in health care. They also provide new insights into several important issues under-reported in the literature on moving innovations into healthcare practice, including the role of middle managers in implementation efforts and the importance of attending to the interpersonal aspects of implementation.

  18. ITRB Spar Domestic Source

    DTIC Science & Technology

    2012-12-14

    Each pair of rollers is designed to capture the shafts mounted to both ends of the tool lid. Additionally, a safety pin can be put in place to...ITRB for the AH-64D. The scope of the program included structural design , materials selection, manufacturing producibility analysis, tooling design ...responsible for tooling design and fabrication, fabrication process development and fabrication of spars and test samples; G3 who designed the RTM

  19. Stochastic-Strength-Based Damage Simulation Tool for Ceramic Matrix and Polymer Matrix Composite Structures

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Bednarcyk, Brett A.; Pineda, Evan J.; Walton, Owen J.; Arnold, Steven M.

    2016-01-01

    Stochastic-based, discrete-event progressive damage simulations of ceramic-matrix composite and polymer matrix composite material structures have been enabled through the development of a unique multiscale modeling tool. This effort involves coupling three independently developed software programs: (1) the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC), (2) the Ceramics Analysis and Reliability Evaluation of Structures Life Prediction Program (CARES/ Life), and (3) the Abaqus finite element analysis (FEA) program. MAC/GMC contributes multiscale modeling capabilities and micromechanics relations to determine stresses and deformations at the microscale of the composite material repeating unit cell (RUC). CARES/Life contributes statistical multiaxial failure criteria that can be applied to the individual brittle-material constituents of the RUC. Abaqus is used at the global scale to model the overall composite structure. An Abaqus user-defined material (UMAT) interface, referred to here as "FEAMAC/CARES," was developed that enables MAC/GMC and CARES/Life to operate seamlessly with the Abaqus FEA code. For each FEAMAC/CARES simulation trial, the stochastic nature of brittle material strength results in random, discrete damage events, which incrementally progress and lead to ultimate structural failure. This report describes the FEAMAC/CARES methodology and discusses examples that illustrate the performance of the tool. A comprehensive example problem, simulating the progressive damage of laminated ceramic matrix composites under various off-axis loading conditions and including a double notched tensile specimen geometry, is described in a separate report.

  20. A Software Tool for Integrated Optical Design Analysis

    NASA Technical Reports Server (NTRS)

    Moore, Jim; Troy, Ed; DePlachett, Charles; Montgomery, Edward (Technical Monitor)

    2001-01-01

    Design of large precision optical systems requires multi-disciplinary analysis, modeling, and design. Thermal, structural and optical characteristics of the hardware must be accurately understood in order to design a system capable of accomplishing the performance requirements. The interactions between each of the disciplines become stronger as systems are designed lighter weight for space applications. This coupling dictates a concurrent engineering design approach. In the past, integrated modeling tools have been developed that attempt to integrate all of the complex analysis within the framework of a single model. This often results in modeling simplifications and it requires engineering specialist to learn new applications. The software described in this presentation addresses the concurrent engineering task using a different approach. The software tool, Integrated Optical Design Analysis (IODA), uses data fusion technology to enable a cross discipline team of engineering experts to concurrently design an optical system using their standard validated engineering design tools.

  1. Causal Relation Analysis Tool of the Case Study in the Engineer Ethics Education

    NASA Astrophysics Data System (ADS)

    Suzuki, Yoshio; Morita, Keisuke; Yasui, Mitsukuni; Tanada, Ichirou; Fujiki, Hiroyuki; Aoyagi, Manabu

    In engineering ethics education, the virtual experiencing of dilemmas is essential. Learning through the case study method is a particularly effective means. Many case studies are, however, difficult to deal with because they often include many complex causal relationships and social factors. It would thus be convenient if there were a tool that could analyze the factors of a case example and organize them into a hierarchical structure to get a better understanding of the whole picture. The tool that was developed applies a cause-and-effect matrix and simple graph theory. It analyzes the causal relationship between facts in a hierarchical structure and organizes complex phenomena. The effectiveness of this tool is shown by presenting an actual example.

  2. The collaboratory for MS3D: a new cyberinfrastructure for the structural elucidation of biological macromolecules and their assemblies using mass spectrometry-based approaches.

    PubMed

    Yu, Eizadora T; Hawkins, Arie; Kuntz, Irwin D; Rahn, Larry A; Rothfuss, Andrew; Sale, Kenneth; Young, Malin M; Yang, Christine L; Pancerella, Carmen M; Fabris, Daniele

    2008-11-01

    Modern biomedical research is evolving with the rapid growth of diverse data types, biophysical characterization methods, computational tools and extensive collaboration among researchers spanning various communities and having complementary backgrounds and expertise. Collaborating researchers are increasingly dependent on shared data and tools made available by other investigators with common interests, thus forming communities that transcend the traditional boundaries of the single research laboratory or institution. Barriers, however, remain to the formation of these virtual communities, usually due to the steep learning curve associated with becoming familiar with new tools, or with the difficulties associated with transferring data between tools. Recognizing the need for shared reference data and analysis tools, we are developing an integrated knowledge environment that supports productive interactions among researchers. Here we report on our current collaborative environment, which focuses on bringing together structural biologists working in the area of mass spectrometric based methods for the analysis of tertiary and quaternary macromolecular structures (MS3D) called the Collaboratory for MS3D (C-MS3D). C-MS3D is a Web-portal designed to provide collaborators with a shared work environment that integrates data storage and management with data analysis tools. Files are stored and archived along with pertinent meta data in such a way as to allow file handling to be tracked (data provenance) and data files to be searched using keywords and modification dates. While at this time the portal is designed around a specific application, the shared work environment is a general approach to building collaborative work groups. The goal of this is to not only provide a common data sharing and archiving system, but also to assist in the building of new collaborations and to spur the development of new tools and technologies.

  3. Structured illumination microscopy as a diagnostic tool for nephrotic disease

    NASA Astrophysics Data System (ADS)

    Nylk, Jonathan; Pullman, James M.; Campbell, Elaine C.; Gunn-Moore, Frank J.; Prystowsky, Michael B.; Dholakia, Kishan

    2017-02-01

    Nephrotic disease is a group of debilitating and sometimes lethal diseases affecting kidney function, specifically the loss of ability to retain vital proteins in the blood while smaller molecules are removed through filtration into the urine. Treatment routes are often dictated by microscopic analysis of kidney biopsies. Podocytes within the glomeruli of the kidney have many interdigitating projections (foot processes), which form the main filtration system. Nephrotic disease is characterised by the loss of this tightly interdigitating substructure and its observation by electron microscopy (EM) is necessitated as these structures are typically 250 500nm wide, with 40nm spacing. Diagnosis by EM is both expensive and time consuming; it can take up to one week to complete the preparation, imaging, and analysis of a single sample. We propose structured illumination microscopy (SIM) as an alternative, optical diagnostic tool. Our results show that SIM can resolve the structure of fluorescent probes tagged to podocin, a protein localised to the periphery of the podocyte foot processes. Three-dimensional podocin maps were acquired in healthy tissue and tissue from patients diagnosed with two different nephrotic disease states; minimal change disease and membranous nephropathy. These structures correlated well with EM images of the same structure. Preparation, imaging, and analysis could be achieved in several hours. Additionally, the volumetric information of the SIM images revealed morphological changes in disease states not observed by EM. This evidence supports the use of SIM as a diagnostic tool for nephrotic disease and can potentially reduce the time and cost per diagnosis.

  4. An Analysis of Organizational Approaches to Online Course Structures

    ERIC Educational Resources Information Center

    Lee, Cheng-Yuan; Dickerson, Jeremy; Winslow, Joe

    2012-01-01

    The structure of an online course, including the navigational interface, visual design of materials and information, as well as the communication tools to facilitate learning, can affect students, instructors, programs and educational organizations in various ways. This paper examines online course structural issues derived from previous research…

  5. Guidelines for the analysis of free energy calculations

    PubMed Central

    Klimovich, Pavel V.; Shirts, Michael R.; Mobley, David L.

    2015-01-01

    Free energy calculations based on molecular dynamics (MD) simulations show considerable promise for applications ranging from drug discovery to prediction of physical properties and structure-function studies. But these calculations are still difficult and tedious to analyze, and best practices for analysis are not well defined or propagated. Essentially, each group analyzing these calculations needs to decide how to conduct the analysis and, usually, develop its own analysis tools. Here, we review and recommend best practices for analysis yielding reliable free energies from molecular simulations. Additionally, we provide a Python tool, alchemical–analysis.py, freely available on GitHub at https://github.com/choderalab/pymbar–examples, that implements the analysis practices reviewed here for several reference simulation packages, which can be adapted to handle data from other packages. Both this review and the tool covers analysis of alchemical calculations generally, including free energy estimates via both thermodynamic integration and free energy perturbation-based estimators. Our Python tool also handles output from multiple types of free energy calculations, including expanded ensemble and Hamiltonian replica exchange, as well as standard fixed ensemble calculations. We also survey a range of statistical and graphical ways of assessing the quality of the data and free energy estimates, and provide prototypes of these in our tool. We hope these tools and discussion will serve as a foundation for more standardization of and agreement on best practices for analysis of free energy calculations. PMID:25808134

  6. Rapid Modeling and Analysis Tools: Evolution, Status, Needs and Directions

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Stone, Thomas J.; Ransom, Jonathan B. (Technical Monitor)

    2002-01-01

    Advanced aerospace systems are becoming increasingly more complex, and customers are demanding lower cost, higher performance, and high reliability. Increased demands are placed on the design engineers to collaborate and integrate design needs and objectives early in the design process to minimize risks that may occur later in the design development stage. High performance systems require better understanding of system sensitivities much earlier in the design process to meet these goals. The knowledge, skills, intuition, and experience of an individual design engineer will need to be extended significantly for the next generation of aerospace system designs. Then a collaborative effort involving the designer, rapid and reliable analysis tools and virtual experts will result in advanced aerospace systems that are safe, reliable, and efficient. This paper discusses the evolution, status, needs and directions for rapid modeling and analysis tools for structural analysis. First, the evolution of computerized design and analysis tools is briefly described. Next, the status of representative design and analysis tools is described along with a brief statement on their functionality. Then technology advancements to achieve rapid modeling and analysis are identified. Finally, potential future directions including possible prototype configurations are proposed.

  7. MOLEonline: a web-based tool for analyzing channels, tunnels and pores (2018 update).

    PubMed

    Pravda, Lukáš; Sehnal, David; Toušek, Dominik; Navrátilová, Veronika; Bazgier, Václav; Berka, Karel; Svobodová Vareková, Radka; Koca, Jaroslav; Otyepka, Michal

    2018-04-30

    MOLEonline is an interactive, web-based application for the detection and characterization of channels (pores and tunnels) within biomacromolecular structures. The updated version of MOLEonline overcomes limitations of the previous version by incorporating the recently developed LiteMol Viewer visualization engine and providing a simple, fully interactive user experience. The application enables two modes of calculation: one is dedicated to the analysis of channels while the other was specifically designed for transmembrane pores. As the application can use both PDB and mmCIF formats, it can be leveraged to analyze a wide spectrum of biomacromolecular structures, e.g. stemming from NMR, X-ray and cryo-EM techniques. The tool is interconnected with other bioinformatics tools (e.g., PDBe, CSA, ChannelsDB, OPM, UniProt) to help both setup and the analysis of acquired results. MOLEonline provides unprecedented analytics for the detection and structural characterization of channels, as well as information about their numerous physicochemical features. Here we present the application of MOLEonline for structural analyses of α-hemolysin and transient receptor potential mucolipin 1 (TRMP1) pores. The MOLEonline application is freely available via the Internet at https://mole.upol.cz.

  8. Scientific Benchmarks for Guiding Macromolecular Energy Function Improvement

    PubMed Central

    Leaver-Fay, Andrew; O’Meara, Matthew J.; Tyka, Mike; Jacak, Ron; Song, Yifan; Kellogg, Elizabeth H.; Thompson, James; Davis, Ian W.; Pache, Roland A.; Lyskov, Sergey; Gray, Jeffrey J.; Kortemme, Tanja; Richardson, Jane S.; Havranek, James J.; Snoeyink, Jack; Baker, David; Kuhlman, Brian

    2013-01-01

    Accurate energy functions are critical to macromolecular modeling and design. We describe new tools for identifying inaccuracies in energy functions and guiding their improvement, and illustrate the application of these tools to improvement of the Rosetta energy function. The feature analysis tool identifies discrepancies between structures deposited in the PDB and low energy structures generated by Rosetta; these likely arise from inaccuracies in the energy function. The optE tool optimizes the weights on the different components of the energy function by maximizing the recapitulation of a wide range of experimental observations. We use the tools to examine three proposed modifications to the Rosetta energy function: improving the unfolded state energy model (reference energies), using bicubic spline interpolation to generate knowledge based torisonal potentials, and incorporating the recently developed Dunbrack 2010 rotamer library (Shapovalov and Dunbrack, 2011). PMID:23422428

  9. Assessing hospital disaster preparedness: a comparison of an on-site survey, directly observed drill performance, and video analysis of teamwork.

    PubMed

    Kaji, Amy H; Langford, Vinette; Lewis, Roger J

    2008-09-01

    There is currently no validated method for assessing hospital disaster preparedness. We determine the degree of correlation between the results of 3 methods for assessing hospital disaster preparedness: administration of an on-site survey, drill observation using a structured evaluation tool, and video analysis of team performance in the hospital incident command center. This was a prospective, observational study conducted during a regional disaster drill, comparing the results from an on-site survey, a structured disaster drill evaluation tool, and a video analysis of teamwork, performed at 6 911-receiving hospitals in Los Angeles County, CA. The on-site survey was conducted separately from the drill and assessed hospital disaster plan structure, vendor agreements, modes of communication, medical and surgical supplies, involvement of law enforcement, mutual aid agreements with other facilities, drills and training, surge capacity, decontamination capability, and pharmaceutical stockpiles. The drill evaluation tool, developed by Johns Hopkins University under contract from the Agency for Healthcare Research and Quality, was used to assess various aspects of drill performance, such as the availability of the hospital disaster plan, the geographic configuration of the incident command center, whether drill participants were identifiable, whether the noise level interfered with effective communication, and how often key information (eg, number of available staffed floor, intensive care, and isolation beds; number of arriving victims; expected triage level of victims; number of potential discharges) was received by the incident command center. Teamwork behaviors in the incident command center were quantitatively assessed, using the MedTeams analysis of the video recordings obtained during the disaster drill. Spearman rank correlations of the results between pair-wise groupings of the 3 assessment methods were calculated. The 3 evaluation methods demonstrated qualitatively different results with respect to each hospital's level of disaster preparedness. The Spearman rank correlation coefficient between the results of the on-site survey and the video analysis of teamwork was -0.34; between the results of the on-site survey and the structured drill evaluation tool, 0.15; and between the results of the video analysis and the drill evaluation tool, 0.82. The disparate results obtained from the 3 methods suggest that each measures distinct aspects of disaster preparedness, and perhaps no single method adequately characterizes overall hospital preparedness.

  10. Artificial neural network prediction of aircraft aeroelastic behavior

    NASA Astrophysics Data System (ADS)

    Pesonen, Urpo Juhani

    An Artificial Neural Network that predicts aeroelastic behavior of aircraft is presented. The neural net was designed to predict the shape of a flexible wing in static flight conditions using results from a structural analysis and an aerodynamic analysis performed with traditional computational tools. To generate reliable training and testing data for the network, an aeroelastic analysis code using these tools as components was designed and validated. To demonstrate the advantages and reliability of Artificial Neural Networks, a network was also designed and trained to predict airfoil maximum lift at low Reynolds numbers where wind tunnel data was used for the training. Finally, a neural net was designed and trained to predict the static aeroelastic behavior of a wing without the need to iterate between the structural and aerodynamic solvers.

  11. A guide to the visual analysis and communication of biomolecular structural data.

    PubMed

    Johnson, Graham T; Hertig, Samuel

    2014-10-01

    Biologists regularly face an increasingly difficult task - to effectively communicate bigger and more complex structural data using an ever-expanding suite of visualization tools. Whether presenting results to peers or educating an outreach audience, a scientist can achieve maximal impact with minimal production time by systematically identifying an audience's needs, planning solutions from a variety of visual communication techniques and then applying the most appropriate software tools. A guide to available resources that range from software tools to professional illustrators can help researchers to generate better figures and presentations tailored to any audience's needs, and enable artistically inclined scientists to create captivating outreach imagery.

  12. [New methods for the evaluation of bone quality. Assessment of bone structural property using imaging.

    PubMed

    Ito, Masako

    Structural property of bone includes micro- or nano-structural property of the trabecular and cortical bone, and macroscopic geometry. Radiological technique is useful to analyze the bone structural property;multi-detector row CT(MDCT)or high-resolution peripheral QCT(HR-pQCT)is available to analyze human bone in vivo . For the analysis of hip geometry, CT-based hip structure analysis(HSA)is available as well as DXA-based HSA. These structural parameters are related to biomechanical property, and these assessment tools provide information of pathological changes or the effects of anti-osteoporotic agents on bone.

  13. Effects of various tool pin profiles on mechanical and metallurgical properties of friction stir welded joints of cryorolled AA2219 aluminium alloy

    NASA Astrophysics Data System (ADS)

    Kamal Babu, Karupannan; Panneerselvam, Kavan; Sathiya, Paulraj; Noorul Haq, Abdul Haq; Sundarrajan, Srinivasan; Mastanaiah, Potta; Srinivasa Murthy, Chunduri Venkata

    2018-02-01

    Friction stir welding (FSW) process was conducted on cryorolled (CR) AA2219 plate using different tool pin profiles such as cylindrical pin, threaded cylindrical pin, square pin and hexagonal pin profiles. The FSW was carried out with pairs of 6 mm thick CR aluminium plates with different tool pin profiles. The different tool pin profile weld portions' behaviors like mechanical (tensile strength, impact and hardness) and metallurgical characteristics were analyzed. The results of the mechanical analysis revealed that the joint made by the hexagonal pin tool had good strength compared to other pin profiles. This was due to the pulsating action and material flow of the tool resulting in dynamic recrystallization in the weld zone. This was confirmed by the ultra fine grain structure formation in Weld Nugget (WN) of hexagonal pin tool joint with a higher percentage of precipitate dissolution. The fractograph of the hexagonal tool pin weld portion confirmed the finer dimple structure morphology without having any interior defect compared to other tool pin profiles. The lowest weld joint strength was obtained from cylindrical pin profile weld joint due to insufficient material flow during welding. The Transmission Electron Microscope and EDX analysis showed the dissolution of the metastable θ″, θ' (Al2Cu) partial precipitates in the WN and proved the influence of metastable precipitates on enhancement of mechanical behavior of weld. The XRD results also confirmed the Al2Cu precipitation dissolution in the weld zone.

  14. [Short interspersed repetitive sequences (SINEs) and their use as a phylogenetic tool].

    PubMed

    Kramerov, D A; Vasetskiĭ, N S

    2009-01-01

    The data on one of the most common repetitive elements of eukaryotic genomes, short interspersed elements (SINEs), are reviewed. Their structure, origin, and functioning in the genome are discussed. The variation and abundance of these neutral genomic markers makes them a convenient and reliable tool for phylogenetic analysis. The main methods of such analysis are presented, and the potential and limitations of this approach are discussed using specific examples.

  15. System data communication structures for active-control transport aircraft, volume 1

    NASA Technical Reports Server (NTRS)

    Hopkins, A. L.; Martin, J. H.; Brock, L. D.; Jansson, D. G.; Serben, S.; Smith, T. B.; Hanley, L. D.

    1981-01-01

    Candidate data communication techniques are identified, including dedicated links, local buses, broadcast buses, multiplex buses, and mesh networks. The design methodology for mesh networks is then discussed, including network topology and node architecture. Several concepts of power distribution are reviewed, including current limiting and mesh networks for power. The technology issues of packaging, transmission media, and lightning are addressed, and, finally, the analysis tools developed to aid in the communication design process are described. There are special tools to analyze the reliability and connectivity of networks and more general reliability analysis tools for all types of systems.

  16. Analysis about diamond tool wear in nano-metric cutting of single crystal silicon using molecular dynamics method

    NASA Astrophysics Data System (ADS)

    Wang, Zhiguo; Liang, Yingchun; Chen, Mingjun; Tong, Zhen; Chen, Jiaxuan

    2010-10-01

    Tool wear not only changes its geometry accuracy and integrity, but also decrease machining precision and surface integrity of workpiece that affect using performance and service life of workpiece in ultra-precision machining. Scholars made a lot of experimental researches and stimulant analyses, but there is a great difference on the wear mechanism, especially on the nano-scale wear mechanism. In this paper, the three-dimensional simulation model is built to simulate nano-metric cutting of a single crystal silicon with a non-rigid right-angle diamond tool with 0 rake angle and 0 clearance angle by the molecular dynamics (MD) simulation approach, which is used to investigate the diamond tool wear during the nano-metric cutting process. A Tersoff potential is employed for the interaction between carbon-carbon atoms, silicon-silicon atoms and carbon-silicon atoms. The tool gets the high alternating shear stress, the tool wear firstly presents at the cutting edge where intension is low. At the corner the tool is splitted along the {1 1 1} crystal plane, which forms the tipping. The wear at the flank face is the structure transformation of diamond that the diamond structure transforms into the sheet graphite structure. Owing to the tool wear the cutting force increases.

  17. PROVAT: a tool for Voronoi tessellation analysis of protein structures and complexes.

    PubMed

    Gore, Swanand P; Burke, David F; Blundell, Tom L

    2005-08-01

    Voronoi tessellation has proved to be a useful tool in protein structure analysis. We have developed PROVAT, a versatile public domain software that enables computation and visualization of Voronoi tessellations of proteins and protein complexes. It is a set of Python scripts that integrate freely available specialized software (Qhull, Pymol etc.) into a pipeline. The calculation component of the tool computes Voronoi tessellation of a given protein system in a way described by a user-supplied XML recipe and stores resulting neighbourhood information as text files with various styles. The Python pickle file generated in the process is used by the visualization component, a Pymol plug-in, that offers a GUI to explore the tessellation visually. PROVAT source code can be downloaded from http://raven.bioc.cam.ac.uk/~swanand/Provat1, which also provides a webserver for its calculation component, documentation and examples.

  18. Generating community-built tools for data sharing and analysis in environmental networks

    USGS Publications Warehouse

    Read, Jordan S.; Gries, Corinna; Read, Emily K.; Klug, Jennifer; Hanson, Paul C.; Hipsey, Matthew R.; Jennings, Eleanor; O'Reilley, Catherine; Winslow, Luke A.; Pierson, Don; McBride, Christopher G.; Hamilton, David

    2016-01-01

    Rapid data growth in many environmental sectors has necessitated tools to manage and analyze these data. The development of tools often lags behind the proliferation of data, however, which may slow exploratory opportunities and scientific progress. The Global Lake Ecological Observatory Network (GLEON) collaborative model supports an efficient and comprehensive data–analysis–insight life cycle, including implementations of data quality control checks, statistical calculations/derivations, models, and data visualizations. These tools are community-built and openly shared. We discuss the network structure that enables tool development and a culture of sharing, leading to optimized output from limited resources. Specifically, data sharing and a flat collaborative structure encourage the development of tools that enable scientific insights from these data. Here we provide a cross-section of scientific advances derived from global-scale analyses in GLEON. We document enhancements to science capabilities made possible by the development of analytical tools and highlight opportunities to expand this framework to benefit other environmental networks.

  19. Visualizing vascular structures in virtual environments

    NASA Astrophysics Data System (ADS)

    Wischgoll, Thomas

    2013-01-01

    In order to learn more about the cause of coronary heart diseases and develop diagnostic tools, the extraction and visualization of vascular structures from volumetric scans for further analysis is an important step. By determining a geometric representation of the vasculature, the geometry can be inspected and additional quantitative data calculated and incorporated into the visualization of the vasculature. To provide a more user-friendly visualization tool, virtual environment paradigms can be utilized. This paper describes techniques for interactive rendering of large-scale vascular structures within virtual environments. This can be applied to almost any virtual environment configuration, such as CAVE-type displays. Specifically, the tools presented in this paper were tested on a Barco I-Space and a large 62x108 inch passive projection screen with a Kinect sensor for user tracking.

  20. Analysis and control on changeable wheel tool system of hybrid grinding and polishing machine tool for blade finishing

    NASA Astrophysics Data System (ADS)

    He, Qiuwei; Lv, Xingming; Wang, Xin; Qu, Xingtian; Zhao, Ji

    2017-01-01

    Blade is the key component in the energy power equipment of turbine, aircraft engines and so on. Researches on the process and equipment for blade finishing become one of important and difficult point. To control precisely tool system of developed hybrid grinding and polishing machine tool for blade finishing, the tool system with changeable wheel for belt polishing is analyzed in this paper. Firstly, the belt length and wrap angle of each wheel in different position of tension wheel swing angle in the process of changing wheel is analyzed. The reasonable belt length is calculated by using MATLAB, and relationships between wrap angle of each wheel and cylinder expansion amount of contact wheel are obtained. Then, the control system for changeable wheel tool structure is developed. Lastly, the surface roughness of blade finishing is verified by experiments. Theoretical analysis and experimental results show that reasonable belt length and wheel wrap angle can be obtained by proposed analysis method, the changeable wheel tool system can be controlled precisely, and the surface roughness of blade after grinding meets the design requirements.

  1. AEROELASTIC SIMULATION TOOL FOR INFLATABLE BALLUTE AEROCAPTURE

    NASA Technical Reports Server (NTRS)

    Liever, P. A.; Sheta, E. F.; Habchi, S. D.

    2006-01-01

    A multidisciplinary analysis tool is under development for predicting the impact of aeroelastic effects on the functionality of inflatable ballute aeroassist vehicles in both the continuum and rarefied flow regimes. High-fidelity modules for continuum and rarefied aerodynamics, structural dynamics, heat transfer, and computational grid deformation are coupled in an integrated multi-physics, multi-disciplinary computing environment. This flexible and extensible approach allows the integration of state-of-the-art, stand-alone NASA and industry leading continuum and rarefied flow solvers and structural analysis codes into a computing environment in which the modules can run concurrently with synchronized data transfer. Coupled fluid-structure continuum flow demonstrations were conducted on a clamped ballute configuration. The feasibility of implementing a DSMC flow solver in the simulation framework was demonstrated, and loosely coupled rarefied flow aeroelastic demonstrations were performed. A NASA and industry technology survey identified CFD, DSMC and structural analysis codes capable of modeling non-linear shape and material response of thin-film inflated aeroshells. The simulation technology will find direct and immediate applications with NASA and industry in ongoing aerocapture technology development programs.

  2. modPDZpep: a web resource for structure based analysis of human PDZ-mediated interaction networks.

    PubMed

    Sain, Neetu; Mohanty, Debasisa

    2016-09-21

    PDZ domains recognize short sequence stretches usually present in C-terminal of their interaction partners. Because of the involvement of PDZ domains in many important biological processes, several attempts have been made for developing bioinformatics tools for genome-wide identification of PDZ interaction networks. Currently available tools for prediction of interaction partners of PDZ domains utilize machine learning approach. Since, they have been trained using experimental substrate specificity data for specific PDZ families, their applicability is limited to PDZ families closely related to the training set. These tools also do not allow analysis of PDZ-peptide interaction interfaces. We have used a structure based approach to develop modPDZpep, a program to predict the interaction partners of human PDZ domains and analyze structural details of PDZ interaction interfaces. modPDZpep predicts interaction partners by using structural models of PDZ-peptide complexes and evaluating binding energy scores using residue based statistical pair potentials. Since, it does not require training using experimental data on peptide binding affinity, it can predict substrates for diverse PDZ families. Because of the use of simple scoring function for binding energy, it is also fast enough for genome scale structure based analysis of PDZ interaction networks. Benchmarking using artificial as well as real negative datasets indicates good predictive power with ROC-AUC values in the range of 0.7 to 0.9 for a large number of human PDZ domains. Another novel feature of modPDZpep is its ability to map novel PDZ mediated interactions in human protein-protein interaction networks, either by utilizing available experimental phage display data or by structure based predictions. In summary, we have developed modPDZpep, a web-server for structure based analysis of human PDZ domains. It is freely available at http://www.nii.ac.in/modPDZpep.html or http://202.54.226.235/modPDZpep.html . This article was reviewed by Michael Gromiha and Zoltán Gáspári.

  3. [Factor structure of regional CBF and CMRglu values as a tool for the study of default mode of the brain].

    PubMed

    Kataev, G V; Korotkov, A D; Kireev, M V; Medvedev, S V

    2013-01-01

    In the present article it was shown that the functional connectivity of brain structures, revealed by factor analysis of resting PET CBF and rCMRglu data, is an adequate tool to study the default mode of the human brain. The identification of neuroanatomic systems of default mode (default mode network) during routine clinical PET investigations is important for further studying the functional organization of the normal brain and its reorganizations in pathological conditions.

  4. Aeroelastic Optimization of Generalized Tube and Wing Aircraft Concepts Using HCDstruct Version 2.0

    NASA Technical Reports Server (NTRS)

    Quinlan, Jesse R.; Gern, Frank H.

    2017-01-01

    Major enhancements were made to the Higher-fidelity Conceptual Design and structural optimization (HCDstruct) tool developed at NASA Langley Research Center (LaRC). Whereas previous versions were limited to hybrid wing body (HWB) configurations, the current version of HCDstruct now supports the analysis of generalized tube and wing (TW) aircraft concepts. Along with significantly enhanced user input options for all air- craft configurations, these enhancements represent HCDstruct version 2.0. Validation was performed using a Boeing 737-200 aircraft model, for which primary structure weight estimates agreed well with available data. Additionally, preliminary analysis of the NASA D8 (ND8) aircraft concept was performed, highlighting several new features of the tool.

  5. A Multiscale, Nonlinear, Modeling Framework Enabling the Design and Analysis of Composite Materials and Structures

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2012-01-01

    A framework for the multiscale design and analysis of composite materials and structures is presented. The ImMAC software suite, developed at NASA Glenn Research Center, embeds efficient, nonlinear micromechanics capabilities within higher scale structural analysis methods such as finite element analysis. The result is an integrated, multiscale tool that relates global loading to the constituent scale, captures nonlinearities at this scale, and homogenizes local nonlinearities to predict their effects at the structural scale. Example applications of the multiscale framework are presented for the stochastic progressive failure of a SiC/Ti composite tensile specimen and the effects of microstructural variations on the nonlinear response of woven polymer matrix composites.

  6. A Multiscale, Nonlinear, Modeling Framework Enabling the Design and Analysis of Composite Materials and Structures

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2011-01-01

    A framework for the multiscale design and analysis of composite materials and structures is presented. The ImMAC software suite, developed at NASA Glenn Research Center, embeds efficient, nonlinear micromechanics capabilities within higher scale structural analysis methods such as finite element analysis. The result is an integrated, multiscale tool that relates global loading to the constituent scale, captures nonlinearities at this scale, and homogenizes local nonlinearities to predict their effects at the structural scale. Example applications of the multiscale framework are presented for the stochastic progressive failure of a SiC/Ti composite tensile specimen and the effects of microstructural variations on the nonlinear response of woven polymer matrix composites.

  7. Numerical analysis of the performance of rock weirs: Effects of structure configuration on local hydraulics

    USGS Publications Warehouse

    Holmquist-Johnson, C. L.

    2009-01-01

    River spanning rock structures are being constructed for water delivery as well as to enable fish passage at barriers and provide or improve the aquatic habitat for endangered fish species. Current design methods are based upon anecdotal information applicable to a narrow range of channel conditions. The complex flow patterns and performance of rock weirs is not well understood. Without accurate understanding of their hydraulics, designers cannot address the failure mechanisms of these structures. Flow characteristics such as jets, near bed velocities, recirculation, eddies, and plunging flow govern scour pool development. These detailed flow patterns can be replicated using a 3D numerical model. Numerical studies inexpensively simulate a large number of cases resulting in an increased range of applicability in order to develop design tools and predictive capability for analysis and design. The analysis and results of the numerical modeling, laboratory modeling, and field data provide a process-based method for understanding how structure geometry affects flow characteristics, scour development, fish passage, water delivery, and overall structure stability. Results of the numerical modeling allow designers to utilize results of the analysis to determine the appropriate geometry for generating desirable flow parameters. The end product of this research will develop tools and guidelines for more robust structure design or retrofits based upon predictable engineering and hydraulic performance criteria. ?? 2009 ASCE.

  8. New software for statistical analysis of Cambridge Structural Database data

    PubMed Central

    Sykes, Richard A.; McCabe, Patrick; Allen, Frank H.; Battle, Gary M.; Bruno, Ian J.; Wood, Peter A.

    2011-01-01

    A collection of new software tools is presented for the analysis of geometrical, chemical and crystallographic data from the Cambridge Structural Database (CSD). This software supersedes the program Vista. The new functionality is integrated into the program Mercury in order to provide statistical, charting and plotting options alongside three-dimensional structural visualization and analysis. The integration also permits immediate access to other information about specific CSD entries through the Mercury framework, a common requirement in CSD data analyses. In addition, the new software includes a range of more advanced features focused towards structural analysis such as principal components analysis, cone-angle correction in hydrogen-bond analyses and the ability to deal with topological symmetry that may be exhibited in molecular search fragments. PMID:22477784

  9. Application of enhanced modern structured analysis techniques to Space Station Freedom electric power system requirements

    NASA Technical Reports Server (NTRS)

    Biernacki, John; Juhasz, John; Sadler, Gerald

    1991-01-01

    A team of Space Station Freedom (SSF) system engineers are in the process of extensive analysis of the SSF requirements, particularly those pertaining to the electrical power system (EPS). The objective of this analysis is the development of a comprehensive, computer-based requirements model, using an enhanced modern structured analysis methodology (EMSA). Such a model provides a detailed and consistent representation of the system's requirements. The process outlined in the EMSA methodology is unique in that it allows the graphical modeling of real-time system state transitions, as well as functional requirements and data relationships, to be implemented using modern computer-based tools. These tools permit flexible updating and continuous maintenance of the models. Initial findings resulting from the application of EMSA to the EPS have benefited the space station program by linking requirements to design, providing traceability of requirements, identifying discrepancies, and fostering an understanding of the EPS.

  10. Genetic Code Analysis Toolkit: A novel tool to explore the coding properties of the genetic code and DNA sequences

    NASA Astrophysics Data System (ADS)

    Kraljić, K.; Strüngmann, L.; Fimmel, E.; Gumbel, M.

    2018-01-01

    The genetic code is degenerated and it is assumed that redundancy provides error detection and correction mechanisms in the translation process. However, the biological meaning of the code's structure is still under current research. This paper presents a Genetic Code Analysis Toolkit (GCAT) which provides workflows and algorithms for the analysis of the structure of nucleotide sequences. In particular, sets or sequences of codons can be transformed and tested for circularity, comma-freeness, dichotomic partitions and others. GCAT comes with a fertile editor custom-built to work with the genetic code and a batch mode for multi-sequence processing. With the ability to read FASTA files or load sequences from GenBank, the tool can be used for the mathematical and statistical analysis of existing sequence data. GCAT is Java-based and provides a plug-in concept for extensibility. Availability: Open source Homepage:http://www.gcat.bio/

  11. Surveying the factor structure and reliability of the Persian version of the Jefferson Scale of Physician Lifelong Learning (JeffSPLL) in staff of medical sciences.

    PubMed

    Karimi, Fatemeh Zahra; Alesheikh, Aytay; Pakravan, Soheila; Abdollahi, Mahbubeh; Damough, Mozhdeh; Anbaran, Zahra Khosravi; Farahani, Leila Amiri

    2017-10-01

    In medical sciences, commitment to lifelong learning has been expressed as an important element. Today, due to the rapid development of medical information and technology, lifelong learning is critical for safe medical care and development in medical research. JeffSPLL is one of the scales for measuring lifelong learning among the staff of medical sciences that has never been used in Iran. The aim of the present study was to determine the factor structure and reliability of the Persian version of JeffSPLL among Persian-speaking staff of universities of medical sciences in Iran. This study was a cross-sectional study, methodologically, that was conducted in 2012-2013. In this study, 210 staff members of Birjand University of Medical Sciences were selected. Data collection tool was the Persian version of JeffSPLL. To investigate the factor structure of this tool, confirmatory factor analysis was used and to evaluate the model fit, goodness-of-fit indices, root mean square error of approximation (RMSEA), the ratio of chi-square to the degree of freedom associated with it, comparative fit index (CFI), and root mean square residual (RMR) were used. To investigate the reliability of tool, Cronbach's alpha was employed. Data analysis was conducted using LISREL8.8 and SPSS 20 software. Confirmatory factor analysis showed that RMSEA was close to 0.1, and CFI and GFI were close to one. Therefore, four-factor model was appropriate. Cronbach's alpha was 0.92 for the whole tool and it was between 0.82 and 0.89 for subscales. The present study verified the four-factor structure of the 19-item Persian version of JeffSPLL that included professional learning beliefs and motivation, scholarly activities, attention to learning opportunities, and technical skills in information seeking among the staff. In addition, this tool has acceptable reliability. Therefore, it was appropriate to assess lifelong learning in the Persian-speaking staff population.

  12. Developments in the CCP4 molecular-graphics project.

    PubMed

    Potterton, Liz; McNicholas, Stuart; Krissinel, Eugene; Gruber, Jan; Cowtan, Kevin; Emsley, Paul; Murshudov, Garib N; Cohen, Serge; Perrakis, Anastassis; Noble, Martin

    2004-12-01

    Progress towards structure determination that is both high-throughput and high-value is dependent on the development of integrated and automatic tools for electron-density map interpretation and for the analysis of the resulting atomic models. Advances in map-interpretation algorithms are extending the resolution regime in which fully automatic tools can work reliably, but at present human intervention is required to interpret poor regions of macromolecular electron density, particularly where crystallographic data is only available to modest resolution [for example, I/sigma(I) < 2.0 for minimum resolution 2.5 A]. In such cases, a set of manual and semi-manual model-building molecular-graphics tools is needed. At the same time, converting the knowledge encapsulated in a molecular structure into understanding is dependent upon visualization tools, which must be able to communicate that understanding to others by means of both static and dynamic representations. CCP4 mg is a program designed to meet these needs in a way that is closely integrated with the ongoing development of CCP4 as a program suite suitable for both low- and high-intervention computational structural biology. As well as providing a carefully designed user interface to advanced algorithms of model building and analysis, CCP4 mg is intended to present a graphical toolkit to developers of novel algorithms in these fields.

  13. An overview of tools for the validation of protein NMR structures.

    PubMed

    Vuister, Geerten W; Fogh, Rasmus H; Hendrickx, Pieter M S; Doreleijers, Jurgen F; Gutmanas, Aleksandras

    2014-04-01

    Biomolecular structures at atomic resolution present a valuable resource for the understanding of biology. NMR spectroscopy accounts for 11% of all structures in the PDB repository. In response to serious problems with the accuracy of some of the NMR-derived structures and in order to facilitate proper analysis of the experimental models, a number of program suites are available. We discuss nine of these tools in this review: PROCHECK-NMR, PSVS, GLM-RMSD, CING, Molprobity, Vivaldi, ResProx, NMR constraints analyzer and QMEAN. We evaluate these programs for their ability to assess the structural quality, restraints and their violations, chemical shifts, peaks and the handling of multi-model NMR ensembles. We document both the input required by the programs and output they generate. To discuss their relative merits we have applied the tools to two representative examples from the PDB: a small, globular monomeric protein (Staphylococcal nuclease from S. aureus, PDB entry 2kq3) and a small, symmetric homodimeric protein (a region of human myosin-X, PDB entry 2lw9).

  14. Fast-NPS-A Markov Chain Monte Carlo-based analysis tool to obtain structural information from single-molecule FRET measurements

    NASA Astrophysics Data System (ADS)

    Eilert, Tobias; Beckers, Maximilian; Drechsler, Florian; Michaelis, Jens

    2017-10-01

    The analysis tool and software package Fast-NPS can be used to analyse smFRET data to obtain quantitative structural information about macromolecules in their natural environment. In the algorithm a Bayesian model gives rise to a multivariate probability distribution describing the uncertainty of the structure determination. Since Fast-NPS aims to be an easy-to-use general-purpose analysis tool for a large variety of smFRET networks, we established an MCMC based sampling engine that approximates the target distribution and requires no parameter specification by the user at all. For an efficient local exploration we automatically adapt the multivariate proposal kernel according to the shape of the target distribution. In order to handle multimodality, the sampler is equipped with a parallel tempering scheme that is fully adaptive with respect to temperature spacing and number of chains. Since the molecular surrounding of a dye molecule affects its spatial mobility and thus the smFRET efficiency, we introduce dye models which can be selected for every dye molecule individually. These models allow the user to represent the smFRET network in great detail leading to an increased localisation precision. Finally, a tool to validate the chosen model combination is provided. Programme Files doi:http://dx.doi.org/10.17632/7ztzj63r68.1 Licencing provisions: Apache-2.0 Programming language: GUI in MATLAB (The MathWorks) and the core sampling engine in C++ Nature of problem: Sampling of highly diverse multivariate probability distributions in order to solve for macromolecular structures from smFRET data. Solution method: MCMC algorithm with fully adaptive proposal kernel and parallel tempering scheme.

  15. PDBe: Protein Data Bank in Europe

    PubMed Central

    Velankar, S.; Alhroub, Y.; Best, C.; Caboche, S.; Conroy, M. J.; Dana, J. M.; Fernandez Montecelo, M. A.; van Ginkel, G.; Golovin, A.; Gore, S. P.; Gutmanas, A.; Haslam, P.; Hendrickx, P. M. S.; Heuson, E.; Hirshberg, M.; John, M.; Lagerstedt, I.; Mir, S.; Newman, L. E.; Oldfield, T. J.; Patwardhan, A.; Rinaldi, L.; Sahni, G.; Sanz-García, E.; Sen, S.; Slowley, R.; Suarez-Uruena, A.; Swaminathan, G. J.; Symmons, M. F.; Vranken, W. F.; Wainwright, M.; Kleywegt, G. J.

    2012-01-01

    The Protein Data Bank in Europe (PDBe; pdbe.org) is a partner in the Worldwide PDB organization (wwPDB; wwpdb.org) and as such actively involved in managing the single global archive of biomacromolecular structure data, the PDB. In addition, PDBe develops tools, services and resources to make structure-related data more accessible to the biomedical community. Here we describe recently developed, extended or improved services, including an animated structure-presentation widget (PDBportfolio), a widget to graphically display the coverage of any UniProt sequence in the PDB (UniPDB), chemistry- and taxonomy-based PDB-archive browsers (PDBeXplore), and a tool for interactive visualization of NMR structures, corresponding experimental data as well as validation and analysis results (Vivaldi). PMID:22110033

  16. LOOS: an extensible platform for the structural analysis of simulations.

    PubMed

    Romo, Tod D; Grossfield, Alan

    2009-01-01

    We have developed LOOS (Lightweight Object-Oriented Structure-analysis library) as an object-oriented library designed to facilitate the rapid development of tools for the structural analysis of simulations. LOOS supports the native file formats of most common simulation packages including AMBER, CHARMM, CNS, Gromacs, NAMD, Tinker, and X-PLOR. Encapsulation and polymorphism are used to simultaneously provide a stable interface to the programmer and make LOOS easily extensible. A rich atom selection language based on the C expression syntax is included as part of the library. LOOS enables students and casual programmer-scientists to rapidly write their own analytical tools in a compact and expressive manner resembling scripting. LOOS is written in C++ and makes extensive use of the Standard Template Library and Boost, and is freely available under the GNU General Public License (version 3) LOOS has been tested on Linux and MacOS X, but is written to be portable and should work on most Unix-based platforms.

  17. Advances in structural and functional analysis of membrane proteins by electron crystallography

    PubMed Central

    Wisedchaisri, Goragot; Reichow, Steve L.; Gonen, Tamir

    2011-01-01

    Summary Electron crystallography is a powerful technique for the study of membrane protein structure and function in the lipid environment. When well-ordered two-dimensional crystals are obtained the structure of both protein and lipid can be determined and lipid-protein interactions analyzed. Protons and ionic charges can be visualized by electron crystallography and the protein of interest can be captured for structural analysis in a variety of physiologically distinct states. This review highlights the strengths of electron crystallography and the momentum that is building up in automation and the development of high throughput tools and methods for structural and functional analysis of membrane proteins by electron crystallography. PMID:22000511

  18. Advances in structural and functional analysis of membrane proteins by electron crystallography.

    PubMed

    Wisedchaisri, Goragot; Reichow, Steve L; Gonen, Tamir

    2011-10-12

    Electron crystallography is a powerful technique for the study of membrane protein structure and function in the lipid environment. When well-ordered two-dimensional crystals are obtained the structure of both protein and lipid can be determined and lipid-protein interactions analyzed. Protons and ionic charges can be visualized by electron crystallography and the protein of interest can be captured for structural analysis in a variety of physiologically distinct states. This review highlights the strengths of electron crystallography and the momentum that is building up in automation and the development of high throughput tools and methods for structural and functional analysis of membrane proteins by electron crystallography. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. A magnetic and gravity investigation of the Liberia Basin, West Africa

    NASA Astrophysics Data System (ADS)

    Morris Cooper, S.; Liu, Tianyou

    2011-02-01

    Gravity and magnetic analysis provide an opportunity to deduce and understand to a large extent the stratigraphy, structure and shape of the substructure. Euler deconvolution is a useful tool for providing estimates of the localities and depth of magnetic and gravity sources. Wavelet analysis is an interesting tool for filtering and improving geophysical data. The application of these two methods to gravity and magnetic data of the Liberia Basin enable the definition of the geometry and depth of the subsurface geologic structures. The study reveals the basin is sub-divided and the depth to basement of the basin structure ranges from about 5 km at its North West end to 10 km at its broadest section eastward. Magnetic data analysis indicates shallow intrusives ranging from a depth of 0.09 km to 0.42 km with an average depth of 0.25 km along the margin. Other intrusives can be found at average depths of 0.6 km and 1.7 km respectively within the confines of the basin. An analysis of the gravity data indicated deep faults intersecting the transform zone.

  20. PDB2Graph: A toolbox for identifying critical amino acids map in proteins based on graph theory.

    PubMed

    Niknam, Niloofar; Khakzad, Hamed; Arab, Seyed Shahriar; Naderi-Manesh, Hossein

    2016-05-01

    The integrative and cooperative nature of protein structure involves the assessment of topological and global features of constituent parts. Network concept takes complete advantage of both of these properties in the analysis concomitantly. High compatibility to structural concepts or physicochemical properties in addition to exploiting a remarkable simplification in the system has made network an ideal tool to explore biological systems. There are numerous examples in which different protein structural and functional characteristics have been clarified by the network approach. Here, we present an interactive and user-friendly Matlab-based toolbox, PDB2Graph, devoted to protein structure network construction, visualization, and analysis. Moreover, PDB2Graph is an appropriate tool for identifying critical nodes involved in protein structural robustness and function based on centrality indices. It maps critical amino acids in protein networks and can greatly aid structural biologists in selecting proper amino acid candidates for manipulating protein structures in a more reasonable and rational manner. To introduce the capability and efficiency of PDB2Graph in detail, the structural modification of Calmodulin through allosteric binding of Ca(2+) is considered. In addition, a mutational analysis for three well-identified model proteins including Phage T4 lysozyme, Barnase and Ribonuclease HI, was performed to inspect the influence of mutating important central residues on protein activity. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Efficient simulation of press hardening process through integrated structural and CFD analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palaniswamy, Hariharasudhan; Mondalek, Pamela; Wronski, Maciek

    Press hardened steel parts are being increasingly used in automotive structures for their higher strength to meet safety standards while reducing vehicle weight to improve fuel consumption. However, manufacturing of sheet metal parts by press hardening process to achieve desired properties is extremely challenging as it involves complex interaction of plastic deformation, metallurgical change, thermal distribution, and fluid flow. Numerical simulation is critical for successful design of the process and to understand the interaction among the numerous process parameters to control the press hardening process in order to consistently achieve desired part properties. Until now there has been no integratedmore » commercial software solution that can efficiently model the complete process from forming of the blank, heat transfer between the blank and tool, microstructure evolution in the blank, heat loss from tool to the fluid that flows through water channels in the tools. In this study, a numerical solution based on Altair HyperWorks® product suite involving RADIOSS®, a non-linear finite element based structural analysis solver and AcuSolve®, an incompressible fluid flow solver based on Galerkin Least Square Finite Element Method have been utilized to develop an efficient solution for complete press hardening process design and analysis. RADIOSS is used to handle the plastic deformation, heat transfer between the blank and tool, and microstructure evolution in the blank during cooling. While AcuSolve is used to efficiently model heat loss from tool to the fluid that flows through water channels in the tools. The approach is demonstrated through some case studies.« less

  2. Data mining and visualization from planetary missions: the VESPA-Europlanet2020 activity

    NASA Astrophysics Data System (ADS)

    Longobardo, Andrea; Capria, Maria Teresa; Zinzi, Angelo; Ivanovski, Stavro; Giardino, Marco; di Persio, Giuseppe; Fonte, Sergio; Palomba, Ernesto; Antonelli, Lucio Angelo; Fonte, Sergio; Giommi, Paolo; Europlanet VESPA 2020 Team

    2017-06-01

    This paper presents the VESPA (Virtual European Solar and Planetary Access) activity, developed in the context of the Europlanet 2020 Horizon project, aimed at providing tools for analysis and visualization of planetary data provided by space missions. In particular, the activity is focused on minor bodies of the Solar System.The structure of the computation node, the algorithms developed for analysis of planetary surfaces and cometary comae and the tools for data visualization are presented.

  3. Mechanical Response Analysis of Long-life Asphalt Pavement Structure of Yunluo High-speed on the Semi-rigid Base

    NASA Astrophysics Data System (ADS)

    Liu, Feng; Wu, Chuanhai; Xu, Xinquan; Li, Hao; Wang, Zhixiang

    2018-01-01

    In order to grasp the rule of the strain change of the semi-rigid asphalt pavement structure under the FWD load and provide a reliable theoretical and practical basis for the design of the pavement structure, based on the test section of Guangdong Yunluo expressway, taking FWD as the loading tool, by using the finite element analysis software ANSYS, the internal variation rules of each pavement structural layer were obtained. Based on the results of the theoretical analysis, the measured strain sensor was set up in the corresponding layer of the pavement structure, and the strain test plan was determined. Based on the analysis of the strain data obtained from several structural layers and field monitoring, the rationality of the type pavement structure and the strain test scheme were verified, so as to provide useful help for the design and the maintenance of the pavement structure.

  4. TACT: A Set of MSC/PATRAN- and MSC/NASTRAN- based Modal Correlation Tools

    NASA Technical Reports Server (NTRS)

    Marlowe, Jill M.; Dixon, Genevieve D.

    1998-01-01

    This paper describes the functionality and demonstrates the utility of the Test Analysis Correlation Tools (TACT), a suite of MSC/PATRAN Command Language (PCL) tools which automate the process of correlating finite element models to modal survey test data. The initial release of TACT provides a basic yet complete set of tools for performing correlation totally inside the PATRAN/NASTRAN environment. Features include a step-by-step menu structure, pre-test accelerometer set evaluation and selection, analysis and test result export/import in Universal File Format, calculation of frequency percent difference and cross-orthogonality correlation results using NASTRAN, creation and manipulation of mode pairs, and five different ways of viewing synchronized animations of analysis and test modal results. For the PATRAN-based analyst, TACT eliminates the repetitive, time-consuming and error-prone steps associated with transferring finite element data to a third-party modal correlation package, which allows the analyst to spend more time on the more challenging task of model updating. The usefulness of this software is presented using a case history, the correlation for a NASA Langley Research Center (LaRC) low aspect ratio research wind tunnel model. To demonstrate the improvements that TACT offers the MSC/PATRAN- and MSC/DIASTRAN- based structural analysis community, a comparison of the modal correlation process using TACT within PATRAN versus external third-party modal correlation packages is presented.

  5. Computational tool for the early screening of monoclonal antibodies for their viscosities

    PubMed Central

    Agrawal, Neeraj J; Helk, Bernhard; Kumar, Sandeep; Mody, Neil; Sathish, Hasige A.; Samra, Hardeep S.; Buck, Patrick M; Li, Li; Trout, Bernhardt L

    2016-01-01

    Highly concentrated antibody solutions often exhibit high viscosities, which present a number of challenges for antibody-drug development, manufacturing and administration. The antibody sequence is a key determinant for high viscosity of highly concentrated solutions; therefore, a sequence- or structure-based tool that can identify highly viscous antibodies from their sequence would be effective in ensuring that only antibodies with low viscosity progress to the development phase. Here, we present a spatial charge map (SCM) tool that can accurately identify highly viscous antibodies from their sequence alone (using homology modeling to determine the 3-dimensional structures). The SCM tool has been extensively validated at 3 different organizations, and has proved successful in correctly identifying highly viscous antibodies. As a quantitative tool, SCM is amenable to high-throughput automated analysis, and can be effectively implemented during the antibody screening or engineering phase for the selection of low-viscosity antibodies. PMID:26399600

  6. NMR studies of protein-nucleic acid interactions.

    PubMed

    Varani, Gabriele; Chen, Yu; Leeper, Thomas C

    2004-01-01

    Protein-DNA and protein-RNA complexes play key functional roles in every living organism. Therefore, the elucidation of their structure and dynamics is an important goal of structural and molecular biology. Nuclear magnetic resonance (NMR) studies of protein and nucleic acid complexes have common features with studies of protein-protein complexes: the interaction surfaces between the molecules must be carefully delineated, the relative orientation of the two species needs to be accurately and precisely determined, and close intermolecular contacts defined by nuclear Overhauser effects (NOEs) must be obtained. However, differences in NMR properties (e.g., chemical shifts) and biosynthetic pathways for sample productions generate important differences. Chemical shift differences between the protein and nucleic acid resonances can aid the NMR structure determination process; however, the relatively limited dispersion of the RNA ribose resonances makes the process of assigning intermolecular NOEs more difficult. The analysis of the resulting structures requires computational tools unique to nucleic acid interactions. This chapter summarizes the most important elements of the structure determination by NMR of protein-nucleic acid complexes and their analysis. The main emphasis is on recent developments (e.g., residual dipolar couplings and new Web-based analysis tools) that have facilitated NMR studies of these complexes and expanded the type of biological problems to which NMR techniques of structural elucidation can now be applied.

  7. 3Drefine: an interactive web server for efficient protein structure refinement

    PubMed Central

    Bhattacharya, Debswapna; Nowotny, Jackson; Cao, Renzhi; Cheng, Jianlin

    2016-01-01

    3Drefine is an interactive web server for consistent and computationally efficient protein structure refinement with the capability to perform web-based statistical and visual analysis. The 3Drefine refinement protocol utilizes iterative optimization of hydrogen bonding network combined with atomic-level energy minimization on the optimized model using a composite physics and knowledge-based force fields for efficient protein structure refinement. The method has been extensively evaluated on blind CASP experiments as well as on large-scale and diverse benchmark datasets and exhibits consistent improvement over the initial structure in both global and local structural quality measures. The 3Drefine web server allows for convenient protein structure refinement through a text or file input submission, email notification, provided example submission and is freely available without any registration requirement. The server also provides comprehensive analysis of submissions through various energy and statistical feedback and interactive visualization of multiple refined models through the JSmol applet that is equipped with numerous protein model analysis tools. The web server has been extensively tested and used by many users. As a result, the 3Drefine web server conveniently provides a useful tool easily accessible to the community. The 3Drefine web server has been made publicly available at the URL: http://sysbio.rnet.missouri.edu/3Drefine/. PMID:27131371

  8. Open, Cross Platform Chemistry Application Unifying Structure Manipulation, External Tools, Databases and Visualization

    DTIC Science & Technology

    2012-11-27

    with powerful analysis tools and an informatics approach leveraging best-of-breed NoSQL databases, in order to store, search and retrieve relevant...dictionaries, and JavaScript also has good support. The MongoDB project[15] was chosen as a scalable NoSQL data store for the cheminfor- matics components

  9. Parameter identification and optimization of slide guide joint of CNC machine tools

    NASA Astrophysics Data System (ADS)

    Zhou, S.; Sun, B. B.

    2017-11-01

    The joint surface has an important influence on the performance of CNC machine tools. In order to identify the dynamic parameters of slide guide joint, the parametric finite element model of the joint is established and optimum design method is used based on the finite element simulation and modal test. Then the mode that has the most influence on the dynamics of slip joint is found through harmonic response analysis. Take the frequency of this mode as objective, the sensitivity analysis of the stiffness of each joint surface is carried out using Latin Hypercube Sampling and Monte Carlo Simulation. The result shows that the vertical stiffness of slip joint surface constituted by the bed and the slide plate has the most obvious influence on the structure. Therefore, this stiffness is taken as the optimization variable and the optimal value is obtained through studying the relationship between structural dynamic performance and stiffness. Take the stiffness values before and after optimization into the FEM of machine tool, and it is found that the dynamic performance of the machine tool is improved.

  10. Nonlinear transient analysis via energy minimization

    NASA Technical Reports Server (NTRS)

    Kamat, M. P.; Knight, N. F., Jr.

    1978-01-01

    The formulation basis for nonlinear transient analysis of finite element models of structures using energy minimization is provided. Geometric and material nonlinearities are included. The development is restricted to simple one and two dimensional finite elements which are regarded as being the basic elements for modeling full aircraft-like structures under crash conditions. The results indicate the effectiveness of the technique as a viable tool for this purpose.

  11. Turning up the heat on aircraft structures. [design and analysis for high-temperature conditions

    NASA Technical Reports Server (NTRS)

    Dobyns, Alan; Saff, Charles; Johns, Robert

    1992-01-01

    An overview is presented of the current effort in design and development of aircraft structures to achieve the lowest cost for best performance. Enhancements in this area are focused on integrated design, improved design analysis tools, low-cost fabrication techniques, and more sophisticated test methods. 3D CAD/CAM data are becoming the method through which design, manufacturing, and engineering communicate.

  12. Tool for analysis of early age transverse cracking of composite bridge decks.

    DOT National Transportation Integrated Search

    2011-08-29

    "Executive Summary: Computational methods and associated software were developed : to compute stresses in HP concrete composite bridge decks due to temperature, shrinkage, and : vehicle loading. The structural analysis program uses a layered finite e...

  13. Guidelines for the analysis of free energy calculations.

    PubMed

    Klimovich, Pavel V; Shirts, Michael R; Mobley, David L

    2015-05-01

    Free energy calculations based on molecular dynamics simulations show considerable promise for applications ranging from drug discovery to prediction of physical properties and structure-function studies. But these calculations are still difficult and tedious to analyze, and best practices for analysis are not well defined or propagated. Essentially, each group analyzing these calculations needs to decide how to conduct the analysis and, usually, develop its own analysis tools. Here, we review and recommend best practices for analysis yielding reliable free energies from molecular simulations. Additionally, we provide a Python tool, alchemical-analysis.py, freely available on GitHub as part of the pymbar package (located at http://github.com/choderalab/pymbar), that implements the analysis practices reviewed here for several reference simulation packages, which can be adapted to handle data from other packages. Both this review and the tool covers analysis of alchemical calculations generally, including free energy estimates via both thermodynamic integration and free energy perturbation-based estimators. Our Python tool also handles output from multiple types of free energy calculations, including expanded ensemble and Hamiltonian replica exchange, as well as standard fixed ensemble calculations. We also survey a range of statistical and graphical ways of assessing the quality of the data and free energy estimates, and provide prototypes of these in our tool. We hope this tool and discussion will serve as a foundation for more standardization of and agreement on best practices for analysis of free energy calculations.

  14. Automatic differentiation as a tool in engineering design

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois; Hall, Laura E.

    1992-01-01

    Automatic Differentiation (AD) is a tool that systematically implements the chain rule of differentiation to obtain the derivatives of functions calculated by computer programs. AD is assessed as a tool for engineering design. The forward and reverse modes of AD, their computing requirements, as well as approaches to implementing AD are discussed. The application of two different tools to two medium-size structural analysis problems to generate sensitivity information typically necessary in an optimization or design situation is also discussed. The observation is made that AD is to be preferred to finite differencing in most cases, as long as sufficient computer storage is available; in some instances, AD may be the alternative to consider in lieu of analytical sensitivity analysis.

  15. Developing Optimized Trajectories Derived from Mission and Thermo-Structural Constraints

    NASA Technical Reports Server (NTRS)

    Lear, Matthew H.; McGrath, Brian E.; Anderson, Michael P.; Green, Peter W.

    2008-01-01

    In conjunction with NASA and the Department of Defense, the Johns Hopkins University Applied Physics Laboratory (JHU/APL) has been investigating analytical techniques to address many of the fundamental issues associated with solar exploration spacecraft and high-speed atmospheric vehicle systems. These issues include: thermo-structural response including the effects of thermal management via the use of surface optical properties for high-temperature composite structures; aerodynamics with the effects of non-equilibrium chemistry and gas radiation; and aero-thermodynamics with the effects of material ablation for a wide range of thermal protection system (TPS) materials. The need exists to integrate these discrete tools into a common framework that enables the investigation of interdisciplinary interactions (including analysis tool, applied load, and environment uncertainties) to provide high fidelity solutions. In addition to developing robust tools for the coupling of aerodynamically induced thermal and mechanical loads, JHU/APL has been studying the optimal design of high-speed vehicles as a function of their trajectory. Under traditional design methodology the optimization of system level mission parameters such as range and time of flight is performed independently of the optimization for thermal and mechanical constraints such as stress and temperature. A truly optimal trajectory should optimize over the entire range of mission and thermo-mechanical constraints. Under this research, a framework for the robust analysis of high-speed spacecraft and atmospheric vehicle systems has been developed. It has been built around a generic, loosely coupled framework such that a variety of readily available analysis tools can be used. The methodology immediately addresses many of the current analysis inadequacies and allows for future extension in order to handle more complex problems.

  16. LLIMAS: Revolutionizing integrating modeling and analysis at MIT Lincoln Laboratory

    NASA Astrophysics Data System (ADS)

    Doyle, Keith B.; Stoeckel, Gerhard P.; Rey, Justin J.; Bury, Mark E.

    2017-08-01

    MIT Lincoln Laboratory's Integrated Modeling and Analysis Software (LLIMAS) enables the development of novel engineering solutions for advanced prototype systems through unique insights into engineering performance and interdisciplinary behavior to meet challenging size, weight, power, environmental, and performance requirements. LLIMAS is a multidisciplinary design optimization tool that wraps numerical optimization algorithms around an integrated framework of structural, thermal, optical, stray light, and computational fluid dynamics analysis capabilities. LLIMAS software is highly extensible and has developed organically across a variety of technologies including laser communications, directed energy, photometric detectors, chemical sensing, laser radar, and imaging systems. The custom software architecture leverages the capabilities of existing industry standard commercial software and supports the incorporation of internally developed tools. Recent advances in LLIMAS's Structural-Thermal-Optical Performance (STOP), aeromechanical, and aero-optical capabilities as applied to Lincoln prototypes are presented.

  17. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system structural components

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.

    1987-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  18. Probabilistic Structural Analysis Methods for select space propulsion system structural components (PSAM)

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.

    1988-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  19. Reliability analysis of composite structures

    NASA Technical Reports Server (NTRS)

    Kan, Han-Pin

    1992-01-01

    A probabilistic static stress analysis methodology has been developed to estimate the reliability of a composite structure. Closed form stress analysis methods are the primary analytical tools used in this methodology. These structural mechanics methods are used to identify independent variables whose variations significantly affect the performance of the structure. Once these variables are identified, scatter in their values is evaluated and statistically characterized. The scatter in applied loads and the structural parameters are then fitted to appropriate probabilistic distribution functions. Numerical integration techniques are applied to compute the structural reliability. The predicted reliability accounts for scatter due to variability in material strength, applied load, fabrication and assembly processes. The influence of structural geometry and mode of failure are also considerations in the evaluation. Example problems are given to illustrate various levels of analytical complexity.

  20. MEA-Tools: an open source toolbox for the analysis of multi-electrode data with MATLAB.

    PubMed

    Egert, U; Knott, Th; Schwarz, C; Nawrot, M; Brandt, A; Rotter, S; Diesmann, M

    2002-05-30

    Recent advances in electrophysiological techniques have created new tools for the acquisition and storage of neuronal activity recorded simultaneously with numerous electrodes. These techniques support the analysis of the function as well as the structure of individual electrogenic cells in the context of surrounding neuronal or cardiac network. Commercially available tools for the analysis of such data, however, cannot be easily adapted to newly emerging requirements for data analysis and visualization, and cross compatibility between them is limited. In this report we introduce a free open source toolbox called microelectrode array tools (MEA-Tools) for the analysis of multi-electrode data based on the common data analysis environment MATLAB (version 5.3-6.1, The Mathworks, Natick, MA). The toolbox itself is platform independent. The file interface currently supports files recorded with MCRack (Multi Channel Systems, Reutlingen, Germany) under Microsoft Windows 95, 98, NT, and 2000, but can be adapted to other data acquisition systems. Functions are controlled via command line input and graphical user interfaces, and support common requirements for the analysis of local field potentials, extracellular spike activity, and continuous recordings, in addition to supplementary data acquired by additional instruments, e.g. intracellular amplifiers. Data may be processed as continuous recordings or time windows triggered to some event.

  1. Multidisciplinary analysis of actively controlled large flexible spacecraft

    NASA Technical Reports Server (NTRS)

    Cooper, Paul A.; Young, John W.; Sutter, Thomas R.

    1986-01-01

    The control of Flexible Structures (COFS) program has supported the development of an analysis capability at the Langley Research Center called the Integrated Multidisciplinary Analysis Tool (IMAT) which provides an efficient data storage and transfer capability among commercial computer codes to aid in the dynamic analysis of actively controlled structures. IMAT is a system of computer programs which transfers Computer-Aided-Design (CAD) configurations, structural finite element models, material property and stress information, structural and rigid-body dynamic model information, and linear system matrices for control law formulation among various commercial applications programs through a common database. Although general in its formulation, IMAT was developed specifically to aid in the evaluation of the structures. A description of the IMAT system and results of an application of the system are given.

  2. A tool to measure whether business management capacity in general practice impacts on the quality of chronic illness care.

    PubMed

    Holton, Christine H; Proudfoot, Judith G; Jayasinghe, Upali W; Grimm, Jane; Bubner, Tanya K; Winstanley, Julie; Harris, Mark F; Beilby, Justin J

    2010-11-01

    Our aim was to develop a tool to identify specific features of the business and financial management of practices that facilitate better quality care for chronic illness in primary care. Domains of management were identified, resulting in the development of a structured interview tool that was administered in 97 primary care practices in Australia. Interview items were screened and subjected to factor analysis, subscales identified and the overall model fit determined. The instrument's validity was assessed against another measure of quality of care. Analysis provided a four-factor solution containing 21 items, which explained 42.5% of the variance in the total scores. The factors related to administrative processes, human resources, marketing analysis and business development. All scores increased significantly with practice size. The business development subscale and total score were higher for rural practices. There was a significant correlation between the business development subscale and quality of care. The indicators of business and financial management in the final tool appear to be useful predictors of the quality of care. The instrument may help inform policy regarding the structure of general practice and implementation of a systems approach to chronic illness care. It can provide information to practices about areas for further development.

  3. Dynamic Analysis With Stress Mode Animation by the Integrated Force Method

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Coroneos, Rula M.; Hopkins, Dale A.

    1997-01-01

    Dynamic animation of stresses and displacements, which complement each other, can be a useful tool in the analysis and design of structural components. At the present time only displacement-mode animation is available through the popular stiffness formulation. This paper attempts to complete this valuable visualization tool by augmenting the existing art with stress mode animation. The reformulated method of forces, which in the literature is known as the integrated force method (IFM), became the analyzer of choice for the development of stress mode animation because stresses are the primary unknowns of its dynamic analysis. Animation of stresses and displacements, which have been developed successfully through the IFM analyzers, is illustrated in several examples along with a brief introduction to IFM dynamic analysis. The usefulness of animation in design optimization is illustrated considering the spacer structure component of the International Space Station as an example. An overview of the integrated force method analysis code (IFM/ANALYZERS) is provided in the appendix.

  4. P-TRAP: a Panicle TRAit Phenotyping tool.

    PubMed

    A L-Tam, Faroq; Adam, Helene; Anjos, António dos; Lorieux, Mathias; Larmande, Pierre; Ghesquière, Alain; Jouannic, Stefan; Shahbazkia, Hamid Reza

    2013-08-29

    In crops, inflorescence complexity and the shape and size of the seed are among the most important characters that influence yield. For example, rice panicles vary considerably in the number and order of branches, elongation of the axis, and the shape and size of the seed. Manual low-throughput phenotyping methods are time consuming, and the results are unreliable. However, high-throughput image analysis of the qualitative and quantitative traits of rice panicles is essential for understanding the diversity of the panicle as well as for breeding programs. This paper presents P-TRAP software (Panicle TRAit Phenotyping), a free open source application for high-throughput measurements of panicle architecture and seed-related traits. The software is written in Java and can be used with different platforms (the user-friendly Graphical User Interface (GUI) uses Netbeans Platform 7.3). The application offers three main tools: a tool for the analysis of panicle structure, a spikelet/grain counting tool, and a tool for the analysis of seed shape. The three tools can be used independently or simultaneously for analysis of the same image. Results are then reported in the Extensible Markup Language (XML) and Comma Separated Values (CSV) file formats. Images of rice panicles were used to evaluate the efficiency and robustness of the software. Compared to data obtained by manual processing, P-TRAP produced reliable results in a much shorter time. In addition, manual processing is not repeatable because dry panicles are vulnerable to damage. The software is very useful, practical and collects much more data than human operators. P-TRAP is a new open source software that automatically recognizes the structure of a panicle and the seeds on the panicle in numeric images. The software processes and quantifies several traits related to panicle structure, detects and counts the grains, and measures their shape parameters. In short, P-TRAP offers both efficient results and a user-friendly environment for experiments. The experimental results showed very good accuracy compared to field operator, expert verification and well-known academic methods.

  5. P-TRAP: a Panicle Trait Phenotyping tool

    PubMed Central

    2013-01-01

    Background In crops, inflorescence complexity and the shape and size of the seed are among the most important characters that influence yield. For example, rice panicles vary considerably in the number and order of branches, elongation of the axis, and the shape and size of the seed. Manual low-throughput phenotyping methods are time consuming, and the results are unreliable. However, high-throughput image analysis of the qualitative and quantitative traits of rice panicles is essential for understanding the diversity of the panicle as well as for breeding programs. Results This paper presents P-TRAP software (Panicle TRAit Phenotyping), a free open source application for high-throughput measurements of panicle architecture and seed-related traits. The software is written in Java and can be used with different platforms (the user-friendly Graphical User Interface (GUI) uses Netbeans Platform 7.3). The application offers three main tools: a tool for the analysis of panicle structure, a spikelet/grain counting tool, and a tool for the analysis of seed shape. The three tools can be used independently or simultaneously for analysis of the same image. Results are then reported in the Extensible Markup Language (XML) and Comma Separated Values (CSV) file formats. Images of rice panicles were used to evaluate the efficiency and robustness of the software. Compared to data obtained by manual processing, P-TRAP produced reliable results in a much shorter time. In addition, manual processing is not repeatable because dry panicles are vulnerable to damage. The software is very useful, practical and collects much more data than human operators. Conclusions P-TRAP is a new open source software that automatically recognizes the structure of a panicle and the seeds on the panicle in numeric images. The software processes and quantifies several traits related to panicle structure, detects and counts the grains, and measures their shape parameters. In short, P-TRAP offers both efficient results and a user-friendly environment for experiments. The experimental results showed very good accuracy compared to field operator, expert verification and well-known academic methods. PMID:23987653

  6. The use of analytical surface tools in the fundamental study of wear. [atomic nature of wear

    NASA Technical Reports Server (NTRS)

    Buckley, D. H.

    1977-01-01

    Various techniques and surface tools available for the study of the atomic nature of the wear of materials are reviewed These include chemical etching, x-ray diffraction, electron diffraction, scanning electron microscopy, low-energy electron diffraction, Auger emission spectroscopy analysis, electron spectroscopy for chemical analysis, field ion microscopy, and the atom probe. Properties of the surface and wear surface regions which affect wear, such as surface energy, crystal structure, crystallographic orientation, mode of dislocation behavior, and cohesive binding, are discussed. A number of mechanisms involved in the generation of wear particles are identified with the aid of the aforementioned tools.

  7. The Papillomavirus Episteme: a central resource for papillomavirus sequence data and analysis.

    PubMed

    Van Doorslaer, Koenraad; Tan, Qina; Xirasagar, Sandhya; Bandaru, Sandya; Gopalan, Vivek; Mohamoud, Yasmin; Huyen, Yentram; McBride, Alison A

    2013-01-01

    The goal of the Papillomavirus Episteme (PaVE) is to provide an integrated resource for the analysis of papillomavirus (PV) genome sequences and related information. The PaVE is a freely accessible, web-based tool (http://pave.niaid.nih.gov) created around a relational database, which enables storage, analysis and exchange of sequence information. From a design perspective, the PaVE adopts an Open Source software approach and stresses the integration and reuse of existing tools. Reference PV genome sequences have been extracted from publicly available databases and reannotated using a custom-created tool. To date, the PaVE contains 241 annotated PV genomes, 2245 genes and regions, 2004 protein sequences and 47 protein structures, which users can explore, analyze or download. The PaVE provides scientists with the data and tools needed to accelerate scientific progress for the study and treatment of diseases caused by PVs.

  8. Improved Design of Tunnel Supports : Volume 1 : Simplified Analysis for Ground-Structure Interaction in Tunneling

    DOT National Transportation Integrated Search

    1980-06-01

    The purpose of this report is to provide the tunneling profession with improved practical tools in the technical or design area, which provide more accurate representations of the ground-structure interaction in tunneling. The design methods range fr...

  9. mvMapper: statistical and geographical data exploration and visualization of multivariate analysis of population structure

    USDA-ARS?s Scientific Manuscript database

    Characterizing population genetic structure across geographic space is a fundamental challenge in population genetics. Multivariate statistical analyses are powerful tools for summarizing genetic variability, but geographic information and accompanying metadata is not always easily integrated into t...

  10. Science as Structured Imagination

    ERIC Educational Resources Information Center

    De Cruz, Helen; De Smedt, Johan

    2010-01-01

    This paper offers an analysis of scientific creativity based on theoretical models and experimental results of the cognitive sciences. Its core idea is that scientific creativity--like other forms of creativity--is structured and constrained by prior ontological expectations. Analogies provide scientists with a powerful epistemic tool to overcome…

  11. PDBFlex: exploring flexibility in protein structures

    PubMed Central

    Hrabe, Thomas; Li, Zhanwen; Sedova, Mayya; Rotkiewicz, Piotr; Jaroszewski, Lukasz; Godzik, Adam

    2016-01-01

    The PDBFlex database, available freely and with no login requirements at http://pdbflex.org, provides information on flexibility of protein structures as revealed by the analysis of variations between depositions of different structural models of the same protein in the Protein Data Bank (PDB). PDBFlex collects information on all instances of such depositions, identifying them by a 95% sequence identity threshold, performs analysis of their structural differences and clusters them according to their structural similarities for easy analysis. The PDBFlex contains tools and viewers enabling in-depth examination of structural variability including: 2D-scaling visualization of RMSD distances between structures of the same protein, graphs of average local RMSD in the aligned structures of protein chains, graphical presentation of differences in secondary structure and observed structural disorder (unresolved residues), difference distance maps between all sets of coordinates and 3D views of individual structures and simulated transitions between different conformations, the latter displayed using JSMol visualization software. PMID:26615193

  12. Basic analysis of reflectometry data software package for the analysis of multilayered structures according to reflectometry data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Astaf'ev, S. B., E-mail: bard@ns.crys.ras.ru; Shchedrin, B. M.; Yanusova, L. G.

    2012-01-15

    The main principles of developing the Basic Analysis of Reflectometry Data (BARD) software package, which is aimed at obtaining a unified (standardized) tool for analyzing the structure of thin multilayer films and nanostructures of different nature based on reflectometry data, are considered. This software package contains both traditionally used procedures for processing reflectometry data and the authors' original developments on the basis of new methods for carrying out and analyzing reflectometry experiments. The structure of the package, its functional possibilities, examples of application, and prospects of development are reviewed.

  13. Lunar hand tools

    NASA Technical Reports Server (NTRS)

    Bentz, Karl F.; Coleman, Robert D.; Dubnik, Kathy; Marshall, William S.; Mcentee, Amy; Na, Sae H.; Patton, Scott G.; West, Michael C.

    1987-01-01

    Tools useful for operations and maintenance tasks on the lunar surface were determined and designed. Primary constraints are the lunar environment, the astronaut's space suit and the strength limits of the astronaut on the moon. A multipurpose rotary motion tool and a collapsible tool carrier were designed. For the rotary tool, a brushless motor and controls were specified, a material for the housing was chosen, bearings and lubrication were recommended and a planetary reduction gear attachment was designed. The tool carrier was designed primarily for ease of access to the tools and fasteners. A material was selected and structural analysis was performed on the carrier. Recommendations were made about the limitations of human performance and about possible attachments to the torque driver.

  14. A Visual Analytics Approach to Structured Data Analysis to Enhance Nonproliferation and Arms Control Verification Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gillen, David S.

    Analysis activities for Nonproliferation and Arms Control verification require the use of many types of data. Tabular structured data, such as Excel spreadsheets and relational databases, have traditionally been used for data mining activities, where specific queries are issued against data to look for matching results. The application of visual analytics tools to structured data enables further exploration of datasets to promote discovery of previously unknown results. This paper discusses the application of a specific visual analytics tool to datasets related to the field of Arms Control and Nonproliferation to promote the use of visual analytics more broadly in thismore » domain. Visual analytics focuses on analytical reasoning facilitated by interactive visual interfaces (Wong and Thomas 2004). It promotes exploratory analysis of data, and complements data mining technologies where known patterns can be mined for. Also with a human in the loop, they can bring in domain knowledge and subject matter expertise. Visual analytics has not widely been applied to this domain. In this paper, we will focus on one type of data: structured data, and show the results of applying a specific visual analytics tool to answer questions in the Arms Control and Nonproliferation domain. We chose to use the T.Rex tool, a visual analytics tool developed at PNNL, which uses a variety of visual exploration patterns to discover relationships in structured datasets, including a facet view, graph view, matrix view, and timeline view. The facet view enables discovery of relationships between categorical information, such as countries and locations. The graph tool visualizes node-link relationship patterns, such as the flow of materials being shipped between parties. The matrix visualization shows highly correlated categories of information. The timeline view shows temporal patterns in data. In this paper, we will use T.Rex with two different datasets to demonstrate how interactive exploration of the data can aid an analyst with arms control and nonproliferation verification activities. Using a dataset from PIERS (PIERS 2014), we will show how container shipment imports and exports can aid an analyst in understanding the shipping patterns between two countries. We will also use T.Rex to examine a collection of research publications from the IAEA International Nuclear Information System (IAEA 2014) to discover collaborations of concern. We hope this paper will encourage the use of visual analytics structured data analytics in the field of nonproliferation and arms control verification. Our paper outlines some of the challenges that exist before broad adoption of these kinds of tools can occur and offers next steps to overcome these challenges.« less

  15. Nonlinear Time Domain Seismic Soil-Structure Interaction (SSI) Deep Soil Site Methodology Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spears, Robert Edward; Coleman, Justin Leigh

    Currently the Department of Energy (DOE) and the nuclear industry perform seismic soil-structure interaction (SSI) analysis using equivalent linear numerical analysis tools. For lower levels of ground motion, these tools should produce reasonable in-structure response values for evaluation of existing and new facilities. For larger levels of ground motion these tools likely overestimate the in-structure response (and therefore structural demand) since they do not consider geometric nonlinearities (such as gaping and sliding between the soil and structure) and are limited in the ability to model nonlinear soil behavior. The current equivalent linear SSI (SASSI) analysis approach either joins the soilmore » and structure together in both tension and compression or releases the soil from the structure for both tension and compression. It also makes linear approximations for material nonlinearities and generalizes energy absorption with viscous damping. This produces the potential for inaccurately establishing where the structural concerns exist and/or inaccurately establishing the amplitude of the in-structure responses. Seismic hazard curves at nuclear facilities have continued to increase over the years as more information has been developed on seismic sources (i.e. faults), additional information gathered on seismic events, and additional research performed to determine local site effects. Seismic hazard curves are used to develop design basis earthquakes (DBE) that are used to evaluate nuclear facility response. As the seismic hazard curves increase, the input ground motions (DBE’s) used to numerically evaluation nuclear facility response increase causing larger in-structure response. As ground motions increase so does the importance of including nonlinear effects in numerical SSI models. To include material nonlinearity in the soil and geometric nonlinearity using contact (gaping and sliding) it is necessary to develop a nonlinear time domain methodology. This methodology will be known as, NonLinear Soil-Structure Interaction (NLSSI). In general NLSSI analysis should provide a more accurate representation of the seismic demands on nuclear facilities their systems and components. INL, in collaboration with a Nuclear Power Plant Vender (NPP-V), will develop a generic Nuclear Power Plant (NPP) structural design to be used in development of the methodology and for comparison with SASSI. This generic NPP design has been evaluated for the INL soil site because of the ease of access and quality of the site specific data. It is now being evaluated for a second site at Vogtle which is located approximately 15 miles East-Northeast of Waynesboro, Georgia and adjacent to Savanna River. The Vogtle site consists of many soil layers spanning down to a depth of 1058 feet. The reason that two soil sites are chosen is to demonstrate the methodology across multiple soil sites. The project will drive the models (soil and structure) using successively increasing acceleration time histories with amplitudes. The models will be run in time domain codes such as ABAQUS, LS-DYNA, and/or ESSI and compared with the same models run in SASSI. The project is focused on developing and documenting a method for performing time domain, non-linear seismic soil structure interaction (SSI) analysis. Development of this method will provide the Department of Energy (DOE) and industry with another tool to perform seismic SSI analysis.« less

  16. Effect of crumb cellular structure characterized by image analysis on cake softness.

    PubMed

    Dewaest, Marine; Villemejane, Cindy; Berland, Sophie; Neron, Stéphane; Clement, Jérôme; Verel, Aliette; Michon, Camille

    2018-06-01

    Sponge cake is a cereal product characterized by an aerated crumb and appreciated for its softness. When formulating such product, it is interesting to be able to characterize the crumb structure using image analysis and to bring knowledge about the effects of the crumb cellular structure on its mechanical properties which contribute to softness. An image analysis method based on mathematical morphology was adapted from the one developed for bread crumb. In order to evaluate its ability to discriminate cellular structures, series of cakes were prepared using two rather similar emulsifiers but also using flours with different aging times before use. The mechanical properties of the crumbs of these different cakes were also characterized. It allowed a cell structure classification taking into account cell size and homogeneity, but also cell wall thickness and the number of holes in the walls. Interestingly, the cellular structure differences had a larger impact on the aerated crumb Young modulus than the wall firmness. Increasing the aging time of flour before use leads to the production of firmer crumbs due to coarser and inhomogeneous cellular structures. Changing the composition of the emulsifier may change the cellular structure and, depending on the type of the structural changes, have an impact on the firmness of the crumb. Cellular structure rather than cell wall firmness was found to impact cake crumb firmness. The new fast and automated tool for cake crumb structure analysis allows detecting quickly any change in cell size or homogeneity but also cell wall thickness and number of holes in the walls (openness degree). To obtain a softer crumb, it seems that options are to decrease the cell size and the cell wall thickness and/or to increase the openness degree. It is then possible to easily evaluate the effects of ingredients (flour composition, emulsifier …) or change in the process on the crumb structure and thus its softness. Moreover, this image analysis is a very efficient tool for quality control. © 2017 Wiley Periodicals, Inc.

  17. CAVER Analyst 1.0: graphic tool for interactive visualization and analysis of tunnels and channels in protein structures.

    PubMed

    Kozlikova, Barbora; Sebestova, Eva; Sustr, Vilem; Brezovsky, Jan; Strnad, Ondrej; Daniel, Lukas; Bednar, David; Pavelka, Antonin; Manak, Martin; Bezdeka, Martin; Benes, Petr; Kotry, Matus; Gora, Artur; Damborsky, Jiri; Sochor, Jiri

    2014-09-15

    The transport of ligands, ions or solvent molecules into proteins with buried binding sites or through the membrane is enabled by protein tunnels and channels. CAVER Analyst is a software tool for calculation, analysis and real-time visualization of access tunnels and channels in static and dynamic protein structures. It provides an intuitive graphic user interface for setting up the calculation and interactive exploration of identified tunnels/channels and their characteristics. CAVER Analyst is a multi-platform software written in JAVA. Binaries and documentation are freely available for non-commercial use at http://www.caver.cz. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. Teaching NMR spectra analysis with nmr.cheminfo.org.

    PubMed

    Patiny, Luc; Bolaños, Alejandro; Castillo, Andrés M; Bernal, Andrés; Wist, Julien

    2018-06-01

    Teaching spectra analysis and structure elucidation requires students to get trained on real problems. This involves solving exercises of increasing complexity and when necessary using computational tools. Although desktop software packages exist for this purpose, nmr.cheminfo.org platform offers students an online alternative. It provides a set of exercises and tools to help solving them. Only a small number of exercises are currently available, but contributors are invited to submit new ones and suggest new types of problems. Copyright © 2018 John Wiley & Sons, Ltd.

  19. Simulation Based Optimization of Complex Monolithic Composite Structures Using Cellular Core Technology

    NASA Astrophysics Data System (ADS)

    Hickmott, Curtis W.

    Cellular core tooling is a new technology which has the capability to manufacture complex integrated monolithic composite structures. This novel tooling method utilizes thermoplastic cellular cores as inner tooling. The semi-rigid nature of the cellular cores makes them convenient for lay-up, and under autoclave temperature and pressure they soften and expand providing uniform compaction on all surfaces including internal features such as ribs and spar tubes. This process has the capability of developing fully optimized aerospace structures by reducing or eliminating assembly using fasteners or bonded joints. The technology is studied in the context of evaluating its capabilities, advantages, and limitations in developing high quality structures. The complex nature of these parts has led to development of a model using the Finite Element Analysis (FEA) software Abaqus and the plug-in COMPRO Common Component Architecture (CCA) provided by Convergent Manufacturing Technologies. This model utilizes a "virtual autoclave" technique to simulate temperature profiles, resin flow paths, and ultimately deformation from residual stress. A model has been developed simulating the temperature profile during curing of composite parts made with the cellular core technology. While modeling of composites has been performed in the past, this project will look to take this existing knowledge and apply it to this new manufacturing method capable of building more complex parts and develop a model designed specifically for building large, complex components with a high degree of accuracy. The model development has been carried out in conjunction with experimental validation. A double box beam structure was chosen for analysis to determine the effects of the technology on internal ribs and joints. Double box beams were manufactured and sectioned into T-joints for characterization. Mechanical behavior of T-joints was performed using the T-joint pull-off test and compared to traditional tooling methods. Components made with the cellular core tooling method showed an improved strength at the joints. It is expected that this knowledge will help optimize the processing of complex, integrated structures and benefit applications in aerospace where lighter, structurally efficient components would be advantageous.

  20. Comparative analysis of machine learning methods in ligand-based virtual screening of large compound libraries.

    PubMed

    Ma, Xiao H; Jia, Jia; Zhu, Feng; Xue, Ying; Li, Ze R; Chen, Yu Z

    2009-05-01

    Machine learning methods have been explored as ligand-based virtual screening tools for facilitating drug lead discovery. These methods predict compounds of specific pharmacodynamic, pharmacokinetic or toxicological properties based on their structure-derived structural and physicochemical properties. Increasing attention has been directed at these methods because of their capability in predicting compounds of diverse structures and complex structure-activity relationships without requiring the knowledge of target 3D structure. This article reviews current progresses in using machine learning methods for virtual screening of pharmacodynamically active compounds from large compound libraries, and analyzes and compares the reported performances of machine learning tools with those of structure-based and other ligand-based (such as pharmacophore and clustering) virtual screening methods. The feasibility to improve the performance of machine learning methods in screening large libraries is discussed.

  1. Progressive Damage and Failure Analysis of Composite Laminates

    NASA Astrophysics Data System (ADS)

    Joseph, Ashith P. K.

    Composite materials are widely used in various industries for making structural parts due to higher strength to weight ratio, better fatigue life, corrosion resistance and material property tailorability. To fully exploit the capability of composites, it is required to know the load carrying capacity of the parts made of them. Unlike metals, composites are orthotropic in nature and fails in a complex manner under various loading conditions which makes it a hard problem to analyze. Lack of reliable and efficient failure analysis tools for composites have led industries to rely more on coupon and component level testing to estimate the design space. Due to the complex failure mechanisms, composite materials require a very large number of coupon level tests to fully characterize the behavior. This makes the entire testing process very time consuming and costly. The alternative is to use virtual testing tools which can predict the complex failure mechanisms accurately. This reduces the cost only to it's associated computational expenses making significant savings. Some of the most desired features in a virtual testing tool are - (1) Accurate representation of failure mechanism: Failure progression predicted by the virtual tool must be same as those observed in experiments. A tool has to be assessed based on the mechanisms it can capture. (2) Computational efficiency: The greatest advantages of a virtual tools are the savings in time and money and hence computational efficiency is one of the most needed features. (3) Applicability to a wide range of problems: Structural parts are subjected to a variety of loading conditions including static, dynamic and fatigue conditions. A good virtual testing tool should be able to make good predictions for all these different loading conditions. The aim of this PhD thesis is to develop a computational tool which can model the progressive failure of composite laminates under different quasi-static loading conditions. The analysis tool is validated by comparing the simulations against experiments for a selected number of quasi-static loading cases.

  2. Automated Assignment of MS/MS Cleavable Cross-Links in Protein 3D-Structure Analysis

    NASA Astrophysics Data System (ADS)

    Götze, Michael; Pettelkau, Jens; Fritzsche, Romy; Ihling, Christian H.; Schäfer, Mathias; Sinz, Andrea

    2015-01-01

    CID-MS/MS cleavable cross-linkers hold an enormous potential for an automated analysis of cross-linked products, which is essential for conducting structural proteomics studies. The created characteristic fragment ion patterns can easily be used for an automated assignment and discrimination of cross-linked products. To date, there are only a few software solutions available that make use of these properties, but none allows for an automated analysis of cleavable cross-linked products. The MeroX software fills this gap and presents a powerful tool for protein 3D-structure analysis in combination with MS/MS cleavable cross-linkers. We show that MeroX allows an automatic screening of characteristic fragment ions, considering static and variable peptide modifications, and effectively scores different types of cross-links. No manual input is required for a correct assignment of cross-links and false discovery rates are calculated. The self-explanatory graphical user interface of MeroX provides easy access for an automated cross-link search platform that is compatible with commonly used data file formats, enabling analysis of data originating from different instruments. The combination of an MS/MS cleavable cross-linker with a dedicated software tool for data analysis provides an automated workflow for 3D-structure analysis of proteins. MeroX is available at www.StavroX.com .

  3. Design and Control of Compliant Tensegrity Robots Through Simulation and Hardware Validation

    NASA Technical Reports Server (NTRS)

    Caluwaerts, Ken; Despraz, Jeremie; Iscen, Atil; Sabelhaus, Andrew P.; Bruce, Jonathan; Schrauwen, Benjamin; Sunspiral, Vytas

    2014-01-01

    To better understand the role of tensegrity structures in biological systems and their application to robotics, the Dynamic Tensegrity Robotics Lab at NASA Ames Research Center has developed and validated two different software environments for the analysis, simulation, and design of tensegrity robots. These tools, along with new control methodologies and the modular hardware components developed to validate them, are presented as a system for the design of actuated tensegrity structures. As evidenced from their appearance in many biological systems, tensegrity ("tensile-integrity") structures have unique physical properties which make them ideal for interaction with uncertain environments. Yet these characteristics, such as variable structural compliance, and global multi-path load distribution through the tension network, make design and control of bio-inspired tensegrity robots extremely challenging. This work presents the progress in using these two tools in tackling the design and control challenges. The results of this analysis includes multiple novel control approaches for mobility and terrain interaction of spherical tensegrity structures. The current hardware prototype of a six-bar tensegrity, code-named ReCTeR, is presented in the context of this validation.

  4. SamuROI, a Python-Based Software Tool for Visualization and Analysis of Dynamic Time Series Imaging at Multiple Spatial Scales.

    PubMed

    Rueckl, Martin; Lenzi, Stephen C; Moreno-Velasquez, Laura; Parthier, Daniel; Schmitz, Dietmar; Ruediger, Sten; Johenning, Friedrich W

    2017-01-01

    The measurement of activity in vivo and in vitro has shifted from electrical to optical methods. While the indicators for imaging activity have improved significantly over the last decade, tools for analysing optical data have not kept pace. Most available analysis tools are limited in their flexibility and applicability to datasets obtained at different spatial scales. Here, we present SamuROI (Structured analysis of multiple user-defined ROIs), an open source Python-based analysis environment for imaging data. SamuROI simplifies exploratory analysis and visualization of image series of fluorescence changes in complex structures over time and is readily applicable at different spatial scales. In this paper, we show the utility of SamuROI in Ca 2+ -imaging based applications at three spatial scales: the micro-scale (i.e., sub-cellular compartments including cell bodies, dendrites and spines); the meso-scale, (i.e., whole cell and population imaging with single-cell resolution); and the macro-scale (i.e., imaging of changes in bulk fluorescence in large brain areas, without cellular resolution). The software described here provides a graphical user interface for intuitive data exploration and region of interest (ROI) management that can be used interactively within Jupyter Notebook: a publicly available interactive Python platform that allows simple integration of our software with existing tools for automated ROI generation and post-processing, as well as custom analysis pipelines. SamuROI software, source code and installation instructions are publicly available on GitHub and documentation is available online. SamuROI reduces the energy barrier for manual exploration and semi-automated analysis of spatially complex Ca 2+ imaging datasets, particularly when these have been acquired at different spatial scales.

  5. SamuROI, a Python-Based Software Tool for Visualization and Analysis of Dynamic Time Series Imaging at Multiple Spatial Scales

    PubMed Central

    Rueckl, Martin; Lenzi, Stephen C.; Moreno-Velasquez, Laura; Parthier, Daniel; Schmitz, Dietmar; Ruediger, Sten; Johenning, Friedrich W.

    2017-01-01

    The measurement of activity in vivo and in vitro has shifted from electrical to optical methods. While the indicators for imaging activity have improved significantly over the last decade, tools for analysing optical data have not kept pace. Most available analysis tools are limited in their flexibility and applicability to datasets obtained at different spatial scales. Here, we present SamuROI (Structured analysis of multiple user-defined ROIs), an open source Python-based analysis environment for imaging data. SamuROI simplifies exploratory analysis and visualization of image series of fluorescence changes in complex structures over time and is readily applicable at different spatial scales. In this paper, we show the utility of SamuROI in Ca2+-imaging based applications at three spatial scales: the micro-scale (i.e., sub-cellular compartments including cell bodies, dendrites and spines); the meso-scale, (i.e., whole cell and population imaging with single-cell resolution); and the macro-scale (i.e., imaging of changes in bulk fluorescence in large brain areas, without cellular resolution). The software described here provides a graphical user interface for intuitive data exploration and region of interest (ROI) management that can be used interactively within Jupyter Notebook: a publicly available interactive Python platform that allows simple integration of our software with existing tools for automated ROI generation and post-processing, as well as custom analysis pipelines. SamuROI software, source code and installation instructions are publicly available on GitHub and documentation is available online. SamuROI reduces the energy barrier for manual exploration and semi-automated analysis of spatially complex Ca2+ imaging datasets, particularly when these have been acquired at different spatial scales. PMID:28706482

  6. Computer aided drug design

    NASA Astrophysics Data System (ADS)

    Jain, A.

    2017-08-01

    Computer based method can help in discovery of leads and can potentially eliminate chemical synthesis and screening of many irrelevant compounds, and in this way, it save time as well as cost. Molecular modeling systems are powerful tools for building, visualizing, analyzing and storing models of complex molecular structure that can help to interpretate structure activity relationship. The use of various techniques of molecular mechanics and dynamics and software in Computer aided drug design along with statistics analysis is powerful tool for the medicinal chemistry to synthesis therapeutic and effective drugs with minimum side effect.

  7. Measuring Individual Skills in Household Waste Recycling: Implications for Citizens' Education and Communication in Six Urban Contexts

    ERIC Educational Resources Information Center

    Passafaro, Paola; Bacciu, Anna; Caggianelli, Ilaria; Castaldi, Viviana; Fucci, Eleonora; Ritondale, Deborah; Trabalzini, Eleonora

    2016-01-01

    This article reports the analysis of six urban contexts in which a practical tool measuring individual skills concerning household waste recycling was tested. The tool is a structured questionnaire including a simulation task that assesses respondents' abilities to sort household waste adequately in a given context/municipality. Results indicate…

  8. Investigation Organizer

    NASA Technical Reports Server (NTRS)

    Panontin, Tina; Carvalho, Robert; Keller, Richard

    2004-01-01

    Contents include the folloving:Overview of the Application; Input Data; Analytical Process; Tool's Output; and Application of the Results of the Analysis.The tool enables the first element through a Web-based application that can be accessed by distributed teams to store and retrieve any type of digital investigation material in a secure environment. The second is accomplished by making the relationships between information explicit through the use of a semantic network-a structure that literally allows an investigator or team to "connect -the-dots." The third element, the significance of the correlated information, is established through causality and consistency tests using a number of different methods embedded within the tool, including fault trees, event sequences, and other accident models. And finally, the evidence gathered and structured within the tool can be directly, electronically archived to preserve the evidence and investigative reasoning.

  9. Structure identification by Mass Spectrometry Non-Targeted Analysis using the US EPA’s CompTox Chemistry Dashboard

    EPA Science Inventory

    Identification of unknowns in mass spectrometry based non-targeted analyses (NTA) requires the integration of complementary pieces of data to arrive at a confident, consensus structure. Researchers use chemical reference databases, spectral matching, fragment prediction tools, r...

  10. Surface analysis of stone and bone tools

    NASA Astrophysics Data System (ADS)

    Stemp, W. James; Watson, Adam S.; Evans, Adrian A.

    2016-03-01

    Microwear (use-wear) analysis is a powerful method for identifying tool use that archaeologists and anthropologists employ to determine the activities undertaken by both humans and their hominin ancestors. Knowledge of tool use allows for more accurate and detailed reconstructions of past behavior, particularly in relation to subsistence practices, economic activities, conflict and ritual. It can also be used to document changes in these activities over time, in different locations, and by different members of society, in terms of gender and status, for example. Both stone and bone tools have been analyzed using a variety of techniques that focus on the observation, documentation and interpretation of wear traces. Traditionally, microwear analysis relied on the qualitative assessment of wear features using microscopes and often included comparisons between replicated tools used experimentally and the recovered artifacts, as well as functional analogies dependent upon modern implements and those used by indigenous peoples from various places around the world. Determination of tool use has also relied on the recovery and analysis of both organic and inorganic residues of past worked materials that survived in and on artifact surfaces. To determine tool use and better understand the mechanics of wear formation, particularly on stone and bone, archaeologists and anthropologists have increasingly turned to surface metrology and tribology to assist them in their research. This paper provides a history of the development of traditional microwear analysis in archaeology and anthropology and also explores the introduction and adoption of more modern methods and technologies for documenting and identifying wear on stone and bone tools, specifically those developed for the engineering sciences to study surface structures on micro- and nanoscales. The current state of microwear analysis is discussed as are the future directions in the study of microwear on stone and bone tools.

  11. RNApdbee 2.0: multifunctional tool for RNA structure annotation.

    PubMed

    Zok, Tomasz; Antczak, Maciej; Zurkowski, Michal; Popenda, Mariusz; Blazewicz, Jacek; Adamiak, Ryszard W; Szachniuk, Marta

    2018-04-30

    In the field of RNA structural biology and bioinformatics, an access to correctly annotated RNA structure is of crucial importance, especially in the secondary and 3D structure predictions. RNApdbee webserver, introduced in 2014, primarily aimed to address the problem of RNA secondary structure extraction from the PDB files. Its new version, RNApdbee 2.0, is a highly advanced multifunctional tool for RNA structure annotation, revealing the relationship between RNA secondary and 3D structure given in the PDB or PDBx/mmCIF format. The upgraded version incorporates new algorithms for recognition and classification of high-ordered pseudoknots in large RNA structures. It allows analysis of isolated base pairs impact on RNA structure. It can visualize RNA secondary structures-including that of quadruplexes-with depiction of non-canonical interactions. It also annotates motifs to ease identification of stems, loops and single-stranded fragments in the input RNA structure. RNApdbee 2.0 is implemented as a publicly available webserver with an intuitive interface and can be freely accessed at http://rnapdbee.cs.put.poznan.pl/.

  12. Hekate: Software Suite for the Mass Spectrometric Analysis and Three-Dimensional Visualization of Cross-Linked Protein Samples

    PubMed Central

    2013-01-01

    Chemical cross-linking of proteins combined with mass spectrometry provides an attractive and novel method for the analysis of native protein structures and protein complexes. Analysis of the data however is complex. Only a small number of cross-linked peptides are produced during sample preparation and must be identified against a background of more abundant native peptides. To facilitate the search and identification of cross-linked peptides, we have developed a novel software suite, named Hekate. Hekate is a suite of tools that address the challenges involved in analyzing protein cross-linking experiments when combined with mass spectrometry. The software is an integrated pipeline for the automation of the data analysis workflow and provides a novel scoring system based on principles of linear peptide analysis. In addition, it provides a tool for the visualization of identified cross-links using three-dimensional models, which is particularly useful when combining chemical cross-linking with other structural techniques. Hekate was validated by the comparative analysis of cytochrome c (bovine heart) against previously reported data.1 Further validation was carried out on known structural elements of DNA polymerase III, the catalytic α-subunit of the Escherichia coli DNA replisome along with new insight into the previously uncharacterized C-terminal domain of the protein. PMID:24010795

  13. Development of Advanced Computational Aeroelasticity Tools at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Bartels, R. E.

    2008-01-01

    NASA Langley Research Center has continued to develop its long standing computational tools to address new challenges in aircraft and launch vehicle design. This paper discusses the application and development of those computational aeroelastic tools. Four topic areas will be discussed: 1) Modeling structural and flow field nonlinearities; 2) Integrated and modular approaches to nonlinear multidisciplinary analysis; 3) Simulating flight dynamics of flexible vehicles; and 4) Applications that support both aeronautics and space exploration.

  14. A projection-based model reduction strategy for the wave and vibration analysis of rotating periodic structures

    NASA Astrophysics Data System (ADS)

    Beli, D.; Mencik, J.-M.; Silva, P. B.; Arruda, J. R. F.

    2018-05-01

    The wave finite element method has proved to be an efficient and accurate numerical tool to perform the free and forced vibration analysis of linear reciprocal periodic structures, i.e. those conforming to symmetrical wave fields. In this paper, its use is extended to the analysis of rotating periodic structures, which, due to the gyroscopic effect, exhibit asymmetric wave propagation. A projection-based strategy which uses reduced symplectic wave basis is employed, which provides a well-conditioned eigenproblem for computing waves in rotating periodic structures. The proposed formulation is applied to the free and forced response analysis of homogeneous, multi-layered and phononic ring structures. In all test cases, the following features are highlighted: well-conditioned dispersion diagrams, good accuracy, and low computational time. The proposed strategy is particularly convenient in the simulation of rotating structures when parametric analysis for several rotational speeds is usually required, e.g. for calculating Campbell diagrams. This provides an efficient and flexible framework for the analysis of rotordynamic problems.

  15. ESTEST: An Open Science Platform for Electronic Structure Research

    ERIC Educational Resources Information Center

    Yuan, Gary

    2012-01-01

    Open science platforms in support of data generation, analysis, and dissemination are becoming indispensible tools for conducting research. These platforms use informatics and information technologies to address significant problems in open science data interoperability, verification & validation, comparison, analysis, post-processing,…

  16. High-resolution DNA melting analysis in plant research

    USDA-ARS?s Scientific Manuscript database

    Genetic and genomic studies provide valuable insight into the inheritance, structure, organization, and function of genes. The knowledge gained from the analysis of plant genes is beneficial to all aspects of plant research, including crop improvement. New methods and tools are continually developed...

  17. Fine-Scale Structure Design for 3D Printing

    NASA Astrophysics Data System (ADS)

    Panetta, Francis Julian

    Modern additive fabrication technologies can manufacture shapes whose geometric complexities far exceed what existing computational design tools can analyze or optimize. At the same time, falling costs have placed these fabrication technologies within the average consumer's reach. Especially for inexpert designers, new software tools are needed to take full advantage of 3D printing technology. This thesis develops such tools and demonstrates the exciting possibilities enabled by fine-tuning objects at the small scales achievable by 3D printing. The thesis applies two high-level ideas to invent these tools: two-scale design and worst-case analysis. The two-scale design approach addresses the problem that accurately simulating--let alone optimizing--the full-resolution geometry sent to the printer requires orders of magnitude more computational power than currently available. However, we can decompose the design problem into a small-scale problem (designing tileable structures achieving a particular deformation behavior) and a macro-scale problem (deciding where to place these structures in the larger object). This separation is particularly effective, since structures for every useful behavior can be designed once, stored in a database, then reused for many different macroscale problems. Worst-case analysis refers to determining how likely an object is to fracture by studying the worst possible scenario: the forces most efficiently breaking it. This analysis is needed when the designer has insufficient knowledge or experience to predict what forces an object will undergo, or when the design is intended for use in many different scenarios unknown a priori. The thesis begins by summarizing the physics and mathematics necessary to rigorously approach these design and analysis problems. Specifically, the second chapter introduces linear elasticity and periodic homogenization. The third chapter presents a pipeline to design microstructures achieving a wide range of effective isotropic elastic material properties on a single-material 3D printer. It also proposes a macroscale optimization algorithm placing these microstructures to achieve deformation goals under prescribed loads. The thesis then turns to worst-case analysis, first considering the macroscale problem: given a user's design, the fourth chapter aims to determine the distribution of pressures over the surface creating the highest stress at any point in the shape. Solving this problem exactly is difficult, so we introduce two heuristics: one to focus our efforts on only regions likely to concentrate stresses and another converting the pressure optimization into an efficient linear program. Finally, the fifth chapter introduces worst-case analysis at the microscopic scale, leveraging the insight that the structure of periodic homogenization enables us to solve the problem exactly and efficiently. Then we use this worst-case analysis to guide a shape optimization, designing structures with prescribed deformation behavior that experience minimal stresses in generic use.

  18. Introduction to the computational structural mechanics testbed

    NASA Technical Reports Server (NTRS)

    Lotts, C. G.; Greene, W. H.; Mccleary, S. L.; Knight, N. F., Jr.; Paulson, S. S.; Gillian, R. E.

    1987-01-01

    The Computational Structural Mechanics (CSM) testbed software system based on the SPAR finite element code and the NICE system is described. This software is denoted NICE/SPAR. NICE was developed at Lockheed Palo Alto Research Laboratory and contains data management utilities, a command language interpreter, and a command language definition for integrating engineering computational modules. SPAR is a system of programs used for finite element structural analysis developed for NASA by Lockheed and Engineering Information Systems, Inc. It includes many complementary structural analysis, thermal analysis, utility functions which communicate through a common database. The work on NICE/SPAR was motivated by requirements for a highly modular and flexible structural analysis system to use as a tool in carrying out research in computational methods and exploring computer hardware. Analysis examples are presented which demonstrate the benefits gained from a combination of the NICE command language with a SPAR computational modules.

  19. LoopX: A Graphical User Interface-Based Database for Comprehensive Analysis and Comparative Evaluation of Loops from Protein Structures.

    PubMed

    Kadumuri, Rajashekar Varma; Vadrevu, Ramakrishna

    2017-10-01

    Due to their crucial role in function, folding, and stability, protein loops are being targeted for grafting/designing to create novel or alter existing functionality and improve stability and foldability. With a view to facilitate a thorough analysis and effectual search options for extracting and comparing loops for sequence and structural compatibility, we developed, LoopX a comprehensively compiled library of sequence and conformational features of ∼700,000 loops from protein structures. The database equipped with a graphical user interface is empowered with diverse query tools and search algorithms, with various rendering options to visualize the sequence- and structural-level information along with hydrogen bonding patterns, backbone φ, ψ dihedral angles of both the target and candidate loops. Two new features (i) conservation of the polar/nonpolar environment and (ii) conservation of sequence and conformation of specific residues within the loops have also been incorporated in the search and retrieval of compatible loops for a chosen target loop. Thus, the LoopX server not only serves as a database and visualization tool for sequence and structural analysis of protein loops but also aids in extracting and comparing candidate loops for a given target loop based on user-defined search options.

  20. PDBStat: a universal restraint converter and restraint analysis software package for protein NMR.

    PubMed

    Tejero, Roberto; Snyder, David; Mao, Binchen; Aramini, James M; Montelione, Gaetano T

    2013-08-01

    The heterogeneous array of software tools used in the process of protein NMR structure determination presents organizational challenges in the structure determination and validation processes, and creates a learning curve that limits the broader use of protein NMR in biology. These challenges, including accurate use of data in different data formats required by software carrying out similar tasks, continue to confound the efforts of novices and experts alike. These important issues need to be addressed robustly in order to standardize protein NMR structure determination and validation. PDBStat is a C/C++ computer program originally developed as a universal coordinate and protein NMR restraint converter. Its primary function is to provide a user-friendly tool for interconverting between protein coordinate and protein NMR restraint data formats. It also provides an integrated set of computational methods for protein NMR restraint analysis and structure quality assessment, relabeling of prochiral atoms with correct IUPAC names, as well as multiple methods for analysis of the consistency of atomic positions indicated by their convergence across a protein NMR ensemble. In this paper we provide a detailed description of the PDBStat software, and highlight some of its valuable computational capabilities. As an example, we demonstrate the use of the PDBStat restraint converter for restrained CS-Rosetta structure generation calculations, and compare the resulting protein NMR structure models with those generated from the same NMR restraint data using more traditional structure determination methods. These results demonstrate the value of a universal restraint converter in allowing the use of multiple structure generation methods with the same restraint data for consensus analysis of protein NMR structures and the underlying restraint data.

  1. PDBStat: A Universal Restraint Converter and Restraint Analysis Software Package for Protein NMR

    PubMed Central

    Tejero, Roberto; Snyder, David; Mao, Binchen; Aramini, James M.; Montelione, Gaetano T

    2013-01-01

    The heterogeneous array of software tools used in the process of protein NMR structure determination presents organizational challenges in the structure determination and validation processes, and creates a learning curve that limits the broader use of protein NMR in biology. These challenges, including accurate use of data in different data formats required by software carrying out similar tasks, continue to confound the efforts of novices and experts alike. These important issues need to be addressed robustly in order to standardize protein NMR structure determination and validation. PDBStat is a C/C++ computer program originally developed as a universal coordinate and protein NMR restraint converter. Its primary function is to provide a user-friendly tool for interconverting between protein coordinate and protein NMR restraint data formats. It also provides an integrated set of computational methods for protein NMR restraint analysis and structure quality assessment, relabeling of prochiral atoms with correct IUPAC names, as well as multiple methods for analysis of the consistency of atomic positions indicated by their convergence across a protein NMR ensemble. In this paper we provide a detailed description of the PDBStat software, and highlight some of its valuable computational capabilities. As an example, we demonstrate the use of the PDBStat restraint converter for restrained CS-Rosetta structure generation calculations, and compare the resulting protein NMR structure models with those generated from the same NMR restraint data using more traditional structure determination methods. These results demonstrate the value of a universal restraint converter in allowing the use of multiple structure generation methods with the same restraint data for consensus analysis of protein NMR structures and the underlying restraint data. PMID:23897031

  2. Characterization of technical surfaces by structure function analysis

    NASA Astrophysics Data System (ADS)

    Kalms, Michael; Kreis, Thomas; Bergmann, Ralf B.

    2018-03-01

    The structure function is a tool for characterizing technical surfaces that exhibits a number of advantages over Fourierbased analysis methods. So it is optimally suited for analyzing the height distributions of surfaces measured by full-field non-contacting methods. The structure function is thus a useful method to extract global or local criteria like e. g. periodicities, waviness, lay, or roughness to analyze and evaluate technical surfaces. After the definition of line- and area-structure function and offering effective procedures for their calculation this paper presents examples using simulated and measured data of technical surfaces including aircraft parts.

  3. Implementation of an ADME enabling selection and visualization tool for drug discovery.

    PubMed

    Stoner, Chad L; Gifford, Eric; Stankovic, Charles; Lepsy, Christopher S; Brodfuehrer, Joanne; Prasad, J V N Vara; Surendran, Narayanan

    2004-05-01

    The pharmaceutical industry has large investments in compound library enrichment, high throughput biological screening, and biopharmaceutical (ADME) screening. As the number of compounds submitted for in vitro ADME screens increases, data analysis, interpretation, and reporting will become rate limiting in providing ADME-structure-activity relationship information to guide the synthetic strategy for chemical series. To meet these challenges, a software tool was developed and implemented that enables scientists to explore in vitro and in silico ADME and chemistry data in a multidimensional framework. The present work integrates physicochemical and ADME data, encompassing results for Caco-2 permeability, human liver microsomal half-life, rat liver microsomal half-life, kinetic solubility, measured log P, rule of 5 descriptors (molecular weight, hydrogen bond acceptors, hydrogen bond donors, calculated log P), polar surface area, chemical stability, and CYP450 3A4 inhibition. To facilitate interpretation of this data, a semicustomized software solution using Spotfire was designed that allows for multidimensional data analysis and visualization. The solution also enables simultaneous viewing and export of chemical structures with the corresponding ADME properties, enabling a more facile analysis of ADME-structure-activity relationship. In vitro and in silico ADME data were generated for 358 compounds from a series of human immunodeficiency virus protease inhibitors, resulting in a data set of 5370 experimental values which were subsequently analyzed and visualized using the customized Spotfire application. Implementation of this analysis and visualization tool has accelerated the selection of molecules for further development based on optimum ADME characteristics, and provided medicinal chemistry with specific, data driven structural recommendations for improvements in the ADME profile. Copyright 2004 Wiley-Liss, Inc. and the American Pharmacists Association J Pharm Sci 93: 1131-1141, 2004

  4. Shape optimization and CAD

    NASA Technical Reports Server (NTRS)

    Rasmussen, John

    1990-01-01

    Structural optimization has attracted the attention since the days of Galileo. Olhoff and Taylor have produced an excellent overview of the classical research within this field. However, the interest in structural optimization has increased greatly during the last decade due to the advent of reliable general numerical analysis methods and the computer power necessary to use them efficiently. This has created the possibility of developing general numerical systems for shape optimization. Several authors, eg., Esping; Braibant & Fleury; Bennet & Botkin; Botkin, Yang, and Bennet; and Stanton have published practical and successful applications of general optimization systems. Ding and Homlein have produced extensive overviews of available systems. Furthermore, a number of commercial optimization systems based on well-established finite element codes have been introduced. Systems like ANSYS, IDEAS, OASIS, and NISAOPT are widely known examples. In parallel to this development, the technology of computer aided design (CAD) has gained a large influence on the design process of mechanical engineering. The CAD technology has already lived through a rapid development driven by the drastically growing capabilities of digital computers. However, the systems of today are still considered as being only the first generation of a long row of computer integrated manufacturing (CIM) systems. These systems to come will offer an integrated environment for design, analysis, and fabrication of products of almost any character. Thus, the CAD system could be regarded as simply a database for geometrical information equipped with a number of tools with the purpose of helping the user in the design process. Among these tools are facilities for structural analysis and optimization as well as present standard CAD features like drawing, modeling, and visualization tools. The state of the art of structural optimization is that a large amount of mathematical and mechanical techniques are available for the solution of single problems. By implementing collections of the available techniques into general software systems, operational environments for structural optimization have been created. The forthcoming years must bring solutions to the problem of integrating such systems into more general design environments. The result of this work should be CAD systems for rational design in which structural optimization is one important design tool among many others.

  5. Some uses of wavelets for imaging dynamic processes in live cochlear structures

    NASA Astrophysics Data System (ADS)

    Boutet de Monvel, J.

    2007-09-01

    A variety of image and signal processing algorithms based on wavelet filtering tools have been developed during the last few decades, that are well adapted to the experimental variability typically encountered in live biological microscopy. A number of processing tools are reviewed, that use wavelets for adaptive image restoration and for motion or brightness variation analysis by optical flow computation. The usefulness of these tools for biological imaging is illustrated in the context of the restoration of images of the inner ear and the analysis of cochlear motion patterns in two and three dimensions. I also report on recent work that aims at capturing fluorescence intensity changes associated with vesicle dynamics at synaptic zones of sensory hair cells. This latest application requires one to separate the intensity variations associated with the physiological process under study from the variations caused by motion of the observed structures. A wavelet optical flow algorithm for doing this is presented, and its effectiveness is demonstrated on artificial and experimental image sequences.

  6. Mutagenicity in a Molecule: Identification of Core Structural Features of Mutagenicity Using a Scaffold Analysis

    PubMed Central

    Hsu, Kuo-Hsiang; Su, Bo-Han; Tu, Yi-Shu; Lin, Olivia A.; Tseng, Yufeng J.

    2016-01-01

    With advances in the development and application of Ames mutagenicity in silico prediction tools, the International Conference on Harmonisation (ICH) has amended its M7 guideline to reflect the use of such prediction models for the detection of mutagenic activity in early drug safety evaluation processes. Since current Ames mutagenicity prediction tools only focus on functional group alerts or side chain modifications of an analog series, these tools are unable to identify mutagenicity derived from core structures or specific scaffolds of a compound. In this study, a large collection of 6512 compounds are used to perform scaffold tree analysis. By relating different scaffolds on constructed scaffold trees with Ames mutagenicity, four major and one minor novel mutagenic groups of scaffold are identified. The recognized mutagenic groups of scaffold can serve as a guide for medicinal chemists to prevent the development of potentially mutagenic therapeutic agents in early drug design or development phases, by modifying the core structures of mutagenic compounds to form non-mutagenic compounds. In addition, five series of substructures are provided as recommendations, for direct modification of potentially mutagenic scaffolds to decrease associated mutagenic activities. PMID:26863515

  7. Research on criticality analysis method of CNC machine tools components under fault rate correlation

    NASA Astrophysics Data System (ADS)

    Gui-xiang, Shen; Xian-zhuo, Zhao; Zhang, Ying-zhi; Chen-yu, Han

    2018-02-01

    In order to determine the key components of CNC machine tools under fault rate correlation, a system component criticality analysis method is proposed. Based on the fault mechanism analysis, the component fault relation is determined, and the adjacency matrix is introduced to describe it. Then, the fault structure relation is hierarchical by using the interpretive structure model (ISM). Assuming that the impact of the fault obeys the Markov process, the fault association matrix is described and transformed, and the Pagerank algorithm is used to determine the relative influence values, combined component fault rate under time correlation can obtain comprehensive fault rate. Based on the fault mode frequency and fault influence, the criticality of the components under the fault rate correlation is determined, and the key components are determined to provide the correct basis for equationting the reliability assurance measures. Finally, taking machining centers as an example, the effectiveness of the method is verified.

  8. Structural Performance’s Optimally Analysing and Implementing Based on ANSYS Technology

    NASA Astrophysics Data System (ADS)

    Han, Na; Wang, Xuquan; Yue, Haifang; Sun, Jiandong; Wu, Yongchun

    2017-06-01

    Computer-aided Engineering (CAE) is a hotspot both in academic field and in modern engineering practice. Analysis System(ANSYS) simulation software for its excellent performance become outstanding one in CAE family, it is committed to the innovation of engineering simulation to help users to shorten the design process, improve product innovation and performance. Aimed to explore a structural performance’s optimally analyzing model for engineering enterprises, this paper introduced CAE and its development, analyzed the necessity for structural optimal analysis as well as the framework of structural optimal analysis on ANSYS Technology, used ANSYS to implement a reinforced concrete slab structural performance’s optimal analysis, which was display the chart of displacement vector and the chart of stress intensity. Finally, this paper compared ANSYS software simulation results with the measured results,expounded that ANSYS is indispensable engineering calculation tools.

  9. Structural Analysis of Competitive Forces in Higher Education Industry: A Conceptual Framework.

    ERIC Educational Resources Information Center

    Sisaye, Seleshi

    This report describes how colleges and universities in the Not-for-Profit sector can bridge the strategic management research gap by applying competitive analysis in the strategic planning process. This business analysis tool can be used to assist colleges and universities, just as it assists businesses, in understanding the competitive forces…

  10. Structural Equation Model Trees

    ERIC Educational Resources Information Center

    Brandmaier, Andreas M.; von Oertzen, Timo; McArdle, John J.; Lindenberger, Ulman

    2013-01-01

    In the behavioral and social sciences, structural equation models (SEMs) have become widely accepted as a modeling tool for the relation between latent and observed variables. SEMs can be seen as a unification of several multivariate analysis techniques. SEM Trees combine the strengths of SEMs and the decision tree paradigm by building tree…

  11. 20180318 - Structure identification by Mass Spectrometry Non-Targeted Analysis using the US EPA’s CompTox Chemistry Dashboard (ACS Spring)

    EPA Science Inventory

    Identification of unknowns in mass spectrometry based non-targeted analyses (NTA) requires the integration of complementary pieces of data to arrive at a confident, consensus structure. Researchers use chemical reference databases, spectral matching, fragment prediction tools, r...

  12. Stress Analysis of Columns and Beam Columns by the Photoelastic Method

    NASA Technical Reports Server (NTRS)

    Ruffner, B F

    1946-01-01

    Principles of similarity and other factors in the design of models for photoelastic testing are discussed. Some approximate theoretical equations, useful in the analysis of results obtained from photoelastic tests are derived. Examples of the use of photoelastic techniques and the analysis of results as applied to uniform and tapered beam columns, circular rings, and statically indeterminate frames, are given. It is concluded that this method is an effective tool for the analysis of structures in which column action is present, particularly in tapered beam columns, and in statically indeterminate structures in which the distribution of loads in the structures is influenced by bending moments due to axial loads in one or more members.

  13. CLMSVault: A Software Suite for Protein Cross-Linking Mass-Spectrometry Data Analysis and Visualization.

    PubMed

    Courcelles, Mathieu; Coulombe-Huntington, Jasmin; Cossette, Émilie; Gingras, Anne-Claude; Thibault, Pierre; Tyers, Mike

    2017-07-07

    Protein cross-linking mass spectrometry (CL-MS) enables the sensitive detection of protein interactions and the inference of protein complex topology. The detection of chemical cross-links between protein residues can identify intra- and interprotein contact sites or provide physical constraints for molecular modeling of protein structure. Recent innovations in cross-linker design, sample preparation, mass spectrometry, and software tools have significantly improved CL-MS approaches. Although a number of algorithms now exist for the identification of cross-linked peptides from mass spectral data, a dearth of user-friendly analysis tools represent a practical bottleneck to the broad adoption of the approach. To facilitate the analysis of CL-MS data, we developed CLMSVault, a software suite designed to leverage existing CL-MS algorithms and provide intuitive and flexible tools for cross-platform data interpretation. CLMSVault stores and combines complementary information obtained from different cross-linkers and search algorithms. CLMSVault provides filtering, comparison, and visualization tools to support CL-MS analyses and includes a workflow for label-free quantification of cross-linked peptides. An embedded 3D viewer enables the visualization of quantitative data and the mapping of cross-linked sites onto PDB structural models. We demonstrate the application of CLMSVault for the analysis of a noncovalent Cdc34-ubiquitin protein complex cross-linked under different conditions. CLMSVault is open-source software (available at https://gitlab.com/courcelm/clmsvault.git ), and a live demo is available at http://democlmsvault.tyerslab.com/ .

  14. Factor Structure of the New Imaginary Audience Scale in a Sample of Female College Students

    ERIC Educational Resources Information Center

    Kuterbach, James M.

    2007-01-01

    The New Imaginary Audience Scale (NIAS; Lapsley, FitzGerald, Rice, & Jackson, 1989) has been used as a research tool with both high school and college aged samples, yet there is no structural validity evidence for its use with college students. This study examined the structural validity of the NIAS via an exploratory factor analysis, using a…

  15. Symposium II: Mechanochemistry in Materials Science, MRS Fall Meeting, Nov 30-Dec 4, 2009, Boston, MA

    DTIC Science & Technology

    2010-09-02

    Dynamic Mechanical Analysis (DMA). The fracture behavior of the mechanophore-linked polymer is also examined through the Double Cleavage Drilled ...multinary complex structures. Structural, microstructural, and chemical characterizations were explored by metrological tools to support this...simple hydrocarbons in order to quantitatively define structure-property relationships for reacting materials under shock compression. Embedded gauge

  16. Trajectory Control for Very Flexible Aircraft

    DTIC Science & Technology

    2006-10-30

    aircraft are coupled with the aeroelastic equations that govern the geometrically nonlinear structural response of the vehicle. A low -order strain...nonlinear structural formulation, the finite state aerodynamic model, and the nonlinear rigid body equations together provide a low -order complete...nonlinear aircraft analysis tool. Due to the inherent flexibility of the aircraft modeling, the low order structural fre- quencies are of the same order

  17. Open source cardiology electronic health record development for DIGICARDIAC implementation

    NASA Astrophysics Data System (ADS)

    Dugarte, Nelson; Medina, Rubén.; Huiracocha, Lourdes; Rojas, Rubén.

    2015-12-01

    This article presents the development of a Cardiology Electronic Health Record (CEHR) system. Software consists of a structured algorithm designed under Health Level-7 (HL7) international standards. Novelty of the system is the integration of high resolution ECG (HRECG) signal acquisition and processing tools, patient information management tools and telecardiology tools. Acquisition tools are for management and control of the DIGICARDIAC electrocardiograph functions. Processing tools allow management of HRECG signal analysis searching for indicative patterns of cardiovascular pathologies. Telecardiology tools incorporation allows system communication with other health care centers decreasing access time to the patient information. CEHR system was completely developed using open source software. Preliminary results of process validation showed the system efficiency.

  18. Intensive mothering ideology in France: A pilot study.

    PubMed

    Loyal, D; Sutter Dallay, A-L; Rascle, N

    2017-12-01

    The aim of this pilot study was to adapt the intensive mothering ideology concept in a French sample and to get an assessment tool. First, the Intensive Parenting Attitudes Questionnaire (IPAQ), a U.S. scale comprising 25 items, was translated and submitted online to French mothers and mothers-to-be (n=250). Structural validity was tested through confirmatory factor analysis with poor results. Secondly, to increase the cultural validity of a new tool, new items were derived from French women speech. French mothers and mothers-to-be (n=22) were asked about their views regarding motherhood and childcare (semi-structured interviews). A thematic content analysis was performed with good inter-judge agreement (0.53-0.86) and 27 items were created. Finally, the total set of 52 items was submitted online to French mothers and mothers-to-be (n=474). The structure was tested through exploratory factor analysis. A new tool called the Measure of Intensive Mothering Ideology (MIMI) was obtained. This 21 items scale with 6 dimensions (Essentialism, Consuming Fulfillment, Child-centrism, Challenge, Sacrifice and Stimulation) explains 59.75% of variance. Internal consistencies were satisfactory (0.61-0.83) and most dimensions were positively and moderately correlated (0.17-0.38). The MIMI is the first French-language scale assessing IMI and offers interesting research avenues notably regarding perinatal parental adaptation. Copyright © 2017 L'Encéphale, Paris. Published by Elsevier Masson SAS. All rights reserved.

  19. 3Drefine: an interactive web server for efficient protein structure refinement.

    PubMed

    Bhattacharya, Debswapna; Nowotny, Jackson; Cao, Renzhi; Cheng, Jianlin

    2016-07-08

    3Drefine is an interactive web server for consistent and computationally efficient protein structure refinement with the capability to perform web-based statistical and visual analysis. The 3Drefine refinement protocol utilizes iterative optimization of hydrogen bonding network combined with atomic-level energy minimization on the optimized model using a composite physics and knowledge-based force fields for efficient protein structure refinement. The method has been extensively evaluated on blind CASP experiments as well as on large-scale and diverse benchmark datasets and exhibits consistent improvement over the initial structure in both global and local structural quality measures. The 3Drefine web server allows for convenient protein structure refinement through a text or file input submission, email notification, provided example submission and is freely available without any registration requirement. The server also provides comprehensive analysis of submissions through various energy and statistical feedback and interactive visualization of multiple refined models through the JSmol applet that is equipped with numerous protein model analysis tools. The web server has been extensively tested and used by many users. As a result, the 3Drefine web server conveniently provides a useful tool easily accessible to the community. The 3Drefine web server has been made publicly available at the URL: http://sysbio.rnet.missouri.edu/3Drefine/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  20. A guide to analysis and reconstruction of serial block face scanning electron microscopy data

    PubMed Central

    TAGGART, M.; RIND, F.C.; WHITE, K.

    2018-01-01

    Summary Serial block face scanning electron microscopy (SBF‐SEM) is a relatively new technique that allows the acquisition of serially sectioned, imaged and digitally aligned ultrastructural data. There is a wealth of information that can be obtained from the resulting image stacks but this presents a new challenge for researchers – how to computationally analyse and make best use of the large datasets produced. One approach is to reconstruct structures and features of interest in 3D. However, the software programmes can appear overwhelming, time‐consuming and not intuitive for those new to image analysis. There are a limited number of published articles that provide sufficient detail on how to do this type of reconstruction. Therefore, the aim of this paper is to provide a detailed step‐by‐step protocol, accompanied by tutorial videos, for several types of analysis programmes that can be used on raw SBF‐SEM data, although there are more options available than can be covered here. To showcase the programmes, datasets of skeletal muscle from foetal and adult guinea pigs are initially used with procedures subsequently applied to guinea pig cardiac tissue and locust brain. The tissue is processed using the heavy metal protocol developed specifically for SBF‐SEM. Trimmed resin blocks are placed into a Zeiss Sigma SEM incorporating the Gatan 3View and the resulting image stacks are analysed in three different programmes, Fiji, Amira and MIB, using a range of tools available for segmentation. The results from the image analysis comparison show that the analysis tools are often more suited to a particular type of structure. For example, larger structures, such as nuclei and cells, can be segmented using interpolation, which speeds up analysis; single contrast structures, such as the nucleolus, can be segmented using the contrast‐based thresholding tools. Knowing the nature of the tissue and its specific structures (complexity, contrast, if there are distinct membranes, size) will help to determine the best method for reconstruction and thus maximize informative output from valuable tissue. PMID:29333754

  1. A guide to analysis and reconstruction of serial block face scanning electron microscopy data.

    PubMed

    Cocks, E; Taggart, M; Rind, F C; White, K

    2018-05-01

    Serial block face scanning electron microscopy (SBF-SEM) is a relatively new technique that allows the acquisition of serially sectioned, imaged and digitally aligned ultrastructural data. There is a wealth of information that can be obtained from the resulting image stacks but this presents a new challenge for researchers - how to computationally analyse and make best use of the large datasets produced. One approach is to reconstruct structures and features of interest in 3D. However, the software programmes can appear overwhelming, time-consuming and not intuitive for those new to image analysis. There are a limited number of published articles that provide sufficient detail on how to do this type of reconstruction. Therefore, the aim of this paper is to provide a detailed step-by-step protocol, accompanied by tutorial videos, for several types of analysis programmes that can be used on raw SBF-SEM data, although there are more options available than can be covered here. To showcase the programmes, datasets of skeletal muscle from foetal and adult guinea pigs are initially used with procedures subsequently applied to guinea pig cardiac tissue and locust brain. The tissue is processed using the heavy metal protocol developed specifically for SBF-SEM. Trimmed resin blocks are placed into a Zeiss Sigma SEM incorporating the Gatan 3View and the resulting image stacks are analysed in three different programmes, Fiji, Amira and MIB, using a range of tools available for segmentation. The results from the image analysis comparison show that the analysis tools are often more suited to a particular type of structure. For example, larger structures, such as nuclei and cells, can be segmented using interpolation, which speeds up analysis; single contrast structures, such as the nucleolus, can be segmented using the contrast-based thresholding tools. Knowing the nature of the tissue and its specific structures (complexity, contrast, if there are distinct membranes, size) will help to determine the best method for reconstruction and thus maximize informative output from valuable tissue. © 2018 The Authors. Journal of Microscopy published by John Wiley & Sons Ltd on behalf of Royal Microscopical Society.

  2. xGDBvm: A Web GUI-Driven Workflow for Annotating Eukaryotic Genomes in the Cloud[OPEN

    PubMed Central

    Merchant, Nirav

    2016-01-01

    Genome-wide annotation of gene structure requires the integration of numerous computational steps. Currently, annotation is arguably best accomplished through collaboration of bioinformatics and domain experts, with broad community involvement. However, such a collaborative approach is not scalable at today’s pace of sequence generation. To address this problem, we developed the xGDBvm software, which uses an intuitive graphical user interface to access a number of common genome analysis and gene structure tools, preconfigured in a self-contained virtual machine image. Once their virtual machine instance is deployed through iPlant’s Atmosphere cloud services, users access the xGDBvm workflow via a unified Web interface to manage inputs, set program parameters, configure links to high-performance computing (HPC) resources, view and manage output, apply analysis and editing tools, or access contextual help. The xGDBvm workflow will mask the genome, compute spliced alignments from transcript and/or protein inputs (locally or on a remote HPC cluster), predict gene structures and gene structure quality, and display output in a public or private genome browser complete with accessory tools. Problematic gene predictions are flagged and can be reannotated using the integrated yrGATE annotation tool. xGDBvm can also be configured to append or replace existing data or load precomputed data. Multiple genomes can be annotated and displayed, and outputs can be archived for sharing or backup. xGDBvm can be adapted to a variety of use cases including de novo genome annotation, reannotation, comparison of different annotations, and training or teaching. PMID:27020957

  3. xGDBvm: A Web GUI-Driven Workflow for Annotating Eukaryotic Genomes in the Cloud.

    PubMed

    Duvick, Jon; Standage, Daniel S; Merchant, Nirav; Brendel, Volker P

    2016-04-01

    Genome-wide annotation of gene structure requires the integration of numerous computational steps. Currently, annotation is arguably best accomplished through collaboration of bioinformatics and domain experts, with broad community involvement. However, such a collaborative approach is not scalable at today's pace of sequence generation. To address this problem, we developed the xGDBvm software, which uses an intuitive graphical user interface to access a number of common genome analysis and gene structure tools, preconfigured in a self-contained virtual machine image. Once their virtual machine instance is deployed through iPlant's Atmosphere cloud services, users access the xGDBvm workflow via a unified Web interface to manage inputs, set program parameters, configure links to high-performance computing (HPC) resources, view and manage output, apply analysis and editing tools, or access contextual help. The xGDBvm workflow will mask the genome, compute spliced alignments from transcript and/or protein inputs (locally or on a remote HPC cluster), predict gene structures and gene structure quality, and display output in a public or private genome browser complete with accessory tools. Problematic gene predictions are flagged and can be reannotated using the integrated yrGATE annotation tool. xGDBvm can also be configured to append or replace existing data or load precomputed data. Multiple genomes can be annotated and displayed, and outputs can be archived for sharing or backup. xGDBvm can be adapted to a variety of use cases including de novo genome annotation, reannotation, comparison of different annotations, and training or teaching. © 2016 American Society of Plant Biologists. All rights reserved.

  4. Experimental strain modal analysis for beam-like structure by using distributed fiber optics and its damage detection

    NASA Astrophysics Data System (ADS)

    Cheng, Liangliang; Busca, Giorgio; Cigada, Alfredo

    2017-07-01

    Modal analysis is commonly considered as an effective tool to obtain the intrinsic characteristics of structures including natural frequencies, modal damping ratios, and mode shapes, which are significant indicators for monitoring the health status of engineering structures. The complex mode indicator function (CMIF) can be regarded as an effective numerical tool to perform modal analysis. In this paper, experimental strain modal analysis based on the CMIF has been introduced. Moreover, a distributed fiber-optic sensor, as a dense measuring device, has been applied to acquire strain data along a beam surface. Thanks to the dense spatial resolution of the distributed fiber optics, more detailed mode shapes could be obtained. In order to test the effectiveness of the method, a mass lump—considered as a linear damage component—has been attached to the surface of the beam, and damage detection based on strain mode shape has been carried out. The results manifest that strain modal parameters can be estimated effectively by utilizing the CMIF based on the corresponding simulations and experiments. Furthermore, damage detection based on strain mode shapes benefits from the accuracy of strain mode shape recognition and the excellent performance of the distributed fiber optics.

  5. ATLAS (Automatic Tool for Local Assembly Structures) - A Comprehensive Infrastructure for Assembly, Annotation, and Genomic Binning of Metagenomic and Metaranscripomic Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Richard A.; Brown, Joseph M.; Colby, Sean M.

    ATLAS (Automatic Tool for Local Assembly Structures) is a comprehensive multiomics data analysis pipeline that is massively parallel and scalable. ATLAS contains a modular analysis pipeline for assembly, annotation, quantification and genome binning of metagenomics and metatranscriptomics data and a framework for reference metaproteomic database construction. ATLAS transforms raw sequence data into functional and taxonomic data at the microbial population level and provides genome-centric resolution through genome binning. ATLAS provides robust taxonomy based on majority voting of protein coding open reading frames rolled-up at the contig level using modified lowest common ancestor (LCA) analysis. ATLAS provides robust taxonomy based onmore » majority voting of protein coding open reading frames rolled-up at the contig level using modified lowest common ancestor (LCA) analysis. ATLAS is user-friendly, easy install through bioconda maintained as open-source on GitHub, and is implemented in Snakemake for modular customizable workflows.« less

  6. Adaptive bill morphology for enhanced tool manipulation in New Caledonian crows

    PubMed Central

    Matsui, Hiroshi; Hunt, Gavin R.; Oberhofer, Katja; Ogihara, Naomichi; McGowan, Kevin J.; Mithraratne, Kumar; Yamasaki, Takeshi; Gray, Russell D.; Izawa, Ei-Ichi

    2016-01-01

    Early increased sophistication of human tools is thought to be underpinned by adaptive morphology for efficient tool manipulation. Such adaptive specialisation is unknown in nonhuman primates but may have evolved in the New Caledonian crow, which has sophisticated tool manufacture. The straightness of its bill, for example, may be adaptive for enhanced visually-directed use of tools. Here, we examine in detail the shape and internal structure of the New Caledonian crow’s bill using Principal Components Analysis and Computed Tomography within a comparative framework. We found that the bill has a combination of interrelated shape and structural features unique within Corvus, and possibly birds generally. The upper mandible is relatively deep and short with a straight cutting edge, and the lower mandible is strengthened and upturned. These novel combined attributes would be functional for (i) counteracting the unique loading patterns acting on the bill when manipulating tools, (ii) a strong precision grip to hold tools securely, and (iii) enhanced visually-guided tool use. Our findings indicate that the New Caledonian crow’s innovative bill has been adapted for tool manipulation to at least some degree. Early increased sophistication of tools may require the co-evolution of morphology that provides improved manipulatory skills. PMID:26955788

  7. Using Velocity Anisotropy to Analyze Magnetohydrodynamic Turbulence in Giant Molecular Clouds

    NASA Astrophysics Data System (ADS)

    Madrid, Alecio; Hernandez, Audra

    2018-01-01

    Structure function (SF) analysis is a strong tool for gaging the Alfvénic properties of magnetohydrodynamic (MHD) simulations, yet there is a lack of literature rigorously investigating limitations in the context of radio spectroscopy. This study takes an in depth approach to studying the limitations of SF analysis for analyzing MHD turbulence in giant molecular cloud (GMC) spectroscopy data. MHD turbulence plays a critical role in the structure and evolution of GMCs as well as in the formation of sub-structures known to spawn stellar progenitors. Existing methods of detection are neither economical nor robust (e.g. dust polarization), and nowhere is this more clear than in the theoretical-observational divide in current literature. A significant limitation of GMC spectroscopy results from the large variation in methods used for extracting GMCs from survey data. Thus, a robust method for studying MHD turbulence must correctly gauge physical properties regardless of the data extraction method used. While SF analysis has demonstrated strong potential across a range of simulated conditions, this study finds significant concern regarding its feasibility as a robust tool in GMC spectroscopy.

  8. Dynamic analysis of space structures including elastic, multibody, and control behavior

    NASA Technical Reports Server (NTRS)

    Pinson, Larry; Soosaar, Keto

    1989-01-01

    The problem is to develop analysis methods, modeling stategies, and simulation tools to predict with assurance the on-orbit performance and integrity of large complex space structures that cannot be verified on the ground. The problem must incorporate large reliable structural models, multi-body flexible dynamics, multi-tier controller interaction, environmental models including 1g and atmosphere, various on-board disturbances, and linkage to mission-level performance codes. All areas are in serious need of work, but the weakest link is multi-body flexible dynamics.

  9. NESSUS/EXPERT - An expert system for probabilistic structural analysis methods

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Palmer, K.; Fink, P.

    1988-01-01

    An expert system (NESSUS/EXPERT) is presented which provides assistance in using probabilistic structural analysis methods. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator. NESSUS/EXPERT was developed with a combination of FORTRAN and CLIPS, a C language expert system tool, to exploit the strengths of each language.

  10. An Expert Assistant for Computer Aided Parallelization

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Chun, Robert; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    The prototype implementation of an expert system was developed to assist the user in the computer aided parallelization process. The system interfaces to tools for automatic parallelization and performance analysis. By fusing static program structure information and dynamic performance analysis data the expert system can help the user to filter, correlate, and interpret the data gathered by the existing tools. Sections of the code that show poor performance and require further attention are rapidly identified and suggestions for improvements are presented to the user. In this paper we describe the components of the expert system and discuss its interface to the existing tools. We present a case study to demonstrate the successful use in full scale scientific applications.

  11. Infrared spectroscopy as a tool to characterise starch ordered structure--a joint FTIR-ATR, NMR, XRD and DSC study.

    PubMed

    Warren, Frederick J; Gidley, Michael J; Flanagan, Bernadine M

    2016-03-30

    Starch has a heterogeneous, semi-crystalline granular structure and the degree of ordered structure can affect its behaviour in foods and bioplastics. A range of methodologies are employed to study starch structure; differential scanning calorimetry, (13)C nuclear magnetic resonance, X-ray diffraction and Fourier transform infrared spectroscopy (FTIR). Despite the appeal of FTIR as a rapid, non-destructive methodology, there is currently no systematically defined quantitative relationship between FTIR spectral features and other starch structural measures. Here, we subject 61 starch samples to structural analysis, and systematically correlate FTIR spectra with other measures of starch structure. A hydration dependent peak position shift in the FTIR spectra of starch is observed, resulting from increased molecular order, but with complex, non-linear behaviour. We demonstrate that FTIR is a tool that can quantitatively probe short range interactions in starch structure. However, the assumptions of linear relationships between starch ordered structure and peak ratios are overly simplistic. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. FDTD simulation tools for UWB antenna analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  13. PANTHER version 11: expanded annotation data from Gene Ontology and Reactome pathways, and data analysis tool enhancements.

    PubMed

    Mi, Huaiyu; Huang, Xiaosong; Muruganujan, Anushya; Tang, Haiming; Mills, Caitlin; Kang, Diane; Thomas, Paul D

    2017-01-04

    The PANTHER database (Protein ANalysis THrough Evolutionary Relationships, http://pantherdb.org) contains comprehensive information on the evolution and function of protein-coding genes from 104 completely sequenced genomes. PANTHER software tools allow users to classify new protein sequences, and to analyze gene lists obtained from large-scale genomics experiments. In the past year, major improvements include a large expansion of classification information available in PANTHER, as well as significant enhancements to the analysis tools. Protein subfamily functional classifications have more than doubled due to progress of the Gene Ontology Phylogenetic Annotation Project. For human genes (as well as a few other organisms), PANTHER now also supports enrichment analysis using pathway classifications from the Reactome resource. The gene list enrichment tools include a new 'hierarchical view' of results, enabling users to leverage the structure of the classifications/ontologies; the tools also allow users to upload genetic variant data directly, rather than requiring prior conversion to a gene list. The updated coding single-nucleotide polymorphisms (SNP) scoring tool uses an improved algorithm. The hidden Markov model (HMM) search tools now use HMMER3, dramatically reducing search times and improving accuracy of E-value statistics. Finally, the PANTHER Tree-Attribute Viewer has been implemented in JavaScript, with new views for exploring protein sequence evolution. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. Eros in Stereo

    NASA Image and Video Library

    2000-05-07

    Stereo imaging, an important tool on NASA NEAR Shoemaker for geologic analysis of Eros, provides three-dimensional information on the asteroid landforms and structures. 3D glasses are necessary to view this image.

  15. Low-cost composite blades for the Mod-0A wind turbines

    NASA Technical Reports Server (NTRS)

    Weingart, O.

    1982-01-01

    Low cost approaches to the design and fabrication of blades for a two-bladed 200 kW wind turbine were identified and the applicability of the techniques to larger and smaller blades was assessed. Blade tooling was designed and fabricated. Two complete blades and a partial blade for tool tryout were built. The patented TFT process was used to wind the entire blade. This process allows rapid winding of an axially oriented composite onto a tapered mandrel, with tapered wall thickness. The blade consists of a TFT glass-epoxy airfoil structure filament wound onto a steel root end fitting. The fitting is, in turn, bolted to a conical steel adapter section to provide for mounting attachment to the hub. Structural analysis, blade properties, and cost and weight analysis are described.

  16. Some Observations on the Current Status of Performing Finite Element Analyses

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Knight, Norman F., Jr; Shivakumar, Kunigal N.

    2015-01-01

    Aerospace structures are complex high-performance structures. Advances in reliable and efficient computing and modeling tools are enabling analysts to consider complex configurations, build complex finite element models, and perform analysis rapidly. Many of the early career engineers of today are very proficient in the usage of modern computers, computing engines, complex software systems, and visualization tools. These young engineers are becoming increasingly efficient in building complex 3D models of complicated aerospace components. However, the current trends demonstrate blind acceptance of the results of the finite element analysis results. This paper is aimed at raising an awareness of this situation. Examples of the common encounters are presented. To overcome the current trends, some guidelines and suggestions for analysts, senior engineers, and educators are offered.

  17. Replication Analysis in Exploratory Factor Analysis: What It Is and Why It Makes Your Analysis Better

    ERIC Educational Resources Information Center

    Osborne, Jason W.; Fitzpatrick, David C.

    2012-01-01

    Exploratory Factor Analysis (EFA) is a powerful and commonly-used tool for investigating the underlying variable structure of a psychometric instrument. However, there is much controversy in the social sciences with regard to the techniques used in EFA (Ford, MacCallum, & Tait, 1986; Henson & Roberts, 2006) and the reliability of the outcome.…

  18. Requirements analysis notebook for the flight data systems definition in the Real-Time Systems Engineering Laboratory (RSEL)

    NASA Astrophysics Data System (ADS)

    Wray, Richard B.

    1991-12-01

    A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.

  19. Requirements analysis notebook for the flight data systems definition in the Real-Time Systems Engineering Laboratory (RSEL)

    NASA Technical Reports Server (NTRS)

    Wray, Richard B.

    1991-01-01

    A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.

  20. SGRAPH (SeismoGRAPHer): Seismic waveform analysis and integrated tools in seismology

    NASA Astrophysics Data System (ADS)

    Abdelwahed, Mohamed F.

    2012-03-01

    Although numerous seismological programs are currently available, most of them suffer from the inability to manipulate different data formats and the lack of embedded seismological tools. SeismoGRAPHer, or simply SGRAPH, is a new system for maintaining and analyzing seismic waveform data in a stand-alone, Windows-based application that manipulates a wide range of data formats. SGRAPH was intended to be a tool sufficient for performing basic waveform analysis and solving advanced seismological problems. The graphical user interface (GUI) utilities and the Windows functionalities, such as dialog boxes, menus, and toolbars, simplify the user interaction with the data. SGRAPH supports common data formats, such as SAC, SEED, GSE, ASCII, and Nanometrics Y-format, and provides the ability to solve many seismological problems with built-in inversion tools. Loaded traces are maintained, processed, plotted, and saved as SAC, ASCII, or PS (post script) file formats. SGRAPH includes Generalized Ray Theory (GRT), genetic algorithm (GA), least-square fitting, auto-picking, fast Fourier transforms (FFT), and many additional tools. This program provides rapid estimation of earthquake source parameters, location, attenuation, and focal mechanisms. Advanced waveform modeling techniques are provided for crustal structure and focal mechanism estimation. SGRAPH has been employed in the Egyptian National Seismic Network (ENSN) as a tool assisting with routine work and data analysis. More than 30 users have been using previous versions of SGRAPH in their research for more than 3 years. The main features of this application are ease of use, speed, small disk space requirements, and the absence of third-party developed components. Because of its architectural structure, SGRAPH can be interfaced with newly developed methods or applications in seismology. A complete setup file, including the SGRAPH package with the online user guide, is available.

  1. Structural Analysis Methods for Structural Health Management of Future Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Tessler, Alexander

    2007-01-01

    Two finite element based computational methods, Smoothing Element Analysis (SEA) and the inverse Finite Element Method (iFEM), are reviewed, and examples of their use for structural health monitoring are discussed. Due to their versatility, robustness, and computational efficiency, the methods are well suited for real-time structural health monitoring of future space vehicles, large space structures, and habitats. The methods may be effectively employed to enable real-time processing of sensing information, specifically for identifying three-dimensional deformed structural shapes as well as the internal loads. In addition, they may be used in conjunction with evolutionary algorithms to design optimally distributed sensors. These computational tools have demonstrated substantial promise for utilization in future Structural Health Management (SHM) systems.

  2. Inter-subject phase synchronization for exploratory analysis of task-fMRI.

    PubMed

    Bolt, Taylor; Nomi, Jason S; Vij, Shruti G; Chang, Catie; Uddin, Lucina Q

    2018-08-01

    Analysis of task-based fMRI data is conventionally carried out using a hypothesis-driven approach, where blood-oxygen-level dependent (BOLD) time courses are correlated with a hypothesized temporal structure. In some experimental designs, this temporal structure can be difficult to define. In other cases, experimenters may wish to take a more exploratory, data-driven approach to detecting task-driven BOLD activity. In this study, we demonstrate the efficiency and power of an inter-subject synchronization approach for exploratory analysis of task-based fMRI data. Combining the tools of instantaneous phase synchronization and independent component analysis, we characterize whole-brain task-driven responses in terms of group-wise similarity in temporal signal dynamics of brain networks. We applied this framework to fMRI data collected during performance of a simple motor task and a social cognitive task. Analyses using an inter-subject phase synchronization approach revealed a large number of brain networks that dynamically synchronized to various features of the task, often not predicted by the hypothesized temporal structure of the task. We suggest that this methodological framework, along with readily available tools in the fMRI community, provides a powerful exploratory, data-driven approach for analysis of task-driven BOLD activity. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Sustainable development induction in organizations: a convergence analysis of ISO standards management tools' parameters.

    PubMed

    Merlin, Fabrício Kurman; Pereira, Vera Lúciaduarte do Valle; Pacheco, Waldemar

    2012-01-01

    Organizations are part of an environment in which they are pressured to meet society's demands and acting in a sustainable way. In an attempt to meet such demands, organizations make use of various management tools, among which, ISO standards are used. Although there are evidences of contributions provided by these standards, it is questionable whether its parameters converge for a possible induction for sustainable development in organizations. This work presents a theoretical study, designed on structuralism world view, descriptive and deductive method, which aims to analyze the convergence of management tools' parameters in ISO standards. In order to support the analysis, a generic framework for possible convergence was developed, based on systems approach, linking five ISO standards (ISO 9001, ISO 14001, OHSAS 18001, ISO 31000 and ISO 26000) with sustainable development and positioning them according to organization levels (strategic, tactical and operational). The structure was designed based on Brundtland report concept. The analysis was performed exploring the generic framework for possible convergence based on Nadler and Tushman model. The results found the standards can contribute to a possible sustainable development induction in organizations, as long as they meet certain minimum conditions related to its strategic alignment.

  4. Nondestructive surface analysis for material research using fiber optic vibrational spectroscopy

    NASA Astrophysics Data System (ADS)

    Afanasyeva, Natalia I.

    2001-11-01

    The advanced methods of fiber optical vibrational spectroscopy (FOVS) has been developed in conjunction with interferometer and low-loss, flexible, and nontoxic optical fibers, sensors, and probes. The combination of optical fibers and sensors with Fourier Transform (FT) spectrometer has been used in the range from 2.5 to 12micrometers . This technique serves as an ideal diagnostic tool for surface analysis of numerous and various diverse materials such as complex structured materials, fluids, coatings, implants, living cells, plants, and tissue. Such surfaces as well as living tissue or plants are very difficult to investigate in vivo by traditional FT infrared or Raman spectroscopy methods. The FOVS technique is nondestructive, noninvasive, fast (15 sec) and capable of operating in remote sampling regime (up to a fiber length of 3m). Fourier transform infrared (FTIR) and Raman fiber optic spectroscopy operating with optical fibers has been suggested as a new powerful tool. These techniques are highly sensitive techniques for structural studies in material research and various applications during process analysis to determine molecular composition, chemical bonds, and molecular conformations. These techniques could be developed as a new tool for quality control of numerous materials as well as noninvasive biopsy.

  5. RipleyGUI: software for analyzing spatial patterns in 3D cell distributions

    PubMed Central

    Hansson, Kristin; Jafari-Mamaghani, Mehrdad; Krieger, Patrik

    2013-01-01

    The true revolution in the age of digital neuroanatomy is the ability to extensively quantify anatomical structures and thus investigate structure-function relationships in great detail. To facilitate the quantification of neuronal cell patterns we have developed RipleyGUI, a MATLAB-based software that can be used to detect patterns in the 3D distribution of cells. RipleyGUI uses Ripley's K-function to analyze spatial distributions. In addition the software contains statistical tools to determine quantitative statistical differences, and tools for spatial transformations that are useful for analyzing non-stationary point patterns. The software has a graphical user interface making it easy to use without programming experience, and an extensive user manual explaining the basic concepts underlying the different statistical tools used to analyze spatial point patterns. The described analysis tool can be used for determining the spatial organization of neurons that is important for a detailed study of structure-function relationships. For example, neocortex that can be subdivided into six layers based on cell density and cell types can also be analyzed in terms of organizational principles distinguishing the layers. PMID:23658544

  6. Turbine Engine Hot Section Technology, 1987

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Presentations were made concerning the development of design analysis tools for combustor liners, turbine vanes, and turbine blades. Presentations were divided into six sections: instrumentation, combustion, turbine heat transfer, structural analysis, fatigue and fracture, surface protective coatings, constitutive behavior of materials, stress-strain response and life prediction methods.

  7. Comparative Reliability of Structured Versus Unstructured Interviews in the Admission Process of a Residency Program

    PubMed Central

    Blouin, Danielle; Day, Andrew G.; Pavlov, Andrey

    2011-01-01

    Background Although never directly compared, structured interviews are reported as being more reliable than unstructured interviews. This study compared the reliability of both types of interview when applied to a common pool of applicants for positions in an emergency medicine residency program. Methods In 2008, one structured interview was added to the two unstructured interviews traditionally used in our resident selection process. A formal job analysis using the critical incident technique guided the development of the structured interview tool. This tool consisted of 7 scenarios assessing 4 of the domains deemed essential for success as a resident in this program. The traditional interview tool assessed 5 general criteria. In addition to these criteria, the unstructured panel members were asked to rate each candidate on the same 4 essential domains rated by the structured panel members. All 3 panels interviewed all candidates. Main outcomes were the overall, interitem, and interrater reliabilities, the correlations between interview panels, and the dimensionality of each interview tool. Results Thirty candidates were interviewed. The overall reliability reached 0.43 for the structured interview, and 0.81 and 0.71 for the unstructured interviews. Analyses of the variance components showed a high interrater, low interitem reliability for the structured interview, and a high interrater, high interitem reliability for the unstructured interviews. The summary measures from the 2 unstructured interviews were significantly correlated, but neither was correlated with the structured interview. Only the structured interview was multidimensional. Conclusions A structured interview did not yield a higher overall reliability than both unstructured interviews. The lower reliability is explained by a lower interitem reliability, which in turn is due to the multidimensionality of the interview tool. Both unstructured panels consistently rated a single dimension, even when prompted to assess the 4 specific domains established as essential to succeed in this residency program. PMID:23205201

  8. Comparative reliability of structured versus unstructured interviews in the admission process of a residency program.

    PubMed

    Blouin, Danielle; Day, Andrew G; Pavlov, Andrey

    2011-12-01

    Although never directly compared, structured interviews are reported as being more reliable than unstructured interviews. This study compared the reliability of both types of interview when applied to a common pool of applicants for positions in an emergency medicine residency program. In 2008, one structured interview was added to the two unstructured interviews traditionally used in our resident selection process. A formal job analysis using the critical incident technique guided the development of the structured interview tool. This tool consisted of 7 scenarios assessing 4 of the domains deemed essential for success as a resident in this program. The traditional interview tool assessed 5 general criteria. In addition to these criteria, the unstructured panel members were asked to rate each candidate on the same 4 essential domains rated by the structured panel members. All 3 panels interviewed all candidates. Main outcomes were the overall, interitem, and interrater reliabilities, the correlations between interview panels, and the dimensionality of each interview tool. Thirty candidates were interviewed. The overall reliability reached 0.43 for the structured interview, and 0.81 and 0.71 for the unstructured interviews. Analyses of the variance components showed a high interrater, low interitem reliability for the structured interview, and a high interrater, high interitem reliability for the unstructured interviews. The summary measures from the 2 unstructured interviews were significantly correlated, but neither was correlated with the structured interview. Only the structured interview was multidimensional. A structured interview did not yield a higher overall reliability than both unstructured interviews. The lower reliability is explained by a lower interitem reliability, which in turn is due to the multidimensionality of the interview tool. Both unstructured panels consistently rated a single dimension, even when prompted to assess the 4 specific domains established as essential to succeed in this residency program.

  9. Probabilistic Sensitivity Analysis for Launch Vehicles with Varying Payloads and Adapters for Structural Dynamics and Loads

    NASA Technical Reports Server (NTRS)

    McGhee, David S.; Peck, Jeff A.; McDonald, Emmett J.

    2012-01-01

    This paper examines Probabilistic Sensitivity Analysis (PSA) methods and tools in an effort to understand their utility in vehicle loads and dynamic analysis. Specifically, this study addresses how these methods may be used to establish limits on payload mass and cg location and requirements on adaptor stiffnesses while maintaining vehicle loads and frequencies within established bounds. To this end, PSA methods and tools are applied to a realistic, but manageable, integrated launch vehicle analysis where payload and payload adaptor parameters are modeled as random variables. This analysis is used to study both Regional Response PSA (RRPSA) and Global Response PSA (GRPSA) methods, with a primary focus on sampling based techniques. For contrast, some MPP based approaches are also examined.

  10. Preliminary Development of an Object-Oriented Optimization Tool

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi

    2011-01-01

    The National Aeronautics and Space Administration Dryden Flight Research Center has developed a FORTRAN-based object-oriented optimization (O3) tool that leverages existing tools and practices and allows easy integration and adoption of new state-of-the-art software. The object-oriented framework can integrate the analysis codes for multiple disciplines, as opposed to relying on one code to perform analysis for all disciplines. Optimization can thus take place within each discipline module, or in a loop between the central executive module and the discipline modules, or both. Six sample optimization problems are presented. The first four sample problems are based on simple mathematical equations; the fifth and sixth problems consider a three-bar truss, which is a classical example in structural synthesis. Instructions for preparing input data for the O3 tool are presented.

  11. Organization and Visualization for Initial Analysis of Forced-Choice Ipsative Data

    ERIC Educational Resources Information Center

    Cochran, Jill A.

    2015-01-01

    Forced-choice ipsative data are common in personality, philosophy and other preference-based studies. However, this type of data inherently contains dependencies that are challenging for usual statistical analysis. In order to utilize the structure of the data as a guide for analysis rather than as a challenge to manage, a visualisation tool was…

  12. Null Models for Everyone: A Two-Step Approach to Teaching Null Model Analysis of Biological Community Structure

    ERIC Educational Resources Information Center

    McCabe, Declan J.; Knight, Evelyn J.

    2016-01-01

    Since being introduced by Connor and Simberloff in response to Diamond's assembly rules, null model analysis has been a controversial tool in community ecology. Despite being commonly used in the primary literature, null model analysis has not featured prominently in general textbooks. Complexity of approaches along with difficulty in interpreting…

  13. Expert Approaches to Analysis

    DTIC Science & Technology

    1999-03-01

    of epistemic forms and games , which can form the basis for building a tool to support expert analyses. 15. SUBJECT TERMS Expert analysis Epistemic...forms Epistemic games SECURITY CLASSIFICATION OF 16. REPORT Unclassified 17. ABSTRACT Unclassified 18. THIS PAGE Unclassified 19. LIMITATION OF...1998 Principal Investigators: Allan Collins & William Ferguson BBN Technologies Introduction 1 Prior Work 2 Structural-Analysis Games 2 Functional

  14. ISAC: A tool for aeroservoelastic modeling and analysis

    NASA Technical Reports Server (NTRS)

    Adams, William M., Jr.; Hoadley, Sherwood Tiffany

    1993-01-01

    The capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules is discussed. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrates some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.

  15. Object-Oriented MDAO Tool with Aeroservoelastic Model Tuning Capability

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Li, Wesley; Lung, Shun-fat

    2008-01-01

    An object-oriented multi-disciplinary analysis and optimization (MDAO) tool has been developed at the NASA Dryden Flight Research Center to automate the design and analysis process and leverage existing commercial as well as in-house codes to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic and hypersonic aircraft. Once the structural analysis discipline is finalized and integrated completely into the MDAO process, other disciplines such as aerodynamics and flight controls will be integrated as well. Simple and efficient model tuning capabilities based on optimization problem are successfully integrated with the MDAO tool. More synchronized all phases of experimental testing (ground and flight), analytical model updating, high-fidelity simulations for model validation, and integrated design may result in reduction of uncertainties in the aeroservoelastic model and increase the flight safety.

  16. MDTraj: A Modern Open Library for the Analysis of Molecular Dynamics Trajectories.

    PubMed

    McGibbon, Robert T; Beauchamp, Kyle A; Harrigan, Matthew P; Klein, Christoph; Swails, Jason M; Hernández, Carlos X; Schwantes, Christian R; Wang, Lee-Ping; Lane, Thomas J; Pande, Vijay S

    2015-10-20

    As molecular dynamics (MD) simulations continue to evolve into powerful computational tools for studying complex biomolecular systems, the necessity of flexible and easy-to-use software tools for the analysis of these simulations is growing. We have developed MDTraj, a modern, lightweight, and fast software package for analyzing MD simulations. MDTraj reads and writes trajectory data in a wide variety of commonly used formats. It provides a large number of trajectory analysis capabilities including minimal root-mean-square-deviation calculations, secondary structure assignment, and the extraction of common order parameters. The package has a strong focus on interoperability with the wider scientific Python ecosystem, bridging the gap between MD data and the rapidly growing collection of industry-standard statistical analysis and visualization tools in Python. MDTraj is a powerful and user-friendly software package that simplifies the analysis of MD data and connects these datasets with the modern interactive data science software ecosystem in Python. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  17. MDTraj: A Modern Open Library for the Analysis of Molecular Dynamics Trajectories

    PubMed Central

    McGibbon, Robert T.; Beauchamp, Kyle A.; Harrigan, Matthew P.; Klein, Christoph; Swails, Jason M.; Hernández, Carlos X.; Schwantes, Christian R.; Wang, Lee-Ping; Lane, Thomas J.; Pande, Vijay S.

    2015-01-01

    As molecular dynamics (MD) simulations continue to evolve into powerful computational tools for studying complex biomolecular systems, the necessity of flexible and easy-to-use software tools for the analysis of these simulations is growing. We have developed MDTraj, a modern, lightweight, and fast software package for analyzing MD simulations. MDTraj reads and writes trajectory data in a wide variety of commonly used formats. It provides a large number of trajectory analysis capabilities including minimal root-mean-square-deviation calculations, secondary structure assignment, and the extraction of common order parameters. The package has a strong focus on interoperability with the wider scientific Python ecosystem, bridging the gap between MD data and the rapidly growing collection of industry-standard statistical analysis and visualization tools in Python. MDTraj is a powerful and user-friendly software package that simplifies the analysis of MD data and connects these datasets with the modern interactive data science software ecosystem in Python. PMID:26488642

  18. Structural analyses for the modification and verification of the Viking aeroshell

    NASA Technical Reports Server (NTRS)

    Stephens, W. B.; Anderson, M. S.

    1976-01-01

    The Viking aeroshell is an extremely lightweight flexible shell structure that has undergone thorough buckling analyses in the course of its development. The analytical tools and modeling technique required to reveal the structural behavior are presented. Significant results are given which illustrate the complex failure modes not usually observed in simple models and analyses. Both shell-of-revolution analysis for the pressure loads and thermal loads during entry and a general shell analysis for concentrated tank loads during launch were used. In many cases fixes or alterations to the structure were required, and the role of the analytical results in determining these modifications is indicated.

  19. Soak Up the Rain New England Webinar Series: National ...

    EPA Pesticide Factsheets

    Presenters will provide an introduction to the most recent EPA green infrastructure tools to R1 stakeholders; and their use in making decisions about implementing green infrastructure. We will discuss structuring your green infrastructure decision, finding appropriate information and tools, evaluating options and selecting the right Best Management Practices mix for your needs.WMOST (Watershed Management Optimization Support Tool)- for screening a wide range of practices for cost-effectiveness in achieving watershed or water utilities management goals.GIWiz (Green Infrastructure Wizard)- a web application connecting communities to EPA Green Infrastructure tools and resources.Opti-Tool-designed to assist in developing technically sound and optimized cost-effective Stormwater management plans. National Stormwater Calculator- a desktop application for estimating the impact of land cover change and green infrastructure controls on stormwater runoff. DASEES-GI (Decision Analysis for a Sustainable Environment, Economy, and Society) – a framework for linking objectives and measures with green infrastructure methods. Presenters will provide an introduction to the most recent EPA green infrastructure tools to R1 stakeholders; and their use in making decisions about implementing green infrastructure. We will discuss structuring your green infrastructure decision, finding appropriate information and tools, evaluating options and selecting the right Best Management Pr

  20. Exploring Rating Quality in Rater-Mediated Assessments Using Mokken Scale Analysis

    ERIC Educational Resources Information Center

    Wind, Stefanie A.; Engelhard, George, Jr.

    2016-01-01

    Mokken scale analysis is a probabilistic nonparametric approach that offers statistical and graphical tools for evaluating the quality of social science measurement without placing potentially inappropriate restrictions on the structure of a data set. In particular, Mokken scaling provides a useful method for evaluating important measurement…

  1. Bridging the gap between individual-level risk for HIV and structural determinants: using root cause analysis in strategic planning.

    PubMed

    Willard, Nancy; Chutuape, Kate; Stines, Stephanie; Ellen, Jonathan M

    2012-01-01

    HIV prevention efforts have expanded beyond individual-level interventions to address structural determinants of risk. Coalitions have been an important vehicle for addressing similar intractable and deeply rooted health-related issues. A root cause analysis process may aid coalitions in identifying fundamental, structural-level contributors to risk and in identifying appropriate solutions. For this article, strategic plans for 13 coalitions were analyzed both before and after a root cause analysis approach was applied to determine the coalitions' strategic plans potential impact and comprehensiveness. After root cause analysis, strategic plans trended toward targeting policies and practices rather than on single agency programmatic changes. Plans expanded to target multiple sectors and several changes within sectors to penetrate deeply into a sector or system. Findings suggest that root cause analysis may be a viable tool to assist coalitions in identifying structural determinants and possible solutions for HIV risk.

  2. Advanced composites structural concepts and materials technologies for primary aircraft structures. Structural response and failure analysis: ISPAN modules users manual

    NASA Technical Reports Server (NTRS)

    Hairr, John W.; Huang, Jui-Ten; Ingram, J. Edward; Shah, Bharat M.

    1992-01-01

    The ISPAN Program (Interactive Stiffened Panel Analysis) is an interactive design tool that is intended to provide a means of performing simple and self contained preliminary analysis of aircraft primary structures made of composite materials. The program combines a series of modules with the finite element code DIAL as its backbone. Four ISPAN Modules were developed and are documented. These include: (1) flat stiffened panel; (2) curved stiffened panel; (3) flat tubular panel; and (4) curved geodesic panel. Users are instructed to input geometric and material properties, load information and types of analysis (linear, bifurcation buckling, or post-buckling) interactively. The program utilizing this information will generate finite element mesh and perform analysis. The output in the form of summary tables of stress or margins of safety, contour plots of loads or stress, and deflected shape plots may be generalized and used to evaluate specific design.

  3. JavaProtein Dossier: a novel web-based data visualization tool for comprehensive analysis of protein structure

    PubMed Central

    Neshich, Goran; Rocchia, Walter; Mancini, Adauto L.; Yamagishi, Michel E. B.; Kuser, Paula R.; Fileto, Renato; Baudet, Christian; Pinto, Ivan P.; Montagner, Arnaldo J.; Palandrani, Juliana F.; Krauchenco, Joao N.; Torres, Renato C.; Souza, Savio; Togawa, Roberto C.; Higa, Roberto H.

    2004-01-01

    JavaProtein Dossier (JPD) is a new concept, database and visualization tool providing one of the largest collections of the physicochemical parameters describing proteins' structure, stability, function and interaction with other macromolecules. By collecting as many descriptors/parameters as possible within a single database, we can achieve a better use of the available data and information. Furthermore, data grouping allows us to generate different parameters with the potential to provide new insights into the sequence–structure–function relationship. In JPD, residue selection can be performed according to multiple criteria. JPD can simultaneously display and analyze all the physicochemical parameters of any pair of structures, using precalculated structural alignments, allowing direct parameter comparison at corresponding amino acid positions among homologous structures. In order to focus on the physicochemical (and consequently pharmacological) profile of proteins, visualization tools (showing the structure and structural parameters) also had to be optimized. Our response to this challenge was the use of Java technology with its exceptional level of interactivity. JPD is freely accessible (within the Gold Sting Suite) at http://sms.cbi.cnptia.embrapa.br, http://mirrors.rcsb.org/SMS, http://trantor.bioc.columbia.edu/SMS and http://www.es.embnet.org/SMS/ (Option: JavaProtein Dossier). PMID:15215458

  4. Full-scale testing, production and cost analysis data for the advanced composite stabilizer for Boeing 737 aircraft. Volume 1: Technical summary

    NASA Technical Reports Server (NTRS)

    Aniversario, R. B.; Harvey, S. T.; Mccarty, J. E.; Parsons, J. T.; Peterson, D. C.; Pritchett, L. D.; Wilson, D. R.; Wogulis, E. R.

    1983-01-01

    The full scale ground test, ground vibration test, and flight tests conducted to demonstrate a composite structure stabilizer for the Boeing 737 aircraft and obtain FAA certification are described. Detail tools, assembly tools, and overall production are discussed. Cost analyses aspects covered include production costs, composite material usage factors, and cost comparisons.

  5. Measuring Educators' Perceptions of Their Skills Relative to Response to Intervention: A Psychometric Study of a Survey Tool

    ERIC Educational Resources Information Center

    Castillo, Jose M.; March, Amanda L.; Stockslager, Kevin M.; Hines, Constance V.

    2016-01-01

    The "Perceptions of RtI Skills Survey" is a self-report measure that assesses educators' perceptions of their data-based problem-solving skills--a critical element of many Response-to-Intervention (RtI) models. Confirmatory factor analysis (CFA) was used to evaluate the underlying factor structure of this tool. Educators from 68 (n =…

  6. Extraction, integration and analysis of alternative splicing and protein structure distributed information

    PubMed Central

    D'Antonio, Matteo; Masseroli, Marco

    2009-01-01

    Background Alternative splicing has been demonstrated to affect most of human genes; different isoforms from the same gene encode for proteins which differ for a limited number of residues, thus yielding similar structures. This suggests possible correlations between alternative splicing and protein structure. In order to support the investigation of such relationships, we have developed the Alternative Splicing and Protein Structure Scrutinizer (PASS), a Web application to automatically extract, integrate and analyze human alternative splicing and protein structure data sparsely available in the Alternative Splicing Database, Ensembl databank and Protein Data Bank. Primary data from these databases have been integrated and analyzed using the Protein Identifier Cross-Reference, BLAST, CLUSTALW and FeatureMap3D software tools. Results A database has been developed to store the considered primary data and the results from their analysis; a system of Perl scripts has been implemented to automatically create and update the database and analyze the integrated data; a Web interface has been implemented to make the analyses easily accessible; a database has been created to manage user accesses to the PASS Web application and store user's data and searches. Conclusion PASS automatically integrates data from the Alternative Splicing Database with protein structure data from the Protein Data Bank. Additionally, it comprehensively analyzes the integrated data with publicly available well-known bioinformatics tools in order to generate structural information of isoform pairs. Further analysis of such valuable information might reveal interesting relationships between alternative splicing and protein structure differences, which may be significantly associated with different functions. PMID:19828075

  7. [Development and Use of Hidrosig

    NASA Technical Reports Server (NTRS)

    Gupta, Vijay K.; Milne, Bruce T.

    2003-01-01

    The NASA portion of this joint NSF-NASA grant consists of objective 2 and a part of objective 3. A major effort was made on objective 2, and it consisted of developing a numerical GIs environment called Hidrosig. This major research tool is being developed by the University of Colorado for conducting river-network-based scaling analyses of coupled water-energy-landform-vegetation interactions including water and energy balances, and floods and droughts, at multiple space-time scales.Objective 2: To analyze the relevant remotely sensed products from satellites, radars and ground measurements to compute the transported water mass for each complete Strahler stream using an 'assimilated water balance equation' at daily and other appropriate time scales. This objective requires analysis of concurrent data sets for Precipitation (PPT), Evapotranspiration (ET) and stream flows (Q) on river networks. To solve this major problem, our decision was to develop Hidrosig, a new Open-Source GIs software. A research group in Colombia, South America, developed the first version of Hidrosig, and Ricardo Mantilla was part of this effort as an undergraduate student before joining the graduate program at the University of Colorado in 2001. Hydrosig automatically extracts river networks from large DEMs and creates a "link-based" data structure, which is required to conduct a variety of analyses under objective 2. It is programmed in Java, which is a multi-platform programming language freely distributed by SUN under a GPL license. Some existent commercial tools like Arc-Info, RiverTools and others are not suitable for our purpose for two reasons. First, the source code is not available that is needed to build on the network data structure. Second, these tools use different programming languages that are not most versatile for our purposes. For example, RiverTools uses an IDL platform that is not very efficient for organizing diverse data sets on river networks. Hidrosig establishes a clear data organization framework that allows a simultaneous analysis of spatial fields along river network structures involving Horton- Strahler framework. Software tools for network extraction from DEMs and network-based analysis of geomorphologic and topologic variables were developed during the first year and a part of second year.

  8. StructRNAfinder: an automated pipeline and web server for RNA families prediction.

    PubMed

    Arias-Carrasco, Raúl; Vásquez-Morán, Yessenia; Nakaya, Helder I; Maracaja-Coutinho, Vinicius

    2018-02-17

    The function of many noncoding RNAs (ncRNAs) depend upon their secondary structures. Over the last decades, several methodologies have been developed to predict such structures or to use them to functionally annotate RNAs into RNA families. However, to fully perform this analysis, researchers should utilize multiple tools, which require the constant parsing and processing of several intermediate files. This makes the large-scale prediction and annotation of RNAs a daunting task even to researchers with good computational or bioinformatics skills. We present an automated pipeline named StructRNAfinder that predicts and annotates RNA families in transcript or genome sequences. This single tool not only displays the sequence/structural consensus alignments for each RNA family, according to Rfam database but also provides a taxonomic overview for each assigned functional RNA. Moreover, we implemented a user-friendly web service that allows researchers to upload their own nucleotide sequences in order to perform the whole analysis. Finally, we provided a stand-alone version of StructRNAfinder to be used in large-scale projects. The tool was developed under GNU General Public License (GPLv3) and is freely available at http://structrnafinder.integrativebioinformatics.me . The main advantage of StructRNAfinder relies on the large-scale processing and integrating the data obtained by each tool and database employed along the workflow, of which several files are generated and displayed in user-friendly reports, useful for downstream analyses and data exploration.

  9. PDB@: an offline toolkit for exploration and analysis of PDB files.

    PubMed

    Mani, Udayakumar; Ravisankar, Sadhana; Ramakrishnan, Sai Mukund

    2013-12-01

    Protein Data Bank (PDB) is a freely accessible archive of the 3-D structural data of biological molecules. Structure based studies offers a unique vantage point in inferring the properties of a protein molecule from structural data. This is too big a task to be done manually. Moreover, there is no single tool, software or server that comprehensively analyses all structure-based properties. The objective of the present work is to develop an offline computational toolkit, PDB@ containing in-built algorithms that help categorizing the structural properties of a protein molecule. The user has the facility to view and edit the PDB file to his need. Some features of the present work are unique in itself and others are an improvement over existing tools. Also, the representation of protein properties in both graphical and textual formats helps in predicting all the necessary details of a protein molecule on a single platform.

  10. PBxplore: a tool to analyze local protein structure and deformability with Protein Blocks

    PubMed Central

    Craveur, Pierrick; Joseph, Agnel Praveen; Jallu, Vincent

    2017-01-01

    This paper describes the development and application of a suite of tools, called PBxplore, to analyze the dynamics and deformability of protein structures using Protein Blocks (PBs). Proteins are highly dynamic macromolecules, and a classical way to analyze their inherent flexibility is to perform molecular dynamics simulations. The advantage of using small structural prototypes such as PBs is to give a good approximation of the local structure of the protein backbone. More importantly, by reducing the conformational complexity of protein structures, PBs allow analysis of local protein deformability which cannot be done with other methods and had been used efficiently in different applications. PBxplore is able to process large amounts of data such as those produced by molecular dynamics simulations. It produces frequencies, entropy and information logo outputs as text and graphics. PBxplore is available at https://github.com/pierrepo/PBxplore and is released under the open-source MIT license. PMID:29177113

  11. Teacher Reporting Attitudes Scale (TRAS): confirmatory and exploratory factor analyses with a Malaysian sample.

    PubMed

    Choo, Wan Yuen; Walsh, Kerryann; Chinna, Karuthan; Tey, Nai Peng

    2013-01-01

    The Teacher Reporting Attitude Scale (TRAS) is a newly developed tool to assess teachers' attitudes toward reporting child abuse and neglect. This article reports on an investigation of the factor structure and psychometric properties of the short form Malay version of the TRAS. A self-report cross-sectional survey was conducted with 667 teachers in 14 randomly selected schools in Selangor state, Malaysia. Analyses were conducted in a 3-stage process using both confirmatory (stages 1 and 3) and exploratory factor analyses (stage 2) to test, modify, and confirm the underlying factor structure of the TRAS in a non-Western teacher sample. Confirmatory factor analysis did not support a 3-factor model previously reported in the original TRAS study. Exploratory factor analysis revealed an 8-item, 4-factor structure. Further confirmatory factor analysis demonstrated appropriateness of the 4-factor structure. Reliability estimates for the four factors-commitment, value, concern, and confidence-were moderate. The modified short form TRAS (Malay version) has potential to be used as a simple tool for relatively quick assessment of teachers' attitudes toward reporting child abuse and neglect. Cross-cultural differences in attitudes toward reporting may exist and the transferability of newly developed instruments to other populations should be evaluated.

  12. Metabolomics Workbench: An international repository for metabolomics data and metadata, metabolite standards, protocols, tutorials and training, and analysis tools

    PubMed Central

    Sud, Manish; Fahy, Eoin; Cotter, Dawn; Azam, Kenan; Vadivelu, Ilango; Burant, Charles; Edison, Arthur; Fiehn, Oliver; Higashi, Richard; Nair, K. Sreekumaran; Sumner, Susan; Subramaniam, Shankar

    2016-01-01

    The Metabolomics Workbench, available at www.metabolomicsworkbench.org, is a public repository for metabolomics metadata and experimental data spanning various species and experimental platforms, metabolite standards, metabolite structures, protocols, tutorials, and training material and other educational resources. It provides a computational platform to integrate, analyze, track, deposit and disseminate large volumes of heterogeneous data from a wide variety of metabolomics studies including mass spectrometry (MS) and nuclear magnetic resonance spectrometry (NMR) data spanning over 20 different species covering all the major taxonomic categories including humans and other mammals, plants, insects, invertebrates and microorganisms. Additionally, a number of protocols are provided for a range of metabolite classes, sample types, and both MS and NMR-based studies, along with a metabolite structure database. The metabolites characterized in the studies available on the Metabolomics Workbench are linked to chemical structures in the metabolite structure database to facilitate comparative analysis across studies. The Metabolomics Workbench, part of the data coordinating effort of the National Institute of Health (NIH) Common Fund's Metabolomics Program, provides data from the Common Fund's Metabolomics Resource Cores, metabolite standards, and analysis tools to the wider metabolomics community and seeks data depositions from metabolomics researchers across the world. PMID:26467476

  13. Building and evaluating an informatics tool to facilitate analysis of a biomedical literature search service in an academic medical center library.

    PubMed

    Hinton, Elizabeth G; Oelschlegel, Sandra; Vaughn, Cynthia J; Lindsay, J Michael; Hurst, Sachiko M; Earl, Martha

    2013-01-01

    This study utilizes an informatics tool to analyze a robust literature search service in an academic medical center library. Structured interviews with librarians were conducted focusing on the benefits of such a tool, expectations for performance, and visual layout preferences. The resulting application utilizes Microsoft SQL Server and .Net Framework 3.5 technologies, allowing for the use of a web interface. Customer tables and MeSH terms are included. The National Library of Medicine MeSH database and entry terms for each heading are incorporated, resulting in functionality similar to searching the MeSH database through PubMed. Data reports will facilitate analysis of the search service.

  14. Metabolic network flux analysis for engineering plant systems.

    PubMed

    Shachar-Hill, Yair

    2013-04-01

    Metabolic network flux analysis (NFA) tools have proven themselves to be powerful aids to metabolic engineering of microbes by providing quantitative insights into the flows of material and energy through cellular systems. The development and application of NFA tools to plant systems has advanced in recent years and are yielding significant insights and testable predictions. Plants present substantial opportunities for the practical application of NFA but they also pose serious challenges related to the complexity of plant metabolic networks and to deficiencies in our knowledge of their structure and regulation. By considering the tools available and selected examples, this article attempts to assess where and how NFA is most likely to have a real impact on plant biotechnology. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Learning from Adverse Events in Obstetrics: Is a Standardized Computer Tool an Effective Strategy for Root Cause Analysis?

    PubMed

    Murray-Davis, Beth; McDonald, Helen; Cross-Sudworth, Fiona; Ahmed, Rashid; Simioni, Julia; Dore, Sharon; Marrin, Michael; DeSantis, Judy; Leyland, Nicholas; Gardosi, Jason; Hutton, Eileen; McDonald, Sarah

    2015-08-01

    Adverse events occur in up to 10% of obstetric cases, and up to one half of these could be prevented. Case reviews and root cause analysis using a structured tool may help health care providers to learn from adverse events and to identify trends and recurring systems issues. We sought to establish the reliability of a root cause analysis computer application called Standardized Clinical Outcome Review (SCOR). We designed a mixed methods study to evaluate the effectiveness of the tool. We conducted qualitative content analysis of five charts reviewed by both the traditional obstetric quality assurance methods and the SCOR tool. We also determined inter-rater reliability by having four health care providers review the same five cases using the SCOR tool. The comparative qualitative review revealed that the traditional quality assurance case review process used inconsistent language and made serious, personalized recommendations for those involved in the case. In contrast, the SCOR review provided a consistent format for recommendations, a list of action points, and highlighted systems issues. The mean percentage agreement between the four reviewers for the five cases was 75%. The different health care providers completed data entry and assessment of the case in a similar way. Missing data from the chart and poor wording of questions were identified as issues affecting percentage agreement. The SCOR tool provides a standardized, objective, obstetric-specific tool for root cause analysis that may improve identification of risk factors and dissemination of action plans to prevent future events.

  16. Model-Based Infrared Metrology for Advanced Technology Nodes and 300 mm Wafer Processing

    NASA Astrophysics Data System (ADS)

    Rosenthal, Peter A.; Duran, Carlos; Tower, Josh; Mazurenko, Alex; Mantz, Ulrich; Weidner, Peter; Kasic, Alexander

    2005-09-01

    The use of infrared spectroscopy for production semiconductor process monitoring has evolved recently from primarily unpatterned, i.e. blanket test wafer measurements in a limited historical application space of blanket epitaxial, BPSG, and FSG layers to new applications involving patterned product wafer measurements, and new measurement capabilities. Over the last several years, the semiconductor industry has adopted a new set of materials associated with copper/low-k interconnects, and new structures incorporating exotic materials including silicon germanium, SOI substrates and high aspect ratio trenches. The new device architectures and more chemically sophisticated materials have raised new process control and metrology challenges that are not addressed by current measurement technology. To address the challenges we have developed a new infrared metrology tool designed for emerging semiconductor production processes, in a package compatible with modern production and R&D environments. The tool incorporates recent advances in reflectance instrumentation including highly accurate signal processing, optimized reflectometry optics, and model-based calibration and analysis algorithms. To meet the production requirements of the modern automated fab, the measurement hardware has been integrated with a fully automated 300 mm platform incorporating front opening unified pod (FOUP) interfaces, automated pattern recognition and high throughput ultra clean robotics. The tool employs a suite of automated dispersion-model analysis algorithms capable of extracting a variety of layer properties from measured spectra. The new tool provides excellent measurement precision, tool matching, and a platform for deploying many new production and development applications. In this paper we will explore the use of model based infrared analysis as a tool for characterizing novel bottle capacitor structures employed in high density dynamic random access memory (DRAM) chips. We will explore the capability of the tool for characterizing multiple geometric parameters associated with the manufacturing process that are important to the yield and performance of advanced bottle DRAM devices.

  17. Climate Impact and GIS Education Using Realistic Applications of Data.gov Thematic Datasets in a Structured Lesson-Based Workbook

    NASA Astrophysics Data System (ADS)

    Amirazodi, S.; Griffin, R.; Bugbee, K.; Ramachandran, R.; Weigel, A. M.

    2016-12-01

    This project created a workbook which teaches Earth Science education undergraduate and graduate students through guided in-class activities and take-home assignments organized around climate topics which use GIS to teach key geospatial analysis techniques and cartography skills. The workbook is structured to the White House's Data.gov climate change themes, which include Coastal Flooding, Ecosystem Vulnerability, Energy Infrastructure, Arctic, Food Resilience, Human Health, Transportation, Tribal Nations, Water. Each theme provides access to framing questions, associated data, interactive tools, and further reading (e.g. the US Climate Resilience Toolkit and National Climate Assessment). Lessons make use of the respective theme's available resources. The structured thematic approach is designed to encourage independent exploration. The goal is to teach climate concepts and concerns, GIS techniques and approaches, and effective cartographic representation and communication of results; and foster a greater awareness of publically available resources and datasets. To reach more audiences more effectively, a two level approach was used. Level 1 serves as an introductory study and relies on only freely available interactive tools to reach audiences with fewer resources and less familiarity. Level 2 presents a more advanced case study, and focuses on supporting common commercially available tool use and real-world analysis techniques.

  18. Climate Impact and GIS Education Using Realistic Applications of Data.gov Thematic Datasets in a Structured Lesson-Based Workbook

    NASA Technical Reports Server (NTRS)

    Amirazodi, Sara; Griffin, Robert; Bugbee, Kaylin; Ramachandran, Rahul; Weigel, Amanda

    2016-01-01

    This project created a workbook which teaches Earth Science to undergraduate and graduate students through guided in-class activities and take-home assignments organized around climate topics which use GIS to teach key geospatial analysis techniques and cartography skills. The workbook is structured to the White House's Data.gov climate change themes, which include Coastal Flooding, Ecosystem Vulnerability, Energy Infrastructure, Arctic, Food Resilience, Human Health, Transportation, Tribal Nations, and Water. Each theme provides access to framing questions, associated data, interactive tools, and further reading (e.g. The US Climate Resilience Toolkit and National Climate Assessment). Lessons make use of the respective theme's available resources. The structured thematic approach is designed to encourage independent exploration. The goal is to teach climate concepts and concerns, GIS techniques and approaches, and effective cartographic representation and communication results; and foster a greater awareness of publicly available resources and datasets. To reach more audiences more effectively, a two level approach was used. Level 1 serves as an introductory study and relies on only freely available interactive tools to reach audiences with fewer resources and less familiarity. Level 2 presents a more advanced case study, and focuses on supporting common commercially available tool use and real-world analysis techniques.

  19. miRToolsGallery: a tag-based and rankable microRNA bioinformatics resources database portal

    PubMed Central

    Chen, Liang; Heikkinen, Liisa; Wang, ChangLiang; Yang, Yang; Knott, K Emily

    2018-01-01

    Abstract Hundreds of bioinformatics tools have been developed for MicroRNA (miRNA) investigations including those used for identification, target prediction, structure and expression profile analysis. However, finding the correct tool for a specific application requires the tedious and laborious process of locating, downloading, testing and validating the appropriate tool from a group of nearly a thousand. In order to facilitate this process, we developed a novel database portal named miRToolsGallery. We constructed the portal by manually curating > 950 miRNA analysis tools and resources. In the portal, a query to locate the appropriate tool is expedited by being searchable, filterable and rankable. The ranking feature is vital to quickly identify and prioritize the more useful from the obscure tools. Tools are ranked via different criteria including the PageRank algorithm, date of publication, number of citations, average of votes and number of publications. miRToolsGallery provides links and data for the comprehensive collection of currently available miRNA tools with a ranking function which can be adjusted using different criteria according to specific requirements. Database URL: http://www.mirtoolsgallery.org PMID:29688355

  20. A Case Study for Probabilistic Methods Validation (MSFC Center Director's Discretionary Fund, Project No. 94-26)

    NASA Technical Reports Server (NTRS)

    Price J. M.; Ortega, R.

    1998-01-01

    Probabilistic method is not a universally accepted approach for the design and analysis of aerospace structures. The validity of this approach must be demonstrated to encourage its acceptance as it viable design and analysis tool to estimate structural reliability. The objective of this Study is to develop a well characterized finite population of similar aerospace structures that can be used to (1) validate probabilistic codes, (2) demonstrate the basic principles behind probabilistic methods, (3) formulate general guidelines for characterization of material drivers (such as elastic modulus) when limited data is available, and (4) investigate how the drivers affect the results of sensitivity analysis at the component/failure mode level.

  1. Structural behavior of composites with progressive fracture

    NASA Technical Reports Server (NTRS)

    Minnetyan, L.; Murthy, P. L. N.; Chamis, C. C.

    1989-01-01

    The objective of the study is to unify several computational tools developed for the prediction of progressive damage and fracture with efforts for the prediction of the overall response of damaged composite structures. In particular, a computational finite element model for the damaged structure is developed using a computer program as a byproduct of the analysis of progressive damage and fracture. Thus, a single computational investigation can predict progressive fracture and the resulting variation in structural properties of angleplied composites.

  2. GoPros™ as an underwater photogrammetry tool for citizen science

    PubMed Central

    David, Peter A.; Dupont, Sally F.; Mathewson, Ciaran P.; O’Neill, Samuel J.; Powell, Nicholas N.; Williamson, Jane E.

    2016-01-01

    Citizen science can increase the scope of research in the marine environment; however, it suffers from necessitating specialized training and simplified methodologies that reduce research output. This paper presents a simplified, novel survey methodology for citizen scientists, which combines GoPro imagery and structure from motion to construct an ortho-corrected 3D model of habitats for analysis. Results using a coral reef habitat were compared to surveys conducted with traditional snorkelling methods for benthic cover, holothurian counts, and coral health. Results were comparable between the two methods, and structure from motion allows the results to be analysed off-site for any chosen visual analysis. The GoPro method outlined in this study is thus an effective tool for citizen science in the marine environment, especially for comparing changes in coral cover or volume over time. PMID:27168973

  3. GoPros™ as an underwater photogrammetry tool for citizen science.

    PubMed

    Raoult, Vincent; David, Peter A; Dupont, Sally F; Mathewson, Ciaran P; O'Neill, Samuel J; Powell, Nicholas N; Williamson, Jane E

    2016-01-01

    Citizen science can increase the scope of research in the marine environment; however, it suffers from necessitating specialized training and simplified methodologies that reduce research output. This paper presents a simplified, novel survey methodology for citizen scientists, which combines GoPro imagery and structure from motion to construct an ortho-corrected 3D model of habitats for analysis. Results using a coral reef habitat were compared to surveys conducted with traditional snorkelling methods for benthic cover, holothurian counts, and coral health. Results were comparable between the two methods, and structure from motion allows the results to be analysed off-site for any chosen visual analysis. The GoPro method outlined in this study is thus an effective tool for citizen science in the marine environment, especially for comparing changes in coral cover or volume over time.

  4. Modeling the Multi-Body System Dynamics of a Flexible Solar Sail Spacecraft

    NASA Technical Reports Server (NTRS)

    Kim, Young; Stough, Robert; Whorton, Mark

    2005-01-01

    Solar sail propulsion systems enable a wide range of space missions that are not feasible with current propulsion technology. Hardware concepts and analytical methods have matured through ground development to the point that a flight validation mission is now realizable. Much attention has been given to modeling the structural dynamics of the constituent elements, but to date an integrated system level dynamics analysis has been lacking. Using a multi-body dynamics and control analysis tool called TREETOPS, the coupled dynamics of the sailcraft bus, sail membranes, flexible booms, and control system sensors and actuators of a representative solar sail spacecraft are investigated to assess system level dynamics and control issues. With this tool, scaling issues and parametric trade studies can be performed to study achievable performance, control authority requirements, and control/structure interaction assessments.

  5. Toward a dynamical theory of body movement in musical performance

    PubMed Central

    Demos, Alexander P.; Chaffin, Roger; Kant, Vivek

    2014-01-01

    Musicians sway expressively as they play in ways that seem clearly related to the music, but quantifying the relationship has been difficult. We suggest that a complex systems framework and its accompanying tools for analyzing non-linear dynamical systems can help identify the motor synergies involved. Synergies are temporary assemblies of parts that come together to accomplish specific goals. We assume that the goal of the performer is to convey musical structure and expression to the audience and to other performers. We provide examples of how dynamical systems tools, such as recurrence quantification analysis (RQA), can be used to examine performers' movements and relate them to the musical structure and to the musician's expressive intentions. We show how detrended fluctuation analysis (DFA) can be used to identify synergies and discover how they are affected by the performer's expressive intentions. PMID:24904490

  6. Application of 3D-QSAR in the rational design of receptor ligands and enzyme inhibitors.

    PubMed

    Mor, Marco; Rivara, Silvia; Lodola, Alessio; Lorenzi, Simone; Bordi, Fabrizio; Plazzi, Pier Vincenzo; Spadoni, Gilberto; Bedini, Annalida; Duranti, Andrea; Tontini, Andrea; Tarzia, Giorgio

    2005-11-01

    Quantitative structure-activity relationships (QSARs) are frequently employed in medicinal chemistry projects, both to rationalize structure-activity relationships (SAR) for known series of compounds and to help in the design of innovative structures endowed with desired pharmacological actions. As a difference from the so-called structure-based drug design tools, they do not require the knowledge of the biological target structure, but are based on the comparison of drug structural features, thus being defined ligand-based drug design tools. In the 3D-QSAR approach, structural descriptors are calculated from molecular models of the ligands, as interaction fields within a three-dimensional (3D) lattice of points surrounding the ligand structure. These descriptors are collected in a large X matrix, which is submitted to multivariate analysis to look for correlations with biological activity. Like for other QSARs, the reliability and usefulness of the correlation models depends on the validity of the assumptions and on the quality of the data. A careful selection of compounds and pharmacological data can improve the application of 3D-QSAR analysis in drug design. Some examples of the application of CoMFA and CoMSIA approaches to the SAR study and design of receptor or enzyme ligands is described, pointing the attention to the fields of melatonin receptor ligands and FAAH inhibitors.

  7. Scientific Platform as a Service - Tools and solutions for efficient access to and analysis of oceanographic data

    NASA Astrophysics Data System (ADS)

    Vines, Aleksander; Hansen, Morten W.; Korosov, Anton

    2017-04-01

    Existing infrastructure international and Norwegian projects, e.g., NorDataNet, NMDC and NORMAP, provide open data access through the OPeNDAP protocol following the conventions for CF (Climate and Forecast) metadata, designed to promote the processing and sharing of files created with the NetCDF application programming interface (API). This approach is now also being implemented in the Norwegian Sentinel Data Hub (satellittdata.no) to provide satellite EO data to the user community. Simultaneously with providing simplified and unified data access, these projects also seek to use and establish common standards for use and discovery metadata. This then allows development of standardized tools for data search and (subset) streaming over the internet to perform actual scientific analysis. A combinnation of software tools, which we call a Scientific Platform as a Service (SPaaS), will take advantage of these opportunities to harmonize and streamline the search, retrieval and analysis of integrated satellite and auxiliary observations of the oceans in a seamless system. The SPaaS is a cloud solution for integration of analysis tools with scientific datasets via an API. The core part of the SPaaS is a distributed metadata catalog to store granular metadata describing the structure, location and content of available satellite, model, and in situ datasets. The analysis tools include software for visualization (also online), interactive in-depth analysis, and server-based processing chains. The API conveys search requests between system nodes (i.e., interactive and server tools) and provides easy access to the metadata catalog, data repositories, and the tools. The SPaaS components are integrated in virtual machines, of which provisioning and deployment are automatized using existing state-of-the-art open-source tools (e.g., Vagrant, Ansible, Docker). The open-source code for scientific tools and virtual machine configurations is under version control at https://github.com/nansencenter/, and is coupled to an online continuous integration system (e.g., Travis CI).

  8. Coal gasification systems engineering and analysis. Appendix H: Work breakdown structure

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A work breakdown structure (WBS) is presented which encompasses the multiple facets (hardware, software, services, and other tasks) of the coal gasification program. The WBS is shown to provide the basis for the following: management and control; cost estimating; budgeting and reporting; scheduling activities; organizational structuring; specification tree generation; weight allocation and control; procurement and contracting activities; and serves as a tool for program evaluation.

  9. Electron Tomography: A Three-Dimensional Analytic Tool for Hard and Soft Materials Research.

    PubMed

    Ercius, Peter; Alaidi, Osama; Rames, Matthew J; Ren, Gang

    2015-10-14

    Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is a technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. This review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Electron Tomography: A Three-Dimensional Analytic Tool for Hard and Soft Materials Research

    PubMed Central

    Alaidi, Osama; Rames, Matthew J.

    2016-01-01

    Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is a technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. This review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated. PMID:26087941

  11. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    NASA Technical Reports Server (NTRS)

    Suh, Peter M.; Conyers, Howard J.; Mavris, Dimitri N.

    2014-01-01

    This paper introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio and number of control surfaces. A doublet lattice approach is taken to compute generalized forces. A rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. Although, all parameters can be easily modified if desired.The focus of this paper is on tool presentation, verification and validation. This process is carried out in stages throughout the paper. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool. Therefore the flutter speed and frequency for a clamped plate are computed using V-g and V-f analysis. The computational results are compared to a previously published computational analysis and wind tunnel results for the same structure. Finally a case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to V-g and V-f analysis. This also includes the analysis of the model in response to a 1-cos gust.

  12. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    NASA Technical Reports Server (NTRS)

    Suh, Peter M.; Conyers, Howard J.; Mavris, Dimitri N.

    2015-01-01

    This paper introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this paper is on tool presentation, verification, and validation. These processes are carried out in stages throughout the paper. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.

  13. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    NASA Technical Reports Server (NTRS)

    Suh, Peter M.; Conyers, Howard Jason; Mavris, Dimitri N.

    2015-01-01

    This report introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this report is on tool presentation, verification, and validation. These processes are carried out in stages throughout the report. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.

  14. GlycoWorkbench: a tool for the computer-assisted annotation of mass spectra of glycans.

    PubMed

    Ceroni, Alessio; Maass, Kai; Geyer, Hildegard; Geyer, Rudolf; Dell, Anne; Haslam, Stuart M

    2008-04-01

    Mass spectrometry is the main analytical technique currently used to address the challenges of glycomics as it offers unrivalled levels of sensitivity and the ability to handle complex mixtures of different glycan variations. Determination of glycan structures from analysis of MS data is a major bottleneck in high-throughput glycomics projects, and robust solutions to this problem are of critical importance. However, all the approaches currently available have inherent restrictions to the type of glycans they can identify, and none of them have proved to be a definitive tool for glycomics. GlycoWorkbench is a software tool developed by the EUROCarbDB initiative to assist the manual interpretation of MS data. The main task of GlycoWorkbench is to evaluate a set of structures proposed by the user by matching the corresponding theoretical list of fragment masses against the list of peaks derived from the spectrum. The tool provides an easy to use graphical interface, a comprehensive and increasing set of structural constituents, an exhaustive collection of fragmentation types, and a broad list of annotation options. The aim of GlycoWorkbench is to offer complete support for the routine interpretation of MS data. The software is available for download from: http://www.eurocarbdb.org/applications/ms-tools.

  15. Instruments of scientific visual representation in atomic databases

    NASA Astrophysics Data System (ADS)

    Kazakov, V. V.; Kazakov, V. G.; Meshkov, O. I.

    2017-10-01

    Graphic tools of spectral data representation provided by operating information systems on atomic spectroscopy—ASD NIST, VAMDC, SPECTR-W3, and Electronic Structure of Atoms—for the support of scientific-research and human-resource development are presented. Such tools of visual representation of scientific data as those of the spectrogram and Grotrian diagram plotting are considered. The possibility of comparative analysis of the experimentally obtained spectra and reference spectra of atomic systems formed according to the database of a resource is described. The access techniques to the mentioned graphic tools are presented.

  16. Design, Fabrication, and Testing of Composite Energy-Absorbing Keel Beams for General Aviation Type Aircraft

    NASA Technical Reports Server (NTRS)

    Kellas, Sotiris; Knight, Norman F., Jr.

    2002-01-01

    A lightweight energy-absorbing keel-beam concept was developed and retrofitted in a general aviation type aircraft to improve crashworthiness performance. The energy-absorbing beam consisted of a foam-filled cellular structure with glass fiber and hybrid glass/kevlar cell walls. Design, analysis, fabrication and testing of the keel beams prior to installation and subsequent full-scale crash testing of the aircraft are described. Factors such as material and fabrication constraints, damage tolerance, crush stress/strain response, seat-rail loading, and post crush integrity, which influenced the course of the design process are also presented. A theory similar to the one often used for ductile metal box structures was employed with appropriate modifications to estimate the sustained crush loads for the beams. This, analytical tool, coupled with dynamic finite element simulation using MSC.Dytran were the prime design and analysis tools. The validity of the theory as a reliable design tool was examined against test data from static crush tests of beam sections while the overall performance of the energy-absorbing subfloor was assessed through dynamic testing of 24 in long subfloor assemblies.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Suresh, Niraj; Stephens, Sean A.; Adams, Lexor

    Plant roots play a critical role in plant-soil-microbe interactions that occur in the rhizosphere, as well as processes with important implications to climate change and forest management. Quantitative size information on roots in their native environment is invaluable for studying root growth and environmental processes involving the plant. X ray computed tomography (XCT) has been demonstrated to be an effective tool for in situ root scanning and analysis. Our group at the Environmental Molecular Sciences Laboratory (EMSL) has developed an XCT-based tool to image and quantitatively analyze plant root structures in their native soil environment. XCT data collected on amore » Prairie dropseed (Sporobolus heterolepis) specimen was used to visualize its root structure. A combination of open-source software RooTrak and DDV were employed to segment the root from the soil, and calculate its isosurface, respectively. Our own computer script named 3DRoot-SV was developed and used to calculate root volume and surface area from a triangular mesh. The process utilizing a unique combination of tools, from imaging to quantitative root analysis, including the 3DRoot-SV computer script, is described.« less

  18. Research in Computational Astrobiology

    NASA Technical Reports Server (NTRS)

    Chaban, Galina; Colombano, Silvano; Scargle, Jeff; New, Michael H.; Pohorille, Andrew; Wilson, Michael A.

    2003-01-01

    We report on several projects in the field of computational astrobiology, which is devoted to advancing our understanding of the origin, evolution and distribution of life in the Universe using theoretical and computational tools. Research projects included modifying existing computer simulation codes to use efficient, multiple time step algorithms, statistical methods for analysis of astrophysical data via optimal partitioning methods, electronic structure calculations on water-nuclei acid complexes, incorporation of structural information into genomic sequence analysis methods and calculations of shock-induced formation of polycylic aromatic hydrocarbon compounds.

  19. Fatigue-Crack-Growth Structural Analysis

    NASA Technical Reports Server (NTRS)

    Newman, J. C., Jr.

    1986-01-01

    Elastic and plastic deformations calculated under variety of loading conditions. Prediction of fatigue-crack-growth lives made with FatigueCrack-Growth Structural Analysis (FASTRAN) computer program. As cyclic loads are applied to initial crack configuration, FASTRAN predicts crack length and other parameters until complete break occurs. Loads are tensile or compressive and of variable or constant amplitude. FASTRAN incorporates linear-elastic fracture mechanics with modifications of load-interaction effects caused by crack closure. FASTRAN considered research tool, because of lengthy calculation times. FASTRAN written in FORTRAN IV for batch execution.

  20. Structured Analysis of the Logistics Support Analysis (LSA) Task, and Integrated Logistic Support (ILS) Element, ’Standardization and Interoperability (S and I)’.

    DTIC Science & Technology

    1988-11-01

    system, using graphic techniques which enable users, analysts, and designers to get a clear and common picture of the system and how its parts fit...boxes into hierarchies suitable for computer implementation. ŗ. Structured Design uses tools, especially graphic ones, to render systems readily...LSA, PROCESSES, DATA FLOWS, DATA STORES, EX"RNAL ENTITIES, OVERALL SYSTEMS DESIGN PROCESS, over 19, ABSTRACT (Continue on reverse if necessary and

  1. Structural Analysis Computer Programs for Rigid Multicomponent Pavement Structures with Discontinuities--WESLIQID and WESLAYER. Report 1. Program Development and Numerical Presentations.

    DTIC Science & Technology

    1981-05-01

    represented as a Winkler foundation. The program can treat any number of slabs connected by steel bars or other load trans- fer devices at the joints...dimensional finite element method. The inherent flexibility of such an approach permits the analysis of a rigid pavement with steel bars and stabilized...layers and provides an efficient tool for analyzing stress conditions at the joint. Unfor- tunately, such a procedure would require a tremendously

  2. Automatic Differentiation as a tool in engineering design

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois M.; Hall, Laura E.

    1992-01-01

    Automatic Differentiation (AD) is a tool that systematically implements the chain rule of differentiation to obtain the derivatives of functions calculated by computer programs. In this paper, it is assessed as a tool for engineering design. The paper discusses the forward and reverse modes of AD, their computing requirements, and approaches to implementing AD. It continues with application to two different tools to two medium-size structural analysis problems to generate sensitivity information typically necessary in an optimization or design situation. The paper concludes with the observation that AD is to be preferred to finite differencing in most cases, as long as sufficient computer storage is available.

  3. Analysis of carbon functional groups in mobile humic acid and recalcitrant calcium humate extracted from eight US soils

    USDA-ARS?s Scientific Manuscript database

    Solid state 13C nuclear magnetic resonance (NMR) spectroscopy is a common tool to study the structure of soil humic fractions; however, knowledge regarding carbon structural relationships in humic fractions is limited. In this study, mobile humic acid (MHA) and recalcitrant calcium humate (CaHA) fr...

  4. An Alternative Approach for the Analyses and Interpretation of Attachment Sort Items

    ERIC Educational Resources Information Center

    Kirkland, John; Bimler, David; Drawneek, Andrew; McKim, Margaret; Scholmerich, Axel

    2004-01-01

    Attachment Q-Sort (AQS) is a tool for quantifying observations about toddler/caregiver relationships. Previous studies have applied factor analysis to the full 90 AQS item set to explore the structure underlying them. Here we explore that structure by applying multidimensional scaling (MDS) to judgements of inter-item similarity. AQS items are…

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less

  6. T.Rex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-06-08

    T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less

  7. Crysalis: an integrated server for computational analysis and design of protein crystallization.

    PubMed

    Wang, Huilin; Feng, Liubin; Zhang, Ziding; Webb, Geoffrey I; Lin, Donghai; Song, Jiangning

    2016-02-24

    The failure of multi-step experimental procedures to yield diffraction-quality crystals is a major bottleneck in protein structure determination. Accordingly, several bioinformatics methods have been successfully developed and employed to select crystallizable proteins. Unfortunately, the majority of existing in silico methods only allow the prediction of crystallization propensity, seldom enabling computational design of protein mutants that can be targeted for enhancing protein crystallizability. Here, we present Crysalis, an integrated crystallization analysis tool that builds on support-vector regression (SVR) models to facilitate computational protein crystallization prediction, analysis, and design. More specifically, the functionality of this new tool includes: (1) rapid selection of target crystallizable proteins at the proteome level, (2) identification of site non-optimality for protein crystallization and systematic analysis of all potential single-point mutations that might enhance protein crystallization propensity, and (3) annotation of target protein based on predicted structural properties. We applied the design mode of Crysalis to identify site non-optimality for protein crystallization on a proteome-scale, focusing on proteins currently classified as non-crystallizable. Our results revealed that site non-optimality is based on biases related to residues, predicted structures, physicochemical properties, and sequence loci, which provides in-depth understanding of the features influencing protein crystallization. Crysalis is freely available at http://nmrcen.xmu.edu.cn/crysalis/.

  8. Crysalis: an integrated server for computational analysis and design of protein crystallization

    PubMed Central

    Wang, Huilin; Feng, Liubin; Zhang, Ziding; Webb, Geoffrey I.; Lin, Donghai; Song, Jiangning

    2016-01-01

    The failure of multi-step experimental procedures to yield diffraction-quality crystals is a major bottleneck in protein structure determination. Accordingly, several bioinformatics methods have been successfully developed and employed to select crystallizable proteins. Unfortunately, the majority of existing in silico methods only allow the prediction of crystallization propensity, seldom enabling computational design of protein mutants that can be targeted for enhancing protein crystallizability. Here, we present Crysalis, an integrated crystallization analysis tool that builds on support-vector regression (SVR) models to facilitate computational protein crystallization prediction, analysis, and design. More specifically, the functionality of this new tool includes: (1) rapid selection of target crystallizable proteins at the proteome level, (2) identification of site non-optimality for protein crystallization and systematic analysis of all potential single-point mutations that might enhance protein crystallization propensity, and (3) annotation of target protein based on predicted structural properties. We applied the design mode of Crysalis to identify site non-optimality for protein crystallization on a proteome-scale, focusing on proteins currently classified as non-crystallizable. Our results revealed that site non-optimality is based on biases related to residues, predicted structures, physicochemical properties, and sequence loci, which provides in-depth understanding of the features influencing protein crystallization. Crysalis is freely available at http://nmrcen.xmu.edu.cn/crysalis/. PMID:26906024

  9. Azahar: a PyMOL plugin for construction, visualization and analysis of glycan molecules

    NASA Astrophysics Data System (ADS)

    Arroyuelo, Agustina; Vila, Jorge A.; Martin, Osvaldo A.

    2016-08-01

    Glycans are key molecules in many physiological and pathological processes. As with other molecules, like proteins, visualization of the 3D structures of glycans adds valuable information for understanding their biological function. Hence, here we introduce Azahar, a computing environment for the creation, visualization and analysis of glycan molecules. Azahar is implemented in Python and works as a plugin for the well known PyMOL package (Schrodinger in The PyMOL molecular graphics system, version 1.3r1, 2010). Besides the already available visualization and analysis options provided by PyMOL, Azahar includes 3 cartoon-like representations and tools for 3D structure caracterization such as a comformational search using a Monte Carlo with minimization routine and also tools to analyse single glycans or trajectories/ensembles including the calculation of radius of gyration, Ramachandran plots and hydrogen bonds. Azahar is freely available to download from http://www.pymolwiki.org/index.php/Azahar and the source code is available at https://github.com/agustinaarroyuelo/Azahar.

  10. AUSPEX: a graphical tool for X-ray diffraction data analysis.

    PubMed

    Thorn, Andrea; Parkhurst, James; Emsley, Paul; Nicholls, Robert A; Vollmar, Melanie; Evans, Gwyndaf; Murshudov, Garib N

    2017-09-01

    In this paper, AUSPEX, a new software tool for experimental X-ray data analysis, is presented. Exploring the behaviour of diffraction intensities and the associated estimated uncertainties facilitates the discovery of underlying problems and can help users to improve their data acquisition and processing in order to obtain better structural models. The program enables users to inspect the distribution of observed intensities (or amplitudes) against resolution as well as the associated estimated uncertainties (sigmas). It is demonstrated how AUSPEX can be used to visually and automatically detect ice-ring artefacts in integrated X-ray diffraction data. Such artefacts can hamper structure determination, but may be difficult to identify from the raw diffraction images produced by modern pixel detectors. The analysis suggests that a significant portion of the data sets deposited in the PDB contain ice-ring artefacts. Furthermore, it is demonstrated how other problems in experimental X-ray data caused, for example, by scaling and data-conversion procedures can be detected by AUSPEX.

  11. High Precision Thermal, Structural and Optical Analysis of an External Occulter Using a Common Model and the General Purpose Multi-Physics Analysis Tool Cielo

    NASA Technical Reports Server (NTRS)

    Hoff, Claus; Cady, Eric; Chainyk, Mike; Kissil, Andrew; Levine, Marie; Moore, Greg

    2011-01-01

    The efficient simulation of multidisciplinary thermo-opto-mechanical effects in precision deployable systems has for years been limited by numerical toolsets that do not necessarily share the same finite element basis, level of mesh discretization, data formats, or compute platforms. Cielo, a general purpose integrated modeling tool funded by the Jet Propulsion Laboratory and the Exoplanet Exploration Program, addresses shortcomings in the current state of the art via features that enable the use of a single, common model for thermal, structural and optical aberration analysis, producing results of greater accuracy, without the need for results interpolation or mapping. This paper will highlight some of these advances, and will demonstrate them within the context of detailed external occulter analyses, focusing on in-plane deformations of the petal edges for both steady-state and transient conditions, with subsequent optical performance metrics including intensity distributions at the pupil and image plane.

  12. Application of wavelet analysis for monitoring the hydrologic effects of dam operation: Glen canyon dam and the Colorado River at lees ferry, Arizona

    USGS Publications Warehouse

    White, M.A.; Schmidt, J.C.; Topping, D.J.

    2005-01-01

    Wavelet analysis is a powerful tool with which to analyse the hydrologic effects of dam construction and operation on river systems. Using continuous records of instantaneous discharge from the Lees Ferry gauging station and records of daily mean discharge from upstream tributaries, we conducted wavelet analyses of the hydrologic structure of the Colorado River in Grand Canyon. The wavelet power spectrum (WPS) of daily mean discharge provided a highly compressed and integrative picture of the post-dam elimination of pronounced annual and sub-annual flow features. The WPS of the continuous record showed the influence of diurnal and weekly power generation cycles, shifts in discharge management, and the 1996 experimental flood in the post-dam period. Normalization of the WPS by local wavelet spectra revealed the fine structure of modulation in discharge scale and amplitude and provides an extremely efficient tool with which to assess the relationships among hydrologic cycles and ecological and geomorphic systems. We extended our analysis to sections of the Snake River and showed how wavelet analysis can be used as a data mining technique. The wavelet approach is an especially promising tool with which to assess dam operation in less well-studied regions and to evaluate management attempts to reconstruct desired flow characteristics. Copyright ?? 2005 John Wiley & Sons, Ltd.

  13. A CFD/CSD Interaction Methodology for Aircraft Wings

    NASA Technical Reports Server (NTRS)

    Bhardwaj, Manoj K.

    1997-01-01

    With advanced subsonic transports and military aircraft operating in the transonic regime, it is becoming important to determine the effects of the coupling between aerodynamic loads and elastic forces. Since aeroelastic effects can contribute significantly to the design of these aircraft, there is a strong need in the aerospace industry to predict these aero-structure interactions computationally. To perform static aeroelastic analysis in the transonic regime, high fidelity computational fluid dynamics (CFD) analysis tools must be used in conjunction with high fidelity computational structural fluid dynamics (CSD) analysis tools due to the nonlinear behavior of the aerodynamics in the transonic regime. There is also a need to be able to use a wide variety of CFD and CSD tools to predict these aeroelastic effects in the transonic regime. Because source codes are not always available, it is necessary to couple the CFD and CSD codes without alteration of the source codes. In this study, an aeroelastic coupling procedure is developed which will perform static aeroelastic analysis using any CFD and CSD code with little code integration. The aeroelastic coupling procedure is demonstrated on an F/A-18 Stabilator using NASTD (an in-house McDonnell Douglas CFD code) and NASTRAN. In addition, the Aeroelastic Research Wing (ARW-2) is used for demonstration of the aeroelastic coupling procedure by using ENSAERO (NASA Ames Research Center CFD code) and a finite element wing-box code (developed as part of this research).

  14. The coming of age of the first hybrid metrology software platform dedicated to nanotechnologies (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Foucher, Johann; Labrosse, Aurelien; Dervillé, Alexandre; Zimmermann, Yann; Bernard, Guilhem; Martinez, Sergio; Grönqvist, Hanna; Baderot, Julien; Pinzan, Florian

    2017-03-01

    The development and integration of new materials and structures at the nanoscale require multiple parallel characterizations in order to control mostly physico-chemical properties as a function of applications. Among all properties, we can list physical properties such as: size, shape, specific surface area, aspect ratio, agglomeration/aggregation state, size distribution, surface morphology/topography, structure (including crystallinity and defect structure), solubility and chemical properties such as: structural formula/molecular structure, composition (including degree of purity, known impurities or additives), phase identity, surface chemistry (composition, charge, tension, reactive sites, physical structure, photocatalytic properties, zeta potential), hydrophilicity/lipophilicity. Depending on the final material formulation (aerosol, powder, nanostructuration…) and the industrial application (semiconductor, cosmetics, chemistry, automotive…), a fleet of complementary characterization equipments must be used in synergy for accurate process tuning and high production yield. The synergy between equipment so-called hybrid metrology consists in using the strength of each technique in order to reduce the global uncertainty for better and faster process control. The only way to succeed doing this exercise is to use data fusion methodology. In this paper, we will introduce the work that has been done to create the first generic hybrid metrology software platform dedicated to nanotechnologies process control. The first part will be dedicated to process flow modeling that is related to a fleet of metrology tools. The second part will introduce the concept of entity model which describes the various parameters that have to be extracted. The entity model is fed with data analysis as a function of the application (automatic analysis or semi-automated analysis). The final part will introduce two ways of doing data fusion on real data coming from imaging (SEM, TEM, AFM) and non-imaging techniques (SAXS). First approach is dedicated to high level fusion which is the art of combining various populations of results from homogeneous or heterogeneous tools, taking into account precision and repeatability of each of them to obtain a new more accurate result. The second approach is dedicated to deep level fusion which is the art of combining raw data from various tools in order to create a new raw data. We will introduce a new concept of virtual tool creator based on deep level fusion. As a conclusion we will discuss the implementation of hybrid metrology in semiconductor environment for advanced process control

  15. Effects of structured written feedback by cards on medical students' performance at Mini Clinical Evaluation Exercise (Mini-CEX) in an outpatient clinic.

    PubMed

    Haghani, Fariba; Hatef Khorami, Mohammad; Fakhari, Mohammad

    2016-07-01

    Feedback cards are recommended as a feasible tool for structured written feedback delivery in clinical education while effectiveness of this tool on the medical students' performance is still questionable.  The purpose of this study was to compare the effects of structured written feedback by cards as well as verbal feedback versus verbal feedback alone on the clinical performance of medical students at the Mini Clinical Evaluation Exercise (Mini-CEX) test in an outpatient clinic. This is a quasi-experimental study with pre- and post-test comprising four groups in two terms of medical students' externship. The students' performance was assessed through the Mini-Clinical Evaluation Exercise (Mini-CEX) as a clinical performance evaluation tool. Structured written feedbacks were given to two experimental groups by designed feedback cards as well as verbal feedback, while in the two control groups feedback was delivered verbally as a routine approach in clinical education. By consecutive sampling method, 62 externship students were enrolled in this study and seven students were excluded from the final analysis due to their absence for three days. According to the ANOVA analysis and Post Hoc Tukey test,  no statistically significant difference was observed among the four groups at the pre-test, whereas a statistically significant difference was observed between the experimental and control groups at the post-test  (F = 4.023, p =0.012). The effect size of the structured written feedbacks on clinical performance was 0.19. Structured written feedback by cards could improve the performance of medical students in a statistical sense. Further studies must be conducted in other clinical courses with longer durations.

  16. A survey of tools for variant analysis of next-generation genome sequencing data

    PubMed Central

    Pabinger, Stephan; Dander, Andreas; Fischer, Maria; Snajder, Rene; Sperk, Michael; Efremova, Mirjana; Krabichler, Birgit; Speicher, Michael R.; Zschocke, Johannes

    2014-01-01

    Recent advances in genome sequencing technologies provide unprecedented opportunities to characterize individual genomic landscapes and identify mutations relevant for diagnosis and therapy. Specifically, whole-exome sequencing using next-generation sequencing (NGS) technologies is gaining popularity in the human genetics community due to the moderate costs, manageable data amounts and straightforward interpretation of analysis results. While whole-exome and, in the near future, whole-genome sequencing are becoming commodities, data analysis still poses significant challenges and led to the development of a plethora of tools supporting specific parts of the analysis workflow or providing a complete solution. Here, we surveyed 205 tools for whole-genome/whole-exome sequencing data analysis supporting five distinct analytical steps: quality assessment, alignment, variant identification, variant annotation and visualization. We report an overview of the functionality, features and specific requirements of the individual tools. We then selected 32 programs for variant identification, variant annotation and visualization, which were subjected to hands-on evaluation using four data sets: one set of exome data from two patients with a rare disease for testing identification of germline mutations, two cancer data sets for testing variant callers for somatic mutations, copy number variations and structural variations, and one semi-synthetic data set for testing identification of copy number variations. Our comprehensive survey and evaluation of NGS tools provides a valuable guideline for human geneticists working on Mendelian disorders, complex diseases and cancers. PMID:23341494

  17. Specification Improvement Through Analysis of Proof Structure (SITAPS): High Assurance Software Development

    DTIC Science & Technology

    2016-02-01

    proof in mathematics. For example, consider the proof of the Pythagorean Theorem illustrated at: http://www.cut-the-knot.org/ pythagoras / where 112...methods and tools have made significant progress in their ability to model software designs and prove correctness theorems about the systems modeled...assumption criticality” or “ theorem root set size” SITAPS detects potentially brittle verification cases. SITAPS provides tools and techniques that

  18. Dynamic analysis of flexible mechanical systems using LATDYN

    NASA Technical Reports Server (NTRS)

    Wu, Shih-Chin; Chang, Che-Wei; Housner, Jerrold M.

    1989-01-01

    A 3-D, finite element based simulation tool for flexible multibody systems is presented. Hinge degrees-of-freedom is built into equations of motion to reduce geometric constraints. The approach avoids the difficulty in selecting deformation modes for flexible components by using assumed mode method. The tool is applied to simulate a practical space structure deployment problem. Results of examples demonstrate the capability of the code and approach.

  19. Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will

    2016-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.

  20. Mapping care processes within a hospital: a web-based proposal merging enterprise modelling and ISO normative principles.

    PubMed

    Staccini, Pascal; Joubert, Michel; Quaranta, Jean-François; Fieschi, Marius

    2003-01-01

    Today, the economic and regulatory environment are pressuring hospitals and healthcare professionals to account for their results and methods of care delivery. The evaluation of the quality and the safety of care, the traceability of the acts performed and the evaluation of practices are some of the reasons underpinning current interest in clinical and hospital information systems. The structured collection of users' needs and system requirements is fundamental when installing such systems. This stage takes time and is generally misconstrued by caregivers and is of limited efficacy to analysis. We used a modelling technique designed for manufacturing processes (SADT: Structured Analysis and Design Technique). We enhanced the initial model of activity of this method and programmed a web-based tool in an object-oriented environment. This tool makes it possible to extract the data dictionary from the description of a given process and to locate documents (procedures, recommendations, instructions). Aimed at structuring needs and storing information provided by teams directly involved regarding the workings of an institution (or at least part of it), the process mapping approach has an important contribution to make in the analysis of clinical information systems.

  1. NIR monitoring of in-service wood structures

    Treesearch

    Michela Zanetti; Timothy G. Rials; Douglas Rammer

    2005-01-01

    Near infrared spectroscopy (NIRS) was used to study a set of Southern Yellow Pine boards exposed to natural weathering for different periods of exposure time. This non-destructive spectroscopic technique is a very powerful tool to predict the weathering of wood when used in combination with multivariate analysis (Principal Component Analysis, PCA, and Projection to...

  2. Sensitivity analysis of hybrid thermoelastic techniques

    Treesearch

    W.A. Samad; J.M. Considine

    2017-01-01

    Stress functions have been used as a complementary tool to support experimental techniques, such as thermoelastic stress analysis (TSA) and digital image correlation (DIC), in an effort to evaluate the complete and separate full-field stresses of loaded structures. The need for such coupling between experimental data and stress functions is due to the fact that...

  3. Searching for Tools versus Asking for Answers: A Taxonomy of Counselee Behavioral Styles during Career Counseling.

    ERIC Educational Resources Information Center

    Sagiv, Lilach

    1999-01-01

    A taxonomy of decision behavior styles (independence/dependence, active/passive, insightful/not) tested with 372 career counseling clients was supported by similar structure analysis and confirmatory factor analysis. Counselors were more likely to be satisfied with decisions of clients they perceived to be insightful. (SK)

  4. Review of free software tools for image analysis of fluorescence cell micrographs.

    PubMed

    Wiesmann, V; Franz, D; Held, C; Münzenmayer, C; Palmisano, R; Wittenberg, T

    2015-01-01

    An increasing number of free software tools have been made available for the evaluation of fluorescence cell micrographs. The main users are biologists and related life scientists with no or little knowledge of image processing. In this review, we give an overview of available tools and guidelines about which tools the users should use to segment fluorescence micrographs. We selected 15 free tools and divided them into stand-alone, Matlab-based, ImageJ-based, free demo versions of commercial tools and data sharing tools. The review consists of two parts: First, we developed a criteria catalogue and rated the tools regarding structural requirements, functionality (flexibility, segmentation and image processing filters) and usability (documentation, data management, usability and visualization). Second, we performed an image processing case study with four representative fluorescence micrograph segmentation tasks with figure-ground and cell separation. The tools display a wide range of functionality and usability. In the image processing case study, we were able to perform figure-ground separation in all micrographs using mainly thresholding. Cell separation was not possible with most of the tools, because cell separation methods are provided only by a subset of the tools and are difficult to parametrize and to use. Most important is that the usability matches the functionality of a tool. To be usable, specialized tools with less functionality need to fulfill less usability criteria, whereas multipurpose tools need a well-structured menu and intuitive graphical user interface. © 2014 Fraunhofer-Institute for Integrated Circuits IIS Journal of Microscopy © 2014 Royal Microscopical Society.

  5. Mid-frequency Band Dynamics of Large Space Structures

    NASA Technical Reports Server (NTRS)

    Coppolino, Robert N.; Adams, Douglas S.

    2004-01-01

    High and low intensity dynamic environments experienced by a spacecraft during launch and on-orbit operations, respectively, induce structural loads and motions, which are difficult to reliably predict. Structural dynamics in low- and mid-frequency bands are sensitive to component interface uncertainty and non-linearity as evidenced in laboratory testing and flight operations. Analytical tools for prediction of linear system response are not necessarily adequate for reliable prediction of mid-frequency band dynamics and analysis of measured laboratory and flight data. A new MATLAB toolbox, designed to address the key challenges of mid-frequency band dynamics, is introduced in this paper. Finite-element models of major subassemblies are defined following rational frequency-wavelength guidelines. For computational efficiency, these subassemblies are described as linear, component mode models. The complete structural system model is composed of component mode subassemblies and linear or non-linear joint descriptions. Computation and display of structural dynamic responses are accomplished employing well-established, stable numerical methods, modern signal processing procedures and descriptive graphical tools. Parametric sensitivity and Monte-Carlo based system identification tools are used to reconcile models with experimental data and investigate the effects of uncertainties. Models and dynamic responses are exported for employment in applications, such as detailed structural integrity and mechanical-optical-control performance analyses.

  6. DNAproDB: an interactive tool for structural analysis of DNA–protein complexes

    PubMed Central

    Sagendorf, Jared M.

    2017-01-01

    Abstract Many biological processes are mediated by complex interactions between DNA and proteins. Transcription factors, various polymerases, nucleases and histones recognize and bind DNA with different levels of binding specificity. To understand the physical mechanisms that allow proteins to recognize DNA and achieve their biological functions, it is important to analyze structures of DNA–protein complexes in detail. DNAproDB is a web-based interactive tool designed to help researchers study these complexes. DNAproDB provides an automated structure-processing pipeline that extracts structural features from DNA–protein complexes. The extracted features are organized in structured data files, which are easily parsed with any programming language or viewed in a browser. We processed a large number of DNA–protein complexes retrieved from the Protein Data Bank and created the DNAproDB database to store this data. Users can search the database by combining features of the DNA, protein or DNA–protein interactions at the interface. Additionally, users can upload their own structures for processing privately and securely. DNAproDB provides several interactive and customizable tools for creating visualizations of the DNA–protein interface at different levels of abstraction that can be exported as high quality figures. All functionality is documented and freely accessible at http://dnaprodb.usc.edu. PMID:28431131

  7. DSSR-enhanced visualization of nucleic acid structures in Jmol

    PubMed Central

    Hanson, Robert M.

    2017-01-01

    Abstract Sophisticated and interactive visualizations are essential for making sense of the intricate 3D structures of macromolecules. For proteins, secondary structural components are routinely featured in molecular graphics visualizations. However, the field of RNA structural bioinformatics is still lagging behind; for example, current molecular graphics tools lack built-in support even for base pairs, double helices, or hairpin loops. DSSR (Dissecting the Spatial Structure of RNA) is an integrated and automated command-line tool for the analysis and annotation of RNA tertiary structures. It calculates a comprehensive and unique set of features for characterizing RNA, as well as DNA structures. Jmol is a widely used, open-source Java viewer for 3D structures, with a powerful scripting language. JSmol, its reincarnation based on native JavaScript, has a predominant position in the post Java-applet era for web-based visualization of molecular structures. The DSSR-Jmol integration presented here makes salient features of DSSR readily accessible, either via the Java-based Jmol application itself, or its HTML5-based equivalent, JSmol. The DSSR web service accepts 3D coordinate files (in mmCIF or PDB format) initiated from a Jmol or JSmol session and returns DSSR-derived structural features in JSON format. This seamless combination of DSSR and Jmol/JSmol brings the molecular graphics of 3D RNA structures to a similar level as that for proteins, and enables a much deeper analysis of structural characteristics. It fills a gap in RNA structural bioinformatics, and is freely accessible (via the Jmol application or the JSmol-based website http://jmol.x3dna.org). PMID:28472503

  8. Structural Analysis Of CD59 Of Chinese Tree Shrew: A New Reference Molecule For Human Immune System Specific CD59 Drug Discovery.

    PubMed

    Panda, Subhamay; Kumari, Leena; Panda, Santamay

    2017-11-17

    Chinese tree shrews (Tupaia belangeri chinensis) bear several characteristics that are considered to be very crucial for utilizing in animal experimental models in biomedical research. Subsequent to the identification of key aspects and signaling pathways in nervous and immune systems, it is revealed that tree shrews acquires shared common as well as unique characteristics, and hence offers a genetic basis for employing this animal as a prospective model for biomedical research. CD59 glycoprotein, commonly referred to as MAC-inhibitory protein (MAC-IP), membrane inhibitor of reactive lysis (MIRL), or protectin, is encoded by the CD59 gene in human beings. It is the member of the LY6/uPAR/alpha-neurotoxin protein family. With this initial point the objective of this study was to determine a comparative composite based structure of CD59 of Chinese tree shrew. The additional objective of this study was to examine the distribution of negatively and positively charged amino acid over molecular modeled structure, distribution of secondary structural elements, hydrophobicity molecular surface analysis and electrostatic potential analysis with the assistance of several bioinformatical analytical tools. CD59 Amino acid sequence of Chinese tree shrew collected from the online database system of National Centre for Biotechnology Information. SignalP 4.0 online server was employed for detection of signal peptide instance within the protein sequence of CD59. Molecular model structure of CD59 protein was generated by the Iterative Threading ASSEmbly Refinement (I-TASSER) suite. The confirmation for three-dimensional structural model was evaluated by structure validation tools. Location of negatively and positively charged amino acid over molecular modeled structure, distribution of secondary structural elements, and hydrophobicity molecular surface analysis was performed with the help of Chimera tool. Electrostatic potential analysis was carried out with the adaptive Poisson-Boltzmann solver package. Subsequently validated model was used for the functionally critical amino acids and active site prediction. The functionally critical amino acids and ligand- binding site (LBS) of the proteins (modeled) was determined using the COACH program. Analysis of Ramachandran plot for Chinese tree shrew depicted that overall, 100% of the residues in homology model were observed in allowed and favored regions, sequentially leading to the validation of the standard of generated protein structural model. In case of CD59 of Chinese tree shrew, the total score of G-factor was found to be -0.66 that was generally larger than the acceptable value. This approach suggests the significance and acceptability of the modeled structure of CD59 of Chinese tree shrew. The molecular model data in cooperation to other relevant post model analysis data put forward molecular insight to protecting activity of CD59 protein molecule of Chinese tree shrew. In the present study, we have proposed the first molecular model structure of uncharted CD59 of Chinese tree shrew by significantly utilizing the comparative composite modeling approach. Therefore, the development of a structural model of the CD59 protein was carried out and analyzed further for deducing molecular enrichment technique. The collaborative effort of molecular model and other relevant data of post model analysis carry forward molecular understanding to protecting activity of CD59 functions towards better insight of features of this natural lead compound. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  9. Multicriteria decision analysis: Overview and implications for environmental decision making

    USGS Publications Warehouse

    Hermans, Caroline M.; Erickson, Jon D.; Erickson, Jon D.; Messner, Frank; Ring, Irene

    2007-01-01

    Environmental decision making involving multiple stakeholders can benefit from the use of a formal process to structure stakeholder interactions, leading to more successful outcomes than traditional discursive decision processes. There are many tools available to handle complex decision making. Here we illustrate the use of a multicriteria decision analysis (MCDA) outranking tool (PROMETHEE) to facilitate decision making at the watershed scale, involving multiple stakeholders, multiple criteria, and multiple objectives. We compare various MCDA methods and their theoretical underpinnings, examining methods that most realistically model complex decision problems in ways that are understandable and transparent to stakeholders.

  10. Tool Wear Feature Extraction Based on Hilbert Marginal Spectrum

    NASA Astrophysics Data System (ADS)

    Guan, Shan; Song, Weijie; Pang, Hongyang

    2017-09-01

    In the metal cutting process, the signal contains a wealth of tool wear state information. A tool wear signal’s analysis and feature extraction method based on Hilbert marginal spectrum is proposed. Firstly, the tool wear signal was decomposed by empirical mode decomposition algorithm and the intrinsic mode functions including the main information were screened out by the correlation coefficient and the variance contribution rate. Secondly, Hilbert transform was performed on the main intrinsic mode functions. Hilbert time-frequency spectrum and Hilbert marginal spectrum were obtained by Hilbert transform. Finally, Amplitude domain indexes were extracted on the basis of the Hilbert marginal spectrum and they structured recognition feature vector of tool wear state. The research results show that the extracted features can effectively characterize the different wear state of the tool, which provides a basis for monitoring tool wear condition.

  11. The Department of Energy (DOE) research program in structural analysis of vertical-axis wind turbines

    NASA Astrophysics Data System (ADS)

    Sullivan, W. N.

    The Darrieus-type Vertical Axis Wind Turbine (VAWT) presents a variety of unusual structural problems to designers. The level of understanding of these structural problems governs, to a large degree, the success or failure of today's rotor designs. A survey is presented of the technology available for rotor structural design with emphasis on the DOE research program now underway. Itemizations are included of the major structural issues unique to the VAWT along with discussion of available analysis techniques for each problem area. It is concluded that tools are available to at least approximately address the most important problems. However, experimental data for confirmation is rather limited in terms of volume and the range of rotor configurations tested.

  12. A Multidisciplinary Tool for Systems Analysis of Planetary Entry, Descent, and Landing (SAPE)

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    2009-01-01

    SAPE is a Python-based multidisciplinary analysis tool for systems analysis of planetary entry, descent, and landing (EDL) for Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Titan. The purpose of SAPE is to provide a variable-fidelity capability for conceptual and preliminary analysis within the same framework. SAPE includes the following analysis modules: geometry, trajectory, aerodynamics, aerothermal, thermal protection system, and structural sizing. SAPE uses the Python language-a platform-independent open-source software for integration and for the user interface. The development has relied heavily on the object-oriented programming capabilities that are available in Python. Modules are provided to interface with commercial and government off-the-shelf software components (e.g., thermal protection systems and finite-element analysis). SAPE runs on Microsoft Windows and Apple Mac OS X and has been partially tested on Linux.

  13. Tertiary structure prediction and identification of druggable pocket in the cancer biomarker – Osteopontin-c

    PubMed Central

    2014-01-01

    Background Osteopontin (Eta, secreted sialoprotein 1, opn) is secreted from different cell types including cancer cells. Three splice variant forms namely osteopontin-a, osteopontin-b and osteopontin-c have been identified. The main astonishing feature is that osteopontin-c is found to be elevated in almost all types of cancer cells. This was the vital point to consider it for sequence analysis and structure predictions which provide ample chances for prognostic, therapeutic and preventive cancer research. Methods Osteopontin-c gene sequence was determined from Breast Cancer sample and was translated to protein sequence. It was then analyzed using various software and web tools for binding pockets, docking and druggability analysis. Due to the lack of homological templates, tertiary structure was predicted using ab-initio method server – I-TASSER and was evaluated after refinement using web tools. Refined structure was compared with known bone sialoprotein electron microscopic structure and docked with CD44 for binding analysis and binding pockets were identified for drug designing. Results Signal sequence of about sixteen amino acid residues was identified using signal sequence prediction servers. Due to the absence of known structures of similar proteins, three dimensional structure of osteopontin-c was predicted using I-TASSER server. The predicted structure was refined with the help of SUMMA server and was validated using SAVES server. Molecular dynamic analysis was carried out using GROMACS software. The final model was built and was used for docking with CD44. Druggable pockets were identified using pocket energies. Conclusions The tertiary structure of osteopontin-c was predicted successfully using the ab-initio method and the predictions showed that osteopontin-c is of fibrous nature comparable to firbronectin. Docking studies showed the significant similarities of QSAET motif in the interaction of CD44 and osteopontins between the normal and splice variant forms of osteopontins and binding pockets analyses revealed several pockets which paved the way to the identification of a druggable pocket. PMID:24401206

  14. Integrated dynamic analysis simulation of space stations with controllable solar array

    NASA Technical Reports Server (NTRS)

    Heinrichs, J. A.; Fee, J. J.

    1972-01-01

    A methodology is formulated and presented for the integrated structural dynamic analysis of space stations with controllable solar arrays and non-controllable appendages. The structural system flexibility characteristics are considered in the dynamic analysis by a synthesis technique whereby free-free space station modal coordinates and cantilever appendage coordinates are inertially coupled. A digital simulation of this analysis method is described and verified by comparison of interaction load solutions with other methods of solution. Motion equations are simulated for both the zero gravity and artificial gravity (spinning) orbital conditions. Closed loop controlling dynamics for both orientation control of the arrays and attitude control of the space station are provided in the simulation by various generic types of controlling systems. The capability of the simulation as a design tool is demonstrated by utilizing typical space station and solar array structural representations and a specific structural perturbing force. Response and interaction load solutions are presented for this structural configuration and indicate the importance of using an integrated type analysis for the predictions of structural interactions.

  15. Structural analysis of glycoproteins: building N-linked glycans with Coot.

    PubMed

    Emsley, Paul; Crispin, Max

    2018-04-01

    Coot is a graphics application that is used to build or manipulate macromolecular models; its particular forte is manipulation of the model at the residue level. The model-building tools of Coot have been combined and extended to assist or automate the building of N-linked glycans. The model is built by the addition of monosaccharides, placed by variation of internal coordinates. The subsequent model is refined by real-space refinement, which is stabilized with modified and additional restraints. It is hoped that these enhanced building tools will help to reduce building errors of N-linked glycans and improve our knowledge of the structures of glycoproteins.

  16. Frequency analysis for modulation-enhanced powder diffraction.

    PubMed

    Chernyshov, Dmitry; Dyadkin, Vadim; van Beek, Wouter; Urakawa, Atsushi

    2016-07-01

    Periodic modulation of external conditions on a crystalline sample with a consequent analysis of periodic diffraction response has been recently proposed as a tool to enhance experimental sensitivity for minor structural changes. Here the intensity distributions for both a linear and nonlinear structural response induced by a symmetric and periodic stimulus are analysed. The analysis is further extended for powder diffraction when an external perturbation changes not only the intensity of Bragg lines but also their positions. The derived results should serve as a basis for a quantitative modelling of modulation-enhanced diffraction data measured in real conditions.

  17. Principles and tools for collaborative entity-based intelligence analysis.

    PubMed

    Bier, Eric A; Card, Stuart K; Bodnar, John W

    2010-01-01

    Software tools that make it easier for analysts to collaborate as a natural part of their work will lead to better analysis that is informed by more perspectives. We are interested to know if software tools can be designed that support collaboration even as they allow analysts to find documents and organize information (including evidence, schemas, and hypotheses). We have modified the Entity Workspace system, described previously, to test such designs. We have evaluated the resulting design in both a laboratory study and a study where it is situated with an analysis team. In both cases, effects on collaboration appear to be positive. Key aspects of the design include an evidence notebook optimized for organizing entities (rather than text characters), information structures that can be collapsed and expanded, visualization of evidence that emphasizes events and documents (rather than emphasizing the entity graph), and a notification system that finds entities of mutual interest to multiple analysts. Long-term tests suggest that this approach can support both top-down and bottom-up styles of analysis.

  18. A simplified method of evaluating the stress wave environment of internal equipment

    NASA Technical Reports Server (NTRS)

    Colton, J. D.; Desmond, T. P.

    1979-01-01

    A simplified method called the transfer function technique (TFT) was devised for evaluating the stress wave environment in a structure containing internal equipment. The TFT consists of following the initial in-plane stress wave that propagates through a structure subjected to a dynamic load and characterizing how the wave is altered as it is transmitted through intersections of structural members. As a basis for evaluating the TFT, impact experiments and detailed stress wave analyses were performed for structures with two or three, or more members. Transfer functions that relate the wave transmitted through an intersection to the incident wave were deduced from the predicted wave response. By sequentially applying these transfer functions to a structure with several intersections, it was found that the environment produced by the initial stress wave propagating through the structure can be approximated well. The TFT can be used as a design tool or as an analytical tool to determine whether a more detailed wave analysis is warranted.

  19. A Data-Driven Diagnostic Framework for Wind Turbine Structures: A Holistic Approach

    PubMed Central

    Bogoevska, Simona; Spiridonakos, Minas; Chatzi, Eleni; Dumova-Jovanoska, Elena; Höffer, Rudiger

    2017-01-01

    The complex dynamics of operational wind turbine (WT) structures challenges the applicability of existing structural health monitoring (SHM) strategies for condition assessment. At the center of Europe’s renewable energy strategic planning, WT systems call for implementation of strategies that may describe the WT behavior in its complete operational spectrum. The framework proposed in this paper relies on the symbiotic treatment of acting environmental/operational variables and the monitored vibration response of the structure. The approach aims at accurate simulation of the temporal variability characterizing the WT dynamics, and subsequently at the tracking of the evolution of this variability in a longer-term horizon. The bi-component analysis tool is applied on long-term data, collected as part of continuous monitoring campaigns on two actual operating WT structures located in different sites in Germany. The obtained data-driven structural models verify the potential of the proposed strategy for development of an automated SHM diagnostic tool. PMID:28358346

  20. Structural coloration of metallic surfaces with micro/nano-structures induced by elliptical vibration texturing

    NASA Astrophysics Data System (ADS)

    Yang, Yang; Pan, Yayue; Guo, Ping

    2017-04-01

    Creating orderly periodic micro/nano-structures on metallic surfaces, or structural coloration, for control of surface apparent color and optical reflectivity has been an exciting research topic over the years. The direct applications of structural coloration include color marking, display devices, and invisibility cloak. This paper presents an efficient method to colorize metallic surfaces with periodic micro/nano-gratings using elliptical vibration texturing. When the tool vibration is coupled with a constant cutting velocity, controlled periodic ripples can be generated due to the overlapping tool trajectory. These periodic ripples with a wavelength near visible spectrum can act as micro-gratings to introduce iridescent colors. The proposed technique also provides a flexible method for color marking of metallic surfaces with arbitrary patterns and images by precise control of the spacing distance and orientation of induced micro/nano-ripples. Theoretical analysis and experimental results are given to demonstrate structural coloration of metals by a direct mechanical machining technique.

  1. Image Analysis of DNA Fiber and Nucleus in Plants.

    PubMed

    Ohmido, Nobuko; Wako, Toshiyuki; Kato, Seiji; Fukui, Kiichi

    2016-01-01

    Advances in cytology have led to the application of a wide range of visualization methods in plant genome studies. Image analysis methods are indispensable tools where morphology, density, and color play important roles in the biological systems. Visualization and image analysis methods are useful techniques in the analyses of the detailed structure and function of extended DNA fibers (EDFs) and interphase nuclei. The EDF is the highest in the spatial resolving power to reveal genome structure and it can be used for physical mapping, especially for closely located genes and tandemly repeated sequences. One the other hand, analyzing nuclear DNA and proteins would reveal nuclear structure and functions. In this chapter, we describe the image analysis protocol for quantitatively analyzing different types of plant genome, EDFs and interphase nuclei.

  2. Composite structural materials

    NASA Technical Reports Server (NTRS)

    Ansell, G. S.; Loewy, R. G.; Wiberley, S. E.

    1981-01-01

    The composite aircraft program component (CAPCOMP) is a graduate level project conducted in parallel with a composite structures program. The composite aircraft program glider (CAPGLIDE) is an undergraduate demonstration project which has as its objectives the design, fabrication, and testing of a foot launched ultralight glider using composite structures. The objective of the computer aided design (COMPAD) portion of the composites project is to provide computer tools for the analysis and design of composite structures. The major thrust of COMPAD is in the finite element area with effort directed at implementing finite element analysis capabilities and developing interactive graphics preprocessing and postprocessing capabilities. The criteria for selecting research projects to be conducted under the innovative and supporting research (INSURE) program are described.

  3. WEBnm@ v2.0: Web server and services for comparing protein flexibility.

    PubMed

    Tiwari, Sandhya P; Fuglebakk, Edvin; Hollup, Siv M; Skjærven, Lars; Cragnolini, Tristan; Grindhaug, Svenn H; Tekle, Kidane M; Reuter, Nathalie

    2014-12-30

    Normal mode analysis (NMA) using elastic network models is a reliable and cost-effective computational method to characterise protein flexibility and by extension, their dynamics. Further insight into the dynamics-function relationship can be gained by comparing protein motions between protein homologs and functional classifications. This can be achieved by comparing normal modes obtained from sets of evolutionary related proteins. We have developed an automated tool for comparative NMA of a set of pre-aligned protein structures. The user can submit a sequence alignment in the FASTA format and the corresponding coordinate files in the Protein Data Bank (PDB) format. The computed normalised squared atomic fluctuations and atomic deformation energies of the submitted structures can be easily compared on graphs provided by the web user interface. The web server provides pairwise comparison of the dynamics of all proteins included in the submitted set using two measures: the Root Mean Squared Inner Product and the Bhattacharyya Coefficient. The Comparative Analysis has been implemented on our web server for NMA, WEBnm@, which also provides recently upgraded functionality for NMA of single protein structures. This includes new visualisations of protein motion, visualisation of inter-residue correlations and the analysis of conformational change using the overlap analysis. In addition, programmatic access to WEBnm@ is now available through a SOAP-based web service. Webnm@ is available at http://apps.cbu.uib.no/webnma . WEBnm@ v2.0 is an online tool offering unique capability for comparative NMA on multiple protein structures. Along with a convenient web interface, powerful computing resources, and several methods for mode analyses, WEBnm@ facilitates the assessment of protein flexibility within protein families and superfamilies. These analyses can give a good view of how the structures move and how the flexibility is conserved over the different structures.

  4. PAT: predictor for structured units and its application for the optimization of target molecules for the generation of synthetic antibodies.

    PubMed

    Jeon, Jouhyun; Arnold, Roland; Singh, Fateh; Teyra, Joan; Braun, Tatjana; Kim, Philip M

    2016-04-01

    The identification of structured units in a protein sequence is an important first step for most biochemical studies. Importantly for this study, the identification of stable structured region is a crucial first step to generate novel synthetic antibodies. While many approaches to find domains or predict structured regions exist, important limitations remain, such as the optimization of domain boundaries and the lack of identification of non-domain structured units. Moreover, no integrated tool exists to find and optimize structural domains within protein sequences. Here, we describe a new tool, PAT ( http://www.kimlab.org/software/pat ) that can efficiently identify both domains (with optimized boundaries) and non-domain putative structured units. PAT automatically analyzes various structural properties, evaluates the folding stability, and reports possible structural domains in a given protein sequence. For reliability evaluation of PAT, we applied PAT to identify antibody target molecules based on the notion that soluble and well-defined protein secondary and tertiary structures are appropriate target molecules for synthetic antibodies. PAT is an efficient and sensitive tool to identify structured units. A performance analysis shows that PAT can characterize structurally well-defined regions in a given sequence and outperforms other efforts to define reliable boundaries of domains. Specially, PAT successfully identifies experimentally confirmed target molecules for antibody generation. PAT also offers the pre-calculated results of 20,210 human proteins to accelerate common queries. PAT can therefore help to investigate large-scale structured domains and improve the success rate for synthetic antibody generation.

  5. Structural optimization: Status and promise

    NASA Astrophysics Data System (ADS)

    Kamat, Manohar P.

    Chapters contained in this book include fundamental concepts of optimum design, mathematical programming methods for constrained optimization, function approximations, approximate reanalysis methods, dual mathematical programming methods for constrained optimization, a generalized optimality criteria method, and a tutorial and survey of multicriteria optimization in engineering. Also included are chapters on the compromise decision support problem and the adaptive linear programming algorithm, sensitivity analyses of discrete and distributed systems, the design sensitivity analysis of nonlinear structures, optimization by decomposition, mixed elements in shape sensitivity analysis of structures based on local criteria, and optimization of stiffened cylindrical shells subjected to destabilizing loads. Other chapters are on applications to fixed-wing aircraft and spacecraft, integrated optimum structural and control design, modeling concurrency in the design of composite structures, and tools for structural optimization. (No individual items are abstracted in this volume)

  6. The Java Image Science Toolkit (JIST) for rapid prototyping and publishing of neuroimaging software.

    PubMed

    Lucas, Blake C; Bogovic, John A; Carass, Aaron; Bazin, Pierre-Louis; Prince, Jerry L; Pham, Dzung L; Landman, Bennett A

    2010-03-01

    Non-invasive neuroimaging techniques enable extraordinarily sensitive and specific in vivo study of the structure, functional response and connectivity of biological mechanisms. With these advanced methods comes a heavy reliance on computer-based processing, analysis and interpretation. While the neuroimaging community has produced many excellent academic and commercial tool packages, new tools are often required to interpret new modalities and paradigms. Developing custom tools and ensuring interoperability with existing tools is a significant hurdle. To address these limitations, we present a new framework for algorithm development that implicitly ensures tool interoperability, generates graphical user interfaces, provides advanced batch processing tools, and, most importantly, requires minimal additional programming or computational overhead. Java-based rapid prototyping with this system is an efficient and practical approach to evaluate new algorithms since the proposed system ensures that rapidly constructed prototypes are actually fully-functional processing modules with support for multiple GUI's, a broad range of file formats, and distributed computation. Herein, we demonstrate MRI image processing with the proposed system for cortical surface extraction in large cross-sectional cohorts, provide a system for fully automated diffusion tensor image analysis, and illustrate how the system can be used as a simulation framework for the development of a new image analysis method. The system is released as open source under the Lesser GNU Public License (LGPL) through the Neuroimaging Informatics Tools and Resources Clearinghouse (NITRC).

  7. The Java Image Science Toolkit (JIST) for Rapid Prototyping and Publishing of Neuroimaging Software

    PubMed Central

    Lucas, Blake C.; Bogovic, John A.; Carass, Aaron; Bazin, Pierre-Louis; Prince, Jerry L.; Pham, Dzung

    2010-01-01

    Non-invasive neuroimaging techniques enable extraordinarily sensitive and specific in vivo study of the structure, functional response and connectivity of biological mechanisms. With these advanced methods comes a heavy reliance on computer-based processing, analysis and interpretation. While the neuroimaging community has produced many excellent academic and commercial tool packages, new tools are often required to interpret new modalities and paradigms. Developing custom tools and ensuring interoperability with existing tools is a significant hurdle. To address these limitations, we present a new framework for algorithm development that implicitly ensures tool interoperability, generates graphical user interfaces, provides advanced batch processing tools, and, most importantly, requires minimal additional programming or computational overhead. Java-based rapid prototyping with this system is an efficient and practical approach to evaluate new algorithms since the proposed system ensures that rapidly constructed prototypes are actually fully-functional processing modules with support for multiple GUI's, a broad range of file formats, and distributed computation. Herein, we demonstrate MRI image processing with the proposed system for cortical surface extraction in large cross-sectional cohorts, provide a system for fully automated diffusion tensor image analysis, and illustrate how the system can be used as a simulation framework for the development of a new image analysis method. The system is released as open source under the Lesser GNU Public License (LGPL) through the Neuroimaging Informatics Tools and Resources Clearinghouse (NITRC). PMID:20077162

  8. Local Debonding and Fiber Breakage in Composite Materials Modeled Accurately

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2001-01-01

    A prerequisite for full utilization of composite materials in aerospace components is accurate design and life prediction tools that enable the assessment of component performance and reliability. Such tools assist both structural analysts, who design and optimize structures composed of composite materials, and materials scientists who design and optimize the composite materials themselves. NASA Glenn Research Center's Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) software package (http://www.grc.nasa.gov/WWW/LPB/mac) addresses this need for composite design and life prediction tools by providing a widely applicable and accurate approach to modeling composite materials. Furthermore, MAC/GMC serves as a platform for incorporating new local models and capabilities that are under development at NASA, thus enabling these new capabilities to progress rapidly to a stage in which they can be employed by the code's end users.

  9. Direct glycan structure determination of intact N-linked glycopeptides by low-energy collision-induced dissociation tandem mass spectrometry and predicted spectral library searching.

    PubMed

    Pai, Pei-Jing; Hu, Yingwei; Lam, Henry

    2016-08-31

    Intact glycopeptide MS analysis to reveal site-specific protein glycosylation is an important frontier of proteomics. However, computational tools for analyzing MS/MS spectra of intact glycopeptides are still limited and not well-integrated into existing workflows. In this work, a new computational tool which combines the spectral library building/searching tool, SpectraST (Lam et al. Nat. Methods2008, 5, 873-875), and the glycopeptide fragmentation prediction tool, MassAnalyzer (Zhang et al. Anal. Chem.2010, 82, 10194-10202) for intact glycopeptide analysis has been developed. Specifically, this tool enables the determination of the glycan structure directly from low-energy collision-induced dissociation (CID) spectra of intact glycopeptides. Given a list of possible glycopeptide sequences as input, a sample-specific spectral library of MassAnalyzer-predicted spectra is built using SpectraST. Glycan identification from CID spectra is achieved by spectral library searching against this library, in which both m/z and intensity information of the possible fragmentation ions are taken into consideration for improved accuracy. We validated our method using a standard glycoprotein, human transferrin, and evaluated its potential to be used in site-specific glycosylation profiling of glycoprotein datasets from LC-MS/MS. In addition, we further applied our method to reveal, for the first time, the site-specific N-glycosylation profile of recombinant human acetylcholinesterase expressed in HEK293 cells. For maximum usability, SpectraST is developed as part of the Trans-Proteomic Pipeline (TPP), a freely available and open-source software suite for MS data analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Factor analysis and Mokken scaling of the Organizational Commitment Questionnaire in nurses.

    PubMed

    Al-Yami, M; Galdas, P; Watson, R

    2018-03-22

    To generate an Arabic version of the Organizational Commitment Questionnaire that would be easily understood by Arabic speakers and would be sensitive to Arabic culture. The nursing workforce in Saudi Arabia is undergoing a process of Saudization but there is a need to understand the factors that will help to retain this workforce. No organizational commitment tools exist in Arabic that are specifically designed for health organizations. An Arabic version of the organizational commitment tool could aid Arabic speaking employers to understand their employees' perceptions of their organizations. Translation and back-translation followed by factor analysis (principal components analysis and confirmatory factor analysis) to test the factorial validity and item response theory (Mokken scaling). A two-factor structure was obtained for the Organizational Commitment Questionnaire comprising Factor 1: Value commitment; and Factor 2: Commitment to stay with acceptable reliability measured by internal consistency. A Mokken scale was obtained including items from both factors showing a hierarchy of items running from commitment to the organization and commitment to self. This study shows that the Arabic version of the OCQ retained the established two-factor structure of the original English-language version. Although the two factors - 'value commitment' and 'commitment to stay' - repudiate the original developers' single factor claim. A useful insight into the structure of the Organizational Commitment Questionnaire has been obtained with the novel addition of a hierarchical scale. The Organizational Commitment Questionnaire is now ready to be used with nurses in the Arab speaking world and could be used a tool to measure the contemporary commitment of nursing employees and in future interventions aimed at increasing commitment and retention of valuable nursing staff. © 2018 International Council of Nurses.

  11. A Petri Net Approach Based Elementary Siphons Supervisor for Flexible Manufacturing Systems

    NASA Astrophysics Data System (ADS)

    Abdul-Hussin, Mowafak Hassan

    2015-05-01

    This paper presents an approach to constructing a class of an S3PR net for modeling, simulation and control of processes occurring in the flexible manufacturing system (FMS) used based elementary siphons of a Petri net. Siphons are very important to the analysis and control of deadlocks of FMS that is significant objectives of siphons. Petri net models in the efficiency structure analysis, and utilization of the FMSs when different policy can be implemented lead to the deadlock prevention. We are representing an effective deadlock-free policy of a special class of Petri nets called S3PR. Simulation of Petri net structural analysis and reachability graph analysis is used for analysis and control of Petri nets. Petri nets contain been successfully as one of the most powerful tools for modelling of FMS, where Using structural analysis, we show that liveness of such systems can be attributed to the absence of under marked siphons.

  12. NASA Tech Briefs, January 2005

    NASA Technical Reports Server (NTRS)

    2005-01-01

    Topics covered include: Fiber-Optic Sensor Would Monitor Growth of Polymer Film; Sensors for Pointing Moving Instruments Toward Each Other; Pd/CeO2/SiC Chemical Sensors; Microparticle Flow Sensor; Scattering-Type Surface-Plasmon-Resonance Biosensors; Diode-Laser-Based Spectrometer for Sensing Gases; Improved Cathode Structure for a Direct Methanol Fuel Cell; X-Band, 17-Watt Solid-State Power Amplifier; Improved Anode for a Direct Methanol Fuel Cell; Tools for Designing and Analyzing Structures; Interactive Display of Scenes with Annotations; Solving Common Mathematical Problems; Tools for Basic Statistical Analysis; Program Calculates Forces in Bolted Structural Joints; Integrated Structural Analysis and Test Program; Molybdate Coatings for Protecting Aluminum Against Corrosion; Synthesizing Diamond from Liquid Feedstock; Modifying Silicates for Better Dispersion in Nanocomposites; Powder-Collection System for Ultrasonic/Sonic Drill/Corer; Semiautomated, Reproducible Batch Processing of Soy; Hydrogen Peroxide Enhances Removal of NOx from Flue Gases; Subsurface Ice Probe; Real-Time Simulation of Aeroheating of the Hyper-X Airplane; Using Laser-Induced Incandescence To Measure Soot in Exhaust; Method of Real-Time Principal-Component Analysis; Insect-Inspired Flight Control for Unmanned Aerial Vehicles; Domain Compilation for Embedded Real-Time Planning; Semantic Metrics for Analysis of Software; Simulation of Laser Cooling and Trapping in Engineering Applications; Large Fluvial Fans and Exploration for Hydrocarbons; Doping-Induced Interband Gain in InAs/AlSb Quantum Wells; Development of Software for a Lidar-Altimeter Processor; Upgrading the Space Shuttle Caution and Warning System; and Fractal Reference Signals in Pulse-Width Modulation.

  13. Exploring Human Diseases and Biological Mechanisms by Protein Structure Prediction and Modeling.

    PubMed

    Wang, Juexin; Luttrell, Joseph; Zhang, Ning; Khan, Saad; Shi, NianQing; Wang, Michael X; Kang, Jing-Qiong; Wang, Zheng; Xu, Dong

    2016-01-01

    Protein structure prediction and modeling provide a tool for understanding protein functions by computationally constructing protein structures from amino acid sequences and analyzing them. With help from protein prediction tools and web servers, users can obtain the three-dimensional protein structure models and gain knowledge of functions from the proteins. In this chapter, we will provide several examples of such studies. As an example, structure modeling methods were used to investigate the relation between mutation-caused misfolding of protein and human diseases including epilepsy and leukemia. Protein structure prediction and modeling were also applied in nucleotide-gated channels and their interaction interfaces to investigate their roles in brain and heart cells. In molecular mechanism studies of plants, rice salinity tolerance mechanism was studied via structure modeling on crucial proteins identified by systems biology analysis; trait-associated protein-protein interactions were modeled, which sheds some light on the roles of mutations in soybean oil/protein content. In the age of precision medicine, we believe protein structure prediction and modeling will play more and more important roles in investigating biomedical mechanism of diseases and drug design.

  14. Structural design of the Sandia 34-M Vertical Axis Wind Turbine

    NASA Astrophysics Data System (ADS)

    Berg, D. E.

    Sandia National Laboratories, as the lead DOE laboratory for Vertical Axis Wind Turbine (VAWT) development, is currently designing a 34-meter diameter Darrieus-type VAWT. This turbine will be a research test bed which provides a focus for advancing technology and validating design and fabrication techniques in a size range suitable for utility use. Structural data from this machine will allow structural modeling to be refined and verified for a turbine on which the gravity effects and stochastic wind loading are significant. Performance data from it will allow aerodynamic modeling to be refined and verified. The design effort incorporates Sandia's state-of-the-art analysis tools in the design of a complete machine. The analytic tools used in this design are discussed and the conceptual design procedure is described.

  15. Study of metallic structural design concepts for an arrow wing supersonic cruise configuration

    NASA Technical Reports Server (NTRS)

    Turner, M. J.; Grande, D. L.

    1977-01-01

    A structural design study was made, to assess the relative merits of various metallic structural concepts and materials for an advanced supersonic aircraft cruising at Mach 2.7. Preliminary studies were made to ensure compliance of the configuration with general design criteria, integrate the propulsion system with the airframe, select structural concepts and materials, and define an efficient structural arrangement. An advanced computerized structural design system was used, in conjunction with a relatively large, complex finite element model, for detailed analysis and sizing of structural members to satisfy strength and flutter criteria. A baseline aircraft design was developed for assessment of current technology. Criteria, analysis methods, and results are presented. The effect on design methods of using the computerized structural design system was appraised, and recommendations are presented concerning further development of design tools, development of materials and structural concepts, and research on basic technology.

  16. Interactive visualization of multi-data-set Rietveld analyses using Cinema:Debye-Scherrer.

    PubMed

    Vogel, Sven C; Biwer, Chris M; Rogers, David H; Ahrens, James P; Hackenberg, Robert E; Onken, Drew; Zhang, Jianzhong

    2018-06-01

    A tool named Cinema:Debye-Scherrer to visualize the results of a series of Rietveld analyses is presented. The multi-axis visualization of the high-dimensional data sets resulting from powder diffraction analyses allows identification of analysis problems, prediction of suitable starting values, identification of gaps in the experimental parameter space and acceleration of scientific insight from the experimental data. The tool is demonstrated with analysis results from 59 U-Nb alloy samples with different compositions, annealing times and annealing temperatures as well as with a high-temperature study of the crystal structure of CsPbBr 3 . A script to extract parameters from a series of Rietveld analyses employing the widely used GSAS Rietveld software is also described. Both software tools are available for download.

  17. Interactive visualization of multi-data-set Rietveld analyses using Cinema:Debye-Scherrer

    PubMed Central

    Biwer, Chris M.; Rogers, David H.; Ahrens, James P.; Hackenberg, Robert E.; Onken, Drew; Zhang, Jianzhong

    2018-01-01

    A tool named Cinema:Debye-Scherrer to visualize the results of a series of Rietveld analyses is presented. The multi-axis visualization of the high-dimensional data sets resulting from powder diffraction analyses allows identification of analysis problems, prediction of suitable starting values, identification of gaps in the experimental parameter space and acceleration of scientific insight from the experimental data. The tool is demonstrated with analysis results from 59 U–Nb alloy samples with different compositions, annealing times and annealing temperatures as well as with a high-temperature study of the crystal structure of CsPbBr3. A script to extract parameters from a series of Rietveld analyses employing the widely used GSAS Rietveld software is also described. Both software tools are available for download. PMID:29896062

  18. TRAPR: R Package for Statistical Analysis and Visualization of RNA-Seq Data.

    PubMed

    Lim, Jae Hyun; Lee, Soo Youn; Kim, Ju Han

    2017-03-01

    High-throughput transcriptome sequencing, also known as RNA sequencing (RNA-Seq), is a standard technology for measuring gene expression with unprecedented accuracy. Numerous bioconductor packages have been developed for the statistical analysis of RNA-Seq data. However, these tools focus on specific aspects of the data analysis pipeline, and are difficult to appropriately integrate with one another due to their disparate data structures and processing methods. They also lack visualization methods to confirm the integrity of the data and the process. In this paper, we propose an R-based RNA-Seq analysis pipeline called TRAPR, an integrated tool that facilitates the statistical analysis and visualization of RNA-Seq expression data. TRAPR provides various functions for data management, the filtering of low-quality data, normalization, transformation, statistical analysis, data visualization, and result visualization that allow researchers to build customized analysis pipelines.

  19. The project organization as a policy tool in implementing welfare reforms in the public sector.

    PubMed

    Jensen, Christian; Johansson, Staffan; Löfström, Mikael

    2013-01-01

    Organizational design is considered in policy literature as a forceful policy tool to put policy to action. However, previous research has not analyzed the project organization as a specific form of organizational design and, hence, has not given much attention to such organizations as a strategic choice when selecting policy tools. The purpose of the article is to investigate the project as a policy tool; how do such temporary organizations function as a specific form of organization when public policy is implemented? The article is based on a framework of policy implementation and is illustrated with two welfare reforms in the Swedish public sector, which were organized and implemented as project organizations. The case studies and the analysis show that it is crucial that a project organization fits into the overall governance structure when used as a policy tool. If not, the project will remain encapsulated and will not have sufficient impact on the permanent organizational structure. The concept of encapsulation indicates a need to protect the project from a potential hostile environment. The implication of this is that organizational design as a policy tool is a matter that deserves more attention in the strategic discussion on implementing public policies and on the suitability of using certain policy tools. Copyright © 2012 John Wiley & Sons, Ltd.

  20. A resource for benchmarking the usefulness of protein structure models.

    PubMed

    Carbajo, Daniel; Tramontano, Anna

    2012-08-02

    Increasingly, biologists and biochemists use computational tools to design experiments to probe the function of proteins and/or to engineer them for a variety of different purposes. The most effective strategies rely on the knowledge of the three-dimensional structure of the protein of interest. However it is often the case that an experimental structure is not available and that models of different quality are used instead. On the other hand, the relationship between the quality of a model and its appropriate use is not easy to derive in general, and so far it has been analyzed in detail only for specific application. This paper describes a database and related software tools that allow testing of a given structure based method on models of a protein representing different levels of accuracy. The comparison of the results of a computational experiment on the experimental structure and on a set of its decoy models will allow developers and users to assess which is the specific threshold of accuracy required to perform the task effectively. The ModelDB server automatically builds decoy models of different accuracy for a given protein of known structure and provides a set of useful tools for their analysis. Pre-computed data for a non-redundant set of deposited protein structures are available for analysis and download in the ModelDB database. IMPLEMENTATION, AVAILABILITY AND REQUIREMENTS: Project name: A resource for benchmarking the usefulness of protein structure models. Project home page: http://bl210.caspur.it/MODEL-DB/MODEL-DB_web/MODindex.php.Operating system(s): Platform independent. Programming language: Perl-BioPerl (program); mySQL, Perl DBI and DBD modules (database); php, JavaScript, Jmol scripting (web server). Other requirements: Java Runtime Environment v1.4 or later, Perl, BioPerl, CPAN modules, HHsearch, Modeller, LGA, NCBI Blast package, DSSP, Speedfill (Surfnet) and PSAIA. License: Free. Any restrictions to use by non-academics: No.

  1. Portfolio: a prototype workstation for development and evaluation of tools for analysis and management of digital portal images.

    PubMed

    Boxwala, A A; Chaney, E L; Fritsch, D S; Friedman, C P; Rosenman, J G

    1998-09-01

    The purpose of this investigation was to design and implement a prototype physician workstation, called PortFolio, as a platform for developing and evaluating, by means of controlled observer studies, user interfaces and interactive tools for analyzing and managing digital portal images. The first observer study was designed to measure physician acceptance of workstation technology, as an alternative to a view box, for inspection and analysis of portal images for detection of treatment setup errors. The observer study was conducted in a controlled experimental setting to evaluate physician acceptance of the prototype workstation technology exemplified by PortFolio. PortFolio incorporates a windows user interface, a compact kit of carefully selected image analysis tools, and an object-oriented data base infrastructure. The kit evaluated in the observer study included tools for contrast enhancement, registration, and multimodal image visualization. Acceptance was measured in the context of performing portal image analysis in a structured protocol designed to simulate clinical practice. The acceptability and usage patterns were measured from semistructured questionnaires and logs of user interactions. Radiation oncologists, the subjects for this study, perceived the tools in PortFolio to be acceptable clinical aids. Concerns were expressed regarding user efficiency, particularly with respect to the image registration tools. The results of our observer study indicate that workstation technology is acceptable to radiation oncologists as an alternative to a view box for clinical detection of setup errors from digital portal images. Improvements in implementation, including more tools and a greater degree of automation in the image analysis tasks, are needed to make PortFolio more clinically practical.

  2. Iterative Development of an Application to Support Nuclear Magnetic Resonance Data Analysis of Proteins.

    PubMed

    Ellis, Heidi J C; Nowling, Ronald J; Vyas, Jay; Martyn, Timothy O; Gryk, Michael R

    2011-04-11

    The CONNecticut Joint University Research (CONNJUR) team is a group of biochemical and software engineering researchers at multiple institutions. The vision of the team is to develop a comprehensive application that integrates a variety of existing analysis tools with workflow and data management to support the process of protein structure determination using Nuclear Magnetic Resonance (NMR). The use of multiple disparate tools and lack of data management, currently the norm in NMR data processing, provides strong motivation for such an integrated environment. This manuscript briefly describes the domain of NMR as used for protein structure determination and explains the formation of the CONNJUR team and its operation in developing the CONNJUR application. The manuscript also describes the evolution of the CONNJUR application through four prototypes and describes the challenges faced while developing the CONNJUR application and how those challenges were met.

  3. Snoopy--a unifying Petri net framework to investigate biomolecular networks.

    PubMed

    Rohr, Christian; Marwan, Wolfgang; Heiner, Monika

    2010-04-01

    To investigate biomolecular networks, Snoopy provides a unifying Petri net framework comprising a family of related Petri net classes. Models can be hierarchically structured, allowing for the mastering of larger networks. To move easily between the qualitative, stochastic and continuous modelling paradigms, models can be converted into each other. We get models sharing structure, but specialized by their kinetic information. The analysis and iterative reverse engineering of biomolecular networks is supported by the simultaneous use of several Petri net classes, while the graphical user interface adapts dynamically to the active one. Built-in animation and simulation are complemented by exports to various analysis tools. Snoopy facilitates the addition of new Petri net classes thanks to its generic design. Our tool with Petri net samples is available free of charge for non-commercial use at http://www-dssz.informatik.tu-cottbus.de/snoopy.html; supported operating systems: Mac OS X, Windows and Linux (selected distributions).

  4. Advanced composite elevator for Boeing 727 aircraft, volume 2

    NASA Technical Reports Server (NTRS)

    Chovil, D. V.; Grant, W. D.; Jamison, E. S.; Syder, H.; Desper, O. E.; Harvey, S. T.; Mccarty, J. E.

    1980-01-01

    Preliminary design activity consisted of developing and analyzing alternate design concepts and selecting the optimum elevator configuration. This included trade studies in which durability, inspectability, producibility, repairability, and customer acceptance were evaluated. Preliminary development efforts consisted of evaluating and selecting material, identifying ancillary structural development test requirements, and defining full scale ground and flight test requirements necessary to obtain Federal Aviation Administration (FAA) certification. After selection of the optimum elevator configuration, detail design was begun and included basic configuration design improvements resulting from manufacturing verification hardware, the ancillary test program, weight analysis, and structural analysis. Detail and assembly tools were designed and fabricated to support a full-scope production program, rather than a limited run. The producibility development programs were used to verify tooling approaches, fabrication processes, and inspection methods for the production mode. Quality parts were readily fabricated and assembled with a minimum rejection rate, using prior inspection methods.

  5. Bioinformatics and molecular modeling in glycobiology

    PubMed Central

    Schloissnig, Siegfried

    2010-01-01

    The field of glycobiology is concerned with the study of the structure, properties, and biological functions of the family of biomolecules called carbohydrates. Bioinformatics for glycobiology is a particularly challenging field, because carbohydrates exhibit a high structural diversity and their chains are often branched. Significant improvements in experimental analytical methods over recent years have led to a tremendous increase in the amount of carbohydrate structure data generated. Consequently, the availability of databases and tools to store, retrieve and analyze these data in an efficient way is of fundamental importance to progress in glycobiology. In this review, the various graphical representations and sequence formats of carbohydrates are introduced, and an overview of newly developed databases, the latest developments in sequence alignment and data mining, and tools to support experimental glycan analysis are presented. Finally, the field of structural glycoinformatics and molecular modeling of carbohydrates, glycoproteins, and protein–carbohydrate interaction are reviewed. PMID:20364395

  6. Protein Data Bank Japan (PDBj): updated user interfaces, resource description framework, analysis tools for large structures

    PubMed Central

    Kinjo, Akira R.; Bekker, Gert-Jan; Suzuki, Hirofumi; Tsuchiya, Yuko; Kawabata, Takeshi; Ikegawa, Yasuyo; Nakamura, Haruki

    2017-01-01

    The Protein Data Bank Japan (PDBj, http://pdbj.org), a member of the worldwide Protein Data Bank (wwPDB), accepts and processes the deposited data of experimentally determined macromolecular structures. While maintaining the archive in collaboration with other wwPDB partners, PDBj also provides a wide range of services and tools for analyzing structures and functions of proteins. We herein outline the updated web user interfaces together with RESTful web services and the backend relational database that support the former. To enhance the interoperability of the PDB data, we have previously developed PDB/RDF, PDB data in the Resource Description Framework (RDF) format, which is now a wwPDB standard called wwPDB/RDF. We have enhanced the connectivity of the wwPDB/RDF data by incorporating various external data resources. Services for searching, comparing and analyzing the ever-increasing large structures determined by hybrid methods are also described. PMID:27789697

  7. STRUTEX: A prototype knowledge-based system for initially configuring a structure to support point loads in two dimensions

    NASA Technical Reports Server (NTRS)

    Robers, James L.; Sobieszczanski-Sobieski, Jaroslaw

    1989-01-01

    Only recently have engineers begun making use of Artificial Intelligence (AI) tools in the area of conceptual design. To continue filling this void in the design process, a prototype knowledge-based system, called STRUTEX has been developed to initially configure a structure to support point loads in two dimensions. This prototype was developed for testing the application of AI tools to conceptual design as opposed to being a testbed for new methods for improving structural analysis and optimization. This system combines numerical and symbolic processing by the computer with interactive problem solving aided by the vision of the user. How the system is constructed to interact with the user is described. Of special interest is the information flow between the knowledge base and the data base under control of the algorithmic main program. Examples of computed and refined structures are presented during the explanation of the system.

  8. Aggregating concept map data to investigate the knowledge of beginning CS students

    NASA Astrophysics Data System (ADS)

    Mühling, Andreas

    2016-07-01

    Concept maps have a long history in educational settings as a tool for teaching, learning, and assessing. As an assessment tool, they are predominantly used to extract the structural configuration of learners' knowledge. This article presents an investigation of the knowledge structures of a large group of beginning CS students. The investigation is based on a method that collects, aggregates, and automatically analyzes the concept maps of a group of learners as a whole, to identify common structural configurations and differences in the learners' knowledge. It shows that those students who have attended CS education in their secondary school life have, on average, configured their knowledge about typical core CS/OOP concepts differently. Also, artifacts of their particular CS curriculum are visible in their externalized knowledge. The data structures and analysis methods necessary for working with concept landscapes have been implemented as a GNU R package that is freely available.

  9. Clinical guideline representation in a CDS: a human information processing method.

    PubMed

    Kilsdonk, Ellen; Riezebos, Rinke; Kremer, Leontien; Peute, Linda; Jaspers, Monique

    2012-01-01

    The Dutch Childhood Oncology Group (DCOG) has developed evidence-based guidelines for screening childhood cancer survivors for possible late complications of treatment. These paper-based guidelines appeared to not suit clinicians' information retrieval strategies; it was thus decided to communicate the guidelines through a Computerized Decision Support (CDS) tool. To ensure high usability of this tool, an analysis of clinicians' cognitive strategies in retrieving information from the paper-based guidelines was used as requirements elicitation method. An information processing model was developed through an analysis of think aloud protocols and used as input for the design of the CDS user interface. Usability analysis of the user interface showed that the navigational structure of the CDS tool fitted well with the clinicians' mental strategies employed in deciding on survivors screening protocols. Clinicians were more efficient and more complete in deciding on patient-tailored screening procedures when supported by the CDS tool than by the paper-based guideline booklet. The think-aloud method provided detailed insight into users' clinical work patterns that supported the design of a highly usable CDS system.

  10. Investigating High-School Chemical Kinetics: The Greek Chemistry Textbook and Students' Difficulties

    ERIC Educational Resources Information Center

    Gegios, Theodoros; Salta, Katerina; Koinis, Spyros

    2017-01-01

    In this study we present an analysis of how the structure and content of the Greek school textbook approaches the concepts of chemical kinetics, and an investigation of the difficulties that 11th grade Greek students face regarding these concepts. Based on the structure and content of the Greek textbook, a tool was developed and applied to…

  11. Nuclear magnetic resonance studies of phosphorus(v) pesticides. Part I. Chemical shifts of protons as a means of identification of pesticides

    USGS Publications Warehouse

    Babad, H.; Herbert, W.; Goldberg, M.C.

    1968-01-01

    Correlations of structural and proton chemical-hift data for 40 commercial phosphorus(V) pesticides are reported. Correlations of structure with the phosphorus coupling constants are discussed, and general trends are noted which aid in the use of NMR as a tool for identification and analysis of phosphorus(V) compounds. ?? 1968.

  12. DAISY: a new software tool to test global identifiability of biological and physiological systems.

    PubMed

    Bellu, Giuseppina; Saccomani, Maria Pia; Audoly, Stefania; D'Angiò, Leontina

    2007-10-01

    A priori global identifiability is a structural property of biological and physiological models. It is considered a prerequisite for well-posed estimation, since it concerns the possibility of recovering uniquely the unknown model parameters from measured input-output data, under ideal conditions (noise-free observations and error-free model structure). Of course, determining if the parameters can be uniquely recovered from observed data is essential before investing resources, time and effort in performing actual biomedical experiments. Many interesting biological models are nonlinear but identifiability analysis for nonlinear system turns out to be a difficult mathematical problem. Different methods have been proposed in the literature to test identifiability of nonlinear models but, to the best of our knowledge, so far no software tools have been proposed for automatically checking identifiability of nonlinear models. In this paper, we describe a software tool implementing a differential algebra algorithm to perform parameter identifiability analysis for (linear and) nonlinear dynamic models described by polynomial or rational equations. Our goal is to provide the biological investigator a completely automatized software, requiring minimum prior knowledge of mathematical modelling and no in-depth understanding of the mathematical tools. The DAISY (Differential Algebra for Identifiability of SYstems) software will potentially be useful in biological modelling studies, especially in physiology and clinical medicine, where research experiments are particularly expensive and/or difficult to perform. Practical examples of use of the software tool DAISY are presented. DAISY is available at the web site http://www.dei.unipd.it/~pia/.

  13. Learning Photogrammetry with Interactive Software Tool PhoX

    NASA Astrophysics Data System (ADS)

    Luhmann, T.

    2016-06-01

    Photogrammetry is a complex topic in high-level university teaching, especially in the fields of geodesy, geoinformatics and metrology where high quality results are demanded. In addition, more and more black-box solutions for 3D image processing and point cloud generation are available that generate nice results easily, e.g. by structure-from-motion approaches. Within this context, the classical approach of teaching photogrammetry (e.g. focusing on aerial stereophotogrammetry) has to be reformed in order to educate students and professionals with new topics and provide them with more information behind the scene. Since around 20 years photogrammetry courses at the Jade University of Applied Sciences in Oldenburg, Germany, include the use of digital photogrammetry software that provide individual exercises, deep analysis of calculation results and a wide range of visualization tools for almost all standard tasks in photogrammetry. During the last years the software package PhoX has been developed that is part of a new didactic concept in photogrammetry and related subjects. It also serves as analysis tool in recent research projects. PhoX consists of a project-oriented data structure for images, image data, measured points and features and 3D objects. It allows for almost all basic photogrammetric measurement tools, image processing, calculation methods, graphical analysis functions, simulations and much more. Students use the program in order to conduct predefined exercises where they have the opportunity to analyse results in a high level of detail. This includes the analysis of statistical quality parameters but also the meaning of transformation parameters, rotation matrices, calibration and orientation data. As one specific advantage, PhoX allows for the interactive modification of single parameters and the direct view of the resulting effect in image or object space.

  14. New Tools Being Developed for Engine- Airframe Blade-Out Structural Simulations

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles

    2003-01-01

    One of the primary concerns of aircraft structure designers is the accurate simulation of the blade-out event. This is required for the aircraft to pass Federal Aviation Administration (FAA) certification and to ensure that the aircraft is safe for operation. Typically, the most severe blade-out occurs when a first-stage fan blade in a high-bypass gas turbine engine is released. Structural loading results from both the impact of the blade onto the containment ring and the subsequent instantaneous unbalance of the rotating components. Reliable simulations of blade-out are required to ensure structural integrity during flight as well as to guarantee successful blade-out certification testing. The loads generated by these analyses are critical to the design teams for several components of the airplane structures including the engine, nacelle, strut, and wing, as well as the aircraft fuselage. Currently, a collection of simulation tools is used for aircraft structural design. Detailed high-fidelity simulation tools are used to capture the structural loads resulting from blade loss, and then these loads are used as input into an overall system model that includes complete structural models of both the engines and the airframe. The detailed simulation (shown in the figure) includes the time-dependent trajectory of the lost blade and its interactions with the containment structure, and the system simulation includes the lost blade loadings and the interactions between the rotating turbomachinery and the remaining aircraft structural components. General-purpose finite element structural analysis codes are typically used, and special provisions are made to include transient effects from the blade loss and rotational effects resulting from the engine s turbomachinery. To develop and validate these new tools with test data, the NASA Glenn Research Center has teamed with GE Aircraft Engines, Pratt & Whitney, Boeing Commercial Aircraft, Rolls-Royce, and MSC.Software.

  15. Differential Targeting of Unpaired Bases within Duplex DNA by the Natural Compound Clerocidin: A Valuable Tool to Dissect DNA Secondary Structure

    PubMed Central

    Nadai, Matteo; Palù, Giorgio; Palumbo, Manlio; Richter, Sara N.

    2012-01-01

    Non-canonical DNA structures have been postulated to mediate protein-nucleic acid interactions and to function as intermediates in the generation of frame-shift mutations when errors in DNA replication occur, which result in a variety of diseases and cancers. Compounds capable of binding to non-canonical DNA conformations may thus have significant diagnostic and therapeutic potential. Clerocidin is a natural diterpenoid which has been shown to selectively react with single-stranded bases without targeting the double helix. Here we performed a comprehensive analysis on several non-canonical DNA secondary structures, namely mismatches, nicks, bulges, hairpins, with sequence variations in both the single-stranded region and the double-stranded flanking segment. By analysis of clerocidin reactivity, we were able to identify the exposed reactive residues which provided information on both the secondary structure and the accessibility of the non-paired sites. Mismatches longer than 1 base were necessary to be reached by clerocidin reactive groups, while 1-base nicks were promptly targeted by clerocidin; in hairpins, clerocidin reactivity increased with the length of the hairpin loop, while, interestingly, reactivity towards bulges reached a maximum in 3-base-long bulges and declined in longer bulges. Electrophoretic mobility shift analysis demonstrated that bulges longer than 3 bases (i.e. 5- and 7-bases) folded or stacked on the duplex region therefore being less accessible by the compound. Clerocidin thus represents a new valuable diagnostic tool to dissect DNA secondary structures. PMID:23285245

  16. Differential targeting of unpaired bases within duplex DNA by the natural compound clerocidin: a valuable tool to dissect DNA secondary structure.

    PubMed

    Nadai, Matteo; Palù, Giorgio; Palumbo, Manlio; Richter, Sara N

    2012-01-01

    Non-canonical DNA structures have been postulated to mediate protein-nucleic acid interactions and to function as intermediates in the generation of frame-shift mutations when errors in DNA replication occur, which result in a variety of diseases and cancers. Compounds capable of binding to non-canonical DNA conformations may thus have significant diagnostic and therapeutic potential. Clerocidin is a natural diterpenoid which has been shown to selectively react with single-stranded bases without targeting the double helix. Here we performed a comprehensive analysis on several non-canonical DNA secondary structures, namely mismatches, nicks, bulges, hairpins, with sequence variations in both the single-stranded region and the double-stranded flanking segment. By analysis of clerocidin reactivity, we were able to identify the exposed reactive residues which provided information on both the secondary structure and the accessibility of the non-paired sites. Mismatches longer than 1 base were necessary to be reached by clerocidin reactive groups, while 1-base nicks were promptly targeted by clerocidin; in hairpins, clerocidin reactivity increased with the length of the hairpin loop, while, interestingly, reactivity towards bulges reached a maximum in 3-base-long bulges and declined in longer bulges. Electrophoretic mobility shift analysis demonstrated that bulges longer than 3 bases (i.e. 5- and 7-bases) folded or stacked on the duplex region therefore being less accessible by the compound. Clerocidin thus represents a new valuable diagnostic tool to dissect DNA secondary structures.

  17. A study of self organized criticality in ion temperature gradient mode driven gyrokinetic turbulence

    NASA Astrophysics Data System (ADS)

    Mavridis, M.; Isliker, H.; Vlahos, L.; Görler, T.; Jenko, F.; Told, D.

    2014-10-01

    An investigation on the characteristics of self organized criticality (Soc) in ITG mode driven turbulence is made, with the use of various statistical tools (histograms, power spectra, Hurst exponents estimated with the rescaled range analysis, and the structure function method). For this purpose, local non-linear gyrokinetic simulations of the cyclone base case scenario are performed with the GENE software package. Although most authors concentrate on global simulations, which seem to be a better choice for such an investigation, we use local simulations in an attempt to study the locally underlying mechanisms of Soc. We also study the structural properties of radially extended structures, with several tools (fractal dimension estimate, cluster analysis, and two dimensional autocorrelation function), in order to explore whether they can be characterized as avalanches. We find that, for large enough driving temperature gradients, the local simulations exhibit most of the features of Soc, with the exception of the probability distribution of observables, which show a tail, yet they are not of power-law form. The radial structures have the same radial extent at all temperature gradients examined; radial motion (transport) though appears only at large temperature gradients, in which case the radial structures can be interpreted as avalanches.

  18. A refined Frequency Domain Decomposition tool for structural modal monitoring in earthquake engineering

    NASA Astrophysics Data System (ADS)

    Pioldi, Fabio; Rizzi, Egidio

    2017-07-01

    Output-only structural identification is developed by a refined Frequency Domain Decomposition ( rFDD) approach, towards assessing current modal properties of heavy-damped buildings (in terms of identification challenge), under strong ground motions. Structural responses from earthquake excitations are taken as input signals for the identification algorithm. A new dedicated computational procedure, based on coupled Chebyshev Type II bandpass filters, is outlined for the effective estimation of natural frequencies, mode shapes and modal damping ratios. The identification technique is also coupled with a Gabor Wavelet Transform, resulting in an effective and self-contained time-frequency analysis framework. Simulated response signals generated by shear-type frames (with variable structural features) are used as a necessary validation condition. In this context use is made of a complete set of seismic records taken from the FEMA P695 database, i.e. all 44 "Far-Field" (22 NS, 22 WE) earthquake signals. The modal estimates are statistically compared to their target values, proving the accuracy of the developed algorithm in providing prompt and accurate estimates of all current strong ground motion modal parameters. At this stage, such analysis tool may be employed for convenient application in the realm of Earthquake Engineering, towards potential Structural Health Monitoring and damage detection purposes.

  19. A study of self organized criticality in ion temperature gradient mode driven gyrokinetic turbulence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mavridis, M.; Isliker, H.; Vlahos, L.

    2014-10-15

    An investigation on the characteristics of self organized criticality (Soc) in ITG mode driven turbulence is made, with the use of various statistical tools (histograms, power spectra, Hurst exponents estimated with the rescaled range analysis, and the structure function method). For this purpose, local non-linear gyrokinetic simulations of the cyclone base case scenario are performed with the GENE software package. Although most authors concentrate on global simulations, which seem to be a better choice for such an investigation, we use local simulations in an attempt to study the locally underlying mechanisms of Soc. We also study the structural properties ofmore » radially extended structures, with several tools (fractal dimension estimate, cluster analysis, and two dimensional autocorrelation function), in order to explore whether they can be characterized as avalanches. We find that, for large enough driving temperature gradients, the local simulations exhibit most of the features of Soc, with the exception of the probability distribution of observables, which show a tail, yet they are not of power-law form. The radial structures have the same radial extent at all temperature gradients examined; radial motion (transport) though appears only at large temperature gradients, in which case the radial structures can be interpreted as avalanches.« less

  20. Neurocognitive inefficacy of the strategy process.

    PubMed

    Klein, Harold E; D'Esposito, Mark

    2007-11-01

    The most widely used (and taught) protocols for strategic analysis-Strengths, Weaknesses, Opportunities, and Threats (SWOT) and Porter's (1980) Five Force Framework for industry analysis-have been found to be insufficient as stimuli for strategy creation or even as a basis for further strategy development. We approach this problem from a neurocognitive perspective. We see profound incompatibilities between the cognitive process-deductive reasoning-channeled into the collective mind of strategists within the formal planning process through its tools of strategic analysis (i.e., rational technologies) and the essentially inductive reasoning process actually needed to address ill-defined, complex strategic situations. Thus, strategic analysis protocols that may appear to be and, indeed, are entirely rational and logical are not interpretable as such at the neuronal substrate level where thinking takes place. The analytical structure (or propositional representation) of these tools results in a mental dead end, the phenomenon known in cognitive psychology as functional fixedness. The difficulty lies with the inability of the brain to make out meaningful (i.e., strategy-provoking) stimuli from the mental images (or depictive representations) generated by strategic analysis tools. We propose decreasing dependence on these tools and conducting further research employing brain imaging technology to explore complex data handling protocols with richer mental representation and greater potential for strategy creation.

  1. Evaluation in undergraduate medical education: Conceptualizing and validating a novel questionnaire for assessing the quality of bedside teaching.

    PubMed

    Dreiling, Katharina; Montano, Diego; Poinstingl, Herbert; Müller, Tjark; Schiekirka-Schwake, Sarah; Anders, Sven; von Steinbüchel, Nicole; Raupach, Tobias

    2017-08-01

    Evaluation is an integral part of curriculum development in medical education. Given the peculiarities of bedside teaching, specific evaluation tools for this instructional format are needed. Development of these tools should be informed by appropriate frameworks. The purpose of this study was to develop a specific evaluation tool for bedside teaching based on the Stanford Faculty Development Program's clinical teaching framework. Based on a literature review yielding 47 evaluation items, an 18-item questionnaire was compiled and subsequently completed by undergraduate medical students at two German universities. Reliability and validity were assessed in an exploratory full information item factor analysis (study one) and a confirmatory factor analysis as well as a measurement invariance analysis (study two). The exploratory analysis involving 824 students revealed a three-factor structure. Reliability estimates of the subscales were satisfactory (α = 0.71-0.84). The model yielded satisfactory fit indices in the confirmatory factor analysis involving 1043 students. The new questionnaire is short and yet based on a widely-used framework for clinical teaching. The analyses presented here indicate good reliability and validity of the instrument. Future research needs to investigate whether feedback generated from this tool helps to improve teaching quality and student learning outcome.

  2. Functional Analysis of OMICs Data and Small Molecule Compounds in an Integrated "Knowledge-Based" Platform.

    PubMed

    Dubovenko, Alexey; Nikolsky, Yuri; Rakhmatulin, Eugene; Nikolskaya, Tatiana

    2017-01-01

    Analysis of NGS and other sequencing data, gene variants, gene expression, proteomics, and other high-throughput (OMICs) data is challenging because of its biological complexity and high level of technical and biological noise. One way to deal with both problems is to perform analysis with a high fidelity annotated knowledgebase of protein interactions, pathways, and functional ontologies. This knowledgebase has to be structured in a computer-readable format and must include software tools for managing experimental data, analysis, and reporting. Here, we present MetaCore™ and Key Pathway Advisor (KPA), an integrated platform for functional data analysis. On the content side, MetaCore and KPA encompass a comprehensive database of molecular interactions of different types, pathways, network models, and ten functional ontologies covering human, mouse, and rat genes. The analytical toolkit includes tools for gene/protein list enrichment analysis, statistical "interactome" tool for the identification of over- and under-connected proteins in the dataset, and a biological network analysis module made up of network generation algorithms and filters. The suite also features Advanced Search, an application for combinatorial search of the database content, as well as a Java-based tool called Pathway Map Creator for drawing and editing custom pathway maps. Applications of MetaCore and KPA include molecular mode of action of disease research, identification of potential biomarkers and drug targets, pathway hypothesis generation, analysis of biological effects for novel small molecule compounds and clinical applications (analysis of large cohorts of patients, and translational and personalized medicine).

  3. Simulation tools for guided wave based structural health monitoring

    NASA Astrophysics Data System (ADS)

    Mesnil, Olivier; Imperiale, Alexandre; Demaldent, Edouard; Baronian, Vahan; Chapuis, Bastien

    2018-04-01

    Structural Health Monitoring (SHM) is a thematic derived from Non Destructive Evaluation (NDE) based on the integration of sensors onto or into a structure in order to monitor its health without disturbing its regular operating cycle. Guided wave based SHM relies on the propagation of guided waves in plate-like or extruded structures. Using piezoelectric transducers to generate and receive guided waves is one of the most widely accepted paradigms due to the low cost and low weight of those sensors. A wide range of techniques for flaw detection based on the aforementioned setup is available in the literature but very few of these techniques have found industrial applications yet. A major difficulty comes from the sensitivity of guided waves to a substantial number of parameters such as the temperature or geometrical singularities, making guided wave measurement difficult to analyze. In order to apply guided wave based SHM techniques to a wider spectrum of applications and to transfer those techniques to the industry, the CEA LIST develops novel numerical methods. These methods facilitate the evaluation of the robustness of SHM techniques for multiple applicative cases and ease the analysis of the influence of various parameters, such as sensors positioning or environmental conditions. The first numerical tool is the guided wave module integrated to the commercial software CIVA, relying on a hybrid modal-finite element formulation to compute the guided wave response of perturbations (cavities, flaws…) in extruded structures of arbitrary cross section such as rails or pipes. The second numerical tool is based on the spectral element method [2] and simulates guided waves in both isotropic (metals) and orthotropic (composites) plate like-structures. This tool is designed to match the widely accepted sparse piezoelectric transducer array SHM configuration in which each embedded sensor acts as both emitter and receiver of guided waves. This tool is under development and will be adapted to simulate complex real-life structures such as curved composite panels with stiffeners. This communication will present these numerical tools and their main functionalities.

  4. Simple methods of exploiting the underlying structure of rule-based systems

    NASA Technical Reports Server (NTRS)

    Hendler, James

    1986-01-01

    Much recent work in the field of expert systems research has aimed at exploiting the underlying structures of the rule base for reasons of analysis. Such techniques as Petri-nets and GAGs have been proposed as representational structures that will allow complete analysis. Much has been made of proving isomorphisms between the rule bases and the mechanisms, and in examining the theoretical power of this analysis. In this paper we describe some early work in a new system which has much simpler (and thus, one hopes, more easily achieved) aims and less formality. The technique being examined is a very simple one: OPS5 programs are analyzed in a purely syntactic way and a FSA description is generated. In this paper we describe the technique and some user interface tools which exploit this structure.

  5. Structural models used in real-time biosurveillance outbreak detection and outbreak curve isolation from noisy background morbidity levels

    PubMed Central

    Cheng, Karen Elizabeth; Crary, David J; Ray, Jaideep; Safta, Cosmin

    2013-01-01

    Objective We discuss the use of structural models for the analysis of biosurveillance related data. Methods and results Using a combination of real and simulated data, we have constructed a data set that represents a plausible time series resulting from surveillance of a large scale bioterrorist anthrax attack in Miami. We discuss the performance of anomaly detection with structural models for these data using receiver operating characteristic (ROC) and activity monitoring operating characteristic (AMOC) analysis. In addition, we show that these techniques provide a method for predicting the level of the outbreak valid for approximately 2 weeks, post-alarm. Conclusions Structural models provide an effective tool for the analysis of biosurveillance data, in particular for time series with noisy, non-stationary background and missing data. PMID:23037798

  6. The MIGenAS integrated bioinformatics toolkit for web-based sequence analysis

    PubMed Central

    Rampp, Markus; Soddemann, Thomas; Lederer, Hermann

    2006-01-01

    We describe a versatile and extensible integrated bioinformatics toolkit for the analysis of biological sequences over the Internet. The web portal offers convenient interactive access to a growing pool of chainable bioinformatics software tools and databases that are centrally installed and maintained by the RZG. Currently, supported tasks comprise sequence similarity searches in public or user-supplied databases, computation and validation of multiple sequence alignments, phylogenetic analysis and protein–structure prediction. Individual tools can be seamlessly chained into pipelines allowing the user to conveniently process complex workflows without the necessity to take care of any format conversions or tedious parsing of intermediate results. The toolkit is part of the Max-Planck Integrated Gene Analysis System (MIGenAS) of the Max Planck Society available at (click ‘Start Toolkit’). PMID:16844980

  7. ISPAN (Interactive Stiffened Panel Analysis): A tool for quick concept evaluation and design trade studies

    NASA Technical Reports Server (NTRS)

    Hairr, John W.; Dorris, William J.; Ingram, J. Edward; Shah, Bharat M.

    1993-01-01

    Interactive Stiffened Panel Analysis (ISPAN) modules, written in FORTRAN, were developed to provide an easy to use tool for creating finite element models of composite material stiffened panels. The modules allow the user to interactively construct, solve and post-process finite element models of four general types of structural panel configurations using only the panel dimensions and properties as input data. Linear, buckling and post-buckling solution capability is provided. This interactive input allows rapid model generation and solution by non finite element users. The results of a parametric study of a blade stiffened panel are presented to demonstrate the usefulness of the ISPAN modules. Also, a non-linear analysis of a test panel was conducted and the results compared to measured data and previous correlation analysis.

  8. Design of Launch Vehicle Flight Control Systems Using Ascent Vehicle Stability Analysis Tool

    NASA Technical Reports Server (NTRS)

    Jang, Jiann-Woei; Alaniz, Abran; Hall, Robert; Bedossian, Nazareth; Hall, Charles; Jackson, Mark

    2011-01-01

    A launch vehicle represents a complicated flex-body structural environment for flight control system design. The Ascent-vehicle Stability Analysis Tool (ASAT) is developed to address the complicity in design and analysis of a launch vehicle. The design objective for the flight control system of a launch vehicle is to best follow guidance commands while robustly maintaining system stability. A constrained optimization approach takes the advantage of modern computational control techniques to simultaneously design multiple control systems in compliance with required design specs. "Tower Clearance" and "Load Relief" designs have been achieved for liftoff and max dynamic pressure flight regions, respectively, in the presence of large wind disturbances. The robustness of the flight control system designs has been verified in the frequency domain Monte Carlo analysis using ASAT.

  9. Elasto-Plastic Analysis of Tee Joints Using HOT-SMAC

    NASA Technical Reports Server (NTRS)

    Arnold, Steve M. (Technical Monitor); Bednarcyk, Brett A.; Yarrington, Phillip W.

    2004-01-01

    The Higher Order Theory - Structural/Micro Analysis Code (HOT-SMAC) software package is applied to analyze the linearly elastic and elasto-plastic response of adhesively bonded tee joints. Joints of this type are finding an increasing number of applications with the increased use of composite materials within advanced aerospace vehicles, and improved tools for the design and analysis of these joints are needed. The linearly elastic results of the code are validated vs. finite element analysis results from the literature under different loading and boundary conditions, and new results are generated to investigate the inelastic behavior of the tee joint. The comparison with the finite element results indicates that HOT-SMAC is an efficient and accurate alternative to the finite element method and has a great deal of potential as an analysis tool for a wide range of bonded joints.

  10. Numerical Analysis Objects

    NASA Astrophysics Data System (ADS)

    Henderson, Michael

    1997-08-01

    The Numerical Analysis Objects project (NAO) is a project in the Mathematics Department of IBM's TJ Watson Research Center. While there are plenty of numerical tools available today, it is not an easy task to combine them into a custom application. NAO is directed at the dual problems of building applications from a set of tools, and creating those tools. There are several "reuse" projects, which focus on the problems of identifying and cataloging tools. NAO is directed at the specific context of scientific computing. Because the type of tools is restricted, problems such as tools with incompatible data structures for input and output, and dissimilar interfaces to tools which solve similar problems can be addressed. The approach we've taken is to define interfaces to those objects used in numerical analysis, such as geometries, functions and operators, and to start collecting (and building) a set of tools which use these interfaces. We have written a class library (a set of abstract classes and implementations) in C++ which demonstrates the approach. Besides the classes, the class library includes "stub" routines which allow the library to be used from C or Fortran, and an interface to a Visual Programming Language. The library has been used to build a simulator for petroleum reservoirs, using a set of tools for discretizing nonlinear differential equations that we have written, and includes "wrapped" versions of packages from the Netlib repository. Documentation can be found on the Web at "http://www.research.ibm.com/nao". I will describe the objects and their interfaces, and give examples ranging from mesh generation to solving differential equations.

  11. PAINT: a promoter analysis and interaction network generation tool for gene regulatory network identification.

    PubMed

    Vadigepalli, Rajanikanth; Chakravarthula, Praveen; Zak, Daniel E; Schwaber, James S; Gonye, Gregory E

    2003-01-01

    We have developed a bioinformatics tool named PAINT that automates the promoter analysis of a given set of genes for the presence of transcription factor binding sites. Based on coincidence of regulatory sites, this tool produces an interaction matrix that represents a candidate transcriptional regulatory network. This tool currently consists of (1) a database of promoter sequences of known or predicted genes in the Ensembl annotated mouse genome database, (2) various modules that can retrieve and process the promoter sequences for binding sites of known transcription factors, and (3) modules for visualization and analysis of the resulting set of candidate network connections. This information provides a substantially pruned list of genes and transcription factors that can be examined in detail in further experimental studies on gene regulation. Also, the candidate network can be incorporated into network identification methods in the form of constraints on feasible structures in order to render the algorithms tractable for large-scale systems. The tool can also produce output in various formats suitable for use in external visualization and analysis software. In this manuscript, PAINT is demonstrated in two case studies involving analysis of differentially regulated genes chosen from two microarray data sets. The first set is from a neuroblastoma N1E-115 cell differentiation experiment, and the second set is from neuroblastoma N1E-115 cells at different time intervals following exposure to neuropeptide angiotensin II. PAINT is available for use as an agent in BioSPICE simulation and analysis framework (www.biospice.org), and can also be accessed via a WWW interface at www.dbi.tju.edu/dbi/tools/paint/.

  12. Ares I-X Flight Test Validation of Control Design Tools in the Frequency-Domain

    NASA Technical Reports Server (NTRS)

    Johnson, Matthew; Hannan, Mike; Brandon, Jay; Derry, Stephen

    2011-01-01

    A major motivation of the Ares I-X flight test program was to Design for Data, in order to maximize the usefulness of the data recorded in support of Ares I modeling and validation of design and analysis tools. The Design for Data effort was intended to enable good post-flight characterizations of the flight control system, the vehicle structural dynamics, and also the aerodynamic characteristics of the vehicle. To extract the necessary data from the system during flight, a set of small predetermined Programmed Test Inputs (PTIs) was injected directly into the TVC signal. These PTIs were designed to excite the necessary vehicle dynamics while exhibiting a minimal impact on loads. The method is similar to common approaches in aircraft flight test programs, but with unique launch vehicle challenges due to rapidly changing states, short duration of flight, a tight flight envelope, and an inability to repeat any test. This paper documents the validation effort of the stability analysis tools to the flight data which was performed by comparing the post-flight calculated frequency response of the vehicle to the frequency response calculated by the stability analysis tools used to design and analyze the preflight models during the control design effort. The comparison between flight day frequency response and stability tool analysis for flight of the simulated vehicle shows good agreement and provides a high level of confidence in the stability analysis tools for use in any future program. This is true for both a nominal model as well as for dispersed analysis, which shows that the flight day frequency response is enveloped by the vehicle s preflight uncertainty models.

  13. Metabolomics Workbench: An international repository for metabolomics data and metadata, metabolite standards, protocols, tutorials and training, and analysis tools.

    PubMed

    Sud, Manish; Fahy, Eoin; Cotter, Dawn; Azam, Kenan; Vadivelu, Ilango; Burant, Charles; Edison, Arthur; Fiehn, Oliver; Higashi, Richard; Nair, K Sreekumaran; Sumner, Susan; Subramaniam, Shankar

    2016-01-04

    The Metabolomics Workbench, available at www.metabolomicsworkbench.org, is a public repository for metabolomics metadata and experimental data spanning various species and experimental platforms, metabolite standards, metabolite structures, protocols, tutorials, and training material and other educational resources. It provides a computational platform to integrate, analyze, track, deposit and disseminate large volumes of heterogeneous data from a wide variety of metabolomics studies including mass spectrometry (MS) and nuclear magnetic resonance spectrometry (NMR) data spanning over 20 different species covering all the major taxonomic categories including humans and other mammals, plants, insects, invertebrates and microorganisms. Additionally, a number of protocols are provided for a range of metabolite classes, sample types, and both MS and NMR-based studies, along with a metabolite structure database. The metabolites characterized in the studies available on the Metabolomics Workbench are linked to chemical structures in the metabolite structure database to facilitate comparative analysis across studies. The Metabolomics Workbench, part of the data coordinating effort of the National Institute of Health (NIH) Common Fund's Metabolomics Program, provides data from the Common Fund's Metabolomics Resource Cores, metabolite standards, and analysis tools to the wider metabolomics community and seeks data depositions from metabolomics researchers across the world. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. A computational proposal for designing structured RNA pools for in vitro selection of RNAs.

    PubMed

    Kim, Namhee; Gan, Hin Hark; Schlick, Tamar

    2007-04-01

    Although in vitro selection technology is a versatile experimental tool for discovering novel synthetic RNA molecules, finding complex RNA molecules is difficult because most RNAs identified from random sequence pools are simple motifs, consistent with recent computational analysis of such sequence pools. Thus, enriching in vitro selection pools with complex structures could increase the probability of discovering novel RNAs. Here we develop an approach for engineering sequence pools that links RNA sequence space regions with corresponding structural distributions via a "mixing matrix" approach combined with a graph theory analysis. We define five classes of mixing matrices motivated by covariance mutations in RNA; these constructs define nucleotide transition rates and are applied to chosen starting sequences to yield specific nonrandom pools. We examine the coverage of sequence space as a function of the mixing matrix and starting sequence via clustering analysis. We show that, in contrast to random sequences, which are associated only with a local region of sequence space, our designed pools, including a structured pool for GTP aptamers, can target specific motifs. It follows that experimental synthesis of designed pools can benefit from using optimized starting sequences, mixing matrices, and pool fractions associated with each of our constructed pools as a guide. Automation of our approach could provide practical tools for pool design applications for in vitro selection of RNAs and related problems.

  15. "Rompiendo el Silencio": Meta-Analysis of the Effectiveness of Peer-Mediated Learning at Improving Language Outcomes for ELLs

    ERIC Educational Resources Information Center

    Cole, Mikel W.

    2013-01-01

    This article reports the results of a meta-analysis of the effectiveness of peer-mediated learning for English language learners. Peer-mediated learning is presented as one pedagogical tool with promise for interrupting a legacy of structural and instructional silencing of culturally and linguistically diverse students. Oral language…

  16. Understanding Groups in Outdoor Adventure Education through Social Network Analysis

    ERIC Educational Resources Information Center

    Jostad, Jeremy; Sibthorp, Jim; Paisley, Karen

    2013-01-01

    Relationships are a critical component to the experience of an outdoor adventure education (OAE) program, therefore, more fruitful ways of investigating groups is needed. Social network analysis (SNA) is an effective tool to study the relationship structure of small groups. This paper provides an explanation of SNA and shows how it was used by the…

  17. Construct Validation of the Louisiana School Analysis Model (SAM) Instructional Staff Questionnaire

    ERIC Educational Resources Information Center

    Bray-Clark, Nikki; Bates, Reid

    2005-01-01

    The purpose of this study was to validate the Louisiana SAM Instructional Staff Questionnaire, a key component of the Louisiana School Analysis Model. The model was designed as a comprehensive evaluation tool for schools. Principle axis factoring with oblique rotation was used to uncover the underlying structure of the SISQ. (Contains 1 table.)

  18. Contrastive Analysis Revisited: Obligatory, Systematic, and Incidental Differences Between Languages.

    ERIC Educational Resources Information Center

    Berman, Ruth Aronson

    1978-01-01

    Contrastive analysis is suggested as a tool in language teaching for such areas as: (1) deciding how much to focus on different aspects of the target language; (2) making generalizations about its structure; and (3) explaining texts or constructions which might otherwise be incomprehensible. The claim is made that such procedures need to be based…

  19. Controls-Structures Interaction (CSI) technology program summary. Earth orbiting platforms program area of the space platforms technology program

    NASA Technical Reports Server (NTRS)

    Newsom, Jerry R.

    1991-01-01

    Control-Structures Interaction (CSI) technology embraces the understanding of the interaction between the spacecraft structure and the control system, and the creation and validation of concepts, techniques, and tools, for enabling the interdisciplinary design of an integrated structure and control system, rather than the integration of a structural design and a control system design. The goal of this program is to develop validated CSI technology for integrated design/analysis and qualification of large flexible space systems and precision space structures. A description of the CSI technology program is presented.

  20. A proof of concept for epidemiological research using structured reporting with pulmonary embolism as a use case.

    PubMed

    Daniel, Pinto Dos Santos; Sonja, Scheibl; Gordon, Arnhold; Aline, Maehringer-Kunz; Christoph, Düber; Peter, Mildenberger; Roman, Kloeckner

    2018-05-10

    This paper studies the possibilities of an integrated IT-based workflow for epidemiological research in pulmonary embolism using freely available tools and structured reporting. We included a total of 521 consecutive cases which had been referred to the radiology department for computed tomography pulmonary angiography (CTPA) with suspected pulmonary embolism (PE). Free-text reports were transformed into structured reports using a freely available IHE-MRRT-compliant reporting platform. D-dimer values were retrieved from the hospitals laboratory results system. All information was stored in the platform's database and visualized using freely available tools. For further analysis, we directly accessed the platform's database with an advanced analytics tool (RapidMiner). We were able developed an integrated workflow for epidemiological statistics from reports obtained in clinical routine. The report data allowed for automated calculation of epidemiological parameters. Prevalence of pulmonary embolism was 27.6%. The mean age in patients with and without pulmonary embolism did not differ (62.8 years and 62.0 years, respectively, p=0.987). As expected, there was a significant difference in mean D-dimer values (10.13 mg/L FEU and 3.12 mg/L FEU, respectively, p<0.001). Structured reporting can make data obtained from clinical routine more accessible. Designing practical workflows is feasible using freely available tools and allows for the calculation of epidemiological statistics on a near real-time basis. Therefore, radiologists should push for the implementation of structured reporting in clinical routine. Advances in knowledge: Theoretical benefits of structured reporting have long been discussed, but practical implementation demonstrating those benefits has been lacking. Here we present a first experience providing proof that structured reporting will make data from clinical routine more accessible.

  1. Comparative evaluation of the content and structure of communication using two handoff tools: implications for patient safety.

    PubMed

    Abraham, Joanna; Kannampallil, Thomas G; Almoosa, Khalid F; Patel, Bela; Patel, Vimla L

    2014-04-01

    Handoffs vary in their structure and content, raising concerns regarding standardization. We conducted a comparative evaluation of the nature and patterns of communication on 2 functionally similar but conceptually different handoff tools: Subjective, Objective, Assessment and Plan, based on a patient problem-based format, and Handoff Intervention Tool (HAND-IT), based on a body system-based format. A nonrandomized pre-post prospective intervention study supported by audio recordings and observations of 82 resident handoffs was conducted in a medical intensive care unit. Qualitative analysis was complemented with exploratory sequential pattern analysis techniques to capture the characteristics and types of communication events (CEs) and breakdowns. Use of HAND-IT led to fewer communication breakdowns (F1,80 = 45.66: P < .0001), greater number of CEs (t40 = 4.56; P < .001), with more ideal CEs than Subjective, Objective, Assessment and Plan (t40 = 9.27; P < .001). In addition, the use of HAND-IT was characterized by more request-response CE transitions. The HAND-IT's body system-based structure afforded physicians the ability to better organize and comprehend patient information and led to an interactive and streamlined communication, with limited external input. Our results also emphasize the importance of information organization using a medical knowledge hierarchical format for fostering effective communication. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. HEDEA: A Python Tool for Extracting and Analysing Semi-structured Information from Medical Records

    PubMed Central

    Aggarwal, Anshul; Garhwal, Sunita

    2018-01-01

    Objectives One of the most important functions for a medical practitioner while treating a patient is to study the patient's complete medical history by going through all records, from test results to doctor's notes. With the increasing use of technology in medicine, these records are mostly digital, alleviating the problem of looking through a stack of papers, which are easily misplaced, but some of these are in an unstructured form. Large parts of clinical reports are in written text form and are tedious to use directly without appropriate pre-processing. In medical research, such health records may be a good, convenient source of medical data; however, lack of structure means that the data is unfit for statistical evaluation. In this paper, we introduce a system to extract, store, retrieve, and analyse information from health records, with a focus on the Indian healthcare scene. Methods A Python-based tool, Healthcare Data Extraction and Analysis (HEDEA), has been designed to extract structured information from various medical records using a regular expression-based approach. Results The HEDEA system is working, covering a large set of formats, to extract and analyse health information. Conclusions This tool can be used to generate analysis report and charts using the central database. This information is only provided after prior approval has been received from the patient for medical research purposes. PMID:29770248

  3. HEDEA: A Python Tool for Extracting and Analysing Semi-structured Information from Medical Records.

    PubMed

    Aggarwal, Anshul; Garhwal, Sunita; Kumar, Ajay

    2018-04-01

    One of the most important functions for a medical practitioner while treating a patient is to study the patient's complete medical history by going through all records, from test results to doctor's notes. With the increasing use of technology in medicine, these records are mostly digital, alleviating the problem of looking through a stack of papers, which are easily misplaced, but some of these are in an unstructured form. Large parts of clinical reports are in written text form and are tedious to use directly without appropriate pre-processing. In medical research, such health records may be a good, convenient source of medical data; however, lack of structure means that the data is unfit for statistical evaluation. In this paper, we introduce a system to extract, store, retrieve, and analyse information from health records, with a focus on the Indian healthcare scene. A Python-based tool, Healthcare Data Extraction and Analysis (HEDEA), has been designed to extract structured information from various medical records using a regular expression-based approach. The HEDEA system is working, covering a large set of formats, to extract and analyse health information. This tool can be used to generate analysis report and charts using the central database. This information is only provided after prior approval has been received from the patient for medical research purposes.

  4. Structure prediction, expression, and antigenicity of c-terminal of GRP78.

    PubMed

    Aghamollaei, Hossein; Mousavi Gargari, Seyed Latif; Ghanei, Mostafa; Rasaee, Mohamad Javad; Amani, Jafar; Bakherad, Hamid; Farnoosh, Gholamreza

    2017-01-01

    Glucose-regulated protein 78 (GRP78) is a typical endoplasmic reticulum luminal chaperone having a main role in the activation of the unfolded protein response. Because of hypoxia and nutrient deprivation in the tumor microenvironment, expression of GRP78 in these cells becomes higher than the native cells, which makes it a suitable candidate for cancer targeting. Suppression of survival signals by antibody production against C-terminal domain of GR78 (CGRP) can induce apoptosis of cancer cells. The aim of this study was in silico analysis, recombinant production, and characterization of CGRP in Escherichia coli. Structural prediction of CGRP by bioinformatics tools was done and the construct containing optimized sequence was transferred to E. coli T7 shuffle. Expression was induced by isopropyl-β-d-thiogalactoside, and recombinant protein was purified by Ni-NTA agarose resin. The content of secondary structures was obtained by circular dichroism (CD) spectrum. CGRP immunogenicity was evaluated from the immunized mouse sera. SDS-PAGE analysis showed CGRP expression in E. coli. CD spectrum also confirmed prediction of structures by bioinformatics tools. The enzyme-linked immunosorbent assay using sera from immunized mice revealed CGRP as a good immunogen. The results obtained in this study showed that the structure of truncated CGRP is very similar to its structure in the whole protein context. This protein can be used in cancer researches. © 2015 International Union of Biochemistry and Molecular Biology, Inc.

  5. Factors Influencing Progressive Failure Analysis Predictions for Laminated Composite Structure

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.

    2008-01-01

    Progressive failure material modeling methods used for structural analysis including failure initiation and material degradation are presented. Different failure initiation criteria and material degradation models are described that define progressive failure formulations. These progressive failure formulations are implemented in a user-defined material model for use with a nonlinear finite element analysis tool. The failure initiation criteria include the maximum stress criteria, maximum strain criteria, the Tsai-Wu failure polynomial, and the Hashin criteria. The material degradation model is based on the ply-discounting approach where the local material constitutive coefficients are degraded. Applications and extensions of the progressive failure analysis material model address two-dimensional plate and shell finite elements and three-dimensional solid finite elements. Implementation details are described in the present paper. Parametric studies for laminated composite structures are discussed to illustrate the features of the progressive failure modeling methods that have been implemented and to demonstrate their influence on progressive failure analysis predictions.

  6. Tropical Pacific moisture variability: Its detection, synoptic structure and consequences in the general circulation

    NASA Technical Reports Server (NTRS)

    Mcguirk, James P.

    1990-01-01

    Satellite data analysis tools are developed and implemented for the diagnosis of atmospheric circulation systems over the tropical Pacific Ocean. The tools include statistical multi-variate procedures, a multi-spectral radiative transfer model, and the global spectral forecast model at NMC. Data include in-situ observations; satellite observations from VAS (moisture, infrared and visible) NOAA polar orbiters (including Tiros Operational Satellite System (TOVS) multi-channel sounding data and OLR grids) and scanning multichannel microwave radiometer (SMMR); and European Centre for Medium Weather Forecasts (ECHMWF) analyses. A primary goal is a better understanding of the relation between synoptic structures of the area, particularly tropical plumes, and the general circulation, especially the Hadley circulation. A second goal is the definition of the quantitative structure and behavior of all Pacific tropical synoptic systems. Finally, strategies are examined for extracting new and additional information from existing satellite observations. Although moisture structure is emphasized, thermal patterns are also analyzed. Both horizontal and vertical structures are studied and objective quantitative results are emphasized.

  7. EXAFS: New tool for study of battery and fuel cell materials

    NASA Technical Reports Server (NTRS)

    Mcbreen, James; Ogrady, William E.; Pandya, Kaumudi I.

    1987-01-01

    Extended X ray absorption fine structure (EXAFS) is a powerful technique for probing the local atomic structure of battery and fuel cell materials. The major advantages of EXAFS are that both the probe and the signal are X rays and the technique is element selective and applicable to all states of matter. This permits in situ studies of electrodes and determination of the structure of single components in composite electrodes, or even complete cells. EXAFS specifically probes short range order and yields coordination numbers, bond distances, and chemical identity of nearest neighbors. Thus, it is ideal for structural studies of ions in solution and the poorly crystallized materials that are often the active materials or catalysts in batteries and fuel cells. Studies on typical battery and fuel cell components are used to describe the technique and the capability of EXAFS as a structural tool in these applications. Typical experimental and data analysis procedures are outlined. The advantages and limitations of the technique are also briefly discussed.

  8. Tannin structural elucidation and quantitative ³¹P NMR analysis. 2. Hydrolyzable tannins and proanthocyanidins.

    PubMed

    Melone, Federica; Saladino, Raffaele; Lange, Heiko; Crestini, Claudia

    2013-10-02

    An unprecedented analytical method that allows simultaneous structural and quantitative characterization of all functional groups present in tannins is reported. In situ labeling of all labile H groups (aliphatic and phenolic hydroxyls and carboxylic acids) with a phosphorus-containing reagent (Cl-TMDP) followed by quantitative ³¹P NMR acquisition constitutes a novel fast and reliable analytical tool for the analysis of tannins and proanthocyanidins with significant implications for the fields of food and feed analyses, tannery, and the development of natural polyphenolics containing products.

  9. Integrated Software for Analyzing Designs of Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Philips, Alan D.

    2003-01-01

    Launch Vehicle Analysis Tool (LVA) is a computer program for preliminary design structural analysis of launch vehicles. Before LVA was developed, in order to analyze the structure of a launch vehicle, it was necessary to estimate its weight, feed this estimate into a program to obtain pre-launch and flight loads, then feed these loads into structural and thermal analysis programs to obtain a second weight estimate. If the first and second weight estimates differed, it was necessary to reiterate these analyses until the solution converged. This process generally took six to twelve person-months of effort. LVA incorporates text to structural layout converter, configuration drawing, mass properties generation, pre-launch and flight loads analysis, loads output plotting, direct solution structural analysis, and thermal analysis subprograms. These subprograms are integrated in LVA so that solutions can be iterated automatically. LVA incorporates expert-system software that makes fundamental design decisions without intervention by the user. It also includes unique algorithms based on extensive research. The total integration of analysis modules drastically reduces the need for interaction with the user. A typical solution can be obtained in 30 to 60 minutes. Subsequent runs can be done in less than two minutes.

  10. ShapeRotator: An R tool for standardized rigid rotations of articulated three-dimensional structures with application for geometric morphometrics.

    PubMed

    Vidal-García, Marta; Bandara, Lashi; Keogh, J Scott

    2018-05-01

    The quantification of complex morphological patterns typically involves comprehensive shape and size analyses, usually obtained by gathering morphological data from all the structures that capture the phenotypic diversity of an organism or object. Articulated structures are a critical component of overall phenotypic diversity, but data gathered from these structures are difficult to incorporate into modern analyses because of the complexities associated with jointly quantifying 3D shape in multiple structures. While there are existing methods for analyzing shape variation in articulated structures in two-dimensional (2D) space, these methods do not work in 3D, a rapidly growing area of capability and research. Here, we describe a simple geometric rigid rotation approach that removes the effect of random translation and rotation, enabling the morphological analysis of 3D articulated structures. Our method is based on Cartesian coordinates in 3D space, so it can be applied to any morphometric problem that also uses 3D coordinates (e.g., spherical harmonics). We demonstrate the method by applying it to a landmark-based dataset for analyzing shape variation using geometric morphometrics. We have developed an R tool (ShapeRotator) so that the method can be easily implemented in the commonly used R package geomorph and MorphoJ software. This method will be a valuable tool for 3D morphological analyses in articulated structures by allowing an exhaustive examination of shape and size diversity.

  11. Visualization of protein interaction networks: problems and solutions

    PubMed Central

    2013-01-01

    Background Visualization concerns the representation of data visually and is an important task in scientific research. Protein-protein interactions (PPI) are discovered using either wet lab techniques, such mass spectrometry, or in silico predictions tools, resulting in large collections of interactions stored in specialized databases. The set of all interactions of an organism forms a protein-protein interaction network (PIN) and is an important tool for studying the behaviour of the cell machinery. Since graphic representation of PINs may highlight important substructures, e.g. protein complexes, visualization is more and more used to study the underlying graph structure of PINs. Although graphs are well known data structures, there are different open problems regarding PINs visualization: the high number of nodes and connections, the heterogeneity of nodes (proteins) and edges (interactions), the possibility to annotate proteins and interactions with biological information extracted by ontologies (e.g. Gene Ontology) that enriches the PINs with semantic information, but complicates their visualization. Methods In these last years many software tools for the visualization of PINs have been developed. Initially thought for visualization only, some of them have been successively enriched with new functions for PPI data management and PIN analysis. The paper analyzes the main software tools for PINs visualization considering four main criteria: (i) technology, i.e. availability/license of the software and supported OS (Operating System) platforms; (ii) interoperability, i.e. ability to import/export networks in various formats, ability to export data in a graphic format, extensibility of the system, e.g. through plug-ins; (iii) visualization, i.e. supported layout and rendering algorithms and availability of parallel implementation; (iv) analysis, i.e. availability of network analysis functions, such as clustering or mining of the graph, and the possibility to interact with external databases. Results Currently, many tools are available and it is not easy for the users choosing one of them. Some tools offer sophisticated 2D and 3D network visualization making available many layout algorithms, others tools are more data-oriented and support integration of interaction data coming from different sources and data annotation. Finally, some specialistic tools are dedicated to the analysis of pathways and cellular processes and are oriented toward systems biology studies, where the dynamic aspects of the processes being studied are central. Conclusion A current trend is the deployment of open, extensible visualization tools (e.g. Cytoscape), that may be incrementally enriched by the interactomics community with novel and more powerful functions for PIN analysis, through the development of plug-ins. On the other hand, another emerging trend regards the efficient and parallel implementation of the visualization engine that may provide high interactivity and near real-time response time, as in NAViGaTOR. From a technological point of view, open-source, free and extensible tools, like Cytoscape, guarantee a long term sustainability due to the largeness of the developers and users communities, and provide a great flexibility since new functions are continuously added by the developer community through new plug-ins, but the emerging parallel, often closed-source tools like NAViGaTOR, can offer near real-time response time also in the analysis of very huge PINs. PMID:23368786

  12. Structure Computation of Quiet Spike[Trademark] Flight-Test Data During Envelope Expansion

    NASA Technical Reports Server (NTRS)

    Kukreja, Sunil L.

    2008-01-01

    System identification or mathematical modeling is used in the aerospace community for development of simulation models for robust control law design. These models are often described as linear time-invariant processes. Nevertheless, it is well known that the underlying process is often nonlinear. The reason for using a linear approach has been due to the lack of a proper set of tools for the identification of nonlinear systems. Over the past several decades, the controls and biomedical communities have made great advances in developing tools for the identification of nonlinear systems. These approaches are robust and readily applicable to aerospace systems. In this paper, we show the application of one such nonlinear system identification technique, structure detection, for the analysis of F-15B Quiet Spike(TradeMark) aeroservoelastic flight-test data. Structure detection is concerned with the selection of a subset of candidate terms that best describe the observed output. This is a necessary procedure to compute an efficient system description that may afford greater insight into the functionality of the system or a simpler controller design. Structure computation as a tool for black-box modeling may be of critical importance for the development of robust parsimonious models for the flight-test community. Moreover, this approach may lead to efficient strategies for rapid envelope expansion, which may save significant development time and costs. The objectives of this study are to demonstrate via analysis of F-15B Quiet Spike aeroservoelastic flight-test data for several flight conditions that 1) linear models are inefficient for modeling aeroservoelastic data, 2) nonlinear identification provides a parsimonious model description while providing a high percent fit for cross-validated data, and 3) the model structure and parameters vary as the flight condition is altered.

  13. Rapid fabrication of embossing tools for the production of polymeric microfluidic devices for bioanalytical applications

    NASA Astrophysics Data System (ADS)

    Ford, Sean M.; McCandless, Andrew B.; Liu, Xuezhu; Soper, Steven A.

    2001-09-01

    In this paper we present embossing tools that were fabricated using both UV and X-ray lithography. The embossing tools created were used to emboss microfluidic channels for bioanalytical applications. Specifically, two tools were fabricated. One, using x-ray lithography, was fabricated for electrophoretic separations of DNA restriction fragment analysis. A second tool, fabricated using SU8, was designed for micro PCR applications. Depths of both tools were approximately 100 micrometers . Both tools were made by directly electroforming nickel on a stainless steel base. Fabrication time for the tool fabricated using x-ray lithography was less than 1 week, and largely depended on the availability of the x-ray source. The SU8 embossing tool was fabricated in less than 24 hours. The resulting nickel electroforms from both processes were extremely robust and did not fail under embossing conditions required for PMMA and/or polycarbonate. Some problems removing SU8 after electroforming were sen for smaller size gaps between nickel structures.

  14. TU-C-17A-03: An Integrated Contour Evaluation Software Tool Using Supervised Pattern Recognition for Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, H; Tan, J; Kavanaugh, J

    Purpose: Radiotherapy (RT) contours delineated either manually or semiautomatically require verification before clinical usage. Manual evaluation is very time consuming. A new integrated software tool using supervised pattern contour recognition was thus developed to facilitate this process. Methods: The contouring tool was developed using an object-oriented programming language C# and application programming interfaces, e.g. visualization toolkit (VTK). The C# language served as the tool design basis. The Accord.Net scientific computing libraries were utilized for the required statistical data processing and pattern recognition, while the VTK was used to build and render 3-D mesh models from critical RT structures in real-timemore » and 360° visualization. Principal component analysis (PCA) was used for system self-updating geometry variations of normal structures based on physician-approved RT contours as a training dataset. The inhouse design of supervised PCA-based contour recognition method was used for automatically evaluating contour normality/abnormality. The function for reporting the contour evaluation results was implemented by using C# and Windows Form Designer. Results: The software input was RT simulation images and RT structures from commercial clinical treatment planning systems. Several abilities were demonstrated: automatic assessment of RT contours, file loading/saving of various modality medical images and RT contours, and generation/visualization of 3-D images and anatomical models. Moreover, it supported the 360° rendering of the RT structures in a multi-slice view, which allows physicians to visually check and edit abnormally contoured structures. Conclusion: This new software integrates the supervised learning framework with image processing and graphical visualization modules for RT contour verification. This tool has great potential for facilitating treatment planning with the assistance of an automatic contour evaluation module in avoiding unnecessary manual verification for physicians/dosimetrists. In addition, its nature as a compact and stand-alone tool allows for future extensibility to include additional functions for physicians’ clinical needs.« less

  15. Damage classification and estimation in experimental structures using time series analysis and pattern recognition

    NASA Astrophysics Data System (ADS)

    de Lautour, Oliver R.; Omenzetter, Piotr

    2010-07-01

    Developed for studying long sequences of regularly sampled data, time series analysis methods are being increasingly investigated for the use of Structural Health Monitoring (SHM). In this research, Autoregressive (AR) models were used to fit the acceleration time histories obtained from two experimental structures: a 3-storey bookshelf structure and the ASCE Phase II Experimental SHM Benchmark Structure, in undamaged and limited number of damaged states. The coefficients of the AR models were considered to be damage-sensitive features and used as input into an Artificial Neural Network (ANN). The ANN was trained to classify damage cases or estimate remaining structural stiffness. The results showed that the combination of AR models and ANNs are efficient tools for damage classification and estimation, and perform well using small number of damage-sensitive features and limited sensors.

  16. DSSR-enhanced visualization of nucleic acid structures in Jmol.

    PubMed

    Hanson, Robert M; Lu, Xiang-Jun

    2017-07-03

    Sophisticated and interactive visualizations are essential for making sense of the intricate 3D structures of macromolecules. For proteins, secondary structural components are routinely featured in molecular graphics visualizations. However, the field of RNA structural bioinformatics is still lagging behind; for example, current molecular graphics tools lack built-in support even for base pairs, double helices, or hairpin loops. DSSR (Dissecting the Spatial Structure of RNA) is an integrated and automated command-line tool for the analysis and annotation of RNA tertiary structures. It calculates a comprehensive and unique set of features for characterizing RNA, as well as DNA structures. Jmol is a widely used, open-source Java viewer for 3D structures, with a powerful scripting language. JSmol, its reincarnation based on native JavaScript, has a predominant position in the post Java-applet era for web-based visualization of molecular structures. The DSSR-Jmol integration presented here makes salient features of DSSR readily accessible, either via the Java-based Jmol application itself, or its HTML5-based equivalent, JSmol. The DSSR web service accepts 3D coordinate files (in mmCIF or PDB format) initiated from a Jmol or JSmol session and returns DSSR-derived structural features in JSON format. This seamless combination of DSSR and Jmol/JSmol brings the molecular graphics of 3D RNA structures to a similar level as that for proteins, and enables a much deeper analysis of structural characteristics. It fills a gap in RNA structural bioinformatics, and is freely accessible (via the Jmol application or the JSmol-based website http://jmol.x3dna.org). © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  17. Tools for Designing and Analyzing Structures

    NASA Technical Reports Server (NTRS)

    Luz, Paul L.

    2005-01-01

    Structural Design and Analysis Toolset is a collection of approximately 26 Microsoft Excel spreadsheet programs, each of which performs calculations within a different subdiscipline of structural design and analysis. These programs present input and output data in user-friendly, menu-driven formats. Although these programs cannot solve complex cases like those treated by larger finite element codes, these programs do yield quick solutions to numerous common problems more rapidly than the finite element codes, thereby making it possible to quickly perform multiple preliminary analyses - e.g., to establish approximate limits prior to detailed analyses by the larger finite element codes. These programs perform different types of calculations, as follows: 1. determination of geometric properties for a variety of standard structural components; 2. analysis of static, vibrational, and thermal- gradient loads and deflections in certain structures (mostly beams and, in the case of thermal-gradients, mirrors); 3. kinetic energies of fans; 4. detailed analysis of stress and buckling in beams, plates, columns, and a variety of shell structures; and 5. temperature dependent properties of materials, including figures of merit that characterize strength, stiffness, and deformation response to thermal gradients

  18. Taming Log Files from Game/Simulation-Based Assessments: Data Models and Data Analysis Tools. Research Report. ETS RR-16-10

    ERIC Educational Resources Information Center

    Hao, Jiangang; Smith, Lawrence; Mislevy, Robert; von Davier, Alina; Bauer, Malcolm

    2016-01-01

    Extracting information efficiently from game/simulation-based assessment (G/SBA) logs requires two things: a well-structured log file and a set of analysis methods. In this report, we propose a generic data model specified as an extensible markup language (XML) schema for the log files of G/SBAs. We also propose a set of analysis methods for…

  19. Further Analysis of Boiling Points of Small Molecules, CH[subscript w]F[subscript x]Cl[subscript y]Br[subscript z

    ERIC Educational Resources Information Center

    Beauchamp, Guy

    2005-01-01

    A study to present specific hypothesis that satisfactorily explain the boiling point of a number of molecules, CH[subscript w]F[subscript x]Cl[subscript y]Br[subscript z] having similar structure, and then analyze the model with the help of multiple linear regression (MLR), a data analysis tool. The MLR analysis was useful in selecting the…

  20. Preliminary design review package on air flat plate collector for solar heating and cooling system

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Guidelines to be used in the development and fabrication of a prototype air flat plate collector subsystem containing 320 square feet (10-4 ft x 8 ft panels) of collector area are presented. Topics discussed include: (1) verification plan; (2) thermal analysis; (3) safety hazard analysis; (4) drawing list; (5) special handling, installation and maintenance tools; (6) structural analysis; and (7) selected drawings.

Top