Sample records for computer test structures

  1. CSI computer system/remote interface unit acceptance test results

    NASA Technical Reports Server (NTRS)

    Sparks, Dean W., Jr.

    1992-01-01

    The validation tests conducted on the Control/Structures Interaction (CSI) Computer System (CCS)/Remote Interface Unit (RIU) is discussed. The CCS/RIU consists of a commercially available, Langley Research Center (LaRC) programmed, space flight qualified computer and a flight data acquisition and filtering computer, developed at LaRC. The tests were performed in the Space Structures Research Laboratory (SSRL) and included open loop excitation, closed loop control, safing, RIU digital filtering, and RIU stand alone testing with the CSI Evolutionary Model (CEM) Phase-0 testbed. The test results indicated that the CCS/RIU system is comparable to ground based systems in performing real-time control-structure experiments.

  2. Correlation of predicted and measured thermal stresses on a truss-type aircraft structure

    NASA Technical Reports Server (NTRS)

    Jenkins, J. M.; Schuster, L. S.; Carter, A. L.

    1978-01-01

    A test structure representing a portion of a hypersonic vehicle was instrumented with strain gages and thermocouples. This test structure was then subjected to laboratory heating representative of supersonic and hypersonic flight conditions. A finite element computer model of this structure was developed using several types of elements with the NASA structural analysis (NASTRAN) computer program. Temperature inputs from the test were used to generate predicted model thermal stresses and these were correlated with the test measurements.

  3. Causal Learning with Local Computations

    ERIC Educational Resources Information Center

    Fernbach, Philip M.; Sloman, Steven A.

    2009-01-01

    The authors proposed and tested a psychological theory of causal structure learning based on local computations. Local computations simplify complex learning problems via cues available on individual trials to update a single causal structure hypothesis. Structural inferences from local computations make minimal demands on memory, require…

  4. Designing for aircraft structural crashworthiness

    NASA Technical Reports Server (NTRS)

    Thomson, R. G.; Caiafa, C.

    1981-01-01

    This report describes structural aviation crash dynamics research activities being conducted on general aviation aircraft and transport aircraft. The report includes experimental and analytical correlations of load-limiting subfloor and seat configurations tested dynamically in vertical drop tests and in a horizontal sled deceleration facility. Computer predictions using a finite-element nonlinear computer program, DYCAST, of the acceleration time-histories of these innovative seat and subfloor structures are presented. Proposed application of these computer techniques, and the nonlinear lumped mass computer program KRASH, to transport aircraft crash dynamics is discussed. A proposed FAA full-scale crash test of a fully instrumented radio controlled transport airplane is also described.

  5. Aeroelastic Modeling of a Nozzle Startup Transient

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Zhao, Xiang; Zhang, Sijun; Chen, Yen-Sen

    2014-01-01

    Lateral nozzle forces are known to cause severe structural damage to any new rocket engine in development during test. While three-dimensional, transient, turbulent, chemically reacting computational fluid dynamics methodology has been demonstrated to capture major side load physics with rigid nozzles, hot-fire tests often show nozzle structure deformation during major side load events, leading to structural damages if structural strengthening measures were not taken. The modeling picture is incomplete without the capability to address the two-way responses between the structure and fluid. The objective of this study is to develop a tightly coupled aeroelastic modeling algorithm by implementing the necessary structural dynamics component into an anchored computational fluid dynamics methodology. The computational fluid dynamics component is based on an unstructured-grid, pressure-based computational fluid dynamics formulation, while the computational structural dynamics component is developed under the framework of modal analysis. Transient aeroelastic nozzle startup analyses at sea level were performed, and the computed transient nozzle fluid-structure interaction physics presented,

  6. JPL control/structure interaction test bed real-time control computer architecture

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1989-01-01

    The Control/Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts - such as active structure - and new tools - such as combined structure and control optimization algorithm - and their verification in ground and possibly flight test. A focus mission spacecraft was designed based upon a space interferometer and is the basis for design of the ground test article. The ground test bed objectives include verification of the spacecraft design concepts, the active structure elements and certain design tools such as the new combined structures and controls optimization tool. In anticipation of CSI technology flight experiments, the test bed control electronics must emulate the computation capacity and control architectures of space qualifiable systems as well as the command and control networks that will be used to connect investigators with the flight experiment hardware. The Test Bed facility electronics were functionally partitioned into three units: a laboratory data acquisition system for structural parameter identification and performance verification; an experiment supervisory computer to oversee the experiment, monitor the environmental parameters and perform data logging; and a multilevel real-time control computing system. The design of the Test Bed electronics is presented along with hardware and software component descriptions. The system should break new ground in experimental control electronics and is of interest to anyone working in the verification of control concepts for large structures.

  7. Development of an Aeroelastic Modeling Capability for Transient Nozzle Side Load Analysis

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Zhao, Xiang; Zhang, Sijun; Chen, Yen-Sen

    2013-01-01

    Lateral nozzle forces are known to cause severe structural damage to any new rocket engine in development during test. While three-dimensional, transient, turbulent, chemically reacting computational fluid dynamics methodology has been demonstrated to capture major side load physics with rigid nozzles, hot-fire tests often show nozzle structure deformation during major side load events, leading to structural damages if structural strengthening measures were not taken. The modeling picture is incomplete without the capability to address the two-way responses between the structure and fluid. The objective of this study is to develop a coupled aeroelastic modeling capability by implementing the necessary structural dynamics component into an anchored computational fluid dynamics methodology. The computational fluid dynamics component is based on an unstructured-grid, pressure-based computational fluid dynamics formulation, while the computational structural dynamics component is developed in the framework of modal analysis. Transient aeroelastic nozzle startup analyses of the Block I Space Shuttle Main Engine at sea level were performed. The computed results from the aeroelastic nozzle modeling are presented.

  8. System reliability of randomly vibrating structures: Computational modeling and laboratory testing

    NASA Astrophysics Data System (ADS)

    Sundar, V. S.; Ammanagi, S.; Manohar, C. S.

    2015-09-01

    The problem of determination of system reliability of randomly vibrating structures arises in many application areas of engineering. We discuss in this paper approaches based on Monte Carlo simulations and laboratory testing to tackle problems of time variant system reliability estimation. The strategy we adopt is based on the application of Girsanov's transformation to the governing stochastic differential equations which enables estimation of probability of failure with significantly reduced number of samples than what is needed in a direct simulation study. Notably, we show that the ideas from Girsanov's transformation based Monte Carlo simulations can be extended to conduct laboratory testing to assess system reliability of engineering structures with reduced number of samples and hence with reduced testing times. Illustrative examples include computational studies on a 10-degree of freedom nonlinear system model and laboratory/computational investigations on road load response of an automotive system tested on a four-post test rig.

  9. Structural Analysis Made 'NESSUSary'

    NASA Technical Reports Server (NTRS)

    2005-01-01

    Everywhere you look, chances are something that was designed and tested by a computer will be in plain view. Computers are now utilized to design and test just about everything imaginable, from automobiles and airplanes to bridges and boats, and elevators and escalators to streets and skyscrapers. Computer-design engineering first emerged in the 1970s, in the automobile and aerospace industries. Since computers were in their infancy, however, architects and engineers during the time were limited to producing only designs similar to hand-drafted drawings. (At the end of 1970s, a typical computer-aided design system was a 16-bit minicomputer with a price tag of $125,000.) Eventually, computers became more affordable and related software became more sophisticated, offering designers the "bells and whistles" to go beyond the limits of basic drafting and rendering, and venture into more skillful applications. One of the major advancements was the ability to test the objects being designed for the probability of failure. This advancement was especially important for the aerospace industry, where complicated and expensive structures are designed. The ability to perform reliability and risk assessment without using extensive hardware testing is critical to design and certification. In 1984, NASA initiated the Probabilistic Structural Analysis Methods (PSAM) project at Glenn Research Center to develop analysis methods and computer programs for the probabilistic structural analysis of select engine components for current Space Shuttle and future space propulsion systems. NASA envisioned that these methods and computational tools would play a critical role in establishing increased system performance and durability, and assist in structural system qualification and certification. Not only was the PSAM project beneficial to aerospace, it paved the way for a commercial risk- probability tool that is evaluating risks in diverse, down- to-Earth application

  10. Theoretical, Experimental, and Computational Evaluation of Disk-Loaded Circular Wave Guides

    NASA Technical Reports Server (NTRS)

    Wallett, Thomas M.; Qureshi, A. Haq

    1994-01-01

    A disk-loaded circular wave guide structure and test fixture were fabricated. The dispersion characteristics were found by theoretical analysis, experimental testing, and computer simulation using the codes ARGUS and SOS. Interaction impedances were computed based on the corresponding dispersion characteristics. Finally, an equivalent circuit model for one period of the structure was chosen using equivalent circuit models for cylindrical wave guides of different radii. Optimum values for the discrete capacitors and inductors describing discontinuities between cylindrical wave guides were found using the computer code TOUCHSTONE.

  11. Theoretical, Experimental, and Computational Evaluation of Several Vane-Type Slow-Wave Structures

    NASA Technical Reports Server (NTRS)

    Wallett, Thomas M.; Qureshi, A. Haq

    1994-01-01

    Several types of periodic vane slow-wave structures were fabricated. The dispersion characteristics were found by theoretical analysis, experimental testing, and computer simulation using the MAFIA code. Computer-generated characteristics agreed to approximately within 2 percent of the experimental characteristics for all structures. The theoretical characteristics, however, deviated increasingly as the width to height ratio became smaller. Interaction impedances were also computed based on the experimental and computer-generated resonance frequency shifts due to the introduction of a perturbing dielectric rod.

  12. Pleurisy and Other Pleural Disorders

    MedlinePlus

    ... structures in your chest. This test provides a computer-generated picture of your lungs that can show ... chest MRI , uses radio waves, magnets, and a computer to created detailed pictures of the structures in ...

  13. Energy and Technology Review

    NASA Astrophysics Data System (ADS)

    Poggio, Andrew J.

    1988-10-01

    This issue of Energy and Technology Review contains: Neutron Penumbral Imaging of Laser-Fusion Targets--using our new penumbral-imaging diagnostic, we have obtained the first images that can be used to measure directly the deuterium-tritium burn region in laser-driven fusion targets; Computed Tomography for Nondestructive Evaluation--various computed tomography systems and computational techniques are used in nondestructive evaluation; Three-Dimensional Image Analysis for Studying Nuclear Chromatin Structure--we have developed an optic-electronic system for acquiring cross-sectional views of cell nuclei, and computer codes to analyze these images and reconstruct the three-dimensional structures they represent; Imaging in the Nuclear Test Program--advanced techniques produce images of unprecedented detail and resolution from Nevada Test Site data; and Computational X-Ray Holography--visible-light experiments and numerically simulated holograms test our ideas about an X-ray microscope for biological research.

  14. Thermal/structural modeling of a large scale in situ overtest experiment for defense high level waste at the Waste Isolation Pilot Plant Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morgan, H.S.; Stone, C.M.; Krieg, R.D.

    Several large scale in situ experiments in bedded salt formations are currently underway at the Waste Isolation Pilot Plant (WIPP) near Carlsbad, New Mexico, USA. In these experiments, the thermal and creep responses of salt around several different underground room configurations are being measured. Data from the tests are to be compared to thermal and structural responses predicted in pretest reference calculations. The purpose of these comparisons is to evaluate computational models developed from laboratory data prior to fielding of the in situ experiments. In this paper, the computational models used in the pretest reference calculation for one of themore » large scale tests, The Overtest for Defense High Level Waste, are described; and the pretest computed thermal and structural responses are compared to early data from the experiment. The comparisons indicate that computed and measured temperatures for the test agree to within ten percent but that measured deformation rates are between two and three times greater than corresponsing computed rates. 10 figs., 3 tabs.« less

  15. A Comparative Study of Multi-material Data Structures for Computational Physics Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garimella, Rao Veerabhadra; Robey, Robert W.

    The data structures used to represent the multi-material state of a computational physics application can have a drastic impact on the performance of the application. We look at efficient data structures for sparse applications where there may be many materials, but only one or few in most computational cells. We develop simple performance models for use in selecting possible data structures and programming patterns. We verify the analytic models of performance through a small test program of the representative cases.

  16. Lewis Structures Technology, 1988. Volume 1: Structural Dynamics

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The specific purpose of the symposium was to familiarize the engineering structures community with the depth and range of research performed by the Structures Division of the Lewis Research Center and its academic and industrial partners. Sessions covered vibration control, fracture mechanics, ceramic component reliability, parallel computing, nondestructive testing, dynamical systems, fatigue and damage, wind turbines, hot section technology, structural mechanics codes, computational methods for dynamics, structural optimization, and applications of structural dynamics.

  17. A Computer-Based Approach for Deriving and Measuring Individual and Team Knowledge Structure from Essay Questions

    ERIC Educational Resources Information Center

    Clariana, Roy B.; Wallace, Patricia

    2007-01-01

    This proof-of-concept investigation describes a computer-based approach for deriving the knowledge structure of individuals and of groups from their written essays, and considers the convergent criterion-related validity of the computer-based scores relative to human rater essay scores and multiple-choice test scores. After completing a…

  18. Causal learning with local computations.

    PubMed

    Fernbach, Philip M; Sloman, Steven A

    2009-05-01

    The authors proposed and tested a psychological theory of causal structure learning based on local computations. Local computations simplify complex learning problems via cues available on individual trials to update a single causal structure hypothesis. Structural inferences from local computations make minimal demands on memory, require relatively small amounts of data, and need not respect normative prescriptions as inferences that are principled locally may violate those principles when combined. Over a series of 3 experiments, the authors found (a) systematic inferences from small amounts of data; (b) systematic inference of extraneous causal links; (c) influence of data presentation order on inferences; and (d) error reduction through pretraining. Without pretraining, a model based on local computations fitted data better than a Bayesian structural inference model. The data suggest that local computations serve as a heuristic for learning causal structure. Copyright 2009 APA, all rights reserved.

  19. Numerical Modeling of the Lake Mary Road Bridge for Foundation Reuse Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sitek, M. A.; Bojanowski, C.; Lottes, S. A.

    This project uses numerical techniques to assess the structural integrity and capacity of the bridge foundations and, as a result, reduces the risk associated with reusing the same foundation for a new superstructure. Nondestructive test methods of different types were used in combination with the numerical modeling and analysis. The onsite tests included visual inspection, tomography, ground penetrating radar, drilling boreholes and coreholes, and the laboratory tests on recovered samples. The results were utilized to identify the current geometry of the structure with foundation, including the hidden geometry of the abutments and piers, and soil and foundation material properties. Thismore » data was used to build the numerical models and run computational analyses on a high performance computer cluster to assess the structural integrity of the bridge and foundations including the suitability of the foundation for reuse with a new superstructure and traffic that will increase the load on the foundations. Computational analysis is more cost-effective and gives an advantage of getting more detailed knowledge about the structural response. It also enables to go beyond non-destructive testing and find the failure conditions without destroying the structure under consideration.« less

  20. Cloud4Psi: cloud computing for 3D protein structure similarity searching.

    PubMed

    Mrozek, Dariusz; Małysiak-Mrozek, Bożena; Kłapciński, Artur

    2014-10-01

    Popular methods for 3D protein structure similarity searching, especially those that generate high-quality alignments such as Combinatorial Extension (CE) and Flexible structure Alignment by Chaining Aligned fragment pairs allowing Twists (FATCAT) are still time consuming. As a consequence, performing similarity searching against large repositories of structural data requires increased computational resources that are not always available. Cloud computing provides huge amounts of computational power that can be provisioned on a pay-as-you-go basis. We have developed the cloud-based system that allows scaling of the similarity searching process vertically and horizontally. Cloud4Psi (Cloud for Protein Similarity) was tested in the Microsoft Azure cloud environment and provided good, almost linearly proportional acceleration when scaled out onto many computational units. Cloud4Psi is available as Software as a Service for testing purposes at: http://cloud4psi.cloudapp.net/. For source code and software availability, please visit the Cloud4Psi project home page at http://zti.polsl.pl/dmrozek/science/cloud4psi.htm. © The Author 2014. Published by Oxford University Press.

  1. Cloud4Psi: cloud computing for 3D protein structure similarity searching

    PubMed Central

    Mrozek, Dariusz; Małysiak-Mrozek, Bożena; Kłapciński, Artur

    2014-01-01

    Summary: Popular methods for 3D protein structure similarity searching, especially those that generate high-quality alignments such as Combinatorial Extension (CE) and Flexible structure Alignment by Chaining Aligned fragment pairs allowing Twists (FATCAT) are still time consuming. As a consequence, performing similarity searching against large repositories of structural data requires increased computational resources that are not always available. Cloud computing provides huge amounts of computational power that can be provisioned on a pay-as-you-go basis. We have developed the cloud-based system that allows scaling of the similarity searching process vertically and horizontally. Cloud4Psi (Cloud for Protein Similarity) was tested in the Microsoft Azure cloud environment and provided good, almost linearly proportional acceleration when scaled out onto many computational units. Availability and implementation: Cloud4Psi is available as Software as a Service for testing purposes at: http://cloud4psi.cloudapp.net/. For source code and software availability, please visit the Cloud4Psi project home page at http://zti.polsl.pl/dmrozek/science/cloud4psi.htm. Contact: dariusz.mrozek@polsl.pl PMID:24930141

  2. A computer vision-based approach for structural displacement measurement

    NASA Astrophysics Data System (ADS)

    Ji, Yunfeng

    2010-04-01

    Along with the incessant advancement in optics, electronics and computer technologies during the last three decades, commercial digital video cameras have experienced a remarkable evolution, and can now be employed to measure complex motions of objects with sufficient accuracy, which render great assistance to structural displacement measurement in civil engineering. This paper proposes a computer vision-based approach for dynamic measurement of structures. One digital camera is used to capture image sequences of planar targets mounted on vibrating structures. The mathematical relationship between image plane and real space is established based on computer vision theory. Then, the structural dynamic displacement at the target locations can be quantified using point reconstruction rules. Compared with other tradition displacement measurement methods using sensors, such as accelerometers, linear-variable-differential-transducers (LVDTs) and global position system (GPS), the proposed approach gives the main advantages of great flexibility, a non-contact working mode and ease of increasing measurement points. To validate, four tests of sinusoidal motion of a point, free vibration of a cantilever beam, wind tunnel test of a cross-section bridge model, and field test of bridge displacement measurement, are performed. Results show that the proposed approach can attain excellent accuracy compared with the analytical ones or the measurements using conventional transducers, and proves to deliver an innovative and low cost solution to structural displacement measurement.

  3. Integrated circuit test-port architecture and method and apparatus of test-port generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teifel, John

    A method and apparatus are provided for generating RTL code for a test-port interface of an integrated circuit. In an embodiment, a test-port table is provided as input data. A computer automatically parses the test-port table into data structures and analyzes it to determine input, output, local, and output-enable port names. The computer generates address-detect and test-enable logic constructed from combinational functions. The computer generates one-hot multiplexer logic for at least some of the output ports. The one-hot multiplexer logic for each port is generated so as to enable the port to toggle between data signals and test signals. Themore » computer then completes the generation of the RTL code.« less

  4. StrAuto: automation and parallelization of STRUCTURE analysis.

    PubMed

    Chhatre, Vikram E; Emerson, Kevin J

    2017-03-24

    Population structure inference using the software STRUCTURE has become an integral part of population genetic studies covering a broad spectrum of taxa including humans. The ever-expanding size of genetic data sets poses computational challenges for this analysis. Although at least one tool currently implements parallel computing to reduce computational overload of this analysis, it does not fully automate the use of replicate STRUCTURE analysis runs required for downstream inference of optimal K. There is pressing need for a tool that can deploy population structure analysis on high performance computing clusters. We present an updated version of the popular Python program StrAuto, to streamline population structure analysis using parallel computing. StrAuto implements a pipeline that combines STRUCTURE analysis with the Evanno Δ K analysis and visualization of results using STRUCTURE HARVESTER. Using benchmarking tests, we demonstrate that StrAuto significantly reduces the computational time needed to perform iterative STRUCTURE analysis by distributing runs over two or more processors. StrAuto is the first tool to integrate STRUCTURE analysis with post-processing using a pipeline approach in addition to implementing parallel computation - a set up ideal for deployment on computing clusters. StrAuto is distributed under the GNU GPL (General Public License) and available to download from http://strauto.popgen.org .

  5. Current Progress of a Finite Element Computational Fluid Dynamics Prediction of Flutter for the AeroStructures Test Wing

    NASA Technical Reports Server (NTRS)

    Arena, Andrew S., Jr.

    2002-01-01

    This progress report focuses on the use of the STructural Analysis RoutineS suite program, SOLIDS, input for the AeroStructures Test Wing. The AeroStructures Test Wing project as a whole is described. The use of the SOLIDS code to find the mode shapes of a structure is discussed. The frequencies, and the structural dynamics to which they relate are examined. The results of the CFD predictions are compared to experimental data from a Ground Vibration Test.

  6. A Structure for Creating Quality Software.

    ERIC Educational Resources Information Center

    Christensen, Larry C.; Bodey, Michael R.

    1990-01-01

    Addresses the issue of assuring quality software for use in computer-aided instruction and presents a structure by which developers can create quality courseware. Differences between courseware and computer-aided instruction software are discussed, methods for testing software are described, and human factors issues as well as instructional design…

  7. Multiple-Choice versus Constructed-Response Tests in the Assessment of Mathematics Computation Skills.

    ERIC Educational Resources Information Center

    Gadalla, Tahany M.

    The equivalence of multiple-choice (MC) and constructed response (discrete) (CR-D) response formats as applied to mathematics computation at grade levels two to six was tested. The difference between total scores from the two response formats was tested for statistical significance, and the factor structure of items in both response formats was…

  8. Test results of a 40-kW Stirling engine and comparison with the NASA Lewis computer code predictions

    NASA Technical Reports Server (NTRS)

    Allen, David J.; Cairelli, James E.

    1988-01-01

    A Stirling engine was tested without auxiliaries at Nasa-Lewis. Three different regenerator configurations were tested with hydrogen. The test objectives were: (1) to obtain steady-state and dynamic engine data, including indicated power, for validation of an existing computer model for this engine; and (2) to evaluate structurally the use of silicon carbide regenerators. This paper presents comparisons of the measured brake performance, indicated mean effective pressure, and cyclic pressure variations from those predicted by the code. The silicon carbide foam generators appear to be structurally suitable, but the foam matrix showed severely reduced performance.

  9. Software for Testing Electroactive Structural Components

    NASA Technical Reports Server (NTRS)

    Moses, Robert W.; Fox, Robert L.; Dimery, Archie D.; Bryant, Robert G.; Shams, Qamar

    2003-01-01

    A computer program generates a graphical user interface that, in combination with its other features, facilitates the acquisition and preprocessing of experimental data on the strain response, hysteresis, and power consumption of a multilayer composite-material structural component containing one or more built-in sensor(s) and/or actuator(s) based on piezoelectric materials. This program runs in conjunction with Lab-VIEW software in a computer-controlled instrumentation system. For a test, a specimen is instrumented with appliedvoltage and current sensors and with strain gauges. Once the computational connection to the test setup has been made via the LabVIEW software, this program causes the test instrumentation to step through specified configurations. If the user is satisfied with the test results as displayed by the software, the user activates an icon on a front-panel display, causing the raw current, voltage, and strain data to be digitized and saved. The data are also put into a spreadsheet and can be plotted on a graph. Graphical displays are saved in an image file for future reference. The program also computes and displays the power and the phase angle between voltage and current.

  10. Braided Composites for Aerospace Applications. (Latest citations from the Aerospace Database)

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The bibliography contains citations concerning the design, fabrication, and testing of structural composites formed by braiding machines. Topics include computer aided design and associated computer aided manufacture of braided tubular and flat forms. Applications include aircraft and spacecraft structures, where high shear strength and stiffness are required.

  11. Aeroelastic Model Structure Computation for Envelope Expansion

    NASA Technical Reports Server (NTRS)

    Kukreja, Sunil L.

    2007-01-01

    Structure detection is a procedure for selecting a subset of candidate terms, from a full model description, that best describes the observed output. This is a necessary procedure to compute an efficient system description which may afford greater insight into the functionality of the system or a simpler controller design. Structure computation as a tool for black-box modelling may be of critical importance in the development of robust, parsimonious models for the flight-test community. Moreover, this approach may lead to efficient strategies for rapid envelope expansion which may save significant development time and costs. In this study, a least absolute shrinkage and selection operator (LASSO) technique is investigated for computing efficient model descriptions of nonlinear aeroelastic systems. The LASSO minimises the residual sum of squares by the addition of an l(sub 1) penalty term on the parameter vector of the traditional 2 minimisation problem. Its use for structure detection is a natural extension of this constrained minimisation approach to pseudolinear regression problems which produces some model parameters that are exactly zero and, therefore, yields a parsimonious system description. Applicability of this technique for model structure computation for the F/A-18 Active Aeroelastic Wing using flight test data is shown for several flight conditions (Mach numbers) by identifying a parsimonious system description with a high percent fit for cross-validated data.

  12. Infinite possibilities: Computational structures technology

    NASA Astrophysics Data System (ADS)

    Beam, Sherilee F.

    1994-12-01

    Computational Fluid Dynamics (or CFD) methods are very familiar to the research community. Even the general public has had some exposure to CFD images, primarily through the news media. However, very little attention has been paid to CST--Computational Structures Technology. Yet, no important design can be completed without it. During the first half of this century, researchers only dreamed of designing and building structures on a computer. Today their dreams have become practical realities as computational methods are used in all phases of design, fabrication and testing of engineering systems. Increasingly complex structures can now be built in even shorter periods of time. Over the past four decades, computer technology has been developing, and early finite element methods have grown from small in-house programs to numerous commercial software programs. When coupled with advanced computing systems, they help engineers make dramatic leaps in designing and testing concepts. The goals of CST include: predicting how a structure will behave under actual operating conditions; designing and complementing other experiments conducted on a structure; investigating microstructural damage or chaotic, unpredictable behavior; helping material developers in improving material systems; and being a useful tool in design systems optimization and sensitivity techniques. Applying CST to a structure problem requires five steps: (1) observe the specific problem; (2) develop a computational model for numerical simulation; (3) develop and assemble software and hardware for running the codes; (4) post-process and interpret the results; and (5) use the model to analyze and design the actual structure. Researchers in both industry and academia continue to make significant contributions to advance this technology with improvements in software, collaborative computing environments and supercomputing systems. As these environments and systems evolve, computational structures technology will evolve. By using CST in the design and operation of future structures systems, engineers will have a better understanding of how a system responds and lasts, more cost-effective methods of designing and testing models, and improved productivity. For informational and educational purposes, a videotape is being produced using both static and dynamic images from research institutions, software and hardware companies, private individuals, and historical photographs and drawings. The extensive number of CST resources indicates its widespread use. Applications run the gamut from simpler university-simulated problems to those requiring solutions on supercomputers. In some cases, an image or an animation will be mapped onto the actual structure to show the relevance of the computer model to the structure.

  13. Infinite possibilities: Computational structures technology

    NASA Technical Reports Server (NTRS)

    Beam, Sherilee F.

    1994-01-01

    Computational Fluid Dynamics (or CFD) methods are very familiar to the research community. Even the general public has had some exposure to CFD images, primarily through the news media. However, very little attention has been paid to CST--Computational Structures Technology. Yet, no important design can be completed without it. During the first half of this century, researchers only dreamed of designing and building structures on a computer. Today their dreams have become practical realities as computational methods are used in all phases of design, fabrication and testing of engineering systems. Increasingly complex structures can now be built in even shorter periods of time. Over the past four decades, computer technology has been developing, and early finite element methods have grown from small in-house programs to numerous commercial software programs. When coupled with advanced computing systems, they help engineers make dramatic leaps in designing and testing concepts. The goals of CST include: predicting how a structure will behave under actual operating conditions; designing and complementing other experiments conducted on a structure; investigating microstructural damage or chaotic, unpredictable behavior; helping material developers in improving material systems; and being a useful tool in design systems optimization and sensitivity techniques. Applying CST to a structure problem requires five steps: (1) observe the specific problem; (2) develop a computational model for numerical simulation; (3) develop and assemble software and hardware for running the codes; (4) post-process and interpret the results; and (5) use the model to analyze and design the actual structure. Researchers in both industry and academia continue to make significant contributions to advance this technology with improvements in software, collaborative computing environments and supercomputing systems. As these environments and systems evolve, computational structures technology will evolve. By using CST in the design and operation of future structures systems, engineers will have a better understanding of how a system responds and lasts, more cost-effective methods of designing and testing models, and improved productivity. For informational and educational purposes, a videotape is being produced using both static and dynamic images from research institutions, software and hardware companies, private individuals, and historical photographs and drawings. The extensive number of CST resources indicates its widespread use. Applications run the gamut from simpler university-simulated problems to those requiring solutions on supercomputers. In some cases, an image or an animation will be mapped onto the actual structure to show the relevance of the computer model to the structure. Transferring the digital files to videotape presents a number of problems related to maintaining the quality of the original image, while still producing a broadcast quality videotape. Since researchers normally do not create a computer image using traditional composition theories or video production requirements, often the image loses some of its original digital quality and impact when transferred to videotape. Although many CST images are currently available, those that are edited into the final project must meet two important criteria: they must complement the narration, and they must be broadcast quality when recorded on videotape.

  14. Flight-vehicle materials, structures, and dynamics - Assessment and future directions. Vol. 5 - Structural dynamics and aeroelasticity

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Editor); Venneri, Samuel L. (Editor)

    1993-01-01

    Various papers on flight vehicle materials, structures, and dynamics are presented. Individual topics addressed include: general modeling methods, component modeling techniques, time-domain computational techniques, dynamics of articulated structures, structural dynamics in rotating systems, structural dynamics in rotorcraft, damping in structures, structural acoustics, structural design for control, structural modeling for control, control strategies for structures, system identification, overall assessment of needs and benefits in structural dynamics and controlled structures. Also discussed are: experimental aeroelasticity in wind tunnels, aeroservoelasticity, nonlinear aeroelasticity, aeroelasticity problems in turbomachines, rotary-wing aeroelasticity with application to VTOL vehicles, computational aeroelasticity, structural dynamic testing and instrumentation.

  15. A Diagnostic Study of Computer Application of Structural Communication Grid

    ERIC Educational Resources Information Center

    Bahar, Mehmet; Aydin, Fatih; Karakirik, Erol

    2009-01-01

    In this article, Structural communication grid (SCG), an alternative measurement and evaluation technique, has been firstly summarised and the design, development and implementation of a computer based SCG system have been introduced. The system is then tested on a sample of 154 participants consisting of candidate students, science teachers and…

  16. National Wind Tecnology Center Provides Dual Axis Resonant Blade Testing

    ScienceCinema

    Felker, Fort

    2018-01-16

    NREL's Structural Testing Laboratory at the National Wind Technology Center (NWTC) provides experimental laboratories, computer facilities for analytical work, space for assembling components and turbines for atmospheric testing as well as office space for industry researchers. Fort Felker, center director at the NWTC, discusses NREL's state-of-the-art structural testing capabilities and shows a flapwise and edgewise blade test in progress.

  17. National Wind Tecnology Center Provides Dual Axis Resonant Blade Testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Felker, Fort

    2013-11-13

    NREL's Structural Testing Laboratory at the National Wind Technology Center (NWTC) provides experimental laboratories, computer facilities for analytical work, space for assembling components and turbines for atmospheric testing as well as office space for industry researchers. Fort Felker, center director at the NWTC, discusses NREL's state-of-the-art structural testing capabilities and shows a flapwise and edgewise blade test in progress.

  18. Test-bed for the remote health monitoring system for bridge structures using FBG sensors

    NASA Astrophysics Data System (ADS)

    Lee, Chin-Hyung; Park, Ki-Tae; Joo, Bong-Chul; Hwang, Yoon-Koog

    2009-05-01

    This paper reports on test-bed for the long-term health monitoring system for bridge structures employing fiber Bragg grating (FBG) sensors, which is remotely accessible via the web, to provide real-time quantitative information on a bridge's response to live loading and environmental changes, and fast prediction of the structure's integrity. The sensors are attached on several locations of the structure and connected to a data acquisition system permanently installed onsite. The system can be accessed through remote communication using an optical cable network, through which the evaluation of the bridge behavior under live loading can be allowed at place far away from the field. Live structural data are transmitted continuously to the server computer at the central office. The server computer is connected securely to the internet, where data can be retrieved, processed and stored for the remote web-based health monitoring. Test-bed revealed that the remote health monitoring technology will enable practical, cost-effective, and reliable condition assessment and maintenance of bridge structures.

  19. Low cost, high performance processing of single particle cryo-electron microscopy data in the cloud.

    PubMed

    Cianfrocco, Michael A; Leschziner, Andres E

    2015-05-08

    The advent of a new generation of electron microscopes and direct electron detectors has realized the potential of single particle cryo-electron microscopy (cryo-EM) as a technique to generate high-resolution structures. Calculating these structures requires high performance computing clusters, a resource that may be limiting to many likely cryo-EM users. To address this limitation and facilitate the spread of cryo-EM, we developed a publicly available 'off-the-shelf' computing environment on Amazon's elastic cloud computing infrastructure. This environment provides users with single particle cryo-EM software packages and the ability to create computing clusters with 16-480+ CPUs. We tested our computing environment using a publicly available 80S yeast ribosome dataset and estimate that laboratories could determine high-resolution cryo-EM structures for $50 to $1500 per structure within a timeframe comparable to local clusters. Our analysis shows that Amazon's cloud computing environment may offer a viable computing environment for cryo-EM.

  20. A brief overview of computational structures technology related activities at NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Hopkins, Dale A.

    1992-01-01

    The presentation gives a partial overview of research and development underway in the Structures Division of LeRC, which collectively is referred to as the Computational Structures Technology Program. The activities in the program are diverse and encompass four major categories: (1) composite materials and structures; (2) probabilistic analysis and reliability; (3) design optimization and expert systems; and (4) computational methods and simulation. The approach of the program is comprehensive and entails exploration of fundamental theories of structural mechanics to accurately represent the complex physics governing engine structural performance, formulation, and implementation of computational techniques and integrated simulation strategies to provide accurate and efficient solutions of the governing theoretical models by exploiting the emerging advances in computer technology, and validation and verification through numerical and experimental tests to establish confidence and define the qualities and limitations of the resulting theoretical models and computational solutions. The program comprises both in-house and sponsored research activities. The remainder of the presentation provides a sample of activities to illustrate the breadth and depth of the program and to demonstrate the accomplishments and benefits that have resulted.

  1. Functional Equivalence Acceptance Testing of FUN3D for Entry Descent and Landing Applications

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.; Wood, William A.; Kleb, William L.; Alter, Stephen J.; Glass, Christopher E.; Padilla, Jose F.; Hammond, Dana P.; White, Jeffery A.

    2013-01-01

    The functional equivalence of the unstructured grid code FUN3D to the the structured grid code LAURA (Langley Aerothermodynamic Upwind Relaxation Algorithm) is documented for applications of interest to the Entry, Descent, and Landing (EDL) community. Examples from an existing suite of regression tests are used to demonstrate the functional equivalence, encompassing various thermochemical models and vehicle configurations. Algorithm modifications required for the node-based unstructured grid code (FUN3D) to reproduce functionality of the cell-centered structured code (LAURA) are also documented. Challenges associated with computation on tetrahedral grids versus computation on structured-grid derived hexahedral systems are discussed.

  2. Study of Geometric Porosity on Static Stability and Drag Using Computational Fluid Dynamics for Rigid Parachute Shapes

    NASA Technical Reports Server (NTRS)

    Greathouse, James S.; Schwing, Alan M.

    2015-01-01

    This paper explores use of computational fluid dynamics to study the e?ect of geometric porosity on static stability and drag for NASA's Multi-Purpose Crew Vehicle main parachute. Both of these aerodynamic characteristics are of interest to in parachute design, and computational methods promise designers the ability to perform detailed parametric studies and other design iterations with a level of control previously unobtainable using ground or flight testing. The approach presented here uses a canopy structural analysis code to define the inflated parachute shapes on which structured computational grids are generated. These grids are used by the computational fluid dynamics code OVERFLOW and are modeled as rigid, impermeable bodies for this analysis. Comparisons to Apollo drop test data is shown as preliminary validation of the technique. Results include several parametric sweeps through design variables in order to better understand the trade between static stability and drag. Finally, designs that maximize static stability with a minimal loss in drag are suggested for further study in subscale ground and flight testing.

  3. Structural, Linguistic and Topic Variables in Verbal and Computational Problems in Elementary Mathematics.

    ERIC Educational Resources Information Center

    Beardslee, Edward C.; Jerman, Max E.

    Five structural, four linguistic and twelve topic variables are used in regression analyses on results of a 50-item achievement test. The test items are related to 12 topics from the third-grade mathematics curriculum. The items reflect one of two cases of the structural variable, cognitive level; the two levels are characterized, inductive…

  4. Computer-Based Linguistic Analysis.

    ERIC Educational Resources Information Center

    Wright, James R.

    Noam Chomsky's transformational-generative grammar model may effectively be translated into an equivalent computer model. Phrase-structure rules and transformations are tested as to their validity and ordering by the computer via the process of random lexical substitution. Errors appearing in the grammar are detected and rectified, and formal…

  5. Preliminary Findings on the Computer-Administered Multiple-Choice Online Causal Comprehension Assessment, a Diagnostic Reading Comprehension Test

    ERIC Educational Resources Information Center

    Davison, Mark L.; Biancarosa, Gina; Carlson, Sarah E.; Seipel, Ben; Liu, Bowen

    2018-01-01

    The computer-administered Multiple-Choice Online Causal Comprehension Assessment (MOCCA) for Grades 3 to 5 has an innovative, 40-item multiple-choice structure in which each distractor corresponds to a comprehension process upon which poor comprehenders have been shown to rely. This structure requires revised thinking about measurement issues…

  6. Research on computer aided testing of pilot response to critical in-flight events

    NASA Technical Reports Server (NTRS)

    Giffin, W. C.; Rockwell, T. H.; Smith, P. J.

    1984-01-01

    Experiments on pilot decision making are described. The development of models of pilot decision making in critical in flight events (CIFE) are emphasized. The following tests are reported on the development of: (1) a frame system representation describing how pilots use their knowledge in a fault diagnosis task; (2) assessment of script norms, distance measures, and Markov models developed from computer aided testing (CAT) data; and (3) performance ranking of subject data. It is demonstrated that interactive computer aided testing either by touch CRT's or personal computers is a useful research and training device for measuring pilot information management in diagnosing system failures in simulated flight situations. Performance is dictated by knowledge of aircraft sybsystems, initial pilot structuring of the failure symptoms and efficient testing of plausible causal hypotheses.

  7. Aeroelastic Model Structure Computation for Envelope Expansion

    NASA Technical Reports Server (NTRS)

    Kukreja, Sunil L.

    2007-01-01

    Structure detection is a procedure for selecting a subset of candidate terms, from a full model description, that best describes the observed output. This is a necessary procedure to compute an efficient system description which may afford greater insight into the functionality of the system or a simpler controller design. Structure computation as a tool for black-box modeling may be of critical importance in the development of robust, parsimonious models for the flight-test community. Moreover, this approach may lead to efficient strategies for rapid envelope expansion that may save significant development time and costs. In this study, a least absolute shrinkage and selection operator (LASSO) technique is investigated for computing efficient model descriptions of non-linear aeroelastic systems. The LASSO minimises the residual sum of squares with the addition of an l(Sub 1) penalty term on the parameter vector of the traditional l(sub 2) minimisation problem. Its use for structure detection is a natural extension of this constrained minimisation approach to pseudo-linear regression problems which produces some model parameters that are exactly zero and, therefore, yields a parsimonious system description. Applicability of this technique for model structure computation for the F/A-18 (McDonnell Douglas, now The Boeing Company, Chicago, Illinois) Active Aeroelastic Wing project using flight test data is shown for several flight conditions (Mach numbers) by identifying a parsimonious system description with a high percent fit for cross-validated data.

  8. The NASA computer aided design and test system

    NASA Technical Reports Server (NTRS)

    Gould, J. M.; Juergensen, K.

    1973-01-01

    A family of computer programs facilitating the design, layout, evaluation, and testing of digital electronic circuitry is described. CADAT (computer aided design and test system) is intended for use by NASA and its contractors and is aimed predominantly at providing cost effective microelectronic subsystems based on custom designed metal oxide semiconductor (MOS) large scale integrated circuits (LSIC's). CADAT software can be easily adopted by installations with a wide variety of computer hardware configurations. Its structure permits ease of update to more powerful component programs and to newly emerging LSIC technologies. The components of the CADAT system are described stressing the interaction of programs rather than detail of coding or algorithms. The CADAT system provides computer aids to derive and document the design intent, includes powerful automatic layout software, permits detailed geometry checks and performance simulation based on mask data, and furnishes test pattern sequences for hardware testing.

  9. Middeck Active Control Experiment (MACE), phase A

    NASA Technical Reports Server (NTRS)

    Crawley, Edward F.; Deluis, Javier; Miller, David W.

    1989-01-01

    A rationale to determine which structural experiments are sufficient to verify the design of structures employing Controlled Structures Technology was derived. A survey of proposed NASA missions was undertaken to identify candidate test articles for use in the Middeck Active Control Experiment (MACE). The survey revealed that potential test articles could be classified into one of three roles: development, demonstration, and qualification, depending on the maturity of the technology and the mission the structure must fulfill. A set of criteria was derived that allowed determination of which role a potential test article must fulfill. A review of the capabilities and limitations of the STS middeck was conducted. A reference design for the MACE test article was presented. Computing requirements for running typical closed-loop controllers was determined, and various computer configurations were studied. The various components required to manufacture the structure were identified. A management plan was established for the remainder of the program experiment development, flight and ground systems development, and integration to the carrier. Procedures for configuration control, fiscal control, and safety, reliabilty, and quality assurance were developed.

  10. Damage progression in Composite Structures

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon

    1996-01-01

    A computational simulation tool is used to evaluate the various stages of damage progression in composite materials during Iosipescu sheat testing. Unidirectional composite specimens with either the major or minor material axis in the load direction are considered. Damage progression characteristics are described for each specimen using two types of boundary conditions. A procedure is outlined regarding the use of computational simulation in composites testing. Iosipescu shear testing using the V-notched beam specimen is a convenient method to measure both shear strength and shear stiffness simultaneously. The evaluation of composite test response can be made more productive and informative via computational simulation of progressive damage and fracture. Computational simulation performs a complete evaluation of laminated composite fracture via assessment of ply and subply level damage/fracture processes.

  11. Crashworthiness of light aircraft fuselage structures: A numerical and experimental investigation

    NASA Technical Reports Server (NTRS)

    Nanyaro, A. P.; Tennyson, R. C.; Hansen, J. S.

    1984-01-01

    The dynamic behavior of aircraft fuselage structures subject to various impact conditions was investigated. An analytical model was developed based on a self-consistent finite element (CFE) formulation utilizing shell, curved beam, and stringer type elements. Equations of motion were formulated and linearized (i.e., for small displacements), although material nonlinearity was retained to treat local plastic deformation. The equations were solved using the implicit Newmark-Beta method with a frontal solver routine. Stiffened aluminum fuselage models were also tested in free flight using the UTIAS pendulum crash test facility. Data were obtained on dynamic strains, g-loads, and transient deformations (using high speed photography in the latter case) during the impact process. Correlations between tests and predicted results are presented, together with computer graphics, based on the CFE model. These results include level and oblique angle impacts as well as the free-flight crash test. Comparisons with a hybrid, lumped mass finite element computer model demonstrate that the CFE formulation provides the test overall agreement with impact test data for comparable computing costs.

  12. Composite structural materials. [aircraft applications

    NASA Technical Reports Server (NTRS)

    Ansell, G. S.; Loewy, R. G.; Wiberley, S. E.

    1981-01-01

    The development of composite materials for aircraft applications is addressed with specific consideration of physical properties, structural concepts and analysis, manufacturing, reliability, and life prediction. The design and flight testing of composite ultralight gliders is documented. Advances in computer aided design and methods for nondestructive testing are also discussed.

  13. Definition of ground test for verification of large space structure control

    NASA Technical Reports Server (NTRS)

    Doane, G. B., III; Glaese, J. R.; Tollison, D. K.; Howsman, T. G.; Curtis, S. (Editor); Banks, B.

    1984-01-01

    Control theory and design, dynamic system modelling, and simulation of test scenarios are the main ideas discussed. The overall effort is the achievement at Marshall Space Flight Center of a successful ground test experiment of a large space structure. A simplified planar model of ground test experiment of a large space structure. A simplified planar model of ground test verification was developed. The elimination from that model of the uncontrollable rigid body modes was also examined. Also studied was the hardware/software of computation speed.

  14. A Note on Testing Mediated Effects in Structural Equation Models: Reconciling Past and Current Research on the Performance of the Test of Joint Significance

    ERIC Educational Resources Information Center

    Valente, Matthew J.; Gonzalez, Oscar; Miocevic, Milica; MacKinnon, David P.

    2016-01-01

    Methods to assess the significance of mediated effects in education and the social sciences are well studied and fall into two categories: single sample methods and computer-intensive methods. A popular single sample method to detect the significance of the mediated effect is the test of joint significance, and a popular computer-intensive method…

  15. Efficient grid-based techniques for density functional theory

    NASA Astrophysics Data System (ADS)

    Rodriguez-Hernandez, Juan Ignacio

    Understanding the chemical and physical properties of molecules and materials at a fundamental level often requires quantum-mechanical models for these substance's electronic structure. This type of many body quantum mechanics calculation is computationally demanding, hindering its application to substances with more than a few hundreds atoms. The supreme goal of many researches in quantum chemistry---and the topic of this dissertation---is to develop more efficient computational algorithms for electronic structure calculations. In particular, this dissertation develops two new numerical integration techniques for computing molecular and atomic properties within conventional Kohn-Sham-Density Functional Theory (KS-DFT) of molecular electronic structure. The first of these grid-based techniques is based on the transformed sparse grid construction. In this construction, a sparse grid is generated in the unit cube and then mapped to real space according to the pro-molecular density using the conditional distribution transformation. The transformed sparse grid was implemented in program deMon2k, where it is used as the numerical integrator for the exchange-correlation energy and potential in the KS-DFT procedure. We tested our grid by computing ground state energies, equilibrium geometries, and atomization energies. The accuracy on these test calculations shows that our grid is more efficient than some previous integration methods: our grids use fewer points to obtain the same accuracy. The transformed sparse grids were also tested for integrating, interpolating and differentiating in different dimensions (n = 1,2,3,6). The second technique is a grid-based method for computing atomic properties within QTAIM. It was also implemented in deMon2k. The performance of the method was tested by computing QTAIM atomic energies, charges, dipole moments, and quadrupole moments. For medium accuracy, our method is the fastest one we know of.

  16. The Construct Validity of Higher Order Structure-of-Intellect Abilities in a Battery of Tests Emphasizing the Product of Transformations: A Confirmatory Maximum Likelihood Factor Analysis.

    ERIC Educational Resources Information Center

    Khattab, Ali-Maher; And Others

    1982-01-01

    A causal modeling system, using confirmatory maximum likelihood factor analysis with the LISREL IV computer program, evaluated the construct validity underlying the higher order factor structure of a given correlation matrix of 46 structure-of-intellect tests emphasizing the product of transformations. (Author/PN)

  17. Computed tomography (CT) as a nondestructive test method used for composite helicopter components

    NASA Astrophysics Data System (ADS)

    Oster, Reinhold

    1991-09-01

    The first components of primary helicopter structures to be made of glass fiber reinforced plastics were the main and tail rotor blades of the Bo105 and BK 117 helicopters. These blades are now successfully produced in series. New developments in rotor components, e.g., the rotor blade technology of the Bo108 and PAH2 programs, make use of very complex fiber reinforced structures to achieve simplicity and strength. Computer tomography was found to be an outstanding nondestructive test method for examining the internal structure of components. A CT scanner generates x-ray attenuation measurements which are used to produce computer reconstructed images of any desired part of an object. The system images a range of flaws in composites in a number of views and planes. Several CT investigations and their results are reported taking composite helicopter components as an example.

  18. Computed Tomography (CT) as a nondestructive test method used for composite helicopter components

    NASA Astrophysics Data System (ADS)

    Oster, Reinhold

    The first components of primary helicopter structures to be made of glass fiber reinforced plastics were the main and tail rotor blades of the Bo105 and BK117 helicopters. These blades are now successfully produced in series. New developments in rotor components, e.g. the rotor blade technology of the Bo108 and PAH2 programs, make use of very complex fiber reinforced structures to achieve simplicity and strength. Computer tomography was found to be an outstanding nondestructive test method for examining the internal structure of components. A CT scanner generates x-ray attenuation measurements which are used to produce computer reconstructed images of any desired part of an object. The system images a range of flaws in composites in a number of views and planes. Several CT investigations and their results are reported taking composite helicopter components as an example.

  19. Computer Vision-Based Structural Displacement Measurement Robust to Light-Induced Image Degradation for In-Service Bridges

    PubMed Central

    Lee, Junhwa; Lee, Kyoung-Chan; Cho, Soojin

    2017-01-01

    The displacement responses of a civil engineering structure can provide important information regarding structural behaviors that help in assessing safety and serviceability. A displacement measurement using conventional devices, such as the linear variable differential transformer (LVDT), is challenging owing to issues related to inconvenient sensor installation that often requires additional temporary structures. A promising alternative is offered by computer vision, which typically provides a low-cost and non-contact displacement measurement that converts the movement of an object, mostly an attached marker, in the captured images into structural displacement. However, there is limited research on addressing light-induced measurement error caused by the inevitable sunlight in field-testing conditions. This study presents a computer vision-based displacement measurement approach tailored to a field-testing environment with enhanced robustness to strong sunlight. An image-processing algorithm with an adaptive region-of-interest (ROI) is proposed to reliably determine a marker’s location even when the marker is indistinct due to unfavorable light. The performance of the proposed system is experimentally validated in both laboratory-scale and field experiments. PMID:29019950

  20. Aggregating Data for Computational Toxicology Applications: The U.S. Environmental Protection Agency (EPA) Aggregated Computational Toxicology Resource (ACToR) System

    EPA Science Inventory

    Computational toxicology combines data from high-throughput test methods, chemical structure analyses and other biological domains (e.g., genes, proteins, cells, tissues) with the goals of predicting and understanding the underlying mechanistic causes of chemical toxicity and for...

  1. Flexible structure control experiments using a real-time workstation for computer-aided control engineering

    NASA Technical Reports Server (NTRS)

    Stieber, Michael E.

    1989-01-01

    A Real-Time Workstation for Computer-Aided Control Engineering has been developed jointly by the Communications Research Centre (CRC) and Ruhr-Universitaet Bochum (RUB), West Germany. The system is presently used for the development and experimental verification of control techniques for large space systems with significant structural flexibility. The Real-Time Workstation essentially is an implementation of RUB's extensive Computer-Aided Control Engineering package KEDDC on an INTEL micro-computer running under the RMS real-time operating system. The portable system supports system identification, analysis, control design and simulation, as well as the immediate implementation and test of control systems. The Real-Time Workstation is currently being used by CRC to study control/structure interaction on a ground-based structure called DAISY, whose design was inspired by a reflector antenna. DAISY emulates the dynamics of a large flexible spacecraft with the following characteristics: rigid body modes, many clustered vibration modes with low frequencies and extremely low damping. The Real-Time Workstation was found to be a very powerful tool for experimental studies, supporting control design and simulation, and conducting and evaluating tests withn one integrated environment.

  2. The pKa Cooperative: A Collaborative Effort to Advance Structure-Based Calculations of pKa values and Electrostatic Effects in Proteins

    PubMed Central

    Nielsen, Jens E.; Gunner, M. R.; Bertrand García-Moreno, E.

    2012-01-01

    The pKa Cooperative http://www.pkacoop.org was organized to advance development of accurate and useful computational methods for structure-based calculation of pKa values and electrostatic energy in proteins. The Cooperative brings together laboratories with expertise and interest in theoretical, computational and experimental studies of protein electrostatics. To improve structure-based energy calculations it is necessary to better understand the physical character and molecular determinants of electrostatic effects. The Cooperative thus intends to foment experimental research into fundamental aspects of proteins that depend on electrostatic interactions. It will maintain a depository for experimental data useful for critical assessment of methods for structure-based electrostatics calculations. To help guide the development of computational methods the Cooperative will organize blind prediction exercises. As a first step, computational laboratories were invited to reproduce an unpublished set of experimental pKa values of acidic and basic residues introduced in the interior of staphylococcal nuclease by site-directed mutagenesis. The pKa values of these groups are unique and challenging to simulate owing to the large magnitude of their shifts relative to normal pKa values in water. Many computational methods were tested in this 1st Blind Prediction Challenge and critical assessment exercise. A workshop was organized in the Telluride Science Research Center to assess objectively the performance of many computational methods tested on this one extensive dataset. This volume of PROTEINS: Structure, Function, and Bioinformatics introduces the pKa Cooperative, presents reports submitted by participants in the blind prediction challenge, and highlights some of the problems in structure-based calculations identified during this exercise. PMID:22002877

  3. Static and Dynamic Model Update of an Inflatable/Rigidizable Torus Structure

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Reaves, mercedes C.

    2006-01-01

    The present work addresses the development of an experimental and computational procedure for validating finite element models. A torus structure, part of an inflatable/rigidizable Hexapod, is used to demonstrate the approach. Because of fabrication, materials, and geometric uncertainties, a statistical approach combined with optimization is used to modify key model parameters. Static test results are used to update stiffness parameters and dynamic test results are used to update the mass distribution. Updated parameters are computed using gradient and non-gradient based optimization algorithms. Results show significant improvements in model predictions after parameters are updated. Lessons learned in the areas of test procedures, modeling approaches, and uncertainties quantification are presented.

  4. Low cost, high performance processing of single particle cryo-electron microscopy data in the cloud

    PubMed Central

    Cianfrocco, Michael A; Leschziner, Andres E

    2015-01-01

    The advent of a new generation of electron microscopes and direct electron detectors has realized the potential of single particle cryo-electron microscopy (cryo-EM) as a technique to generate high-resolution structures. Calculating these structures requires high performance computing clusters, a resource that may be limiting to many likely cryo-EM users. To address this limitation and facilitate the spread of cryo-EM, we developed a publicly available ‘off-the-shelf’ computing environment on Amazon's elastic cloud computing infrastructure. This environment provides users with single particle cryo-EM software packages and the ability to create computing clusters with 16–480+ CPUs. We tested our computing environment using a publicly available 80S yeast ribosome dataset and estimate that laboratories could determine high-resolution cryo-EM structures for $50 to $1500 per structure within a timeframe comparable to local clusters. Our analysis shows that Amazon's cloud computing environment may offer a viable computing environment for cryo-EM. DOI: http://dx.doi.org/10.7554/eLife.06664.001 PMID:25955969

  5. Composite panel development at JPL

    NASA Technical Reports Server (NTRS)

    Mcelroy, Paul; Helms, Rich

    1988-01-01

    Parametric computer studies can be use in a cost effective manner to determine optimized composite mirror panel designs. An InterDisciplinary computer Model (IDM) was created to aid in the development of high precision reflector panels for LDR. The materials properties, thermal responses, structural geometries, and radio/optical precision are synergistically analyzed for specific panel designs. Promising panels designs are fabricated and tested so that comparison with panel test results can be used to verify performance prediction models and accommodate design refinement. The iterative approach of computer design and model refinement with performance testing and materials optimization has shown good results for LDR panels.

  6. Accurate optimization of amino acid form factors for computing small-angle X-ray scattering intensity of atomistic protein structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tong, Dudu; Yang, Sichun; Lu, Lanyuan

    2016-06-20

    Structure modellingviasmall-angle X-ray scattering (SAXS) data generally requires intensive computations of scattering intensity from any given biomolecular structure, where the accurate evaluation of SAXS profiles using coarse-grained (CG) methods is vital to improve computational efficiency. To date, most CG SAXS computing methods have been based on a single-bead-per-residue approximation but have neglected structural correlations between amino acids. To improve the accuracy of scattering calculations, accurate CG form factors of amino acids are now derived using a rigorous optimization strategy, termed electron-density matching (EDM), to best fit electron-density distributions of protein structures. This EDM method is compared with and tested againstmore » other CG SAXS computing methods, and the resulting CG SAXS profiles from EDM agree better with all-atom theoretical SAXS data. By including the protein hydration shell represented by explicit CG water molecules and the correction of protein excluded volume, the developed CG form factors also reproduce the selected experimental SAXS profiles with very small deviations. Taken together, these EDM-derived CG form factors present an accurate and efficient computational approach for SAXS computing, especially when higher molecular details (represented by theqrange of the SAXS data) become necessary for effective structure modelling.« less

  7. Response of basic structural elements and B-52 structural components to simulated nuclear overpressure. Volume II-program data (basic structural elements). Final report, 1 June 1977-30 September 1979

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Syring, R.P.; Grubb, R.L.

    1979-09-30

    This document reports on the following: (1) experimental determination of the response of 16 basic structural elements and 7 B-52 components to simulated nuclear overpressure environments (utilizing Sandia Corporation's Thunderpipe Shock Tube), (2) analysis of these test specimens utilizing the NOVA-2 computer program, and (3) correlation of test and analysis results.

  8. Structural Stability of Mathematical Models of National Economy

    NASA Astrophysics Data System (ADS)

    Ashimov, Abdykappar A.; Sultanov, Bahyt T.; Borovskiy, Yuriy V.; Adilov, Zheksenbek M.; Ashimov, Askar A.

    2011-12-01

    In the paper we test robustness of particular dynamic systems in a compact regions of a plane and a weak structural stability of one dynamic system of high order in a compact region of its phase space. The test was carried out based on the fundamental theory of dynamical systems on a plane and based on the conditions for weak structural stability of high order dynamic systems. A numerical algorithm for testing the weak structural stability of high order dynamic systems has been proposed. Based on this algorithm we assess the weak structural stability of one computable general equilibrium model.

  9. Dynamic Docking Test System (DDTS) active table computer program NASA Advanced Docking System (NADS)

    NASA Technical Reports Server (NTRS)

    Gates, R. M.; Jantz, R. E.

    1974-01-01

    A computer program was developed to describe the three-dimensional motion of the Dynamic Docking Test System active table. The input consists of inertia and geometry data, actuator structural data, forcing function data, hydraulics data, servo electronics data, and integration control data. The output consists of table responses, actuator bending responses, and actuator responses.

  10. The NASA Lewis Research Center High Temperature Fatigue and Structures Laboratory

    NASA Technical Reports Server (NTRS)

    Mcgaw, M. A.; Bartolotta, P. A.

    1987-01-01

    The physical organization of the NASA Lewis Research Center High Temperature Fatigue and Structures Laboratory is described. Particular attention is given to uniaxial test systems, high cycle/low cycle testing systems, axial torsional test systems, computer system capabilities, and a laboratory addition. The proposed addition will double the floor area of the present laboratory and will be equipped with its own control room.

  11. The Evidence for a Subscore Structure in a Test of English Language Competency for English Language Learners

    ERIC Educational Resources Information Center

    Reckase, Mark D.; Xu, Jing-Ru

    2015-01-01

    How to compute and report subscores for a test that was originally designed for reporting scores on a unidimensional scale has been a topic of interest in recent years. In the research reported here, we describe an application of multidimensional item response theory to identify a subscore structure in a test designed for reporting results using a…

  12. Structural integrity of a confinement vessel for testing nuclear fuels for space propulsion

    NASA Astrophysics Data System (ADS)

    Bergmann, V. L.

    Nuclear propulsion systems for rockets could significantly reduce the travel time to distant destinations in space. However, long before such a concept can become reality, a significant effort must be invested in analysis and ground testing to guide the development of nuclear fuels. Any testing in support of development of nuclear fuels for space propulsion must be safely contained to prevent the release of radioactive materials. This paper describes analyses performed to assess the structural integrity of a test confinement vessel. The confinement structure, a stainless steel pressure vessel with bolted flanges, was designed for operating static pressures in accordance with the ASME Boiler and Pressure Vessel Code. In addition to the static operating pressures, the confinement barrier must withstand static overpressures from off-normal conditions without releasing radioactive material. Results from axisymmetric finite element analyses are used to evaluate the response of the confinement structure under design and accident conditions. For the static design conditions, the stresses computed from the ASME code are compared with the stresses computed by the finite element method.

  13. Embellishing Problem-Solving Examples with Deep Structure Information Facilitates Transfer

    ERIC Educational Resources Information Center

    Lee, Hee Seung; Betts, Shawn; Anderson, John R.

    2017-01-01

    Appreciation of problem structure is critical to successful learning. Two experiments investigated effective ways of communicating problem structure in a computer-based learning environment and tested whether verbal instruction is necessary to specify solution steps, when deep structure is already embellished by instructional examples.…

  14. Development of a Cloud Computing-Based Pier Type Port Structure Stability Evaluation Platform Using Fiber Bragg Grating Sensors.

    PubMed

    Jo, Byung Wan; Jo, Jun Ho; Khan, Rana Muhammad Asad; Kim, Jung Hoon; Lee, Yun Sung

    2018-05-23

    Structure Health Monitoring is a topic of great interest in port structures due to the ageing of structures and the limitations of evaluating structures. This paper presents a cloud computing-based stability evaluation platform for a pier type port structure using Fiber Bragg Grating (FBG) sensors in a system consisting of a FBG strain sensor, FBG displacement gauge, FBG angle meter, gateway, and cloud computing-based web server. The sensors were installed on core components of the structure and measurements were taken to evaluate the structures. The measurement values were transmitted to the web server via the gateway to analyze and visualize them. All data were analyzed and visualized in the web server to evaluate the structure based on the safety evaluation index (SEI). The stability evaluation platform for pier type port structures involves the efficient monitoring of the structures which can be carried out easily anytime and anywhere by converging new technologies such as cloud computing and FBG sensors. In addition, the platform has been successfully implemented at “Maryang Harbor” situated in Maryang-Meyon of Korea to test its durability.

  15. Parallel-vector computation for linear structural analysis and non-linear unconstrained optimization problems

    NASA Technical Reports Server (NTRS)

    Nguyen, D. T.; Al-Nasra, M.; Zhang, Y.; Baddourah, M. A.; Agarwal, T. K.; Storaasli, O. O.; Carmona, E. A.

    1991-01-01

    Several parallel-vector computational improvements to the unconstrained optimization procedure are described which speed up the structural analysis-synthesis process. A fast parallel-vector Choleski-based equation solver, pvsolve, is incorporated into the well-known SAP-4 general-purpose finite-element code. The new code, denoted PV-SAP, is tested for static structural analysis. Initial results on a four processor CRAY 2 show that using pvsolve reduces the equation solution time by a factor of 14-16 over the original SAP-4 code. In addition, parallel-vector procedures for the Golden Block Search technique and the BFGS method are developed and tested for nonlinear unconstrained optimization. A parallel version of an iterative solver and the pvsolve direct solver are incorporated into the BFGS method. Preliminary results on nonlinear unconstrained optimization test problems, using pvsolve in the analysis, show excellent parallel-vector performance indicating that these parallel-vector algorithms can be used in a new generation of finite-element based structural design/analysis-synthesis codes.

  16. Response of basic structural elements and B-52 structural components to simulated nuclear overpressure. Volume I-program description and results (basic structural elements). Final report, 1 June 1977-30 September 1979

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Syring, R.P.; Grubb, R.L.

    1979-09-30

    This document reports on the following: (1) experimental determination of the response of 16 basic structural elements and 7 B-52 components to simulated nuclear overpressure environments (utilizing Sandia Corporation's Thunderpipe Shock Tube), (2) analysis of these test specimens utilizing the NOVA-2 computer program, and (3) correlation of test and analysis results.

  17. First Test of Fan Active Noise Control (ANC) Completed

    NASA Technical Reports Server (NTRS)

    2005-01-01

    With the advent of ultrahigh-bypass engines, the space available for passive acoustic treatment is becoming more limited, whereas noise regulations are becoming more stringent. Active noise control (ANC) holds promise as a solution to this problem. It uses secondary (added) noise sources to reduce or eliminate the offending noise radiation. The first active noise control test on the low-speed fan test bed was a General Electric Company system designed to control either the exhaust or inlet fan tone. This system consists of a "ring source," an induct array of error microphones, and a control computer. Fan tone noise propagates in a duct in the form of spinning waves. These waves are detected by the microphone array, and the computer identifies their spinning structure. The computer then controls the "ring source" to generate waves that have the same spinning structure and amplitude, but 180 out of phase with the fan noise. This computer generated tone cancels the fan tone before it radiates from the duct and is heard in the far field. The "ring source" used in these tests is a cylindrical array of 16 flat-plate acoustic radiators that are driven by thin piezoceramic sheets bonded to their back surfaces. The resulting source can produce spinning waves up to mode 7 at levels high enough to cancel the fan tone. The control software is flexible enough to work on spinning mode orders from -6 to 6. In this test, the fan was configured to produce a tone of order 6. The complete modal (spinning and radial) structure of the tones was measured with two builtin sets of rotating microphone rakes. These rakes provide a measurement of the system performance independent from the control system error microphones. In addition, the far-field noise was measured with a semicircular array of 28 microphones. This test represents the first in a series of tests that demonstrate different active noise control concepts, each on a progressively more complicated modal structure. The tests are in preparation for a demonstration on a flight-type engine.

  18. Fast normal mode computations of capsid dynamics inspired by resonance

    NASA Astrophysics Data System (ADS)

    Na, Hyuntae; Song, Guang

    2018-07-01

    Increasingly more and larger structural complexes are being determined experimentally. The sizes of these systems pose a formidable computational challenge to the study of their vibrational dynamics by normal mode analysis. To overcome this challenge, this work presents a novel resonance-inspired approach. Tests on large shell structures of protein capsids demonstrate that there is a strong resonance between the vibrations of a whole capsid and those of individual capsomeres. We then show how this resonance can be taken advantage of to significantly speed up normal mode computations.

  19. Minimum Conflict Mainstreaming.

    ERIC Educational Resources Information Center

    Awen, Ed; And Others

    Computer technology is discussed as a tool for facilitating the implementation of the mainstreaming process. Minimum conflict mainstreaming/merging (MCM) is defined as an approach which utilizes computer technology to circumvent such structural obstacles to mainstreaming as transportation scheduling, screening and assignment of students, testing,…

  20. Ultrasound

    MedlinePlus

    ... reflect off body structures. A computer receives the waves and uses them to create a picture. Unlike with an x-ray or CT scan, this test does not use ionizing radiation. The test is done in the ultrasound ...

  1. Predicting multi-wall structural response to hypervelocity impact using the hull code

    NASA Technical Reports Server (NTRS)

    Schonberg, William P.

    1993-01-01

    Previously, multi-wall structures have been analyzed extensively, primarily through experiment, as a means of increasing the meteoroid/space debris impact protection of spacecraft. As structural configurations become more varied, the number of tests required to characterize their response increases dramatically. As an alternative to experimental testing, numerical modeling of high-speed impact phenomena is often being used to predict the response of a variety of structural systems under different impact loading conditions. The results of comparing experimental tests to Hull Hydrodynamic Computer Code predictions are reported. Also, the results of a numerical parametric study of multi-wall structural response to hypervelocity cylindrical projectile impact are presented.

  2. CSI Flight Computer System and experimental test results

    NASA Technical Reports Server (NTRS)

    Sparks, Dean W., Jr.; Peri, F., Jr.; Schuler, P.

    1993-01-01

    This paper describes the CSI Computer System (CCS) and the experimental tests performed to validate its functionality. This system is comprised of two major components: the space flight qualified Excitation and Damping Subsystem (EDS) which performs controls calculations; and the Remote Interface Unit (RIU) which is used for data acquisition, transmission, and filtering. The flight-like RIU is the interface between the EDS and the sensors and actuators positioned on the particular structure under control. The EDS and RIU communicate over the MIL-STD-1553B, a space flight qualified bus. To test the CCS under realistic conditions, it was connected to the Phase-0 CSI Evolutionary Model (CEM) at NASA Langley Research Center. The following schematic shows how the CCS is connected to the CEM. Various tests were performed which validated the ability of the system to perform control/structures experiments.

  3. A Computational Approach for Model Update of an LS-DYNA Energy Absorbing Cell

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Jackson, Karen E.; Kellas, Sotiris

    2008-01-01

    NASA and its contractors are working on structural concepts for absorbing impact energy of aerospace vehicles. Recently, concepts in the form of multi-cell honeycomb-like structures designed to crush under load have been investigated for both space and aeronautics applications. Efforts to understand these concepts are progressing from tests of individual cells to tests of systems with hundreds of cells. Because of fabrication irregularities, geometry irregularities, and material properties uncertainties, the problem of reconciling analytical models, in particular LS-DYNA models, with experimental data is a challenge. A first look at the correlation results between single cell load/deflection data with LS-DYNA predictions showed problems which prompted additional work in this area. This paper describes a computational approach that uses analysis of variance, deterministic sampling techniques, response surface modeling, and genetic optimization to reconcile test with analysis results. Analysis of variance provides a screening technique for selection of critical parameters used when reconciling test with analysis. In this study, complete ignorance of the parameter distribution is assumed and, therefore, the value of any parameter within the range that is computed using the optimization procedure is considered to be equally likely. Mean values from tests are matched against LS-DYNA solutions by minimizing the square error using a genetic optimization. The paper presents the computational methodology along with results obtained using this approach.

  4. Robustness of Ability Estimation to Multidimensionality in CAST with Implications to Test Assembly

    ERIC Educational Resources Information Center

    Zhang, Yanwei; Nandakumar, Ratna

    2006-01-01

    Computer Adaptive Sequential Testing (CAST) is a test delivery model that combines features of the traditional conventional paper-and-pencil testing and item-based computerized adaptive testing (CAT). The basic structure of CAST is a panel composed of multiple testlets adaptively administered to examinees at different stages. Current applications…

  5. Analysis of whisker-toughened CMC structural components using an interactive reliability model

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.; Palko, Joseph L.

    1992-01-01

    Realizing wider utilization of ceramic matrix composites (CMC) requires the development of advanced structural analysis technologies. This article focuses on the use of interactive reliability models to predict component probability of failure. The deterministic William-Warnke failure criterion serves as theoretical basis for the reliability model presented here. The model has been implemented into a test-bed software program. This computer program has been coupled to a general-purpose finite element program. A simple structural problem is presented to illustrate the reliability model and the computer algorithm.

  6. Combining 3D structure of real video and synthetic objects

    NASA Astrophysics Data System (ADS)

    Kim, Man-Bae; Song, Mun-Sup; Kim, Do-Kyoon

    1998-04-01

    This paper presents a new approach of combining real video and synthetic objects. The purpose of this work is to use the proposed technology in the fields of advanced animation, virtual reality, games, and so forth. Computer graphics has been used in the fields previously mentioned. Recently, some applications have added real video to graphic scenes for the purpose of augmenting the realism that the computer graphics lacks in. This approach called augmented or mixed reality can produce more realistic environment that the entire use of computer graphics. Our approach differs from the virtual reality and augmented reality in the manner that computer- generated graphic objects are combined to 3D structure extracted from monocular image sequences. The extraction of the 3D structure requires the estimation of 3D depth followed by the construction of a height map. Graphic objects are then combined to the height map. The realization of our proposed approach is carried out in the following steps: (1) We derive 3D structure from test image sequences. The extraction of the 3D structure requires the estimation of depth and the construction of a height map. Due to the contents of the test sequence, the height map represents the 3D structure. (2) The height map is modeled by Delaunay triangulation or Bezier surface and each planar surface is texture-mapped. (3) Finally, graphic objects are combined to the height map. Because 3D structure of the height map is already known, Step (3) is easily manipulated. Following this procedure, we produced an animation video demonstrating the combination of the 3D structure and graphic models. Users can navigate the realistic 3D world whose associated image is rendered on the display monitor.

  7. Optimal Sequential Rules for Computer-Based Instruction.

    ERIC Educational Resources Information Center

    Vos, Hans J.

    1998-01-01

    Formulates sequential rules for adapting the appropriate amount of instruction to learning needs in the context of computer-based instruction. Topics include Bayesian decision theory, threshold and linear-utility structure, psychometric model, optimal sequential number of test questions, and an empirical example of sequential instructional…

  8. A ridge tracking algorithm and error estimate for efficient computation of Lagrangian coherent structures.

    PubMed

    Lipinski, Doug; Mohseni, Kamran

    2010-03-01

    A ridge tracking algorithm for the computation and extraction of Lagrangian coherent structures (LCS) is developed. This algorithm takes advantage of the spatial coherence of LCS by tracking the ridges which form LCS to avoid unnecessary computations away from the ridges. We also make use of the temporal coherence of LCS by approximating the time dependent motion of the LCS with passive tracer particles. To justify this approximation, we provide an estimate of the difference between the motion of the LCS and that of tracer particles which begin on the LCS. In addition to the speedup in computational time, the ridge tracking algorithm uses less memory and results in smaller output files than the standard LCS algorithm. Finally, we apply our ridge tracking algorithm to two test cases, an analytically defined double gyre as well as the more complicated example of the numerical simulation of a swimming jellyfish. In our test cases, we find up to a 35 times speedup when compared with the standard LCS algorithm.

  9. Detection of Answer Copying Based on the Structure of a High-Stakes Test

    ERIC Educational Resources Information Center

    Belov, Dmitry I.

    2011-01-01

    This article presents the Variable Match Index (VM-Index), a new statistic for detecting answer copying. The power of the VM-Index relies on two-dimensional conditioning as well as the structure of the test. The asymptotic distribution of the VM-Index is analyzed by reduction to Poisson trials. A computational study comparing the VM-Index with the…

  10. Thermal/Structural Tailoring of Engine Blades (T/STAEBL) User's manual

    NASA Technical Reports Server (NTRS)

    Brown, K. W.

    1994-01-01

    The Thermal/Structural Tailoring of Engine Blades (T/STAEBL) system is a computer code that is able to perform numerical optimizations of cooled jet engine turbine blades and vanes. These optimizations seek an airfoil design of minimum operating cost that satisfies realistic design constraints. This report documents the organization of the T/STAEBL computer program, its design and analysis procedure, its optimization procedure, and provides an overview of the input required to run the program, as well as the computer resources required for its effective use. Additionally, usage of the program is demonstrated through a validation test case.

  11. Thermal/Structural Tailoring of Engine Blades (T/STAEBL): User's manual

    NASA Astrophysics Data System (ADS)

    Brown, K. W.

    1994-03-01

    The Thermal/Structural Tailoring of Engine Blades (T/STAEBL) system is a computer code that is able to perform numerical optimizations of cooled jet engine turbine blades and vanes. These optimizations seek an airfoil design of minimum operating cost that satisfies realistic design constraints. This report documents the organization of the T/STAEBL computer program, its design and analysis procedure, its optimization procedure, and provides an overview of the input required to run the program, as well as the computer resources required for its effective use. Additionally, usage of the program is demonstrated through a validation test case.

  12. A Worst-Case Approach for On-Line Flutter Prediction

    NASA Technical Reports Server (NTRS)

    Lind, Rick C.; Brenner, Martin J.

    1998-01-01

    Worst-case flutter margins may be computed for a linear model with respect to a set of uncertainty operators using the structured singular value. This paper considers an on-line implementation to compute these robust margins in a flight test program. Uncertainty descriptions are updated at test points to account for unmodeled time-varying dynamics of the airplane by ensuring the robust model is not invalidated by measured flight data. Robust margins computed with respect to this uncertainty remain conservative to the changing dynamics throughout the flight. A simulation clearly demonstrates this method can improve the efficiency of flight testing by accurately predicting the flutter margin to improve safety while reducing the necessary flight time.

  13. Heat transfer, thermal stress analysis and the dynamic behaviour of high power RF structures. [MARC and SUPERFISH codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKeown, J.; Labrie, J.P.

    1983-08-01

    A general purpose finite element computer code called MARC is used to calculate the temperature distribution and dimensional changes in linear accelerator rf structures. Both steady state and transient behaviour are examined with the computer model. Combining results from MARC with the cavity evaluation computer code SUPERFISH, the static and dynamic behaviour of a structure under power is investigated. Structure cooling is studied to minimize loss in shunt impedance and frequency shifts during high power operation. Results are compared with an experimental test carried out on a cw 805 MHz on-axis coupled structure at an energy gradient of 1.8 MeV/m.more » The model has also been used to compare the performance of on-axis and coaxial structures and has guided the mechanical design of structures suitable for average gradients in excess of 2.0 MeV/m at 2.45 GHz.« less

  14. Force Limited Vibration Testing: Computation C2 for Real Load and Probabilistic Source

    NASA Astrophysics Data System (ADS)

    Wijker, J. J.; de Boer, A.; Ellenbroek, M. H. M.

    2014-06-01

    To prevent over-testing of the test-item during random vibration testing Scharton proposed and discussed the force limited random vibration testing (FLVT) in a number of publications, in which the factor C2 is besides the random vibration specification, the total mass and the turnover frequency of the load(test item), a very important parameter. A number of computational methods to estimate C2 are described in the literature, i.e. the simple and the complex two degrees of freedom system, STDFS and CTDFS, respectively. Both the STDFS and the CTDFS describe in a very reduced (simplified) manner the load and the source (adjacent structure to test item transferring the excitation forces, i.e. spacecraft supporting an instrument).The motivation of this work is to establish a method for the computation of a realistic value of C2 to perform a representative random vibration test based on force limitation, when the adjacent structure (source) description is more or less unknown. Marchand formulated a conservative estimation of C2 based on maximum modal effective mass and damping of the test item (load) , when no description of the supporting structure (source) is available [13].Marchand discussed the formal description of getting C 2 , using the maximum PSD of the acceleration and maximum PSD of the force, both at the interface between load and source, in combination with the apparent mass and total mass of the the load. This method is very convenient to compute the factor C 2 . However, finite element models are needed to compute the spectra of the PSD of both the acceleration and force at the interface between load and source.Stevens presented the coupled systems modal approach (CSMA), where simplified asparagus patch models (parallel-oscillator representation) of load and source are connected, consisting of modal effective masses and the spring stiffnesses associated with the natural frequencies. When the random acceleration vibration specification is given the CMSA method is suitable to compute the valueof the parameter C 2 .When no mathematical model of the source can be made available, estimations of the value C2 can be find in literature.In this paper a probabilistic mathematical representation of the unknown source is proposed, such that the asparagus patch model of the source can be approximated. The computation of the value C2 can be done in conjunction with the CMSA method, knowing the apparent mass of the load and the random acceleration specification at the interface between load and source, respectively.Strength & stiffness design rules for spacecraft, instrumentation, units, etc. will be practiced, as mentioned in ECSS Standards and Handbooks, Launch Vehicle User's manuals, papers, books , etc. A probabilistic description of the design parameters is foreseen.As an example a simple experiment has been worked out.

  15. Blade loss transient dynamics analysis, volume 2. Task 2: Theoretical and analytical development. Task 3: Experimental verification

    NASA Technical Reports Server (NTRS)

    Gallardo, V. C.; Storace, A. S.; Gaffney, E. F.; Bach, L. J.; Stallone, M. J.

    1981-01-01

    The component element method was used to develop a transient dynamic analysis computer program which is essentially based on modal synthesis combined with a central, finite difference, numerical integration scheme. The methodology leads to a modular or building-block technique that is amenable to computer programming. To verify the analytical method, turbine engine transient response analysis (TETRA), was applied to two blade-out test vehicles that had been previously instrumented and tested. Comparison of the time dependent test data with those predicted by TETRA led to recommendations for refinement or extension of the analytical method to improve its accuracy and overcome its shortcomings. The development of working equations, their discretization, numerical solution scheme, the modular concept of engine modelling, the program logical structure and some illustrated results are discussed. The blade-loss test vehicles (rig full engine), the type of measured data, and the engine structural model are described.

  16. Methods for evaluating the predictive accuracy of structural dynamic models

    NASA Technical Reports Server (NTRS)

    Hasselman, Timothy K.; Chrostowski, Jon D.

    1991-01-01

    Modeling uncertainty is defined in terms of the difference between predicted and measured eigenvalues and eigenvectors. Data compiled from 22 sets of analysis/test results was used to create statistical databases for large truss-type space structures and both pretest and posttest models of conventional satellite-type space structures. Modeling uncertainty is propagated through the model to produce intervals of uncertainty on frequency response functions, both amplitude and phase. This methodology was used successfully to evaluate the predictive accuracy of several structures, including the NASA CSI Evolutionary Structure tested at Langley Research Center. Test measurements for this structure were within + one-sigma intervals of predicted accuracy for the most part, demonstrating the validity of the methodology and computer code.

  17. The effects of computer simulation versus hands-on dissection and the placement of computer simulation within the learning cycle on student achievement and attitude

    NASA Astrophysics Data System (ADS)

    Hopkins, Kathryn Susan

    The value of dissection as an instructional strategy has been debated, but not evidenced in research literature. The purpose of this study was to examine the efficacy of using computer simulated frog dissection as a substitute for traditional hands-on frog dissection and to examine the possible enhancement of achievement by combining the two strategies in a specific sequence. In this study, 134 biology students at two Central Texas schools were divided into the five following treatment groups: computer simulation of frog dissection, computer simulation before dissection, traditional hands-on frog dissection, dissection before computer simulation, and textual worksheet materials. The effects on achievement were evaluated by labeling 10 structures on three diagrams, identifying 11 pinned structures on a prosected frog, and answering 9 multiple-choice questions over the dissection process. Attitude was evaluated using a thirty item survey with a five-point Likert scale. The quasi-experimental design was pretest/post-test/post-test nonequivalent group for both control and experimental groups, a 2 x 2 x 5 completely randomized factorial design (gender, school, five treatments). The pretest/post-test design was incorporated to control for prior knowledge using analysis of covariance. The dissection only group evidenced a significantly higher performance than all other treatments except dissection-then-computer on the post-test segment requiring students to label pinned anatomical parts on a prosected frog. Interactions between treatment and school in addition to interaction between treatment and gender were found to be significant. The diagram and attitude post-tests evidenced no significant difference. Results on the nine multiple-choice questions about dissection procedures indicated a significant difference between schools. The interaction between treatment and school was also found to be significant. On a delayed post-test, a significant difference in gender was found on the diagram labeling segment of the post-test. Males were reported to have the higher score. Since existing research conflicts with this study's results, additional research using authentic assessment is recommended. Instruction should be aligned with dissection content and process objectives for each treatment group, and the teacher variable should be controlled.

  18. Structure Computation of Quiet Spike[Trademark] Flight-Test Data During Envelope Expansion

    NASA Technical Reports Server (NTRS)

    Kukreja, Sunil L.

    2008-01-01

    System identification or mathematical modeling is used in the aerospace community for development of simulation models for robust control law design. These models are often described as linear time-invariant processes. Nevertheless, it is well known that the underlying process is often nonlinear. The reason for using a linear approach has been due to the lack of a proper set of tools for the identification of nonlinear systems. Over the past several decades, the controls and biomedical communities have made great advances in developing tools for the identification of nonlinear systems. These approaches are robust and readily applicable to aerospace systems. In this paper, we show the application of one such nonlinear system identification technique, structure detection, for the analysis of F-15B Quiet Spike(TradeMark) aeroservoelastic flight-test data. Structure detection is concerned with the selection of a subset of candidate terms that best describe the observed output. This is a necessary procedure to compute an efficient system description that may afford greater insight into the functionality of the system or a simpler controller design. Structure computation as a tool for black-box modeling may be of critical importance for the development of robust parsimonious models for the flight-test community. Moreover, this approach may lead to efficient strategies for rapid envelope expansion, which may save significant development time and costs. The objectives of this study are to demonstrate via analysis of F-15B Quiet Spike aeroservoelastic flight-test data for several flight conditions that 1) linear models are inefficient for modeling aeroservoelastic data, 2) nonlinear identification provides a parsimonious model description while providing a high percent fit for cross-validated data, and 3) the model structure and parameters vary as the flight condition is altered.

  19. DETECTING UNSPECIFIED STRUCTURE IN LOW-COUNT IMAGES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stein, Nathan M.; Dyk, David A. van; Kashyap, Vinay L.

    Unexpected structure in images of astronomical sources often presents itself upon visual inspection of the image, but such apparent structure may either correspond to true features in the source or be due to noise in the data. This paper presents a method for testing whether inferred structure in an image with Poisson noise represents a significant departure from a baseline (null) model of the image. To infer image structure, we conduct a Bayesian analysis of a full model that uses a multiscale component to allow flexible departures from the posited null model. As a test statistic, we use a tailmore » probability of the posterior distribution under the full model. This choice of test statistic allows us to estimate a computationally efficient upper bound on a p-value that enables us to draw strong conclusions even when there are limited computational resources that can be devoted to simulations under the null model. We demonstrate the statistical performance of our method on simulated images. Applying our method to an X-ray image of the quasar 0730+257, we find significant evidence against the null model of a single point source and uniform background, lending support to the claim of an X-ray jet.« less

  20. Computational Modeling of Liquid and Gaseous Control Valves

    NASA Technical Reports Server (NTRS)

    Daines, Russell; Ahuja, Vineet; Hosangadi, Ashvin; Shipman, Jeremy; Moore, Arden; Sulyma, Peter

    2005-01-01

    In this paper computational modeling efforts undertaken at NASA Stennis Space Center in support of rocket engine component testing are discussed. Such analyses include structurally complex cryogenic liquid valves and gas valves operating at high pressures and flow rates. Basic modeling and initial successes are documented, and other issues that make valve modeling at SSC somewhat unique are also addressed. These include transient behavior, valve stall, and the determination of flow patterns in LOX valves. Hexahedral structured grids are used for valves that can be simplifies through the use of axisymmetric approximation. Hybrid unstructured methodology is used for structurally complex valves that have disparate length scales and complex flow paths that include strong swirl, local recirculation zones/secondary flow effects. Hexahedral (structured), unstructured, and hybrid meshes are compared for accuracy and computational efficiency. Accuracy is determined using verification and validation techniques.

  1. Smart command recognizer (SCR) - For development, test, and implementation of speech commands

    NASA Technical Reports Server (NTRS)

    Simpson, Carol A.; Bunnell, John W.; Krones, Robert R.

    1988-01-01

    The SCR, a rapid prototyping system for the development, testing, and implementation of speech commands in a flight simulator or test aircraft, is described. A single unit performs all functions needed during these three phases of system development, while the use of common software and speech command data structure files greatly reduces the preparation time for successive development phases. As a smart peripheral to a simulation or flight host computer, the SCR interprets the pilot's spoken input and passes command codes to the simulation or flight computer.

  2. Development of an Efficient Binaural Simulation for the Analysis of Structural Acoustic Data

    NASA Technical Reports Server (NTRS)

    Johnson, Marty E.; Lalime, Aimee L.; Grosveld, Ferdinand W.; Rizzi, Stephen A.; Sullivan, Brenda M.

    2003-01-01

    Applying binaural simulation techniques to structural acoustic data can be very computationally intensive as the number of discrete noise sources can be very large. Typically, Head Related Transfer Functions (HRTFs) are used to individually filter the signals from each of the sources in the acoustic field. Therefore, creating a binaural simulation implies the use of potentially hundreds of real time filters. This paper details two methods of reducing the number of real-time computations required by: (i) using the singular value decomposition (SVD) to reduce the complexity of the HRTFs by breaking them into dominant singular values and vectors and (ii) by using equivalent source reduction (ESR) to reduce the number of sources to be analyzed in real-time by replacing sources on the scale of a structural wavelength with sources on the scale of an acoustic wavelength. The ESR and SVD reduction methods can be combined to provide an estimated computation time reduction of 99.4% for the structural acoustic data tested. In addition, preliminary tests have shown that there is a 97% correlation between the results of the combined reduction methods and the results found with the current binaural simulation techniques

  3. Using Molecular Dynamics Simulations as an Aid in the Prediction of Domain Swapping of Computationally Designed Protein Variants.

    PubMed

    Mou, Yun; Huang, Po-Ssu; Thomas, Leonard M; Mayo, Stephen L

    2015-08-14

    In standard implementations of computational protein design, a positive-design approach is used to predict sequences that will be stable on a given backbone structure. Possible competing states are typically not considered, primarily because appropriate structural models are not available. One potential competing state, the domain-swapped dimer, is especially compelling because it is often nearly identical with its monomeric counterpart, differing by just a few mutations in a hinge region. Molecular dynamics (MD) simulations provide a computational method to sample different conformational states of a structure. Here, we tested whether MD simulations could be used as a post-design screening tool to identify sequence mutations leading to domain-swapped dimers. We hypothesized that a successful computationally designed sequence would have backbone structure and dynamics characteristics similar to that of the input structure and that, in contrast, domain-swapped dimers would exhibit increased backbone flexibility and/or altered structure in the hinge-loop region to accommodate the large conformational change required for domain swapping. While attempting to engineer a homodimer from a 51-amino-acid fragment of the monomeric protein engrailed homeodomain (ENH), we had instead generated a domain-swapped dimer (ENH_DsD). MD simulations on these proteins showed increased B-factors derived from MD simulation in the hinge loop of the ENH_DsD domain-swapped dimer relative to monomeric ENH. Two point mutants of ENH_DsD designed to recover the monomeric fold were then tested with an MD simulation protocol. The MD simulations suggested that one of these mutants would adopt the target monomeric structure, which was subsequently confirmed by X-ray crystallography. Copyright © 2015. Published by Elsevier Ltd.

  4. Finite Element Analysis of an Energy Absorbing Sub-floor Structure

    NASA Technical Reports Server (NTRS)

    Moore, Scott C.

    1995-01-01

    As part of the Advanced General Aviation Transportation Experiments program, the National Aeronautics and Space Administration's Langley Research Center is conducting tests to design energy absorbing structures to improve occupant survivability in aircraft crashes. An effort is currently underway to design an Energy Absorbing (EA) sub-floor structure which will reduce occupant loads in an aircraft crash. However, a recent drop test of a fuselage specimen with a proposed EA sub-floor structure demonstrated that the effects of sectioning the fuselage on both the fuselage section's stiffness and the performance of the EA structure were not fully understood. Therefore, attempts are underway to model the proposed sub-floor structure on computers using the DYCAST finite element code to provide a better understanding of the structure's behavior in testing, and in an actual crash.

  5. Lifetime Reliability Evaluation of Structural Ceramic Parts with the CARES/LIFE Computer Program

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.

    1993-01-01

    The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker equation. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), Weibull's normal stress averaging method (NSA), or Batdorf's theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Two example problems demonstrating cyclic fatigue parameter estimation and component reliability analysis with proof testing are included.

  6. Model verification of large structural systems

    NASA Technical Reports Server (NTRS)

    Lee, L. T.; Hasselman, T. K.

    1977-01-01

    A methodology was formulated, and a general computer code implemented for processing sinusoidal vibration test data to simultaneously make adjustments to a prior mathematical model of a large structural system, and resolve measured response data to obtain a set of orthogonal modes representative of the test model. The derivation of estimator equations is shown along with example problems. A method for improving the prior analytic model is included.

  7. Survey of NASA research on crash dynamics

    NASA Technical Reports Server (NTRS)

    Thomson, R. G.; Carden, H. D.; Hayduk, R. J.

    1984-01-01

    Ten years of structural crash dynamics research activities conducted on general aviation aircraft by the National Aeronautics and Space Administration (NASA) are described. Thirty-two full-scale crash tests were performed at Langley Research Center, and pertinent data on airframe and seat behavior were obtained. Concurrent with the experimental program, analytical methods were developed to help predict structural behavior during impact. The effects of flight parameters at impact on cabin deceleration pulses at the seat/occupant interface, experimental and analytical correlation of data on load-limiting subfloor and seat configurations, airplane section test results for computer modeling validation, and data from emergency-locator-transmitter (ELT) investigations to determine probable cause of false alarms and nonactivations are assessed. Computer programs which provide designers with analytical methods for predicting accelerations, velocities, and displacements of collapsing structures are also discussed.

  8. Modeling of unit operating considerations in generating-capacity reliability evaluation. Volume 2. Computer-program documentation. Final report. [GENESIS, OPCON and OPPLAN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patton, A.D.; Ayoub, A.K.; Singh, C.

    1982-07-01

    This report describes the structure and operation of prototype computer programs developed for a Monte Carlo simulation model, GENESIS, and for two analytical models, OPCON and OPPLAN. It includes input data requirements and sample test cases.

  9. Modelling rollover behaviour of exacavator-based forest machines

    Treesearch

    M.W. Veal; S.E. Taylor; Robert B. Rummer

    2003-01-01

    This poster presentation provides results from analytical and computer simulation models of rollover behaviour of hydraulic excavators. These results are being used as input to the operator protective structure standards development process. Results from rigid body mechanics and computer simulation methods agree well with field rollover test data. These results show...

  10. The Impact of Iranian Teachers Cultural Values on Computer Technology Acceptance

    ERIC Educational Resources Information Center

    Sadeghi, Karim; Saribagloo, Javad Amani; Aghdam, Samad Hanifepour; Mahmoudi, Hojjat

    2014-01-01

    This study was conducted with the aim of testing the technology acceptance model and the impact of Hofstede cultural values (masculinity/femininity, uncertainty avoidance, individualism/collectivism, and power distance) on computer technology acceptance among teachers at Urmia city (Iran) using the structural equation modeling approach. From among…

  11. Photovoltaic module encapsulation design and materials section, volume 2

    NASA Technical Reports Server (NTRS)

    Cuddihy, E. F.

    1984-01-01

    Tests for chemical structure, material properties, water absorption, aging and curing agent of Ethylene Vinyl Acetate (EVA) and UV absorption studies are carried out. A computer model was developed for thermal optical modeling, to investigate dependence between module operating temperature and solar insolation, and heat dissapation behavior. Structural analyses were performed in order to determine the stress distribution under wind and heat conditions. Curves are shown for thermal loading conditions. An electrical isolation was carried out to investigate electrical stress aging of non-metallic encapsulation materials and limiting material flaws, and to develop a computer model of electrical fields and stresses in encapsulation materials. In addition, a mathematical model was developed and tests were conducted to predict hygroscopic and thermal expansion and contraction on a plastic coated wooden substrate. Thermal cycle and humidity freezing cycle tests, partial discharge tests, and hail impact tests were also carried out. Finally, the effects of soiling on the surface of photovoltaic modules were investigated. Two antisoiling coatings, a fluorinated silane and perflourodecanoic acid were considered.

  12. Mixed time integration methods for transient thermal analysis of structures

    NASA Technical Reports Server (NTRS)

    Liu, W. K.

    1982-01-01

    The computational methods used to predict and optimize the thermal structural behavior of aerospace vehicle structures are reviewed. In general, two classes of algorithms, implicit and explicit, are used in transient thermal analysis of structures. Each of these two methods has its own merits. Due to the different time scales of the mechanical and thermal responses, the selection of a time integration method can be a different yet critical factor in the efficient solution of such problems. Therefore mixed time integration methods for transient thermal analysis of structures are being developed. The computer implementation aspects and numerical evaluation of these mixed time implicit-explicit algorithms in thermal analysis of structures are presented. A computationally useful method of estimating the critical time step for linear quadrilateral element is also given. Numerical tests confirm the stability criterion and accuracy characteristics of the methods. The superiority of these mixed time methods to the fully implicit method or the fully explicit method is also demonstrated.

  13. Mixed time integration methods for transient thermal analysis of structures

    NASA Technical Reports Server (NTRS)

    Liu, W. K.

    1983-01-01

    The computational methods used to predict and optimize the thermal-structural behavior of aerospace vehicle structures are reviewed. In general, two classes of algorithms, implicit and explicit, are used in transient thermal analysis of structures. Each of these two methods has its own merits. Due to the different time scales of the mechanical and thermal responses, the selection of a time integration method can be a difficult yet critical factor in the efficient solution of such problems. Therefore mixed time integration methods for transient thermal analysis of structures are being developed. The computer implementation aspects and numerical evaluation of these mixed time implicit-explicit algorithms in thermal analysis of structures are presented. A computationally-useful method of estimating the critical time step for linear quadrilateral element is also given. Numerical tests confirm the stability criterion and accuracy characteristics of the methods. The superiority of these mixed time methods to the fully implicit method or the fully explicit method is also demonstrated.

  14. Computational Approaches for Revealing the Structure of Membrane Transporters: Case Study on Bilitranslocase.

    PubMed

    Venko, Katja; Roy Choudhury, A; Novič, Marjana

    2017-01-01

    The structural and functional details of transmembrane proteins are vastly underexplored, mostly due to experimental difficulties regarding their solubility and stability. Currently, the majority of transmembrane protein structures are still unknown and this present a huge experimental and computational challenge. Nowadays, thanks to X-ray crystallography or NMR spectroscopy over 3000 structures of membrane proteins have been solved, among them only a few hundred unique ones. Due to the vast biological and pharmaceutical interest in the elucidation of the structure and the functional mechanisms of transmembrane proteins, several computational methods have been developed to overcome the experimental gap. If combined with experimental data the computational information enables rapid, low cost and successful predictions of the molecular structure of unsolved proteins. The reliability of the predictions depends on the availability and accuracy of experimental data associated with structural information. In this review, the following methods are proposed for in silico structure elucidation: sequence-dependent predictions of transmembrane regions, predictions of transmembrane helix-helix interactions, helix arrangements in membrane models, and testing their stability with molecular dynamics simulations. We also demonstrate the usage of the computational methods listed above by proposing a model for the molecular structure of the transmembrane protein bilitranslocase. Bilitranslocase is bilirubin membrane transporter, which shares similar tissue distribution and functional properties with some of the members of the Organic Anion Transporter family and is the only member classified in the Bilirubin Transporter Family. Regarding its unique properties, bilitranslocase is a potentially interesting drug target.

  15. Probabilistic Fatigue Damage Prognosis Using a Surrogate Model Trained Via 3D Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Leser, Patrick E.; Hochhalter, Jacob D.; Newman, John A.; Leser, William P.; Warner, James E.; Wawrzynek, Paul A.; Yuan, Fuh-Gwo

    2015-01-01

    Utilizing inverse uncertainty quantification techniques, structural health monitoring can be integrated with damage progression models to form probabilistic predictions of a structure's remaining useful life. However, damage evolution in realistic structures is physically complex. Accurately representing this behavior requires high-fidelity models which are typically computationally prohibitive. In the present work, a high-fidelity finite element model is represented by a surrogate model, reducing computation times. The new approach is used with damage diagnosis data to form a probabilistic prediction of remaining useful life for a test specimen under mixed-mode conditions.

  16. An experiment on the use of disposable plastics as a reinforcement in concrete beams

    NASA Technical Reports Server (NTRS)

    Chowdhury, Mostafiz R.

    1992-01-01

    Illustrated here is the concept of reinforced concrete structures by the use of computer simulation and an inexpensive hands-on design experiment. The students in our construction management program use disposable plastic as a reinforcement to demonstrate their understanding of reinforced concrete and prestressed concrete beams. The plastics used for such an experiment vary from plastic bottles to steel reinforced auto tires. This experiment will show the extent to which plastic reinforcement increases the strength of a concrete beam. The procedure of using such throw-away plastics in an experiment to explain the interaction between the reinforcement material and concrete, and a comparison of the test results for using different types of waste plastics are discussed. A computer analysis to simulate the structural response is used to compare the test results and to understand the analytical background of reinforced concrete design. This interaction of using computers to analyze structures and to relate the output results with real experimentation is found to be a very useful method for teaching a math-based analytical subject to our non-engineering students.

  17. Conifer ovulate cones accumulate pollen principally by simple impaction.

    PubMed

    Cresswell, James E; Henning, Kevin; Pennel, Christophe; Lahoubi, Mohamed; Patrick, Michael A; Young, Phillipe G; Tabor, Gavin R

    2007-11-13

    In many pine species (Family Pinaceae), ovulate cones structurally resemble a turbine, which has been widely interpreted as an adaptation for improving pollination by producing complex aerodynamic effects. We tested the turbine interpretation by quantifying patterns of pollen accumulation on ovulate cones in a wind tunnel and by using simulation models based on computational fluid dynamics. We used computer-aided design and computed tomography to create computational fluid dynamics model cones. We studied three species: Pinus radiata, Pinus sylvestris, and Cedrus libani. Irrespective of the approach or species studied, we found no evidence that turbine-like aerodynamics made a significant contribution to pollen accumulation, which instead occurred primarily by simple impaction. Consequently, we suggest alternative adaptive interpretations for the structure of ovulate cones.

  18. Conifer ovulate cones accumulate pollen principally by simple impaction

    PubMed Central

    Cresswell, James E.; Henning, Kevin; Pennel, Christophe; Lahoubi, Mohamed; Patrick, Michael A.; Young, Phillipe G.; Tabor, Gavin R.

    2007-01-01

    In many pine species (Family Pinaceae), ovulate cones structurally resemble a turbine, which has been widely interpreted as an adaptation for improving pollination by producing complex aerodynamic effects. We tested the turbine interpretation by quantifying patterns of pollen accumulation on ovulate cones in a wind tunnel and by using simulation models based on computational fluid dynamics. We used computer-aided design and computed tomography to create computational fluid dynamics model cones. We studied three species: Pinus radiata, Pinus sylvestris, and Cedrus libani. Irrespective of the approach or species studied, we found no evidence that turbine-like aerodynamics made a significant contribution to pollen accumulation, which instead occurred primarily by simple impaction. Consequently, we suggest alternative adaptive interpretations for the structure of ovulate cones. PMID:17986613

  19. Predicting Ligand Binding Sites on Protein Surfaces by 3-Dimensional Probability Density Distributions of Interacting Atoms

    PubMed Central

    Jian, Jhih-Wei; Elumalai, Pavadai; Pitti, Thejkiran; Wu, Chih Yuan; Tsai, Keng-Chang; Chang, Jeng-Yih; Peng, Hung-Pin; Yang, An-Suei

    2016-01-01

    Predicting ligand binding sites (LBSs) on protein structures, which are obtained either from experimental or computational methods, is a useful first step in functional annotation or structure-based drug design for the protein structures. In this work, the structure-based machine learning algorithm ISMBLab-LIG was developed to predict LBSs on protein surfaces with input attributes derived from the three-dimensional probability density maps of interacting atoms, which were reconstructed on the query protein surfaces and were relatively insensitive to local conformational variations of the tentative ligand binding sites. The prediction accuracy of the ISMBLab-LIG predictors is comparable to that of the best LBS predictors benchmarked on several well-established testing datasets. More importantly, the ISMBLab-LIG algorithm has substantial tolerance to the prediction uncertainties of computationally derived protein structure models. As such, the method is particularly useful for predicting LBSs not only on experimental protein structures without known LBS templates in the database but also on computationally predicted model protein structures with structural uncertainties in the tentative ligand binding sites. PMID:27513851

  20. Rapid Design and Testing of Novel Gas/liquid Contacting Devices for Post-Combustion CO 2 Capture via 3D Printing - Phase II Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Panaccione, Charles; Staab, Greg; Meuleman, Erik

    ION has developed a mathematically driven model for a contacting device incorporating mass transfer, heat transfer, and computational fluid dynamics. This model is based upon a parametric structure for purposes of future commercialization. The most promising design from modeling was 3D printed and tested in a bench scale CO 2 capture unit and compared to commercially available structured packing tested in the same unit.

  1. Smiles2Monomers: a link between chemical and biological structures for polymers.

    PubMed

    Dufresne, Yoann; Noé, Laurent; Leclère, Valérie; Pupin, Maude

    2015-01-01

    The monomeric composition of polymers is powerful for structure comparison and synthetic biology, among others. Many databases give access to the atomic structure of compounds but the monomeric structure of polymers is often lacking. We have designed a smart algorithm, implemented in the tool Smiles2Monomers (s2m), to infer efficiently and accurately the monomeric structure of a polymer from its chemical structure. Our strategy is divided into two steps: first, monomers are mapped on the atomic structure by an efficient subgraph-isomorphism algorithm ; second, the best tiling is computed so that non-overlapping monomers cover all the structure of the target polymer. The mapping is based on a Markovian index built by a dynamic programming algorithm. The index enables s2m to search quickly all the given monomers on a target polymer. After, a greedy algorithm combines the mapped monomers into a consistent monomeric structure. Finally, a local branch and cut algorithm refines the structure. We tested this method on two manually annotated databases of polymers and reconstructed the structures de novo with a sensitivity over 90 %. The average computation time per polymer is 2 s. s2m automatically creates de novo monomeric annotations for polymers, efficiently in terms of time computation and sensitivity. s2m allowed us to detect annotation errors in the tested databases and to easily find the accurate structures. So, s2m could be integrated into the curation process of databases of small compounds to verify the current entries and accelerate the annotation of new polymers. The full method can be downloaded or accessed via a website for peptide-like polymers at http://bioinfo.lifl.fr/norine/smiles2monomers.jsp.Graphical abstract:.

  2. Probabilistic design of fibre concrete structures

    NASA Astrophysics Data System (ADS)

    Pukl, R.; Novák, D.; Sajdlová, T.; Lehký, D.; Červenka, J.; Červenka, V.

    2017-09-01

    Advanced computer simulation is recently well-established methodology for evaluation of resistance of concrete engineering structures. The nonlinear finite element analysis enables to realistically predict structural damage, peak load, failure, post-peak response, development of cracks in concrete, yielding of reinforcement, concrete crushing or shear failure. The nonlinear material models can cover various types of concrete and reinforced concrete: ordinary concrete, plain or reinforced, without or with prestressing, fibre concrete, (ultra) high performance concrete, lightweight concrete, etc. Advanced material models taking into account fibre concrete properties such as shape of tensile softening branch, high toughness and ductility are described in the paper. Since the variability of the fibre concrete material properties is rather high, the probabilistic analysis seems to be the most appropriate format for structural design and evaluation of structural performance, reliability and safety. The presented combination of the nonlinear analysis with advanced probabilistic methods allows evaluation of structural safety characterized by failure probability or by reliability index respectively. Authors offer a methodology and computer tools for realistic safety assessment of concrete structures; the utilized approach is based on randomization of the nonlinear finite element analysis of the structural model. Uncertainty of the material properties or their randomness obtained from material tests are accounted in the random distribution. Furthermore, degradation of the reinforced concrete materials such as carbonation of concrete, corrosion of reinforcement, etc. can be accounted in order to analyze life-cycle structural performance and to enable prediction of the structural reliability and safety in time development. The results can serve as a rational basis for design of fibre concrete engineering structures based on advanced nonlinear computer analysis. The presented methodology is illustrated on results from two probabilistic studies with different types of concrete structures related to practical applications and made from various materials (with the parameters obtained from real material tests).

  3. Composite structural materials. [aircraft structures

    NASA Technical Reports Server (NTRS)

    Ansell, G. S.; Loewy, R. G.; Wiberley, S. E.

    1980-01-01

    The use of filamentary composite materials in the design and construction of primary aircraft structures is considered with emphasis on efforts to develop advanced technology in the areas of physical properties, structural concepts and analysis, manufacturing, and reliability and life prediction. The redesign of a main spar/rib region on the Boeing 727 elevator near its actuator attachment point is discussed. A composite fabrication and test facility is described as well as the use of minicomputers for computer aided design. Other topics covered include (1) advanced structural analysis methids for composites; (2) ultrasonic nondestructive testing of composite structures; (3) optimum combination of hardeners in the cure of epoxy; (4) fatigue in composite materials; (5) resin matrix characterization and properties; (6) postbuckling analysis of curved laminate composite panels; and (7) acoustic emission testing of composite tensile specimens.

  4. Crashworthy airframe design concepts: Fabrication and testing

    NASA Technical Reports Server (NTRS)

    Cronkhite, J. D.; Berry, V. L.

    1982-01-01

    Crashworthy floor concepts applicable to general aviation aircraft metal airframe structures were investigated. Initially several energy absorbing lower fuselage structure concepts were evaluated. Full scale floor sections representative of a twin engine, general aviation airplane lower fuselage structure were designed and fabricated. The floors featured an upper high strength platform with an energy absorbing, crushable structure underneath. Eighteen floors were fabricated that incorporated five different crushable subfloor concepts. The floors were then evaluated through static and dynamic testing. Computer programs NASTRAN and KRASH were used for the static and dynamic analysis of the floor section designs. Two twin engine airplane fuselages were modified to incorporate the most promising crashworthy floor sections for test evaluation.

  5. Accurate Cold-Test Model of Helical TWT Slow-Wave Circuits

    NASA Technical Reports Server (NTRS)

    Kory, Carol L.; Dayton, James A., Jr.

    1997-01-01

    Recently, a method has been established to accurately calculate cold-test data for helical slow-wave structures using the three-dimensional electromagnetic computer code, MAFIA. Cold-test parameters have been calculated for several helical traveling-wave tube (TWT) slow-wave circuits possessing various support rod configurations, and results are presented here showing excellent agreement with experiment. The helical models include tape thickness, dielectric support shapes and material properties consistent with the actual circuits. The cold-test data from this helical model can be used as input into large-signal helical TWT interaction codes making it possible, for the first time, to design a complete TWT via computer simulation.

  6. Composite structural materials

    NASA Technical Reports Server (NTRS)

    Ansell, G. S.; Loewy, R. G.; Wiberley, S. E.

    1981-01-01

    The composite aircraft program component (CAPCOMP) is a graduate level project conducted in parallel with a composite structures program. The composite aircraft program glider (CAPGLIDE) is an undergraduate demonstration project which has as its objectives the design, fabrication, and testing of a foot launched ultralight glider using composite structures. The objective of the computer aided design (COMPAD) portion of the composites project is to provide computer tools for the analysis and design of composite structures. The major thrust of COMPAD is in the finite element area with effort directed at implementing finite element analysis capabilities and developing interactive graphics preprocessing and postprocessing capabilities. The criteria for selecting research projects to be conducted under the innovative and supporting research (INSURE) program are described.

  7. The Use of Computers and Video Games in Brain Damage Therapy.

    ERIC Educational Resources Information Center

    Lorimer, David

    The use of computer assisted therapy (CAT) in the rehabilitation of individuals with brain damage is examined. Hardware considerations are explored, and the variety of software programs available for brain injury rehabilitation is discussed. Structured testing and treatment programs in time measurement, memory, and direction finding are described,…

  8. Invariance of an Extended Technology Acceptance Model Across Gender and Age Group

    ERIC Educational Resources Information Center

    Ahmad, Tunku Badariah Tunku; Madarsha, Kamal Basha; Zainuddin, Ahmad Marzuki; Ismail, Nik Ahmad Hisham; Khairani, Ahmad Zamri; Nordin, Mohamad Sahari

    2011-01-01

    In this study, we examined the likelihood of a TAME (extended technology acceptance model), in which the interrelationships among computer self-efficacy, perceived usefulness, intention to use and self-reported use of computer-mediated technology were tested. In addition, the gender- and age-invariant of its causal structure were evaluated. The…

  9. Teaching through Interactive Pictures: Computer, Video and Human Realities.

    ERIC Educational Resources Information Center

    Strauss, Andre

    A technique designed to make the classroom use of videotapes for second language teaching for specific purposes more efficient is described. The technique begins with a classroom review of basic vocabulary and structures, which is then tested with a computer exercise. A second stage requires discussion and memorization of vocabulary and phrases…

  10. Ensuring Positiveness of the Scaled Difference Chi-Square Test Statistic

    ERIC Educational Resources Information Center

    Satorra, Albert; Bentler, Peter M.

    2010-01-01

    A scaled difference test statistic T[tilde][subscript d] that can be computed from standard software of structural equation models (SEM) by hand calculations was proposed in Satorra and Bentler (Psychometrika 66:507-514, 2001). The statistic T[tilde][subscript d] is asymptotically equivalent to the scaled difference test statistic T[bar][subscript…

  11. A three degree of freedom manipulator used for store separation wind tunnel test

    NASA Astrophysics Data System (ADS)

    Wei, R.; Che, B.-H.; Sun, C.-B.; Zhang, J.; Lu, Y.-Q.

    2018-06-01

    A three degree of freedom manipulator is presented, which is used for store separation wind tunnel test. It is a kind of mechatronics product, have small volume and large moment of torque. The paper researched the design principle of wind tunnel test equipment, also introduced the transmission principle design, physical design, control system design, drive element selection calculation and verification, dynamics computation and static structural computation of the manipulator. To satisfy the design principle of wind tunnel test equipment, some optimization design are made include optimizes the structure of drive element and cable, fairing configuration, overall dimension so that to make the device more suitable for the wind tunnel test. Some tests are made to verify the parameters of the manipulator. The results show that the device improves the load from 100 Nm to 250 Nm, control accuracy from 0.1°to 0.05°in pitch and yaw, also improves load from 10 Nm to 20 Nm, control accuracy from 0.1°to 0.05°in roll.

  12. The effects of a visualization-centered curriculum on conceptual understanding and representational competence in high school biology

    NASA Astrophysics Data System (ADS)

    Wilder, Anna

    The purpose of this study was to investigate the effects of a visualization-centered curriculum, Hemoglobin: A Case of Double Identity, on conceptual understanding and representational competence in high school biology. Sixty-nine students enrolled in three sections of freshman biology taught by the same teacher participated in this study. Online Chemscape Chime computer-based molecular visualizations were incorporated into the 10-week curriculum to introduce students to fundamental structure and function relationships. Measures used in this study included a Hemoglobin Structure and Function Test, Mental Imagery Questionnaire, Exam Difficulty Survey, the Student Assessment of Learning Gains, the Group Assessment of Logical Thinking, the Attitude Toward Science in School Assessment, audiotapes of student interviews, students' artifacts, weekly unit activity surveys, informal researcher observations and a teacher's weekly questionnaire. The Hemoglobin Structure and Function Test, consisting of Parts A and B, was administered as a pre and posttest. Part A used exclusively verbal test items to measure conceptual understanding, while Part B used visual-verbal test items to measure conceptual understanding and representational competence. Results of the Hemoglobin Structure and Function pre and posttest revealed statistically significant gains in conceptual understanding and representational competence, suggesting the visualization-centered curriculum implemented in this study was effective in supporting positive learning outcomes. The large positive correlation between posttest results on Part A, comprised of all-verbal test items, and Part B, using visual-verbal test items, suggests this curriculum supported students' mutual development of conceptual understanding and representational competence. Evidence based on student interviews, Student Assessment of Learning Gains ratings and weekly activity surveys indicated positive attitudes toward the use of Chemscape Chime software and the computer-based molecular visualization activities as learning tools. Evidence from these same sources also indicated that students felt computer-based molecular visualization activities in conjunction with other classroom activities supported their learning. Implications for instructional design are discussed.

  13. Test-Free Fracture Toughness

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon; Chamis, Christos C. (Technical Monitor)

    2003-01-01

    Computational simulation results can give the prediction of damage growth and progression and fracture toughness of composite structures. The experimental data from literature provide environmental effects on the fracture behavior of metallic or fiber composite structures. However, the traditional experimental methods to analyze the influence of the imposed conditions are expensive and time consuming. This research used the CODSTRAN code to model the temperature effects, scaling effects and the loading effects of fiber/braided composite specimens with and without fiber-optic sensors on the damage initiation and energy release rates. The load-displacement relationship and fracture toughness assessment approach is compared with the test results from literature and it is verified that the computational simulation, with the use of established material modeling and finite element modules, adequately tracks the changes of fracture toughness and subsequent fracture propagation for any fiber/braided composite structure due to the change of fiber orientations, presence of large diameter optical fibers, and any loading conditions.

  14. Test-Free Fracture Toughness

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon; Chamis, Christos C. (Technical Monitor)

    2003-01-01

    Computational simulation results can give the prediction of damage growth and progression and fracture toughness of composite structures. The experimental data from literature provide environmental effects on the fracture behavior of metallic or fiber composite structures. However, the traditional experimental methods to analyze the influence of the imposed conditions are expensive and time consuming. This research used the CODSTRAN code to model the temperature effects, scaling effects and the loading effects of fiberbraided composite specimens with and without fiber-optic sensors on the damage initiation and energy release rates. The load-displacement relationship and fracture toughness assessment approach is compared with the test results from literature and it is verified that the computational simulation, with the use of established material modeling and finite element modules, adequately tracks the changes of fracture toughness and subsequent fracture propagation for any fiberbraided composite structure due to the change of fiber orientations, presence of large diameter optical fibers, and any loading conditions.

  15. Structural-Vibration-Response Data Analysis

    NASA Technical Reports Server (NTRS)

    Smith, W. R.; Hechenlaible, R. N.; Perez, R. C.

    1983-01-01

    Computer program developed as structural-vibration-response data analysis tool for use in dynamic testing of Space Shuttle. Program provides fast and efficient time-domain least-squares curve-fitting procedure for reducing transient response data to obtain structural model frequencies and dampings from free-decay records. Procedure simultaneously identifies frequencies, damping values, and participation factors for noisy multiple-response records.

  16. Factor Structure and Incremental Validity of the Enhanced Computer- Administered Tests

    DTIC Science & Technology

    1992-07-01

    performance in the mechanical maintenance specialties. 14. SUBJECT TERMS Aptitude tests, ASVAB (Armed services vocational aptitude battery), CAT ...Code 11) Attn: Dir, Personnel Systems (Code 12) Attn: Dir, Testing Systems (Code 13) Attn: CAT /ASVABPMO FJB1 COMNAVCRUITCOM FT1 CNET V8 CG MCRD...test, a computerized adaptive testing version of the ASVAB ( CAT -ASVAB), the psychomotor portion of the General Aptitude Test Battery (GATB), and the

  17. How to Create, Modify, and Interface Aspen In-House and User Databanks for System Configuration 1:

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Camp, D W

    2000-10-27

    The goal of this document is to provide detailed instructions to create, modify, interface, and test Aspen User and In-House databanks with minimal frustration. The level of instructions are aimed at a novice Aspen Plus simulation user who is neither a programming nor computer-system expert. The instructions are tailored to Version 10.1 of Aspen Plus and the specific computing configuration summarized in the Title of this document and detailed in Section 2. Many details of setting up databanks depend on the computing environment specifics, such as the machines, operating systems, command languages, directory structures, inter-computer communications software, the version ofmore » the Aspen Engine and Graphical User Interface (GUI), and the directory structure of how these were installed.« less

  18. Effect of strain rate and temperature on mechanical properties of selected building Polish steels

    NASA Astrophysics Data System (ADS)

    Moćko, Wojciech; Kruszka, Leopold

    2015-09-01

    Currently, the computer programs of CAD type are basic tool for designing of various structures under impact loading. Application of the numerical calculations allows to substantially reduce amount of time required for the design stage of such projects. However, the proper use of computer aided designing technique requires input data for numerical software including elastic-plastic models of structural materials. This work deals with the constitutive model developed by Rusinek and Klepaczko (RK) applied for the modelling of mechanical behaviour of selected grades structural St0S, St3SX, 18GS and 34GS steels and presents here results of experimental and empirical analyses to describe dynamic elastic-plastic behaviours of tested materials at wide range of temperature. In order to calibrate the RK constitutive model, series of compression tests at wide range of strain rates, including static, quasi-static and dynamic investigations at lowered, room and elevated temperatures, were carried out using two testing stands: servo-hydraulic machine and split Hopkinson bar. The results were analysed to determine influence of temperature and strain rate on visco-plastic response of tested steels, and show good correlation with experimental data.

  19. Test Input Generation for Red-Black Trees using Abstraction

    NASA Technical Reports Server (NTRS)

    Visser, Willem; Pasareanu, Corina S.; Pelanek, Radek

    2005-01-01

    We consider the problem of test input generation for code that manipulates complex data structures. Test inputs are sequences of method calls from the data structure interface. We describe test input generation techniques that rely on state matching to avoid generation of redundant tests. Exhaustive techniques use explicit state model checking to explore all the possible test sequences up to predefined input sizes. Lossy techniques rely on abstraction mappings to compute and store abstract versions of the concrete states; they explore under-approximations of all the possible test sequences. We have implemented the techniques on top of the Java PathFinder model checker and we evaluate them using a Java implementation of red-black trees.

  20. Turbocharged molecular discovery of OLED emitters: from high-throughput quantum simulation to highly efficient TADF devices

    NASA Astrophysics Data System (ADS)

    Gómez-Bombarelli, Rafael; Aguilera-Iparraguirre, Jorge; Hirzel, Timothy D.; Ha, Dong-Gwang; Einzinger, Markus; Wu, Tony; Baldo, Marc A.; Aspuru-Guzik, Alán.

    2016-09-01

    Discovering new OLED emitters requires many experiments to synthesize candidates and test performance in devices. Large scale computer simulation can greatly speed this search process but the problem remains challenging enough that brute force application of massive computing power is not enough to successfully identify novel structures. We report a successful High Throughput Virtual Screening study that leveraged a range of methods to optimize the search process. The generation of candidate structures was constrained to contain combinatorial explosion. Simulations were tuned to the specific problem and calibrated with experimental results. Experimentalists and theorists actively collaborated such that experimental feedback was regularly utilized to update and shape the computational search. Supervised machine learning methods prioritized candidate structures prior to quantum chemistry simulation to prevent wasting compute on likely poor performers. With this combination of techniques, each multiplying the strength of the search, this effort managed to navigate an area of molecular space and identify hundreds of promising OLED candidate structures. An experimentally validated selection of this set shows emitters with external quantum efficiencies as high as 22%.

  1. A 3D moisture-stress FEM analysis for time dependent problems in timber structures

    NASA Astrophysics Data System (ADS)

    Fortino, Stefania; Mirianon, Florian; Toratti, Tomi

    2009-11-01

    This paper presents a 3D moisture-stress numerical analysis for timber structures under variable humidity and load conditions. An orthotropic viscoelastic-mechanosorptive material model is specialized on the basis of previous models. Both the constitutive model and the equations needed to describe the moisture flow across the structure are implemented into user subroutines of the Abaqus finite element code and a coupled moisture-stress analysis is performed for several types of mechanical loads and moisture changes. The presented computational approach is validated by analyzing some wood tests described in the literature and comparing the computational results with the reported experimental data.

  2. Buckling analysis and test correlation of hat stiffened panels for hypersonic vehicles

    NASA Technical Reports Server (NTRS)

    Percy, Wendy C.; Fields, Roger A.

    1990-01-01

    The paper discusses the design, analysis, and test of hat stiffened panels subjected to a variety of thermal and mechanical load conditions. The panels were designed using data from structural optimization computer codes and finite element analysis. Test methods included the grid shadow moire method and a single gage force stiffness method. The agreement between the test data and analysis provides confidence in the methods that are currently being used to design structures for hypersonic vehicles. The agreement also indicates that post buckled strength may potentially be used to reduce the vehicle weight.

  3. Active Control Technology at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Antcliff, Richard R.; McGowan, Anna-Marie R.

    2000-01-01

    NASA Langley has a long history of attacking important technical opportunities from a broad base of supporting disciplines. The research and development at Langley in this subject area range from the test tube to the test flight. The information covered here will range from the development of innovative new materials, sensors and actuators, to the incorporation of smart sensors and actuators in practical devices, to the optimization of the location of these devices, to, finally, a wide variety of applications of these devices utilizing Langley's facilities and expertise. Advanced materials are being developed for sensors and actuators, as well as polymers for integrating smart devices into composite structures. Contributions reside in three key areas: computational materials; advanced piezoelectric materials; and integrated composite structures. The computational materials effort is focused on developing predictive tools for the efficient design of new materials with the appropriate combination of properties for next generation smart airframe systems. Research in the area of advanced piezoelectrics includes optimizing the efficiency, force output, use temperature, and energy transfer between the structure and device for both ceramic and polymeric materials. For structural health monitoring, advanced non-destructive techniques including fiber optics are being developed for detection of delaminations, cracks and environmental deterioration in aircraft structures. The computational materials effort is focused on developing predictive tools for the efficient design of new materials with the appropriate combination of properties for next generation smart airframe system. Innovative fabrication techniques processing structural composites with sensor and actuator integration are being developed.

  4. Structural health monitoring for DOT using magnetic shape memory alloy cables in concrete

    NASA Astrophysics Data System (ADS)

    Davis, Allen; Mirsayar, Mirmilad; Sheahan, Emery; Hartl, Darren

    2018-03-01

    Embedding shape memory alloy (SMA) wires in concrete components offers the potential to monitor their structural health via external magnetic field sensing. Currently, structural health monitoring (SHM) is dominated by acoustic emission and vibration-based methods. Thus, it is attractive to pursue alternative damage sensing techniques that may lower the cost or increase the accuracy of SHM. In this work, SHM via magnetic field detection applied to embedded magnetic shape memory alloy (MSMA) is demonstrated both experimentally and using computational models. A concrete beam containing iron-based MSMA wire is subjected to a 3-point bend test where structural damage is induced, thereby resulting in a localized phase change of the MSMA wire. Magnetic field lines passing through the embedded MSMA domain are altered by this phase change and can thus be used to detect damage within the structure. A good correlation is observed between the computational and experimental results. Additionally, the implementation of stranded MSMA cables in place of the MSMA wire is assessed through similar computational models. The combination of these computational models and their subsequent experimental validation provide sufficient support for the feasibility of SHM using magnetic field sensing via MSMA embedded components.

  5. Conformational energy calculations on polypeptides and proteins: use of a statistical mechanical procedure for evaluating structure and properties.

    PubMed

    Scheraga, H A; Paine, G H

    1986-01-01

    We are using a variety of theoretical and computational techniques to study protein structure, protein folding, and higher-order structures. Our earlier work involved treatments of liquid water and aqueous solutions of nonpolar and polar solutes, computations of the stabilities of the fundamental structures of proteins and their packing arrangements, conformations of small cyclic and open-chain peptides, structures of fibrous proteins (collagen), structures of homologous globular proteins, introduction of special procedures as constraints during energy minimization of globular proteins, and structures of enzyme-substrate complexes. Recently, we presented a new methodology for predicting polypeptide structure (described here); the method is based on the calculation of the probable and average conformation of a polypeptide chain by the application of equilibrium statistical mechanics in conjunction with an adaptive, importance sampling Monte Carlo algorithm. As a test, it was applied to Met-enkephalin.

  6. Full equations utilities (FEQUTL) model for the approximation of hydraulic characteristics of open channels and control structures during unsteady flow

    USGS Publications Warehouse

    Franz, Delbert D.; Melching, Charles S.

    1997-01-01

    The Full EQuations UTiLities (FEQUTL) model is a computer program for computation of tables that list the hydraulic characteristics of open channels and control structures as a function of upstream and downstream depths; these tables facilitate the simulation of unsteady flow in a stream system with the Full Equations (FEQ) model. Simulation of unsteady flow requires many iterations for each time period computed. Thus, computation of hydraulic characteristics during the simulations is impractical, and preparation of function tables and application of table look-up procedures facilitates simulation of unsteady flow. Three general types of function tables are computed: one-dimensional tables that relate hydraulic characteristics to upstream flow depth, two-dimensional tables that relate flow through control structures to upstream and downstream flow depth, and three-dimensional tables that relate flow through gated structures to upstream and downstream flow depth and gate setting. For open-channel reaches, six types of one-dimensional function tables contain different combinations of the top width of flow, area, first moment of area with respect to the water surface, conveyance, flux coefficients, and correction coefficients for channel curvilinearity. For hydraulic control structures, one type of one-dimensional function table contains relations between flow and upstream depth, and two types of two-dimensional function tables contain relations among flow and upstream and downstream flow depths. For hydraulic control structures with gates, a three-dimensional function table lists the system of two-dimensional tables that contain the relations among flow and upstream and downstream flow depths that correspond to different gate openings. Hydraulic control structures for which function tables containing flow relations are prepared in FEQUTL include expansions, contractions, bridges, culverts, embankments, weirs, closed conduits (circular, rectangular, and pipe-arch shapes), dam failures, floodways, and underflow gates (sluice and tainter gates). The theory for computation of the hydraulic characteristics is presented for open channels and for each hydraulic control structure. For the hydraulic control structures, the theory is developed from the results of experimental tests of flow through the structure for different upstream and downstream flow depths. These tests were done to describe flow hydraulics for a single, steady-flow design condition and, thus, do not provide complete information on flow transitions (for example, between free- and submerged-weir flow) that may result in simulation of unsteady flow. Therefore, new procedures are developed to approximate the hydraulics of flow transitions for culverts, embankments, weirs, and underflow gates.

  7. An application of high authority/low authority control and positivity

    NASA Technical Reports Server (NTRS)

    Seltzer, S. M.; Irwin, D.; Tollison, D.; Waites, H. B.

    1988-01-01

    Control Dynamics Company (CDy), in conjunction with NASA Marshall Space Flight Center (MSFC), has supported the U.S. Air Force Wright Aeronautical Laboratory (AFWAL) in conducting an investigation of the implementation of several DOD controls techniques. These techniques are to provide vibration suppression and precise attitude control for flexible space structures. AFWAL issued a contract to Control Dynamics to perform this work under the Active Control Technique Evaluation for Spacecraft (ACES) Program. The High Authority Control/Low Authority Control (HAC/LAC) and Positivity controls techniques, which were cultivated under the DARPA Active Control of Space Structures (ACOSS) Program, were applied to a structural model of the NASA/MSFC Ground Test Facility ACES configuration. The control systems design were accomplished and linear post-analyses of the closed-loop systems are provided. The control system designs take into account effects of sampling and delay in the control computer. Nonlinear simulation runs were used to verify the control system designs and implementations in the facility control computers. Finally, test results are given to verify operations of the control systems in the test facility.

  8. Impact design methods for ceramic components in gas turbine engines

    NASA Technical Reports Server (NTRS)

    Song, J.; Cuccio, J.; Kington, H.

    1991-01-01

    Methods currently under development to design ceramic turbine components with improved impact resistance are presented. Two different modes of impact damage are identified and characterized, i.e., structural damage and local damage. The entire computation is incorporated into the EPIC computer code. Model capability is demonstrated by simulating instrumented plate impact and particle impact tests.

  9. A new test of computational protein design: predicting posttranslational modification specificity for the enzyme SMYD2.

    PubMed

    Reynolds, Kimberly A

    2015-01-06

    In this issue of Structure, Lanouette and colleagues use a combination of computation and experiment to define a specificity motif for the lysine methyltransferase SMYD2. Using this motif, they predict and experimentally verify four new SMYD2 substrates. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. A Theorem on the Rank of a Product of Matrices with Illustration of Its Use in Goodness of Fit Testing.

    PubMed

    Satorra, Albert; Neudecker, Heinz

    2015-12-01

    This paper develops a theorem that facilitates computing the degrees of freedom of Wald-type chi-square tests for moment restrictions when there is rank deficiency of key matrices involved in the definition of the test. An if and only if (iff) condition is developed for a simple rule of difference of ranks to be used when computing the desired degrees of freedom of the test. The theorem is developed exploiting basics tools of matrix algebra. The theorem is shown to play a key role in proving the asymptotic chi-squaredness of a goodness of fit test in moment structure analysis, and in finding the degrees of freedom of this chi-square statistic.

  11. The NASA B-757 HIRF Test Series: Flight Test Results

    NASA Technical Reports Server (NTRS)

    Moeller, Karl J.; Dudley, Kenneth L.

    1997-01-01

    In 1995, the NASA Langley Research Center conducted a series of aircraft tests aimed at characterizing the electromagnetic environment (EME) in and around a Boeing 757 airliner. Measurements were made of the electromagnetic energy coupled into the aircraft and the signals induced on select structures as the aircraft was flown past known RF transmitters. These measurements were conducted to provide data for the validation of computational techniques for the assessment of electromagnetic effects in commercial transport aircraft. This paper reports on the results of flight tests using RF radiators in the HF, VHF, and UHF ranges and on efforts to use computational and analytical techniques to predict RF field levels inside the airliner at these frequencies.

  12. Ensuring Positiveness of the Scaled Difference Chi-square Test Statistic.

    PubMed

    Satorra, Albert; Bentler, Peter M

    2010-06-01

    A scaled difference test statistic [Formula: see text] that can be computed from standard software of structural equation models (SEM) by hand calculations was proposed in Satorra and Bentler (2001). The statistic [Formula: see text] is asymptotically equivalent to the scaled difference test statistic T̄(d) introduced in Satorra (2000), which requires more involved computations beyond standard output of SEM software. The test statistic [Formula: see text] has been widely used in practice, but in some applications it is negative due to negativity of its associated scaling correction. Using the implicit function theorem, this note develops an improved scaling correction leading to a new scaled difference statistic T̄(d) that avoids negative chi-square values.

  13. Stochastic model of the NASA/MSFC ground facility for large space structures with uncertain parameters: The maximum entropy approach, part 2

    NASA Technical Reports Server (NTRS)

    Hsia, Wei Shen

    1989-01-01

    A validated technology data base is being developed in the areas of control/structures interaction, deployment dynamics, and system performance for Large Space Structures (LSS). A Ground Facility (GF), in which the dynamics and control systems being considered for LSS applications can be verified, was designed and built. One of the important aspects of the GF is to verify the analytical model for the control system design. The procedure is to describe the control system mathematically as well as possible, then to perform tests on the control system, and finally to factor those results into the mathematical model. The reduction of the order of a higher order control plant was addressed. The computer program was improved for the maximum entropy principle adopted in Hyland's MEOP method. The program was tested against the testing problem. It resulted in a very close match. Two methods of model reduction were examined: Wilson's model reduction method and Hyland's optimal projection (OP) method. Design of a computer program for Hyland's OP method was attempted. Due to the difficulty encountered at the stage where a special matrix factorization technique is needed in order to obtain the required projection matrix, the program was successful up to the finding of the Linear Quadratic Gaussian solution but not beyond. Numerical results along with computer programs which employed ORACLS are presented.

  14. Semi-automatic forensic approach using mandibular midline lingual structures as fingerprint: a pilot study.

    PubMed

    Shaheen, E; Mowafy, B; Politis, C; Jacobs, R

    2017-12-01

    Previous research proposed the use of the mandibular midline neurovascular canal structures as a forensic finger print. In their observer study, an average correct identification of 95% was reached which triggered this study. To present a semi-automatic computer recognition approach to replace the observers and to validate the accuracy of this newly proposed method. Imaging data from Computer Tomography (CT) and Cone Beam Computer Tomography (CBCT) of mandibles scanned at two different moments were collected to simulate an AM and PM situation where the first scan presented AM and the second scan was used to simulate PM. Ten cases with 20 scans were used to build a classifier which relies on voxel based matching and results with classification into one of two groups: "Unmatched" and "Matched". This protocol was then tested using five other scans out of the database. Unpaired t-testing was applied and accuracy of the computerized approach was determined. A significant difference was found between the "Unmatched" and "Matched" classes with means of 0.41 and 0.86 respectively. Furthermore, the testing phase showed an accuracy of 100%. The validation of this method pushes this protocol further to a fully automatic identification procedure for victim identification based on the mandibular midline canals structures only in cases with available AM and PM CBCT/CT data.

  15. Impacts: NIST Building and Fire Research Laboratory (technical and societal)

    NASA Astrophysics Data System (ADS)

    Raufaste, N. J.

    1993-08-01

    The Building and Fire Research Laboratory (BFRL) of the National Institute of Standards and Technology (NIST) is dedicated to the life cycle quality of constructed facilities. The report describes major effects of BFRL's program on building and fire research. Contents of the document include: structural reliability; nondestructive testing of concrete; structural failure investigations; seismic design and construction standards; rehabilitation codes and standards; alternative refrigerants research; HVAC simulation models; thermal insulation; residential equipment energy efficiency; residential plumbing standards; computer image evaluation of building materials; corrosion-protection for reinforcing steel; prediction of the service lives of building materials; quality of construction materials laboratory testing; roofing standards; simulating fires with computers; fire safety evaluation system; fire investigations; soot formation and evolution; cone calorimeter development; smoke detector standards; standard for the flammability of children's sleepwear; smoldering insulation fires; wood heating safety research; in-place testing of concrete; communication protocols for building automation and control systems; computer simulation of the properties of concrete and other porous materials; cigarette-induced furniture fires; carbon monoxide formation in enclosure fires; halon alternative fire extinguishing agents; turbulent mixing research; materials fire research; furniture flammability testing; standard for the cigarette ignition resistance of mattresses; support of navy firefighter trainer program; and using fire to clean up oil spills.

  16. 33 CFR 148.8 - How are certifying entities designated and used for purposes of this subchapter?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...-based structures and project-related structures, systems, and equipment; (6) Technical capabilities, including professional certifications and organizational memberships of the nominee or the primary staff to..., appropriate technology such as computer modeling programs and hardware or testing materials and equipment; (8...

  17. Imagining Garage Start-Ups: Interactive Effects of Imaginative Capacities on Entrepreneurial Intention

    ERIC Educational Resources Information Center

    Chang, Chi-Cheng; Yao, Shu-Nung; Chen, Shi-An; King, Jung-Tai; Liang, Chaoyun

    2016-01-01

    This article describes a structural examination of the interaction among different imaginative capacities and the entrepreneurial intention of electrical and computer engineering students. Two studies were combined to confirm the factor structure of survey items and test the hypothesised interaction model. The results indicated that imaginative…

  18. 33 CFR 148.8 - How are certifying entities designated and used for purposes of this subchapter?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... be associated with the CE's duties for the specific project; (7) In-house availability of, or access to, appropriate technology such as computer modeling programs and hardware or testing materials and...-based structures and project-related structures, systems, and equipment; (6) Technical capabilities...

  19. 33 CFR 148.8 - How are certifying entities designated and used for purposes of this subchapter?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... be associated with the CE's duties for the specific project; (7) In-house availability of, or access to, appropriate technology such as computer modeling programs and hardware or testing materials and...-based structures and project-related structures, systems, and equipment; (6) Technical capabilities...

  20. Blending Determinism with Evolutionary Computing: Applications to the Calculation of the Molecular Electronic Structure of Polythiophene.

    PubMed

    Sarkar, Kanchan; Sharma, Rahul; Bhattacharyya, S P

    2010-03-09

    A density matrix based soft-computing solution to the quantum mechanical problem of computing the molecular electronic structure of fairly long polythiophene (PT) chains is proposed. The soft-computing solution is based on a "random mutation hill climbing" scheme which is modified by blending it with a deterministic method based on a trial single-particle density matrix [P((0))(R)] for the guessed structural parameters (R), which is allowed to evolve under a unitary transformation generated by the Hamiltonian H(R). The Hamiltonian itself changes as the geometrical parameters (R) defining the polythiophene chain undergo mutation. The scale (λ) of the transformation is optimized by making the energy [E(λ)] stationary with respect to λ. The robustness and the performance levels of variants of the algorithm are analyzed and compared with those of other derivative free methods. The method is further tested successfully with optimization of the geometry of bipolaron-doped long PT chains.

  1. Reducing computation in an i-vector speaker recognition system using a tree-structured universal background model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McClanahan, Richard; De Leon, Phillip L.

    The majority of state-of-the-art speaker recognition systems (SR) utilize speaker models that are derived from an adapted universal background model (UBM) in the form of a Gaussian mixture model (GMM). This is true for GMM supervector systems, joint factor analysis systems, and most recently i-vector systems. In all of the identified systems, the posterior probabilities and sufficient statistics calculations represent a computational bottleneck in both enrollment and testing. We propose a multi-layered hash system, employing a tree-structured GMM–UBM which uses Runnalls’ Gaussian mixture reduction technique, in order to reduce the number of these calculations. Moreover, with this tree-structured hash, wemore » can trade-off reduction in computation with a corresponding degradation of equal error rate (EER). As an example, we also reduce this computation by a factor of 15× while incurring less than 10% relative degradation of EER (or 0.3% absolute EER) when evaluated with NIST 2010 speaker recognition evaluation (SRE) telephone data.« less

  2. Reducing computation in an i-vector speaker recognition system using a tree-structured universal background model

    DOE PAGES

    McClanahan, Richard; De Leon, Phillip L.

    2014-08-20

    The majority of state-of-the-art speaker recognition systems (SR) utilize speaker models that are derived from an adapted universal background model (UBM) in the form of a Gaussian mixture model (GMM). This is true for GMM supervector systems, joint factor analysis systems, and most recently i-vector systems. In all of the identified systems, the posterior probabilities and sufficient statistics calculations represent a computational bottleneck in both enrollment and testing. We propose a multi-layered hash system, employing a tree-structured GMM–UBM which uses Runnalls’ Gaussian mixture reduction technique, in order to reduce the number of these calculations. Moreover, with this tree-structured hash, wemore » can trade-off reduction in computation with a corresponding degradation of equal error rate (EER). As an example, we also reduce this computation by a factor of 15× while incurring less than 10% relative degradation of EER (or 0.3% absolute EER) when evaluated with NIST 2010 speaker recognition evaluation (SRE) telephone data.« less

  3. Simulation of blast action on civil structures using ANSYS Autodyn

    NASA Astrophysics Data System (ADS)

    Fedorova, N. N.; Valger, S. A.; Fedorov, A. V.

    2016-10-01

    The paper presents the results of 3D numerical simulations of shock wave spreading in cityscape area. ANSYS Autodyne software is used for the computations. Different test cases are investigated numerically. On the basis of the computations, the complex transient flowfield structure formed in the vicinity of prismatic bodies was obtained and analyzed. The simulation results have been compared to the experimental data. The ability of two numerical schemes is studied to correctly predict the pressure history in several gauges placed on walls of the obstacles.

  4. Bridging the digital divide through the integration of computer and information technology in science education: An action research study

    NASA Astrophysics Data System (ADS)

    Brown, Gail Laverne

    The presence of a digital divide, computer and information technology integration effectiveness, and barriers to continued usage of computer and information technology were investigated. Thirty-four African American and Caucasian American students (17 males and 17 females) in grades 9--11 from 2 Georgia high school science classes were exposed to 30 hours of hands-on computer and information technology skills. The purpose of the exposure was to improve students' computer and information technology skills. Pre-study and post-study skills surveys, and structured interviews were used to compare race, gender, income, grade-level, and age differences with respect to computer usage. A paired t-test and McNemar test determined mean differences between student pre-study and post-study perceived skills levels. The results were consistent with findings of the National Telecommunications and Information Administration (2000) that indicated the presence of a digital divide and digital inclusion. Caucasian American participants were found to have more at-home computer and Internet access than African American participants, indicating that there is a digital divide by ethnicity. Caucasian American females were found to have more computer and Internet access which was an indication of digital inclusion. Sophomores had more at-home computer access and Internet access than other levels indicating digital inclusion. Students receiving regular meals had more computer and Internet access than students receiving free/reduced meals. Older students had more computer and Internet access than younger students. African American males had been using computer and information technology the longest which is an indication of inclusion. The paired t-test and McNemar test revealed significant perceived student increases in all skills levels. Interviews did not reveal any barriers to continued usage of the computer and information technology skills.

  5. Use of Hilbert Curves in Parallelized CUDA code: Interaction of Interstellar Atoms with the Heliosphere

    NASA Astrophysics Data System (ADS)

    Destefano, Anthony; Heerikhuisen, Jacob

    2015-04-01

    Fully 3D particle simulations can be a computationally and memory expensive task, especially when high resolution grid cells are required. The problem becomes further complicated when parallelization is needed. In this work we focus on computational methods to solve these difficulties. Hilbert curves are used to map the 3D particle space to the 1D contiguous memory space. This method of organization allows for minimized cache misses on the GPU as well as a sorted structure that is equivalent to an octal tree data structure. This type of sorted structure is attractive for uses in adaptive mesh implementations due to the logarithm search time. Implementations using the Message Passing Interface (MPI) library and NVIDIA's parallel computing platform CUDA will be compared, as MPI is commonly used on server nodes with many CPU's. We will also compare static grid structures with those of adaptive mesh structures. The physical test bed will be simulating heavy interstellar atoms interacting with a background plasma, the heliosphere, simulated from fully consistent coupled MHD/kinetic particle code. It is known that charge exchange is an important factor in space plasmas, specifically it modifies the structure of the heliosphere itself. We would like to thank the Alabama Supercomputer Authority for the use of their computational resources.

  6. Exponential approximations in optimal design

    NASA Technical Reports Server (NTRS)

    Belegundu, A. D.; Rajan, S. D.; Rajgopal, J.

    1990-01-01

    One-point and two-point exponential functions have been developed and proved to be very effective approximations of structural response. The exponential has been compared to the linear, reciprocal and quadratic fit methods. Four test problems in structural analysis have been selected. The use of such approximations is attractive in structural optimization to reduce the numbers of exact analyses which involve computationally expensive finite element analysis.

  7. Materials Database Development for Ballistic Impact Modeling

    NASA Technical Reports Server (NTRS)

    Pereira, J. Michael

    2007-01-01

    A set of experimental data is being generated under the Fundamental Aeronautics Program Supersonics project to help create and validate accurate computational impact models of jet engine impact events. The data generated will include material property data generated at a range of different strain rates, from 1x10(exp -4)/sec to 5x10(exp 4)/sec, over a range of temperatures. In addition, carefully instrumented ballistic impact tests will be conducted on flat plates and curved structures to provide material and structural response information to help validate the computational models. The material property data and the ballistic impact data will be generated using materials from the same lot, as far as possible. It was found in preliminary testing that the surface finish of test specimens has an effect on measured high strain rate tension response of AL2024. Both the maximum stress and maximum elongation are greater on specimens with a smoother finish. This report gives an overview of the testing that is being conducted and presents results of preliminary testing of the surface finish study.

  8. Load Balancing Strategies for Multiphase Flows on Structured Grids

    NASA Astrophysics Data System (ADS)

    Olshefski, Kristopher; Owkes, Mark

    2017-11-01

    The computation time required to perform large simulations of complex systems is currently one of the leading bottlenecks of computational research. Parallelization allows multiple processing cores to perform calculations simultaneously and reduces computational times. However, load imbalances between processors waste computing resources as processors wait for others to complete imbalanced tasks. In multiphase flows, these imbalances arise due to the additional computational effort required at the gas-liquid interface. However, many current load balancing schemes are only designed for unstructured grid applications. The purpose of this research is to develop a load balancing strategy while maintaining the simplicity of a structured grid. Several approaches are investigated including brute force oversubscription, node oversubscription through Message Passing Interface (MPI) commands, and shared memory load balancing using OpenMP. Each of these strategies are tested with a simple one-dimensional model prior to implementation into the three-dimensional NGA code. Current results show load balancing will reduce computational time by at least 30%.

  9. Understanding health care communication preferences of veteran primary care users.

    PubMed

    LaVela, Sherri L; Schectman, Gordon; Gering, Jeffrey; Locatelli, Sara M; Gawron, Andrew; Weaver, Frances M

    2012-09-01

    To assess veterans' health communication preferences (in-person, telephone, or electronic) for primary care needs and the impact of computer use on preferences. Structured patient interviews (n=448). Bivariate analyses examined preferences for primary care by 'infrequent' vs. 'regular' computer users. Only 54% were regular computer users, nearly all of whom had ever used the internet. 'Telephone' was preferred for 6 of 10 reasons (general medical questions, medication questions and refills, preventive care reminders, scheduling, and test results); although telephone was preferred by markedly fewer regular computer users. 'In-person' was preferred for new/ongoing conditions/symptoms, treatment instructions, and next care steps; these preferences were unaffected by computer use frequency. Among regular computer users, 1/3 preferred 'electronic' for preventive reminders (37%), test results (34%), and refills (32%). For most primary care needs, telephone communication was preferred, although by a greater proportion of infrequent vs. regular computer users. In-person communication was preferred for reasons that may require an exam or visual instructions. About 1/3 of regular computer users prefer electronic communication for routine needs, e.g., preventive reminders, test results, and refills. These findings can be used to plan patient-centered care that is aligned with veterans' preferred health communication methods. Published by Elsevier Ireland Ltd.

  10. Abdominal CT scan

    MedlinePlus

    Computed tomography scan - abdomen; CT scan - abdomen; CT abdomen and pelvis ... An abdominal CT scan makes detailed pictures of the structures inside your belly very quickly. This test may be used to look ...

  11. Revisiting the blind tests in crystal structure prediction: accurate energy ranking of molecular crystals.

    PubMed

    Asmadi, Aldi; Neumann, Marcus A; Kendrick, John; Girard, Pascale; Perrin, Marc-Antoine; Leusen, Frank J J

    2009-12-24

    In the 2007 blind test of crystal structure prediction hosted by the Cambridge Crystallographic Data Centre (CCDC), a hybrid DFT/MM method correctly ranked each of the four experimental structures as having the lowest lattice energy of all the crystal structures predicted for each molecule. The work presented here further validates this hybrid method by optimizing the crystal structures (experimental and submitted) of the first three CCDC blind tests held in 1999, 2001, and 2004. Except for the crystal structures of compound IX, all structures were reminimized and ranked according to their lattice energies. The hybrid method computes the lattice energy of a crystal structure as the sum of the DFT total energy and a van der Waals (dispersion) energy correction. Considering all four blind tests, the crystal structure with the lowest lattice energy corresponds to the experimentally observed structure for 12 out of 14 molecules. Moreover, good geometrical agreement is observed between the structures determined by the hybrid method and those measured experimentally. In comparison with the correct submissions made by the blind test participants, all hybrid optimized crystal structures (apart from compound II) have the smallest calculated root mean squared deviations from the experimentally observed structures. It is predicted that a new polymorph of compound V exists under pressure.

  12. Analysis of the structural behaviour of colonic segments by inflation tests: Experimental activity and physio-mechanical model.

    PubMed

    Carniel, Emanuele L; Mencattelli, Margherita; Bonsignori, Gabriella; Fontanella, Chiara G; Frigo, Alessandro; Rubini, Alessandro; Stefanini, Cesare; Natali, Arturo N

    2015-11-01

    A coupled experimental and computational approach is provided for the identification of the structural behaviour of gastrointestinal regions, accounting for both elastic and visco-elastic properties. The developed procedure is applied to characterize the mechanics of gastrointestinal samples from pig colons. Experimental data about the structural behaviour of colonic segments are provided by inflation tests. Different inflation processes are performed according to progressively increasing top pressure conditions. Each inflation test consists of an air in-flow, according to an almost constant increasing pressure rate, such as 3.5 mmHg/s, up to a prescribed top pressure, which is held constant for about 300 s to allow the development of creep phenomena. Different tests are interspersed by 600 s of rest to allow the recovery of the tissues' mechanical condition. Data from structural tests are post-processed by a physio-mechanical model in order to identify the mechanical parameters that interpret both the non-linear elastic behaviour of the sample, as the instantaneous pressure-stretch trend, and the time-dependent response, as the stretch increase during the creep processes. The parameters are identified by minimizing the discrepancy between experimental and model results. Different sets of parameters are evaluated for different specimens from different pigs. A statistical analysis is performed to evaluate the distribution of the parameters and to assess the reliability of the experimental and computational activities. © IMechE 2015.

  13. A Combined Experimental/Computational Investigation of a Rocket Based Combined Cycle Inlet

    NASA Technical Reports Server (NTRS)

    Smart, Michael K.; Trexler, Carl A.; Goldman, Allen L.

    2001-01-01

    A rocket based combined cycle inlet geometry has undergone wind tunnel testing and computational analysis with Mach 4 flow at the inlet face. Performance parameters obtained from the wind tunnel tests were the mass capture, the maximum back-pressure, and the self-starting characteristics of the inlet. The CFD analysis supplied a confirmation of the mass capture, the inlet efficiency and the details of the flowfield structure. Physical parameters varied during the test program were cowl geometry, cowl position, body-side bleed magnitude and ingested boundary layer thickness. An optimum configuration was determined for the inlet as a result of this work.

  14. Factor Structure and Reliability of Test Items for Saudi Teacher Licence Assessment

    ERIC Educational Resources Information Center

    Alsadaawi, Abdullah Saleh

    2017-01-01

    The Saudi National Assessment Centre administers the Computer Science Teacher Test for teacher certification. The aim of this study is to explore gender differences in candidates' scores, and investigate dimensionality, reliability, and differential item functioning using confirmatory factor analysis and item response theory. The confirmatory…

  15. Integrated Structural Analysis and Test Program

    NASA Technical Reports Server (NTRS)

    Kaufman, Daniel

    2005-01-01

    An integrated structural-analysis and structure-testing computer program is being developed in order to: Automate repetitive processes in testing and analysis; Accelerate pre-test analysis; Accelerate reporting of tests; Facilitate planning of tests; Improve execution of tests; Create a vibration, acoustics, and shock test database; and Integrate analysis and test data. The software package includes modules pertaining to sinusoidal and random vibration, shock and time replication, acoustics, base-driven modal survey, and mass properties and static/dynamic balance. The program is commanded by use of ActiveX controls. There is minimal need to generate command lines. Analysis or test files are selected by opening a Windows Explorer display. After selecting the desired input file, the program goes to a so-called analysis data process or test data process, depending on the type of input data. The status of the process is given by a Windows status bar, and when processing is complete, the data are reported in graphical, tubular, and matrix form.

  16. LAVA Simulations for the AIAA Sonic Boom Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Housman, Jeffrey A.; Sozer, Emre; Moini-Yekta , Shayan; Kiris, Cetin C.

    2014-01-01

    Computational simulations using the Launch Ascent and Vehicle Aerodynamics (LAVA) framework are presented for the First AIAA Sonic Boom Prediction Workshop test cases. The framework is utilized with both structured overset and unstructured meshing approaches. The three workshop test cases include an axisymmetric body, a Delta Wing-Body model, and a complete low-boom supersonic transport concept. Solution sensitivity to mesh type and sizing, and several numerical convective flux discretization choices are presented and discussed. Favorable comparison between the computational simulations and experimental data of nearand mid-field pressure signatures were obtained.

  17. Variable-Complexity Multidisciplinary Optimization on Parallel Computers

    NASA Technical Reports Server (NTRS)

    Grossman, Bernard; Mason, William H.; Watson, Layne T.; Haftka, Raphael T.

    1998-01-01

    This report covers work conducted under grant NAG1-1562 for the NASA High Performance Computing and Communications Program (HPCCP) from December 7, 1993, to December 31, 1997. The objective of the research was to develop new multidisciplinary design optimization (MDO) techniques which exploit parallel computing to reduce the computational burden of aircraft MDO. The design of the High-Speed Civil Transport (HSCT) air-craft was selected as a test case to demonstrate the utility of our MDO methods. The three major tasks of this research grant included: development of parallel multipoint approximation methods for the aerodynamic design of the HSCT, use of parallel multipoint approximation methods for structural optimization of the HSCT, mathematical and algorithmic development including support in the integration of parallel computation for items (1) and (2). These tasks have been accomplished with the development of a response surface methodology that incorporates multi-fidelity models. For the aerodynamic design we were able to optimize with up to 20 design variables using hundreds of expensive Euler analyses together with thousands of inexpensive linear theory simulations. We have thereby demonstrated the application of CFD to a large aerodynamic design problem. For the predicting structural weight we were able to combine hundreds of structural optimizations of refined finite element models with thousands of optimizations based on coarse models. Computations have been carried out on the Intel Paragon with up to 128 nodes. The parallel computation allowed us to perform combined aerodynamic-structural optimization using state of the art models of a complex aircraft configurations.

  18. Introduction to bioinformatics.

    PubMed

    Can, Tolga

    2014-01-01

    Bioinformatics is an interdisciplinary field mainly involving molecular biology and genetics, computer science, mathematics, and statistics. Data intensive, large-scale biological problems are addressed from a computational point of view. The most common problems are modeling biological processes at the molecular level and making inferences from collected data. A bioinformatics solution usually involves the following steps: Collect statistics from biological data. Build a computational model. Solve a computational modeling problem. Test and evaluate a computational algorithm. This chapter gives a brief introduction to bioinformatics by first providing an introduction to biological terminology and then discussing some classical bioinformatics problems organized by the types of data sources. Sequence analysis is the analysis of DNA and protein sequences for clues regarding function and includes subproblems such as identification of homologs, multiple sequence alignment, searching sequence patterns, and evolutionary analyses. Protein structures are three-dimensional data and the associated problems are structure prediction (secondary and tertiary), analysis of protein structures for clues regarding function, and structural alignment. Gene expression data is usually represented as matrices and analysis of microarray data mostly involves statistics analysis, classification, and clustering approaches. Biological networks such as gene regulatory networks, metabolic pathways, and protein-protein interaction networks are usually modeled as graphs and graph theoretic approaches are used to solve associated problems such as construction and analysis of large-scale networks.

  19. The Proposal of a Evolutionary Strategy Generating the Data Structures Based on a Horizontal Tree for the Tests

    NASA Astrophysics Data System (ADS)

    Żukowicz, Marek; Markiewicz, Michał

    2016-09-01

    The aim of the article is to present a mathematical definition of the object model, that is known in computer science as TreeList and to show application of this model for design evolutionary algorithm, that purpose is to generate structures based on this object. The first chapter introduces the reader to the problem of presenting data using the TreeList object. The second chapter describes the problem of testing data structures based on TreeList. The third one shows a mathematical model of the object TreeList and the parameters, used in determining the utility of structures created through this model and in evolutionary strategy, that generates these structures for testing purposes. The last chapter provides a brief summary and plans for future research related to the algorithm presented in the article.

  20. Conflicts of interest improve collective computation of adaptive social structures

    PubMed Central

    Brush, Eleanor R.; Krakauer, David C.; Flack, Jessica C.

    2018-01-01

    In many biological systems, the functional behavior of a group is collectively computed by the system’s individual components. An example is the brain’s ability to make decisions via the activity of billions of neurons. A long-standing puzzle is how the components’ decisions combine to produce beneficial group-level outputs, despite conflicts of interest and imperfect information. We derive a theoretical model of collective computation from mechanistic first principles, using results from previous work on the computation of power structure in a primate model system. Collective computation has two phases: an information accumulation phase, in which (in this study) pairs of individuals gather information about their fighting abilities and make decisions about their dominance relationships, and an information aggregation phase, in which these decisions are combined to produce a collective computation. To model information accumulation, we extend a stochastic decision-making model—the leaky integrator model used to study neural decision-making—to a multiagent game-theoretic framework. We then test alternative algorithms for aggregating information—in this study, decisions about dominance resulting from the stochastic model—and measure the mutual information between the resultant power structure and the “true” fighting abilities. We find that conflicts of interest can improve accuracy to the benefit of all agents. We also find that the computation can be tuned to produce different power structures by changing the cost of waiting for a decision. The successful application of a similar stochastic decision-making model in neural and social contexts suggests general principles of collective computation across substrates and scales. PMID:29376116

  1. JPL Energy Consumption Program (ECP) documentation: A computer model simulating heating, cooling and energy loads in buildings. [low cost solar array efficiency

    NASA Technical Reports Server (NTRS)

    Lansing, F. L.; Chai, V. W.; Lascu, D.; Urbenajo, R.; Wong, P.

    1978-01-01

    The engineering manual provides a complete companion documentation about the structure of the main program and subroutines, the preparation of input data, the interpretation of output results, access and use of the program, and the detailed description of all the analytic, logical expressions and flow charts used in computations and program structure. A numerical example is provided and solved completely to show the sequence of computations followed. The program is carefully structured to reduce both user's time and costs without sacrificing accuracy. The user would expect a cost of CPU time of approximately $5.00 per building zone excluding printing costs. The accuracy, on the other hand, measured by deviation of simulated consumption from watt-hour meter readings, was found by many simulation tests not to exceed + or - 10 percent margin.

  2. GPU implementation of the linear scaling three dimensional fragment method for large scale electronic structure calculations

    NASA Astrophysics Data System (ADS)

    Jia, Weile; Wang, Jue; Chi, Xuebin; Wang, Lin-Wang

    2017-02-01

    LS3DF, namely linear scaling three-dimensional fragment method, is an efficient linear scaling ab initio total energy electronic structure calculation code based on a divide-and-conquer strategy. In this paper, we present our GPU implementation of the LS3DF code. Our test results show that the GPU code can calculate systems with about ten thousand atoms fully self-consistently in the order of 10 min using thousands of computing nodes. This makes the electronic structure calculations of 10,000-atom nanosystems routine work. This speed is 4.5-6 times faster than the CPU calculations using the same number of nodes on the Titan machine in the Oak Ridge leadership computing facility (OLCF). Such speedup is achieved by (a) carefully re-designing of the computationally heavy kernels; (b) redesign of the communication pattern for heterogeneous supercomputers.

  3. Thermal/structural Tailoring of Engine Blades (T/SEAEBL). Theoretical Manual

    NASA Technical Reports Server (NTRS)

    Brown, K. W.; Clevenger, W. B.

    1994-01-01

    The Thermal/Structural Tailoring of Engine Blades (T/STAEBL) system is a family of computer programs executed by a control program. The T/STAEBL system performs design optimizations of cooled, hollow turbine blades and vanes. This manual describes the T/STAEBL data block structure and system organization. The approximate analysis and optimization modules are detailed, and a validation test case is provided.

  4. Fire development and wall endurance in sandwich and wood-frame structures

    Treesearch

    Carlton A. Holmes; Herbert W. Eickner; John J. Brenden; Curtis C. Peters; Robert H. White

    1980-01-01

    Large-scale fire tests were conducted on seven 16- by 24-foot structures. Four of these structures were of sandwich construction with cores of plastic or paper honeycomb and three were of wood-frame construction. The wasss were loaded to a computer design loading, and the fire endurance determined under a fire exposure from a typical building contents loading of 4-1/2...

  5. Thermal/structural tailoring of engine blades (T/SEAEBL). Theoretical manual

    NASA Astrophysics Data System (ADS)

    Brown, K. W.; Clevenger, W. B.

    1994-03-01

    The Thermal/Structural Tailoring of Engine Blades (T/STAEBL) system is a family of computer programs executed by a control program. The T/STAEBL system performs design optimizations of cooled, hollow turbine blades and vanes. This manual describes the T/STAEBL data block structure and system organization. The approximate analysis and optimization modules are detailed, and a validation test case is provided.

  6. Advances in computational design and analysis of airbreathing propulsion systems

    NASA Technical Reports Server (NTRS)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  7. Quasi-Maximum Likelihood Estimation of Structural Equation Models with Multiple Interaction and Quadratic Effects

    ERIC Educational Resources Information Center

    Klein, Andreas G.; Muthen, Bengt O.

    2007-01-01

    In this article, a nonlinear structural equation model is introduced and a quasi-maximum likelihood method for simultaneous estimation and testing of multiple nonlinear effects is developed. The focus of the new methodology lies on efficiency, robustness, and computational practicability. Monte-Carlo studies indicate that the method is highly…

  8. Testing a Wheeled Landing Gear System for the TH-57 Helicopter

    DTIC Science & Technology

    1992-12-01

    initial comparison was done using a structural analysis program, GIFTS , to simultaneously analyze an~i compare the gear systems. Experimental data was used...15 B. GIFTS PROGRAM RESULTS ............................ 15 1. Model...Element Total System ( GIFTS ) structural analysis program, which is resident oin the Aeiunauimia Euginme1ing Department computer system, an analysis

  9. 33 CFR 148.8 - How are certifying entities designated and used for purposes of this subchapter?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... be associated with its duties for the specific project; (7) In-house availability of, or access to, appropriate technology such as computer modeling programs and hardware or testing materials and equipment; (8...-based structures and project-related structures, systems, and equipment; (6) Technical capabilities...

  10. The Power of the Test for Treatment Effects in Three-Level Block Randomized Designs

    ERIC Educational Resources Information Center

    Konstantopoulos, Spyros

    2008-01-01

    Experiments that involve nested structures may assign treatment conditions either to subgroups (such as classrooms) or individuals within subgroups (such as students). The design of such experiments requires knowledge of the intraclass correlation structure to compute the sample sizes necessary to achieve adequate power to detect the treatment…

  11. Factor Structure and Scale Reliabilities of the Adjective Check List Across Time

    ERIC Educational Resources Information Center

    Miller, Stephen H.; And Others

    1978-01-01

    Investigated factor structure and scale reliabilities of Gough's Adjective Check List (ACL) and their stability over time. Employees in a community mental health center completed the ACL twice, separated by a one-year interval. After each administration, separate factor analyses were computed. All scales had highly significant test-retest…

  12. 33 CFR 148.8 - How are certifying entities designated and used for purposes of this subchapter?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... be associated with its duties for the specific project; (7) In-house availability of, or access to, appropriate technology such as computer modeling programs and hardware or testing materials and equipment; (8...-based structures and project-related structures, systems, and equipment; (6) Technical capabilities...

  13. TRANSTRAIN: A program to compute strain transformations in composite materials

    NASA Technical Reports Server (NTRS)

    Ahmed, Rafiq

    1990-01-01

    Over the years, the solid rocket motor community has made increasing use of composite materials for thermal and structural applications. This is particularly true of solid rocket nozzles, which have used carbon phenolic and, increasingly, carbon-carbon materials to provide structural integrity and thermal protection at the high temperatures encountered during motor burn. To evaluate the degree of structural performance of nozzles and their materials and to verify analysis models, many subscale and full-scale tests are run. These provide engineers with valuable data needed to optimize design and to analyze nozzle hardware. Included among these data are strains, pressures, thrust, temperatures, and displacements. Recent nozzle test hardware has made increasing use of strain gauges embedded in the carbon composite material to measure internal strains. In order to evaluate strength, these data must be transformed into strains along the fiber directions. The fiber-direction stresses can then be calculated. A computer program written to help engineers correctly manipulate the strain data into a form that can be used to evaluate structural integrity of the nozzle is examined.

  14. General aviation crash safety program at Langley Research Center

    NASA Technical Reports Server (NTRS)

    Thomson, R. G.

    1976-01-01

    The purpose of the crash safety program is to support development of the technology to define and demonstrate new structural concepts for improved crash safety and occupant survivability in general aviation aircraft. The program involves three basic areas of research: full-scale crash simulation testing, nonlinear structural analyses necessary to predict failure modes and collapse mechanisms of the vehicle, and evaluation of energy absorption concepts for specific component design. Both analytical and experimental methods are being used to develop expertise in these areas. Analyses include both simplified procedures for estimating energy absorption capabilities and more complex computer programs for analysis of general airframe response. Full-scale tests of typical structures as well as tests on structural components are being used to verify the analyses and to demonstrate improved design concepts.

  15. Hierarchical nonlinear behavior of hot composite structures

    NASA Technical Reports Server (NTRS)

    Murthy, P. L. N.; Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Hierarchical computational procedures are described to simulate the multiple scale thermal/mechanical behavior of high temperature metal matrix composites (HT-MMC) in the following three broad areas: (1) behavior of HT-MMC's from micromechanics to laminate via METCAN (Metal Matrix Composite Analyzer), (2) tailoring of HT-MMC behavior for optimum specific performance via MMLT (Metal Matrix Laminate Tailoring), and (3) HT-MMC structural response for hot structural components via HITCAN (High Temperature Composite Analyzer). Representative results from each area are presented to illustrate the effectiveness of computational simulation procedures and accompanying computer codes. The sample case results show that METCAN can be used to simulate material behavior such as the entire creep span; MMLT can be used to concurrently tailor the fabrication process and the interphase layer for optimum performance such as minimum residual stresses; and HITCAN can be used to predict the structural behavior such as the deformed shape due to component fabrication. These codes constitute virtual portable desk-top test laboratories for characterizing HT-MMC laminates, tailoring the fabrication process, and qualifying structural components made from them.

  16. Oxford International Conference on the Mechanical Properties of Materials at High Rates of Strain (4th) Held in Oxford, United Kingdom on 19-22 March 1989

    DTIC Science & Technology

    1989-03-22

    models are used in the computer program EPIC2 to describe the structural response in the cylinder impact test are compared and the differences are...Inc. 8600 Le Salle Road Suite 614, Oxford Building Towson, Maryland 21204 This paper describes the development and application of a computer program ...performed using a dynamic viscoplastic finite element computer program . The resolution of the procedure has been investigated by obtaining replicate

  17. Queueing Network Models for Parallel Processing of Task Systems: an Operational Approach

    NASA Technical Reports Server (NTRS)

    Mak, Victor W. K.

    1986-01-01

    Computer performance modeling of possibly complex computations running on highly concurrent systems is considered. Earlier works in this area either dealt with a very simple program structure or resulted in methods with exponential complexity. An efficient procedure is developed to compute the performance measures for series-parallel-reducible task systems using queueing network models. The procedure is based on the concept of hierarchical decomposition and a new operational approach. Numerical results for three test cases are presented and compared to those of simulations.

  18. Advanced concepts for transformers pressboard dielectric constant and mechanical strength

    NASA Astrophysics Data System (ADS)

    1982-03-01

    Of the numerous electrical considerations in a material, the value of the dielectric constant serves as an important criterion in designing proper insulation systems. Ways to reduce the dielectric constant of solid (fibrous) insulating materials were investigated. A literature search was made on cellulosic and synthetic fibers and also additives which offered the potential for dielectric constant reduction of the solid insulation. Sample board structures were produced in the laboratory and tested for electrical, mechanical and chemical characteristics. Electrical tests determined the suitability of the material at transformer test and operating conditions. The mechanical tests established the physical characteristics of the modified board structures. Chemical tests checked the conductivity of the aqueous extract, acidity, and ash content. Further, compatibility with transformer oil and some aging tests were performed. An actual computer transformer design was made based on one of the modified board structures and the reduction in core steel and transformer losses were shown.

  19. Consistent structures and interactions by density functional theory with small atomic orbital basis sets.

    PubMed

    Grimme, Stefan; Brandenburg, Jan Gerit; Bannwarth, Christoph; Hansen, Andreas

    2015-08-07

    A density functional theory (DFT) based composite electronic structure approach is proposed to efficiently compute structures and interaction energies in large chemical systems. It is based on the well-known and numerically robust Perdew-Burke-Ernzerhoff (PBE) generalized-gradient-approximation in a modified global hybrid functional with a relatively large amount of non-local Fock-exchange. The orbitals are expanded in Ahlrichs-type valence-double zeta atomic orbital (AO) Gaussian basis sets, which are available for many elements. In order to correct for the basis set superposition error (BSSE) and to account for the important long-range London dispersion effects, our well-established atom-pairwise potentials are used. In the design of the new method, particular attention has been paid to an accurate description of structural parameters in various covalent and non-covalent bonding situations as well as in periodic systems. Together with the recently proposed three-fold corrected (3c) Hartree-Fock method, the new composite scheme (termed PBEh-3c) represents the next member in a hierarchy of "low-cost" electronic structure approaches. They are mainly free of BSSE and account for most interactions in a physically sound and asymptotically correct manner. PBEh-3c yields good results for thermochemical properties in the huge GMTKN30 energy database. Furthermore, the method shows excellent performance for non-covalent interaction energies in small and large complexes. For evaluating its performance on equilibrium structures, a new compilation of standard test sets is suggested. These consist of small (light) molecules, partially flexible, medium-sized organic molecules, molecules comprising heavy main group elements, larger systems with long bonds, 3d-transition metal systems, non-covalently bound complexes (S22 and S66×8 sets), and peptide conformations. For these sets, overall deviations from accurate reference data are smaller than for various other tested DFT methods and reach that of triple-zeta AO basis set second-order perturbation theory (MP2/TZ) level at a tiny fraction of computational effort. Periodic calculations conducted for molecular crystals to test structures (including cell volumes) and sublimation enthalpies indicate very good accuracy competitive to computationally more involved plane-wave based calculations. PBEh-3c can be applied routinely to several hundreds of atoms on a single processor and it is suggested as a robust "high-speed" computational tool in theoretical chemistry and physics.

  20. Improve SSME power balance model

    NASA Technical Reports Server (NTRS)

    Karr, Gerald R.

    1992-01-01

    Effort was dedicated to development and testing of a formal strategy for reconciling uncertain test data with physically limited computational prediction. Specific weaknesses in the logical structure of the current Power Balance Model (PBM) version are described with emphasis given to the main routing subroutines BAL and DATRED. Selected results from a variational analysis of PBM predictions are compared to Technology Test Bed (TTB) variational study results to assess PBM predictive capability. The motivation for systematic integration of uncertain test data with computational predictions based on limited physical models is provided. The theoretical foundation for the reconciliation strategy developed in this effort is presented, and results of a reconciliation analysis of the Space Shuttle Main Engine (SSME) high pressure fuel side turbopump subsystem are examined.

  1. Method and apparatus for predicting the direction of movement in machine vision

    NASA Technical Reports Server (NTRS)

    Lawton, Teri B. (Inventor)

    1992-01-01

    A computer-simulated cortical network is presented. The network is capable of computing the visibility of shifts in the direction of movement. Additionally, the network can compute the following: (1) the magnitude of the position difference between the test and background patterns; (2) localized contrast differences at different spatial scales analyzed by computing temporal gradients of the difference and sum of the outputs of paired even- and odd-symmetric bandpass filters convolved with the input pattern; and (3) the direction of a test pattern moved relative to a textured background. The direction of movement of an object in the field of view of a robotic vision system is detected in accordance with nonlinear Gabor function algorithms. The movement of objects relative to their background is used to infer the 3-dimensional structure and motion of object surfaces.

  2. Energy Finite Element Analysis Developments for Vibration Analysis of Composite Aircraft Structures

    NASA Technical Reports Server (NTRS)

    Vlahopoulos, Nickolas; Schiller, Noah H.

    2011-01-01

    The Energy Finite Element Analysis (EFEA) has been utilized successfully for modeling complex structural-acoustic systems with isotropic structural material properties. In this paper, a formulation for modeling structures made out of composite materials is presented. An approach based on spectral finite element analysis is utilized first for developing the equivalent material properties for the composite material. These equivalent properties are employed in the EFEA governing differential equations for representing the composite materials and deriving the element level matrices. The power transmission characteristics at connections between members made out of non-isotropic composite material are considered for deriving suitable power transmission coefficients at junctions of interconnected members. These coefficients are utilized for computing the joint matrix that is needed to assemble the global system of EFEA equations. The global system of EFEA equations is solved numerically and the vibration levels within the entire system can be computed. The new EFEA formulation for modeling composite laminate structures is validated through comparison to test data collected from a representative composite aircraft fuselage that is made out of a composite outer shell and composite frames and stiffeners. NASA Langley constructed the composite cylinder and conducted the test measurements utilized in this work.

  3. AnchorDock: Blind and Flexible Anchor-Driven Peptide Docking.

    PubMed

    Ben-Shimon, Avraham; Niv, Masha Y

    2015-05-05

    The huge conformational space stemming from the inherent flexibility of peptides is among the main obstacles to successful and efficient computational modeling of protein-peptide interactions. Current peptide docking methods typically overcome this challenge using prior knowledge from the structure of the complex. Here we introduce AnchorDock, a peptide docking approach, which automatically targets the docking search to the most relevant parts of the conformational space. This is done by precomputing the free peptide's structure and by computationally identifying anchoring spots on the protein surface. Next, a free peptide conformation undergoes anchor-driven simulated annealing molecular dynamics simulations around the predicted anchoring spots. In the challenging task of a completely blind docking test, AnchorDock produced exceptionally good results (backbone root-mean-square deviation ≤ 2.2Å, rank ≤15) for 10 of 13 unbound cases tested. The impressive performance of AnchorDock supports a molecular recognition pathway that is driven via pre-existing local structural elements. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Tenth NASTRAN User's Colloquium

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The development of the NASTRAN computer program, a general purpose finite element computer code for structural analysis, was discussed. The application and development of NASTRAN is presented in the following topics: improvements and enhancements; developments of pre and postprocessors; interactive review system; the use of harmonic expansions in magnetic field problems; improving a dynamic model with test data using Linwood; solution of axisymmetric fluid structure interaction problems; large displacements and stability analysis of nonlinear propeller structures; prediction of bead area contact load at the tire wheel interface; elastic plastic analysis of an overloaded breech ring; finite element solution of torsion and other 2-D Poisson equations; new capability for elastic aircraft airloads; usage of substructuring analysis in the get away special program; solving symmetric structures with nonsymmetric loads; evaluation and reduction of errors induced by Guyan transformation.

  5. Ice interaction with offshore structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cammaert, A.B.; Muggeridge, D.B.

    1988-01-01

    Oil platforms and other offshore structures being built in the arctic regions must be able to withstand icebergs, ice islands, and pack ice. This reference explain the effect ice has on offshore structures and demonstrates design and construction methods that allow such structures to survive in harsh, ice-ridden environments. It analyzes the characteristics of sea ice as well as dynamic ice forces on structures. Techniques for ice modeling and field testing facilitate the design and construction of sturdy, offshore constructions. Computer programs included.

  6. Dispersion Interactions between Rare Gas Atoms: Testing the London Equation Using ab Initio Methods

    ERIC Educational Resources Information Center

    Halpern, Arthur M.

    2011-01-01

    A computational chemistry experiment is described in which students can use advanced ab initio quantum mechanical methods to test the ability of the London equation to account quantitatively for the attractive (dispersion) interactions between rare gas atoms. Using readily available electronic structure applications, students can calculate the…

  7. Characterizing Facesheet/Core Disbonding in Honeycomb Core Sandwich Structure

    NASA Technical Reports Server (NTRS)

    Rinker, Martin; Ratcliffe, James G.; Adams, Daniel O.; Krueger, Ronald

    2013-01-01

    Results are presented from an experimental investigation into facesheet core disbonding in carbon fiber reinforced plastic/Nomex honeycomb sandwich structures using a Single Cantilever Beam test. Specimens with three, six and twelve-ply facesheets were tested. Specimens with different honeycomb cores consisting of four different cell sizes were also tested, in addition to specimens with three different widths. Three different data reduction methods were employed for computing apparent fracture toughness values from the test data, namely an area method, a compliance calibration technique and a modified beam theory method. The compliance calibration and modified beam theory approaches yielded comparable apparent fracture toughness values, which were generally lower than those computed using the area method. Disbonding in the three-ply facesheet specimens took place at the facesheet/core interface and yielded the lowest apparent fracture toughness values. Disbonding in the six and twelve-ply facesheet specimens took place within the core, near to the facesheet/core interface. Specimen width was not found to have a significant effect on apparent fracture toughness. The amount of scatter in the apparent fracture toughness data was found to increase with honeycomb core cell size.

  8. Improved computer-aided detection of small polyps in CT colonography using interpolation for curvature estimationa

    PubMed Central

    Liu, Jiamin; Kabadi, Suraj; Van Uitert, Robert; Petrick, Nicholas; Deriche, Rachid; Summers, Ronald M.

    2011-01-01

    Purpose: Surface curvatures are important geometric features for the computer-aided analysis and detection of polyps in CT colonography (CTC). However, the general kernel approach for curvature computation can yield erroneous results for small polyps and for polyps that lie on haustral folds. Those erroneous curvatures will reduce the performance of polyp detection. This paper presents an analysis of interpolation’s effect on curvature estimation for thin structures and its application on computer-aided detection of small polyps in CTC. Methods: The authors demonstrated that a simple technique, image interpolation, can improve the accuracy of curvature estimation for thin structures and thus significantly improve the sensitivity of small polyp detection in CTC. Results: Our experiments showed that the merits of interpolating included more accurate curvature values for simulated data, and isolation of polyps near folds for clinical data. After testing on a large clinical data set, it was observed that sensitivities with linear, quadratic B-spline and cubic B-spline interpolations significantly improved the sensitivity for small polyp detection. Conclusions: The image interpolation can improve the accuracy of curvature estimation for thin structures and thus improve the computer-aided detection of small polyps in CTC. PMID:21859029

  9. Evaluation of the discrete vortex wake cross flow model using vector computers. Part 1: Theory and application

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The current program had the objective to modify a discrete vortex wake method to efficiently compute the aerodynamic forces and moments on high fineness ratio bodies (f approximately 10.0). The approach is to increase computational efficiency by structuring the program to take advantage of new computer vector software and by developing new algorithms when vector software can not efficiently be used. An efficient program was written and substantial savings achieved. Several test cases were run for fineness ratios up to f = 16.0 and angles of attack up to 50 degrees.

  10. X-ray-induced acoustic computed tomography of concrete infrastructure

    NASA Astrophysics Data System (ADS)

    Tang, Shanshan; Ramseyer, Chris; Samant, Pratik; Xiang, Liangzhong

    2018-02-01

    X-ray-induced Acoustic Computed Tomography (XACT) takes advantage of both X-ray absorption contrast and high ultrasonic resolution in a single imaging modality by making use of the thermoacoustic effect. In XACT, X-ray absorption by defects and other structures in concrete create thermally induced pressure jumps that launch ultrasonic waves, which are then received by acoustic detectors to form images. In this research, XACT imaging was used to non-destructively test and identify defects in concrete. For concrete structures, we conclude that XACT imaging allows multiscale imaging at depths ranging from centimeters to meters, with spatial resolutions from sub-millimeter to centimeters. XACT imaging also holds promise for single-side testing of concrete infrastructure and provides an optimal solution for nondestructive inspection of existing bridges, pavement, nuclear power plants, and other concrete infrastructure.

  11. Active-learning strategies in computer-assisted drug discovery.

    PubMed

    Reker, Daniel; Schneider, Gisbert

    2015-04-01

    High-throughput compound screening is time and resource consuming, and considerable effort is invested into screening compound libraries, profiling, and selecting the most promising candidates for further testing. Active-learning methods assist the selection process by focusing on areas of chemical space that have the greatest chance of success while considering structural novelty. The core feature of these algorithms is their ability to adapt the structure-activity landscapes through feedback. Instead of full-deck screening, only focused subsets of compounds are tested, and the experimental readout is used to refine molecule selection for subsequent screening cycles. Once implemented, these techniques have the potential to reduce costs and save precious materials. Here, we provide a comprehensive overview of the various computational active-learning approaches and outline their potential for drug discovery. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Iterative Refinement of a Binding Pocket Model: Active Computational Steering of Lead Optimization

    PubMed Central

    2012-01-01

    Computational approaches for binding affinity prediction are most frequently demonstrated through cross-validation within a series of molecules or through performance shown on a blinded test set. Here, we show how such a system performs in an iterative, temporal lead optimization exercise. A series of gyrase inhibitors with known synthetic order formed the set of molecules that could be selected for “synthesis.” Beginning with a small number of molecules, based only on structures and activities, a model was constructed. Compound selection was done computationally, each time making five selections based on confident predictions of high activity and five selections based on a quantitative measure of three-dimensional structural novelty. Compound selection was followed by model refinement using the new data. Iterative computational candidate selection produced rapid improvements in selected compound activity, and incorporation of explicitly novel compounds uncovered much more diverse active inhibitors than strategies lacking active novelty selection. PMID:23046104

  13. Loading tests of a wing structure for a hypersonic aircraft

    NASA Technical Reports Server (NTRS)

    Fields, R. A.; Reardon, L. F.; Siegel, W. H.

    1980-01-01

    Room-temperature loading tests were conducted on a wing structure designed with a beaded panel concept for a Mach 8 hypersonic research airplane. Strain, stress, and deflection data were compared with the results of three finite-element structural analysis computer programs and with design data. The test program data were used to evaluate the structural concept and the methods of analysis used in the design. A force stiffness technique was utilized in conjunction with load conditions which produced various combinations of panel shear and compression loading to determine the failure envelope of the buckling critical beaded panels The force-stiffness data did not result in any predictions of buckling failure. It was, therefore, concluded that the panels were conservatively designed as a result of design constraints and assumptions of panel eccentricities. The analysis programs calculated strains and stresses competently. Comparisons between calculated and measured structural deflections showed good agreement. The test program offered a positive demonstration of the beaded panel concept subjected to room-temperature load conditions.

  14. Performance on Cambridge Neuropsychological Test Automated Battery Subtests Sensitive to Frontal Lobe Function in People with Autistic Disorder: Evidence from the Collaborative Programs of Excellence in Autism Network

    ERIC Educational Resources Information Center

    Ozonoff, Sally; Cook, Ian; Coon, Hilary; Dawson, Geraldine; Joseph, Robert M.; Klin, Ami; McMahon, William M.; Minshew, Nancy; Munson, Jeffrey A.

    2004-01-01

    Recent structural and functional imaging work, as well as neuropathology and neuropsychology studies, provide strong empirical support for the involvement of frontal cortex in autism. The Cambridge Neuropsychological Test Automated Battery (CANTAB) is a computer-administered set of neuropsychological tests developed to examine specific components…

  15. Advanced thermally stable jet fuels. Technical progress report, January 1995--March 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schobert, H.H.; Eser, S.; Song, C.

    Quantitative structure-property relationships have been applied to study the thermal stability of pure hydrocarbons typical of jet fuel components. A simple method of chemical structure description in terms of Benson groups was tested in searching for structure-property relationships for the hydrocarbons tested experimentally in this program. Molecular connectivity as a structure-based approach to chemical structure-property relationship analysis was also tested. Further development of both the experimental data base and computational methods will be necessary. Thermal decomposition studies, using glass tube reactors, were extended to two additional model compounds: n-decane and n-dodecane. Efforts on refining the deposit growth measurement and characterizationmore » of suspended matter in stressed fuels have lead to improvements in the analysis of stressed fuels. Catalytic hydrogenation and dehydrogenation studies utilizing a molybdenum sulfide catalyst are also described.« less

  16. Model verification of large structural systems. [space shuttle model response

    NASA Technical Reports Server (NTRS)

    Lee, L. T.; Hasselman, T. K.

    1978-01-01

    A computer program for the application of parameter identification on the structural dynamic models of space shuttle and other large models with hundreds of degrees of freedom is described. Finite element, dynamic, analytic, and modal models are used to represent the structural system. The interface with math models is such that output from any structural analysis program applied to any structural configuration can be used directly. Processed data from either sine-sweep tests or resonant dwell tests are directly usable. The program uses measured modal data to condition the prior analystic model so as to improve the frequency match between model and test. A Bayesian estimator generates an improved analytical model and a linear estimator is used in an iterative fashion on highly nonlinear equations. Mass and stiffness scaling parameters are generated for an improved finite element model, and the optimum set of parameters is obtained in one step.

  17. Aero/structural tailoring of engine blades (AERO/STAEBL)

    NASA Technical Reports Server (NTRS)

    Brown, K. W.

    1988-01-01

    This report describes the Aero/Structural Tailoring of Engine Blades (AERO/STAEBL) program, which is a computer code used to perform engine fan and compressor blade aero/structural numerical optimizations. These optimizations seek a blade design of minimum operating cost that satisfies realistic blade design constraints. This report documents the overall program (i.e., input, optimization procedures, approximate analyses) and also provides a detailed description of the validation test cases.

  18. NASA experiments on the B-720 structure and seats

    NASA Astrophysics Data System (ADS)

    Alfaro-Bou, E.

    1986-01-01

    Two experiments onboard a remotely piloted transport aircraft that was crashed on landing are discussed. The structural experiment deals with the location and distribution of the instrumentation throughout the airplane structure. In the seat experiment, the development and testing of an energy absorbing seat are discussed. The objective of the structural experiment was to obtain a data base of structural crash loads for use in the advancement of crashworthy technology of materials (such as composites) in structural design and for use in the comparison between computer and experimental results. The objective of the seat experiment was to compare the performance of an energy absorbing transport seat and a standard seat when subjected to similar crash pulses. Details are given on the location of instrumentation, on the dynamic seat test pulse and headward acceleration limits.

  19. Design of Bioprosthetic Aortic Valves using biaxial test data.

    PubMed

    Dabiri, Y; Paulson, K; Tyberg, J; Ronsky, J; Ali, I; Di Martino, E; Narine, K

    2015-01-01

    Bioprosthetic Aortic Valves (BAVs) do not have the serious limitations of mechanical aortic valves in terms of thrombosis. However, the lifetime of BAVs is too short, often requiring repeated surgeries. The lifetime of BAVs might be improved by using computer simulations of the structural behavior of the leaflets. The goal of this study was to develop a numerical model applicable to the optimization of durability of BAVs. The constitutive equations were derived using biaxial tensile tests. Using a Fung model, stress and strain data were computed from biaxial test data. SolidWorks was used to develop the geometry of the leaflets, and ABAQUS finite element software package was used for finite element calculations. Results showed the model is consistent with experimental observations. Reaction forces computed by the model corresponded with experimental measurements when the biaxial test was simulated. As well, the location of maximum stresses corresponded to the locations of frequent tearing of BAV leaflets. Results suggest that BAV design can be optimized with respect to durability.

  20. Development of an Efficient Binaural Simulation for the Analysis of Structural Acoustic Data

    NASA Technical Reports Server (NTRS)

    Lalime, Aimee L.; Johnson, Marty E.; Rizzi, Stephen A. (Technical Monitor)

    2002-01-01

    Binaural or "virtual acoustic" representation has been proposed as a method of analyzing acoustic and vibroacoustic data. Unfortunately, this binaural representation can require extensive computer power to apply the Head Related Transfer Functions (HRTFs) to a large number of sources, as with a vibrating structure. This work focuses on reducing the number of real-time computations required in this binaural analysis through the use of Singular Value Decomposition (SVD) and Equivalent Source Reduction (ESR). The SVD method reduces the complexity of the HRTF computations by breaking the HRTFs into dominant singular values (and vectors). The ESR method reduces the number of sources to be analyzed in real-time computation by replacing sources on the scale of a structural wavelength with sources on the scale of an acoustic wavelength. It is shown that the effectiveness of the SVD and ESR methods improves as the complexity of the source increases. In addition, preliminary auralization tests have shown that the results from both the SVD and ESR methods are indistinguishable from the results found with the exhaustive method.

  1. Navier-Stokes simulations of slender axisymmetric shapes in supersonic, turbulent flow

    NASA Astrophysics Data System (ADS)

    Moran, Kenneth J.; Beran, Philip S.

    1994-07-01

    Computational fluid dynamics is used to study flows about slender, axisymmetric bodies at very high speeds. Numerical experiments are conducted to simulate a broad range of flight conditions. Mach number is varied from 1.5 to 8 and Reynolds number is varied from 1 X 10(exp 6)/m to 10(exp 8)/m. The primary objective is to develop and validate a computational and methodology for the accurate simulation of a wide variety of flow structures. Accurate results are obtained for detached bow shocks, recompression shocks, corner-point expansions, base-flow recirculations, and turbulent boundary layers. Accuracy is assessed through comparison with theory and experimental data; computed surface pressure, shock structure, base-flow structure, and velocity profiles are within measurement accuracy throughout the range of conditions tested. The methodology is both practical and general: general in its applicability, and practicaal in its performance. To achieve high accuracy, modifications to previously reported techniques are implemented in the scheme. These modifications improve computed results in the vicinity of symmetry lines and in the base flow region, including the turbulent wake.

  2. Image-Based Patient-Specific Ventricle Models with Fluid-Structure Interaction for Cardiac Function Assessment and Surgical Design Optimization

    PubMed Central

    Tang, Dalin; Yang, Chun; Geva, Tal; del Nido, Pedro J.

    2010-01-01

    Recent advances in medical imaging technology and computational modeling techniques are making it possible that patient-specific computational ventricle models be constructed and used to test surgical hypotheses and replace empirical and often risky clinical experimentation to examine the efficiency and suitability of various reconstructive procedures in diseased hearts. In this paper, we provide a brief review on recent development in ventricle modeling and its potential application in surgical planning and management of tetralogy of Fallot (ToF) patients. Aspects of data acquisition, model selection and construction, tissue material properties, ventricle layer structure and tissue fiber orientations, pressure condition, model validation and virtual surgery procedures (changing patient-specific ventricle data and perform computer simulation) were reviewed. Results from a case study using patient-specific cardiac magnetic resonance (CMR) imaging and right/left ventricle and patch (RV/LV/Patch) combination model with fluid-structure interactions (FSI) were reported. The models were used to evaluate and optimize human pulmonary valve replacement/insertion (PVR) surgical procedure and patch design and test a surgical hypothesis that PVR with small patch and aggressive scar tissue trimming in PVR surgery may lead to improved recovery of RV function and reduced stress/strain conditions in the patch area. PMID:21344066

  3. Aircraft Metal Skin Repair and Honeycomb Structure Repair; Sheet Metal Work 3: 9857.02.

    ERIC Educational Resources Information Center

    Dade County Public Schools, Miami, FL.

    The course helps students determine types of repairs, compute repair sizes, and complete the repair through surface protection. Course content includes goals, specific objectives, protection of metals, repairs to metal skin, and honeycomb structure repair. A bibliography and post-test are appended. A prerequisite for this course is mastery of the…

  4. Web-Beagle: a web server for the alignment of RNA secondary structures.

    PubMed

    Mattei, Eugenio; Pietrosanto, Marco; Ferrè, Fabrizio; Helmer-Citterich, Manuela

    2015-07-01

    Web-Beagle (http://beagle.bio.uniroma2.it) is a web server for the pairwise global or local alignment of RNA secondary structures. The server exploits a new encoding for RNA secondary structure and a substitution matrix of RNA structural elements to perform RNA structural alignments. The web server allows the user to compute up to 10 000 alignments in a single run, taking as input sets of RNA sequences and structures or primary sequences alone. In the latter case, the server computes the secondary structure prediction for the RNAs on-the-fly using RNAfold (free energy minimization). The user can also compare a set of input RNAs to one of five pre-compiled RNA datasets including lncRNAs and 3' UTRs. All types of comparison produce in output the pairwise alignments along with structural similarity and statistical significance measures for each resulting alignment. A graphical color-coded representation of the alignments allows the user to easily identify structural similarities between RNAs. Web-Beagle can be used for finding structurally related regions in two or more RNAs, for the identification of homologous regions or for functional annotation. Benchmark tests show that Web-Beagle has lower computational complexity, running time and better performances than other available methods. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  5. Design, fabrication and test of graphite/epoxy metering truss structure components, phase 3

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The design, materials, tooling, manufacturing processes, quality control, test procedures, and results associated with the fabrication and test of graphite/epoxy metering truss structure components exhibiting a near zero coefficient of thermal expansion are described. Analytical methods were utilized, with the aid of a computer program, to define the most efficient laminate configurations in terms of thermal behavior and structural requirements. This was followed by an extensive material characterization and selection program, conducted for several graphite/graphite/hybrid laminate systems to obtain experimental data in support of the analytical predictions. Mechanical property tests as well as the coefficient of thermal expansion tests were run on each laminate under study, the results of which were used as the selection criteria for the single most promising laminate. Further coefficient of thermal expansion measurement was successfully performed on three subcomponent tubes utilizing the selected laminate.

  6. Treatment of atomic and molecular line blanketing by opacity sampling. [atmospheric optics - stellar atmospheres

    NASA Technical Reports Server (NTRS)

    Johnson, H. R.; Krupp, B. M.

    1975-01-01

    An opacity sampling (OS) technique for treating the radiative opacity of large numbers of atomic and molecular lines in cool stellar atmospheres is presented. Tests were conducted and results show that the structure of atmospheric models is accurately fixed by the use of 1000 frequency points, and 500 frequency points is often adequate. The effects of atomic and molecular lines are separately studied. A test model computed by using the OS method agrees very well with a model having identical atmospheric parameters computed by the giant line (opacity distribution function) method.

  7. Experimental study and numerical simulation on the structural and mechanical properties of Typha leaves through multimodal microscopy approaches.

    PubMed

    Liu, Jingjing; Zhang, Zhihui; Yu, Zhenglei; Liang, Yunhong; Li, Xiujuan; Ren, Luquan

    2018-01-01

    The Typha leaf, with special multi-level structure, low density and excellent mechanical properties, is an ideal bionic prototype utilized for lightweight design. In order to further study the relationship between the structure and mechanical properties, the three-dimensional macroscopic morphology of Typha leaves was characterized by micro computed tomography (Micro-CT) and its internal microstructure was observed by scanning electron microscopy (SEM). The combination of experimental and computational research was carried out in this paper, to reveal and verify the effect of multi-level structure on the mechanical properties. A universal testing machine and a self-developed mechanical testing apparatus with high precision and low load were used to measure the mechanical properties of the axial compression and lateral bending of the leaves, respectively. Three models with different internal structures were established based on the above-mentioned three-dimensional morphologies. The result demonstrated that the structure of partitions and diaphragms within the Typha leaf could form a reinforcement ribs structure which could provide multiple load paths and make the process of compression and bending difficult. The further nonlinear finite element analysis through LS-DYNA proved that internal structure could improve the ability of the models to resist compression and deformation. The investigation can be the reference for lightweight thin-walled structure design and inspire the application of the bionic structural materials. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Estimation of Unsteady Aerodynamic Models from Dynamic Wind Tunnel Data

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick; Klein, Vladislav

    2011-01-01

    Demanding aerodynamic modelling requirements for military and civilian aircraft have motivated researchers to improve computational and experimental techniques and to pursue closer collaboration in these areas. Model identification and validation techniques are key components for this research. This paper presents mathematical model structures and identification techniques that have been used successfully to model more general aerodynamic behaviours in single-degree-of-freedom dynamic testing. Model parameters, characterizing aerodynamic properties, are estimated using linear and nonlinear regression methods in both time and frequency domains. Steps in identification including model structure determination, parameter estimation, and model validation, are addressed in this paper with examples using data from one-degree-of-freedom dynamic wind tunnel and water tunnel experiments. These techniques offer a methodology for expanding the utility of computational methods in application to flight dynamics, stability, and control problems. Since flight test is not always an option for early model validation, time history comparisons are commonly made between computational and experimental results and model adequacy is inferred by corroborating results. An extension is offered to this conventional approach where more general model parameter estimates and their standard errors are compared.

  9. Computational analysis of human and mouse CREB3L4 Protein

    PubMed Central

    Velpula, Kiran Kumar; Rehman, Azeem Abdul; Chigurupati, Soumya; Sanam, Ramadevi; Inampudi, Krishna Kishore; Akila, Chandra Sekhar

    2012-01-01

    CREB3L4 is a member of the CREB/ATF transcription factor family, characterized by their regulation of gene expression through the cAMP-responsive element. Previous studies identified this protein in mice and humans. Whereas CREB3L4 in mice (referred to as Tisp40) is found in the testes and functions in spermatogenesis, human CREB3L4 is primarily detected in the prostate and has been implicated in cancer. We conducted computational analyses to compare the structural homology between murine Tisp40α human CREB3L4. Our results reveal that the primary and secondary structures of the two proteins contain high similarity. Additionally, predicted helical transmembrane structure reveals that the proteins likely have similar structure and function. This study offers preliminary findings that support the translation of mouse Tisp40α findings into human models, based on structural homology. PMID:22829733

  10. NASA/FAA general aviation crash dynamics program - An update

    NASA Technical Reports Server (NTRS)

    Hayduk, R. J.; Thomson, R. G.; Carden, H. D.

    1979-01-01

    Work in progress in the NASA/FAA General Aviation Crash Dynamics Program for the development of technology for increased crash-worthiness and occupant survivability of general aviation aircraft is presented. Full-scale crash testing facilities and procedures are outlined, and a chronological summary of full-scale tests conducted and planned is presented. The Plastic and Large Deflection Analysis of Nonlinear Structures and Modified Seat Occupant Model for Light Aircraft computer programs which form part of the effort to predict nonlinear geometric and material behavior of sheet-stringer aircraft structures subjected to large deformations are described, and excellent agreement between simulations and experiments is noted. The development of structural concepts to attenuate the load transmitted to the passenger through the seats and subfloor structure is discussed, and an apparatus built to test emergency locator transmitters in a realistic environment is presented.

  11. APINetworks: A general API for the treatment of complex networks in arbitrary computational environments

    NASA Astrophysics Data System (ADS)

    Niño, Alfonso; Muñoz-Caro, Camelia; Reyes, Sebastián

    2015-11-01

    The last decade witnessed a great development of the structural and dynamic study of complex systems described as a network of elements. Therefore, systems can be described as a set of, possibly, heterogeneous entities or agents (the network nodes) interacting in, possibly, different ways (defining the network edges). In this context, it is of practical interest to model and handle not only static and homogeneous networks but also dynamic, heterogeneous ones. Depending on the size and type of the problem, these networks may require different computational approaches involving sequential, parallel or distributed systems with or without the use of disk-based data structures. In this work, we develop an Application Programming Interface (APINetworks) for the modeling and treatment of general networks in arbitrary computational environments. To minimize dependency between components, we decouple the network structure from its function using different packages for grouping sets of related tasks. The structural package, the one in charge of building and handling the network structure, is the core element of the system. In this work, we focus in this API structural component. We apply an object-oriented approach that makes use of inheritance and polymorphism. In this way, we can model static and dynamic networks with heterogeneous elements in the nodes and heterogeneous interactions in the edges. In addition, this approach permits a unified treatment of different computational environments. Tests performed on a C++11 version of the structural package show that, on current standard computers, the system can handle, in main memory, directed and undirected linear networks formed by tens of millions of nodes and edges. Our results compare favorably to those of existing tools.

  12. Alzheimer's and Dementia Testing for Earlier Diagnosis

    MedlinePlus

    ... focused on early detection of Alzheimer's disease. Imaging technologies used in Alzheimer's research Structural imaging provides information ... chemical changes linked to specific diseases. Molecular imaging technologies include PET, fMRI and single photon emission computed ...

  13. Wing Shape Sensing from Measured Strain

    NASA Technical Reports Server (NTRS)

    Pak, Chan-Gi

    2015-01-01

    A new two step theory is investigated for predicting the deflection and slope of an entire structure using strain measurements at discrete locations. In the first step, a measured strain is fitted using a piecewise least squares curve fitting method together with the cubic spline technique. These fitted strains are integrated twice to obtain deflection data along the fibers. In the second step, computed deflection along the fibers are combined with a finite element model of the structure in order to extrapolate the deflection and slope of the entire structure through the use of System Equivalent Reduction and Expansion Process. The theory is first validated on a computational model, a cantilevered rectangular wing. It is then applied to test data from a cantilevered swept wing model.

  14. Edge detection based on computational ghost imaging with structured illuminations

    NASA Astrophysics Data System (ADS)

    Yuan, Sheng; Xiang, Dong; Liu, Xuemei; Zhou, Xin; Bing, Pibin

    2018-03-01

    Edge detection is one of the most important tools to recognize the features of an object. In this paper, we propose an optical edge detection method based on computational ghost imaging (CGI) with structured illuminations which are generated by an interference system. The structured intensity patterns are designed to make the edge of an object be directly imaged from detected data in CGI. This edge detection method can extract the boundaries for both binary and grayscale objects in any direction at one time. We also numerically test the influence of distance deviations in the interference system on edge extraction, i.e., the tolerance of the optical edge detection system to distance deviation. Hopefully, it may provide a guideline for scholars to build an experimental system.

  15. High-resolution subject-specific mitral valve imaging and modeling: experimental and computational methods.

    PubMed

    Toma, Milan; Bloodworth, Charles H; Einstein, Daniel R; Pierce, Eric L; Cochran, Richard P; Yoganathan, Ajit P; Kunzelman, Karyn S

    2016-12-01

    The diversity of mitral valve (MV) geometries and multitude of surgical options for correction of MV diseases necessitates the use of computational modeling. Numerical simulations of the MV would allow surgeons and engineers to evaluate repairs, devices, procedures, and concepts before performing them and before moving on to more costly testing modalities. Constructing, tuning, and validating these models rely upon extensive in vitro characterization of valve structure, function, and response to change due to diseases. Micro-computed tomography ([Formula: see text]CT) allows for unmatched spatial resolution for soft tissue imaging. However, it is still technically challenging to obtain an accurate geometry of the diastolic MV. We discuss here the development of a novel technique for treating MV specimens with glutaraldehyde fixative in order to minimize geometric distortions in preparation for [Formula: see text]CT scanning. The technique provides a resulting MV geometry which is significantly more detailed in chordal structure, accurate in leaflet shape, and closer to its physiological diastolic geometry. In this paper, computational fluid-structure interaction (FSI) simulations are used to show the importance of more detailed subject-specific MV geometry with 3D chordal structure to simulate a proper closure validated against [Formula: see text]CT images of the closed valve. Two computational models, before and after use of the aforementioned technique, are used to simulate closure of the MV.

  16. Space shuttle low cost/risk avionics study

    NASA Technical Reports Server (NTRS)

    1971-01-01

    All work breakdown structure elements containing any avionics related effort were examined for pricing the life cycle costs. The analytical, testing, and integration efforts are included for the basic onboard avionics and electrical power systems. The design and procurement of special test equipment and maintenance and repair equipment are considered. Program management associated with these efforts is described. Flight test spares and labor and materials associated with the operations and maintenance of the avionics systems throughout the horizontal flight test are examined. It was determined that cost savings can be achieved by using existing hardware, maximizing orbiter-booster commonality, specifying new equipments to MIL quality standards, basing redundancy on cost effective analysis, minimizing software complexity and reducing cross strapping and computer-managed functions, utilizing compilers and floating point computers, and evolving the design as dictated by the horizontal flight test schedules.

  17. Computer-aided design of polymers and composites

    NASA Technical Reports Server (NTRS)

    Kaelble, D. H.

    1985-01-01

    This book on computer-aided design of polymers and composites introduces and discusses the subject from the viewpoint of atomic and molecular models. Thus, the origins of stiffness, strength, extensibility, and fracture toughness in composite materials can be analyzed directly in terms of chemical composition and molecular structure. Aspects of polymer composite reliability are considered along with characterization techniques for composite reliability, relations between atomic and molecular properties, computer aided design and manufacture, polymer CAD/CAM models, and composite CAD/CAM models. Attention is given to multiphase structural adhesives, fibrous composite reliability, metal joint reliability, polymer physical states and transitions, chemical quality assurance, processability testing, cure monitoring and management, nondestructive evaluation (NDE), surface NDE, elementary properties, ionic-covalent bonding, molecular analysis, acid-base interactions, the manufacturing science, and peel mechanics.

  18. Development of a Fluid Structures Interaction Test Technique for Fabrics

    NASA Technical Reports Server (NTRS)

    Zilliac, Gregory G.; Heineck, James T.; Schairer, Edward T.; Mosher, Robert N.; Garbeff, Theodore Joseph

    2012-01-01

    Application of fluid structures interaction (FSI) computational techniques to configurations of interest to the entry, descent and landing (EDL) community is limited by two factors - limited characterization of the material properties for fabrics of interest and insufficient experimental data to validate the FSI codes. Recently ILC Dover Inc. performed standard tests to characterize the static stress-strain response of four candidate fabrics for use in EDL applications. The objective of the tests described here is to address the need for a FSI dataset for CFD validation purposes. To reach this objective, the structural response of fabrics was measured in a very simple aerodynamic environment with well controlled boundary conditions. Two test series were undertaken. The first series covered a range of tunnel conditions and the second focused on conditions that resulted in fabric panel buckling.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grimme, Stefan, E-mail: grimme@thch.uni-bonn.de; Brandenburg, Jan Gerit; Bannwarth, Christoph

    A density functional theory (DFT) based composite electronic structure approach is proposed to efficiently compute structures and interaction energies in large chemical systems. It is based on the well-known and numerically robust Perdew-Burke-Ernzerhoff (PBE) generalized-gradient-approximation in a modified global hybrid functional with a relatively large amount of non-local Fock-exchange. The orbitals are expanded in Ahlrichs-type valence-double zeta atomic orbital (AO) Gaussian basis sets, which are available for many elements. In order to correct for the basis set superposition error (BSSE) and to account for the important long-range London dispersion effects, our well-established atom-pairwise potentials are used. In the design ofmore » the new method, particular attention has been paid to an accurate description of structural parameters in various covalent and non-covalent bonding situations as well as in periodic systems. Together with the recently proposed three-fold corrected (3c) Hartree-Fock method, the new composite scheme (termed PBEh-3c) represents the next member in a hierarchy of “low-cost” electronic structure approaches. They are mainly free of BSSE and account for most interactions in a physically sound and asymptotically correct manner. PBEh-3c yields good results for thermochemical properties in the huge GMTKN30 energy database. Furthermore, the method shows excellent performance for non-covalent interaction energies in small and large complexes. For evaluating its performance on equilibrium structures, a new compilation of standard test sets is suggested. These consist of small (light) molecules, partially flexible, medium-sized organic molecules, molecules comprising heavy main group elements, larger systems with long bonds, 3d-transition metal systems, non-covalently bound complexes (S22 and S66×8 sets), and peptide conformations. For these sets, overall deviations from accurate reference data are smaller than for various other tested DFT methods and reach that of triple-zeta AO basis set second-order perturbation theory (MP2/TZ) level at a tiny fraction of computational effort. Periodic calculations conducted for molecular crystals to test structures (including cell volumes) and sublimation enthalpies indicate very good accuracy competitive to computationally more involved plane-wave based calculations. PBEh-3c can be applied routinely to several hundreds of atoms on a single processor and it is suggested as a robust “high-speed” computational tool in theoretical chemistry and physics.« less

  20. An emulator for minimizing computer resources for finite element analysis

    NASA Technical Reports Server (NTRS)

    Melosh, R.; Utku, S.; Islam, M.; Salama, M.

    1984-01-01

    A computer code, SCOPE, has been developed for predicting the computer resources required for a given analysis code, computer hardware, and structural problem. The cost of running the code is a small fraction (about 3 percent) of the cost of performing the actual analysis. However, its accuracy in predicting the CPU and I/O resources depends intrinsically on the accuracy of calibration data that must be developed once for the computer hardware and the finite element analysis code of interest. Testing of the SCOPE code on the AMDAHL 470 V/8 computer and the ELAS finite element analysis program indicated small I/O errors (3.2 percent), larger CPU errors (17.8 percent), and negligible total errors (1.5 percent).

  1. A Flexible and Non-instrusive Approach for Computing Complex Structural Coverage Metrics

    NASA Technical Reports Server (NTRS)

    Whalen, Michael W.; Person, Suzette J.; Rungta, Neha; Staats, Matt; Grijincu, Daniela

    2015-01-01

    Software analysis tools and techniques often leverage structural code coverage information to reason about the dynamic behavior of software. Existing techniques instrument the code with the required structural obligations and then monitor the execution of the compiled code to report coverage. Instrumentation based approaches often incur considerable runtime overhead for complex structural coverage metrics such as Modified Condition/Decision (MC/DC). Code instrumentation, in general, has to be approached with great care to ensure it does not modify the behavior of the original code. Furthermore, instrumented code cannot be used in conjunction with other analyses that reason about the structure and semantics of the code under test. In this work, we introduce a non-intrusive preprocessing approach for computing structural coverage information. It uses a static partial evaluation of the decisions in the source code and a source-to-bytecode mapping to generate the information necessary to efficiently track structural coverage metrics during execution. Our technique is flexible; the results of the preprocessing can be used by a variety of coverage-driven software analysis tasks, including automated analyses that are not possible for instrumented code. Experimental results in the context of symbolic execution show the efficiency and flexibility of our nonintrusive approach for computing code coverage information

  2. Progressive Damage and Failure Analysis of Composite Laminates

    NASA Astrophysics Data System (ADS)

    Joseph, Ashith P. K.

    Composite materials are widely used in various industries for making structural parts due to higher strength to weight ratio, better fatigue life, corrosion resistance and material property tailorability. To fully exploit the capability of composites, it is required to know the load carrying capacity of the parts made of them. Unlike metals, composites are orthotropic in nature and fails in a complex manner under various loading conditions which makes it a hard problem to analyze. Lack of reliable and efficient failure analysis tools for composites have led industries to rely more on coupon and component level testing to estimate the design space. Due to the complex failure mechanisms, composite materials require a very large number of coupon level tests to fully characterize the behavior. This makes the entire testing process very time consuming and costly. The alternative is to use virtual testing tools which can predict the complex failure mechanisms accurately. This reduces the cost only to it's associated computational expenses making significant savings. Some of the most desired features in a virtual testing tool are - (1) Accurate representation of failure mechanism: Failure progression predicted by the virtual tool must be same as those observed in experiments. A tool has to be assessed based on the mechanisms it can capture. (2) Computational efficiency: The greatest advantages of a virtual tools are the savings in time and money and hence computational efficiency is one of the most needed features. (3) Applicability to a wide range of problems: Structural parts are subjected to a variety of loading conditions including static, dynamic and fatigue conditions. A good virtual testing tool should be able to make good predictions for all these different loading conditions. The aim of this PhD thesis is to develop a computational tool which can model the progressive failure of composite laminates under different quasi-static loading conditions. The analysis tool is validated by comparing the simulations against experiments for a selected number of quasi-static loading cases.

  3. Vehicle interior interactions and kinematics of rear facing child restraints in frontal crashes.

    PubMed

    Sherwood, C P; Gopalan, S; Abdelilah, Y; Marshall, R J; Crandall, J R

    2005-01-01

    The performance of rear facing child restraints in frontal crashes can be determined by controlling a) the child's kinematics and b) interactions with vehicle structures. Twelve sled tests were performed to analyze the effect of the location and structural properties of vehicle interior components. The role of restraint kinematics was studied by developing computational models which underwent idealized motions. Stiff structures originally offset from the restraint, but which contact the restraint late in the test, cause increased injury values. Attachment methods which reduce child restraint rotation and more rigidly couple the restraint to the vehicle result in the best safety performance.

  4. Vehicle Interior Interactions and Kinematics of Rear Facing Child Restraints in Frontal Crashes

    PubMed Central

    Sherwood, C. P.; Gopalan, S.; Abdelilah, Y.; Marshall, R. J.; Crandall, J. R.

    2005-01-01

    The performance of rear facing child restraints in frontal crashes can be determined by controlling a) the child’s kinematics and b) interactions with vehicle structures. Twelve sled tests were performed to analyze the effect of the location and structural properties of vehicle interior components. The role of restraint kinematics was studied by developing computational models which underwent idealized motions. Stiff structures originally offset from the restraint, but which contact the restraint late in the test, cause increased injury values. Attachment methods which reduce child restraint rotation and more rigidly couple the restraint to the vehicle result in the best safety performance. PMID:16179150

  5. Method and apparatus for thermographically and quantitatively analyzing a structure for disbonds and/or inclusions

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S. (Inventor); Winfree, William P. (Inventor); Cramer, K. Elliott (Inventor); Zalamedia, Joseph N. (Inventor)

    1996-01-01

    A heat source such as a magnetic induction/eddy current generator remotely heats a region of a surface of a test structure to a desired depth. For example, the frequency of the heating source can be varied to heat to the desired depth. A thermal sensor senses temperature changes in the heated region as a function of time. A computer compares these sensed temperature changes with calibration standards of a similar sample having known disbond and/or inclusion geography(ies) to analyze the test structure. A plurality of sensors can be arranged linearly to sense vector heat flow.

  6. Parallel deterministic transport sweeps of structured and unstructured meshes with overloaded mesh decompositions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pautz, Shawn D.; Bailey, Teresa S.

    Here, the efficiency of discrete ordinates transport sweeps depends on the scheduling algorithm, the domain decomposition, the problem to be solved, and the computational platform. Sweep scheduling algorithms may be categorized by their approach to several issues. In this paper we examine the strategy of domain overloading for mesh partitioning as one of the components of such algorithms. In particular, we extend the domain overloading strategy, previously defined and analyzed for structured meshes, to the general case of unstructured meshes. We also present computational results for both the structured and unstructured domain overloading cases. We find that an appropriate amountmore » of domain overloading can greatly improve the efficiency of parallel sweeps for both structured and unstructured partitionings of the test problems examined on up to 10 5 processor cores.« less

  7. Parallel deterministic transport sweeps of structured and unstructured meshes with overloaded mesh decompositions

    DOE PAGES

    Pautz, Shawn D.; Bailey, Teresa S.

    2016-11-29

    Here, the efficiency of discrete ordinates transport sweeps depends on the scheduling algorithm, the domain decomposition, the problem to be solved, and the computational platform. Sweep scheduling algorithms may be categorized by their approach to several issues. In this paper we examine the strategy of domain overloading for mesh partitioning as one of the components of such algorithms. In particular, we extend the domain overloading strategy, previously defined and analyzed for structured meshes, to the general case of unstructured meshes. We also present computational results for both the structured and unstructured domain overloading cases. We find that an appropriate amountmore » of domain overloading can greatly improve the efficiency of parallel sweeps for both structured and unstructured partitionings of the test problems examined on up to 10 5 processor cores.« less

  8. Measuring the Impact of Haptic Feedback Using the SOLO Taxonomy

    ERIC Educational Resources Information Center

    Minogue, James; Jones, Gail

    2009-01-01

    The application of Biggs' and Collis' Structure of Observed Learning Outcomes taxonomy in the evaluation of student learning about cell membrane transport via a computer-based learning environment is described in this study. Pre-test-post-test comparisons of student outcome data (n = 80) were made across two groups of randomly assigned students:…

  9. Structural Anomalies Detected in Ceramic Matrix Composites Using Combined Nondestructive Evaluation and Finite Element Analysis (NDE and FEA)

    NASA Technical Reports Server (NTRS)

    Abdul-Aziz, Ali; Baaklini, George Y.; Bhatt, Ramakrishna T.

    2003-01-01

    Most reverse engineering approaches involve imaging or digitizing an object and then creating a computerized reconstruction that can be integrated, in three dimensions, into a particular design environment. The rapid prototyping technique builds high-quality physical prototypes directly from computer-aided design files. This fundamental technique for interpreting and interacting with large data sets is being used here via Velocity2 (an integrated image-processing software, ref. 1) using computed tomography (CT) data to produce a prototype three-dimensional test specimen model for analyses. A study at the NASA Glenn Research Center proposes to use these capabilities to conduct a combined nondestructive evaluation (NDE) and finite element analysis (FEA) to screen pretest and posttest structural anomalies in structural components. A tensile specimen made of silicon nitrite (Si3N4) ceramic matrix composite was considered to evaluate structural durability and deformity. Ceramic matrix composites are being sought as candidate materials to replace nickel-base superalloys for turbine engine applications. They have the unique characteristics of being able to withstand higher operating temperatures and harsh combustion environments. In addition, their low densities relative to metals help reduce component mass (ref. 2). Detailed three-dimensional volume rendering of the tensile test specimen was successfully carried out with Velocity2 (ref. 1) using two-dimensional images that were generated via computed tomography. Subsequent, three-dimensional finite element analyses were performed, and the results obtained were compared with those predicted by NDE-based calculations and experimental tests. It was shown that Velocity2 software can be used to render a three-dimensional object from a series of CT scan images with a minimum level of complexity. The analytical results (ref. 3) show that the high-stress regions correlated well with the damage sites identified by the CT scans and the experimental data. Furthermore, modeling of the voids collected via NDE offered an analytical advantage that resulted in more accurate assessments of the material s structural strength. The top figure shows a CT scan image of the specimen test section illustrating various hidden structural entities in the material and an optical image of the test specimen considered in this study. The bottom figure represents the stress response predicted from the finite element analyses (ref .3 ) for a selected CT slice where it clearly illustrates the correspondence of the high stress risers due to voids in the material with those predicted by the NDE. This study is continuing, and efforts are concentrated on improving the modeling capabilities to imitate the structural anomalies as detected.

  10. Fast Bayesian approach for modal identification using free vibration data, Part I - Most probable value

    NASA Astrophysics Data System (ADS)

    Zhang, Feng-Liang; Ni, Yan-Chun; Au, Siu-Kui; Lam, Heung-Fai

    2016-03-01

    The identification of modal properties from field testing of civil engineering structures is becoming economically viable, thanks to the advent of modern sensor and data acquisition technology. Its demand is driven by innovative structural designs and increased performance requirements of dynamic-prone structures that call for a close cross-checking or monitoring of their dynamic properties and responses. Existing instrumentation capabilities and modal identification techniques allow structures to be tested under free vibration, forced vibration (known input) or ambient vibration (unknown broadband loading). These tests can be considered complementary rather than competing as they are based on different modeling assumptions in the identification model and have different implications on costs and benefits. Uncertainty arises naturally in the dynamic testing of structures due to measurement noise, sensor alignment error, modeling error, etc. This is especially relevant in field vibration tests because the test condition in the field environment can hardly be controlled. In this work, a Bayesian statistical approach is developed for modal identification using the free vibration response of structures. A frequency domain formulation is proposed that makes statistical inference based on the Fast Fourier Transform (FFT) of the data in a selected frequency band. This significantly simplifies the identification model because only the modes dominating the frequency band need to be included. It also legitimately ignores the information in the excluded frequency bands that are either irrelevant or difficult to model, thereby significantly reducing modeling error risk. The posterior probability density function (PDF) of the modal parameters is derived rigorously from modeling assumptions and Bayesian probability logic. Computational difficulties associated with calculating the posterior statistics, including the most probable value (MPV) and the posterior covariance matrix, are addressed. Fast computational algorithms for determining the MPV are proposed so that the method can be practically implemented. In the companion paper (Part II), analytical formulae are derived for the posterior covariance matrix so that it can be evaluated without resorting to finite difference method. The proposed method is verified using synthetic data. It is also applied to modal identification of full-scale field structures.

  11. Dynamic analysis for shuttle design verification

    NASA Technical Reports Server (NTRS)

    Fralich, R. W.; Green, C. E.; Rheinfurth, M. H.

    1972-01-01

    Two approaches that are used for determining the modes and frequencies of space shuttle structures are discussed. The first method, direct numerical analysis, involves finite element mathematical modeling of the space shuttle structure in order to use computer programs for dynamic structural analysis. The second method utilizes modal-coupling techniques of experimental verification made by vibrating only spacecraft components and by deducing modes and frequencies of the complete vehicle from results obtained in the component tests.

  12. Congruence Between Pulmonary Function and Computed Tomography Imaging Assessment of Cystic Fibrosis Severity.

    PubMed

    Rybacka, Anna; Goździk-Spychalska, Joanna; Rybacki, Adam; Piorunek, Tomasz; Batura-Gabryel, Halina; Karmelita-Katulska, Katarzyna

    2018-05-04

    In cystic fibrosis, pulmonary function tests (PFTs) and computed tomography are used to assess lung function and structure, respectively. Although both techniques of assessment are congruent there are lingering doubts about which PFTs variables show the best congruence with computed tomography scoring. In this study we addressed the issue by reinvestigating the association between PFTs variables and the score of changes seen in computed tomography scans in patients with cystic fibrosis with and without pulmonary exacerbation. This retrospective study comprised 40 patients in whom PFTs and computed tomography were performed no longer than 3 weeks apart. Images (inspiratory: 0.625 mm slice thickness, 0.625 mm interval; expiratory: 1.250 mm slice thickness, 10 mm interval) were evaluated with the Bhalla scoring system. The most frequent structural abnormality found in scans were bronchiectases and peribronchial thickening. The strongest relationship was found between the Bhalla sore and forced expiratory volume in 1 s (FEV1). The Bhalla sore also was related to forced vital capacity (FVC), FEV1/FVC ratio, residual volume (RV), and RV/total lung capacity (TLC) ratio. We conclude that lung structural data obtained from the computed tomography examination are highly congruent to lung function data. Thus, computed tomography imaging may supersede functional assessment in cases of poor compliance with spirometry procedures in the lederly or children. Computed tomography also seems more sensitive than PFTs in the assessment of cystic fibrosis progression. Moreover, in early phases of cystic fibrosis, computed tomography, due to its excellent resolution, may be irreplaceable in monitoring pulmonary damage.

  13. Langley Ground Facilities and Testing in the 21st Century

    NASA Technical Reports Server (NTRS)

    Ambur, Damodar R.; Kegelman, Jerome T.; Kilgore, William A.

    2010-01-01

    A strategic approach for retaining and more efficiently operating the essential Langley Ground Testing Facilities in the 21st Century is presented. This effort takes advantage of the previously completed and ongoing studies at the Agency and National levels. This integrated approach takes into consideration the overall decline in test business base within the nation and reduced utilization in each of the Langley facilities with capabilities to test in the subsonic, transonic, supersonic, and hypersonic speed regimes. The strategy accounts for capability needs to meet the Agency programmatic requirements and strategic goals and to execute test activities in the most efficient and flexible facility operating structure. The structure currently being implemented at Langley offers agility to right-size our capability and capacity from a national perspective, to accommodate the dynamic nature of the testing needs, and will address the influence of existing and emerging analytical tools for design. The paradigm for testing in the retained facilities is to efficiently and reliably provide more accurate and high-quality test results at an affordable cost to support design information needs for flight regimes where the computational capability is not adequate and to verify and validate the existing and emerging computational tools. Each of the above goals are planned to be achieved, keeping in mind the increasing small industry customer base engaged in developing unpiloted aerial vehicles and commercial space transportation systems.

  14. DockTrina: docking triangular protein trimers.

    PubMed

    Popov, Petr; Ritchie, David W; Grudinin, Sergei

    2014-01-01

    In spite of the abundance of oligomeric proteins within a cell, the structural characterization of protein-protein interactions is still a challenging task. In particular, many of these interactions involve heteromeric complexes, which are relatively difficult to determine experimentally. Hence there is growing interest in using computational techniques to model such complexes. However, assembling large heteromeric complexes computationally is a highly combinatorial problem. Nonetheless the problem can be simplified greatly by considering interactions between protein trimers. After dimers and monomers, triangular trimers (i.e. trimers with pair-wise contacts between all three pairs of proteins) are the most frequently observed quaternary structural motifs according to the three-dimensional (3D) complex database. This article presents DockTrina, a novel protein docking method for modeling the 3D structures of nonsymmetrical triangular trimers. The method takes as input pair-wise contact predictions from a rigid body docking program. It then scans and scores all possible combinations of pairs of monomers using a very fast root mean square deviation test. Finally, it ranks the predictions using a scoring function which combines triples of pair-wise contact terms and a geometric clash penalty term. The overall approach takes less than 2 min per complex on a modern desktop computer. The method is tested and validated using a benchmark set of 220 bound and seven unbound protein trimer structures. DockTrina will be made available at http://nano-d.inrialpes.fr/software/docktrina. Copyright © 2013 Wiley Periodicals, Inc.

  15. Estimation of the vortex length scale and intensity from two-dimensional samples

    NASA Technical Reports Server (NTRS)

    Reuss, D. L.; Cheng, W. P.

    1992-01-01

    A method is proposed for estimating flow features that influence flame wrinkling in reciprocating internal combustion engines, where traditional statistical measures of turbulence are suspect. Candidate methods were tested in a computed channel flow where traditional turbulence measures are valid and performance can be rationally evaluated. Two concepts are tested. First, spatial filtering is applied to the two-dimensional velocity distribution and found to reveal structures corresponding to the vorticity field. Decreasing the spatial-frequency cutoff of the filter locally changes the character and size of the flow structures that are revealed by the filter. Second, vortex length scale and intensity is estimated by computing the ensemble-average velocity distribution conditionally sampled on the vorticity peaks. The resulting conditionally sampled 'average vortex' has a peak velocity less than half the rms velocity and a size approximately equal to the two-point-correlation integral-length scale.

  16. Static aeroelastic analysis and tailoring of a single-element racing car wing

    NASA Astrophysics Data System (ADS)

    Sadd, Christopher James

    This thesis presents the research from an Engineering Doctorate research programme in collaboration with Reynard Motorsport Ltd, a manufacturer of racing cars. Racing car wing design has traditionally considered structures to be rigid. However, structures are never perfectly rigid and the interaction between aerodynamic loading and structural flexibility has a direct impact on aerodynamic performance. This interaction is often referred to as static aeroelasticity and the focus of this research has been the development of a computational static aeroelastic analysis method to improve the design of a single-element racing car wing. A static aeroelastic analysis method has been developed by coupling a Reynolds-Averaged Navier-Stokes CFD analysis method with a Finite Element structural analysis method using an iterative scheme. Development of this method has included assessment of CFD and Finite Element analysis methods and development of data transfer and mesh deflection methods. Experimental testing was also completed to further assess the computational analyses. The computational and experimental results show a good correlation and these studies have also shown that a Navier-Stokes static aeroelastic analysis of an isolated wing can be performed at an acceptable computational cost. The static aeroelastic analysis tool was used to assess methods of tailoring the structural flexibility of the wing to increase its aerodynamic performance. These tailoring methods were then used to produce two final wing designs to increase downforce and reduce drag respectively. At the average operating dynamic pressure of the racing car, the computational analysis predicts that the downforce-increasing wing has a downforce of C[1]=-1.377 in comparison to C[1]=-1.265 for the original wing. The computational analysis predicts that the drag-reducing wing has a drag of C[d]=0.115 in comparison to C[d]=0.143 for the original wing.

  17. Lesion site patterns in severe, nonverbal aphasia to predict outcome with a computer-assisted treatment program.

    PubMed

    Naeser, M A; Baker, E H; Palumbo, C L; Nicholas, M; Alexander, M P; Samaraweera, R; Prete, M N; Hodge, S M; Weissman, T

    1998-11-01

    To test whether lesion site patterns in patients with chronic, severe aphasia who have no meaningful spontaneous speech are predictive of outcome following treatment with a nonverbal, icon-based computer-assisted visual communication (C-ViC) program. Retrospective study in which computed tomographic scans performed 3 months after onset of stroke and aphasia test scores obtained before C-ViC therapy were reviewed for patients after receiving C-ViC treatment. A neurology department and speech pathology service of a Department of Veterans Affairs medical center and a university aphasia research center. Seventeen patients with stroke and severe aphasia who began treatment with C-ViC from 3 months to 10 years after onset of stroke. Level of ability to use C-ViC on a personal computer to communicate. All patients with bilateral lesions failed to learn C-ViC. For patients with unilateral left hemisphere lesion sites, statistical analyses accurately discriminated between those who could initiate communication with C-ViC from those who were only able to answer directed questions. The critical lesion areas involved temporal lobe structures (Wernicke cortical area and the subcortical temporal isthmus), supraventricular frontal lobe structures (supplementary motor area or cingulate gyrus 24), and the subcortical medial subcallosal fasciculus, deep to the Broca area. Specific lesion sites were also identified for appropriate candidacy for C-ViC. Lesion site patterns on computed tomographic scans are helpful to define candidacy for C-ViC training, and to predict outcome level. A practical method is presented for clinical application of these lesion site results in combination with aphasia test scores.

  18. A simple and fast heuristic for protein structure comparison.

    PubMed

    Pelta, David A; González, Juan R; Moreno Vega, Marcos

    2008-03-25

    Protein structure comparison is a key problem in bioinformatics. There exist several methods for doing protein comparison, being the solution of the Maximum Contact Map Overlap problem (MAX-CMO) one of the alternatives available. Although this problem may be solved using exact algorithms, researchers require approximate algorithms that obtain good quality solutions using less computational resources than the formers. We propose a variable neighborhood search metaheuristic for solving MAX-CMO. We analyze this strategy in two aspects: 1) from an optimization point of view the strategy is tested on two different datasets, obtaining an error of 3.5%(over 2702 pairs) and 1.7% (over 161 pairs) with respect to optimal values; thus leading to high accurate solutions in a simpler and less expensive way than exact algorithms; 2) in terms of protein structure classification, we conduct experiments on three datasets and show that is feasible to detect structural similarities at SCOP's family and CATH's architecture levels using normalized overlap values. Some limitations and the role of normalization are outlined for doing classification at SCOP's fold level. We designed, implemented and tested.a new tool for solving MAX-CMO, based on a well-known metaheuristic technique. The good balance between solution's quality and computational effort makes it a valuable tool. Moreover, to the best of our knowledge, this is the first time the MAX-CMO measure is tested at SCOP's fold and CATH's architecture levels with encouraging results.

  19. Reverse-Time Imaging Based on Full-Waveform Inverted Velocity Model for Nondestructive Testing of Heterogeneous Engineered Structures

    NASA Astrophysics Data System (ADS)

    Nguyen, L. T.; Modrak, R. T.; Saenger, E. H.; Tromp, J.

    2017-12-01

    Reverse-time migration (RTM) can reconstruct reflectors and scatterers by cross-correlating the source wavefield and the receiver wavefield given a known velocity model of the background. In nondestructive testing, however, the engineered structure under inspection is often composed of layers of various materials and the background material has been degraded non-uniformly because of environmental or operational effects. On the other hand, ultrasonic waveform tomography based on the principles of full-waveform inversion (FWI) has succeeded in detecting anomalous features in engineered structures. But the building of the wave velocity model of the comprehensive small-size and high-contrast defect(s) is difficult because it requires computationally expensive high-frequency numerical wave simulations and an accurate understanding of large-scale background variations of the engineered structure.To reduce computational cost and improve detection of small defects, a useful approach is to divide the waveform tomography procedure into two steps: first, a low-frequency model-building step aimed at recovering background structure using FWI, and second, a high-frequency imaging step targeting defects using RTM. Through synthetic test cases, we show that the two-step procedure appears more promising in most cases than a single-step inversion. In particular, we find that the new workflow succeeds in the challenging scenario where the defect lies along preexisting layer interface in a composite bridge deck and in related experiments involving noisy data or inaccurate source parameters. The results reveal the potential of the new wavefield imaging method and encourage further developments in data processing, enhancing computation power, and optimizing the imaging workflow itself so that the procedure can efficiently be applied to geometrically complex 3D solids and waveguides. Lastly, owing to the scale invariance of the elastic wave equation, this imaging procedure can be transferred to applications in regional scales as well.

  20. Structural Anomaly Detection Using Fiber Optic Sensors and Inverse Finite Element Method

    NASA Technical Reports Server (NTRS)

    Quach, Cuong C.; Vazquez, Sixto L.; Tessler, Alex; Moore, Jason P.; Cooper, Eric G.; Spangler, Jan. L.

    2005-01-01

    NASA Langley Research Center is investigating a variety of techniques for mitigating aircraft accidents due to structural component failure. One technique under consideration combines distributed fiber optic strain sensing with an inverse finite element method for detecting and characterizing structural anomalies anomalies that may provide early indication of airframe structure degradation. The technique identifies structural anomalies that result in observable changes in localized strain but do not impact the overall surface shape. Surface shape information is provided by an Inverse Finite Element Method that computes full-field displacements and internal loads using strain data from in-situ fiberoptic sensors. This paper describes a prototype of such a system and reports results from a series of laboratory tests conducted on a test coupon subjected to increasing levels of damage.

  1. Computed crystal energy landscapes for understanding and predicting organic crystal structures and polymorphism.

    PubMed

    Price, Sarah Sally L

    2009-01-20

    The phenomenon of polymorphism, the ability of a molecule to adopt more than one crystal structure, is a well-established property of crystalline solids. The possible variations in physical properties between polymorphs make the reliable reproduction of a crystalline form essential for all research using organic materials, as well as quality control in manufacture. Thus, the last two decades have seen both an increase in interest in polymorphism and the availability of the computer power needed to make the computational prediction of organic crystal structures a practical possibility. In the past decade, researchers have made considerable improvements in the theoretical basis for calculating the sets of structures that are within the energy range of possible polymorphism, called crystal energy landscapes. It is common to find that a molecule has a wide variety of ways of packing with lattice energy within a few kilojoules per mole of the most stable structure. However, as we develop methods to search for and characterize "all" solid forms, it is also now usual for polymorphs and solvates to be found. Thus, the computed crystal energy landscape reflects and to an increasing extent "predicts" the emerging complexity of the solid state observed for many organic molecules. This Account will discuss the ways in which the calculation of the crystal energy landscape of a molecule can be used as a complementary technique to solid form screening for polymorphs. Current methods can predict the known crystal structure, even under "blind test" conditions, but such successes are generally restricted to those structures that are the most stable over a wide range of thermodynamic conditions. The other low-energy structures can be alternative polymorphs, which have sometimes been found in later experimental studies. Examining the computed structures reveals the various compromises between close packing, hydrogen bonding, and pi-pi stacking that can result in energetically feasible structures. Indeed, we have observed that systems with many almost equi-energetic structures that contain a common interchangeable motif correlate with a tendency to disorder and problems with control of the crystallization product. Thus, contrasting the computed crystal energy landscape with the known crystal structures of a given molecule provides a valuable complement to solid form screening, and the examination of the low-energy structures often leads to a rationalization of the forms found.

  2. Generating Topic Headings during Reading of Screen-Based Text Facilitates Learning of Structural Knowledge and Impairs Learning of Lower-Level Knowledge

    ERIC Educational Resources Information Center

    Clariana, Roy B.; Marker, Anthony W.

    2007-01-01

    This investigation considers the effects of learner-generated headings on memory. Participants (N = 63) completed a computer-based lesson with or without learner-generated text topic headings. Posttests included a cued recall test of factual knowledge and a sorting task measure of structural knowledge. A significant disordinal interaction was…

  3. Further investigations of the W-test for pairwise epistasis testing.

    PubMed

    Howey, Richard; Cordell, Heather J

    2017-01-01

    Background: In a recent paper, a novel W-test for pairwise epistasis testing was proposed that appeared, in computer simulations, to have higher power than competing alternatives. Application to genome-wide bipolar data detected significant epistasis between SNPs in genes of relevant biological function. Network analysis indicated that the implicated genes formed two separate interaction networks, each containing genes highly related to autism and neurodegenerative disorders. Methods: Here we investigate further the properties and performance of the W-test via theoretical evaluation, computer simulations and application to real data. Results: We demonstrate that, for common variants, the W-test is closely related to several existing tests of association allowing for interaction, including logistic regression on 8 degrees of freedom, although logistic regression can show inflated type I error for low minor allele frequencies,  whereas the W-test shows good/conservative type I error control. Although in some situations the W-test can show higher power, logistic regression is not limited to tests on 8 degrees of freedom but can instead be tailored to impose greater structure on the assumed alternative hypothesis, offering a power advantage when the imposed structure matches the true structure. Conclusions: The W-test is a potentially useful method for testing for association - without necessarily implying interaction - between genetic variants disease, particularly when one or more of the genetic variants are rare. For common variants, the advantages of the W-test are less clear, and, indeed, there are situations where existing methods perform better. In our investigations, we further uncover a number of problems with the practical implementation and application of the W-test (to bipolar disorder) previously described, apparently due to inadequate use of standard data quality-control procedures. This observation leads us to urge caution in interpretation of the previously-presented results, most of which we consider are highly likely to be artefacts.

  4. Aggregating Data for Computational Toxicology Applications ...

    EPA Pesticide Factsheets

    Computational toxicology combines data from high-throughput test methods, chemical structure analyses and other biological domains (e.g., genes, proteins, cells, tissues) with the goals of predicting and understanding the underlying mechanistic causes of chemical toxicity and for predicting toxicity of new chemicals and products. A key feature of such approaches is their reliance on knowledge extracted from large collections of data and data sets in computable formats. The U.S. Environmental Protection Agency (EPA) has developed a large data resource called ACToR (Aggregated Computational Toxicology Resource) to support these data-intensive efforts. ACToR comprises four main repositories: core ACToR (chemical identifiers and structures, and summary data on hazard, exposure, use, and other domains), ToxRefDB (Toxicity Reference Database, a compilation of detailed in vivo toxicity data from guideline studies), ExpoCastDB (detailed human exposure data from observational studies of selected chemicals), and ToxCastDB (data from high-throughput screening programs, including links to underlying biological information related to genes and pathways). The EPA DSSTox (Distributed Structure-Searchable Toxicity) program provides expert-reviewed chemical structures and associated information for these and other high-interest public inventories. Overall, the ACToR system contains information on about 400,000 chemicals from 1100 different sources. The entire system is built usi

  5. Optimal structure and parameter learning of Ising models

    DOE PAGES

    Lokhov, Andrey; Vuffray, Marc Denis; Misra, Sidhant; ...

    2018-03-16

    Reconstruction of the structure and parameters of an Ising model from binary samples is a problem of practical importance in a variety of disciplines, ranging from statistical physics and computational biology to image processing and machine learning. The focus of the research community shifted toward developing universal reconstruction algorithms that are both computationally efficient and require the minimal amount of expensive data. Here, we introduce a new method, interaction screening, which accurately estimates model parameters using local optimization problems. The algorithm provably achieves perfect graph structure recovery with an information-theoretically optimal number of samples, notably in the low-temperature regime, whichmore » is known to be the hardest for learning. Here, the efficacy of interaction screening is assessed through extensive numerical tests on synthetic Ising models of various topologies with different types of interactions, as well as on real data produced by a D-Wave quantum computer. Finally, this study shows that the interaction screening method is an exact, tractable, and optimal technique that universally solves the inverse Ising problem.« less

  6. Optimal structure and parameter learning of Ising models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lokhov, Andrey; Vuffray, Marc Denis; Misra, Sidhant

    Reconstruction of the structure and parameters of an Ising model from binary samples is a problem of practical importance in a variety of disciplines, ranging from statistical physics and computational biology to image processing and machine learning. The focus of the research community shifted toward developing universal reconstruction algorithms that are both computationally efficient and require the minimal amount of expensive data. Here, we introduce a new method, interaction screening, which accurately estimates model parameters using local optimization problems. The algorithm provably achieves perfect graph structure recovery with an information-theoretically optimal number of samples, notably in the low-temperature regime, whichmore » is known to be the hardest for learning. Here, the efficacy of interaction screening is assessed through extensive numerical tests on synthetic Ising models of various topologies with different types of interactions, as well as on real data produced by a D-Wave quantum computer. Finally, this study shows that the interaction screening method is an exact, tractable, and optimal technique that universally solves the inverse Ising problem.« less

  7. Evaluation of Semantic Web Technologies for Storing Computable Definitions of Electronic Health Records Phenotyping Algorithms.

    PubMed

    Papež, Václav; Denaxas, Spiros; Hemingway, Harry

    2017-01-01

    Electronic Health Records are electronic data generated during or as a byproduct of routine patient care. Structured, semi-structured and unstructured EHR offer researchers unprecedented phenotypic breadth and depth and have the potential to accelerate the development of precision medicine approaches at scale. A main EHR use-case is defining phenotyping algorithms that identify disease status, onset and severity. Phenotyping algorithms utilize diagnoses, prescriptions, laboratory tests, symptoms and other elements in order to identify patients with or without a specific trait. No common standardized, structured, computable format exists for storing phenotyping algorithms. The majority of algorithms are stored as human-readable descriptive text documents making their translation to code challenging due to their inherent complexity and hinders their sharing and re-use across the community. In this paper, we evaluate the two key Semantic Web Technologies, the Web Ontology Language and the Resource Description Framework, for enabling computable representations of EHR-driven phenotyping algorithms.

  8. A projection-based model reduction strategy for the wave and vibration analysis of rotating periodic structures

    NASA Astrophysics Data System (ADS)

    Beli, D.; Mencik, J.-M.; Silva, P. B.; Arruda, J. R. F.

    2018-05-01

    The wave finite element method has proved to be an efficient and accurate numerical tool to perform the free and forced vibration analysis of linear reciprocal periodic structures, i.e. those conforming to symmetrical wave fields. In this paper, its use is extended to the analysis of rotating periodic structures, which, due to the gyroscopic effect, exhibit asymmetric wave propagation. A projection-based strategy which uses reduced symplectic wave basis is employed, which provides a well-conditioned eigenproblem for computing waves in rotating periodic structures. The proposed formulation is applied to the free and forced response analysis of homogeneous, multi-layered and phononic ring structures. In all test cases, the following features are highlighted: well-conditioned dispersion diagrams, good accuracy, and low computational time. The proposed strategy is particularly convenient in the simulation of rotating structures when parametric analysis for several rotational speeds is usually required, e.g. for calculating Campbell diagrams. This provides an efficient and flexible framework for the analysis of rotordynamic problems.

  9. Bayes Factor Covariance Testing in Item Response Models.

    PubMed

    Fox, Jean-Paul; Mulder, Joris; Sinharay, Sandip

    2017-12-01

    Two marginal one-parameter item response theory models are introduced, by integrating out the latent variable or random item parameter. It is shown that both marginal response models are multivariate (probit) models with a compound symmetry covariance structure. Several common hypotheses concerning the underlying covariance structure are evaluated using (fractional) Bayes factor tests. The support for a unidimensional factor (i.e., assumption of local independence) and differential item functioning are evaluated by testing the covariance components. The posterior distribution of common covariance components is obtained in closed form by transforming latent responses with an orthogonal (Helmert) matrix. This posterior distribution is defined as a shifted-inverse-gamma, thereby introducing a default prior and a balanced prior distribution. Based on that, an MCMC algorithm is described to estimate all model parameters and to compute (fractional) Bayes factor tests. Simulation studies are used to show that the (fractional) Bayes factor tests have good properties for testing the underlying covariance structure of binary response data. The method is illustrated with two real data studies.

  10. Algorithms for Efficient Computation of Transfer Functions for Large Order Flexible Systems

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Giesy, Daniel P.

    1998-01-01

    An efficient and robust computational scheme is given for the calculation of the frequency response function of a large order, flexible system implemented with a linear, time invariant control system. Advantage is taken of the highly structured sparsity of the system matrix of the plant based on a model of the structure using normal mode coordinates. The computational time per frequency point of the new computational scheme is a linear function of system size, a significant improvement over traditional, still-matrix techniques whose computational times per frequency point range from quadratic to cubic functions of system size. This permits the practical frequency domain analysis of systems of much larger order than by traditional, full-matrix techniques. Formulations are given for both open- and closed-loop systems. Numerical examples are presented showing the advantages of the present formulation over traditional approaches, both in speed and in accuracy. Using a model with 703 structural modes, the present method was up to two orders of magnitude faster than a traditional method. The present method generally showed good to excellent accuracy throughout the range of test frequencies, while traditional methods gave adequate accuracy for lower frequencies, but generally deteriorated in performance at higher frequencies with worst case errors being many orders of magnitude times the correct values.

  11. RNA-SSPT: RNA Secondary Structure Prediction Tools.

    PubMed

    Ahmad, Freed; Mahboob, Shahid; Gulzar, Tahsin; Din, Salah U; Hanif, Tanzeela; Ahmad, Hifza; Afzal, Muhammad

    2013-01-01

    The prediction of RNA structure is useful for understanding evolution for both in silico and in vitro studies. Physical methods like NMR studies to predict RNA secondary structure are expensive and difficult. Computational RNA secondary structure prediction is easier. Comparative sequence analysis provides the best solution. But secondary structure prediction of a single RNA sequence is challenging. RNA-SSPT is a tool that computationally predicts secondary structure of a single RNA sequence. Most of the RNA secondary structure prediction tools do not allow pseudoknots in the structure or are unable to locate them. Nussinov dynamic programming algorithm has been implemented in RNA-SSPT. The current studies shows only energetically most favorable secondary structure is required and the algorithm modification is also available that produces base pairs to lower the total free energy of the secondary structure. For visualization of RNA secondary structure, NAVIEW in C language is used and modified in C# for tool requirement. RNA-SSPT is built in C# using Dot Net 2.0 in Microsoft Visual Studio 2005 Professional edition. The accuracy of RNA-SSPT is tested in terms of Sensitivity and Positive Predicted Value. It is a tool which serves both secondary structure prediction and secondary structure visualization purposes.

  12. RNA-SSPT: RNA Secondary Structure Prediction Tools

    PubMed Central

    Ahmad, Freed; Mahboob, Shahid; Gulzar, Tahsin; din, Salah U; Hanif, Tanzeela; Ahmad, Hifza; Afzal, Muhammad

    2013-01-01

    The prediction of RNA structure is useful for understanding evolution for both in silico and in vitro studies. Physical methods like NMR studies to predict RNA secondary structure are expensive and difficult. Computational RNA secondary structure prediction is easier. Comparative sequence analysis provides the best solution. But secondary structure prediction of a single RNA sequence is challenging. RNA-SSPT is a tool that computationally predicts secondary structure of a single RNA sequence. Most of the RNA secondary structure prediction tools do not allow pseudoknots in the structure or are unable to locate them. Nussinov dynamic programming algorithm has been implemented in RNA-SSPT. The current studies shows only energetically most favorable secondary structure is required and the algorithm modification is also available that produces base pairs to lower the total free energy of the secondary structure. For visualization of RNA secondary structure, NAVIEW in C language is used and modified in C# for tool requirement. RNA-SSPT is built in C# using Dot Net 2.0 in Microsoft Visual Studio 2005 Professional edition. The accuracy of RNA-SSPT is tested in terms of Sensitivity and Positive Predicted Value. It is a tool which serves both secondary structure prediction and secondary structure visualization purposes. PMID:24250115

  13. Parallel-vector computation for structural analysis and nonlinear unconstrained optimization problems

    NASA Technical Reports Server (NTRS)

    Nguyen, Duc T.

    1990-01-01

    Practical engineering application can often be formulated in the form of a constrained optimization problem. There are several solution algorithms for solving a constrained optimization problem. One approach is to convert a constrained problem into a series of unconstrained problems. Furthermore, unconstrained solution algorithms can be used as part of the constrained solution algorithms. Structural optimization is an iterative process where one starts with an initial design, a finite element structure analysis is then performed to calculate the response of the system (such as displacements, stresses, eigenvalues, etc.). Based upon the sensitivity information on the objective and constraint functions, an optimizer such as ADS or IDESIGN, can be used to find the new, improved design. For the structural analysis phase, the equation solver for the system of simultaneous, linear equations plays a key role since it is needed for either static, or eigenvalue, or dynamic analysis. For practical, large-scale structural analysis-synthesis applications, computational time can be excessively large. Thus, it is necessary to have a new structural analysis-synthesis code which employs new solution algorithms to exploit both parallel and vector capabilities offered by modern, high performance computers such as the Convex, Cray-2 and Cray-YMP computers. The objective of this research project is, therefore, to incorporate the latest development in the parallel-vector equation solver, PVSOLVE into the widely popular finite-element production code, such as the SAP-4. Furthermore, several nonlinear unconstrained optimization subroutines have also been developed and tested under a parallel computer environment. The unconstrained optimization subroutines are not only useful in their own right, but they can also be incorporated into a more popular constrained optimization code, such as ADS.

  14. Control of optical systems

    NASA Technical Reports Server (NTRS)

    Founds, D.

    1988-01-01

    Some of the current and planned activities at the Air Force Systems Command in structures and controls for optical-type systems are summarized. Many of the activities are contracted to industry; one task is an in-house program which includes a hardware test program. The objective of the in-house program, referred to as the Aluminum Beam Expander Structure (ABES), is to address issues involved in on-orbit system identification. The structure, which appears similar to the LDR backup structure, is about 35 feet tall. The activity to date has been limited to acquisition of about 250 hours of test data. About 30 hours of data per excitation force is gathered in order to obtain sufficient data for a good statistical estimate of the structural parameters. The development of an Integrated Structural Modeling (ISM) computer program is being done by Boeing Aerospace Company. The objective of the contracted effort is to develop a combined optics, structures, thermal, controls, and multibody dynamics simulation code.

  15. Creation of a computer self-efficacy measure: analysis of internal consistency, psychometric properties, and validity.

    PubMed

    Howard, Matt C

    2014-10-01

    Computer self-efficacy is an often studied construct that has been shown to be related to an array of important individual outcomes. Unfortunately, existing measures of computer self-efficacy suffer from several deficiencies, including criterion contamination, outdated wording, and/or inadequate psychometric properties. For this reason, the current article presents the creation of a new computer self-efficacy measure. In Study 1, an over-representative item list is created and subsequently reduced through exploratory factor analysis to create an initial measure, and the discriminant validity of this initial measure is tested. In Study 2, the unidimensional factor structure of the initial measure is supported through confirmatory factor analysis and further reduced into a final, 12-item measure. In Study 3, the convergent and criterion validity of the 12-item measure is tested. Overall, this three study process demonstrates that the new computer self-efficacy measure has superb psychometric properties and internal reliability, and demonstrates excellent evidence for several aspects of validity. It is hoped that the 12-item computer self-efficacy measure will be utilized in future research on computer self-efficacy, which is discussed in the current article.

  16. Complementing the characterization of in vivo generated N-glucuronic acid conjugates of stanozolol by collision cross section computation and analysis.

    PubMed

    Thevis, Mario; Dib, Josef; Thomas, Andreas; Höppner, Sebastian; Lagojda, Andreas; Kuehne, Dirk; Sander, Mark; Opfermann, Georg; Schänzer, Wilhelm

    2015-01-01

    Detailed structural information on metabolites serving as target analytes in clinical, forensic, and sports drug testing programmes is of paramount importance to ensure unequivocal test results. In the present study, the utility of collision cross section (CCS) analysis by travelling wave ion mobility measurements to support drug metabolite characterization efforts was tested concerning recently identified glucuronic acid conjugates of the anabolic-androgenic steroid stanozolol. Employing travelling-wave ion mobility spectrometry/quadrupole-time-of-flight mass spectrometry, drift times of five synthetically derived and fully characterized steroid glucuronides were measured and subsequently correlated to respective CCSs as obtained in silico to form an analyte-tailored calibration curve. The CCSs were calculated by equilibrium structure minimization (density functional theory) using the programmes ORCA with the data set B3LYP/6-31G and MOBCAL utilizing the trajectory method (TM) with nitrogen as drift gas. Under identical experimental conditions, synthesized and/or urinary stanozolol-N and O-glucuronides were analyzed to provide complementary information on the location of glucuronidation. Finally, the obtained data were compared to CCS results generated by the system's internal algorithm based on a calibration employing a polyalanine analyte mixture. The CCSs ΩN2 calculated for the five steroid glucuronide calibrants were found between 180 and 208 Å(2) , thus largely covering the observed and computed CCSs for stanozolol-N1'-, stanozolol-N2'-, and stanozolol-O-glucuronide found at values between 195.1 and 212.4 Å(2) . The obtained data corroborated the earlier suggested N- and O-glucuronidation of stanozolol, and demonstrate the exploit of ion mobility and CCS computation in structure characterization of phase-II metabolic products; however, despite reproducibly measurable differences in ion mobility of stanozolol-N1'-, N2'-, and O-glucuronides, the discriminatory power of the chosen CCS computation algorithm was found to be not appropriate to allow for accurate assignments of the two N-conjugated structures. Using polyalanine-based calibrations, significantly different absolute values were obtained for all CCSs, but due to a constant offset of approximately 45 Å(2) an excellent correlation (R(2)  = 0.9997) between both approaches was observed. This suggests a substantially accelerated protocol when patterns of computed and polyalanine-based experimental data can be used for structure elucidations instead of creating individual analyte-specific calibration curves. Copyright © 2015 John Wiley & Sons, Ltd.

  17. Gravitational birefringence and an exotic formula for redshifts

    NASA Astrophysics Data System (ADS)

    Duval, Christian; Pasquet, Johanna; Schücker, Thomas; Tilquin, André

    2018-06-01

    We compute the birefringence of light in curved Robertson-Walker spacetimes and propose an exotic formula for redshift based on the internal structure of the spinning photon. We then use the Hubble diagram of supernovae to test this formula.

  18. Investigation of prescribed movement in fluid–structure interaction simulation for the human phonation process☆

    PubMed Central

    Zörner, S.; Kaltenbacher, M.; Döllinger, M.

    2013-01-01

    In a partitioned approach for computational fluid–structure interaction (FSI) the coupling between fluid and structure causes substantial computational resources. Therefore, a convenient alternative is to reduce the problem to a pure flow simulation with preset movement and applying appropriate boundary conditions. This work investigates the impact of replacing the fully-coupled interface condition with a one-way coupling. To continue to capture structural movement and its effect onto the flow field, prescribed wall movements from separate simulations and/or measurements are used. As an appropriate test case, we apply the different coupling strategies to the human phonation process, which is a highly complex interaction of airflow through the larynx and structural vibration of the vocal folds (VF). We obtain vocal fold vibrations from a fully-coupled simulation and use them as input data for the simplified simulation, i.e. just solving the fluid flow. All computations are performed with our research code CFS++, which is based on the finite element (FE) method. The presented results show that a pure fluid simulation with prescribed structural movement can substitute the fully-coupled approach. However, caution must be used to ensure accurate boundary conditions on the interface, and we found that only a pressure driven flow correctly responds to the physical effects when using specified motion. PMID:24204083

  19. Hybrid, experimental and computational, investigation of mechanical components

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    1996-07-01

    Computational and experimental methodologies have unique features for the analysis and solution of a wide variety of engineering problems. Computations provide results that depend on selection of input parameters such as geometry, material constants, and boundary conditions which, for correct modeling purposes, have to be appropriately chosen. In addition, it is relatively easy to modify the input parameters in order to computationally investigate different conditions. Experiments provide solutions which characterize the actual behavior of the object of interest subjected to specific operating conditions. However, it is impractical to experimentally perform parametric investigations. This paper discusses the use of a hybrid, computational and experimental, approach for study and optimization of mechanical components. Computational techniques are used for modeling the behavior of the object of interest while it is experimentally tested using noninvasive optical techniques. Comparisons are performed through a fringe predictor program used to facilitate the correlation between both techniques. In addition, experimentally obtained quantitative information, such as displacements and shape, can be applied in the computational model in order to improve this correlation. The result is a validated computational model that can be used for performing quantitative analyses and structural optimization. Practical application of the hybrid approach is illustrated with a representative example which demonstrates the viability of the approach as an engineering tool for structural analysis and optimization.

  20. Automatic computational labeling of glomerular textural boundaries

    NASA Astrophysics Data System (ADS)

    Ginley, Brandon; Tomaszewski, John E.; Sarder, Pinaki

    2017-03-01

    The glomerulus, a specialized bundle of capillaries, is the blood filtering unit of the kidney. Each human kidney contains about 1 million glomeruli. Structural damages in the glomerular micro-compartments give rise to several renal conditions; most severe of which is proteinuria, where excessive blood proteins flow freely to the urine. The sole way to confirm glomerular structural damage in renal pathology is by examining histopathological or immunofluorescence stained needle biopsies under a light microscope. However, this method is extremely tedious and time consuming, and requires manual scoring on the number and volume of structures. Computational quantification of equivalent features promises to greatly ease this manual burden. The largest obstacle to computational quantification of renal tissue is the ability to recognize complex glomerular textural boundaries automatically. Here we present a computational pipeline to accurately identify glomerular boundaries with high precision and accuracy. The computational pipeline employs an integrated approach composed of Gabor filtering, Gaussian blurring, statistical F-testing, and distance transform, and performs significantly better than standard Gabor based textural segmentation method. Our integrated approach provides mean accuracy/precision of 0.89/0.97 on n = 200Hematoxylin and Eosin (HE) glomerulus images, and mean 0.88/0.94 accuracy/precision on n = 200 Periodic Acid Schiff (PAS) glomerulus images. Respective accuracy/precision of the Gabor filter bank based method is 0.83/0.84 for HE and 0.78/0.8 for PAS. Our method will simplify computational partitioning of glomerular micro-compartments hidden within dense textural boundaries. Automatic quantification of glomeruli will streamline structural analysis in clinic, and can help realize real time diagnoses and interventions.

  1. Theoretical modeling of multiprotein complexes by iSPOT: Integration of small-angle X-ray scattering, hydroxyl radical footprinting, and computational docking.

    PubMed

    Huang, Wei; Ravikumar, Krishnakumar M; Parisien, Marc; Yang, Sichun

    2016-12-01

    Structural determination of protein-protein complexes such as multidomain nuclear receptors has been challenging for high-resolution structural techniques. Here, we present a combined use of multiple biophysical methods, termed iSPOT, an integration of shape information from small-angle X-ray scattering (SAXS), protection factors probed by hydroxyl radical footprinting, and a large series of computationally docked conformations from rigid-body or molecular dynamics (MD) simulations. Specifically tested on two model systems, the power of iSPOT is demonstrated to accurately predict the structures of a large protein-protein complex (TGFβ-FKBP12) and a multidomain nuclear receptor homodimer (HNF-4α), based on the structures of individual components of the complexes. Although neither SAXS nor footprinting alone can yield an unambiguous picture for each complex, the combination of both, seamlessly integrated in iSPOT, narrows down the best-fit structures that are about 3.2Å and 4.2Å in RMSD from their corresponding crystal structures, respectively. Furthermore, this proof-of-principle study based on the data synthetically derived from available crystal structures shows that the iSPOT-using either rigid-body or MD-based flexible docking-is capable of overcoming the shortcomings of standalone computational methods, especially for HNF-4α. By taking advantage of the integration of SAXS-based shape information and footprinting-based protection/accessibility as well as computational docking, this iSPOT platform is set to be a powerful approach towards accurate integrated modeling of many challenging multiprotein complexes. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Computational Aerodynamics of Shuttle Orbiter Damage Scenarios in Support of the Columbia Accident Investigation

    NASA Technical Reports Server (NTRS)

    Bibb, Karen L.; Prabhu, Ramadas K.

    2004-01-01

    In support of the Columbia Accident Investigation, inviscid computations of the aerodynamic characteristics for various Shuttle Orbiter damage scenarios were performed using the FELISA unstructured CFD solver. Computed delta aerodynamics were compared with the reconstructed delta aerodynamics in order to postulate a progression of damage through the flight trajectory. By performing computations at hypervelocity flight and CF4 tunnel conditions, a bridge was provided between wind tunnel testing in Langley's 20-Inch CF4 facility and the flight environment experienced by Columbia during re-entry. The rapid modeling capability of the unstructured methodology allowed the computational effort to keep pace with the wind tunnel and, at times, guide the wind tunnel efforts. These computations provided a detailed view of the flowfield characteristics and the contribution of orbiter components (such as the vertical tail and wing) to aerodynamic forces and moments that were unavailable from wind tunnel testing. The damage scenarios are grouped into three categories. Initially, single and multiple missing full RCC panels were analyzed to determine the effect of damage location and magnitude on the aerodynamics. Next is a series of cases with progressive damage, increasing in severity, in the region of RCC panel 9. The final group is a set of wing leading edge and windward surface deformations that model possible structural deformation of the wing skin due to internal heating of the wing structure. By matching the aerodynamics from selected damage scenarios to the reconstructed flight aerodynamics, a progression of damage that is consistent with the flight data, debris forensics, and wind tunnel data is postulated.

  3. Automatic testing and assessment of neuroanatomy using a digital brain atlas: method and development of computer- and mobile-based applications.

    PubMed

    Nowinski, Wieslaw L; Thirunavuukarasuu, Arumugam; Ananthasubramaniam, Anand; Chua, Beng Choon; Qian, Guoyu; Nowinska, Natalia G; Marchenko, Yevgen; Volkau, Ihar

    2009-10-01

    Preparation of tests and student's assessment by the instructor are time consuming. We address these two tasks in neuroanatomy education by employing a digital media application with a three-dimensional (3D), interactive, fully segmented, and labeled brain atlas. The anatomical and vascular models in the atlas are linked to Terminologia Anatomica. Because the cerebral models are fully segmented and labeled, our approach enables automatic and random atlas-derived generation of questions to test location and naming of cerebral structures. This is done in four steps: test individualization by the instructor, test taking by the students at their convenience, automatic student assessment by the application, and communication of the individual assessment to the instructor. A computer-based application with an interactive 3D atlas and a preliminary mobile-based application were developed to realize this approach. The application works in two test modes: instructor and student. In the instructor mode, the instructor customizes the test by setting the scope of testing and student performance criteria, which takes a few seconds. In the student mode, the student is tested and automatically assessed. Self-testing is also feasible at any time and pace. Our approach is automatic both with respect to test generation and student assessment. It is also objective, rapid, and customizable. We believe that this approach is novel from computer-based, mobile-based, and atlas-assisted standpoints.

  4. Spectral gene set enrichment (SGSE).

    PubMed

    Frost, H Robert; Li, Zhigang; Moore, Jason H

    2015-03-03

    Gene set testing is typically performed in a supervised context to quantify the association between groups of genes and a clinical phenotype. In many cases, however, a gene set-based interpretation of genomic data is desired in the absence of a phenotype variable. Although methods exist for unsupervised gene set testing, they predominantly compute enrichment relative to clusters of the genomic variables with performance strongly dependent on the clustering algorithm and number of clusters. We propose a novel method, spectral gene set enrichment (SGSE), for unsupervised competitive testing of the association between gene sets and empirical data sources. SGSE first computes the statistical association between gene sets and principal components (PCs) using our principal component gene set enrichment (PCGSE) method. The overall statistical association between each gene set and the spectral structure of the data is then computed by combining the PC-level p-values using the weighted Z-method with weights set to the PC variance scaled by Tracy-Widom test p-values. Using simulated data, we show that the SGSE algorithm can accurately recover spectral features from noisy data. To illustrate the utility of our method on real data, we demonstrate the superior performance of the SGSE method relative to standard cluster-based techniques for testing the association between MSigDB gene sets and the variance structure of microarray gene expression data. Unsupervised gene set testing can provide important information about the biological signal held in high-dimensional genomic data sets. Because it uses the association between gene sets and samples PCs to generate a measure of unsupervised enrichment, the SGSE method is independent of cluster or network creation algorithms and, most importantly, is able to utilize the statistical significance of PC eigenvalues to ignore elements of the data most likely to represent noise.

  5. Virtual-pulse time integral methodology: A new explicit approach for computational dynamics - Theoretical developments for general nonlinear structural dynamics

    NASA Technical Reports Server (NTRS)

    Chen, Xiaoqin; Tamma, Kumar K.; Sha, Desong

    1993-01-01

    The present paper describes a new explicit virtual-pulse time integral methodology for nonlinear structural dynamics problems. The purpose of the paper is to provide the theoretical basis of the methodology and to demonstrate applicability of the proposed formulations to nonlinear dynamic structures. Different from the existing numerical methods such as direct time integrations or mode superposition techniques, the proposed methodology offers new perspectives and methodology of development, and possesses several unique and attractive computational characteristics. The methodology is tested and compared with the implicit Newmark method (trapezoidal rule) through a nonlinear softening and hardening spring dynamic models. The numerical results indicate that the proposed explicit virtual-pulse time integral methodology is an excellent alternative for solving general nonlinear dynamic problems.

  6. Parameterization of Model Validating Sets for Uncertainty Bound Optimizations. Revised

    NASA Technical Reports Server (NTRS)

    Lim, K. B.; Giesy, D. P.

    2000-01-01

    Given measurement data, a nominal model and a linear fractional transformation uncertainty structure with an allowance on unknown but bounded exogenous disturbances, easily computable tests for the existence of a model validating uncertainty set are given. Under mild conditions, these tests are necessary and sufficient for the case of complex, nonrepeated, block-diagonal structure. For the more general case which includes repeated and/or real scalar uncertainties, the tests are only necessary but become sufficient if a collinearity condition is also satisfied. With the satisfaction of these tests, it is shown that a parameterization of all model validating sets of plant models is possible. The new parameterization is used as a basis for a systematic way to construct or perform uncertainty tradeoff with model validating uncertainty sets which have specific linear fractional transformation structure for use in robust control design and analysis. An illustrative example which includes a comparison of candidate model validating sets is given.

  7. Hardware survey for the avionics test bed

    NASA Technical Reports Server (NTRS)

    Cobb, J. M.

    1981-01-01

    A survey of maor hardware items that could possibly be used in the development of an avionics test bed for space shuttle attached or autonomous large space structures was conducted in NASA Johnson Space Center building 16. The results of the survey are organized to show the hardware by laboratory usage. Computer systems in each laboratory are described in some detail.

  8. Sixty years of aeronautical research, 1917-1977. [Langley Research Center

    NASA Technical Reports Server (NTRS)

    Anderton, D. A.

    1978-01-01

    The history of Langley Research Center and its contributions to solving problems related to flight over the past six decades is recounted. Technical innovations described include those related to air craft construction materials, jet and rocket propulsion, flight testing and simulation, wind tunnel tests, noise reduction, supersonic flight, air traffic control, structural analysis, computational aerodynamics, and fuel efficiency.

  9. Current Grid Generation Strategies and Future Requirements in Hypersonic Vehicle Design, Analysis and Testing

    NASA Technical Reports Server (NTRS)

    Papadopoulos, Periklis; Venkatapathy, Ethiraj; Prabhu, Dinesh; Loomis, Mark P.; Olynick, Dave; Arnold, James O. (Technical Monitor)

    1998-01-01

    Recent advances in computational power enable computational fluid dynamic modeling of increasingly complex configurations. A review of grid generation methodologies implemented in support of the computational work performed for the X-38 and X-33 are presented. In strategizing topological constructs and blocking structures factors considered are the geometric configuration, optimal grid size, numerical algorithms, accuracy requirements, physics of the problem at hand, computational expense, and the available computer hardware. Also addressed are grid refinement strategies, the effects of wall spacing, and convergence. The significance of grid is demonstrated through a comparison of computational and experimental results of the aeroheating environment experienced by the X-38 vehicle. Special topics on grid generation strategies are also addressed to model control surface deflections, and material mapping.

  10. Impact Damage and Strain Rate Effects for Toughened Epoxy Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Minnetyan, Levon

    2006-01-01

    Structural integrity of composite systems under dynamic impact loading is investigated herein. The GENOA virtual testing software environment is used to implement the effects of dynamic loading on fracture progression and damage tolerance. Combinations of graphite and glass fibers with a toughened epoxy matrix are investigated. The effect of a ceramic coating for the absorption of impact energy is also included. Impact and post impact simulations include verification and prediction of (1) Load and Impact Energy, (2) Impact Damage Size, (3) Maximum Impact Peak Load, (4) Residual Strength, (5) Maximum Displacement, (6) Contribution of Failure Modes to Failure Mechanisms, (7) Prediction of Impact Load Versus Time, and (8) Damage, and Fracture Pattern. A computer model is utilized for the assessment of structural response, progressive fracture, and defect/damage tolerance characteristics. Results show the damage progression sequence and the changes in the structural response characteristics due to dynamic impact. The fundamental premise of computational simulation is that the complete evaluation of composite fracture requires an assessment of ply and subply level damage/fracture processes as the structure is subjected to loads. Simulation results for the graphite/epoxy composite were compared with the impact and tension failure test data, correlation and verification was obtained that included: (1) impact energy, (2) damage size, (3) maximum impact peak load, (4) residual strength, (5) maximum displacement, and (6) failure mechanisms of the composite structure.

  11. Methods and benefits of experimental seismic evaluation of nuclear power plants. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1979-07-01

    This study reviews experimental techniques, instrumentation requirements, safety considerations, and benefits of performing vibration tests on nuclear power plant containments and internal components. The emphasis is on testing to improve seismic structural models. Techniques for identification of resonant frequencies, damping, and mode shapes, are discussed. The benefits of testing with regard to increased damping and more accurate computer models are oulined. A test plan, schedule and budget are presented for a typical PWR nuclear power plant.

  12. Computer Simulation For Design Of TWT's

    NASA Technical Reports Server (NTRS)

    Bartos, Karen F.; Fite, E. Brian; Shalkhauser, Kurt A.; Sharp, G. Richard

    1992-01-01

    A three-dimensional finite-element analytical technique facilitates design and fabrication of traveling-wave-tube (TWT) slow-wave structures. Used to perform thermal and mechanical analyses of TWT designed with variety of configurations, geometries, and materials. Using three-dimensional computer analysis, designer able to simulate building and testing of TWT, with consequent substantial saving of time and money. Technique enables detailed look into operation of traveling-wave tubes to help improve performance for future communications systems.

  13. Evaluation of a data dictionary system. [information dissemination and computer systems programs

    NASA Technical Reports Server (NTRS)

    Driggers, W. G.

    1975-01-01

    The usefulness was investigated of a data dictionary/directory system for achieving optimum benefits from existing and planned investments in computer data files in the Data Systems Development Branch and the Institutional Data Systems Division. Potential applications of the data catalogue system are discussed along with an evaluation of the system. Other topics discussed include data description, data structure, programming aids, programming languages, program networks, and test data.

  14. NASA software specification and evaluation system: Software verification/validation techniques

    NASA Technical Reports Server (NTRS)

    1977-01-01

    NASA software requirement specifications were used in the development of a system for validating and verifying computer programs. The software specification and evaluation system (SSES) provides for the effective and efficient specification, implementation, and testing of computer software programs. The system as implemented will produce structured FORTRAN or ANSI FORTRAN programs, but the principles upon which SSES is designed allow it to be easily adapted to other high order languages.

  15. Blade loss transient dynamics analysis, volume 1. Task 1: Survey and perspective. [aircraft gas turbine engines

    NASA Technical Reports Server (NTRS)

    Gallardo, V. C.; Gaffney, E. F.; Bach, L. J.; Stallone, M. J.

    1981-01-01

    An analytical technique was developed to predict the behavior of a rotor system subjected to sudden unbalance. The technique is implemented in the Turbine Engine Transient Rotor Analysis (TETRA) computer program using the component element method. The analysis was particularly aimed toward blade-loss phenomena in gas turbine engines. A dual-rotor, casing, and pylon structure can be modeled by the computer program. Blade tip rubs, Coriolis forces, and mechanical clearances are included. The analytical system was verified by modeling and simulating actual test conditions for a rig test as well as a full-engine, blade-release demonstration.

  16. A microprocessor-based automation test system for the experiment of the multi-stage compressor

    NASA Astrophysics Data System (ADS)

    Zhang, Huisheng; Lin, Chongping

    1991-08-01

    An automation test system that is controlled by the microprocessor and used in the multistage compressor experiment is described. Based on the analysis of the compressor experiment performances, a complete hardware system structure is set up. It is composed of a IBM PC/XT computer, a large scale sampled data system, the moving machine with three directions, the scanners, the digital instrumentation and some output devices. A program structure of real-time software system is described. The testing results show that this test system can take the measure of many parameter magnitudes in the blade row places and on a boundary layer in different states. The automatic extent and the accuracy of experiment is increased and the experimental cost is reduced.

  17. Hamilton Standard Q-fan demonstrator dynamic pitch change test program, volume 1

    NASA Technical Reports Server (NTRS)

    Demers, W. J.; Nelson, D. J.; Wainauski, H. S.

    1975-01-01

    Tests of a full scale variable pitch fan engine to obtain data on the structural characteristics, response times, and fan/core engine compatibility during transient changes in blade angle, fan rpm, and engine power is reported. Steady state reverse thrust tests with a take off nozzle configuration were also conducted. The 1.4 meter diameter, 13 bladed controllable pitch fan was driven by a T55 L 11A engine with power and blade angle coordinated by a digital computer. The tests demonstrated an ability to change from full forward thrust to reverse thrust in less than one (1) second. Reverse thrust was effected through feather and through flat pitch; structural characteristics and engine/fan compatibility were within satisfactory limits.

  18. The Cyborg Astrobiologist: scouting red beds for uncommon features with geological significance

    NASA Astrophysics Data System (ADS)

    McGuire, Patrick Charles; Díaz-Martínez, Enrique; Ormö, Jens; Gómez-Elvira, Javier; Rodríguez-Manfredi, José Antonio; Sebastián-Martínez, Eduardo; Ritter, Helge; Haschke, Robert; Oesker, Markus; Ontrup, Jörg

    2005-04-01

    The `Cyborg Astrobiologist' has undergone a second geological field trial, at a site in northern Guadalajara, Spain, near Riba de Santiuste. The site at Riba de Santiuste is dominated by layered deposits of red sandstones. The Cyborg Astrobiologist is a wearable computer and video camera system that has demonstrated a capability to find uncommon interest points in geological imagery in real time in the field. In this second field trial, the computer vision system of the Cyborg Astrobiologist was tested at seven different tripod positions, on three different geological structures. The first geological structure was an outcrop of nearly homogeneous sandstone, which exhibits oxidized-iron impurities in red areas and an absence of these iron impurities in white areas. The white areas in these `red beds' have turned white because the iron has been removed. The iron removal from the sandstone can proceed once the iron has been chemically reduced, perhaps by a biological agent. In one instance the computer vision system found several (iron-free) white spots to be uncommon and therefore interesting, as well as several small and dark nodules. The second geological structure was another outcrop some 600 m to the east, with white, textured mineral deposits on the surface of the sandstone, at the bottom of the outcrop. The computer vision system found these white, textured mineral deposits to be interesting. We acquired samples of the mineral deposits for geochemical analysis in the laboratory. This laboratory analysis of the crust identifies a double layer, consisting of an internal millimetre-size layering of calcite and an external centimetre-size efflorescence of gypsum. The third geological structure was a 50 cm thick palaeosol layer, with fossilized root structures of some plants. The computer vision system also found certain areas of these root structures to be interesting. A quasi-blind comparison of the Cyborg Astrobiologist's interest points for these images with the interest points determined afterwards by a human geologist shows that the Cyborg Astrobiologist concurred with the human geologist 68% of the time (true-positive rate), with a 32% false-positive rate and a 32% false-negative rate. The performance of the Cyborg Astrobiologist's computer vision system was by no means perfect, so there is plenty of room for improvement. However, these tests validate the image-segmentation and uncommon-mapping technique that we first employed at a different geological site (Rivas Vaciamadrid) with somewhat different properties for the imagery.

  19. STRUTEX: A prototype knowledge-based system for initially configuring a structure to support point loads in two dimensions

    NASA Technical Reports Server (NTRS)

    Robers, James L.; Sobieszczanski-Sobieski, Jaroslaw

    1989-01-01

    Only recently have engineers begun making use of Artificial Intelligence (AI) tools in the area of conceptual design. To continue filling this void in the design process, a prototype knowledge-based system, called STRUTEX has been developed to initially configure a structure to support point loads in two dimensions. This prototype was developed for testing the application of AI tools to conceptual design as opposed to being a testbed for new methods for improving structural analysis and optimization. This system combines numerical and symbolic processing by the computer with interactive problem solving aided by the vision of the user. How the system is constructed to interact with the user is described. Of special interest is the information flow between the knowledge base and the data base under control of the algorithmic main program. Examples of computed and refined structures are presented during the explanation of the system.

  20. Assessment of Hybrid RANS/LES Turbulence Models for Aeroacoustics Applications

    NASA Technical Reports Server (NTRS)

    Vatsa, Veer N.; Lockard, David P.

    2010-01-01

    Predicting the noise from aircraft with exposed landing gear remains a challenging problem for the aeroacoustics community. Although computational fluid dynamics (CFD) has shown promise as a technique that could produce high-fidelity flow solutions, generating grids that can resolve the pertinent physics around complex configurations can be very challenging. Structured grids are often impractical for such configurations. Unstructured grids offer a path forward for simulating complex configurations. However, few unstructured grid codes have been thoroughly tested for unsteady flow problems in the manner needed for aeroacoustic prediction. A widely used unstructured grid code, FUN3D, is examined for resolving the near field in unsteady flow problems. Although the ultimate goal is to compute the flow around complex geometries such as the landing gear, simpler problems that include some of the relevant physics, and are easily amenable to the structured grid approaches are used for testing the unstructured grid approach. The test cases chosen for this study correspond to the experimental work on single and tandem cylinders conducted in the Basic Aerodynamic Research Tunnel (BART) and the Quiet Flow Facility (QFF) at NASA Langley Research Center. These configurations offer an excellent opportunity to assess the performance of hybrid RANS/LES turbulence models that transition from RANS in unresolved regions near solid bodies to LES in the outer flow field. Several of these models have been implemented and tested in both structured and unstructured grid codes to evaluate their dependence on the solver and mesh type. Comparison of FUN3D solutions with experimental data and numerical solutions from a structured grid flow solver are found to be encouraging.

  1. An Architecture for Real-Time Interpretation and Visualization of Structural Sensor Data in a Laboratory Environment

    NASA Technical Reports Server (NTRS)

    Doggett, William; Vazquez, Sixto

    2000-01-01

    A visualization system is being developed out of the need to monitor, interpret, and make decisions based on the information from several thousand sensors during experimental testing to facilitate development and validation of structural health monitoring algorithms. As an added benefit the system will enable complete real-time sensor assessment of complex test specimens. Complex structural specimens are routinely tested that have hundreds or thousands of sensors. During a test, it is impossible for a single researcher to effectively monitor all the sensors and subsequently interesting phenomena occur that are not recognized until post-test analysis. The ability to detect and alert the researcher to these unexpected phenomena as the test progresses will significantly enhance the understanding and utilization of complex test articles. Utilization is increased by the ability to halt a test when the health monitoring algorithm response is not satisfactory or when an unexpected phenomenon occurs, enabling focused investigation potentially through the installation of additional sensors. Often if the test continues, structural changes make it impossible to reproduce the conditions that exhibited the phenomena. The prohibitive time and costs associated with fabrication, sensoring, and subsequent testing of additional test articles generally makes it impossible to further investigate the phenomena. A scalable architecture is described to address the complex computational demands of structural health monitoring algorithm development and laboratory experimental test monitoring. The researcher monitors the test using a photographic quality 3D graphical model with actual sensor locations identified. In addition, researchers can quickly activate plots displaying time or load versus selected sensor response along with the expected values and predefined limits. The architecture has several key features. First, distributed dissimilar computers may be seamlessly integrated into the information flow. Second, virtual sensors may be defined that are complex functions of existing sensors or other virtual sensors. Virtual sensors represent a calculated value not directly measured by particular physical instrument. They can be used, for example, to represent the maximum difference in a range of sensors or the calculated buckling load based on the current strains. Third, the architecture enables autonomous response to preconceived events, where by the system can be configured to suspend or abort a test if a failure is detected in the load introduction system. Fourth, the architecture is designed to allow cooperative monitoring and control of the test progression from multiple stations both remote and local to the test system. To illustrate the architecture, a preliminary implementation is described monitoring the Stitched Composite Wing recently tested at LaRC.

  2. Computer-generated predictions of the structure and of the IR and Raman spectra of VX. Final report, May-August 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hameka, H.F.; Jensen, J.O.

    1993-05-01

    This report presents the computed optimized geometry and vibrational IR and Raman frequencies of the V-agent VX. The computations are performed with the Gaussian 90 Program Package using 6-31G* basis sets. We assign the vibrational frequencies and correct each frequency by multiplying it with a previously derived 6-31G* correction factor. The result is a computer-generated prediction of the IR and Raman spectra of VX. This study was intended as a blind test of the utility of IR spectral prediction. Therefore, we intentionally did not look at experimental data on the IR and Raman spectra of VX.... IR Spectra, VX, Ramanmore » spectra, Computer predictions.« less

  3. Aggregating Data for Computational Toxicology Applications: The U.S. Environmental Protection Agency (EPA) Aggregated Computational Toxicology Resource (ACToR) System

    PubMed Central

    Judson, Richard S.; Martin, Matthew T.; Egeghy, Peter; Gangwal, Sumit; Reif, David M.; Kothiya, Parth; Wolf, Maritja; Cathey, Tommy; Transue, Thomas; Smith, Doris; Vail, James; Frame, Alicia; Mosher, Shad; Cohen Hubal, Elaine A.; Richard, Ann M.

    2012-01-01

    Computational toxicology combines data from high-throughput test methods, chemical structure analyses and other biological domains (e.g., genes, proteins, cells, tissues) with the goals of predicting and understanding the underlying mechanistic causes of chemical toxicity and for predicting toxicity of new chemicals and products. A key feature of such approaches is their reliance on knowledge extracted from large collections of data and data sets in computable formats. The U.S. Environmental Protection Agency (EPA) has developed a large data resource called ACToR (Aggregated Computational Toxicology Resource) to support these data-intensive efforts. ACToR comprises four main repositories: core ACToR (chemical identifiers and structures, and summary data on hazard, exposure, use, and other domains), ToxRefDB (Toxicity Reference Database, a compilation of detailed in vivo toxicity data from guideline studies), ExpoCastDB (detailed human exposure data from observational studies of selected chemicals), and ToxCastDB (data from high-throughput screening programs, including links to underlying biological information related to genes and pathways). The EPA DSSTox (Distributed Structure-Searchable Toxicity) program provides expert-reviewed chemical structures and associated information for these and other high-interest public inventories. Overall, the ACToR system contains information on about 400,000 chemicals from 1100 different sources. The entire system is built using open source tools and is freely available to download. This review describes the organization of the data repository and provides selected examples of use cases. PMID:22408426

  4. Aggregating data for computational toxicology applications: The U.S. Environmental Protection Agency (EPA) Aggregated Computational Toxicology Resource (ACToR) System.

    PubMed

    Judson, Richard S; Martin, Matthew T; Egeghy, Peter; Gangwal, Sumit; Reif, David M; Kothiya, Parth; Wolf, Maritja; Cathey, Tommy; Transue, Thomas; Smith, Doris; Vail, James; Frame, Alicia; Mosher, Shad; Cohen Hubal, Elaine A; Richard, Ann M

    2012-01-01

    Computational toxicology combines data from high-throughput test methods, chemical structure analyses and other biological domains (e.g., genes, proteins, cells, tissues) with the goals of predicting and understanding the underlying mechanistic causes of chemical toxicity and for predicting toxicity of new chemicals and products. A key feature of such approaches is their reliance on knowledge extracted from large collections of data and data sets in computable formats. The U.S. Environmental Protection Agency (EPA) has developed a large data resource called ACToR (Aggregated Computational Toxicology Resource) to support these data-intensive efforts. ACToR comprises four main repositories: core ACToR (chemical identifiers and structures, and summary data on hazard, exposure, use, and other domains), ToxRefDB (Toxicity Reference Database, a compilation of detailed in vivo toxicity data from guideline studies), ExpoCastDB (detailed human exposure data from observational studies of selected chemicals), and ToxCastDB (data from high-throughput screening programs, including links to underlying biological information related to genes and pathways). The EPA DSSTox (Distributed Structure-Searchable Toxicity) program provides expert-reviewed chemical structures and associated information for these and other high-interest public inventories. Overall, the ACToR system contains information on about 400,000 chemicals from 1100 different sources. The entire system is built using open source tools and is freely available to download. This review describes the organization of the data repository and provides selected examples of use cases.

  5. Neural networks as a control methodology

    NASA Technical Reports Server (NTRS)

    Mccullough, Claire L.

    1990-01-01

    While conventional computers must be programmed in a logical fashion by a person who thoroughly understands the task to be performed, the motivation behind neural networks is to develop machines which can train themselves to perform tasks, using available information about desired system behavior and learning from experience. There are three goals of this fellowship program: (1) to evaluate various neural net methods and generate computer software to implement those deemed most promising on a personal computer equipped with Matlab; (2) to evaluate methods currently in the professional literature for system control using neural nets to choose those most applicable to control of flexible structures; and (3) to apply the control strategies chosen in (2) to a computer simulation of a test article, the Control Structures Interaction Suitcase Demonstrator, which is a portable system consisting of a small flexible beam driven by a torque motor and mounted on springs tuned to the first flexible mode of the beam. Results of each are discussed.

  6. Green Bank Telescope active surface system

    NASA Astrophysics Data System (ADS)

    Lacasse, Richard J.

    1998-05-01

    During the design phase of the Green Bank Telescope (GBT), various means of providing an accurate surface on a large aperture paraboloid, were considered. Automated jacks supporting the primary reflector were selected as the appropriate technology since they promised greater performance and potentially lower costs than a homologous or carbon fiber design, and had certain advantages over an active secondary. The design of the active surface has presented many challenges. Since the actuators are mounted on a tipping structure, it was required that they support a significant side-load. Such devices were not readily available commercially so they had to be developed. Additional actuator requirements include low backlash, repeatable positioning, and an operational life of at least 230 years. Similarly, no control system capable of controlling the 2209 actuators was commercially available. Again a prime requirement was reliability. Maintaining was also a very important consideration. The system architecture is tree-like. An active surface 'master-computer' controls interaction with the telescope control system, and controls ancillary equipment such as power supplies and temperature monitors. Two slave computers interface with the master- computer, and each closes approximately 1100 position loops. For simplicity, the servo is an 'on/off' type, yet achieves a positioning resolution of 25 microns. Each slave computer interfaces with 4 VME I/O cards, which in turn communicate with 140 control modules. The control modules read out the positions of the actuators every 0.1 sec and control the actuators' DC motors. Initial control of the active surface will be based on an elevation dependant structural model. Later, the model will be improved by holographic observations.Surface accuracy will be improved further by using laser ranging system which will actively measure the surface figure. Several tests have been conducted to assure that the system will perform as desired when installed on the telescope. These include actuator life tests, motor life test, position transducer accuracy test, as well as positioning accuracy tests.

  7. Modeling of Failure for Analysis of Triaxial Braided Carbon Fiber Composites

    NASA Technical Reports Server (NTRS)

    Goldberg, Robert K.; Littell, Justin D.; Binienda, Wieslaw K.

    2010-01-01

    In the development of advanced aircraft-engine fan cases and containment systems, composite materials are beginning to be used due to their low weight and high strength. The design of these structures must include the capability of withstanding impact loads from a released fan blade. Relatively complex triaxially braided fiber architectures have been found to yield the best performance for the fan cases. To properly work with and design these structures, robust analytical tools are required that can be used in the design process. A new analytical approach models triaxially braided carbon fiber composite materials within the environment of a transient dynamic finite-element code, specifically the commercially available transient dynamic finite-element code LS-DYNA. The geometry of the braided composites is approximated by a series of parallel laminated composites. The composite is modeled by using shell finite elements. The material property data are computed by examining test data from static tests on braided composites, where optical strain measurement techniques are used to examine the local strain variations within the material. These local strain data from the braided composite tests are used along with a judicious application of composite micromechanics- based methods to compute the stiffness properties of an equivalent unidirectional laminated composite required for the shell elements. The local strain data from the braided composite tests are also applied to back out strength and failure properties of the equivalent unidirectional composite. The properties utilized are geared towards the application of a continuum damage mechanics-based composite constitutive model available within LS-DYNA. The developed model can be applied to conduct impact simulations of structures composed of triaxially braided composites. The advantage of this technology is that it facilitates the analysis of the deformation and damage response of a triaxially braided polymer matrix composite within the environment of a transient dynamic finite-element code such as LS-DYNA in a manner which accounts for the local physical mechanisms but is still computationally efficient. This methodology is tightly coupled to experimental tests on the braided composite, which ensures that the material properties have physical significance. Aerospace or automotive companies interested in using triaxially braided composites in their structures, particularly for impact or crash applications, would find the technology useful. By the development of improved design tools, the amount of very expensive impact testing that will need to be performed can be significantly reduced.

  8. NASCAP user's manual, 1978

    NASA Technical Reports Server (NTRS)

    Cassidy, J. J., III

    1978-01-01

    NASCAP simulates the charging process for a complex object in either tenuous plasma (geosynchronous orbit) or ground test (electron gun source) environment. Program control words, the structure of user input files, and various user options available are described in this computer programmer's user manual.

  9. Electro-impulse de-icing testing analysis and design

    NASA Technical Reports Server (NTRS)

    Zumwalt, G. W.; Schrag, R. L.; Bernhart, W. D.; Friedberg, R. A.

    1988-01-01

    Electro-Impulse De-Icing (EIDI) is a method of ice removal by sharp blows delivered by a transient electromagnetic field. Detailed results are given for studies of the electrodynamic phenomena. Structural dynamic tests and computations are described. Also reported are ten sets of tests at NASA's Icing Research Tunnel and flight tests by NASA and Cessna Aircraft Company. Fabrication of system components are described and illustrated. Fatigue and electromagnetic interference tests are reported. Here, the necessary information for the design of an EIDI system for aircraft is provided.

  10. Findings from the Supersonic Qualification Program of the Mars Science Laboratory Parachute System

    NASA Technical Reports Server (NTRS)

    Sengupta, Anita; Steltzner, Adam; Witkowski, Allen; Candler, Graham; Pantano, Carlos

    2009-01-01

    In 2012, the Mars Science Laboratory Mission (MSL) will deploy NASA's largest extra-terrestrial parachute, a technology integral to the safe landing of its advanced robotic explorer on the surface. The supersonic parachute system is a mortar deployed 21.5 m disk-gap-band (DGB) parachute, identical in geometric scaling to the Viking era DGB parachutes of the 1970's. The MSL parachute deployment conditions are Mach 2.3 at a dynamic pressure of 750 Pa. The Viking Balloon Launched Decelerator Test (BLDT) successfully demonstrated a maximum of 700 Pa at Mach 2.2 for a 16.1 m DGB parachute in its AV4 flight. All previous Mars deployments have derived their supersonic qualification from the Viking BLDT test series, preventing the need for full scale high altitude supersonic testing. The qualification programs for Mars Pathfinder, Mars Exploration Rover, and Phoenix Scout Missions were all limited to subsonic structural qualification, with supersonic performance and survivability bounded by the BLDT qualification. The MSL parachute, at the edge of the supersonic heritage deployment space and 33% larger than the Viking parachute, accepts a certain degree of risk without addressing the supersonic environment in which it will deploy. In addition, MSL will spend up to 10 seconds above Mach 1.5, an aerodynamic regime that is associated with a known parachute instability characterized by significant canopy projected area fluctuation and dynamic drag variation. This aerodynamic instability, referred to as "area oscillations" by the parachute community has drag performance, inflation stability, and structural implications, introducing risk to mission success if not quantified for the MSL parachute system. To minimize this risk and as an alternative to a prohibitively expensive high altitude test program, a multi-phase qualification program using computation simulation validated by subscale test was developed and implemented for MSL. The first phase consisted of 2% of fullscale supersonic wind tunnel testing of a rigid DGB parachute with entry-vehicle to validate two high fidelity computational fluid dynamics (CFD) tools. The computer codes utilized Large Eddy Simulation and Detached Eddy Simulation numerical approaches to accurately capture the turbulent wake of the entry vehicle and its coupling to the parachute bow-shock. The second phase was the development of fluid structure interaction (FSI) computational tools to predict parachute response to the supersonic flow field. The FSI development included the integration of the CFD from the first phase with a finite element structural model of the parachute membrane and cable elements. In this phase, a 4% of full-scale supersonic flexible parachute test program was conducted to provide validation data to the FSI code and an empirical dataset of the MSL parachute in a flight-like environment. The final phase is FSI simulations of the full-scale MSL parachute in a Mars type deployment. Findings from this program will be presented in terms of code development and validation, empirical findings from the supersonic testing, and drag performance during supersonic operation.

  11. On-Line Mu Method for Robust Flutter Prediction in Expanding a Safe Flight Envelope for an Aircraft Model Under Flight Test

    NASA Technical Reports Server (NTRS)

    Lind, Richard C. (Inventor); Brenner, Martin J.

    2001-01-01

    A structured singular value (mu) analysis method of computing flutter margins has robust stability of a linear aeroelastic model with uncertainty operators (Delta). Flight data is used to update the uncertainty operators to accurately account for errors in the computed model and the observed range of aircraft dynamics of the aircraft under test caused by time-varying aircraft parameters, nonlinearities, and flight anomalies, such as test nonrepeatability. This mu-based approach computes predict flutter margins that are worst case with respect to the modeling uncertainty for use in determining when the aircraft is approaching a flutter condition and defining an expanded safe flight envelope for the aircraft that is accepted with more confidence than traditional methods that do not update the analysis algorithm with flight data by introducing mu as a flutter margin parameter that presents several advantages over tracking damping trends as a measure of a tendency to instability from available flight data.

  12. Measurement of the inertial constants of a rigid or flexible structure of arbitrary share through a vibration test

    NASA Technical Reports Server (NTRS)

    Engrand, D.; Cortial, J.

    1983-01-01

    The inertial constants of an aircraft rocket, or of any other structure, are defined without materializing any rotating axis. The necessary equipment is very similar to that used normally for ground vibration tests. An elastic suspension is used to obtain the total natural modes corresponding to the motions of the structure as a solid. From the measurements of the generalized masses of these modes it is possible to compute the inertial constants: (1) center of inertia; (2) tensor of inertia; and (3) mass. When the structure is not strictly rigid a purification process, based on the mean square method makes it possible to rigidify it at the price of some approximations and a few more measurements. Eventual additional masses, that are not parts of the structure, can be taken into account.

  13. Greater power and computational efficiency for kernel-based association testing of sets of genetic variants.

    PubMed

    Lippert, Christoph; Xiang, Jing; Horta, Danilo; Widmer, Christian; Kadie, Carl; Heckerman, David; Listgarten, Jennifer

    2014-11-15

    Set-based variance component tests have been identified as a way to increase power in association studies by aggregating weak individual effects. However, the choice of test statistic has been largely ignored even though it may play an important role in obtaining optimal power. We compared a standard statistical test-a score test-with a recently developed likelihood ratio (LR) test. Further, when correction for hidden structure is needed, or gene-gene interactions are sought, state-of-the art algorithms for both the score and LR tests can be computationally impractical. Thus we develop new computationally efficient methods. After reviewing theoretical differences in performance between the score and LR tests, we find empirically on real data that the LR test generally has more power. In particular, on 15 of 17 real datasets, the LR test yielded at least as many associations as the score test-up to 23 more associations-whereas the score test yielded at most one more association than the LR test in the two remaining datasets. On synthetic data, we find that the LR test yielded up to 12% more associations, consistent with our results on real data, but also observe a regime of extremely small signal where the score test yielded up to 25% more associations than the LR test, consistent with theory. Finally, our computational speedups now enable (i) efficient LR testing when the background kernel is full rank, and (ii) efficient score testing when the background kernel changes with each test, as for gene-gene interaction tests. The latter yielded a factor of 2000 speedup on a cohort of size 13 500. Software available at http://research.microsoft.com/en-us/um/redmond/projects/MSCompBio/Fastlmm/. heckerma@microsoft.com Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  14. Dry wind tunnel system

    NASA Technical Reports Server (NTRS)

    Chen, Ping-Chih (Inventor)

    2013-01-01

    This invention is a ground flutter testing system without a wind tunnel, called Dry Wind Tunnel (DWT) System. The DWT system consists of a Ground Vibration Test (GVT) hardware system, a multiple input multiple output (MIMO) force controller software, and a real-time unsteady aerodynamic force generation software, that is developed from an aerodynamic reduced order model (ROM). The ground flutter test using the DWT System operates on a real structural model, therefore no scaled-down structural model, which is required by the conventional wind tunnel flutter test, is involved. Furthermore, the impact of the structural nonlinearities on the aeroelastic stability can be included automatically. Moreover, the aeroservoelastic characteristics of the aircraft can be easily measured by simply including the flight control system in-the-loop. In addition, the unsteady aerodynamics generated computationally is interference-free from the wind tunnel walls. Finally, the DWT System can be conveniently and inexpensively carried out as a post GVT test with the same hardware, only with some possible rearrangement of the shakers and the inclusion of additional sensors.

  15. Measurement of multiaxial ply strength by an off-axis flexure test

    NASA Technical Reports Server (NTRS)

    Crews, John H., Jr.; Naik, Rajiv A.

    1992-01-01

    An off-axis flexure (OAF) test was performed to measure ply strength under multiaxial stress states. This test involves unidirectional off-axis specimens loaded in bending, using an apparatus that allows these anisotropic specimens to twist as well as flex without the complications of a resisting torque. A 3D finite element stress analysis verified that simple beam theory could be used to compute the specimen bending stresses at failure. Unidirectional graphite/epoxy specimens with fiber angles ranging from 90 deg to 15 deg have combined normal and shear stresses on their failure planes that are typical of 45 deg plies in structural laminates. Tests for a range of stress states with AS4/3501-6 specimens showed that both normal and shear stresses on the failure plane influenced cracking resistance. This OAF test may prove to be useful for generating data needed to predict ply cracking in composite structures and may also provide an approach for studying fiber-matrix interface failures under stress states typical of structures.

  16. A Matrix-Free Algorithm for Multidisciplinary Design Optimization

    NASA Astrophysics Data System (ADS)

    Lambe, Andrew Borean

    Multidisciplinary design optimization (MDO) is an approach to engineering design that exploits the coupling between components or knowledge disciplines in a complex system to improve the final product. In aircraft design, MDO methods can be used to simultaneously design the outer shape of the aircraft and the internal structure, taking into account the complex interaction between the aerodynamic forces and the structural flexibility. Efficient strategies are needed to solve such design optimization problems and guarantee convergence to an optimal design. This work begins with a comprehensive review of MDO problem formulations and solution algorithms. First, a fundamental MDO problem formulation is defined from which other formulations may be obtained through simple transformations. Using these fundamental problem formulations, decomposition methods from the literature are reviewed and classified. All MDO methods are presented in a unified mathematical notation to facilitate greater understanding. In addition, a novel set of diagrams, called extended design structure matrices, are used to simultaneously visualize both data communication and process flow between the many software components of each method. For aerostructural design optimization, modern decomposition-based MDO methods cannot efficiently handle the tight coupling between the aerodynamic and structural states. This fact motivates the exploration of methods that can reduce the computational cost. A particular structure in the direct and adjoint methods for gradient computation motivates the idea of a matrix-free optimization method. A simple matrix-free optimizer is developed based on the augmented Lagrangian algorithm. This new matrix-free optimizer is tested on two structural optimization problems and one aerostructural optimization problem. The results indicate that the matrix-free optimizer is able to efficiently solve structural and multidisciplinary design problems with thousands of variables and constraints. On the aerostructural test problem formulated with thousands of constraints, the matrix-free optimizer is estimated to reduce the total computational time by up to 90% compared to conventional optimizers.

  17. A Matrix-Free Algorithm for Multidisciplinary Design Optimization

    NASA Astrophysics Data System (ADS)

    Lambe, Andrew Borean

    Multidisciplinary design optimization (MDO) is an approach to engineering design that exploits the coupling between components or knowledge disciplines in a complex system to improve the final product. In aircraft design, MDO methods can be used to simultaneously design the outer shape of the aircraft and the internal structure, taking into account the complex interaction between the aerodynamic forces and the structural flexibility. Efficient strategies are needed to solve such design optimization problems and guarantee convergence to an optimal design. This work begins with a comprehensive review of MDO problem formulations and solution algorithms. First, a fundamental MDO problem formulation is defined from which other formulations may be obtained through simple transformations. Using these fundamental problem formulations, decomposition methods from the literature are reviewed and classified. All MDO methods are presented in a unified mathematical notation to facilitate greater understanding. In addition, a novel set of diagrams, called extended design structure matrices, are used to simultaneously visualize both data communication and process flow between the many software components of each method. For aerostructural design optimization, modern decomposition-based MDO methods cannot efficiently handle the tight coupling between the aerodynamic and structural states. This fact motivates the exploration of methods that can reduce the computational cost. A particular structure in the direct and adjoint methods for gradient computation. motivates the idea of a matrix-free optimization method. A simple matrix-free optimizer is developed based on the augmented Lagrangian algorithm. This new matrix-free optimizer is tested on two structural optimization problems and one aerostructural optimization problem. The results indicate that the matrix-free optimizer is able to efficiently solve structural and multidisciplinary design problems with thousands of variables and constraints. On the aerostructural test problem formulated with thousands of constraints, the matrix-free optimizer is estimated to reduce the total computational time by up to 90% compared to conventional optimizers.

  18. Effects of instructional strategies using cross sections on the recognition of anatomical structures in correlated CT and MR images.

    PubMed

    Khalil, Mohammed K; Paas, Fred; Johnson, Tristan E; Su, Yung K; Payer, Andrew F

    2008-01-01

    This research is an effort to best utilize the interactive anatomical images for instructional purposes based on cognitive load theory. Three studies explored the differential effects of three computer-based instructional strategies that use anatomical cross-sections to enhance the interpretation of radiological images. These strategies include: (1) cross-sectional images of the head that can be superimposed on radiological images, (2) transparent highlighting of anatomical structures in radiological images, and (3) cross-sectional images of the head with radiological images presented side-by-side. Data collected included: (1) time spent on instruction and on solving test questions, (2) mental effort during instruction and test, and (3) students' performance to identify anatomical structures in radiological images. Participants were 28 freshmen medical students (15 males and 13 females) and 208 biology students (190 females and 18 males). All studies used posttest-only control group design, and the collected data were analyzed by either t test or ANOVA. In self-directed computer-based environments, the strategies that used cross sections to improve students' ability to recognize anatomic structures in radiological images showed no significant positive effects. However, when increasing the complexity of the instructional materials, cross-sectional images imposed a higher cognitive load, as indicated by higher investment of mental effort. There is not enough evidence to claim that the simultaneous combination of cross sections and radiological images has no effect on the identification of anatomical structures in radiological images for novices. Further research that control for students' learning and cognitive style is needed to reach an informative conclusion.

  19. TMDIM: an improved algorithm for the structure prediction of transmembrane domains of bitopic dimers.

    PubMed

    Cao, Han; Ng, Marcus C K; Jusoh, Siti Azma; Tai, Hio Kuan; Siu, Shirley W I

    2017-09-01

    [Formula: see text]-Helical transmembrane proteins are the most important drug targets in rational drug development. However, solving the experimental structures of these proteins remains difficult, therefore computational methods to accurately and efficiently predict the structures are in great demand. We present an improved structure prediction method TMDIM based on Park et al. (Proteins 57:577-585, 2004) for predicting bitopic transmembrane protein dimers. Three major algorithmic improvements are introduction of the packing type classification, the multiple-condition decoy filtering, and the cluster-based candidate selection. In a test of predicting nine known bitopic dimers, approximately 78% of our predictions achieved a successful fit (RMSD <2.0 Å) and 78% of the cases are better predicted than the two other methods compared. Our method provides an alternative for modeling TM bitopic dimers of unknown structures for further computational studies. TMDIM is freely available on the web at https://cbbio.cis.umac.mo/TMDIM . Website is implemented in PHP, MySQL and Apache, with all major browsers supported.

  20. TMDIM: an improved algorithm for the structure prediction of transmembrane domains of bitopic dimers

    NASA Astrophysics Data System (ADS)

    Cao, Han; Ng, Marcus C. K.; Jusoh, Siti Azma; Tai, Hio Kuan; Siu, Shirley W. I.

    2017-09-01

    α-Helical transmembrane proteins are the most important drug targets in rational drug development. However, solving the experimental structures of these proteins remains difficult, therefore computational methods to accurately and efficiently predict the structures are in great demand. We present an improved structure prediction method TMDIM based on Park et al. (Proteins 57:577-585, 2004) for predicting bitopic transmembrane protein dimers. Three major algorithmic improvements are introduction of the packing type classification, the multiple-condition decoy filtering, and the cluster-based candidate selection. In a test of predicting nine known bitopic dimers, approximately 78% of our predictions achieved a successful fit (RMSD <2.0 Å) and 78% of the cases are better predicted than the two other methods compared. Our method provides an alternative for modeling TM bitopic dimers of unknown structures for further computational studies. TMDIM is freely available on the web at https://cbbio.cis.umac.mo/TMDIM. Website is implemented in PHP, MySQL and Apache, with all major browsers supported.

  1. SimRNA: a coarse-grained method for RNA folding simulations and 3D structure prediction.

    PubMed

    Boniecki, Michal J; Lach, Grzegorz; Dawson, Wayne K; Tomala, Konrad; Lukasz, Pawel; Soltysinski, Tomasz; Rother, Kristian M; Bujnicki, Janusz M

    2016-04-20

    RNA molecules play fundamental roles in cellular processes. Their function and interactions with other biomolecules are dependent on the ability to form complex three-dimensional (3D) structures. However, experimental determination of RNA 3D structures is laborious and challenging, and therefore, the majority of known RNAs remain structurally uncharacterized. Here, we present SimRNA: a new method for computational RNA 3D structure prediction, which uses a coarse-grained representation, relies on the Monte Carlo method for sampling the conformational space, and employs a statistical potential to approximate the energy and identify conformations that correspond to biologically relevant structures. SimRNA can fold RNA molecules using only sequence information, and, on established test sequences, it recapitulates secondary structure with high accuracy, including correct prediction of pseudoknots. For modeling of complex 3D structures, it can use additional restraints, derived from experimental or computational analyses, including information about secondary structure and/or long-range contacts. SimRNA also can be used to analyze conformational landscapes and identify potential alternative structures. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. Development of an automated ultrasonic testing system

    NASA Astrophysics Data System (ADS)

    Shuxiang, Jiao; Wong, Brian Stephen

    2005-04-01

    Non-Destructive Testing is necessary in areas where defects in structures emerge over time due to wear and tear and structural integrity is necessary to maintain its usability. However, manual testing results in many limitations: high training cost, long training procedure, and worse, the inconsistent test results. A prime objective of this project is to develop an automatic Non-Destructive testing system for a shaft of the wheel axle of a railway carriage. Various methods, such as the neural network, pattern recognition methods and knowledge-based system are used for the artificial intelligence problem. In this paper, a statistical pattern recognition approach, Classification Tree is applied. Before feature selection, a thorough study on the ultrasonic signals produced was carried out. Based on the analysis of the ultrasonic signals, three signal processing methods were developed to enhance the ultrasonic signals: Cross-Correlation, Zero-Phase filter and Averaging. The target of this step is to reduce the noise and make the signal character more distinguishable. Four features: 1. The Auto Regressive Model Coefficients. 2. Standard Deviation. 3. Pearson Correlation 4. Dispersion Uniformity Degree are selected. And then a Classification Tree is created and applied to recognize the peak positions and amplitudes. Searching local maximum is carried out before feature computing. This procedure reduces much computation time in the real-time testing. Based on this algorithm, a software package called SOFRA was developed to recognize the peaks, calibrate automatically and test a simulated shaft automatically. The automatic calibration procedure and the automatic shaft testing procedure are developed.

  3. Failure mechanisms in energy-absorbing composite structures

    NASA Astrophysics Data System (ADS)

    Johnson, Alastair F.; David, Matthew

    2010-11-01

    Quasi-static tests are described for determination of the energy-absorption properties of composite crash energy-absorbing segment elements under axial loads. Detailed computer tomography scans of failed specimens were used to identify local compression crush failure mechanisms at the crush front. These mechanisms are important for selecting composite materials for energy-absorbing structures, such as helicopter and aircraft sub-floors. Finite element models of the failure processes are described that could be the basis for materials selection and future design procedures for crashworthy structures.

  4. Structural behavior of the Bitter plate tf magnet for the Zephyr ignition test reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bobrov, E.S.; Becker, H.

    1981-01-01

    This paper discusses methods and results of the computer structural analysis of the Bitter plate toroidal field magnet design for the ZEPHYR Ignition Test Reactor. The magnet provides a field of 7.06 T at the center of the bore which is 1.76 m from the major toroidal axis. The ignited plasma is located at a major radius of 1.36 m where the magnetic field is 9.11 T. The plasma is moved to this final position following compression in the major radius. The horizontal bore of the magnet is 1.8 m.

  5. Test-bench system for a borehole azimuthal acoustic reflection imaging logging tool

    NASA Astrophysics Data System (ADS)

    Liu, Xianping; Ju, Xiaodong; Qiao, Wenxiao; Lu, Junqiang; Men, Baiyong; Liu, Dong

    2016-06-01

    The borehole azimuthal acoustic reflection imaging logging tool (BAAR) is a new generation of imaging logging tool, which is able to investigate stratums in a relatively larger range of space around the borehole. The BAAR is designed based on the idea of modularization with a very complex structure, so it has become urgent for us to develop a dedicated test-bench system to debug each module of the BAAR. With the help of a test-bench system introduced in this paper, test and calibration of BAAR can be easily achieved. The test-bench system is designed based on the client/server model. The hardware system mainly consists of a host computer, an embedded controlling board, a bus interface board, a data acquisition board and a telemetry communication board. The host computer serves as the human machine interface and processes the uploaded data. The software running on the host computer is designed based on VC++. The embedded controlling board uses Advanced Reduced Instruction Set Machines 7 (ARM7) as the micro controller and communicates with the host computer via Ethernet. The software for the embedded controlling board is developed based on the operating system uClinux. The bus interface board, data acquisition board and telemetry communication board are designed based on a field programmable gate array (FPGA) and provide test interfaces for the logging tool. To examine the feasibility of the test-bench system, it was set up to perform a test on BAAR. By analyzing the test results, an unqualified channel of the electronic receiving cabin was discovered. It is suggested that the test-bench system can be used to quickly determine the working condition of sub modules of BAAR and it is of great significance in improving production efficiency and accelerating industrial production of the logging tool.

  6. Molecular Modeling in Drug Design for the Development of Organophosphorus Antidotes/Prophylactics.

    DTIC Science & Technology

    1986-06-01

    multidimensional statistical QSAR analysis techniques to suggest new structures for synthesis and evaluation. C. Application of quantum chemical techniques to...compounds for synthesis and testing for antidotal potency. E. Use of computer-assisted methods to determine the steric constraints at the active site...modeling techniques to model the enzyme acetylcholinester-se. H. Suggestion of some novel compounds for synthesis and testing for reactivating

  7. A simple and fast heuristic for protein structure comparison

    PubMed Central

    Pelta, David A; González, Juan R; Moreno Vega, Marcos

    2008-01-01

    Background Protein structure comparison is a key problem in bioinformatics. There exist several methods for doing protein comparison, being the solution of the Maximum Contact Map Overlap problem (MAX-CMO) one of the alternatives available. Although this problem may be solved using exact algorithms, researchers require approximate algorithms that obtain good quality solutions using less computational resources than the formers. Results We propose a variable neighborhood search metaheuristic for solving MAX-CMO. We analyze this strategy in two aspects: 1) from an optimization point of view the strategy is tested on two different datasets, obtaining an error of 3.5%(over 2702 pairs) and 1.7% (over 161 pairs) with respect to optimal values; thus leading to high accurate solutions in a simpler and less expensive way than exact algorithms; 2) in terms of protein structure classification, we conduct experiments on three datasets and show that is feasible to detect structural similarities at SCOP's family and CATH's architecture levels using normalized overlap values. Some limitations and the role of normalization are outlined for doing classification at SCOP's fold level. Conclusion We designed, implemented and tested.a new tool for solving MAX-CMO, based on a well-known metaheuristic technique. The good balance between solution's quality and computational effort makes it a valuable tool. Moreover, to the best of our knowledge, this is the first time the MAX-CMO measure is tested at SCOP's fold and CATH's architecture levels with encouraging results. Software is available for download at . PMID:18366735

  8. A Monte Carlo approach applied to ultrasonic non-destructive testing

    NASA Astrophysics Data System (ADS)

    Mosca, I.; Bilgili, F.; Meier, T.; Sigloch, K.

    2012-04-01

    Non-destructive testing based on ultrasound allows us to detect, characterize and size discrete flaws in geotechnical and architectural structures and materials. This information is needed to determine whether such flaws can be tolerated in future service. In typical ultrasonic experiments, only the first-arriving P-wave is interpreted, and the remainder of the recorded waveform is neglected. Our work aims at understanding surface waves, which are strong signals in the later wave train, with the ultimate goal of full waveform tomography. At present, even the structural estimation of layered media is still challenging because material properties of the samples can vary widely, and good initial models for inversion do not often exist. The aim of the present study is to combine non-destructive testing with a theoretical data analysis and hence to contribute to conservation strategies of archaeological and architectural structures. We analyze ultrasonic waveforms measured at the surface of a variety of samples, and define the behaviour of surface waves in structures of increasing complexity. The tremendous potential of ultrasonic surface waves becomes an advantage only if numerical forward modelling tools are available to describe the waveforms accurately. We compute synthetic full seismograms as well as group and phase velocities for the data. We invert them for the elastic properties of the sample via a global search of the parameter space, using the Neighbourhood Algorithm. Such a Monte Carlo approach allows us to perform a complete uncertainty and resolution analysis, but the computational cost is high and increases quickly with the number of model parameters. Therefore it is practical only for defining the seismic properties of media with a limited number of degrees of freedom, such as layered structures. We have applied this approach to both synthetic layered structures and real samples. The former contributed to benchmark the propagation of ultrasonic surface waves in typical materials tested with a non-destructive technique (e.g., marble, unweathered and weathered concrete and natural stone).

  9. ProSelection: A Novel Algorithm to Select Proper Protein Structure Subsets for in Silico Target Identification and Drug Discovery Research.

    PubMed

    Wang, Nanyi; Wang, Lirong; Xie, Xiang-Qun

    2017-11-27

    Molecular docking is widely applied to computer-aided drug design and has become relatively mature in the recent decades. Application of docking in modeling varies from single lead compound optimization to large-scale virtual screening. The performance of molecular docking is highly dependent on the protein structures selected. It is especially challenging for large-scale target prediction research when multiple structures are available for a single target. Therefore, we have established ProSelection, a docking preferred-protein selection algorithm, in order to generate the proper structure subset(s). By the ProSelection algorithm, protein structures of "weak selectors" are filtered out whereas structures of "strong selectors" are kept. Specifically, the structure which has a good statistical performance of distinguishing active ligands from inactive ligands is defined as a strong selector. In this study, 249 protein structures of 14 autophagy-related targets are investigated. Surflex-dock was used as the docking engine to distinguish active and inactive compounds against these protein structures. Both t test and Mann-Whitney U test were used to distinguish the strong from the weak selectors based on the normality of the docking score distribution. The suggested docking score threshold for active ligands (SDA) was generated for each strong selector structure according to the receiver operating characteristic (ROC) curve. The performance of ProSelection was further validated by predicting the potential off-targets of 43 U.S. Federal Drug Administration approved small molecule antineoplastic drugs. Overall, ProSelection will accelerate the computational work in protein structure selection and could be a useful tool for molecular docking, target prediction, and protein-chemical database establishment research.

  10. Predicting organ toxicity using in vitro bioactivity data and chemical structure

    EPA Science Inventory

    Animal testing alone cannot practically evaluate the health hazard posed by tens of thousands of environmental chemicals. Computational approaches together with high-throughput experimental data may provide more efficient means to predict chemical toxicity. Here, we use a superv...

  11. 49 CFR 236.923 - Task analysis and basic requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... classroom, simulator, computer-based, hands-on, or other formally structured training and testing, except... for Processor-Based Signal and Train Control Systems § 236.923 Task analysis and basic requirements...) Based on a formal task analysis, identify the installation, maintenance, repair, modification...

  12. 49 CFR 229.211 - Processing of petitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... to the U.S. Department of Transportation Docket Operations (M-30), West Building Ground Floor, Room... requires additional information to appropriately consider the petition, FRA will conduct a hearing on the..., or both which may include validated computer modeling, structural crush analysis, component testing...

  13. 49 CFR 229.211 - Processing of petitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... to the U.S. Department of Transportation Docket Operations (M-30), West Building Ground Floor, Room... requires additional information to appropriately consider the petition, FRA will conduct a hearing on the..., or both which may include validated computer modeling, structural crush analysis, component testing...

  14. 49 CFR 229.211 - Processing of petitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... to the U.S. Department of Transportation Docket Operations (M-30), West Building Ground Floor, Room... requires additional information to appropriately consider the petition, FRA will conduct a hearing on the..., or both which may include validated computer modeling, structural crush analysis, component testing...

  15. High performance transcription factor-DNA docking with GPU computing

    PubMed Central

    2012-01-01

    Background Protein-DNA docking is a very challenging problem in structural bioinformatics and has important implications in a number of applications, such as structure-based prediction of transcription factor binding sites and rational drug design. Protein-DNA docking is very computational demanding due to the high cost of energy calculation and the statistical nature of conformational sampling algorithms. More importantly, experiments show that the docking quality depends on the coverage of the conformational sampling space. It is therefore desirable to accelerate the computation of the docking algorithm, not only to reduce computing time, but also to improve docking quality. Methods In an attempt to accelerate the sampling process and to improve the docking performance, we developed a graphics processing unit (GPU)-based protein-DNA docking algorithm. The algorithm employs a potential-based energy function to describe the binding affinity of a protein-DNA pair, and integrates Monte-Carlo simulation and a simulated annealing method to search through the conformational space. Algorithmic techniques were developed to improve the computation efficiency and scalability on GPU-based high performance computing systems. Results The effectiveness of our approach is tested on a non-redundant set of 75 TF-DNA complexes and a newly developed TF-DNA docking benchmark. We demonstrated that the GPU-based docking algorithm can significantly accelerate the simulation process and thereby improving the chance of finding near-native TF-DNA complex structures. This study also suggests that further improvement in protein-DNA docking research would require efforts from two integral aspects: improvement in computation efficiency and energy function design. Conclusions We present a high performance computing approach for improving the prediction accuracy of protein-DNA docking. The GPU-based docking algorithm accelerates the search of the conformational space and thus increases the chance of finding more near-native structures. To the best of our knowledge, this is the first ad hoc effort of applying GPU or GPU clusters to the protein-DNA docking problem. PMID:22759575

  16. Greater power and computational efficiency for kernel-based association testing of sets of genetic variants

    PubMed Central

    Lippert, Christoph; Xiang, Jing; Horta, Danilo; Widmer, Christian; Kadie, Carl; Heckerman, David; Listgarten, Jennifer

    2014-01-01

    Motivation: Set-based variance component tests have been identified as a way to increase power in association studies by aggregating weak individual effects. However, the choice of test statistic has been largely ignored even though it may play an important role in obtaining optimal power. We compared a standard statistical test—a score test—with a recently developed likelihood ratio (LR) test. Further, when correction for hidden structure is needed, or gene–gene interactions are sought, state-of-the art algorithms for both the score and LR tests can be computationally impractical. Thus we develop new computationally efficient methods. Results: After reviewing theoretical differences in performance between the score and LR tests, we find empirically on real data that the LR test generally has more power. In particular, on 15 of 17 real datasets, the LR test yielded at least as many associations as the score test—up to 23 more associations—whereas the score test yielded at most one more association than the LR test in the two remaining datasets. On synthetic data, we find that the LR test yielded up to 12% more associations, consistent with our results on real data, but also observe a regime of extremely small signal where the score test yielded up to 25% more associations than the LR test, consistent with theory. Finally, our computational speedups now enable (i) efficient LR testing when the background kernel is full rank, and (ii) efficient score testing when the background kernel changes with each test, as for gene–gene interaction tests. The latter yielded a factor of 2000 speedup on a cohort of size 13 500. Availability: Software available at http://research.microsoft.com/en-us/um/redmond/projects/MSCompBio/Fastlmm/. Contact: heckerma@microsoft.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25075117

  17. Control/structure interaction design methodology

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.; Layman, William E.

    1989-01-01

    The Control Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts (such as active structure) and new tools (such as a combined structure and control optimization algorithm) and their verification in ground and possibly flight test. The new CSI design methodology is centered around interdisciplinary engineers using new tools that closely integrate structures and controls. Verification is an important CSI theme and analysts will be closely integrated to the CSI Test Bed laboratory. Components, concepts, tools and algorithms will be developed and tested in the lab and in future Shuttle-based flight experiments. The design methodology is summarized in block diagrams depicting the evolution of a spacecraft design and descriptions of analytical capabilities used in the process. The multiyear JPL CSI implementation plan is described along with the essentials of several new tools. A distributed network of computation servers and workstations was designed that will provide a state-of-the-art development base for the CSI technologies.

  18. Blind test of physics-based prediction of protein structures.

    PubMed

    Shell, M Scott; Ozkan, S Banu; Voelz, Vincent; Wu, Guohong Albert; Dill, Ken A

    2009-02-01

    We report here a multiprotein blind test of a computer method to predict native protein structures based solely on an all-atom physics-based force field. We use the AMBER 96 potential function with an implicit (GB/SA) model of solvation, combined with replica-exchange molecular-dynamics simulations. Coarse conformational sampling is performed using the zipping and assembly method (ZAM), an approach that is designed to mimic the putative physical routes of protein folding. ZAM was applied to the folding of six proteins, from 76 to 112 monomers in length, in CASP7, a community-wide blind test of protein structure prediction. Because these predictions have about the same level of accuracy as typical bioinformatics methods, and do not utilize information from databases of known native structures, this work opens up the possibility of predicting the structures of membrane proteins, synthetic peptides, or other foldable polymers, for which there is little prior knowledge of native structures. This approach may also be useful for predicting physical protein folding routes, non-native conformations, and other physical properties from amino acid sequences.

  19. Blind Test of Physics-Based Prediction of Protein Structures

    PubMed Central

    Shell, M. Scott; Ozkan, S. Banu; Voelz, Vincent; Wu, Guohong Albert; Dill, Ken A.

    2009-01-01

    We report here a multiprotein blind test of a computer method to predict native protein structures based solely on an all-atom physics-based force field. We use the AMBER 96 potential function with an implicit (GB/SA) model of solvation, combined with replica-exchange molecular-dynamics simulations. Coarse conformational sampling is performed using the zipping and assembly method (ZAM), an approach that is designed to mimic the putative physical routes of protein folding. ZAM was applied to the folding of six proteins, from 76 to 112 monomers in length, in CASP7, a community-wide blind test of protein structure prediction. Because these predictions have about the same level of accuracy as typical bioinformatics methods, and do not utilize information from databases of known native structures, this work opens up the possibility of predicting the structures of membrane proteins, synthetic peptides, or other foldable polymers, for which there is little prior knowledge of native structures. This approach may also be useful for predicting physical protein folding routes, non-native conformations, and other physical properties from amino acid sequences. PMID:19186130

  20. Correlation of AH-1G airframe test data with a NASTRAN mathematical model

    NASA Technical Reports Server (NTRS)

    Cronkhite, J. D.; Berry, V. L.

    1976-01-01

    Test data was provided for evaluating a mathematical vibration model of the Bell AH-1G helicopter airframe. The math model was developed and analyzed using the NASTRAN structural analysis computer program. Data from static and dynamic tests were used for comparison with the math model. Static tests of the fuselage and tailboom were conducted to verify the stiffness representation of the NASTRAN model. Dynamic test data were obtained from shake tests of the airframe and were used to evaluate the NASTRAN model for representing the low frequency (below 30 Hz) vibration response of the airframe.

  1. A homogenization-based quasi-discrete method for the fracture of heterogeneous materials

    NASA Astrophysics Data System (ADS)

    Berke, P. Z.; Peerlings, R. H. J.; Massart, T. J.; Geers, M. G. D.

    2014-05-01

    The understanding and the prediction of the failure behaviour of materials with pronounced microstructural effects is of crucial importance. This paper presents a novel computational methodology for the handling of fracture on the basis of the microscale behaviour. The basic principles presented here allow the incorporation of an adaptive discretization scheme of the structure as a function of the evolution of strain localization in the underlying microstructure. The proposed quasi-discrete methodology bridges two scales: the scale of the material microstructure, modelled with a continuum type description; and the structural scale, where a discrete description of the material is adopted. The damaging material at the structural scale is divided into unit volumes, called cells, which are represented as a discrete network of points. The scale transition is inspired by computational homogenization techniques; however it does not rely on classical averaging theorems. The structural discrete equilibrium problem is formulated in terms of the underlying fine scale computations. Particular boundary conditions are developed on the scale of the material microstructure to address damage localization problems. The performance of this quasi-discrete method with the enhanced boundary conditions is assessed using different computational test cases. The predictions of the quasi-discrete scheme agree well with reference solutions obtained through direct numerical simulations, both in terms of crack patterns and load versus displacement responses.

  2. Modifications to the streamtube curvature program. Volume 1: Program modifications and user's manual. [user manuals (computer programs) for transonic flow of nacelles and intake systems of turbofan engines

    NASA Technical Reports Server (NTRS)

    Ferguson, D. R.; Keith, J. S.

    1975-01-01

    The improvements which have been incorporated in the Streamtube Curvature Program to enhance both its computational and diagnostic capabilities are described. Detailed descriptions are given of the revisions incorporated to more reliably handle the jet stream-external flow interaction at trailing edges. Also presented are the augmented boundary layer procedures and a variety of other program changes relating to program diagnostics and extended solution capabilities. An updated User's Manual, that includes information on the computer program operation, usage, and logical structure, is presented. User documentation includes an outline of the general logical flow of the program and detailed instructions for program usage and operation. From the standpoint of the programmer, the overlay structure is described. The input data, output formats, and diagnostic printouts are covered in detail and illustrated with three typical test cases.

  3. Bonded composite to metal scarf joint performance in an aircraft landing gear drag strut. [for Boeing 747 aircraft

    NASA Technical Reports Server (NTRS)

    Howell, W. E.

    1974-01-01

    The structural performance of a boron-epoxy reinforced titanium drag strut, which contains a bonded scarf joint and was designed to the criteria of the Boeing 747 transport, was evaluated. An experimental and analytical investigation was conducted. The strut was exposed to two lifetimes of spectrum loading and was statically loaded to the tensile and compressive design ultimate loads. Throughout the test program no evidence of any damage in the drag strut was detected by strain gage measurements, ultrasonic inspection, or visual observation. An analytical study of the bonded joint was made using the NASA structural analysis computer program NASTRAN. A comparison of the strains predicted by the NASTRAN computer program with the experimentally determined values shows excellent agreement. The NASTRAN computer program is a viable tool for studying, in detail, the stresses and strains induced in a bonded joint.

  4. Cyclic structural analyses of anisotropic turbine blades for reusable space propulsion systems. [ssme fuel turbopump

    NASA Technical Reports Server (NTRS)

    Manderscheid, J. M.; Kaufman, A.

    1985-01-01

    Turbine blades for reusable space propulsion systems are subject to severe thermomechanical loading cycles that result in large inelastic strains and very short lives. These components require the use of anisotropic high-temperature alloys to meet the safety and durability requirements of such systems. To assess the effects on blade life of material anisotropy, cyclic structural analyses are being performed for the first stage high-pressure fuel turbopump blade of the space shuttle main engine. The blade alloy is directionally solidified MAR-M 246 alloy. The analyses are based on a typical test stand engine cycle. Stress-strain histories at the airfoil critical location are computed using the MARC nonlinear finite-element computer code. The MARC solutions are compared to cyclic response predictions from a simplified structural analysis procedure developed at the NASA Lewis Research Center.

  5. The Structure and Properties of Silica Glass Nanostructures using Novel Computational Systems

    NASA Astrophysics Data System (ADS)

    Doblack, Benjamin N.

    The structure and properties of silica glass nanostructures are examined using computational methods in this work. Standard synthesis methods of silica and its associated material properties are first discussed in brief. A review of prior experiments on this amorphous material is also presented. Background and methodology for the simulation of mechanical tests on amorphous bulk silica and nanostructures are later presented. A new computational system for the accurate and fast simulation of silica glass is also presented, using an appropriate interatomic potential for this material within the open-source molecular dynamics computer program LAMMPS. This alternative computational method uses modern graphics processors, Nvidia CUDA technology and specialized scientific codes to overcome processing speed barriers common to traditional computing methods. In conjunction with a virtual reality system used to model select materials, this enhancement allows the addition of accelerated molecular dynamics simulation capability. The motivation is to provide a novel research environment which simultaneously allows visualization, simulation, modeling and analysis. The research goal of this project is to investigate the structure and size dependent mechanical properties of silica glass nanohelical structures under tensile MD conditions using the innovative computational system. Specifically, silica nanoribbons and nanosprings are evaluated which revealed unique size dependent elastic moduli when compared to the bulk material. For the nanoribbons, the tensile behavior differed widely between the models simulated, with distinct characteristic extended elastic regions. In the case of the nanosprings simulated, more clear trends are observed. In particular, larger nanospring wire cross-sectional radii (r) lead to larger Young's moduli, while larger helical diameters (2R) resulted in smaller Young's moduli. Structural transformations and theoretical models are also analyzed to identify possible factors which might affect the mechanical response of silica nanostructures under tension. The work presented outlines an innovative simulation methodology, and discusses how results can be validated against prior experimental and simulation findings. The ultimate goal is to develop new computational methods for the study of nanostructures which will make the field of materials science more accessible, cost effective and efficient.

  6. Non-Invasive Tension Measurement Devices for Parachute Cordage

    NASA Technical Reports Server (NTRS)

    Litteken, Douglas A.; Daum, Jared S.

    2016-01-01

    The need for lightweight and non-intrusive tension measurements has arisen alongside the development of high-fidelity computer models of textile and fluid dynamics. In order to validate these computer models, data must be gathered in the operational environment without altering the design, construction, or performance of the test article. Current measurement device designs rely on severing a cord and breaking the load path to introduce a load cell. These load cells are very reliable, but introduce an area of high stiffness in the load path, directly affecting the structural response, adding excessive weight, and possibly altering the dynamics of the parachute during a test. To capture the required data for analysis validation without affecting the response of the system, non-invasive measurement devices have been developed and tested by NASA. These tension measurement devices offer minimal impact to the mass, form, fit, and function of the test article, while providing reliable, axial tension measurements for parachute cordage.

  7. Benchmarking optimization software with COPS.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dolan, E.D.; More, J.J.

    2001-01-08

    The COPS test set provides a modest selection of difficult nonlinearly constrained optimization problems from applications in optimal design, fluid dynamics, parameter estimation, and optimal control. In this report we describe version 2.0 of the COPS problems. The formulation and discretization of the original problems have been streamlined and improved. We have also added new problems. The presentation of COPS follows the original report, but the description of the problems has been streamlined. For each problem we discuss the formulation of the problem and the structural data in Table 0.1 on the formulation. The aim of presenting this data ismore » to provide an approximate idea of the size and sparsity of the problem. We also include the results of computational experiments with the LANCELOT, LOQO, MINOS, and SNOPT solvers. These computational experiments differ from the original results in that we have deleted problems that were considered to be too easy. Moreover, in the current version of the computational experiments, each problem is tested with four variations. An important difference between this report and the original report is that the tables that present the computational experiments are generated automatically from the testing script. This is explained in more detail in the report.« less

  8. Extending the Capabilities of Closed-loop Distributed Engine Control Simulations Using LAN Communication

    NASA Technical Reports Server (NTRS)

    Aretskin-Hariton, Eliot D.; Zinnecker, Alicia Mae; Culley, Dennis E.

    2014-01-01

    Distributed Engine Control (DEC) is an enabling technology that has the potential to advance the state-of-the-art in gas turbine engine control. To analyze the capabilities that DEC offers, a Hardware-In-the-Loop (HIL) test bed is being developed at NASA Glenn Research Center. This test bed will support a systems-level analysis of control capabilities in closed-loop engine simulations. The structure of the HIL emulates a virtual test cell by implementing the operator functions, control system, and engine on three separate computers. This implementation increases the flexibility and extensibility of the HIL. Here, a method is discussed for implementing these interfaces by connecting the three platforms over a dedicated Local Area Network (LAN). This approach is verified using the Commercial Modular Aero-Propulsion System Simulation 40k (C-MAPSS40k), which is typically implemented on one computer. There are marginal differences between the results from simulation of the typical and the three-computer implementation. Additional analysis of the LAN network, including characterization of network load, packet drop, and latency, is presented. The three-computer setup supports the incorporation of complex control models and proprietary engine models into the HIL framework.

  9. The effect of training methodology on knowledge representation in categorization.

    PubMed

    Hélie, Sébastien; Shamloo, Farzin; Ell, Shawn W

    2017-01-01

    Category representations can be broadly classified as containing within-category information or between-category information. Although such representational differences can have a profound impact on decision-making, relatively little is known about the factors contributing to the development and generalizability of different types of category representations. These issues are addressed by investigating the impact of training methodology and category structures using a traditional empirical approach as well as the novel adaptation of computational modeling techniques from the machine learning literature. Experiment 1 focused on rule-based (RB) category structures thought to promote between-category representations. Participants learned two sets of two categories during training and were subsequently tested on a novel categorization problem using the training categories. Classification training resulted in a bias toward between-category representations whereas concept training resulted in a bias toward within-category representations. Experiment 2 focused on information-integration (II) category structures thought to promote within-category representations. With II structures, there was a bias toward within-category representations regardless of training methodology. Furthermore, in both experiments, computational modeling suggests that only within-category representations could support generalization during the test phase. These data suggest that within-category representations may be dominant and more robust for supporting the reconfiguration of current knowledge to support generalization.

  10. The effect of training methodology on knowledge representation in categorization

    PubMed Central

    Shamloo, Farzin; Ell, Shawn W.

    2017-01-01

    Category representations can be broadly classified as containing within–category information or between–category information. Although such representational differences can have a profound impact on decision–making, relatively little is known about the factors contributing to the development and generalizability of different types of category representations. These issues are addressed by investigating the impact of training methodology and category structures using a traditional empirical approach as well as the novel adaptation of computational modeling techniques from the machine learning literature. Experiment 1 focused on rule–based (RB) category structures thought to promote between–category representations. Participants learned two sets of two categories during training and were subsequently tested on a novel categorization problem using the training categories. Classification training resulted in a bias toward between–category representations whereas concept training resulted in a bias toward within–category representations. Experiment 2 focused on information-integration (II) category structures thought to promote within–category representations. With II structures, there was a bias toward within–category representations regardless of training methodology. Furthermore, in both experiments, computational modeling suggests that only within–category representations could support generalization during the test phase. These data suggest that within–category representations may be dominant and more robust for supporting the reconfiguration of current knowledge to support generalization. PMID:28846732

  11. A Historical Perspective on Dynamics Testing at the Langley Research Center

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Kvaternik, Raymond G.; Hanks, Brantley R.

    2000-01-01

    The experience and advancement of Structural dynamics testing for space system applications at the Langley Research Center of the National Aeronautics and Space Administration (NASA) over the past four decades is reviewed. This experience began in the 1960's with the development of a technology base using a variety of physical models to explore dynamic phenomena and to develop reliable analytical modeling capability for space systems. It continued through the 1970's and 80's with the development of rapid, computer-aided test techniques, the testing of low-natural frequency, gravity-sensitive systems, the testing of integrated structures with active flexible motion control, and orbital flight measurements, It extended into the 1990's where advanced computerized system identification methods were developed for estimating the dynamic states of complex, lightweight, flexible aerospace systems, The scope of discussion in this paper includes ground and flight tests and summarizes lessons learned in both successes and failures.

  12. Computational method to predict thermodynamic, transport, and flow properties for the modified Langley 8-foot high-temperature tunnel

    NASA Technical Reports Server (NTRS)

    Venkateswaran, S.; Hunt, L. Roane; Prabhu, Ramadas K.

    1992-01-01

    The Langley 8 foot high temperature tunnel (8 ft HTT) is used to test components of hypersonic vehicles for aerothermal loads definition and structural component verification. The test medium of the 8 ft HTT is obtained by burning a mixture of methane and air under high pressure; the combustion products are expanded through an axisymmetric conical contoured nozzle to simulate atmospheric flight at Mach 7. This facility was modified to raise the oxygen content of the test medium to match that of air and to include Mach 4 and Mach 5 capabilities. These modifications will facilitate the testing of hypersonic air breathing propulsion systems for a wide range of flight conditions. A computational method to predict the thermodynamic, transport, and flow properties of the equilibrium chemically reacting oxygen enriched methane-air combustion products was implemented in a computer code. This code calculates the fuel, air, and oxygen mass flow rates and test section flow properties for Mach 7, 5, and 4 nozzle configurations for given combustor and mixer conditions. Salient features of the 8 ft HTT are described, and some of the predicted tunnel operational characteristics are presented in the carpet plots to assist users in preparing test plans.

  13. XTALOPT: An open-source evolutionary algorithm for crystal structure prediction

    NASA Astrophysics Data System (ADS)

    Lonie, David C.; Zurek, Eva

    2011-02-01

    The implementation and testing of XTALOPT, an evolutionary algorithm for crystal structure prediction, is outlined. We present our new periodic displacement (ripple) operator which is ideally suited to extended systems. It is demonstrated that hybrid operators, which combine two pure operators, reduce the number of duplicate structures in the search. This allows for better exploration of the potential energy surface of the system in question, while simultaneously zooming in on the most promising regions. A continuous workflow, which makes better use of computational resources as compared to traditional generation based algorithms, is employed. Various parameters in XTALOPT are optimized using a novel benchmarking scheme. XTALOPT is available under the GNU Public License, has been interfaced with various codes commonly used to study extended systems, and has an easy to use, intuitive graphical interface. Program summaryProgram title:XTALOPT Catalogue identifier: AEGX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL v2.1 or later [1] No. of lines in distributed program, including test data, etc.: 36 849 No. of bytes in distributed program, including test data, etc.: 1 149 399 Distribution format: tar.gz Programming language: C++ Computer: PCs, workstations, or clusters Operating system: Linux Classification: 7.7 External routines: QT [2], OpenBabel [3], AVOGADRO [4], SPGLIB [8] and one of: VASP [5], PWSCF [6], GULP [7]. Nature of problem: Predicting the crystal structure of a system from its stoichiometry alone remains a grand challenge in computational materials science, chemistry, and physics. Solution method: Evolutionary algorithms are stochastic search techniques which use concepts from biological evolution in order to locate the global minimum on their potential energy surface. Our evolutionary algorithm, XTALOPT, is freely available to the scientific community for use and collaboration under the GNU Public License. Running time: User dependent. The program runs until stopped by the user.

  14. Energy Finite Element Analysis for Computing the High Frequency Vibration of the Aluminum Testbed Cylinder and Correlating the Results to Test Data

    NASA Technical Reports Server (NTRS)

    Vlahopoulos, Nickolas

    2005-01-01

    The Energy Finite Element Analysis (EFEA) is a finite element based computational method for high frequency vibration and acoustic analysis. The EFEA solves with finite elements governing differential equations for energy variables. These equations are developed from wave equations. Recently, an EFEA method for computing high frequency vibration of structures either in vacuum or in contact with a dense fluid has been presented. The presence of fluid loading has been considered through added mass and radiation damping. The EFEA developments were validated by comparing EFEA results to solutions obtained by very dense conventional finite element models and solutions from classical techniques such as statistical energy analysis (SEA) and the modal decomposition method for bodies of revolution. EFEA results have also been compared favorably with test data for the vibration and the radiated noise generated by a large scale submersible vehicle. The primary variable in EFEA is defined as the time averaged over a period and space averaged over a wavelength energy density. A joint matrix computed from the power transmission coefficients is utilized for coupling the energy density variables across any discontinuities, such as change of plate thickness, plate/stiffener junctions etc. When considering the high frequency vibration of a periodically stiffened plate or cylinder, the flexural wavelength is smaller than the interval length between two periodic stiffeners, therefore the stiffener stiffness can not be smeared by computing an equivalent rigidity for the plate or cylinder. The periodic stiffeners must be regarded as coupling components between periodic units. In this paper, Periodic Structure (PS) theory is utilized for computing the coupling joint matrix and for accounting for the periodicity characteristics.

  15. Hypersonic Research Vehicle (HRV) real-time flight test support feasibility and requirements study. Part 1: Real-time flight experiment support

    NASA Technical Reports Server (NTRS)

    Rediess, Herman A.; Ramnath, Rudrapatna V.; Vrable, Daniel L.; Hirvo, David H.; Mcmillen, Lowell D.; Osofsky, Irving B.

    1991-01-01

    The results are presented of a study to identify potential real time remote computational applications to support monitoring HRV flight test experiments along with definitions of preliminary requirements. A major expansion of the support capability available at Ames-Dryden was considered. The focus is on the use of extensive computation and data bases together with real time flight data to generate and present high level information to those monitoring the flight. Six examples were considered: (1) boundary layer transition location; (2) shock wave position estimation; (3) performance estimation; (4) surface temperature estimation; (5) critical structural stress estimation; and (6) stability estimation.

  16. Geometric modeling of subcellular structures, organelles, and multiprotein complexes

    PubMed Central

    Feng, Xin; Xia, Kelin; Tong, Yiying; Wei, Guo-Wei

    2013-01-01

    SUMMARY Recently, the structure, function, stability, and dynamics of subcellular structures, organelles, and multi-protein complexes have emerged as a leading interest in structural biology. Geometric modeling not only provides visualizations of shapes for large biomolecular complexes but also fills the gap between structural information and theoretical modeling, and enables the understanding of function, stability, and dynamics. This paper introduces a suite of computational tools for volumetric data processing, information extraction, surface mesh rendering, geometric measurement, and curvature estimation of biomolecular complexes. Particular emphasis is given to the modeling of cryo-electron microscopy data. Lagrangian-triangle meshes are employed for the surface presentation. On the basis of this representation, algorithms are developed for surface area and surface-enclosed volume calculation, and curvature estimation. Methods for volumetric meshing have also been presented. Because the technological development in computer science and mathematics has led to multiple choices at each stage of the geometric modeling, we discuss the rationales in the design and selection of various algorithms. Analytical models are designed to test the computational accuracy and convergence of proposed algorithms. Finally, we select a set of six cryo-electron microscopy data representing typical subcellular complexes to demonstrate the efficacy of the proposed algorithms in handling biomolecular surfaces and explore their capability of geometric characterization of binding targets. This paper offers a comprehensive protocol for the geometric modeling of subcellular structures, organelles, and multiprotein complexes. PMID:23212797

  17. The importance of employing computational resources for the automation of drug discovery.

    PubMed

    Rosales-Hernández, Martha Cecilia; Correa-Basurto, José

    2015-03-01

    The application of computational tools to drug discovery helps researchers to design and evaluate new drugs swiftly with a reduce economic resources. To discover new potential drugs, computational chemistry incorporates automatization for obtaining biological data such as adsorption, distribution, metabolism, excretion and toxicity (ADMET), as well as drug mechanisms of action. This editorial looks at examples of these computational tools, including docking, molecular dynamics simulation, virtual screening, quantum chemistry, quantitative structural activity relationship, principal component analysis and drug screening workflow systems. The authors then provide their perspectives on the importance of these techniques for drug discovery. Computational tools help researchers to design and discover new drugs for the treatment of several human diseases without side effects, thus allowing for the evaluation of millions of compounds with a reduced cost in both time and economic resources. The problem is that operating each program is difficult; one is required to use several programs and understand each of the properties being tested. In the future, it is possible that a single computer and software program will be capable of evaluating the complete properties (mechanisms of action and ADMET properties) of ligands. It is also possible that after submitting one target, this computer-software will be capable of suggesting potential compounds along with ways to synthesize them, and presenting biological models for testing.

  18. Preparation of Morpheus Vehicle for Vacuum Environment Testing

    NASA Technical Reports Server (NTRS)

    Sandoval, Armando

    2016-01-01

    The main objective for this summer 2016 tour was to prepare the Morpheus vehicle for its upcoming test inside Plum Brook's vacuum chamber at NASA John H. Glenn Research Center. My contributions towards this project were mostly analytical in nature, providing numerical models to validate test data, generating computer aided analyses for the structure support of the vehicle's engine, and designing a vacuum can that is to protect the high speed camera used during testing. Furthermore, I was also tasked with designing a tank toroidal spray bar system.

  19. Creating a Structured AOP Knowledgebase via Ontology-Based Annotations

    EPA Science Inventory

    The Adverse Outcome Pathway (AOP) framework is increasingly used to integrate data from traditional and emerging toxicity testing paradigms. As the number of AOP descriptions has increased, so has the need to define the AOP in terms that can be interpreted computationally. We wil...

  20. An alternative approach for computing seismic response with accidental eccentricity

    NASA Astrophysics Data System (ADS)

    Fan, Xuanhua; Yin, Jiacong; Sun, Shuli; Chen, Pu

    2014-09-01

    Accidental eccentricity is a non-standard assumption for seismic design of tall buildings. Taking it into consideration requires reanalysis of seismic resistance, which requires either time consuming computation of natural vibration of eccentric structures or finding a static displacement solution by applying an approximated equivalent torsional moment for each eccentric case. This study proposes an alternative modal response spectrum analysis (MRSA) approach to calculate seismic responses with accidental eccentricity. The proposed approach, called the Rayleigh Ritz Projection-MRSA (RRP-MRSA), is developed based on MRSA and two strategies: (a) a RRP method to obtain a fast calculation of approximate modes of eccentric structures; and (b) an approach to assemble mass matrices of eccentric structures. The efficiency of RRP-MRSA is tested via engineering examples and compared with the standard MRSA (ST-MRSA) and one approximate method, i.e., the equivalent torsional moment hybrid MRSA (ETM-MRSA). Numerical results show that RRP-MRSA not only achieves almost the same precision as ST-MRSA, and is much better than ETM-MRSA, but is also more economical. Thus, RRP-MRSA can be in place of current accidental eccentricity computations in seismic design.

  1. StrBioLib: a Java library for development of custom computational structural biology applications.

    PubMed

    Chandonia, John-Marc

    2007-08-01

    StrBioLib is a library of Java classes useful for developing software for computational structural biology research. StrBioLib contains classes to represent and manipulate protein structures, biopolymer sequences, sets of biopolymer sequences, and alignments between biopolymers based on either sequence or structure. Interfaces are provided to interact with commonly used bioinformatics applications, including (psi)-blast, modeller, muscle and Primer3, and tools are provided to read and write many file formats used to represent bioinformatic data. The library includes a general-purpose neural network object with multiple training algorithms, the Hooke and Jeeves non-linear optimization algorithm, and tools for efficient C-style string parsing and formatting. StrBioLib is the basis for the Pred2ary secondary structure prediction program, is used to build the astral compendium for sequence and structure analysis, and has been extensively tested through use in many smaller projects. Examples and documentation are available at the site below. StrBioLib may be obtained under the terms of the GNU LGPL license from http://strbio.sourceforge.net/

  2. Concept Mapping Assessment of Media Assisted Learning in Interdisciplinary Science Education

    NASA Astrophysics Data System (ADS)

    Schaal, Steffen; Bogner, Franz X.; Girwidz, Raimund

    2010-05-01

    Acquisition of conceptual knowledge is a central aim in science education. In this study we monitored an interdisciplinary hypermedia assisted learning unit on hibernation and thermodynamics based on cooperative learning. We used concept mapping for the assessment, applying a pre-test/post-test design. In our study, 106 9th graders cooperated by working in pairs ( n = 53) for six lessons. As an interdisciplinary learning activity in such complex knowledge domains has to combine many different aspects, we focused on long-term knowledge. Learners working cooperatively in dyads constructed computer-supported concept maps which were analysed by specific software. The data analysis encompassed structural aspects of the knowledge corresponding to a target reference map. After the learning unit, the results showed the acquisition of higher-order domain-specific knowledge structures which indicates successful interdisciplinary learning through the hypermedia learning environment. The benefit of using a computer-assisted concept mapping assessment for research in science education, and in science classrooms is considered.

  3. Analysis of sintered polymer scaffolds using concomitant synchrotron computed tomography and in situ mechanical testing.

    PubMed

    Dhillon, A; Schneider, P; Kuhn, G; Reinwald, Y; White, L J; Levchuk, A; Rose, F R A J; Müller, R; Shakesheff, K M; Rahman, C V

    2011-12-01

    The mechanical behaviour of polymer scaffolds plays a vital role in their successful use in bone tissue engineering. The present study utilised novel sintered polymer scaffolds prepared using temperature-sensitive poly(DL-lactic acid-co-glycolic acid)/poly(ethylene glycol) particles. The microstructure of these scaffolds was monitored under compressive strain by image-guided failure assessment (IGFA), which combined synchrotron radiation computed tomography (SR CT) and in situ micro-compression. Three-dimensional CT data sets of scaffolds subjected to a strain rate of 0.01%/s illustrated particle movement within the scaffolds with no deformation or cracking. When compressed using a higher strain rate of 0.02%/s particle movement was more pronounced and cracks between sintered particles were observed. The results from this study demonstrate that IGFA based on simultaneous SR CT imaging and micro-compression testing is a useful tool for assessing structural and mechanical scaffold properties, leading to further insight into structure-function relationships in scaffolds for bone tissue engineering applications.

  4. Computer-Based Learning: Graphical Integration of Whole and Sectional Neuroanatomy Improves Long-Term Retention

    PubMed Central

    Naaz, Farah; Chariker, Julia H.; Pani, John R.

    2013-01-01

    A study was conducted to test the hypothesis that instruction with graphically integrated representations of whole and sectional neuroanatomy is especially effective for learning to recognize neural structures in sectional imagery (such as MRI images). Neuroanatomy was taught to two groups of participants using computer graphical models of the human brain. Both groups learned whole anatomy first with a three-dimensional model of the brain. One group then learned sectional anatomy using two-dimensional sectional representations, with the expectation that there would be transfer of learning from whole to sectional anatomy. The second group learned sectional anatomy by moving a virtual cutting plane through the three-dimensional model. In tests of long-term retention of sectional neuroanatomy, the group with graphically integrated representation recognized more neural structures that were known to be challenging to learn. This study demonstrates the use of graphical representation to facilitate a more elaborated (deeper) understanding of complex spatial relations. PMID:24563579

  5. Rapid and Accurate Machine Learning Recognition of High Performing Metal Organic Frameworks for CO2 Capture.

    PubMed

    Fernandez, Michael; Boyd, Peter G; Daff, Thomas D; Aghaji, Mohammad Zein; Woo, Tom K

    2014-09-04

    In this work, we have developed quantitative structure-property relationship (QSPR) models using advanced machine learning algorithms that can rapidly and accurately recognize high-performing metal organic framework (MOF) materials for CO2 capture. More specifically, QSPR classifiers have been developed that can, in a fraction of a section, identify candidate MOFs with enhanced CO2 adsorption capacity (>1 mmol/g at 0.15 bar and >4 mmol/g at 1 bar). The models were tested on a large set of 292 050 MOFs that were not part of the training set. The QSPR classifier could recover 945 of the top 1000 MOFs in the test set while flagging only 10% of the whole library for compute intensive screening. Thus, using the machine learning classifiers as part of a high-throughput screening protocol would result in an order of magnitude reduction in compute time and allow intractably large structure libraries and search spaces to be screened.

  6. A Nonlinear Modal Aeroelastic Solver for FUN3D

    NASA Technical Reports Server (NTRS)

    Goldman, Benjamin D.; Bartels, Robert E.; Biedron, Robert T.; Scott, Robert C.

    2016-01-01

    A nonlinear structural solver has been implemented internally within the NASA FUN3D computational fluid dynamics code, allowing for some new aeroelastic capabilities. Using a modal representation of the structure, a set of differential or differential-algebraic equations are derived for general thin structures with geometric nonlinearities. ODEPACK and LAPACK routines are linked with FUN3D, and the nonlinear equations are solved at each CFD time step. The existing predictor-corrector method is retained, whereby the structural solution is updated after mesh deformation. The nonlinear solver is validated using a test case for a flexible aeroshell at transonic, supersonic, and hypersonic flow conditions. Agreement with linear theory is seen for the static aeroelastic solutions at relatively low dynamic pressures, but structural nonlinearities limit deformation amplitudes at high dynamic pressures. No flutter was found at any of the tested trajectory points, though LCO may be possible in the transonic regime.

  7. The temperature dependence of inelastic light scattering from small particles for use in combustion diagnostic instrumentation

    NASA Technical Reports Server (NTRS)

    Cloud, Stanley D.

    1987-01-01

    A computer calculation of the expected angular distribution of coherent anti-Stokes Raman scattering (CARS) from micrometer size polystyrene spheres based on a Mie-type model, and a pilot experiment to test the feasibility of measuring CARS angular distributions from micrometer size polystyrene spheres by simply suspending them in water are discussed. The computer calculations predict a very interesting structure in the angular distributions that depends strongly on the size and relative refractive index of the spheres.

  8. Structural Technology Evaluation and Analysis Program (STEAP). Delivery Order 0037: Prognosis-Based Control Reconfiguration for an Aircraft with Faulty Actuator to Enable Performance in a Degraded State

    DTIC Science & Technology

    2010-12-01

    computers in 1953. HIL motion simulators were also built for the dynamic testing of vehicle com- ponents (e.g. suspensions, bodies ) with hydraulic or...complex, comprehensive mechanical systems can be simulated in real-time by parallel computers; examples include multi- body sys- tems, brake systems...hard constraints in a multivariable control framework. And the third aspect is the ability to perform online optimization. These aspects results in

  9. The non-independence discussion about cycle structure in the computer language: the final simplification of computer language in the structural design

    NASA Astrophysics Data System (ADS)

    Yang, Peilu

    2013-03-01

    In the first place, the article discusses the theory, content, development, and questions about structured programming design. The further extension on this basement provides the cycle structure in computer language is the sequence structure, branch structure, and the cycle structure with independence. Through the deeply research by the writer, we find the non-independence and reach the final simplification about the computer language design. In the first, the writer provides the language structure of linear structure (I structure) and curvilinear structure (Y structure). This makes the computer language has high proficiency with simplification during the program exploration. The research in this article is corresponding with the widely used dualistic structure in the computer field. Moreover, it is greatly promote the evolution of computer language.

  10. Neural Mechanisms Underlying the Computation of Hierarchical Tree Structures in Mathematics

    PubMed Central

    Nakai, Tomoya; Sakai, Kuniyoshi L.

    2014-01-01

    Whether mathematical and linguistic processes share the same neural mechanisms has been a matter of controversy. By examining various sentence structures, we recently demonstrated that activations in the left inferior frontal gyrus (L. IFG) and left supramarginal gyrus (L. SMG) were modulated by the Degree of Merger (DoM), a measure for the complexity of tree structures. In the present study, we hypothesize that the DoM is also critical in mathematical calculations, and clarify whether the DoM in the hierarchical tree structures modulates activations in these regions. We tested an arithmetic task that involved linear and quadratic sequences with recursive computation. Using functional magnetic resonance imaging, we found significant activation in the L. IFG, L. SMG, bilateral intraparietal sulcus (IPS), and precuneus selectively among the tested conditions. We also confirmed that activations in the L. IFG and L. SMG were free from memory-related factors, and that activations in the bilateral IPS and precuneus were independent from other possible factors. Moreover, by fitting parametric models of eight factors, we found that the model of DoM in the hierarchical tree structures was the best to explain the modulation of activations in these five regions. Using dynamic causal modeling, we showed that the model with a modulatory effect for the connection from the L. IPS to the L. IFG, and with driving inputs into the L. IFG, was highly probable. The intrinsic, i.e., task-independent, connection from the L. IFG to the L. IPS, as well as that from the L. IPS to the R. IPS, would provide a feedforward signal, together with negative feedback connections. We indicate that mathematics and language share the network of the L. IFG and L. IPS/SMG for the computation of hierarchical tree structures, and that mathematics recruits the additional network of the L. IPS and R. IPS. PMID:25379713

  11. Geographically distributed hybrid testing & collaboration between geotechnical centrifuge and structures laboratories

    NASA Astrophysics Data System (ADS)

    Ojaghi, Mobin; Martínez, Ignacio Lamata; Dietz, Matt S.; Williams, Martin S.; Blakeborough, Anthony; Crewe, Adam J.; Taylor, Colin A.; Madabhushi, S. P. Gopal; Haigh, Stuart K.

    2018-01-01

    Distributed Hybrid Testing (DHT) is an experimental technique designed to capitalise on advances in modern networking infrastructure to overcome traditional laboratory capacity limitations. By coupling the heterogeneous test apparatus and computational resources of geographically distributed laboratories, DHT provides the means to take on complex, multi-disciplinary challenges with new forms of communication and collaboration. To introduce the opportunity and practicability afforded by DHT, here an exemplar multi-site test is addressed in which a dedicated fibre network and suite of custom software is used to connect the geotechnical centrifuge at the University of Cambridge with a variety of structural dynamics loading apparatus at the University of Oxford and the University of Bristol. While centrifuge time-scaling prevents real-time rates of loading in this test, such experiments may be used to gain valuable insights into physical phenomena, test procedure and accuracy. These and other related experiments have led to the development of the real-time DHT technique and the creation of a flexible framework that aims to facilitate future distributed tests within the UK and beyond. As a further example, a real-time DHT experiment between structural labs using this framework for testing across the Internet is also presented.

  12. A 2-year study of Gram stain competency assessment in 40 clinical laboratories.

    PubMed

    Goodyear, Nancy; Kim, Sara; Reeves, Mary; Astion, Michael L

    2006-01-01

    We used a computer-based competency assessment tool for Gram stain interpretation to assess the performance of 278 laboratory staff from 40 laboratories on 40 multiple-choice questions. We report test reliability, mean scores, median, item difficulty, discrimination, and analysis of the highest- and lowest-scoring questions. The questions were reliable (KR-20 coefficient, 0.80). Overall mean score was 88% (range, 63%-98%). When categorized by cell type, the means were host cells, 93%; other cells (eg, yeast), 92%; gram-positive, 90%; and gram-negative, 88%. When categorized by type of interpretation, the means were other (eg, underdecolorization), 92%; identify by structure (eg, bacterial morphologic features), 91%; and identify by name (eg, genus and species), 87%. Of the 6 highest-scoring questions (mean scores, > or = 99%) 5 were identify by structure and 1 was identify by name. Of the 6 lowest-scoring questions (mean scores, < 75%) 5 were gram-negative and 1 was host cells. By type of interpretation, 2 were identify by structure and 4 were identify by name. Computer-based Gram stain competency assessment examinations are reliable. Our analysis helps laboratories identify areas for continuing education in Gram stain interpretation and will direct future revisions of the tests.

  13. Computer-Based Readability Testing of Information Booklets for German Cancer Patients.

    PubMed

    Keinki, Christian; Zowalla, Richard; Pobiruchin, Monika; Huebner, Jutta; Wiesner, Martin

    2018-04-12

    Understandable health information is essential for treatment adherence and improved health outcomes. For readability testing, several instruments analyze the complexity of sentence structures, e.g., Flesch-Reading Ease (FRE) or Vienna-Formula (WSTF). Moreover, the vocabulary is of high relevance for readers. The aim of this study is to investigate the agreement of sentence structure and vocabulary-based (SVM) instruments. A total of 52 freely available German patient information booklets on cancer were collected from the Internet. The mean understandability level L was computed for 51 booklets. The resulting values of FRE, WSTF, and SVM were assessed pairwise for agreement with Bland-Altman plots and two-sided, paired t tests. For the pairwise comparison, the mean L values are L FRE  = 6.81, L WSTF  = 7.39, L SVM  = 5.09. The sentence structure-based metrics gave significantly different scores (P < 0.001) for all assessed booklets, confirmed by the Bland-Altman analysis. The study findings suggest that vocabulary-based instruments cannot be interchanged with FRE/WSTF. However, both analytical aspects should be considered and checked by authors to linguistically refine texts with respect to the individual target group. Authors of health information can be supported by automated readability analysis. Health professionals can benefit by direct booklet comparisons allowing for time-effective selection of suitable booklets for patients.

  14. Shielding requirements for the Space Station habitability modules

    NASA Technical Reports Server (NTRS)

    Avans, Sherman L.; Horn, Jennifer R.; Williamsen, Joel E.

    1990-01-01

    The design, analysis, development, and tests of the total meteoroid/debris protection system for the Space Station Freedom habitability modules, such as the habitation module, the laboratory module, and the node structures, are described. Design requirements are discussed along with development efforts, including a combination of hypervelocity testing and analyses. Computer hydrocode analysis of hypervelocity impact phenomena associated with Space Station habitability structures is covered and the use of optimization techniques, engineering models, and parametric analyses is assessed. Explosive rail gun development efforts and protective capability and damage tolerance of multilayer insulation due to meteoroid/debris impact are considered. It is concluded that anticipated changes in the debris environment definition and requirements will require rescoping the tests and analysis required to develop a protection system.

  15. A millimeter-wave tunneladder TWT

    NASA Technical Reports Server (NTRS)

    Wilson, D.

    1988-01-01

    A millimeter-wave traveling wave tube (TWT) was developed using a dispersive, high-impedance forward wave interaction structure based on a ladder, with non-space-harmonic interaction, for a tube with high gain per inch and high efficiency. The 'TunneLadder' interaction structure combines ladder properties modified to accommodate Pierce gun beam optics in a radially magnetized PM focusing structure. The development involved the fabrication of chemically milled, shaped ladders diffusion brazed to diamond cubes which are in turn active diffusion brazed to each ridge of a doubly ridged waveguide. Cold-test data, representing the (omega)(beta) and and impedance characteristics of the modified ladder circuit, were used in small and large-signal computer programs to predict TWT gain and efficiency. The structural design emphasizes ruggedness and reliability. Actual data from tested tubes verify the predicted performance while providing broader bandwidth than expected.

  16. Acoustic environmental accuracy requirements for response determination

    NASA Technical Reports Server (NTRS)

    Pettitt, M. R.

    1983-01-01

    A general purpose computer program was developed for the prediction of vehicle interior noise. This program, named VIN, has both modal and statistical energy analysis capabilities for structural/acoustic interaction analysis. The analytic models and their computer implementation were verified through simple test cases with well-defined experimental results. The model was also applied in a space shuttle payload bay launch acoustics prediction study. The computer program processes large and small problems with equal efficiency because all arrays are dynamically sized by program input variables at run time. A data base is built and easily accessed for design studies. The data base significantly reduces the computational costs of such studies by allowing the reuse of the still-valid calculated parameters of previous iterations.

  17. Computational structural mechanics engine structures computational simulator

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1989-01-01

    The Computational Structural Mechanics (CSM) program at Lewis encompasses: (1) fundamental aspects for formulating and solving structural mechanics problems, and (2) development of integrated software systems to computationally simulate the performance/durability/life of engine structures.

  18. STEP and STEPSPL: Computer programs for aerodynamic model structure determination and parameter estimation

    NASA Technical Reports Server (NTRS)

    Batterson, J. G.

    1986-01-01

    The successful parametric modeling of the aerodynamics for an airplane operating at high angles of attack or sideslip is performed in two phases. First the aerodynamic model structure must be determined and second the associated aerodynamic parameters (stability and control derivatives) must be estimated for that model. The purpose of this paper is to document two versions of a stepwise regression computer program which were developed for the determination of airplane aerodynamic model structure and to provide two examples of their use on computer generated data. References are provided for the application of the programs to real flight data. The two computer programs that are the subject of this report, STEP and STEPSPL, are written in FORTRAN IV (ANSI l966) compatible with a CDC FTN4 compiler. Both programs are adaptations of a standard forward stepwise regression algorithm. The purpose of the adaptation is to facilitate the selection of a adequate mathematical model of the aerodynamic force and moment coefficients of an airplane from flight test data. The major difference between STEP and STEPSPL is in the basis for the model. The basis for the model in STEP is the standard polynomial Taylor's series expansion of the aerodynamic function about some steady-state trim condition. Program STEPSPL utilizes a set of spline basis functions.

  19. A first-principle calculation of the XANES spectrum of Cu2+ in water

    NASA Astrophysics Data System (ADS)

    La Penna, G.; Minicozzi, V.; Morante, S.; Rossi, G. C.; Stellato, F.

    2015-09-01

    The progress in high performance computing we are witnessing today offers the possibility of accurate electron density calculations of systems in realistic physico-chemical conditions. In this paper, we present a strategy aimed at performing a first-principle computation of the low energy part of the X-ray Absorption Spectroscopy (XAS) spectrum based on the density functional theory calculation of the electronic potential. To test its effectiveness, we apply the method to the computation of the X-ray absorption near edge structure part of the XAS spectrum in the paradigmatic, but simple case of Cu2+ in water. In order to keep into account the effect of the metal site structure fluctuations in determining the experimental signal, the theoretical spectrum is evaluated as the average over the computed spectra of a statistically significant number of simulated metal site configurations. The comparison of experimental data with theoretical calculations suggests that Cu2+ lives preferentially in a square-pyramidal geometry. The remarkable success of this approach in the interpretation of XAS data makes us optimistic about the possibility of extending the computational strategy we have outlined to the more interesting case of molecules of biological relevance bound to transition metal ions.

  20. Design of Test Articles and Monitoring System for the Characterization of HIRF Effects on a Fault-Tolerant Computer Communication System

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo; Malekpour, Mahyar R.; Miner, Paul S.; Koppen, Sandra V.

    2008-01-01

    This report describes the design of the test articles and monitoring systems developed to characterize the response of a fault-tolerant computer communication system when stressed beyond the theoretical limits for guaranteed correct performance. A high-intensity radiated electromagnetic field (HIRF) environment was selected as the means of injecting faults, as such environments are known to have the potential to cause arbitrary and coincident common-mode fault manifestations that can overwhelm redundancy management mechanisms. The monitors generate stimuli for the systems-under-test (SUTs) and collect data in real-time on the internal state and the response at the external interfaces. A real-time health assessment capability was developed to support the automation of the test. A detailed description of the nature and structure of the collected data is included. The goal of the report is to provide insight into the design and operation of these systems, and to serve as a reference document for use in post-test analyses.

  1. Ground test for vibration control demonstrator

    NASA Astrophysics Data System (ADS)

    Meyer, C.; Prodigue, J.; Broux, G.; Cantinaud, O.; Poussot-Vassal, C.

    2016-09-01

    In the objective of maximizing comfort in Falcon jets, Dassault Aviation is developing an innovative vibration control technology. Vibrations of the structure are measured at several locations and sent to a dedicated high performance vibration control computer. Control laws are implemented in this computer to analyse the vibrations in real time, and then elaborate orders sent to the existing control surfaces to counteract vibrations. After detailing the technology principles, this paper focuses on the vibration control ground demonstration that was performed by Dassault Aviation in May 2015 on Falcon 7X business jet. The goal of this test was to attenuate vibrations resulting from fixed forced excitation delivered by shakers. The ground test demonstrated the capability to implement an efficient closed-loop vibration control with a significant vibration level reduction and validated the vibration control law design methodology. This successful ground test was a prerequisite before the flight test demonstration that is now being prepared. This study has been partly supported by the JTI CleanSky SFWA-ITD.

  2. Reduced-Order Models for the Aeroelastic Analysis of Ares Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Vatsa, Veer N.; Biedron, Robert T.

    2010-01-01

    This document presents the development and application of unsteady aerodynamic, structural dynamic, and aeroelastic reduced-order models (ROMs) for the ascent aeroelastic analysis of the Ares I-X flight test and Ares I crew launch vehicles using the unstructured-grid, aeroelastic FUN3D computational fluid dynamics (CFD) code. The purpose of this work is to perform computationally-efficient aeroelastic response calculations that would be prohibitively expensive via computation of multiple full-order aeroelastic FUN3D solutions. These efficient aeroelastic ROM solutions provide valuable insight regarding the aeroelastic sensitivity of the vehicles to various parameters over a range of dynamic pressures.

  3. Bifilar analysis users manual, volume 2

    NASA Technical Reports Server (NTRS)

    Cassarino, S. J.

    1980-01-01

    The digital computer program developed to study the vibration response of a coupled rotor/bifilar/airframe coupled system is described. The theoretical development of the rotor/airframe system equations of motion is provided. The fuselage and bifilar absorber equations of motion are discussed. The modular block approach used in the make-up of this computer program is described. The input data needed to run the rotor and bifilar absorber analyses is described. Sample output formats are presented and discussed. The results for four test cases, which use the major logic paths of the computer program, are presented. The overall program structure is discussed in detail. The FORTRAN subroutines are described in detail.

  4. Simulation-based hypothesis testing of high dimensional means under covariance heterogeneity.

    PubMed

    Chang, Jinyuan; Zheng, Chao; Zhou, Wen-Xin; Zhou, Wen

    2017-12-01

    In this article, we study the problem of testing the mean vectors of high dimensional data in both one-sample and two-sample cases. The proposed testing procedures employ maximum-type statistics and the parametric bootstrap techniques to compute the critical values. Different from the existing tests that heavily rely on the structural conditions on the unknown covariance matrices, the proposed tests allow general covariance structures of the data and therefore enjoy wide scope of applicability in practice. To enhance powers of the tests against sparse alternatives, we further propose two-step procedures with a preliminary feature screening step. Theoretical properties of the proposed tests are investigated. Through extensive numerical experiments on synthetic data sets and an human acute lymphoblastic leukemia gene expression data set, we illustrate the performance of the new tests and how they may provide assistance on detecting disease-associated gene-sets. The proposed methods have been implemented in an R-package HDtest and are available on CRAN. © 2017, The International Biometric Society.

  5. Global identifiability of linear compartmental models--a computer algebra algorithm.

    PubMed

    Audoly, S; D'Angiò, L; Saccomani, M P; Cobelli, C

    1998-01-01

    A priori global identifiability deals with the uniqueness of the solution for the unknown parameters of a model and is, thus, a prerequisite for parameter estimation of biological dynamic models. Global identifiability is however difficult to test, since it requires solving a system of algebraic nonlinear equations which increases both in nonlinearity degree and number of terms and unknowns with increasing model order. In this paper, a computer algebra tool, GLOBI (GLOBal Identifiability) is presented, which combines the topological transfer function method with the Buchberger algorithm, to test global identifiability of linear compartmental models. GLOBI allows for the automatic testing of a priori global identifiability of general structure compartmental models from general multi input-multi output experiments. Examples of usage of GLOBI to analyze a priori global identifiability of some complex biological compartmental models are provided.

  6. Numerical characteristics of quantum computer simulation

    NASA Astrophysics Data System (ADS)

    Chernyavskiy, A.; Khamitov, K.; Teplov, A.; Voevodin, V.; Voevodin, Vl.

    2016-12-01

    The simulation of quantum circuits is significantly important for the implementation of quantum information technologies. The main difficulty of such modeling is the exponential growth of dimensionality, thus the usage of modern high-performance parallel computations is relevant. As it is well known, arbitrary quantum computation in circuit model can be done by only single- and two-qubit gates, and we analyze the computational structure and properties of the simulation of such gates. We investigate the fact that the unique properties of quantum nature lead to the computational properties of the considered algorithms: the quantum parallelism make the simulation of quantum gates highly parallel, and on the other hand, quantum entanglement leads to the problem of computational locality during simulation. We use the methodology of the AlgoWiki project (algowiki-project.org) to analyze the algorithm. This methodology consists of theoretical (sequential and parallel complexity, macro structure, and visual informational graph) and experimental (locality and memory access, scalability and more specific dynamic characteristics) parts. Experimental part was made by using the petascale Lomonosov supercomputer (Moscow State University, Russia). We show that the simulation of quantum gates is a good base for the research and testing of the development methods for data intense parallel software, and considered methodology of the analysis can be successfully used for the improvement of the algorithms in quantum information science.

  7. Creating a Structured Adverse Outcome Pathway Knowledgebase via Ontology-Based Annotations

    EPA Science Inventory

    The Adverse Outcome Pathway (AOP) framework is increasingly used to integrate data based on traditional and emerging toxicity testing paradigms. As the number of AOP descriptions has increased, so has the need to define the AOP in computable terms. Herein, we present a comprehens...

  8. Computer-Aided Techniques for Providing Operator Performance Measures.

    ERIC Educational Resources Information Center

    Connelly, Edward M.; And Others

    This report documents the theory, structure, and implementation of a performance processor (written in FORTRAN IV) that can accept performance demonstration data representing various levels of operator's skill and, under user control, analyze data to provide candidate performance measures and validation test results. The processor accepts two…

  9. PREDICTION OF THE VAPOR PRESSURE, BOILING POINT, HEAT OF VAPORIZATION AND DIFFUSION COEFFICIENT OF ORGANIC COMPOUNDS

    EPA Science Inventory

    The prototype computer program SPARC has been under development for several years to estimate physical properties and chemical reactivity parameters of organic compounds strictly from molecular structure. SPARC solute-solute physical process models have been developed and tested...

  10. A new model to compute the desired steering torque for steer-by-wire vehicles and driving simulators

    NASA Astrophysics Data System (ADS)

    Fankem, Steve; Müller, Steffen

    2014-05-01

    This paper deals with the control of the hand wheel actuator in steer-by-wire (SbW) vehicles and driving simulators (DSs). A novel model for the computation of the desired steering torque is presented. The introduced steering torque computation does not only aim to generate a realistic steering feel, which means that the driver should not miss the basic steering functionality of a modern conventional steering system such as an electric power steering (EPS) or hydraulic power steering (HPS), and this in every driving situation. In addition, the modular structure of the steering torque computation combined with suitably selected tuning parameters has the objective to offer a high degree of customisability of the steering feel and thus to provide each driver with his preferred steering feel in a very intuitive manner. The task and the tuning of each module are firstly described. Then, the steering torque computation is parameterised such that the steering feel of a series EPS system is reproduced. For this purpose, experiments are conducted in a hardware-in-the-loop environment where a test EPS is mounted on a steering test bench coupled with a vehicle simulator and parameter identification techniques are applied. Subsequently, how appropriate the steering torque computation mimics the test EPS system is objectively evaluated with respect to criteria concerning the steering torque level and gradient, the feedback behaviour and the steering return ability. Finally, the intuitive tuning of the modular steering torque computation is demonstrated for deriving a sportier steering feel configuration.

  11. Hardware math for the 6502 microprocessor

    NASA Technical Reports Server (NTRS)

    Kissel, R.; Currie, J.

    1985-01-01

    A floating-point arithmetic unit is described which is being used in the Ground Facility of Large Space Structures Control Verification (GF/LSSCV). The experiment uses two complete inertial measurement units and a set of three gimbal torquers in a closed loop to control the structural vibrations in a flexible test article (beam). A 6502 (8-bit) microprocessor controls four AMD 9511A floating-point arithmetic units to do all the computation in 20 milliseconds.

  12. Dynamic Identification for Control of Large Space Structures

    NASA Technical Reports Server (NTRS)

    Ibrahim, S. R.

    1985-01-01

    This is a compilation of reports by the one author on one subject. It consists of the following five journal articles: (1) A Parametric Study of the Ibrahim Time Domain Modal Identification Algorithm; (2) Large Modal Survey Testing Using the Ibrahim Time Domain Identification Technique; (3) Computation of Normal Modes from Identified Complex Modes; (4) Dynamic Modeling of Structural from Measured Complex Modes; and (5) Time Domain Quasi-Linear Identification of Nonlinear Dynamic Systems.

  13. Blast Load Simulator Experiments for Computational Model Validation: Report 1

    DTIC Science & Technology

    2016-08-01

    involving the inclusion of non-responding box-type structures in a BLS simulated blast environment. The BLS is a highly tunable com- pressed-gas-driven...Blast Load Simulator (BLS) to evaluate its suitability for a future effort involving the inclusion of non-responding box-type structures located in...Recommendations Preliminary testing indicated that inclusion of the grill and diaphragm striker resulted in a decrease in peak pressure of about 12

  14. Partitioning Rectangular and Structurally Nonsymmetric Sparse Matrices for Parallel Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    B. Hendrickson; T.G. Kolda

    1998-09-01

    A common operation in scientific computing is the multiplication of a sparse, rectangular or structurally nonsymmetric matrix and a vector. In many applications the matrix- transpose-vector product is also required. This paper addresses the efficient parallelization of these operations. We show that the problem can be expressed in terms of partitioning bipartite graphs. We then introduce several algorithms for this partitioning problem and compare their performance on a set of test matrices.

  15. SMS crew station (C and D panels and forward structures). CEI part 1: Detail specification, type 1 data

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Established are the requirements for performance, design, test and qualification of one type of equipment identified as SMS C&D panels and forward structures. This CEI is used to provide all hardware and wiring necessary for the C&D panels to be properly interfaced with the computer complex/signal conversion equipment (SCE), crew station, and software requirements as defined in other CEI specifications.

  16. On Structure and Properties of Amorphous Materials

    PubMed Central

    Stachurski, Zbigniew H.

    2011-01-01

    Mechanical, optical, magnetic and electronic properties of amorphous materials hold great promise towards current and emergent technologies. We distinguish at least four categories of amorphous (glassy) materials: (i) metallic; (ii) thin films; (iii) organic and inorganic thermoplastics; and (iv) amorphous permanent networks. Some fundamental questions about the atomic arrangements remain unresolved. This paper focuses on the models of atomic arrangements in amorphous materials. The earliest ideas of Bernal on the structure of liquids were followed by experiments and computer models for the packing of spheres. Modern approach is to carry out computer simulations with prediction that can be tested by experiments. A geometrical concept of an ideal amorphous solid is presented as a novel contribution to the understanding of atomic arrangements in amorphous solids. PMID:28824158

  17. Configuration and Sizing of a Test Fixture for Panels Under Combined Loads

    NASA Technical Reports Server (NTRS)

    Lovejoy, Andrew E.

    2006-01-01

    Future air and space structures are expected to utilize composite panels that are subjected to combined mechanical loads, such as bi-axial compression/tension, shear and pressure. Therefore, the ability to accurately predict the buckling and strength failures of such panels is important. While computational analysis can provide tremendous insight into panel response, experimental results are necessary to verify predicted performances of these panels to judge the accuracy of computational methods. However, application of combined loads is an extremely difficult task due to the complex test fixtures and set-up required. Presented herein is a comparison of several test set-ups capable of testing panels under combined loads. Configurations compared include a D-box, a segmented cylinder and a single panel set-up. The study primarily focuses on the preliminary sizing of a single panel test configuration capable of testing flat panels under combined in-plane mechanical loads. This single panel set-up appears to be best suited to the testing of both strength critical and buckling critical panels. Required actuator loads and strokes are provided for various square, flat panels.

  18. Integrated Vehicle Ground Vibration Testing of Manned Spacecraft: Historical Precedent

    NASA Technical Reports Server (NTRS)

    Lemke, Paul R.; Tuma, Margaret L.; Askins, Bruce R.

    2008-01-01

    For the first time in nearly 30 years, NASA is developing a new manned space flight launch system. The Ares I will carry crew and cargo to not only the International Space Station, but onward for the future exploration of the Moon and Mars. The Ares I control system and structural designs use complex computer models for their development. An Integrated Vehicle Ground Vibration Test (IVGVT) will validate the efficacy of these computer models. The IVGVT will reduce the technical risk of unexpected conditions that could place the vehicle or crew in jeopardy. The Ares Project Office's Flight and Integrated Test Office commissioned a study to determine how historical programs, such as Saturn and Space Shuttle, validated the structural dynamics of an integrated flight vehicle. The study methodology was to examine the historical record and seek out members of the engineering community who recall the development of historic manned launch vehicles. These records and interviews provided insight into the best practices and lessons learned from these historic development programs. The information that was gathered allowed the creation of timelines of the historic development programs. The timelines trace the programs from the development of test articles through test preparation, test operations, and test data reduction efforts. These timelines also demonstrate how the historical tests fit within their overall vehicle development programs. Finally, the study was able to quantify approximate staffing levels during historic development programs. Using this study, the Flight and Integrated Test Office was able to evaluate the Ares I Integrated Vehicle Ground Vibration Test schedule and workforce budgets in light of the historical precedents to determine if the test had schedule or cost risks associated with it.

  19. Development and application of hybrid structure based method for efficient screening of ligands binding to G-protein coupled receptors

    NASA Astrophysics Data System (ADS)

    Kortagere, Sandhya; Welsh, William J.

    2006-12-01

    G-protein coupled receptors (GPCRs) comprise a large superfamily of proteins that are targets for nearly 50% of drugs in clinical use today. In the past, the use of structure-based drug design strategies to develop better drug candidates has been severely hampered due to the absence of the receptor's three-dimensional structure. However, with recent advances in molecular modeling techniques and better computing power, atomic level details of these receptors can be derived from computationally derived molecular models. Using information from these models coupled with experimental evidence, it has become feasible to build receptor pharmacophores. In this study, we demonstrate the use of the Hybrid Structure Based (HSB) method that can be used effectively to screen and identify prospective ligands that bind to GPCRs. Essentially; this multi-step method combines ligand-based methods for building enriched libraries of small molecules and structure-based methods for screening molecules against the GPCR target. The HSB method was validated to identify retinal and its analogues from a random dataset of ˜300,000 molecules. The results from this study showed that the 9 top-ranking molecules are indeed analogues of retinal. The method was also tested to identify analogues of dopamine binding to the dopamine D2 receptor. Six of the ten top-ranking molecules are known analogues of dopamine including a prodrug, while the other thirty-four molecules are currently being tested for their activity against all dopamine receptors. The results from both these test cases have proved that the HSB method provides a realistic solution to bridge the gap between the ever-increasing demand for new drugs to treat psychiatric disorders and the lack of efficient screening methods for GPCRs.

  20. Boeing Smart Rotor Full-scale Wind Tunnel Test Data Report

    NASA Technical Reports Server (NTRS)

    Kottapalli, Sesi; Hagerty, Brandon; Salazar, Denise

    2016-01-01

    A full-scale helicopter smart material actuated rotor technology (SMART) rotor test was conducted in the USAF National Full-Scale Aerodynamics Complex 40- by 80-Foot Wind Tunnel at NASA Ames. The SMART rotor system is a five-bladed MD 902 bearingless rotor with active trailing-edge flaps. The flaps are actuated using piezoelectric actuators. Rotor performance, structural loads, and acoustic data were obtained over a wide range of rotor shaft angles of attack, thrust, and airspeeds. The primary test objective was to acquire unique validation data for the high-performance computing analyses developed under the Defense Advanced Research Project Agency (DARPA) Helicopter Quieting Program (HQP). Other research objectives included quantifying the ability of the on-blade flaps to achieve vibration reduction, rotor smoothing, and performance improvements. This data set of rotor performance and structural loads can be used for analytical and experimental comparison studies with other full-scale rotor systems and for analytical validation of computer simulation models. The purpose of this final data report is to document a comprehensive, highquality data set that includes only data points where the flap was actively controlled and each of the five flaps behaved in a similar manner.

  1. Sizing Single Cantilever Beam Specimens for Characterizing Facesheet/Core Peel Debonding in Sandwich Structure

    NASA Technical Reports Server (NTRS)

    Ratcliffe, James G.

    2010-01-01

    This paper details part of an effort focused on the development of a standardized facesheet/core peel debonding test procedure. The purpose of the test is to characterize facesheet/core peel in sandwich structure, accomplished through the measurement of the critical strain energy release rate associated with the debonding process. The specific test method selected for the standardized test procedure utilizes a single cantilever beam (SCB) specimen configuration. The objective of the current work is to develop a method for establishing SCB specimen dimensions. This is achieved by imposing specific limitations on specimen dimensions, with the objectives of promoting a linear elastic specimen response, and simplifying the data reduction method required for computing the critical strain energy release rate associated with debonding. The sizing method is also designed to be suitable for incorporation into a standardized test protocol. Preliminary application of the resulting sizing method yields practical specimen dimensions.

  2. Experimental and finite element investigation of the buckling characteristics of a beaded skin panel for a hypersonic aircraft. Ph.D. Thesis. Final Report

    NASA Technical Reports Server (NTRS)

    Siegel, W. H.

    1978-01-01

    As part of NASA's continuing research into hypersonics and 85 square foot hypersonic wing test section of a proposed hypersonic research airplane was laboratory tested. The project reported on in this paper has carried the hypersonic wing test structure project one step further by testing a single beaded panel to failure. The primary interest was focused upon the buckling characteristics of the panel under pure compression with boundary conditions similar to those found in a wing mounted condition. Three primary phases of analysis are included in the report. These phases include: experimental testing of the beaded panel to failure; finite element structural analysis of the beaded panel with the computer program NASTRAN; a summary of the semiclassical buckling equations for the beaded panel under purely compressive loads. Comparisons between each of the analysis methods are also included.

  3. Fitting Multimeric Protein Complexes into Electron Microscopy Maps Using 3D Zernike Descriptors

    PubMed Central

    Esquivel-Rodríguez, Juan; Kihara, Daisuke

    2012-01-01

    A novel computational method for fitting high-resolution structures of multiple proteins into a cryoelectron microscopy map is presented. The method named EMLZerD generates a pool of candidate multiple protein docking conformations of component proteins, which are later compared with a provided electron microscopy (EM) density map to select the ones that fit well into the EM map. The comparison of docking conformations and the EM map is performed using the 3D Zernike descriptor (3DZD), a mathematical series expansion of three-dimensional functions. The 3DZD provides a unified representation of the surface shape of multimeric protein complex models and EM maps, which allows a convenient, fast quantitative comparison of the three dimensional structural data. Out of 19 multimeric complexes tested, near native complex structures with a root mean square deviation of less than 2.5 Å were obtained for 14 cases while medium range resolution structures with correct topology were computed for the additional 5 cases. PMID:22417139

  4. Fitting multimeric protein complexes into electron microscopy maps using 3D Zernike descriptors.

    PubMed

    Esquivel-Rodríguez, Juan; Kihara, Daisuke

    2012-06-14

    A novel computational method for fitting high-resolution structures of multiple proteins into a cryoelectron microscopy map is presented. The method named EMLZerD generates a pool of candidate multiple protein docking conformations of component proteins, which are later compared with a provided electron microscopy (EM) density map to select the ones that fit well into the EM map. The comparison of docking conformations and the EM map is performed using the 3D Zernike descriptor (3DZD), a mathematical series expansion of three-dimensional functions. The 3DZD provides a unified representation of the surface shape of multimeric protein complex models and EM maps, which allows a convenient, fast quantitative comparison of the three-dimensional structural data. Out of 19 multimeric complexes tested, near native complex structures with a root-mean-square deviation of less than 2.5 Å were obtained for 14 cases while medium range resolution structures with correct topology were computed for the additional 5 cases.

  5. Load monitoring of aerospace structures utilizing micro-electro-mechanical systems for static and quasi-static loading conditions

    NASA Astrophysics Data System (ADS)

    Martinez, M.; Rocha, B.; Li, M.; Shi, G.; Beltempo, A.; Rutledge, R.; Yanishevsky, M.

    2012-11-01

    The National Research Council Canada (NRC) has worked on the development of structural health monitoring (SHM) test platforms for assessing the performance of sensor systems for load monitoring applications. The first SHM platform consists of a 5.5 m cantilever aluminum beam that provides an optimal scenario for evaluating the ability of a load monitoring system to measure bending, torsion and shear loads. The second SHM platform contains an added level of structural complexity, by consisting of aluminum skins with bonded/riveted stringers, typical of an aircraft lower wing structure. These two load monitoring platforms are well characterized and documented, providing loading conditions similar to those encountered during service. In this study, a micro-electro-mechanical system (MEMS) for acquiring data from triads of gyroscopes, accelerometers and magnetometers is described. The system was used to compute changes in angles at discrete stations along the platforms. The angles obtained from the MEMS were used to compute a second, third or fourth order degree polynomial surface from which displacements at every point could be computed. The use of a new Kalman filter was evaluated for angle estimation, from which displacements in the structure were computed. The outputs of the newly developed algorithms were then compared to the displacements obtained from the linear variable displacement transducers connected to the platforms. The displacement curves were subsequently post-processed either analytically, or with the help of a finite element model of the structure, to estimate strains and loads. The estimated strains were compared with baseline strain gauge instrumentation installed on the platforms. This new approach for load monitoring was able to provide accurate estimates of applied strains and shear loads.

  6. DL_MG: A Parallel Multigrid Poisson and Poisson-Boltzmann Solver for Electronic Structure Calculations in Vacuum and Solution.

    PubMed

    Womack, James C; Anton, Lucian; Dziedzic, Jacek; Hasnip, Phil J; Probert, Matt I J; Skylaris, Chris-Kriton

    2018-03-13

    The solution of the Poisson equation is a crucial step in electronic structure calculations, yielding the electrostatic potential-a key component of the quantum mechanical Hamiltonian. In recent decades, theoretical advances and increases in computer performance have made it possible to simulate the electronic structure of extended systems in complex environments. This requires the solution of more complicated variants of the Poisson equation, featuring nonhomogeneous dielectric permittivities, ionic concentrations with nonlinear dependencies, and diverse boundary conditions. The analytic solutions generally used to solve the Poisson equation in vacuum (or with homogeneous permittivity) are not applicable in these circumstances, and numerical methods must be used. In this work, we present DL_MG, a flexible, scalable, and accurate solver library, developed specifically to tackle the challenges of solving the Poisson equation in modern large-scale electronic structure calculations on parallel computers. Our solver is based on the multigrid approach and uses an iterative high-order defect correction method to improve the accuracy of solutions. Using two chemically relevant model systems, we tested the accuracy and computational performance of DL_MG when solving the generalized Poisson and Poisson-Boltzmann equations, demonstrating excellent agreement with analytic solutions and efficient scaling to ∼10 9 unknowns and 100s of CPU cores. We also applied DL_MG in actual large-scale electronic structure calculations, using the ONETEP linear-scaling electronic structure package to study a 2615 atom protein-ligand complex with routinely available computational resources. In these calculations, the overall execution time with DL_MG was not significantly greater than the time required for calculations using a conventional FFT-based solver.

  7. Computer assisted Objective structured clinical examination versus Objective structured clinical examination in assessment of Dermatology undergraduate students.

    PubMed

    Chaudhary, Richa; Grover, Chander; Bhattacharya, S N; Sharma, Arun

    2017-01-01

    The assessment of dermatology undergraduates is being done through computer assisted objective structured clinical examination at our institution for the last 4 years. We attempted to compare objective structured clinical examination (OSCE) and computer assisted objective structured clinical examination (CA-OSCE) as assessment tools. To assess the relative effectiveness of CA-OSCE and OSCE as assessment tools for undergraduate dermatology trainees. Students underwent CA-OSCE as well as OSCE-based evaluation of equal weightage as an end of posting assessment. The attendance as well as the marks in both the examination formats were meticulously recorded and statistically analyzed using SPSS version 20.0. Intercooled Stata V9.0 was used to assess the reliability and internal consistency of the examinations conducted. Feedback from both students and examiners was also recorded. The mean attendance for the study group was 77% ± 12.0%. The average score on CA- OSCE and OSCE was 47.4% ± 19.8% and 53.5% ± 18%, respectively. These scores showed a mutually positive correlation, with Spearman's coefficient being 0.593. Spearman's rank correlation coefficient between attendance scores and assessment score was 0.485 for OSCE and 0.451 for CA-OSCE. The Cronbach's alpha coefficient for all the tests ranged from 0.76 to 0.87 indicating high reliability. The comparison was based on a single batch of 139 students. Such an evaluation on more students in larger number of batches over successive years could help throw more light on the subject. Computer assisted objective structured clinical examination was found to be a valid, reliable and effective format for dermatology assessment, being rated as the preferred format by examiners.

  8. Parallel Finite Element Domain Decomposition for Structural/Acoustic Analysis

    NASA Technical Reports Server (NTRS)

    Nguyen, Duc T.; Tungkahotara, Siroj; Watson, Willie R.; Rajan, Subramaniam D.

    2005-01-01

    A domain decomposition (DD) formulation for solving sparse linear systems of equations resulting from finite element analysis is presented. The formulation incorporates mixed direct and iterative equation solving strategics and other novel algorithmic ideas that are optimized to take advantage of sparsity and exploit modern computer architecture, such as memory and parallel computing. The most time consuming part of the formulation is identified and the critical roles of direct sparse and iterative solvers within the framework of the formulation are discussed. Experiments on several computer platforms using several complex test matrices are conducted using software based on the formulation. Small-scale structural examples are used to validate thc steps in the formulation and large-scale (l,000,000+ unknowns) duct acoustic examples are used to evaluate the ORIGIN 2000 processors, and a duster of 6 PCs (running under the Windows environment). Statistics show that the formulation is efficient in both sequential and parallel computing environmental and that the formulation is significantly faster and consumes less memory than that based on one of the best available commercialized parallel sparse solvers.

  9. Toward structure prediction of cyclic peptides.

    PubMed

    Yu, Hongtao; Lin, Yu-Shan

    2015-02-14

    Cyclic peptides are a promising class of molecules that can be used to target specific protein-protein interactions. A computational method to accurately predict their structures would substantially advance the development of cyclic peptides as modulators of protein-protein interactions. Here, we develop a computational method that integrates bias-exchange metadynamics simulations, a Boltzmann reweighting scheme, dihedral principal component analysis and a modified density peak-based cluster analysis to provide a converged structural description for cyclic peptides. Using this method, we evaluate the performance of a number of popular protein force fields on a model cyclic peptide. All the tested force fields seem to over-stabilize the α-helix and PPII/β regions in the Ramachandran plot, commonly populated by linear peptides and proteins. Our findings suggest that re-parameterization of a force field that well describes the full Ramachandran plot is necessary to accurately model cyclic peptides.

  10. Accurate multiple sequence-structure alignment of RNA sequences using combinatorial optimization.

    PubMed

    Bauer, Markus; Klau, Gunnar W; Reinert, Knut

    2007-07-27

    The discovery of functional non-coding RNA sequences has led to an increasing interest in algorithms related to RNA analysis. Traditional sequence alignment algorithms, however, fail at computing reliable alignments of low-homology RNA sequences. The spatial conformation of RNA sequences largely determines their function, and therefore RNA alignment algorithms have to take structural information into account. We present a graph-based representation for sequence-structure alignments, which we model as an integer linear program (ILP). We sketch how we compute an optimal or near-optimal solution to the ILP using methods from combinatorial optimization, and present results on a recently published benchmark set for RNA alignments. The implementation of our algorithm yields better alignments in terms of two published scores than the other programs that we tested: This is especially the case with an increasing number of input sequences. Our program LARA is freely available for academic purposes from http://www.planet-lisa.net.

  11. Validation of a wireless modular monitoring system for structures

    NASA Astrophysics Data System (ADS)

    Lynch, Jerome P.; Law, Kincho H.; Kiremidjian, Anne S.; Carryer, John E.; Kenny, Thomas W.; Partridge, Aaron; Sundararajan, Arvind

    2002-06-01

    A wireless sensing unit for use in a Wireless Modular Monitoring System (WiMMS) has been designed and constructed. Drawing upon advanced technological developments in the areas of wireless communications, low-power microprocessors and micro-electro mechanical system (MEMS) sensing transducers, the wireless sensing unit represents a high-performance yet low-cost solution to monitoring the short-term and long-term performance of structures. A sophisticated reduced instruction set computer (RISC) microcontroller is placed at the core of the unit to accommodate on-board computations, measurement filtering and data interrogation algorithms. The functionality of the wireless sensing unit is validated through various experiments involving multiple sensing transducers interfaced to the sensing unit. In particular, MEMS-based accelerometers are used as the primary sensing transducer in this study's validation experiments. A five degree of freedom scaled test structure mounted upon a shaking table is employed for system validation.

  12. A curve fitting method for solving the flutter equation. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Cooper, J. L.

    1972-01-01

    A curve fitting approach was developed to solve the flutter equation for the critical flutter velocity. The psi versus nu curves are approximated by cubic and quadratic equations. The curve fitting technique utilized the first and second derivatives of psi with respect to nu. The method was tested for two structures, one structure being six times the total mass of the other structure. The algorithm never showed any tendency to diverge from the solution. The average time for the computation of a flutter velocity was 3.91 seconds on an IBM Model 50 computer for an accuracy of five per cent. For values of nu close to the critical root of the flutter equation the algorithm converged on the first attempt. The maximum number of iterations for convergence to the critical flutter velocity was five with an assumed value of nu relatively distant from the actual crossover.

  13. A three-dimensional structured/unstructured hybrid Navier-Stokes method for turbine blade rows

    NASA Technical Reports Server (NTRS)

    Tsung, F.-L.; Loellbach, J.; Kwon, O.; Hah, C.

    1994-01-01

    A three-dimensional viscous structured/unstructured hybrid scheme has been developed for numerical computation of high Reynolds number turbomachinery flows. The procedure allows an efficient structured solver to be employed in the densely clustered, high aspect-ratio grid around the viscous regions near solid surfaces, while employing an unstructured solver elsewhere in the flow domain to add flexibility in mesh generation. Test results for an inviscid flow over an external transonic wing and a Navier-Stokes flow for an internal annular cascade are presented.

  14. A Big Data Approach to Analyzing Market Volatility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Kesheng; Bethel, E. Wes; Gu, Ming

    2013-06-05

    Understanding the microstructure of the financial market requires the processing of a vast amount of data related to individual trades, and sometimes even multiple levels of quotes. Analyzing such a large volume of data requires tremendous computing power that is not easily available to financial academics and regulators. Fortunately, public funded High Performance Computing (HPC) power is widely available at the National Laboratories in the US. In this paper we demonstrate that the HPC resource and the techniques for data-intensive sciences can be used to greatly accelerate the computation of an early warning indicator called Volume-synchronized Probability of Informed tradingmore » (VPIN). The test data used in this study contains five and a half year's worth of trading data for about 100 most liquid futures contracts, includes about 3 billion trades, and takes 140GB as text files. By using (1) a more efficient file format for storing the trading records, (2) more effective data structures and algorithms, and (3) parallelizing the computations, we are able to explore 16,000 different ways of computing VPIN in less than 20 hours on a 32-core IBM DataPlex machine. Our test demonstrates that a modest computer is sufficient to monitor a vast number of trading activities in real-time – an ability that could be valuable to regulators. Our test results also confirm that VPIN is a strong predictor of liquidity-induced volatility. With appropriate parameter choices, the false positive rates are about 7% averaged over all the futures contracts in the test data set. More specifically, when VPIN values rise above a threshold (CDF > 0.99), the volatility in the subsequent time windows is higher than the average in 93% of the cases.« less

  15. Preliminary Computational Analysis of the (HIRENASD) Configuration in Preparation for the Aeroelastic Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Chwalowski, Pawel; Florance, Jennifer P.; Heeg, Jennifer; Wieseman, Carol D.; Perry, Boyd P.

    2011-01-01

    This paper presents preliminary computational aeroelastic analysis results generated in preparation for the first Aeroelastic Prediction Workshop (AePW). These results were produced using FUN3D software developed at NASA Langley and are compared against the experimental data generated during the HIgh REynolds Number Aero- Structural Dynamics (HIRENASD) Project. The HIRENASD wind-tunnel model was tested in the European Transonic Windtunnel in 2006 by Aachen University0s Department of Mechanics with funding from the German Research Foundation. The computational effort discussed here was performed (1) to obtain a preliminary assessment of the ability of the FUN3D code to accurately compute physical quantities experimentally measured on the HIRENASD model and (2) to translate the lessons learned from the FUN3D analysis of HIRENASD into a set of initial guidelines for the first AePW, which includes test cases for the HIRENASD model and its experimental data set. This paper compares the computational and experimental results obtained at Mach 0.8 for a Reynolds number of 7 million based on chord, corresponding to the HIRENASD test conditions No. 132 and No. 159. Aerodynamic loads and static aeroelastic displacements are compared at two levels of the grid resolution. Harmonic perturbation numerical results are compared with the experimental data using the magnitude and phase relationship between pressure coefficients and displacement. A dynamic aeroelastic numerical calculation is presented at one wind-tunnel condition in the form of the time history of the generalized displacements. Additional FUN3D validation results are also presented for the AGARD 445.6 wing data set. This wing was tested in the Transonic Dynamics Tunnel and is commonly used in the preliminary benchmarking of computational aeroelastic software.

  16. Parametric bicubic spline and CAD tools for complex targets shape modelling in physical optics radar cross section prediction

    NASA Astrophysics Data System (ADS)

    Delogu, A.; Furini, F.

    1991-09-01

    Increasing interest in radar cross section (RCS) reduction is placing new demands on theoretical, computation, and graphic techniques for calculating scattering properties of complex targets. In particular, computer codes capable of predicting the RCS of an entire aircraft at high frequency and of achieving RCS control with modest structural changes, are becoming of paramount importance in stealth design. A computer code, evaluating the RCS of arbitrary shaped metallic objects that are computer aided design (CAD) generated, and its validation with measurements carried out using ALENIA RCS test facilities are presented. The code, based on the physical optics method, is characterized by an efficient integration algorithm with error control, in order to contain the computer time within acceptable limits, and by an accurate parametric representation of the target surface in terms of bicubic splines.

  17. Restructuring the CS 1 classroom: Examining the effect of open laboratory-based classes vs. closed laboratory-based classes on Computer Science 1 students' achievement and attitudes toward computers and computer courses

    NASA Astrophysics Data System (ADS)

    Henderson, Jean Foster

    The purpose of this study was to assess the effect of classroom restructuring involving computer laboratories on student achievement and student attitudes toward computers and computer courses. The effects of the targeted student attributes of gender, previous programming experience, math background, and learning style were also examined. The open lab-based class structure consisted of a traditional lecture class with a separate, unscheduled lab component in which lab assignments were completed outside of class; the closed lab-based class structure integrated a lab component within the lecture class so that half the class was reserved for lecture and half the class was reserved for students to complete lab assignments by working cooperatively with each other and under the supervision and guidance of the instructor. The sample consisted of 71 students enrolled in four intact classes of Computer Science I during the fall and spring semesters of the 2006--2007 school year at two southern universities: two classes were held in the fall (one at each university) and two classes were held in the spring (one at each university). A counterbalanced repeated measures design was used in which all students experienced both class structures for half of each semester. The order of control and treatment was rotated among the four classes. All students received the same amount of class and instructor time. A multivariate analysis of variance (MANOVA) via a multiple regression strategy was used to test the study's hypotheses. Although the overall MANOVA model was statistically significant, independent follow-up univariate analyses relative to each dependent measure found that the only significant research factor was math background: Students whose mathematics background was at the level of Calculus I or higher had significantly higher student achievement than students whose mathematics background was less than Calculus I. The results suggest that classroom structures that incorporate an open laboratory setting are just as effective on student achievement and attitudes as classroom structures that incorporate a closed laboratory setting. The results also suggest that math background is a strong predictor of student achievement in CS 1.

  18. From QSAR to QSIIR: Searching for Enhanced Computational Toxicology Models

    PubMed Central

    Zhu, Hao

    2017-01-01

    Quantitative Structure Activity Relationship (QSAR) is the most frequently used modeling approach to explore the dependency of biological, toxicological, or other types of activities/properties of chemicals on their molecular features. In the past two decades, QSAR modeling has been used extensively in drug discovery process. However, the predictive models resulted from QSAR studies have limited use for chemical risk assessment, especially for animal and human toxicity evaluations, due to the low predictivity of new compounds. To develop enhanced toxicity models with independently validated external prediction power, novel modeling protocols were pursued by computational toxicologists based on rapidly increasing toxicity testing data in recent years. This chapter reviews the recent effort in our laboratory to incorporate the biological testing results as descriptors in the toxicity modeling process. This effort extended the concept of QSAR to Quantitative Structure In vitro-In vivo Relationship (QSIIR). The QSIIR study examples provided in this chapter indicate that the QSIIR models that based on the hybrid (biological and chemical) descriptors are indeed superior to the conventional QSAR models that only based on chemical descriptors for several animal toxicity endpoints. We believe that the applications introduced in this review will be of interest and value to researchers working in the field of computational drug discovery and environmental chemical risk assessment. PMID:23086837

  19. Virtual Shaker Testing: Simulation Technology Improves Vibration Test Performance

    NASA Technical Reports Server (NTRS)

    Ricci, Stefano; Peeters, Bart; Fetter, Rebecca; Boland, Doug; Debille, Jan

    2008-01-01

    In the field of vibration testing, the interaction between the structure being tested and the instrumentation hardware used to perform the test is a critical issue. This is particularly true when testing massive structures (e.g. satellites), because due to physical design and manufacturing limits, the dynamics of the testing facility often couples with the test specimen one in the frequency range of interest. A further issue in this field is the standard use of a closed loop real-time vibration control scheme, which could potentially shift poles and change damping of the aforementioned coupled system. Virtual shaker testing is a novel approach to deal with these issues. It means performing a simulation which closely represents the real vibration test on the specific facility by taking into account all parameters which might impact the dynamic behavior of the specimen. In this paper, such a virtual shaker testing approach is developed. It consists of the following components: (1) Either a physical-based or an equation-based coupled electro-mechanical lumped parameter shaker model is created. The model parameters are obtained from manufacturer's specifications or by carrying out some dedicated experiments; (2) Existing real-time vibration control algorithm are ported to the virtual simulation environment; and (3) A structural model of the test object is created and after defining proper interface conditions structural modes are computed by means of the well-established Craig-Bampton CMS technique. At this stage, a virtual shaker test has been run, by coupling the three described models (shaker, control loop, structure) in a co-simulation routine. Numerical results have eventually been correlated with experimental ones in order to assess the robustness of the proposed methodology.

  20. Sequential Test Strategies for Multiple Fault Isolation

    NASA Technical Reports Server (NTRS)

    Shakeri, M.; Pattipati, Krishna R.; Raghavan, V.; Patterson-Hine, Ann; Kell, T.

    1997-01-01

    In this paper, we consider the problem of constructing near optimal test sequencing algorithms for diagnosing multiple faults in redundant (fault-tolerant) systems. The computational complexity of solving the optimal multiple-fault isolation problem is super-exponential, that is, it is much more difficult than the single-fault isolation problem, which, by itself, is NP-hard. By employing concepts from information theory and Lagrangian relaxation, we present several static and dynamic (on-line or interactive) test sequencing algorithms for the multiple fault isolation problem that provide a trade-off between the degree of suboptimality and computational complexity. Furthermore, we present novel diagnostic strategies that generate a static diagnostic directed graph (digraph), instead of a static diagnostic tree, for multiple fault diagnosis. Using this approach, the storage complexity of the overall diagnostic strategy reduces substantially. Computational results based on real-world systems indicate that the size of a static multiple fault strategy is strictly related to the structure of the system, and that the use of an on-line multiple fault strategy can diagnose faults in systems with as many as 10,000 failure sources.

  1. Aeroelasticity Benchmark Assessment: Subsonic Fixed Wing Program

    NASA Technical Reports Server (NTRS)

    Florance, Jennifer P.; Chwalowski, Pawel; Wieseman, Carol D.

    2010-01-01

    The fundamental technical challenge in computational aeroelasticity is the accurate prediction of unsteady aerodynamic phenomena and the effect on the aeroelastic response of a vehicle. Currently, a benchmarking standard for use in validating the accuracy of computational aeroelasticity codes does not exist. Many aeroelastic data sets have been obtained in wind-tunnel and flight testing throughout the world; however, none have been globally presented or accepted as an ideal data set. There are numerous reasons for this. One reason is that often, such aeroelastic data sets focus on the aeroelastic phenomena alone (flutter, for example) and do not contain associated information such as unsteady pressures and time-correlated structural dynamic deflections. Other available data sets focus solely on the unsteady pressures and do not address the aeroelastic phenomena. Other discrepancies can include omission of relevant data, such as flutter frequency and / or the acquisition of only qualitative deflection data. In addition to these content deficiencies, all of the available data sets present both experimental and computational technical challenges. Experimental issues include facility influences, nonlinearities beyond those being modeled, and data processing. From the computational perspective, technical challenges include modeling geometric complexities, coupling between the flow and the structure, grid issues, and boundary conditions. The Aeroelasticity Benchmark Assessment task seeks to examine the existing potential experimental data sets and ultimately choose the one that is viewed as the most suitable for computational benchmarking. An initial computational evaluation of that configuration will then be performed using the Langley-developed computational fluid dynamics (CFD) software FUN3D1 as part of its code validation process. In addition to the benchmarking activity, this task also includes an examination of future research directions. Researchers within the Aeroelasticity Branch will examine other experimental efforts within the Subsonic Fixed Wing (SFW) program (such as testing of the NASA Common Research Model (CRM)) and other NASA programs and assess aeroelasticity issues and research topics.

  2. Prediction of beta-turns in proteins using the first-order Markov models.

    PubMed

    Lin, Thy-Hou; Wang, Ging-Ming; Wang, Yen-Tseng

    2002-01-01

    We present a method based on the first-order Markov models for predicting simple beta-turns and loops containing multiple turns in proteins. Sequences of 338 proteins in a database are divided using the published turn criteria into the following three regions, namely, the turn, the boundary, and the nonturn ones. A transition probability matrix is constructed for either the turn or the nonturn region using the weighted transition probabilities computed for dipeptides identified from each region. There are two such matrices constructed for the boundary region since the transition probabilities for dipeptides immediately preceding or following a turn are different. The window used for scanning a protein sequence from amino (N-) to carboxyl (C-) terminal is a hexapeptide since the transition probability computed for a turn tetrapeptide is capped at both the N- and C- termini with a boundary transition probability indexed respectively from the two boundary transition matrices. A sum of the averaged product of the transition probabilities of all the hexapeptides involving each residue is computed. This is then weighted with a probability computed from assuming that all the hexapeptides are from the nonturn region to give the final prediction quantity. Both simple beta-turns and loops containing multiple turns in a protein are then identified by the rising of the prediction quantity computed. The performance of the prediction scheme or the percentage (%) of correct prediction is evaluated through computation of Matthews correlation coefficients for each protein predicted. It is found that the prediction method is capable of giving prediction results with better correlation between the percent of correct prediction and the Matthews correlation coefficients for a group of test proteins as compared with those predicted using some secondary structural prediction methods. The prediction accuracy for about 40% of proteins in the database or 50% of proteins in the test set is better than 70%. Such a percentage for the test set is reduced to 30 if the structures of all the proteins in the set are treated as unknown.

  3. Functional connectivity dynamically evolves on multiple time-scales over a static structural connectome: Models and mechanisms.

    PubMed

    Cabral, Joana; Kringelbach, Morten L; Deco, Gustavo

    2017-10-15

    Over the last decade, we have observed a revolution in brain structural and functional Connectomics. On one hand, we have an ever-more detailed characterization of the brain's white matter structural connectome. On the other, we have a repertoire of consistent functional networks that form and dissipate over time during rest. Despite the evident spatial similarities between structural and functional connectivity, understanding how different time-evolving functional networks spontaneously emerge from a single structural network requires analyzing the problem from the perspective of complex network dynamics and dynamical system's theory. In that direction, bottom-up computational models are useful tools to test theoretical scenarios and depict the mechanisms at the genesis of resting-state activity. Here, we provide an overview of the different mechanistic scenarios proposed over the last decade via computational models. Importantly, we highlight the need of incorporating additional model constraints considering the properties observed at finer temporal scales with MEG and the dynamical properties of FC in order to refresh the list of candidate scenarios. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Computational design of thermostabilizing point mutations for G protein-coupled receptors

    PubMed Central

    Popov, Petr; Peng, Yao; Shen, Ling; Stevens, Raymond C; Cherezov, Vadim; Liu, Zhi-Jie

    2018-01-01

    Engineering of GPCR constructs with improved thermostability is a key for successful structural and biochemical studies of this transmembrane protein family, targeted by 40% of all therapeutic drugs. Here we introduce a comprehensive computational approach to effective prediction of stabilizing mutations in GPCRs, named CompoMug, which employs sequence-based analysis, structural information, and a derived machine learning predictor. Tested experimentally on the serotonin 5-HT2C receptor target, CompoMug predictions resulted in 10 new stabilizing mutations, with an apparent thermostability gain ~8.8°C for the best single mutation and ~13°C for a triple mutant. Binding of antagonists confers further stabilization for the triple mutant receptor, with total gains of ~21°C as compared to wild type apo 5-HT2C. The predicted mutations enabled crystallization and structure determination for the 5-HT2C receptor complexes in inactive and active-like states. While CompoMug already shows high 25% hit rate and utility in GPCR structural studies, further improvements are expected with accumulation of structural and mutation data. PMID:29927385

  5. Joint nonlinearity effects in the design of a flexible truss structure control system

    NASA Technical Reports Server (NTRS)

    Mercadal, Mathieu

    1986-01-01

    Nonlinear effects are introduced in the dynamics of large space truss structures by the connecting joints which are designed with rather important tolerances to facilitate the assembly of the structures in space. The purpose was to develop means to investigate the nonlinear dynamics of the structures, particularly the limit cycles that might occur when active control is applied to the structures. An analytical method was sought and derived to predict the occurrence of limit cycles and to determine their stability. This method is mainly based on the quasi-linearization of every joint using describing functions. This approach was proven successful when simple dynamical systems were tested. Its applicability to larger systems depends on the amount of computations it requires, and estimates of the computational task tend to indicate that the number of individual sources of nonlinearity should be limited. Alternate analytical approaches, which do not account for every single nonlinearity, or the simulation of a simplified model of the dynamical system should, therefore, be investigated to determine a more effective way to predict limit cycles in large dynamical systems with an important number of distributed nonlinearities.

  6. A LEAST ABSOLUTE SHRINKAGE AND SELECTION OPERATOR (LASSO) FOR NONLINEAR SYSTEM IDENTIFICATION

    NASA Technical Reports Server (NTRS)

    Kukreja, Sunil L.; Lofberg, Johan; Brenner, Martin J.

    2006-01-01

    Identification of parametric nonlinear models involves estimating unknown parameters and detecting its underlying structure. Structure computation is concerned with selecting a subset of parameters to give a parsimonious description of the system which may afford greater insight into the functionality of the system or a simpler controller design. In this study, a least absolute shrinkage and selection operator (LASSO) technique is investigated for computing efficient model descriptions of nonlinear systems. The LASSO minimises the residual sum of squares by the addition of a 1 penalty term on the parameter vector of the traditional 2 minimisation problem. Its use for structure detection is a natural extension of this constrained minimisation approach to pseudolinear regression problems which produces some model parameters that are exactly zero and, therefore, yields a parsimonious system description. The performance of this LASSO structure detection method was evaluated by using it to estimate the structure of a nonlinear polynomial model. Applicability of the method to more complex systems such as those encountered in aerospace applications was shown by identifying a parsimonious system description of the F/A-18 Active Aeroelastic Wing using flight test data.

  7. Artificial intelligence approach to planning the robotic assembly of large tetrahedral truss structures

    NASA Technical Reports Server (NTRS)

    Homemdemello, Luiz S.

    1992-01-01

    An assembly planner for tetrahedral truss structures is presented. To overcome the difficulties due to the large number of parts, the planner exploits the simplicity and uniformity of the shapes of the parts and the regularity of their interconnection. The planning automation is based on the computational formalism known as production system. The global data base consists of a hexagonal grid representation of the truss structure. This representation captures the regularity of tetrahedral truss structures and their multiple hierarchies. It maps into quadratic grids and can be implemented in a computer by using a two-dimensional array data structure. By maintaining the multiple hierarchies explicitly in the model, the choice of a particular hierarchy is only made when needed, thus allowing a more informed decision. Furthermore, testing the preconditions of the production rules is simple because the patterned way in which the struts are interconnected is incorporated into the topology of the hexagonal grid. A directed graph representation of assembly sequences allows the use of both graph search and backtracking control strategies.

  8. Parallel hyperbolic PDE simulation on clusters: Cell versus GPU

    NASA Astrophysics Data System (ADS)

    Rostrup, Scott; De Sterck, Hans

    2010-12-01

    Increasingly, high-performance computing is looking towards data-parallel computational devices to enhance computational performance. Two technologies that have received significant attention are IBM's Cell Processor and NVIDIA's CUDA programming model for graphics processing unit (GPU) computing. In this paper we investigate the acceleration of parallel hyperbolic partial differential equation simulation on structured grids with explicit time integration on clusters with Cell and GPU backends. The message passing interface (MPI) is used for communication between nodes at the coarsest level of parallelism. Optimizations of the simulation code at the several finer levels of parallelism that the data-parallel devices provide are described in terms of data layout, data flow and data-parallel instructions. Optimized Cell and GPU performance are compared with reference code performance on a single x86 central processing unit (CPU) core in single and double precision. We further compare the CPU, Cell and GPU platforms on a chip-to-chip basis, and compare performance on single cluster nodes with two CPUs, two Cell processors or two GPUs in a shared memory configuration (without MPI). We finally compare performance on clusters with 32 CPUs, 32 Cell processors, and 32 GPUs using MPI. Our GPU cluster results use NVIDIA Tesla GPUs with GT200 architecture, but some preliminary results on recently introduced NVIDIA GPUs with the next-generation Fermi architecture are also included. This paper provides computational scientists and engineers who are considering porting their codes to accelerator environments with insight into how structured grid based explicit algorithms can be optimized for clusters with Cell and GPU accelerators. It also provides insight into the speed-up that may be gained on current and future accelerator architectures for this class of applications. Program summaryProgram title: SWsolver Catalogue identifier: AEGY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGY_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL v3 No. of lines in distributed program, including test data, etc.: 59 168 No. of bytes in distributed program, including test data, etc.: 453 409 Distribution format: tar.gz Programming language: C, CUDA Computer: Parallel Computing Clusters. Individual compute nodes may consist of x86 CPU, Cell processor, or x86 CPU with attached NVIDIA GPU accelerator. Operating system: Linux Has the code been vectorised or parallelized?: Yes. Tested on 1-128 x86 CPU cores, 1-32 Cell Processors, and 1-32 NVIDIA GPUs. RAM: Tested on Problems requiring up to 4 GB per compute node. Classification: 12 External routines: MPI, CUDA, IBM Cell SDK Nature of problem: MPI-parallel simulation of Shallow Water equations using high-resolution 2D hyperbolic equation solver on regular Cartesian grids for x86 CPU, Cell Processor, and NVIDIA GPU using CUDA. Solution method: SWsolver provides 3 implementations of a high-resolution 2D Shallow Water equation solver on regular Cartesian grids, for CPU, Cell Processor, and NVIDIA GPU. Each implementation uses MPI to divide work across a parallel computing cluster. Additional comments: Sub-program numdiff is used for the test run.

  9. Process Integrated Mechanism for Human-Computer Collaboration and Coordination

    DTIC Science & Technology

    2012-09-12

    system we implemented the TAFLib library that provides the communication with TAF . The data received from the TAF server is collected in a data structure...send new commands and flight plans for the UAVs to the TAF server. Test scenarios Several scenarios have been implemented to test and prove our...areas. Shooting Enemies The basic scenario proved the successful integration of PIM and the TAF simulation environment. Subsequently we improved the CP

  10. Health workers’ knowledge of and attitudes towards computer applications in rural African health facilities

    PubMed Central

    Sukums, Felix; Mensah, Nathan; Mpembeni, Rose; Kaltschmidt, Jens; Haefeli, Walter E.; Blank, Antje

    2014-01-01

    Background The QUALMAT (Quality of Maternal and Prenatal Care: Bridging the Know-do Gap) project has introduced an electronic clinical decision support system (CDSS) for pre-natal and maternal care services in rural primary health facilities in Burkina Faso, Ghana, and Tanzania. Objective To report an assessment of health providers’ computer knowledge, experience, and attitudes prior to the implementation of the QUALMAT electronic CDSS. Design A cross-sectional study was conducted with providers in 24 QUALMAT project sites. Information was collected using structured questionnaires. Chi-squared tests and one-way ANOVA describe the association between computer knowledge, attitudes, and other factors. Semi-structured interviews and focus groups were conducted to gain further insights. Results A total of 108 providers responded, 63% were from Tanzania and 37% from Ghana. The mean age was 37.6 years, and 79% were female. Only 40% had ever used computers, and 29% had prior computer training. About 80% were computer illiterate or beginners. Educational level, age, and years of work experience were significantly associated with computer knowledge (p<0.01). Most (95.3%) had positive attitudes towards computers – average score (±SD) of 37.2 (±4.9). Females had significantly lower scores than males. Interviews and group discussions showed that although most were lacking computer knowledge and experience, they were optimistic about overcoming challenges associated with the introduction of computers in their workplace. Conclusions Given the low levels of computer knowledge among rural health workers in Africa, it is important to provide adequate training and support to ensure the successful uptake of electronic CDSSs in these settings. The positive attitudes to computers found in this study underscore that also rural care providers are ready to use such technology. PMID:25361721

  11. Health workers' knowledge of and attitudes towards computer applications in rural African health facilities.

    PubMed

    Sukums, Felix; Mensah, Nathan; Mpembeni, Rose; Kaltschmidt, Jens; Haefeli, Walter E; Blank, Antje

    2014-01-01

    The QUALMAT (Quality of Maternal and Prenatal Care: Bridging the Know-do Gap) project has introduced an electronic clinical decision support system (CDSS) for pre-natal and maternal care services in rural primary health facilities in Burkina Faso, Ghana, and Tanzania. To report an assessment of health providers' computer knowledge, experience, and attitudes prior to the implementation of the QUALMAT electronic CDSS. A cross-sectional study was conducted with providers in 24 QUALMAT project sites. Information was collected using structured questionnaires. Chi-squared tests and one-way ANOVA describe the association between computer knowledge, attitudes, and other factors. Semi-structured interviews and focus groups were conducted to gain further insights. A total of 108 providers responded, 63% were from Tanzania and 37% from Ghana. The mean age was 37.6 years, and 79% were female. Only 40% had ever used computers, and 29% had prior computer training. About 80% were computer illiterate or beginners. Educational level, age, and years of work experience were significantly associated with computer knowledge (p<0.01). Most (95.3%) had positive attitudes towards computers - average score (±SD) of 37.2 (±4.9). Females had significantly lower scores than males. Interviews and group discussions showed that although most were lacking computer knowledge and experience, they were optimistic about overcoming challenges associated with the introduction of computers in their workplace. Given the low levels of computer knowledge among rural health workers in Africa, it is important to provide adequate training and support to ensure the successful uptake of electronic CDSSs in these settings. The positive attitudes to computers found in this study underscore that also rural care providers are ready to use such technology.

  12. Use of 13Cα Chemical-Shifts in Protein Structure Determination

    PubMed Central

    Vila, Jorge A.; Ripoll, Daniel R.; Scheraga, Harold A.

    2008-01-01

    A physics-based method, aimed at determining protein structures by using NOE-derived distances together with observed and computed 13C chemical shifts, is proposed. The approach makes use of 13Cα chemical shifts, computed at the density functional level of theory, to obtain torsional constraints for all backbone and side-chain torsional angles without making a priori use of the occupancy of any region of the Ramachandran map by the amino acid residues. The torsional constraints are not fixed but are changed dynamically in each step of the procedure, following an iterative self-consistent approach intended to identify a set of conformations for which the computed 13Cα chemical shifts match the experimental ones. A test is carried out on a 76-amino acid all-α-helical protein, namely the B. Subtilis acyl carrier protein. It is shown that, starting from randomly generated conformations, the final protein models are more accurate than an existing NMR-derived structure model of this protein, in terms of both the agreement between predicted and observed 13Cα chemical shifts and some stereochemical quality indicators, and of similar accuracy as one of the protein models solved at a high level of resolution. The results provide evidence that this methodology can be used not only for structure determination but also for additional protein structure refinement of NMR-derived models deposited in the Protein Data Bank. PMID:17516673

  13. Modeling Code Is Helping Cleveland Develop New Products

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Master Builders, Inc., is a 350-person company in Cleveland, Ohio, that develops and markets specialty chemicals for the construction industry. Developing new products involves creating many potential samples and running numerous tests to characterize the samples' performance. Company engineers enlisted NASA's help to replace cumbersome physical testing with computer modeling of the samples' behavior. Since the NASA Lewis Research Center's Structures Division develops mathematical models and associated computation tools to analyze the deformation and failure of composite materials, its researchers began a two-phase effort to modify Lewis' Integrated Composite Analyzer (ICAN) software for Master Builders' use. Phase I has been completed, and Master Builders is pleased with the results. The company is now working to begin implementation of Phase II.

  14. Advances and trends in computational structural mechanics

    NASA Technical Reports Server (NTRS)

    Noor, A. K.

    1986-01-01

    Recent developments in computational structural mechanics are reviewed with reference to computational needs for future structures technology, advances in computational models for material behavior, discrete element technology, assessment and control of numerical simulations of structural response, hybrid analysis, and techniques for large-scale optimization. Research areas in computational structural mechanics which have high potential for meeting future technological needs are identified. These include prediction and analysis of the failure of structural components made of new materials, development of computational strategies and solution methodologies for large-scale structural calculations, and assessment of reliability and adaptive improvement of response predictions.

  15. Computer Modeling of the Earliest Cellular Structures and Functions

    NASA Technical Reports Server (NTRS)

    Pohorille, Andrew; Chipot, Christophe; Schweighofer, Karl

    2000-01-01

    In the absence of extinct or extant record of protocells (the earliest ancestors of contemporary cells). the most direct way to test our understanding of the origin of cellular life is to construct laboratory models of protocells. Such efforts are currently underway in the NASA Astrobiology Program. They are accompanied by computational studies aimed at explaining self-organization of simple molecules into ordered structures and developing designs for molecules that perform proto-cellular functions. Many of these functions, such as import of nutrients, capture and storage of energy. and response to changes in the environment are carried out by proteins bound to membrane< We will discuss a series of large-scale, molecular-level computer simulations which demonstrate (a) how small proteins (peptides) organize themselves into ordered structures at water-membrane interfaces and insert into membranes, (b) how these peptides aggregate to form membrane-spanning structures (eg. channels), and (c) by what mechanisms such aggregates perform essential proto-cellular functions, such as proton transport of protons across cell walls, a key step in cellular bioenergetics. The simulations were performed using the molecular dynamics method, in which Newton's equations of motion for each item in the system are solved iteratively. The problems of interest required simulations on multi-nanosecond time scales, which corresponded to 10(exp 6)-10(exp 8) time steps.

  16. CSM research: Methods and application studies

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.

    1989-01-01

    Computational mechanics is that discipline of applied science and engineering devoted to the study of physical phenomena by means of computational methods based on mathematical modeling and simulation, utilizing digital computers. The discipline combines theoretical and applied mechanics, approximation theory, numerical analysis, and computer science. Computational mechanics has had a major impact on engineering analysis and design. When applied to structural mechanics, the discipline is referred to herein as computational structural mechanics. Complex structures being considered by NASA for the 1990's include composite primary aircraft structures and the space station. These structures will be much more difficult to analyze than today's structures and necessitate a major upgrade in computerized structural analysis technology. NASA has initiated a research activity in structural analysis called Computational Structural Mechanics (CSM). The broad objective of the CSM activity is to develop advanced structural analysis technology that will exploit modern and emerging computers, such as those with vector and/or parallel processing capabilities. Here, the current research directions for the Methods and Application Studies Team of the Langley CSM activity are described.

  17. Vibration-based health monitoring and model refinement of civil engineering structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farrar, C.R.; Doebling, S.W.

    1997-10-01

    Damage or fault detection, as determined by changes in the dynamic properties of structures, is a subject that has received considerable attention in the technical literature beginning approximately 30 years ago. The basic idea is that changes in the structure`s properties, primarily stiffness, will alter the dynamic properties of the structure such as resonant frequencies and mode shapes, and properties derived from these quantities such as modal-based flexibility. Recently, this technology has been investigated for applications to health monitoring of large civil engineering structures. This presentation will discuss such a study undertaken by engineers from New Mexico Sate University, Sandiamore » National Laboratory and Los Alamos National Laboratory. Experimental modal analyses were performed in an undamaged interstate highway bridge and immediately after four successively more severe damage cases were inflicted in the main girder of the structure. Results of these tests provide insight into the abilities of modal-based damage ID methods to identify damage and the current limitations of this technology. Closely related topics that will be discussed are the use of modal properties to validate computer models of the structure, the use of these computer models in the damage detection process, and the general lack of experimental investigation of large civil engineering structures.« less

  18. DACS II - A distributed thermal/mechanical loads data acquisition and control system

    NASA Technical Reports Server (NTRS)

    Zamanzadeh, Behzad; Trover, William F.; Anderson, Karl F.

    1987-01-01

    A distributed data acquisition and control system has been developed for the NASA Flight Loads Research Facility. The DACS II system is composed of seven computer systems and four array processors configured as a main computer system, three satellite computer systems, and 13 analog input/output systems interconnected through three independent data networks. Up to three independent heating and loading tests can be run concurrently on different test articles or the entire system can be used on a single large test such as a full scale hypersonic aircraft. Thermal tests can include up to 512 independent adaptive closed loop control channels. The control system can apply up to 20 MW of heating to a test specimen while simultaneously applying independent mechanical loads. Each thermal control loop is capable of heating a structure at rates of up to 150 F per second over a temperature range of -300 to +2500 F. Up to 64 independent mechanical load profiles can be commanded along with thermal control. Up to 1280 analog inputs monitor temperature, load, displacement and strain on the test specimens with real time data displayed on up to 15 terminals as color plots and tabular data displays. System setup and operation is accomplished with interactive menu-driver displays with extensive facilities to assist the users in all phases of system operation.

  19. Variability in the Propagation Phase of CFD-Based Noise Prediction: Summary of Results From Category 8 of the BANC-III Workshop

    NASA Technical Reports Server (NTRS)

    Lopes, Leonard; Redonnet, Stephane; Imamura, Taro; Ikeda, Tomoaki; Zawodny, Nikolas; Cunha, Guilherme

    2015-01-01

    The usage of Computational Fluid Dynamics (CFD) in noise prediction typically has been a two part process: accurately predicting the flow conditions in the near-field and then propagating the noise from the near-field to the observer. Due to the increase in computing power and the cost benefit when weighed against wind tunnel testing, the usage of CFD to estimate the local flow field of complex geometrical structures has become more routine. Recently, the Benchmark problems in Airframe Noise Computation (BANC) workshops have provided a community focus on accurately simulating the local flow field near the body with various CFD approaches. However, to date, little effort has been given into assessing the impact of the propagation phase of noise prediction. This paper includes results from the BANC-III workshop which explores variability in the propagation phase of CFD-based noise prediction. This includes two test cases: an analytical solution of a quadrupole source near a sphere and a computational solution around a nose landing gear. Agreement between three codes was very good for the analytic test case, but CFD-based noise predictions indicate that the propagation phase can introduce 3dB or more of variability in noise predictions.

  20. Computer Simulation of Microwave Devices

    NASA Technical Reports Server (NTRS)

    Kory, Carol L.

    1997-01-01

    The accurate simulation of cold-test results including dispersion, on-axis beam interaction impedance, and attenuation of a helix traveling-wave tube (TWT) slow-wave circuit using the three-dimensional code MAFIA (Maxwell's Equations Solved by the Finite Integration Algorithm) was demonstrated for the first time. Obtaining these results is a critical step in the design of TWT's. A well-established procedure to acquire these parameters is to actually build and test a model or a scale model of the circuit. However, this procedure is time-consuming and expensive, and it limits freedom to examine new variations to the basic circuit. These limitations make the need for computational methods crucial since they can lower costs, reduce tube development time, and lessen limitations on novel designs. Computer simulation has been used to accurately obtain cold-test parameters for several slow-wave circuits. Although the helix slow-wave circuit remains the mainstay of the TWT industry because of its exceptionally wide bandwidth, until recently it has been impossible to accurately analyze a helical TWT using its exact dimensions because of the complexity of its geometrical structure. A new computer modeling technique developed at the NASA Lewis Research Center overcomes these difficulties. The MAFIA three-dimensional mesh for a C-band helix slow-wave circuit is shown.

  1. Rotor Airloads Prediction Using Unstructured Meshes and Loose CFD/CSD Coupling

    NASA Technical Reports Server (NTRS)

    Biedron, Robert T.; Lee-Rausch, Elizabeth M.

    2008-01-01

    The FUN3D unsteady Reynolds-averaged Navier-Stokes solver for unstructured grids has been modified to allow prediction of trimmed rotorcraft airloads. The trim of the rotorcraft and the aeroelastic deformation of the rotor blades are accounted for via loose coupling with the CAMRAD II rotorcraft computational structural dynamics code. The set of codes is used to analyze the HART-II Baseline, Minimum Noise and Minimum Vibration test conditions. The loose coupling approach is found to be stable and convergent for the cases considered. Comparison of the resulting airloads and structural deformations with experimentally measured data is presented. The effect of grid resolution and temporal accuracy is examined. Rotorcraft airloads prediction presents a very substantial challenge for Computational Fluid Dynamics (CFD). Not only must the unsteady nature of the flow be accurately modeled, but since most rotorcraft blades are not structurally stiff, an accurate simulation must account for the blade structural dynamics. In addition, trim of the rotorcraft to desired thrust and moment targets depends on both aerodynamic loads and structural deformation, and vice versa. Further, interaction of the fuselage with the rotor flow field can be important, so that relative motion between the blades and the fuselage must be accommodated. Thus a complete simulation requires coupled aerodynamics, structures and trim, with the ability to model geometrically complex configurations. NASA has recently initiated a Subsonic Rotary Wing (SRW) Project under the overall Fundamental Aeronautics Program. Within the context of SRW are efforts aimed at furthering the state of the art of high-fidelity rotorcraft flow simulations, using both structured and unstructured meshes. Structured-mesh solvers have an advantage in computation speed, but even though remarkably complex configurations may be accommodated using the overset grid approach, generation of complex structured-mesh systems can require months to set up. As a result, many rotorcraft simulations using structured-grid CFD neglect the fuselage. On the other hand, unstructured-mesh solvers are easily able to handle complex geometries, but suffer from slower execution speed. However, advances in both computer hardware and CFD algorithms have made previously state-of-the-art computations routine for unstructured-mesh solvers, so that rotorcraft simulations using unstructured grids are now viable. The aim of the present work is to develop a first principles rotorcraft simulation tool based on an unstructured CFD solver.

  2. Probabilistic sampling of protein conformations: new hope for brute force?

    PubMed

    Feldman, Howard J; Hogue, Christopher W V

    2002-01-01

    Protein structure prediction from sequence alone by "brute force" random methods is a computationally expensive problem. Estimates have suggested that it could take all the computers in the world longer than the age of the universe to compute the structure of a single 200-residue protein. Here we investigate the use of a faster version of our FOLDTRAJ probabilistic all-atom protein-structure-sampling algorithm. We have improved the method so that it is now over twenty times faster than originally reported, and capable of rapidly sampling conformational space without lattices. It uses geometrical constraints and a Leonard-Jones type potential for self-avoidance. We have also implemented a novel method to add secondary structure-prediction information to make protein-like amounts of secondary structure in sampled structures. In a set of 100,000 probabilistic conformers of 1VII, 1ENH, and 1PMC generated, the structures with smallest Calpha RMSD from native are 3.95, 5.12, and 5.95A, respectively. Expanding this test to a set of 17 distinct protein folds, we find that all-helical structures are "hit" by brute force more frequently than beta or mixed structures. For small helical proteins or very small non-helical ones, this approach should have a "hit" close enough to detect with a good scoring function in a pool of several million conformers. By fitting the distribution of RMSDs from the native state of each of the 17 sets of conformers to the extreme value distribution, we are able to estimate the size of conformational space for each. With a 0.5A RMSD cutoff, the number of conformers is roughly 2N where N is the number of residues in the protein. This is smaller than previous estimates, indicating an average of only two possible conformations per residue when sterics are accounted for. Our method reduces the effective number of conformations available at each residue by probabilistic bias, without requiring any particular discretization of residue conformational space, and is the fastest method of its kind. With computer speeds doubling every 18 months and parallel and distributed computing becoming more practical, the brute force approach to protein structure prediction may yet have some hope in the near future. Copyright 2001 Wiley-Liss, Inc.

  3. Development history of the Hybrid Test Vehicle

    NASA Technical Reports Server (NTRS)

    Trummel, M. C.; Burke, A. F.

    1983-01-01

    Phase I of a joint Department of Energy/Jet Propulsion Laboratory Program undertook the development of the Hybrid Test Vehicle (HTV), which has subsequently progressed through design, fabrication, and testing and evaluation phases. Attention is presently given to the design and test experience gained during the HTV development program, and a discussion is presented of the design features and performance capabilities of the various 'mule' vehicles, devoted to the separate development of engine microprocessor control, vehicle structure, and mechanical components, whose elements were incorporated into the final HTV design. Computer projections of the HTV's performance are given.

  4. Proceedings of Damping Volume 1 of 3

    DTIC Science & Technology

    1993-06-01

    paper. This work will present a passive piezoelectric damping implementation on ASTREX, a large space structure. The motivation behind this research is...Presented at Damping 󈨡 San Francisco, CA February 24-26, 1993 Motivation "• Accurate design of precision structures "* Computer modelling - Design...14) (KI f(0)/Fl,.) FRom equations (3) and (6), Young’s modulus of rubber specimen is written as; L Ea-K (15) A E - EJ(I+ PS4 ) (16) NONRESONANT TEST

  5. Computer-aided personal interviewing. A new technique for data collection in epidemiologic surveys.

    PubMed

    Birkett, N J

    1988-03-01

    Most epidemiologic studies involve the collection of data directly from selected respondents. Traditionally, interviewers are provided with the interview in booklet form on paper and answers are recorded therein. On receipt at the study office, the interview results are coded, transcribed, and keypunched for analysis. The author's team has developed a method of personal interviewing which uses a structured interview stored on a lap-sized computer. Responses are entered into the computer and are subject to immediate error-checking and correction. All skip-patterns are automatic. Data entry to the final data-base involves no manual data transcription. A pilot evaluation with a preliminary version of the system using tape-recorded interviews in a test/re-test methodology revealed a slightly higher error rate, probably related to weaknesses in the pilot system and the training process. Computer interviews tended to be longer but other features of the interview process were not affected by computer. The author's team has now completed 2,505 interviews using this system in a community-based blood pressure survey. It has been well accepted by both interviewers and respondents. Failure to complete an interview on the computer was uncommon (5 per cent) and well-handled by paper back-up questionnaires. The results show that computer-aided personal interviewing in the home is feasible but that further evaluation is needed to establish the impact of this methodology on overall data quality.

  6. Finite element analysis simulations for ultrasonic array NDE inspections

    NASA Astrophysics Data System (ADS)

    Dobson, Jeff; Tweedie, Andrew; Harvey, Gerald; O'Leary, Richard; Mulholland, Anthony; Tant, Katherine; Gachagan, Anthony

    2016-02-01

    Advances in manufacturing techniques and materials have led to an increase in the demand for reliable and robust inspection techniques to maintain safety critical features. The application of modelling methods to develop and evaluate inspections is becoming an essential tool for the NDE community. Current analytical methods are inadequate for simulation of arbitrary components and heterogeneous materials, such as anisotropic welds or composite structures. Finite element analysis software (FEA), such as PZFlex, can provide the ability to simulate the inspection of these arrangements, providing the ability to economically prototype and evaluate improved NDE methods. FEA is often seen as computationally expensive for ultrasound problems however, advances in computing power have made it a more viable tool. This paper aims to illustrate the capability of appropriate FEA to produce accurate simulations of ultrasonic array inspections - minimizing the requirement for expensive test-piece fabrication. Validation is afforded via corroboration of the FE derived and experimentally generated data sets for a test-block comprising 1D and 2D defects. The modelling approach is extended to consider the more troublesome aspects of heterogeneous materials where defect dimensions can be of the same length scale as the grain structure. The model is used to facilitate the implementation of new ultrasonic array inspection methods for such materials. This is exemplified by considering the simulation of ultrasonic NDE in a weld structure in order to assess new approaches to imaging such structures.

  7. Computation of statistical secondary structure of nucleic acids.

    PubMed Central

    Yamamoto, K; Kitamura, Y; Yoshikura, H

    1984-01-01

    This paper presents a computer analysis of statistical secondary structure of nucleic acids. For a given single stranded nucleic acid, we generated "structure map" which included all the annealing structures in the sequence. The map was transformed into "energy map" by rough approximation; here, the energy level of every pairing structure consisting of more than 2 successive nucleic acid pairs was calculated. By using the "energy map", the probability of occurrence of each annealed structure was computed, i.e., the structure was computed statistically. The basis of computation was the 8-queen problem in the chess game. The validity of our computer programme was checked by computing tRNA structure which has been well established. Successful application of this programme to small nuclear RNAs of various origins is demonstrated. PMID:6198622

  8. Cellular Ti-6Al-4V structures with interconnected macro porosity for bone implants fabricated by selective electron beam melting.

    PubMed

    Heinl, Peter; Müller, Lenka; Körner, Carolin; Singer, Robert F; Müller, Frank A

    2008-09-01

    Selective electron beam melting (SEBM) was successfully used to fabricate novel cellular Ti-6Al-4V structures for orthopaedic applications. Micro computer tomography (microCT) analysis demonstrated the capability to fabricate three-dimensional structures with an interconnected porosity and pore sizes suitable for tissue ingrowth and vascularization. Mechanical properties, such as compressive strength and elastic modulus, of the tested structures were similar to those of human bone. Thus, stress-shielding effects after implantation might be avoided due to a reduced stiffness mismatch between implant and bone. A chemical surface modification using HCl and NaOH induced apatite formation during in vitro bioactivity tests in simulated body fluid under dynamic conditions. The modified bioactive surface is expected to enhance the fixation of the implant in the surrounding bone as well as to improve its long-term stability.

  9. Finite element modelling of crash response of composite aerospace sub-floor structures

    NASA Astrophysics Data System (ADS)

    McCarthy, M. A.; Harte, C. G.; Wiggenraad, J. F. M.; Michielsen, A. L. P. J.; Kohlgrüber, D.; Kamoulakos, A.

    Composite energy-absorbing structures for use in aircraft are being studied within a European Commission research programme (CRASURV - Design for Crash Survivability). One of the aims of the project is to evaluate the current capabilities of crashworthiness simulation codes for composites modelling. This paper focuses on the computational analysis using explicit finite element analysis, of a number of quasi-static and dynamic tests carried out within the programme. It describes the design of the structures, the analysis techniques used, and the results of the analyses in comparison to the experimental test results. It has been found that current multi-ply shell models are capable of modelling the main energy-absorbing processes at work in such structures. However some deficiencies exist, particularly in modelling fabric composites. Developments within the finite element code are taking place as a result of this work which will enable better representation of composite fabrics.

  10. Internal consistency and stability of the CANTAB neuropsychological test battery in children.

    PubMed

    Syväoja, Heidi J; Tammelin, Tuija H; Ahonen, Timo; Räsänen, Pekka; Tolvanen, Asko; Kankaanpää, Anna; Kantomaa, Marko T

    2015-06-01

    The Cambridge Neuropsychological Test Automated Battery (CANTAB) is a computer-assessed test battery widely use in different populations. The internal consistency and 1-year stability of CANTAB tests were examined in school-age children. Two hundred-thirty children (57% girls) from five schools in the Jyväskylä school district in Finland participated in the study in spring 2011. The children completed the following CANTAB tests: (a) visual memory (pattern recognition memory [PRM] and spatial recognition memory [SRM]), (b) executive function (spatial span [SSP], Stockings of Cambridge [SOC], and intra-extra dimensional set shift [IED]), and (c) attention (reaction time [RTI] and rapid visual information processing [RVP]). Seventy-four children participated in the follow-up measurements (64% girls) in spring 2012. Cronbach's alpha reliability coefficient was used to estimate the internal consistency of the nonhampering test, and structural equation models were applied to examine the stability of these tests. The reliability and the stability could not be determined for IED or SSP because of the nature of these tests. The internal consistency was acceptable only in the RTI task. The 1-year stability was moderate-to-good for the PRM, RTI, and RVP. The SSP and IED showed a moderate correlation between the two measurement points. The SRM and the SOC tasks were not reliable or stable measures in this study population. For research purposes, we recommend using structural equation modeling to improve reliability. The results suggest that the reliability and the stability of computer-based test batteries should be confirmed in the target population before using them for clinical or research purposes. (c) 2015 APA, all rights reserved).

  11. Quantitative structure-activation barrier relationship modeling for Diels-Alder ligations utilizing quantum chemical structural descriptors.

    PubMed

    Nandi, Sisir; Monesi, Alessandro; Drgan, Viktor; Merzel, Franci; Novič, Marjana

    2013-10-30

    In the present study, we show the correlation of quantum chemical structural descriptors with the activation barriers of the Diels-Alder ligations. A set of 72 non-catalysed Diels-Alder reactions were subjected to quantitative structure-activation barrier relationship (QSABR) under the framework of theoretical quantum chemical descriptors calculated solely from the structures of diene and dienophile reactants. Experimental activation barrier data were obtained from literature. Descriptors were computed using Hartree-Fock theory using 6-31G(d) basis set as implemented in Gaussian 09 software. Variable selection and model development were carried out by stepwise multiple linear regression methodology. Predictive performance of the quantitative structure-activation barrier relationship (QSABR) model was assessed by training and test set concept and by calculating leave-one-out cross-validated Q2 and predictive R2 values. The QSABR model can explain and predict 86.5% and 80% of the variances, respectively, in the activation energy barrier training data. Alternatively, a neural network model based on back propagation of errors was developed to assess the nonlinearity of the sought correlations between theoretical descriptors and experimental reaction barriers. A reasonable predictability for the activation barrier of the test set reactions was obtained, which enabled an exploration and interpretation of the significant variables responsible for Diels-Alder interaction between dienes and dienophiles. Thus, studies in the direction of QSABR modelling that provide efficient and fast prediction of activation barriers of the Diels-Alder reactions turn out to be a meaningful alternative to transition state theory based computation.

  12. Computational analysis of an aortic valve jet

    NASA Astrophysics Data System (ADS)

    Shadden, Shawn C.; Astorino, Matteo; Gerbeau, Jean-Frédéric

    2009-11-01

    In this work we employ a coupled FSI scheme using an immersed boundary method to simulate flow through a realistic deformable, 3D aortic valve model. This data was used to compute Lagrangian coherent structures, which revealed flow separation from the valve leaflets during systole, and correspondingly, the boundary between the jet of ejected fluid and the regions of separated, recirculating flow. Advantages of computing LCS in multi-dimensional FSI models of the aortic valve are twofold. For one, the quality and effectiveness of existing clinical indices used to measure aortic jet size can be tested by taking advantage of the accurate measure of the jet area derived from LCS. Secondly, as an ultimate goal, a reliable computational framework for the assessment of the aortic valve stenosis could be developed.

  13. Characterization of structural connections using free and forced response test data

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles; Huckelbridge, Arthur A.

    1989-01-01

    The accurate prediction of system dynamic response often has been limited by deficiencies in existing capabilities to characterize connections adequately. Connections between structural components often are complex mechanically, and difficult to accurately model analytically. Improved analytical models for connections are needed to improve system dynamic preditions. A procedure for identifying physical connection properties from free and forced response test data is developed, then verified utilizing a system having both a linear and nonlinear connection. Connection properties are computed in terms of physical parameters so that the physical characteristics of the connections can better be understood, in addition to providing improved input for the system model. The identification procedure is applicable to multi-degree of freedom systems, and does not require that the test data be measured directly at the connection locations.

  14. The Purpose of Generating Fatigue Crack Growth Threshold Data

    NASA Technical Reports Server (NTRS)

    Forth, Scott

    2006-01-01

    Test data shows that different width and thickness C(T), M(T) and ESE(T) specimens generate different thresholds Structures designed for "infinite life" are being re-evaluated: a) Threshold changes from 6 to 3 ksi in(sup 1/2); b) Computational life changes from infinite to 4 missions. Multi-million dollar test programs required to substantiate operation. Using ASTM E647 as standard guidance to generate threshold data is not practical. A threshold test approach needs to be standardized that will provide positive margin for high cycle fatigue applications.

  15. Advanced Computational Methods for High-accuracy Refinement of Protein Low-quality Models

    NASA Astrophysics Data System (ADS)

    Zang, Tianwu

    Predicting the 3-dimentional structure of protein has been a major interest in the modern computational biology. While lots of successful methods can generate models with 3˜5A root-mean-square deviation (RMSD) from the solution, the progress of refining these models is quite slow. It is therefore urgently needed to develop effective methods to bring low-quality models to higher-accuracy ranges (e.g., less than 2 A RMSD). In this thesis, I present several novel computational methods to address the high-accuracy refinement problem. First, an enhanced sampling method, named parallel continuous simulated tempering (PCST), is developed to accelerate the molecular dynamics (MD) simulation. Second, two energy biasing methods, Structure-Based Model (SBM) and Ensemble-Based Model (EBM), are introduced to perform targeted sampling around important conformations. Third, a three-step method is developed to blindly select high-quality models along the MD simulation. These methods work together to make significant refinement of low-quality models without any knowledge of the solution. The effectiveness of these methods is examined in different applications. Using the PCST-SBM method, models with higher global distance test scores (GDT_TS) are generated and selected in the MD simulation of 18 targets from the refinement category of the 10th Critical Assessment of Structure Prediction (CASP10). In addition, in the refinement test of two CASP10 targets using the PCST-EBM method, it is indicated that EBM may bring the initial model to even higher-quality levels. Furthermore, a multi-round refinement protocol of PCST-SBM improves the model quality of a protein to the level that is sufficient high for the molecular replacement in X-ray crystallography. Our results justify the crucial position of enhanced sampling in the protein structure prediction and demonstrate that a considerable improvement of low-accuracy structures is still achievable with current force fields.

  16. Computational approaches for drug discovery.

    PubMed

    Hung, Che-Lun; Chen, Chi-Chun

    2014-09-01

    Cellular proteins are the mediators of multiple organism functions being involved in physiological mechanisms and disease. By discovering lead compounds that affect the function of target proteins, the target diseases or physiological mechanisms can be modulated. Based on knowledge of the ligand-receptor interaction, the chemical structures of leads can be modified to improve efficacy, selectivity and reduce side effects. One rational drug design technology, which enables drug discovery based on knowledge of target structures, functional properties and mechanisms, is computer-aided drug design (CADD). The application of CADD can be cost-effective using experiments to compare predicted and actual drug activity, the results from which can used iteratively to improve compound properties. The two major CADD-based approaches are structure-based drug design, where protein structures are required, and ligand-based drug design, where ligand and ligand activities can be used to design compounds interacting with the protein structure. Approaches in structure-based drug design include docking, de novo design, fragment-based drug discovery and structure-based pharmacophore modeling. Approaches in ligand-based drug design include quantitative structure-affinity relationship and pharmacophore modeling based on ligand properties. Based on whether the structure of the receptor and its interaction with the ligand are known, different design strategies can be seed. After lead compounds are generated, the rule of five can be used to assess whether these have drug-like properties. Several quality validation methods, such as cost function analysis, Fisher's cross-validation analysis and goodness of hit test, can be used to estimate the metrics of different drug design strategies. To further improve CADD performance, multi-computers and graphics processing units may be applied to reduce costs. © 2014 Wiley Periodicals, Inc.

  17. An impulsive receptance technique for the time domain computation of the vibration of a whole aero-engine model with nonlinear bearings

    NASA Astrophysics Data System (ADS)

    Hai, Pham Minh; Bonello, Philip

    2008-12-01

    The direct study of the vibration of real engine structures with nonlinear bearings, particularly aero-engines, has been severely limited by the fact that current nonlinear computational techniques are not well-suited for complex large-order systems. This paper introduces a novel implicit "impulsive receptance method" (IRM) for the time domain analysis of such structures. The IRM's computational efficiency is largely immune to the number of modes used and dependent only on the number of nonlinear elements. This means that, apart from retaining numerical accuracy, a much more physically accurate solution is achievable within a short timeframe. Simulation tests on a realistically sized representative twin-spool aero-engine showed that the new method was around 40 times faster than a conventional implicit integration scheme. Preliminary results for a given rotor unbalance distribution revealed the varying degree of journal lift, orbit size and shape at the example engine's squeeze-film damper bearings, and the effect of end-sealing at these bearings.

  18. High resolution spectroscopic mapping imaging applied in situ to multilayer structures for stratigraphic identification of painted art objects

    NASA Astrophysics Data System (ADS)

    Karagiannis, Georgios Th.

    2016-04-01

    The development of non-destructive techniques is a reality in the field of conservation science. These techniques are usually not so accurate, as the analytical micro-sampling techniques, however, the proper development of soft-computing techniques can improve their accuracy. In this work, we propose a real-time fast acquisition spectroscopic mapping imaging system that operates from the ultraviolet to mid infrared (UV/Vis/nIR/mIR) area of the electromagnetic spectrum and it is supported by a set of soft-computing methods to identify the materials that exist in a stratigraphic structure of paint layers. Particularly, the system acquires spectra in diffuse-reflectance mode, scanning in a Region-Of-Interest (ROI), and having wavelength range from 200 up to 5000 nm. Also, a fuzzy c-means clustering algorithm, i.e., the particular soft-computing algorithm, produces the mapping images. The evaluation of the method was tested on a byzantine painted icon.

  19. VMOMS — A computer code for finding moment solutions to the Grad-Shafranov equation

    NASA Astrophysics Data System (ADS)

    Lao, L. L.; Wieland, R. M.; Houlberg, W. A.; Hirshman, S. P.

    1982-08-01

    Title of program: VMOMS Catalogue number: ABSH Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland (See application form in this issue) Computer: PDP-10/KL10; Installation: ORNL Fusion Energy Division, Oak Ridge National Laboratory, Oak Ridge, TN 37830, USA Operating system: TOPS 10 Programming language used: FORTRAN High speed storage required: 9000 words No. of bits in a word: 36 Overlay structure: none Peripherals used: line printer, disk drive No. of cards in combined program and test deck: 2839 Card punching code: ASCII

  20. Remote sensing of land-based voids using computer enhanced infrared thermography

    NASA Astrophysics Data System (ADS)

    Weil, Gary J.

    1989-10-01

    Experiments are described in which computer-enhanced infrared thermography techniques are used to detect and describe subsurface land-based voids, such as voids surrounding buried utility pipes, voids in concrete structures such as airport taxiways, abandoned buried utility storage tanks, and caves and underground shelters. Infrared thermography also helps to evaluate bridge deck systems, highway pavements, and garage concrete. The IR thermography techniques make it possible to survey large areas quickly and efficiently. The paper also surveys the advantages and limitations of thermographic testing in comparison with other forms of NDT.

  1. Mission definition study for Stanford relativity satellite. Volume 3: Appendices

    NASA Technical Reports Server (NTRS)

    1971-01-01

    An analysis is presented for the cost of the mission as a function of the following variables: amount of redundancy in the spacecraft, amount of care taken in building the spacecraft (functional and environmental tests, screening of components, quality control, etc), and the number of flights necessary to accomplish the mission. Thermal analysis and mathematical models for the experimental components are presented. The results of computer structural and stress analyses for support and cylinders are discussed. Reliability, quality control, and control system simulation by computer are also considered.

  2. Translational Genomics Research Institute: Identification of Pathways Enriched with Condition-Specific Statistical Dependencies Across Four Subtypes of Glioblastoma Multiforme | Office of Cancer Genomics

    Cancer.gov

    Evaluation of Differential DependencY (EDDY) is a statistical test for the differential dependency relationship of a set of genes between two given conditions. For each condition, possible dependency network structures are enumerated and their likelihoods are computed to represent a probability distribution of dependency networks. The difference between the probability distributions of dependency networks is computed between conditions, and its statistical significance is evaluated with random permutations of condition labels on the samples.  

  3. Translational Genomics Research Institute (TGen): Identification of Pathways Enriched with Condition-Specific Statistical Dependencies Across Four Subtypes of Glioblastoma Multiforme | Office of Cancer Genomics

    Cancer.gov

    Evaluation of Differential DependencY (EDDY) is a statistical test for the differential dependency relationship of a set of genes between two given conditions. For each condition, possible dependency network structures are enumerated and their likelihoods are computed to represent a probability distribution of dependency networks. The difference between the probability distributions of dependency networks is computed between conditions, and its statistical significance is evaluated with random permutations of condition labels on the samples.  

  4. Computational test bench and flow chart for wavefront sensors

    NASA Astrophysics Data System (ADS)

    Abecassis, Úrsula V.; de Lima Monteiro, Davies W.; Salles, Luciana P.; Stanigher, Rafaela; Borges, Euller

    2014-05-01

    The wavefront reconstruction diagram has come to supply the need in literature of an ampler vision over the many methods and optronic devices used for the reconstruction of wavefronts and to show the existing interactions between those. A computational platform has been developed using the diagram's orientation for the taking of decision over the best technique and the photo sensible and electronic structures to be implemented. This work will be directed to an ophthalmological application in the development of an instrument of help for the diagnosis of optical aberrations of the human eye.

  5. JETSPIN: A specific-purpose open-source software for simulations of nanofiber electrospinning

    NASA Astrophysics Data System (ADS)

    Lauricella, Marco; Pontrelli, Giuseppe; Coluzza, Ivan; Pisignano, Dario; Succi, Sauro

    2015-12-01

    We present the open-source computer program JETSPIN, specifically designed to simulate the electrospinning process of nanofibers. Its capabilities are shown with proper reference to the underlying model, as well as a description of the relevant input variables and associated test-case simulations. The various interactions included in the electrospinning model implemented in JETSPIN are discussed in detail. The code is designed to exploit different computational architectures, from single to parallel processor workstations. This paper provides an overview of JETSPIN, focusing primarily on its structure, parallel implementations, functionality, performance, and availability.

  6. An Application Development Platform for Neuromorphic Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dean, Mark; Chan, Jason; Daffron, Christopher

    2016-01-01

    Dynamic Adaptive Neural Network Arrays (DANNAs) are neuromorphic computing systems developed as a hardware based approach to the implementation of neural networks. They feature highly adaptive and programmable structural elements, which model arti cial neural networks with spiking behavior. We design them to solve problems using evolutionary optimization. In this paper, we highlight the current hardware and software implementations of DANNA, including their features, functionalities and performance. We then describe the development of an Application Development Platform (ADP) to support efficient application implementation and testing of DANNA based solutions. We conclude with future directions.

  7. Approximated maximum likelihood estimation in multifractal random walks

    NASA Astrophysics Data System (ADS)

    Løvsletten, O.; Rypdal, M.

    2012-04-01

    We present an approximated maximum likelihood method for the multifractal random walk processes of [E. Bacry , Phys. Rev. EPLEEE81539-375510.1103/PhysRevE.64.026103 64, 026103 (2001)]. The likelihood is computed using a Laplace approximation and a truncation in the dependency structure for the latent volatility. The procedure is implemented as a package in the r computer language. Its performance is tested on synthetic data and compared to an inference approach based on the generalized method of moments. The method is applied to estimate parameters for various financial stock indices.

  8. NASA Aeronautics: Research and Technology Program Highlights

    NASA Technical Reports Server (NTRS)

    1990-01-01

    This report contains numerous color illustrations to describe the NASA programs in aeronautics. The basic ideas involved are explained in brief paragraphs. The seven chapters deal with Subsonic aircraft, High-speed transport, High-performance military aircraft, Hypersonic/Transatmospheric vehicles, Critical disciplines, National facilities and Organizations & installations. Some individual aircraft discussed are : the SR-71 aircraft, aerospace planes, the high-speed civil transport (HSCT), the X-29 forward-swept wing research aircraft, and the X-31 aircraft. Critical disciplines discussed are numerical aerodynamic simulation, computational fluid dynamics, computational structural dynamics and new experimental testing techniques.

  9. Pretest predictions of the Fast Flux Test Facility Passive Safety Test Phase IIB transients using United States derived computer codes and methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heard, F.J.; Harris, R.A.; Padilla, A.

    The SASSYS/SAS4A systems analysis code was used to simulate a series of unprotected loss of flow (ULOF) tests planned at the Fast Flux Test Facility (FFTF). The subject tests were designed to investigate the transient performance of the FFTF during various ULOF scenarios for two different loading patterns designed to produce extremes in the assembly load pad clearance and the direction of the initial assembly bows. The tests are part of an international program designed to extend the existing data base on the performance of liquid metal reactors (LMR). The analyses demonstrate that a wide range of power-to-flow ratios canmore » be reached during the transients and, therefore, will yield valuable data on the dynamic character of the structural feedbacks in LMRS. These analyses will be repeated once the actual FFTF core loadings for the tests are available. These predictions, similar ones obtained by other international participants in the FFTF program, and post-test analyses will be used to upgrade and further verify the computer codes used to predict the behavior of LMRS.« less

  10. Data processing device test apparatus and method therefor

    DOEpatents

    Wilcox, Richard Jacob; Mulig, Jason D.; Eppes, David; Bruce, Michael R.; Bruce, Victoria J.; Ring, Rosalinda M.; Cole, Jr., Edward I.; Tangyunyong, Paiboon; Hawkins, Charles F.; Louie, Arnold Y.

    2003-04-08

    A method and apparatus mechanism for testing data processing devices are implemented. The test mechanism isolates critical paths by correlating a scanning microscope image with a selected speed path failure. A trigger signal having a preselected value is generated at the start of each pattern vector. The sweep of the scanning microscope is controlled by a computer, which also receives and processes the image signals returned from the microscope. The value of the trigger signal is correlated with a set of pattern lines being driven on the DUT. The trigger is either asserted or negated depending the detection of a pattern line failure and the particular line that failed. In response to the detection of the particular speed path failure being characterized, and the trigger signal, the control computer overlays a mask on the image of the device under test (DUT). The overlaid image provides a visual correlation of the failure with the structural elements of the DUT at the level of resolution of the microscope itself.

  11. Prediction and verification of creep behavior in metallic materials and components for the space shuttle thermal protection system

    NASA Technical Reports Server (NTRS)

    Davis, J. W.; Cramer, B. A.

    1976-01-01

    A method of analysis was developed for predicting permanent cyclic creep deflections in stiffened panel structures. This method uses creep equations based on cyclic tensile creep tests and a computer program to predict panel deflections as a function of mission cycle. Four materials were investigated - a titanium alloy (Ti-6Al-4V), a cobalt alloy (L605), and two nickel alloys (Rene'41 and TDNiCr). Steady-state and cyclic creep response data were obtained by testing tensile specimens fabricated from thin gage sheet (0.025 and 0.63 cm nominal). Steady-state and cyclic creep equations were developed which describe creep as a function of time, temperature and load. Tests were also performed on subsize (6.35 x 30.5 cm) rib and corrugation stiffened panels. These tests were used to correlate creep responses between elemental specimens and panels. The panel response was analyzed by use of a specially written computer program.

  12. Prediction of protein structural classes by Chou's pseudo amino acid composition: approached using continuous wavelet transform and principal component analysis.

    PubMed

    Li, Zhan-Chao; Zhou, Xi-Bin; Dai, Zong; Zou, Xiao-Yong

    2009-07-01

    A prior knowledge of protein structural classes can provide useful information about its overall structure, so it is very important for quick and accurate determination of protein structural class with computation method in protein science. One of the key for computation method is accurate protein sample representation. Here, based on the concept of Chou's pseudo-amino acid composition (AAC, Chou, Proteins: structure, function, and genetics, 43:246-255, 2001), a novel method of feature extraction that combined continuous wavelet transform (CWT) with principal component analysis (PCA) was introduced for the prediction of protein structural classes. Firstly, the digital signal was obtained by mapping each amino acid according to various physicochemical properties. Secondly, CWT was utilized to extract new feature vector based on wavelet power spectrum (WPS), which contains more abundant information of sequence order in frequency domain and time domain, and PCA was then used to reorganize the feature vector to decrease information redundancy and computational complexity. Finally, a pseudo-amino acid composition feature vector was further formed to represent primary sequence by coupling AAC vector with a set of new feature vector of WPS in an orthogonal space by PCA. As a showcase, the rigorous jackknife cross-validation test was performed on the working datasets. The results indicated that prediction quality has been improved, and the current approach of protein representation may serve as a useful complementary vehicle in classifying other attributes of proteins, such as enzyme family class, subcellular localization, membrane protein types and protein secondary structure, etc.

  13. Algorithm for repairing the damaged images of grain structures obtained from the cellular automata and measurement of grain size

    NASA Astrophysics Data System (ADS)

    Ramírez-López, A.; Romero-Romo, M. A.; Muñoz-Negron, D.; López-Ramírez, S.; Escarela-Pérez, R.; Duran-Valencia, C.

    2012-10-01

    Computational models are developed to create grain structures using mathematical algorithms based on the chaos theory such as cellular automaton, geometrical models, fractals, and stochastic methods. Because of the chaotic nature of grain structures, some of the most popular routines are based on the Monte Carlo method, statistical distributions, and random walk methods, which can be easily programmed and included in nested loops. Nevertheless, grain structures are not well defined as the results of computational errors and numerical inconsistencies on mathematical methods. Due to the finite definition of numbers or the numerical restrictions during the simulation of solidification, damaged images appear on the screen. These images must be repaired to obtain a good measurement of grain geometrical properties. Some mathematical algorithms were developed to repair, measure, and characterize grain structures obtained from cellular automata in the present work. An appropriate measurement of grain size and the corrected identification of interfaces and length are very important topics in materials science because they are the representation and validation of mathematical models with real samples. As a result, the developed algorithms are tested and proved to be appropriate and efficient to eliminate the errors and characterize the grain structures.

  14. Solving complex band structure problems with the FEAST eigenvalue algorithm

    NASA Astrophysics Data System (ADS)

    Laux, S. E.

    2012-08-01

    With straightforward extension, the FEAST eigenvalue algorithm [Polizzi, Phys. Rev. B 79, 115112 (2009)] is capable of solving the generalized eigenvalue problems representing traveling-wave problems—as exemplified by the complex band-structure problem—even though the matrices involved are complex, non-Hermitian, and singular, and hence outside the originally stated range of applicability of the algorithm. The obtained eigenvalues/eigenvectors, however, contain spurious solutions which must be detected and removed. The efficiency and parallel structure of the original algorithm are unaltered. The complex band structures of Si layers of varying thicknesses and InAs nanowires of varying radii are computed as test problems.

  15. Thermography Inspection for Early Detection of Composite Damage in Structures During Fatigue Loading

    NASA Technical Reports Server (NTRS)

    Zalameda, Joseph N.; Burke, Eric R.; Parker, F. Raymond; Seebo, Jeffrey P.; Wright, Christopher W.; Bly, James B.

    2012-01-01

    Advanced composite structures are commonly tested under controlled loading. Understanding the initiation and progression of composite damage under load is critical for validating design concepts and structural analysis tools. Thermal nondestructive evaluation (NDE) is used to detect and characterize damage in composite structures during fatigue loading. A difference image processing algorithm is demonstrated to enhance damage detection and characterization by removing thermal variations not associated with defects. In addition, a one-dimensional multilayered thermal model is used to characterize damage. Lastly, the thermography results are compared to other inspections such as non-immersion ultrasonic inspections and computed tomography X-ray.

  16. Solving protein structures using short-distance cross-linking constraints as a guide for discrete molecular dynamics simulations

    PubMed Central

    Brodie, Nicholas I.; Popov, Konstantin I.; Petrotchenko, Evgeniy V.; Dokholyan, Nikolay V.; Borchers, Christoph H.

    2017-01-01

    We present an integrated experimental and computational approach for de novo protein structure determination in which short-distance cross-linking data are incorporated into rapid discrete molecular dynamics (DMD) simulations as constraints, reducing the conformational space and achieving the correct protein folding on practical time scales. We tested our approach on myoglobin and FK506 binding protein—models for α helix–rich and β sheet–rich proteins, respectively—and found that the lowest-energy structures obtained were in agreement with the crystal structure, hydrogen-deuterium exchange, surface modification, and long-distance cross-linking validation data. Our approach is readily applicable to other proteins with unknown structures. PMID:28695211

  17. Solving protein structures using short-distance cross-linking constraints as a guide for discrete molecular dynamics simulations.

    PubMed

    Brodie, Nicholas I; Popov, Konstantin I; Petrotchenko, Evgeniy V; Dokholyan, Nikolay V; Borchers, Christoph H

    2017-07-01

    We present an integrated experimental and computational approach for de novo protein structure determination in which short-distance cross-linking data are incorporated into rapid discrete molecular dynamics (DMD) simulations as constraints, reducing the conformational space and achieving the correct protein folding on practical time scales. We tested our approach on myoglobin and FK506 binding protein-models for α helix-rich and β sheet-rich proteins, respectively-and found that the lowest-energy structures obtained were in agreement with the crystal structure, hydrogen-deuterium exchange, surface modification, and long-distance cross-linking validation data. Our approach is readily applicable to other proteins with unknown structures.

  18. Implementation and extension of the impulse transfer function method for future application to the space shuttle project. Volume 2: Program description and user's guide

    NASA Technical Reports Server (NTRS)

    Patterson, G.

    1973-01-01

    The data processing procedures and the computer programs were developed to predict structural responses using the Impulse Transfer Function (ITF) method. There are three major steps in the process: (1) analog-to-digital (A-D) conversion of the test data to produce Phase I digital tapes (2) processing of the Phase I digital tapes to extract ITF's and storing them in a permanent data bank, and (3) predicting structural responses to a set of applied loads. The analog to digital conversion is performed by a standard package which will be described later in terms of the contents of the resulting Phase I digital tape. Two separate computer programs have been developed to perform the digital processing.

  19. Crashworthiness: Planes, trains, and automobiles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Logan, R.W.; Tokarz, F.J.; Whirley, R.G.

    A powerful DYNA3D computer code simulates the dynamic effects of stress traveling through structures. It is the most advanced modeling tool available to study crashworthiness problems and to analyze impacts. Now used by some 1000 companies, government research laboratories, and universities in the U.S. and abroad, DYNA3D is also a preeminent example of successful technology transfer. The initial interest in such a code was to simulate the structural response of weapons systems. The need was to model not the explosive or nuclear events themselves but rather the impacts of weapons systems with the ground, tracking the stress waves as theymore » move through the object. This type of computer simulation augmented or, in certain cases, reduced the need for expensive and time-consuming crash testing.« less

  20. Normalized Cut Algorithm for Automated Assignment of Protein Domains

    NASA Technical Reports Server (NTRS)

    Samanta, M. P.; Liang, S.; Zha, H.; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    We present a novel computational method for automatic assignment of protein domains from structural data. At the core of our algorithm lies a recently proposed clustering technique that has been very successful for image-partitioning applications. This grap.,l-theory based clustering method uses the notion of a normalized cut to partition. an undirected graph into its strongly-connected components. Computer implementation of our method tested on the standard comparison set of proteins from the literature shows a high success rate (84%), better than most existing alternative In addition, several other features of our algorithm, such as reliance on few adjustable parameters, linear run-time with respect to the size of the protein and reduced complexity compared to other graph-theory based algorithms, would make it an attractive tool for structural biologists.

  1. Space Shuttle Main Engine structural analysis and data reduction/evaluation. Volume 6: Primary nozzle diffuser analysis

    NASA Technical Reports Server (NTRS)

    Foley, Michael J.

    1989-01-01

    The primary nozzle diffuser routes fuel from the main fuel valve on the Space Shuttle Main Engine (SSME) to the nozzle coolant inlet mainfold, main combustion chamber coolant inlet mainfold, chamber coolant valve, and the augmented spark igniters. The diffuser also includes the fuel system purge check valve connection. A static stress analysis was performed on the diffuser because no detailed analysis was done on this part in the past. Structural concerns were in the area of the welds because approximately 10 percent are in areas inaccessible by X-ray testing devices. Flow dynamics and thermodynamics were not included in the analysis load case. Constant internal pressure at maximum SSME power was used instead. A three-dimensional, finite element method was generated using ANSYS version 4.3A on the Lockheed VAX 11/785 computer to perform the stress computations. IDEAS Supertab on a Sun 3/60 computer was used to create the finite element model. Rocketdyne drawing number RS009156 was used for the model interpretation. The flight diffuser is denoted as -101. A description of the model, boundary conditions/load case, material properties, structural analysis/results, and a summary are included for documentation.

  2. Development of a thermal and structural analysis procedure for cooled radial turbines

    NASA Technical Reports Server (NTRS)

    Kumar, Ganesh N.; Deanna, Russell G.

    1988-01-01

    A procedure for computing the rotor temperature and stress distributions in a cooled radial turbine is considered. Existing codes for modeling the external mainstream flow and the internal cooling flow are used to compute boundary conditions for the heat transfer and stress analyses. An inviscid, quasi three-dimensional code computes the external free stream velocity. The external velocity is then used in a boundary layer analysis to compute the external heat transfer coefficients. Coolant temperatures are computed by a viscous one-dimensional internal flow code for the momentum and energy equation. These boundary conditions are input to a three-dimensional heat conduction code for calculation of rotor temperatures. The rotor stress distribution may be determined for the given thermal, pressure and centrifugal loading. The procedure is applied to a cooled radial turbine which will be tested at the NASA Lewis Research Center. Representative results from this case are included.

  3. Two-Level Weld-Material Homogenization for Efficient Computational Analysis of Welded Structure Blast-Survivability

    NASA Astrophysics Data System (ADS)

    Grujicic, M.; Arakere, G.; Hariharan, A.; Pandurangan, B.

    2012-06-01

    The introduction of newer joining technologies like the so-called friction-stir welding (FSW) into automotive engineering entails the knowledge of the joint-material microstructure and properties. Since, the development of vehicles (including military vehicles capable of surviving blast and ballistic impacts) nowadays involves extensive use of the computational engineering analyses (CEA), robust high-fidelity material models are needed for the FSW joints. A two-level material-homogenization procedure is proposed and utilized in this study to help manage computational cost and computer storage requirements for such CEAs. The method utilizes experimental (microstructure, microhardness, tensile testing, and x-ray diffraction) data to construct: (a) the material model for each weld zone and (b) the material model for the entire weld. The procedure is validated by comparing its predictions with the predictions of more detailed but more costly computational analyses.

  4. Development of a thermal and structural analysis procedure for cooled radial turbines

    NASA Technical Reports Server (NTRS)

    Kumar, Ganesh N.; Deanna, Russell G.

    1988-01-01

    A procedure for computing the rotor temperature and stress distributions in a cooled radial turbine are considered. Existing codes for modeling the external mainstream flow and the internal cooling flow are used to compute boundary conditions for the heat transfer and stress analysis. The inviscid, quasi three dimensional code computes the external free stream velocity. The external velocity is then used in a boundary layer analysis to compute the external heat transfer coefficients. Coolant temperatures are computed by a viscous three dimensional internal flow cade for the momentum and energy equation. These boundary conditions are input to a three dimensional heat conduction code for the calculation of rotor temperatures. The rotor stress distribution may be determined for the given thermal, pressure and centrifugal loading. The procedure is applied to a cooled radial turbine which will be tested at the NASA Lewis Research Center. Representative results are given.

  5. Hierarchical pictorial structures for simultaneously localizing multiple organs in volumetric pre-scan CT

    NASA Astrophysics Data System (ADS)

    Montillo, Albert; Song, Qi; Das, Bipul; Yin, Zhye

    2015-03-01

    Parsing volumetric computed tomography (CT) into 10 or more salient organs simultaneously is a challenging task with many applications such as personalized scan planning and dose reporting. In the clinic, pre-scan data can come in the form of very low dose volumes acquired just prior to the primary scan or from an existing primary scan. To localize organs in such diverse data, we propose a new learning based framework that we call hierarchical pictorial structures (HPS) which builds multiple levels of models in a tree-like hierarchy that mirrors the natural decomposition of human anatomy from gross structures to finer structures. Each node of our hierarchical model learns (1) the local appearance and shape of structures, and (2) a generative global model that learns probabilistic, structural arrangement. Our main contribution is twofold. First we embed the pictorial structures approach in a hierarchical framework which reduces test time image interpretation and allows for the incorporation of additional geometric constraints that robustly guide model fitting in the presence of noise. Second we guide our HPS framework with the probabilistic cost maps extracted using random decision forests using volumetric 3D HOG features which makes our model fast to train and fast to apply to novel test data and posses a high degree of invariance to shape distortion and imaging artifacts. All steps require approximate 3 mins to compute and all organs are located with suitably high accuracy for our clinical applications such as personalized scan planning for radiation dose reduction. We assess our method using a database of volumetric CT scans from 81 subjects with widely varying age and pathology and with simulated ultra-low dose cadaver pre-scan data.

  6. NASTRAN Analysis Comparison to Shock Tube Tests Used to Simulate Nuclear Overpressures

    NASA Technical Reports Server (NTRS)

    Wheless, T. K.

    1985-01-01

    This report presents a study of the effectiveness of the NASTRAN computer code for predicting structural response to nuclear blast overpressures. NASTRAN's effectiveness is determined by comparing results against shock tube tests used to simulate nuclear overpressures. Seven panels of various configurations are compared in this study. Panel deflections are the criteria used to measure NASTRAN's effectiveness. This study is a result of needed improvements in the survivability/vulnerability analyses subjected to nuclear blast.

  7. Initial validation of a web-based self-administered neuropsychological test battery for older adults and seniors.

    PubMed

    Hansen, Tor Ivar; Haferstrom, Elise Christina D; Brunner, Jan F; Lehn, Hanne; Håberg, Asta Kristine

    2015-01-01

    Computerized neuropsychological tests are effective in assessing different cognitive domains, but are often limited by the need of proprietary hardware and technical staff. Web-based tests can be more accessible and flexible. We aimed to investigate validity, effects of computer familiarity, education, and age, and the feasibility of a new web-based self-administered neuropsychological test battery (Memoro) in older adults and seniors. A total of 62 (37 female) participants (mean age 60.7 years) completed the Memoro web-based neuropsychological test battery and a traditional battery composed of similar tests intended to measure the same cognitive constructs. Participants were assessed on computer familiarity and how they experienced the two batteries. To properly test the factor structure of Memoro, an additional factor analysis in 218 individuals from the HUNT population was performed. Comparing Memoro to traditional tests, we observed good concurrent validity (r = .49-.63). The performance on the traditional and Memoro test battery was consistent, but differences in raw scores were observed with higher scores on verbal memory and lower in spatial memory in Memoro. Factor analysis indicated two factors: verbal and spatial memory. There were no correlations between test performance and computer familiarity after adjustment for age or age and education. Subjects reported that they preferred web-based testing as it allowed them to set their own pace, and they did not feel scrutinized by an administrator. Memoro showed good concurrent validity compared to neuropsychological tests measuring similar cognitive constructs. Based on the current results, Memoro appears to be a tool that can be used to assess cognitive function in older and senior adults. Further work is necessary to ascertain its validity and reliability.

  8. Initial validation of a web-based self-administered neuropsychological test battery for older adults and seniors

    PubMed Central

    Hansen, Tor Ivar; Haferstrom, Elise Christina D.; Brunner, Jan F.; Lehn, Hanne; Håberg, Asta Kristine

    2015-01-01

    Introduction: Computerized neuropsychological tests are effective in assessing different cognitive domains, but are often limited by the need of proprietary hardware and technical staff. Web-based tests can be more accessible and flexible. We aimed to investigate validity, effects of computer familiarity, education, and age, and the feasibility of a new web-based self-administered neuropsychological test battery (Memoro) in older adults and seniors. Method: A total of 62 (37 female) participants (mean age 60.7 years) completed the Memoro web-based neuropsychological test battery and a traditional battery composed of similar tests intended to measure the same cognitive constructs. Participants were assessed on computer familiarity and how they experienced the two batteries. To properly test the factor structure of Memoro, an additional factor analysis in 218 individuals from the HUNT population was performed. Results: Comparing Memoro to traditional tests, we observed good concurrent validity (r = .49–.63). The performance on the traditional and Memoro test battery was consistent, but differences in raw scores were observed with higher scores on verbal memory and lower in spatial memory in Memoro. Factor analysis indicated two factors: verbal and spatial memory. There were no correlations between test performance and computer familiarity after adjustment for age or age and education. Subjects reported that they preferred web-based testing as it allowed them to set their own pace, and they did not feel scrutinized by an administrator. Conclusions: Memoro showed good concurrent validity compared to neuropsychological tests measuring similar cognitive constructs. Based on the current results, Memoro appears to be a tool that can be used to assess cognitive function in older and senior adults. Further work is necessary to ascertain its validity and reliability. PMID:26009791

  9. An influence coefficient method for the application of the modal technique to wing flutter suppression of the DAST ARW-1 wing

    NASA Technical Reports Server (NTRS)

    Pines, S.

    1981-01-01

    The methods used to compute the mass, structural stiffness, and aerodynamic forces in the form of influence coefficient matrices as applied to a flutter analysis of the Drones for Aerodynamic and Structural Testing (DAST) Aeroelastic Research Wing. The DAST wing was chosen because wind tunnel flutter test data and zero speed vibration data of the modes and frequencies exist and are available for comparison. A derivation of the equations of motion that can be used to apply the modal method for flutter suppression is included. A comparison of the open loop flutter predictions with both wind tunnel data and other analytical methods is presented.

  10. Design optimization studies using COSMIC NASTRAN

    NASA Technical Reports Server (NTRS)

    Pitrof, Stephen M.; Bharatram, G.; Venkayya, Vipperla B.

    1993-01-01

    The purpose of this study is to create, test and document a procedure to integrate mathematical optimization algorithms with COSMIC NASTRAN. This procedure is very important to structural design engineers who wish to capitalize on optimization methods to ensure that their design is optimized for its intended application. The OPTNAST computer program was created to link NASTRAN and design optimization codes into one package. This implementation was tested using two truss structure models and optimizing their designs for minimum weight, subject to multiple loading conditions and displacement and stress constraints. However, the process is generalized so that an engineer could design other types of elements by adding to or modifying some parts of the code.

  11. Advanced Computing Methods for Knowledge Discovery and Prognosis in Acoustic Emission Monitoring

    ERIC Educational Resources Information Center

    Mejia, Felipe

    2012-01-01

    Structural health monitoring (SHM) has gained significant popularity in the last decade. This growing interest, coupled with new sensing technologies, has resulted in an overwhelming amount of data in need of management and useful interpretation. Acoustic emission (AE) testing has been particularly fraught by the problem of growing data and is…

  12. CHINESE GRAMMARS AND THE COMPUTER AT THE OHIO STATE UNIVERSITY. PRELIMINARY REPORT.

    ERIC Educational Resources Information Center

    MEYERS, L.F.; YANG, J.

    SAMPLE OUTPUT SENTENCES OF VARIOUS COMIT AND SNOBOL PROGRAMS FOR TESTING A CHINESE GENERATIVE GRAMMAR ARE PRESENTED. THE GRAMMAR CHOSEN FOR EXPERIMENTATION IS A PRELIMINARY VERSION OF A TRANSFORMATIONAL GRAMMAR. ALL OF THE COMIT PROGRAMS AND ONE OF THE SNOBOL PROGRAMS USE A LINEARIZED REPRESENTATION OF TREE STRUCTURES, WITH ADDITIONAL NUMERICAL…

  13. Attending to Structural Programming Features Predicts Differences in Learning and Motivation

    ERIC Educational Resources Information Center

    Witherspoon, Eben B.; Schunn, Christian D.; Higashi, Ross M.; Shoop, Robin

    2018-01-01

    Educational robotics programs offer an engaging opportunity to potentially teach core computer science concepts and practices in K-12 classrooms. Here, we test the effects of units with different programming content within a virtual robotics context on both learning gains and motivational changes in middle school (6th-8th grade) robotics…

  14. Mechanical Engineering at KSC: 'How I spend My Hours from 9 to 5 and Draw a Paycheck'

    NASA Technical Reports Server (NTRS)

    Randazzo, John; Steinrock. Todd (Technical Monitor)

    2003-01-01

    This viewgraph presentation provides an overview of a senior mechanical engineer's role in designing and testing sensors to fly aboard the shuttle Discovery during STS-95 and STS-98. Topics covered include: software development tools, computation fluid dynamics, structural analysis, housing design, and systems integration.

  15. NASTRAN applications to aircraft propulsion systems

    NASA Technical Reports Server (NTRS)

    White, J. L.; Beste, D. L.

    1975-01-01

    The use of NASTRAN in propulsion system structural integration analysis is described. Computer support programs for modeling, substructuring, and plotting analysis results are discussed. Requirements on interface information and data exchange by participants in a NASTRAN substructure analysis are given. Static and normal modes vibration analysis results are given with comparison to test and other analytical results.

  16. High-Dimensional Semantic Space Accounts of Priming

    ERIC Educational Resources Information Center

    Jones, Michael N.; Kintsch, Walter; Mewhort, Douglas J. K.

    2006-01-01

    A broad range of priming data has been used to explore the structure of semantic memory and to test between models of word representation. In this paper, we examine the computational mechanisms required to learn distributed semantic representations for words directly from unsupervised experience with language. To best account for the variety of…

  17. Computer-Based Learning: Graphical Integration of Whole and Sectional Neuroanatomy Improves Long-Term Retention

    ERIC Educational Resources Information Center

    Naaz, Farah; Chariker, Julia H.; Pani, John R.

    2014-01-01

    A study was conducted to test the hypothesis that instruction with graphically integrated representations of whole and sectional neuroanatomy is especially effective for learning to recognize neural structures in sectional imagery (such as magnetic resonance imaging [MRI]). Neuroanatomy was taught to two groups of participants using computer…

  18. Testing the Relation between Fidelity of Implementation and Student Outcomes in Math

    ERIC Educational Resources Information Center

    Crawford, Lindy; Carpenter, Dick M., II; Wilson, Mary T.; Schmeister, Megan; McDonald, Marilee

    2012-01-01

    The relation between fidelity of implementation and student outcomes in a computer-based middle school mathematics curriculum was measured empirically. Participants included 485 students and 23 teachers from 11 public middle schools across seven states. Implementation fidelity was defined using two constructs: fidelity to structure and fidelity to…

  19. A cross-sectional evaluation of computer literacy among medical students at a tertiary care teaching hospital in Mumbai, Bombay.

    PubMed

    Panchabhai, T S; Dangayach, N S; Mehta, V S; Patankar, C V; Rege, N N

    2011-01-01

    Computer usage capabilities of medical students for introduction of computer-aided learning have not been adequately assessed. Cross-sectional study to evaluate computer literacy among medical students. Tertiary care teaching hospital in Mumbai, India. Participants were administered a 52-question questionnaire, designed to study their background, computer resources, computer usage, activities enhancing computer skills, and attitudes toward computer-aided learning (CAL). The data was classified on the basis of sex, native place, and year of medical school, and the computer resources were compared. The computer usage and attitudes toward computer-based learning were assessed on a five-point Likert scale, to calculate Computer usage score (CUS - maximum 55, minimum 11) and Attitude score (AS - maximum 60, minimum 12). The quartile distribution among the groups with respect to the CUS and AS was compared by chi-squared tests. The correlation between CUS and AS was then tested. Eight hundred and seventy-five students agreed to participate in the study and 832 completed the questionnaire. One hundred and twenty eight questionnaires were excluded and 704 were analyzed. Outstation students had significantly lesser computer resources as compared to local students (P<0.0001). The mean CUS for local students (27.0±9.2, Mean±SD) was significantly higher than outstation students (23.2±9.05). No such difference was observed for the AS. The means of CUS and AS did not differ between males and females. The CUS and AS had positive, but weak correlations for all subgroups. The weak correlation between AS and CUS for all students could be explained by the lack of computer resources or inadequate training to use computers for learning. Providing additional resources would benefit the subset of outstation students with lesser computer resources. This weak correlation between the attitudes and practices of all students needs to be investigated. We believe that this gap can be bridged with a structured computer learning program.

  20. Exact calculation of distributions on integers, with application to sequence alignment.

    PubMed

    Newberg, Lee A; Lawrence, Charles E

    2009-01-01

    Computational biology is replete with high-dimensional discrete prediction and inference problems. Dynamic programming recursions can be applied to several of the most important of these, including sequence alignment, RNA secondary-structure prediction, phylogenetic inference, and motif finding. In these problems, attention is frequently focused on some scalar quantity of interest, a score, such as an alignment score or the free energy of an RNA secondary structure. In many cases, score is naturally defined on integers, such as a count of the number of pairing differences between two sequence alignments, or else an integer score has been adopted for computational reasons, such as in the test of significance of motif scores. The probability distribution of the score under an appropriate probabilistic model is of interest, such as in tests of significance of motif scores, or in calculation of Bayesian confidence limits around an alignment. Here we present three algorithms for calculating the exact distribution of a score of this type; then, in the context of pairwise local sequence alignments, we apply the approach so as to find the alignment score distribution and Bayesian confidence limits.

Top