Sample records for element analysis tools

  1. CFD Methods and Tools for Multi-Element Airfoil Analysis

    NASA Technical Reports Server (NTRS)

    Rogers, Stuart E.; George, Michael W. (Technical Monitor)

    1995-01-01

    This lecture will discuss the computational tools currently available for high-lift multi-element airfoil analysis. It will present an overview of a number of different numerical approaches, their current capabilities, short-comings, and computational costs. The lecture will be limited to viscous methods, including inviscid/boundary layer coupling methods, and incompressible and compressible Reynolds-averaged Navier-Stokes methods. Both structured and unstructured grid generation approaches will be presented. Two different structured grid procedures are outlined, one which uses multi-block patched grids, the other uses overset chimera grids. Turbulence and transition modeling will be discussed.

  2. Automation Tools for Finite Element Analysis of Adhesively Bonded Joints

    NASA Technical Reports Server (NTRS)

    Tahmasebi, Farhad; Brodeur, Stephen J. (Technical Monitor)

    2002-01-01

    This article presents two new automation creation tools that obtain stresses and strains (Shear and peel) in adhesively bonded joints. For a given adhesively bonded joint Finite Element model, in which the adhesive is characterised using springs, these automation tools read the corresponding input and output files, use the spring forces and deformations to obtain the adhesive stresses and strains, sort the stresses and strains in descending order, and generate plot files for 3D visualisation of the stress and strain fields. Grids (nodes) and elements can be numbered in any order that is convenient for the user. Using the automation tools, trade-off studies, which are needed for design of adhesively bonded joints, can be performed very quickly.

  3. ElemeNT: a computational tool for detecting core promoter elements.

    PubMed

    Sloutskin, Anna; Danino, Yehuda M; Orenstein, Yaron; Zehavi, Yonathan; Doniger, Tirza; Shamir, Ron; Juven-Gershon, Tamar

    2015-01-01

    Core promoter elements play a pivotal role in the transcriptional output, yet they are often detected manually within sequences of interest. Here, we present 2 contributions to the detection and curation of core promoter elements within given sequences. First, the Elements Navigation Tool (ElemeNT) is a user-friendly web-based, interactive tool for prediction and display of putative core promoter elements and their biologically-relevant combinations. Second, the CORE database summarizes ElemeNT-predicted core promoter elements near CAGE and RNA-seq-defined Drosophila melanogaster transcription start sites (TSSs). ElemeNT's predictions are based on biologically-functional core promoter elements, and can be used to infer core promoter compositions. ElemeNT does not assume prior knowledge of the actual TSS position, and can therefore assist in annotation of any given sequence. These resources, freely accessible at http://lifefaculty.biu.ac.il/gershon-tamar/index.php/resources, facilitate the identification of core promoter elements as active contributors to gene expression.

  4. RSAT: regulatory sequence analysis tools.

    PubMed

    Thomas-Chollier, Morgane; Sand, Olivier; Turatsinze, Jean-Valéry; Janky, Rekin's; Defrance, Matthieu; Vervisch, Eric; Brohée, Sylvain; van Helden, Jacques

    2008-07-01

    The regulatory sequence analysis tools (RSAT, http://rsat.ulb.ac.be/rsat/) is a software suite that integrates a wide collection of modular tools for the detection of cis-regulatory elements in genome sequences. The suite includes programs for sequence retrieval, pattern discovery, phylogenetic footprint detection, pattern matching, genome scanning and feature map drawing. Random controls can be performed with random gene selections or by generating random sequences according to a variety of background models (Bernoulli, Markov). Beyond the original word-based pattern-discovery tools (oligo-analysis and dyad-analysis), we recently added a battery of tools for matrix-based detection of cis-acting elements, with some original features (adaptive background models, Markov-chain estimation of P-values) that do not exist in other matrix-based scanning tools. The web server offers an intuitive interface, where each program can be accessed either separately or connected to the other tools. In addition, the tools are now available as web services, enabling their integration in programmatic workflows. Genomes are regularly updated from various genome repositories (NCBI and EnsEMBL) and 682 organisms are currently supported. Since 1998, the tools have been used by several hundreds of researchers from all over the world. Several predictions made with RSAT were validated experimentally and published.

  5. Stochastic Simulation Tool for Aerospace Structural Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.; Moore, David F.

    2006-01-01

    Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.

  6. Fluctuating Finite Element Analysis (FFEA): A continuum mechanics software tool for mesoscale simulation of biomolecules.

    PubMed

    Solernou, Albert; Hanson, Benjamin S; Richardson, Robin A; Welch, Robert; Read, Daniel J; Harlen, Oliver G; Harris, Sarah A

    2018-03-01

    Fluctuating Finite Element Analysis (FFEA) is a software package designed to perform continuum mechanics simulations of proteins and other globular macromolecules. It combines conventional finite element methods with stochastic thermal noise, and is appropriate for simulations of large proteins and protein complexes at the mesoscale (length-scales in the range of 5 nm to 1 μm), where there is currently a paucity of modelling tools. It requires 3D volumetric information as input, which can be low resolution structural information such as cryo-electron tomography (cryo-ET) maps or much higher resolution atomistic co-ordinates from which volumetric information can be extracted. In this article we introduce our open source software package for performing FFEA simulations which we have released under a GPLv3 license. The software package includes a C ++ implementation of FFEA, together with tools to assist the user to set up the system from Electron Microscopy Data Bank (EMDB) or Protein Data Bank (PDB) data files. We also provide a PyMOL plugin to perform basic visualisation and additional Python tools for the analysis of FFEA simulation trajectories. This manuscript provides a basic background to the FFEA method, describing the implementation of the core mechanical model and how intermolecular interactions and the solvent environment are included within this framework. We provide prospective FFEA users with a practical overview of how to set up an FFEA simulation with reference to our publicly available online tutorials and manuals that accompany this first release of the package.

  7. Fluctuating Finite Element Analysis (FFEA): A continuum mechanics software tool for mesoscale simulation of biomolecules

    PubMed Central

    Solernou, Albert

    2018-01-01

    Fluctuating Finite Element Analysis (FFEA) is a software package designed to perform continuum mechanics simulations of proteins and other globular macromolecules. It combines conventional finite element methods with stochastic thermal noise, and is appropriate for simulations of large proteins and protein complexes at the mesoscale (length-scales in the range of 5 nm to 1 μm), where there is currently a paucity of modelling tools. It requires 3D volumetric information as input, which can be low resolution structural information such as cryo-electron tomography (cryo-ET) maps or much higher resolution atomistic co-ordinates from which volumetric information can be extracted. In this article we introduce our open source software package for performing FFEA simulations which we have released under a GPLv3 license. The software package includes a C ++ implementation of FFEA, together with tools to assist the user to set up the system from Electron Microscopy Data Bank (EMDB) or Protein Data Bank (PDB) data files. We also provide a PyMOL plugin to perform basic visualisation and additional Python tools for the analysis of FFEA simulation trajectories. This manuscript provides a basic background to the FFEA method, describing the implementation of the core mechanical model and how intermolecular interactions and the solvent environment are included within this framework. We provide prospective FFEA users with a practical overview of how to set up an FFEA simulation with reference to our publicly available online tutorials and manuals that accompany this first release of the package. PMID:29570700

  8. Finite Element Analysis of Surface Residual Stress in Functionally Gradient Cemented Carbide Tool

    NASA Astrophysics Data System (ADS)

    Su, Chuangnan; Liu, Deshun; Tang, Siwen; Li, Pengnan; Qiu, Xinyi

    2018-03-01

    A component distribution model is proposed for three-component functionally gradient cemented carbide (FGCC) based on electron probe microanalysis results obtained for gradient layer thickness, microstructure, and elemental distribution. The residual surface stress of FGCC-T5 tools occurring during the fabrication process is analyzed using an ANSYS-implemented finite element method (FEM) and X-ray diffraction. A comparison of the experimental and calculated values verifies the feasibility of using FEM to analyze the residual surface stress in FGCC-T5 tools. The effects of the distribution index, geometrical shape, substrate thickness, gradient layer thickness, and position of the cobalt-rich layer on residual surface stress are studied in detail.

  9. Toward automatic finite element analysis

    NASA Technical Reports Server (NTRS)

    Kela, Ajay; Perucchio, Renato; Voelcker, Herbert

    1987-01-01

    Two problems must be solved if the finite element method is to become a reliable and affordable blackbox engineering tool. Finite element meshes must be generated automatically from computer aided design databases and mesh analysis must be made self-adaptive. The experimental system described solves both problems in 2-D through spatial and analytical substructuring techniques that are now being extended into 3-D.

  10. Tool Releases Optical Elements From Spring Brackets

    NASA Technical Reports Server (NTRS)

    Gum, J. S.

    1984-01-01

    Threaded hooks retract bracket arms holding element. Tool uses three hooks with threaded shanks mounted in ring-shaped holder to pull on tabs to release optical element. One person can easily insert or remove optical element (such as prism or lens) from spring holder or bracket with minimal risk of damage.

  11. Intra-regional classification of grape seeds produced in Mendoza province (Argentina) by multi-elemental analysis and chemometrics tools.

    PubMed

    Canizo, Brenda V; Escudero, Leticia B; Pérez, María B; Pellerano, Roberto G; Wuilloud, Rodolfo G

    2018-03-01

    The feasibility of the application of chemometric techniques associated with multi-element analysis for the classification of grape seeds according to their provenance vineyard soil was investigated. Grape seed samples from different localities of Mendoza province (Argentina) were evaluated. Inductively coupled plasma mass spectrometry (ICP-MS) was used for the determination of twenty-nine elements (Ag, As, Ce, Co, Cs, Cu, Eu, Fe, Ga, Gd, La, Lu, Mn, Mo, Nb, Nd, Ni, Pr, Rb, Sm, Te, Ti, Tl, Tm, U, V, Y, Zn and Zr). Once the analytical data were collected, supervised pattern recognition techniques such as linear discriminant analysis (LDA), partial least square discriminant analysis (PLS-DA), k-nearest neighbors (k-NN), support vector machine (SVM) and Random Forest (RF) were applied to construct classification/discrimination rules. The results indicated that nonlinear methods, RF and SVM, perform best with up to 98% and 93% accuracy rate, respectively, and therefore are excellent tools for classification of grapes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Recommendations for tool-handle material choice based on finite element analysis.

    PubMed

    Harih, Gregor; Dolšak, Bojan

    2014-05-01

    Huge areas of work are still done manually and require the usages of different powered and non-powered hand tools. In order to increase the user performance, satisfaction, and lower the risk of acute and cumulative trauma disorders, several researchers have investigated the sizes and shapes of tool-handles. However, only a few authors have investigated tool-handles' materials for further optimising them. Therefore, as presented in this paper, we have utilised a finite-element method for simulating human fingertip whilst grasping tool-handles. We modelled and simulated steel and ethylene propylene diene monomer (EPDM) rubber as homogeneous tool-handle materials and two composites consisting of EPDM rubber and EPDM foam, and also EPDM rubber and PU foam. The simulated finger force was set to obtain characteristic contact pressures of 20 kPa, 40 kPa, 80 kPa, and 100 kPa. Numerical tests have shown that EPDM rubber lowers the contact pressure just slightly. On the other hand, both composites showed significant reduction in contact pressure that could lower the risks of acute and cumulative trauma disorders which are pressure-dependent. Based on the results, it is also evident that a composite containing PU foam with a more evident and flat plateau deformed less at lower strain rates and deformed more when the plateau was reached, in comparison to the composite with EPDM foam. It was shown that hyper-elastic foam materials, which take into account the non-linear behaviour of fingertip soft tissue, can lower the contact pressure whilst maintaining low deformation rate of the tool-handle material for maintaining sufficient rate of stability of the hand tool in the hands. Lower contact pressure also lowers the risk of acute and cumulative trauma disorders, and increases comfort whilst maintaining performance. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  13. An Analysis of the Elements of Collaboration Associated with Top Collaborative Tools

    DTIC Science & Technology

    2010-03-01

    lets you access your e-mail, calendar, and files from any web browser anywhere in the world. Web based www.hotoffice.com Noodle Vialect’s (parent...www.taroby.org Yuuguu Yuuguu is an instant screen sharing, web conferencing, remote support, desktop remote control and messaging tool. Client...Office, Noodle , Novlet, Revizr, Taroby, and Yuuguu) received all seven NS ratings (see Table 20 below). The overall ratings for the major elements

  14. The Applications of Finite Element Analysis in Proximal Humeral Fractures.

    PubMed

    Ye, Yongyu; You, Wei; Zhu, Weimin; Cui, Jiaming; Chen, Kang; Wang, Daping

    2017-01-01

    Proximal humeral fractures are common and most challenging, due to the complexity of the glenohumeral joint, especially in the geriatric population with impacted fractures, that the development of implants continues because currently the problems with their fixation are not solved. Pre-, intra-, and postoperative assessments are crucial in management of those patients. Finite element analysis, as one of the valuable tools, has been implemented as an effective and noninvasive method to analyze proximal humeral fractures, providing solid evidence for management of troublesome patients. However, no review article about the applications and effects of finite element analysis in assessing proximal humeral fractures has been reported yet. This review article summarized the applications, contribution, and clinical significance of finite element analysis in assessing proximal humeral fractures. Furthermore, the limitations of finite element analysis, the difficulties of more realistic simulation, and the validation and also the creation of validated FE models were discussed. We concluded that although some advancements in proximal humeral fractures researches have been made by using finite element analysis, utility of this powerful tool for routine clinical management and adequate simulation requires more state-of-the-art studies to provide evidence and bases.

  15. Finite element analysis as a design tool for thermoplastic vulcanizate glazing seals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gase, K.M.; Hudacek, L.L.; Pesevski, G.T.

    1998-12-31

    There are three materials that are commonly used in commercial glazing seals: EPDM, silicone and thermoplastic vulcanizates (TPVs). TPVs are a high performance class of thermoplastic elastomers (TPEs), where TPEs have elastomeric properties with thermoplastic processability. TPVs have emerged as materials well suited for use in glazing seals due to ease of processing, economics and part design flexibility. The part design and development process is critical to ensure that the chosen TPV provides economics, quality and function in demanding environments. In the design and development process, there is great value in utilizing dual durometer systems to capitalize on the benefitsmore » of soft and rigid materials. Computer-aided design tools, such as Finite Element Analysis (FEA), are effective in minimizing development time and predicting system performance. Examples of TPV glazing seals will illustrate the benefits of utilizing FEA to take full advantage of the material characteristics, which results in functional performance and quality while reducing development iterations. FEA will be performed on two glazing seal profiles to confirm optimum geometry.« less

  16. The Mobile Element Locator Tool (MELT): population-scale mobile element discovery and biology

    PubMed Central

    Gardner, Eugene J.; Lam, Vincent K.; Harris, Daniel N.; Chuang, Nelson T.; Scott, Emma C.; Pittard, W. Stephen; Mills, Ryan E.; Devine, Scott E.

    2017-01-01

    Mobile element insertions (MEIs) represent ∼25% of all structural variants in human genomes. Moreover, when they disrupt genes, MEIs can influence human traits and diseases. Therefore, MEIs should be fully discovered along with other forms of genetic variation in whole genome sequencing (WGS) projects involving population genetics, human diseases, and clinical genomics. Here, we describe the Mobile Element Locator Tool (MELT), which was developed as part of the 1000 Genomes Project to perform MEI discovery on a population scale. Using both Illumina WGS data and simulations, we demonstrate that MELT outperforms existing MEI discovery tools in terms of speed, scalability, specificity, and sensitivity, while also detecting a broader spectrum of MEI-associated features. Several run modes were developed to perform MEI discovery on local and cloud systems. In addition to using MELT to discover MEIs in modern humans as part of the 1000 Genomes Project, we also used it to discover MEIs in chimpanzees and ancient (Neanderthal and Denisovan) hominids. We detected diverse patterns of MEI stratification across these populations that likely were caused by (1) diverse rates of MEI production from source elements, (2) diverse patterns of MEI inheritance, and (3) the introgression of ancient MEIs into modern human genomes. Overall, our study provides the most comprehensive map of MEIs to date spanning chimpanzees, ancient hominids, and modern humans and reveals new aspects of MEI biology in these lineages. We also demonstrate that MELT is a robust platform for MEI discovery and analysis in a variety of experimental settings. PMID:28855259

  17. ELM - A SIMPLE TOOL FOR THERMAL-HYDRAULIC ANALYSIS OF SOLID-CORE NUCLEAR ROCKET FUEL ELEMENTS

    NASA Technical Reports Server (NTRS)

    Walton, J. T.

    1994-01-01

    ELM is a simple computational tool for modeling the steady-state thermal-hydraulics of propellant flow through fuel element coolant channels in nuclear thermal rockets. Written for the nuclear propulsion project of the Space Exploration Initiative, ELM evaluates the various heat transfer coefficient and friction factor correlations available for turbulent pipe flow with heat addition. In the past, these correlations were found in different reactor analysis codes, but now comparisons are possible within one program. The logic of ELM is based on the one-dimensional conservation of energy in combination with Newton's Law of Cooling to determine the bulk flow temperature and the wall temperature across a control volume. Since the control volume is an incremental length of tube, the corresponding pressure drop is determined by application of the Law of Conservation of Momentum. The size, speed, and accuracy of ELM make it a simple tool for use in fuel element parametric studies. ELM is a machine independent program written in FORTRAN 77. It has been successfully compiled on an IBM PC compatible running MS-DOS using Lahey FORTRAN 77, a DEC VAX series computer running VMS, and a Sun4 series computer running SunOS UNIX. ELM requires 565K of RAM under SunOS 4.1, 360K of RAM under VMS 5.4, and 406K of RAM under MS-DOS. Because this program is machine independent, no executable is provided on the distribution media. The standard distribution medium for ELM is one 5.25 inch 360K MS-DOS format diskette. ELM was developed in 1991. DEC, VAX, and VMS are trademarks of Digital Equipment Corporation. Sun4 and SunOS are trademarks of Sun Microsystems, Inc. IBM PC is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation.

  18. Challenges Facing Design and Analysis Tools

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Broduer, Steve (Technical Monitor)

    2001-01-01

    The design and analysis of future aerospace systems will strongly rely on advanced engineering analysis tools used in combination with risk mitigation procedures. The implications of such a trend place increased demands on these tools to assess off-nominal conditions, residual strength, damage propagation, and extreme loading conditions in order to understand and quantify these effects as they affect mission success. Advances in computer hardware such as CPU processing speed, memory, secondary storage, and visualization provide significant resources for the engineer to exploit in engineering design. The challenges facing design and analysis tools fall into three primary areas. The first area involves mechanics needs such as constitutive modeling, contact and penetration simulation, crack growth prediction, damage initiation and progression prediction, transient dynamics and deployment simulations, and solution algorithms. The second area involves computational needs such as fast, robust solvers, adaptivity for model and solution strategies, control processes for concurrent, distributed computing for uncertainty assessments, and immersive technology. Traditional finite element codes still require fast direct solvers which when coupled to current CPU power enables new insight as a result of high-fidelity modeling. The third area involves decision making by the analyst. This area involves the integration and interrogation of vast amounts of information - some global in character while local details are critical and often drive the design. The proposed presentation will describe and illustrate these areas using composite structures, energy-absorbing structures, and inflatable space structures. While certain engineering approximations within the finite element model may be adequate for global response prediction, they generally are inadequate in a design setting or when local response prediction is critical. Pitfalls to be avoided and trends for emerging analysis tools

  19. Analysis of Brick Masonry Wall using Applied Element Method

    NASA Astrophysics Data System (ADS)

    Lincy Christy, D.; Madhavan Pillai, T. M.; Nagarajan, Praveen

    2018-03-01

    The Applied Element Method (AEM) is a versatile tool for structural analysis. Analysis is done by discretising the structure as in the case of Finite Element Method (FEM). In AEM, elements are connected by a set of normal and shear springs instead of nodes. AEM is extensively used for the analysis of brittle materials. Brick masonry wall can be effectively analyzed in the frame of AEM. The composite nature of masonry wall can be easily modelled using springs. The brick springs and mortar springs are assumed to be connected in series. The brick masonry wall is analyzed and failure load is determined for different loading cases. The results were used to find the best aspect ratio of brick to strengthen brick masonry wall.

  20. A finite element analysis modeling tool for solid oxide fuel cell development: coupled electrochemistry, thermal and flow analysis in MARC®

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khaleel, Mohammad A.; Lin, Zijing; Singh, Prabhakar

    2004-05-03

    A 3D simulation tool for modeling solid oxide fuel cells is described. The tool combines the versatility and efficiency of a commercial finite element analysis code, MARC{reg_sign}, with an in-house developed robust and flexible electrochemical (EC) module. Based upon characteristic parameters obtained experimentally and assigned by the user, the EC module calculates the current density distribution, heat generation, and fuel and oxidant species concentration, taking the temperature profile provided by MARC{reg_sign} and operating conditions such as the fuel and oxidant flow rate and the total stack output voltage or current as the input. MARC{reg_sign} performs flow and thermal analyses basedmore » on the initial and boundary thermal and flow conditions and the heat generation calculated by the EC module. The main coupling between MARC{reg_sign} and EC is for MARC{reg_sign} to supply the temperature field to EC and for EC to give the heat generation profile to MARC{reg_sign}. The loosely coupled, iterative scheme is advantageous in terms of memory requirement, numerical stability and computational efficiency. The coupling is iterated to self-consistency for a steady-state solution. Sample results for steady states as well as the startup process for stacks with different flow designs are presented to illustrate the modeling capability and numerical performance characteristic of the simulation tool.« less

  1. Probabilistic finite elements for fatigue and fracture analysis

    NASA Astrophysics Data System (ADS)

    Belytschko, Ted; Liu, Wing Kam

    Attenuation is focused on the development of Probabilistic Finite Element Method (PFEM), which combines the finite element method with statistics and reliability methods, and its application to linear, nonlinear structural mechanics problems and fracture mechanics problems. The computational tool based on the Stochastic Boundary Element Method is also given for the reliability analysis of a curvilinear fatigue crack growth. The existing PFEM's have been applied to solve for two types of problems: (1) determination of the response uncertainty in terms of the means, variance and correlation coefficients; and (2) determination the probability of failure associated with prescribed limit states.

  2. Probabilistic finite elements for fatigue and fracture analysis

    NASA Technical Reports Server (NTRS)

    Belytschko, Ted; Liu, Wing Kam

    1992-01-01

    Attenuation is focused on the development of Probabilistic Finite Element Method (PFEM), which combines the finite element method with statistics and reliability methods, and its application to linear, nonlinear structural mechanics problems and fracture mechanics problems. The computational tool based on the Stochastic Boundary Element Method is also given for the reliability analysis of a curvilinear fatigue crack growth. The existing PFEM's have been applied to solve for two types of problems: (1) determination of the response uncertainty in terms of the means, variance and correlation coefficients; and (2) determination the probability of failure associated with prescribed limit states.

  3. Automation of a DXA-based finite element tool for clinical assessment of hip fracture risk.

    PubMed

    Luo, Yunhua; Ahmed, Sharif; Leslie, William D

    2018-03-01

    Finite element analysis of medical images is a promising tool for assessing hip fracture risk. Although a number of finite element models have been developed for this purpose, none of them have been routinely used in clinic. The main reason is that the computer programs that implement the finite element models have not been completely automated, and heavy training is required before clinicians can effectively use them. By using information embedded in clinical dual energy X-ray absorptiometry (DXA), we completely automated a DXA-based finite element (FE) model that we previously developed for predicting hip fracture risk. The automated FE tool can be run as a standalone computer program with the subject's raw hip DXA image as input. The automated FE tool had greatly improved short-term precision compared with the semi-automated version. To validate the automated FE tool, a clinical cohort consisting of 100 prior hip fracture cases and 300 matched controls was obtained from a local community clinical center. Both the automated FE tool and femoral bone mineral density (BMD) were applied to discriminate the fracture cases from the controls. Femoral BMD is the gold standard reference recommended by the World Health Organization for screening osteoporosis and for assessing hip fracture risk. The accuracy was measured by the area under ROC curve (AUC) and odds ratio (OR). Compared with femoral BMD (AUC = 0.71, OR = 2.07), the automated FE tool had a considerably improved accuracy (AUC = 0.78, OR = 2.61 at the trochanter). This work made a large step toward applying our DXA-based FE model as a routine clinical tool for the assessment of hip fracture risk. Furthermore, the automated computer program can be embedded into a web-site as an internet application. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Logistics Process Analysis ToolProcess Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2008-03-31

    LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component was added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fortmore » Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).« less

  5. Modelling of tunnelling processes and rock cutting tool wear with the particle finite element method

    NASA Astrophysics Data System (ADS)

    Carbonell, Josep Maria; Oñate, Eugenio; Suárez, Benjamín

    2013-09-01

    Underground construction involves all sort of challenges in analysis, design, project and execution phases. The dimension of tunnels and their structural requirements are growing, and so safety and security demands do. New engineering tools are needed to perform a safer planning and design. This work presents the advances in the particle finite element method (PFEM) for the modelling and the analysis of tunneling processes including the wear of the cutting tools. The PFEM has its foundation on the Lagrangian description of the motion of a continuum built from a set of particles with known physical properties. The method uses a remeshing process combined with the alpha-shape technique to detect the contacting surfaces and a finite element method for the mechanical computations. A contact procedure has been developed for the PFEM which is combined with a constitutive model for predicting the excavation front and the wear of cutting tools. The material parameters govern the coupling of frictional contact and wear between the interacting domains at the excavation front. The PFEM allows predicting several parameters which are relevant for estimating the performance of a tunnelling boring machine such as wear in the cutting tools, the pressure distribution on the face of the boring machine and the vibrations produced in the machinery and the adjacent soil/rock. The final aim is to help in the design of the excavating tools and in the planning of the tunnelling operations. The applications presented show that the PFEM is a promising technique for the analysis of tunnelling problems.

  6. An Integrated Tool for System Analysis of Sample Return Vehicles

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.; Maddock, Robert W.; Winski, Richard G.

    2012-01-01

    The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis and design of entry vehicles is multidisciplinary in nature, requiring the application of mass sizing, flight mechanics, aerodynamics, aerothermodynamics, thermal analysis, structural analysis, and impact analysis tools. Integration of a multidisciplinary problem is a challenging task; the execution process and data transfer among disciplines should be automated and consistent. This paper describes an integrated analysis tool for the design and sizing of an Earth entry vehicle. The current tool includes the following disciplines: mass sizing, flight mechanics, aerodynamics, aerothermodynamics, and impact analysis tools. Python and Java languages are used for integration. Results are presented and compared with the results from previous studies.

  7. Behavioral Health and Performance Element: Tools and Technologies

    NASA Technical Reports Server (NTRS)

    Leveton, Lauren B.

    2009-01-01

    This slide presentation reviews the research into the Behavioral Health and Performance (BHP) of the Human Research Program. The program element goal is to identify, characterize and prevent or reduce behavioral health and performance risks associated with space travel, exploration, and return to terrestrial life. To accomplish this goal the program focuses on applied research that is designed to yield deliverables that reduce risk. There are several different elements that are of particular interest: Behavioral Medicine, Sleep, and team composition, and team work. In order to assure success for NASA missions the Human Research Program develops and validate the standards for each of the areas of interest. There is discussion of the impact on BHP while astronauts are on Long Duration Missions. The effort in this research is to create tools to meet the BHP concerns, these prospective tools are reviewed.

  8. Ensembles of NLP Tools for Data Element Extraction from Clinical Notes.

    PubMed

    Kuo, Tsung-Ting; Rao, Pallavi; Maehara, Cleo; Doan, Son; Chaparro, Juan D; Day, Michele E; Farcas, Claudiu; Ohno-Machado, Lucila; Hsu, Chun-Nan

    2016-01-01

    Natural Language Processing (NLP) is essential for concept extraction from narrative text in electronic health records (EHR). To extract numerous and diverse concepts, such as data elements (i.e., important concepts related to a certain medical condition), a plausible solution is to combine various NLP tools into an ensemble to improve extraction performance. However, it is unclear to what extent ensembles of popular NLP tools improve the extraction of numerous and diverse concepts. Therefore, we built an NLP ensemble pipeline to synergize the strength of popular NLP tools using seven ensemble methods, and to quantify the improvement in performance achieved by ensembles in the extraction of data elements for three very different cohorts. Evaluation results show that the pipeline can improve the performance of NLP tools, but there is high variability depending on the cohort.

  9. RSAT 2015: Regulatory Sequence Analysis Tools

    PubMed Central

    Medina-Rivera, Alejandra; Defrance, Matthieu; Sand, Olivier; Herrmann, Carl; Castro-Mondragon, Jaime A.; Delerce, Jeremy; Jaeger, Sébastien; Blanchet, Christophe; Vincens, Pierre; Caron, Christophe; Staines, Daniel M.; Contreras-Moreira, Bruno; Artufel, Marie; Charbonnier-Khamvongsa, Lucie; Hernandez, Céline; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques

    2015-01-01

    RSAT (Regulatory Sequence Analysis Tools) is a modular software suite for the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, appropriate to genome-wide data sets like ChIP-seq, (ii) transcription factor binding motif analysis (quality assessment, comparisons and clustering), (iii) comparative genomics and (iv) analysis of regulatory variations. Nine new programs have been added to the 43 described in the 2011 NAR Web Software Issue, including a tool to extract sequences from a list of coordinates (fetch-sequences from UCSC), novel programs dedicated to the analysis of regulatory variants from GWAS or population genomics (retrieve-variation-seq and variation-scan), a program to cluster motifs and visualize the similarities as trees (matrix-clustering). To deal with the drastic increase of sequenced genomes, RSAT public sites have been reorganized into taxon-specific servers. The suite is well-documented with tutorials and published protocols. The software suite is available through Web sites, SOAP/WSDL Web services, virtual machines and stand-alone programs at http://www.rsat.eu/. PMID:25904632

  10. Integrated Modeling Tools for Thermal Analysis and Applications

    NASA Technical Reports Server (NTRS)

    Milman, Mark H.; Needels, Laura; Papalexandris, Miltiadis

    1999-01-01

    Integrated modeling of spacecraft systems is a rapidly evolving area in which multidisciplinary models are developed to design and analyze spacecraft configurations. These models are especially important in the early design stages where rapid trades between subsystems can substantially impact design decisions. Integrated modeling is one of the cornerstones of two of NASA's planned missions in the Origins Program -- the Next Generation Space Telescope (NGST) and the Space Interferometry Mission (SIM). Common modeling tools for control design and opto-mechanical analysis have recently emerged and are becoming increasingly widely used. A discipline that has been somewhat less integrated, but is nevertheless of critical concern for high precision optical instruments, is thermal analysis and design. A major factor contributing to this mild estrangement is that the modeling philosophies and objectives for structural and thermal systems typically do not coincide. Consequently the tools that are used in these discplines suffer a degree of incompatibility, each having developed along their own evolutionary path. Although standard thermal tools have worked relatively well in the past. integration with other disciplines requires revisiting modeling assumptions and solution methods. Over the past several years we have been developing a MATLAB based integrated modeling tool called IMOS (Integrated Modeling of Optical Systems) which integrates many aspects of structural, optical, control and dynamical analysis disciplines. Recent efforts have included developing a thermal modeling and analysis capability, which is the subject of this article. Currently, the IMOS thermal suite contains steady state and transient heat equation solvers, and the ability to set up the linear conduction network from an IMOS finite element model. The IMOS code generates linear conduction elements associated with plates and beams/rods of the thermal network directly from the finite element structural

  11. Ensembles of NLP Tools for Data Element Extraction from Clinical Notes

    PubMed Central

    Kuo, Tsung-Ting; Rao, Pallavi; Maehara, Cleo; Doan, Son; Chaparro, Juan D.; Day, Michele E.; Farcas, Claudiu; Ohno-Machado, Lucila; Hsu, Chun-Nan

    2016-01-01

    Natural Language Processing (NLP) is essential for concept extraction from narrative text in electronic health records (EHR). To extract numerous and diverse concepts, such as data elements (i.e., important concepts related to a certain medical condition), a plausible solution is to combine various NLP tools into an ensemble to improve extraction performance. However, it is unclear to what extent ensembles of popular NLP tools improve the extraction of numerous and diverse concepts. Therefore, we built an NLP ensemble pipeline to synergize the strength of popular NLP tools using seven ensemble methods, and to quantify the improvement in performance achieved by ensembles in the extraction of data elements for three very different cohorts. Evaluation results show that the pipeline can improve the performance of NLP tools, but there is high variability depending on the cohort. PMID:28269947

  12. An Analysis Technique/Automated Tool for Comparing and Tracking Analysis Modes of Different Finite Element Models

    NASA Technical Reports Server (NTRS)

    Towner, Robert L.; Band, Jonathan L.

    2012-01-01

    An analysis technique was developed to compare and track mode shapes for different Finite Element Models. The technique may be applied to a variety of structural dynamics analyses, including model reduction validation (comparing unreduced and reduced models), mode tracking for various parametric analyses (e.g., launch vehicle model dispersion analysis to identify sensitivities to modal gain for Guidance, Navigation, and Control), comparing models of different mesh fidelity (e.g., a coarse model for a preliminary analysis compared to a higher-fidelity model for a detailed analysis) and mode tracking for a structure with properties that change over time (e.g., a launch vehicle from liftoff through end-of-burn, with propellant being expended during the flight). Mode shapes for different models are compared and tracked using several numerical indicators, including traditional Cross-Orthogonality and Modal Assurance Criteria approaches, as well as numerical indicators obtained by comparing modal strain energy and kinetic energy distributions. This analysis technique has been used to reliably identify correlated mode shapes for complex Finite Element Models that would otherwise be difficult to compare using traditional techniques. This improved approach also utilizes an adaptive mode tracking algorithm that allows for automated tracking when working with complex models and/or comparing a large group of models.

  13. Probabilistic finite elements for fatigue and fracture analysis

    NASA Astrophysics Data System (ADS)

    Belytschko, Ted; Liu, Wing Kam

    1993-04-01

    An overview of the probabilistic finite element method (PFEM) developed by the authors and their colleagues in recent years is presented. The primary focus is placed on the development of PFEM for both structural mechanics problems and fracture mechanics problems. The perturbation techniques are used as major tools for the analytical derivation. The following topics are covered: (1) representation and discretization of random fields; (2) development of PFEM for the general linear transient problem and nonlinear elasticity using Hu-Washizu variational principle; (3) computational aspects; (4) discussions of the application of PFEM to the reliability analysis of both brittle fracture and fatigue; and (5) a stochastic computational tool based on stochastic boundary element (SBEM). Results are obtained for the reliability index and corresponding probability of failure for: (1) fatigue crack growth; (2) defect geometry; (3) fatigue parameters; and (4) applied loads. These results show that initial defect is a critical parameter.

  14. Probabilistic finite elements for fatigue and fracture analysis

    NASA Technical Reports Server (NTRS)

    Belytschko, Ted; Liu, Wing Kam

    1993-01-01

    An overview of the probabilistic finite element method (PFEM) developed by the authors and their colleagues in recent years is presented. The primary focus is placed on the development of PFEM for both structural mechanics problems and fracture mechanics problems. The perturbation techniques are used as major tools for the analytical derivation. The following topics are covered: (1) representation and discretization of random fields; (2) development of PFEM for the general linear transient problem and nonlinear elasticity using Hu-Washizu variational principle; (3) computational aspects; (4) discussions of the application of PFEM to the reliability analysis of both brittle fracture and fatigue; and (5) a stochastic computational tool based on stochastic boundary element (SBEM). Results are obtained for the reliability index and corresponding probability of failure for: (1) fatigue crack growth; (2) defect geometry; (3) fatigue parameters; and (4) applied loads. These results show that initial defect is a critical parameter.

  15. Non destructive multi elemental analysis using prompt gamma neutron activation analysis techniques: Preliminary results for concrete sample

    NASA Astrophysics Data System (ADS)

    Dahing, Lahasen@Normanshah; Yahya, Redzuan; Yahya, Roslan; Hassan, Hearie

    2014-09-01

    In this study, principle of prompt gamma neutron activation analysis has been used as a technique to determine the elements in the sample. The system consists of collimated isotopic neutron source, Cf-252 with HPGe detector and Multichannel Analysis (MCA). Concrete with size of 10×10×10 cm3 and 15×15×15 cm3 were analysed as sample. When neutrons enter and interact with elements in the concrete, the neutron capture reaction will occur and produce characteristic prompt gamma ray of the elements. The preliminary result of this study demonstrate the major element in the concrete was determined such as Si, Mg, Ca, Al, Fe and H as well as others element, such as Cl by analysis the gamma ray lines respectively. The results obtained were compared with NAA and XRF techniques as a part of reference and validation. The potential and the capability of neutron induced prompt gamma as tool for multi elemental analysis qualitatively to identify the elements present in the concrete sample discussed.

  16. Finite-element analysis of NiTi wire deflection during orthodontic levelling treatment

    NASA Astrophysics Data System (ADS)

    Razali, M. F.; Mahmud, A. S.; Mokhtar, N.; Abdullah, J.

    2016-02-01

    Finite-element analysis is an important product development tool in medical devices industry for design and failure analysis of devices. This tool helps device designers to quickly explore various design options, optimizing specific designs and providing a deeper insight how a device is actually performing. In this study, three-dimensional finite-element models of superelastic nickel-titanium arch wire engaged in a three brackets system were developed. The aim was to measure the effect of binding friction developed on wire-bracket interaction towards the remaining recovery force available for tooth movement. Uniaxial and three brackets bending test were modelled and validated against experimental works. The prediction made by the three brackets bending models shows good agreement with the experimental results.

  17. RSAT 2015: Regulatory Sequence Analysis Tools.

    PubMed

    Medina-Rivera, Alejandra; Defrance, Matthieu; Sand, Olivier; Herrmann, Carl; Castro-Mondragon, Jaime A; Delerce, Jeremy; Jaeger, Sébastien; Blanchet, Christophe; Vincens, Pierre; Caron, Christophe; Staines, Daniel M; Contreras-Moreira, Bruno; Artufel, Marie; Charbonnier-Khamvongsa, Lucie; Hernandez, Céline; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques

    2015-07-01

    RSAT (Regulatory Sequence Analysis Tools) is a modular software suite for the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, appropriate to genome-wide data sets like ChIP-seq, (ii) transcription factor binding motif analysis (quality assessment, comparisons and clustering), (iii) comparative genomics and (iv) analysis of regulatory variations. Nine new programs have been added to the 43 described in the 2011 NAR Web Software Issue, including a tool to extract sequences from a list of coordinates (fetch-sequences from UCSC), novel programs dedicated to the analysis of regulatory variants from GWAS or population genomics (retrieve-variation-seq and variation-scan), a program to cluster motifs and visualize the similarities as trees (matrix-clustering). To deal with the drastic increase of sequenced genomes, RSAT public sites have been reorganized into taxon-specific servers. The suite is well-documented with tutorials and published protocols. The software suite is available through Web sites, SOAP/WSDL Web services, virtual machines and stand-alone programs at http://www.rsat.eu/. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  18. Finite element analysis on a medical implant.

    PubMed

    Semenescu, Augustin; Radu-Ioniță, Florentina; Mateș, Ileana Mariana; Bădică, Petre; Batalu, Nicolae Dan; Negoita, Olivia Doina; Purcarea, Victor Lorin

    2016-01-01

    Several studies have shown a tight connection between several ocular pathologies and an increased risk of hip fractures due to falling, especially among elderly patients. The total replacement of the hip joint is a major surgical intervention that aims to restore the function of the affected hip by various factors, such as arthritis, injures, and others. A corkscrew-like femoral stem was designed in order to preserve the bone stock and to prevent the occurrence of iatrogenic fractures during the hammering of the implant. In this paper, the finite element analysis for the proposed design was applied, considering different loads and three types of materials. A finite element analysis is a powerful tool to simulate, optimize, design, and select suitable materials for new medical implants. The results showed that the best scenario was for Ti6Al4V alloy, although Ti and 316L stainless steel had a reasonable high safety factor.

  19. Organic Elemental Analysis.

    ERIC Educational Resources Information Center

    Ma, T. S.; Gutterson, Milton

    1980-01-01

    Reviews general developments in computerization and data processing of organic elemental analyses; carbon, hydrogen, and nitrogen analyzers; procedures for determining oxygen, sulfur, and halogens, as well as other nometallic elements and organometallics. Selected papers on trace analysis of nonmetals and determination of metallic elements are…

  20. RSAT 2018: regulatory sequence analysis tools 20th anniversary.

    PubMed

    Nguyen, Nga Thi Thuy; Contreras-Moreira, Bruno; Castro-Mondragon, Jaime A; Santana-Garcia, Walter; Ossio, Raul; Robles-Espinoza, Carla Daniela; Bahin, Mathieu; Collombet, Samuel; Vincens, Pierre; Thieffry, Denis; van Helden, Jacques; Medina-Rivera, Alejandra; Thomas-Chollier, Morgane

    2018-05-02

    RSAT (Regulatory Sequence Analysis Tools) is a suite of modular tools for the detection and the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, including from genome-wide datasets like ChIP-seq/ATAC-seq, (ii) motif scanning, (iii) motif analysis (quality assessment, comparisons and clustering), (iv) analysis of regulatory variations, (v) comparative genomics. Six public servers jointly support 10 000 genomes from all kingdoms. Six novel or refactored programs have been added since the 2015 NAR Web Software Issue, including updated programs to analyse regulatory variants (retrieve-variation-seq, variation-scan, convert-variations), along with tools to extract sequences from a list of coordinates (retrieve-seq-bed), to select motifs from motif collections (retrieve-matrix), and to extract orthologs based on Ensembl Compara (get-orthologs-compara). Three use cases illustrate the integration of new and refactored tools to the suite. This Anniversary update gives a 20-year perspective on the software suite. RSAT is well-documented and available through Web sites, SOAP/WSDL (Simple Object Access Protocol/Web Services Description Language) web services, virtual machines and stand-alone programs at http://www.rsat.eu/.

  1. The effects of dissecting tools on the trace element concentrations of fish and mussel tissues.

    PubMed

    Heit, M; Klusek, C S

    1982-06-01

    A comparison of the effects of dissecting tools composed of various materials on the trace element content of the muscle of the marine bluefish, Pomatomus saltatrix, and the soft tissues of freshwater mussels, Eliptio complanatus and Lampsilus radiata, is presented. The fish were dissected with blades made of stainless steel, Lexan plastic, titanium, and Teflon-coated stainless steel. The mussels were dissected with stainless and Teflon tools only. Elements measured included As, Cd, Cr, Cu, Hg, Ni, Pb, Se, Sn, Te, V, and Zn. Significant concentration differences (P = 0.01) were not found for any element in fish or mussel samples dissected by the different tools.

  2. Analysis of the electromagnetic wave resistivity tool in deviated well drilling

    NASA Astrophysics Data System (ADS)

    Zhang, Yumei; Xu, Lijun; Cao, Zhang

    2014-04-01

    Electromagnetic wave resistivity (EWR) tools are used to provide real-time measurements of resistivity in the formation around the tool in Logging While Drilling (LWD). In this paper, the acquired resistivity information in the formation is analyzed to extract more information, including dipping angle and azimuth direction of the drill. A finite element (FM) model of EWR tool working in layered earth formations is established. Numerical analysis and FM simulations are employed to analyze the amplitude ratio and phase difference between the voltages measured at the two receivers of the EWR tool in deviated well drilling.

  3. Finite Element Modeling Techniques for Analysis of VIIP

    NASA Technical Reports Server (NTRS)

    Feola, Andrew J.; Raykin, J.; Gleason, R.; Mulugeta, Lealem; Myers, Jerry G.; Nelson, Emily S.; Samuels, Brian C.; Ethier, C. Ross

    2015-01-01

    Visual Impairment and Intracranial Pressure (VIIP) syndrome is a major health concern for long-duration space missions. Currently, it is thought that a cephalad fluid shift in microgravity causes elevated intracranial pressure (ICP) that is transmitted along the optic nerve sheath (ONS). We hypothesize that this in turn leads to alteration and remodeling of connective tissue in the posterior eye which impacts vision. Finite element (FE) analysis is a powerful tool for examining the effects of mechanical loads in complex geometries. Our goal is to build a FE analysis framework to understand the response of the lamina cribrosa and optic nerve head to elevations in ICP in VIIP.

  4. Kuhn-Tucker optimization based reliability analysis for probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Besterfield, G.; Lawrence, M.; Belytschko, T.

    1988-01-01

    The fusion of probability finite element method (PFEM) and reliability analysis for fracture mechanics is considered. Reliability analysis with specific application to fracture mechanics is presented, and computational procedures are discussed. Explicit expressions for the optimization procedure with regard to fracture mechanics are given. The results show the PFEM is a very powerful tool in determining the second-moment statistics. The method can determine the probability of failure or fracture subject to randomness in load, material properties and crack length, orientation, and location.

  5. A three-dimensional inverse finite element analysis of the heel pad.

    PubMed

    Chokhandre, Snehal; Halloran, Jason P; van den Bogert, Antonie J; Erdemir, Ahmet

    2012-03-01

    Quantification of plantar tissue behavior of the heel pad is essential in developing computational models for predictive analysis of preventive treatment options such as footwear for patients with diabetes. Simulation based studies in the past have generally adopted heel pad properties from the literature, in return using heel-specific geometry with material properties of a different heel. In exceptional cases, patient-specific material characterization was performed with simplified two-dimensional models, without further evaluation of a heel-specific response under different loading conditions. The aim of this study was to conduct an inverse finite element analysis of the heel in order to calculate heel-specific material properties in situ. Multidimensional experimental data available from a previous cadaver study by Erdemir et al. ("An Elaborate Data Set Characterizing the Mechanical Response of the Foot," ASME J. Biomech. Eng., 131(9), pp. 094502) was used for model development, optimization, and evaluation of material properties. A specimen-specific three-dimensional finite element representation was developed. Heel pad material properties were determined using inverse finite element analysis by fitting the model behavior to the experimental data. Compression dominant loading, applied using a spherical indenter, was used for optimization of the material properties. The optimized material properties were evaluated through simulations representative of a combined loading scenario (compression and anterior-posterior shear) with a spherical indenter and also of a compression dominant loading applied using an elevated platform. Optimized heel pad material coefficients were 0.001084 MPa (μ), 9.780 (α) (with an effective Poisson's ratio (ν) of 0.475), for a first-order nearly incompressible Ogden material model. The model predicted structural response of the heel pad was in good agreement for both the optimization (<1.05% maximum tool force, 0.9% maximum tool

  6. Determination of elements in hospital waste with neutron activation analysis method

    NASA Astrophysics Data System (ADS)

    Dwijananti, P.; Astuti, B.; Alwiyah; Fianti

    2018-03-01

    The producer of the biggest B3 waste is hospital. The waste is from medical and laboratory activities. The purpose of this study is to determine the elements contained in the liquid waste from hospital and calculate the levels of these elements. This research was done by analysis of the neutron activation conducted at BATAN Yogyakarta. The neutron activation analysis is divided into two stages: activation of the samples using neutron sources of reactor Kartini, then chopping by using a set of tools, gamma spectrometer with HPGe detector. Qualitative and quantitative analysis were done by matching the gamma spectrum peak to the Neutron Activation Table. The sample was taken from four points of the liquid waste treatment plant (WWTP) Bhakti Wira Tamtama Semarang hospital. The results showed that the samples containing elements of Cr, Zn, Fe, Co, and Na, with the levels of each element is Cr (0.033 - 0.075) mg/L, Zn (0.090 - 1.048) mg/L, Fe (2.937-37.743) mg/L, Co (0.005-0.023) mg/L, and Na (61.088-116.330) mg/L. Comparing to the standard value, the liquid is safe to the environment.

  7. Finite Element Modeling, Simulation, Tools, and Capabilities at Superform

    NASA Astrophysics Data System (ADS)

    Raman, Hari; Barnes, A. J.

    2010-06-01

    Over the past thirty years Superform has been a pioneer in the SPF arena, having developed a keen understanding of the process and a range of unique forming techniques to meet varying market needs. Superform’s high-profile list of customers includes Boeing, Airbus, Aston Martin, Ford, and Rolls Royce. One of the more recent additions to Superform’s technical know-how is finite element modeling and simulation. Finite element modeling is a powerful numerical technique which when applied to SPF provides a host of benefits including accurate prediction of strain levels in a part, presence of wrinkles and predicting pressure cycles optimized for time and part thickness. This paper outlines a brief history of finite element modeling applied to SPF and then reviews some of the modeling tools and techniques that Superform have applied and continue to do so to successfully superplastically form complex-shaped parts. The advantages of employing modeling at the design stage are discussed and illustrated with real-world examples.

  8. Ablative Thermal Response Analysis Using the Finite Element Method

    NASA Technical Reports Server (NTRS)

    Dec John A.; Braun, Robert D.

    2009-01-01

    A review of the classic techniques used to solve ablative thermal response problems is presented. The advantages and disadvantages of both the finite element and finite difference methods are described. As a first step in developing a three dimensional finite element based ablative thermal response capability, a one dimensional computer tool has been developed. The finite element method is used to discretize the governing differential equations and Galerkin's method of weighted residuals is used to derive the element equations. A code to code comparison between the current 1-D tool and the 1-D Fully Implicit Ablation and Thermal Response Program (FIAT) has been performed.

  9. Design selection of an innovative tool holder for ultrasonic vibration assisted turning (IN-UVAT) using finite element analysis simulation

    NASA Astrophysics Data System (ADS)

    Rachmat, Haris; Ibrahim, M. Rasidi; Hasan, Sulaiman bin

    2017-04-01

    On of high technology in machining is ultrasonic vibration assisted turning. The design of tool holder was a crucial step to make sure the tool holder is enough to handle all forces on turning process. Because of the direct experimental approach is expensive, the paper studied to predict feasibility of tool holder displacement and effective stress was used the computational in finite element simulation. SS201 and AISI 1045 materials were used with sharp and ramp corners flexure hinges on design. The result shows that AISI 1045 material and which has ramp corner flexure hinge was the best choice to be produced. The displacement is around 11.3 micron and effective stress is 1.71e+008 N/m2 and also the factor of safety is 3.10.

  10. Data and Tools | Energy Analysis | NREL

    Science.gov Websites

    and Tools Energy Analysis Data and Tools NREL develops energy analysis data and tools to assess collections. Data Products Technology and Performance Analysis Tools Energy Systems Analysis Tools Economic and Financial Analysis Tools

  11. OEXP Analysis Tools Workshop

    NASA Technical Reports Server (NTRS)

    Garrett, L. Bernard; Wright, Robert L.; Badi, Deborah; Findlay, John T.

    1988-01-01

    This publication summarizes the software needs and available analysis tools presented at the OEXP Analysis Tools Workshop held at the NASA Langley Research Center, Hampton, Virginia on June 21 to 22, 1988. The objective of the workshop was to identify available spacecraft system (and subsystem) analysis and engineering design tools, and mission planning and analysis software that could be used for various NASA Office of Exploration (code Z) studies, specifically lunar and Mars missions.

  12. Challenges in Integrating Nondestructive Evaluation and Finite Element Methods for Realistic Structural Analysis

    NASA Technical Reports Server (NTRS)

    Abdul-Aziz, Ali; Baaklini, George Y.; Zagidulin, Dmitri; Rauser, Richard W.

    2000-01-01

    Capabilities and expertise related to the development of links between nondestructive evaluation (NDE) and finite element analysis (FEA) at Glenn Research Center (GRC) are demonstrated. Current tools to analyze data produced by computed tomography (CT) scans are exercised to help assess the damage state in high temperature structural composite materials. A utility translator was written to convert velocity (an image processing software) STL data file to a suitable CAD-FEA type file. Finite element analyses are carried out with MARC, a commercial nonlinear finite element code, and the analytical results are discussed. Modeling was established by building MSC/Patran (a pre and post processing finite element package) generated model and comparing it to a model generated by Velocity in conjunction with MSC/Patran Graphics. Modeling issues and results are discussed in this paper. The entire process that outlines the tie between the data extracted via NDE and the finite element modeling and analysis is fully described.

  13. Comparison of Damage Path Predictions for Composite Laminates by Explicit and Standard Finite Element Analysis Tools

    NASA Technical Reports Server (NTRS)

    Bogert, Philip B.; Satyanarayana, Arunkumar; Chunchu, Prasad B.

    2006-01-01

    Splitting, ultimate failure load and the damage path in center notched composite specimens subjected to in-plane tension loading are predicted using progressive failure analysis methodology. A 2-D Hashin-Rotem failure criterion is used in determining intra-laminar fiber and matrix failures. This progressive failure methodology has been implemented in the Abaqus/Explicit and Abaqus/Standard finite element codes through user written subroutines "VUMAT" and "USDFLD" respectively. A 2-D finite element model is used for predicting the intra-laminar damages. Analysis results obtained from the Abaqus/Explicit and Abaqus/Standard code show good agreement with experimental results. The importance of modeling delamination in progressive failure analysis methodology is recognized for future studies. The use of an explicit integration dynamics code for simple specimen geometry and static loading establishes a foundation for future analyses where complex loading and nonlinear dynamic interactions of damage and structure will necessitate it.

  14. Laser induced breakdown spectroscopy (LIBS) as a rapid tool for material analysis

    NASA Astrophysics Data System (ADS)

    Hussain, T.; Gondal, M. A.

    2013-06-01

    Laser induced breakdown spectroscopy (LIBS) is a novel technique for elemental analysis based on laser-generated plasma. In this technique, laser pulses are applied for ablation of the sample, resulting in the vaporization and ionization of sample in hot plasma which is finally analyzed by the spectrometer. The elements are identified by their unique spectral signatures. LIBS system was developed for elemental analysis of solid and liquid samples. The developed system was applied for qualitative as well as quantitative measurement of elemental concentration present in iron slag and open pit ore samples. The plasma was generated by focusing a pulsed Nd:YAG laser at 1064 nm on test samples to study the capabilities of LIBS as a rapid tool for material analysis. The concentrations of various elements of environmental significance such as cadmium, calcium, magnesium, chromium, manganese, titanium, barium, phosphorus, copper, iron, zinc etc., in these samples were determined. Optimal experimental conditions were evaluated for improving the sensitivity of developed LIBS system through parametric dependence study. The laser-induced breakdown spectroscopy (LIBS) results were compared with the results obtained using standard analytical technique such as inductively couple plasma emission spectroscopy (ICP). Limit of detection (LOD) of our LIBS system were also estimated for the above mentioned elements. This study demonstrates that LIBS could be highly appropriate for rapid online analysis of iron slag and open pit waste.

  15. SINEBase: a database and tool for SINE analysis.

    PubMed

    Vassetzky, Nikita S; Kramerov, Dmitri A

    2013-01-01

    SINEBase (http://sines.eimb.ru) integrates the revisited body of knowledge about short interspersed elements (SINEs). A set of formal definitions concerning SINEs was introduced. All available sequence data were screened through these definitions and the genetic elements misidentified as SINEs were discarded. As a result, 175 SINE families have been recognized in animals, flowering plants and green algae. These families were classified by the modular structure of their nucleotide sequences and the frequencies of different patterns were evaluated. These data formed the basis for the database of SINEs. The SINEBase website can be used in two ways: first, to explore the database of SINE families, and second, to analyse candidate SINE sequences using specifically developed tools. This article presents an overview of the database and the process of SINE identification and analysis.

  16. SINEBase: a database and tool for SINE analysis

    PubMed Central

    Vassetzky, Nikita S.; Kramerov, Dmitri A.

    2013-01-01

    SINEBase (http://sines.eimb.ru) integrates the revisited body of knowledge about short interspersed elements (SINEs). A set of formal definitions concerning SINEs was introduced. All available sequence data were screened through these definitions and the genetic elements misidentified as SINEs were discarded. As a result, 175 SINE families have been recognized in animals, flowering plants and green algae. These families were classified by the modular structure of their nucleotide sequences and the frequencies of different patterns were evaluated. These data formed the basis for the database of SINEs. The SINEBase website can be used in two ways: first, to explore the database of SINE families, and second, to analyse candidate SINE sequences using specifically developed tools. This article presents an overview of the database and the process of SINE identification and analysis. PMID:23203982

  17. Rapid SAW Sensor Development Tools

    NASA Technical Reports Server (NTRS)

    Wilson, William C.; Atkinson, Gary M.

    2007-01-01

    The lack of integrated design tools for Surface Acoustic Wave (SAW) devices has led us to develop tools for the design, modeling, analysis, and automatic layout generation of SAW devices. These tools enable rapid development of wireless SAW sensors. The tools developed have been designed to integrate into existing Electronic Design Automation (EDA) tools to take advantage of existing 3D modeling, and Finite Element Analysis (FEA). This paper presents the SAW design, modeling, analysis, and automated layout generation tools.

  18. Analysis of Tire Tractive Performance on Deformable Terrain by Finite Element-Discrete Element Method

    NASA Astrophysics Data System (ADS)

    Nakashima, Hiroshi; Takatsu, Yuzuru

    The goal of this study is to develop a practical and fast simulation tool for soil-tire interaction analysis, where finite element method (FEM) and discrete element method (DEM) are coupled together, and which can be realized on a desktop PC. We have extended our formerly proposed dynamic FE-DE method (FE-DEM) to include practical soil-tire system interaction, where not only the vertical sinkage of a tire, but also the travel of a driven tire was considered. Numerical simulation by FE-DEM is stable, and the relationships between variables, such as load-sinkage and sinkage-travel distance, and the gross tractive effort and running resistance characteristics, are obtained. Moreover, the simulation result is accurate enough to predict the maximum drawbar pull for a given tire, once the appropriate parameter values are provided. Therefore, the developed FE-DEM program can be applied with sufficient accuracy to interaction problems in soil-tire systems.

  19. Enhanced terahertz imaging system performance analysis and design tool for concealed weapon identification

    NASA Astrophysics Data System (ADS)

    Murrill, Steven R.; Franck, Charmaine C.; Espinola, Richard L.; Petkie, Douglas T.; De Lucia, Frank C.; Jacobs, Eddie L.

    2011-11-01

    The U.S. Army Research Laboratory (ARL) and the U.S. Army Night Vision and Electronic Sensors Directorate (NVESD) have developed a terahertz-band imaging system performance model/tool for detection and identification of concealed weaponry. The details of the MATLAB-based model which accounts for the effects of all critical sensor and display components, and for the effects of atmospheric attenuation, concealment material attenuation, and active illumination, were reported on at the 2005 SPIE Europe Security & Defence Symposium (Brugge). An advanced version of the base model that accounts for both the dramatic impact that target and background orientation can have on target observability as related to specular and Lambertian reflections captured by an active-illumination-based imaging system, and for the impact of target and background thermal emission, was reported on at the 2007 SPIE Defense and Security Symposium (Orlando). This paper will provide a comprehensive review of an enhanced, user-friendly, Windows-executable, terahertz-band imaging system performance analysis and design tool that now includes additional features such as a MODTRAN-based atmospheric attenuation calculator and advanced system architecture configuration inputs that allow for straightforward performance analysis of active or passive systems based on scanning (single- or line-array detector element(s)) or staring (focal-plane-array detector elements) imaging architectures. This newly enhanced THz imaging system design tool is an extension of the advanced THz imaging system performance model that was developed under the Defense Advanced Research Project Agency's (DARPA) Terahertz Imaging Focal-Plane Technology (TIFT) program. This paper will also provide example system component (active-illumination source and detector) trade-study analyses using the new features of this user-friendly THz imaging system performance analysis and design tool.

  20. Oscillation Baselining and Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  1. The p-version of the finite element method in incremental elasto-plastic analysis

    NASA Technical Reports Server (NTRS)

    Holzer, Stefan M.; Yosibash, Zohar

    1993-01-01

    Whereas the higher-order versions of the finite elements method (the p- and hp-version) are fairly well established as highly efficient methods for monitoring and controlling the discretization error in linear problems, little has been done to exploit their benefits in elasto-plastic structural analysis. Aspects of incremental elasto-plastic finite element analysis which are particularly amenable to improvements by the p-version is discussed. These theoretical considerations are supported by several numerical experiments. First, an example for which an analytical solution is available is studied. It is demonstrated that the p-version performs very well even in cycles of elasto-plastic loading and unloading, not only as compared to the traditional h-version but also in respect to the exact solution. Finally, an example of considerable practical importance - the analysis of a cold-worked lug - is presented which demonstrates how the modeling tools offered by higher-order finite element techniques can contribute to an improved approximation of practical problems.

  2. Analysis of Uncertainty and Variability in Finite Element Computational Models for Biomedical Engineering: Characterization and Propagation

    PubMed Central

    Mangado, Nerea; Piella, Gemma; Noailly, Jérôme; Pons-Prats, Jordi; Ballester, Miguel Ángel González

    2016-01-01

    Computational modeling has become a powerful tool in biomedical engineering thanks to its potential to simulate coupled systems. However, real parameters are usually not accurately known, and variability is inherent in living organisms. To cope with this, probabilistic tools, statistical analysis and stochastic approaches have been used. This article aims to review the analysis of uncertainty and variability in the context of finite element modeling in biomedical engineering. Characterization techniques and propagation methods are presented, as well as examples of their applications in biomedical finite element simulations. Uncertainty propagation methods, both non-intrusive and intrusive, are described. Finally, pros and cons of the different approaches and their use in the scientific community are presented. This leads us to identify future directions for research and methodological development of uncertainty modeling in biomedical engineering. PMID:27872840

  3. Analysis of Uncertainty and Variability in Finite Element Computational Models for Biomedical Engineering: Characterization and Propagation.

    PubMed

    Mangado, Nerea; Piella, Gemma; Noailly, Jérôme; Pons-Prats, Jordi; Ballester, Miguel Ángel González

    2016-01-01

    Computational modeling has become a powerful tool in biomedical engineering thanks to its potential to simulate coupled systems. However, real parameters are usually not accurately known, and variability is inherent in living organisms. To cope with this, probabilistic tools, statistical analysis and stochastic approaches have been used. This article aims to review the analysis of uncertainty and variability in the context of finite element modeling in biomedical engineering. Characterization techniques and propagation methods are presented, as well as examples of their applications in biomedical finite element simulations. Uncertainty propagation methods, both non-intrusive and intrusive, are described. Finally, pros and cons of the different approaches and their use in the scientific community are presented. This leads us to identify future directions for research and methodological development of uncertainty modeling in biomedical engineering.

  4. State Analysis Database Tool

    NASA Technical Reports Server (NTRS)

    Rasmussen, Robert; Bennett, Matthew

    2006-01-01

    The State Analysis Database Tool software establishes a productive environment for collaboration among software and system engineers engaged in the development of complex interacting systems. The tool embodies State Analysis, a model-based system engineering methodology founded on a state-based control architecture (see figure). A state represents a momentary condition of an evolving system, and a model may describe how a state evolves and is affected by other states. The State Analysis methodology is a process for capturing system and software requirements in the form of explicit models and states, and defining goal-based operational plans consistent with the models. Requirements, models, and operational concerns have traditionally been documented in a variety of system engineering artifacts that address different aspects of a mission s lifecycle. In State Analysis, requirements, models, and operations information are State Analysis artifacts that are consistent and stored in a State Analysis Database. The tool includes a back-end database, a multi-platform front-end client, and Web-based administrative functions. The tool is structured to prompt an engineer to follow the State Analysis methodology, to encourage state discovery and model description, and to make software requirements and operations plans consistent with model descriptions.

  5. Flow Analysis Tool White Paper

    NASA Technical Reports Server (NTRS)

    Boscia, Nichole K.

    2012-01-01

    Faster networks are continually being built to accommodate larger data transfers. While it is intuitive to think that implementing faster networks will result in higher throughput rates, this is often not the case. There are many elements involved in data transfer, many of which are beyond the scope of the network itself. Although networks may get bigger and support faster technologies, the presence of other legacy components, such as older application software or kernel parameters, can often cause bottlenecks. Engineers must be able to identify when data flows are reaching a bottleneck that is not imposed by the network and then troubleshoot it using the tools available to them. The current best practice is to collect as much information as possible on the network traffic flows so that analysis is quick and easy. Unfortunately, no single method of collecting this information can sufficiently capture the whole endto- end picture. This becomes even more of a hurdle when large, multi-user systems are involved. In order to capture all the necessary information, multiple data sources are required. This paper presents a method for developing a flow analysis tool to effectively collect network flow data from multiple sources and provide that information to engineers in a clear, concise way for analysis. The purpose of this method is to collect enough information to quickly (and automatically) identify poorly performing flows along with the cause of the problem. The method involves the development of a set of database tables that can be populated with flow data from multiple sources, along with an easyto- use, web-based front-end interface to help network engineers access, organize, analyze, and manage all the information.

  6. Design Optimization Tool for Synthetic Jet Actuators Using Lumped Element Modeling

    NASA Technical Reports Server (NTRS)

    Gallas, Quentin; Sheplak, Mark; Cattafesta, Louis N., III; Gorton, Susan A. (Technical Monitor)

    2005-01-01

    The performance specifications of any actuator are quantified in terms of an exhaustive list of parameters such as bandwidth, output control authority, etc. Flow-control applications benefit from a known actuator frequency response function that relates the input voltage to the output property of interest (e.g., maximum velocity, volumetric flow rate, momentum flux, etc.). Clearly, the required performance metrics are application specific, and methods are needed to achieve the optimal design of these devices. Design and optimization studies have been conducted for piezoelectric cantilever-type flow control actuators, but the modeling issues are simpler compared to synthetic jets. Here, lumped element modeling (LEM) is combined with equivalent circuit representations to estimate the nonlinear dynamic response of a synthetic jet as a function of device dimensions, material properties, and external flow conditions. These models provide reasonable agreement between predicted and measured frequency response functions and thus are suitable for use as design tools. In this work, we have developed a Matlab-based design optimization tool for piezoelectric synthetic jet actuators based on the lumped element models mentioned above. Significant improvements were achieved by optimizing the piezoceramic diaphragm dimensions. Synthetic-jet actuators were fabricated and benchtop tested to fully document their behavior and validate a companion optimization effort. It is hoped that the tool developed from this investigation will assist in the design and deployment of these actuators.

  7. ISPAN (Interactive Stiffened Panel Analysis): A tool for quick concept evaluation and design trade studies

    NASA Technical Reports Server (NTRS)

    Hairr, John W.; Dorris, William J.; Ingram, J. Edward; Shah, Bharat M.

    1993-01-01

    Interactive Stiffened Panel Analysis (ISPAN) modules, written in FORTRAN, were developed to provide an easy to use tool for creating finite element models of composite material stiffened panels. The modules allow the user to interactively construct, solve and post-process finite element models of four general types of structural panel configurations using only the panel dimensions and properties as input data. Linear, buckling and post-buckling solution capability is provided. This interactive input allows rapid model generation and solution by non finite element users. The results of a parametric study of a blade stiffened panel are presented to demonstrate the usefulness of the ISPAN modules. Also, a non-linear analysis of a test panel was conducted and the results compared to measured data and previous correlation analysis.

  8. Power flow as a complement to statistical energy analysis and finite element analysis

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1987-01-01

    Present methods of analysis of the structural response and the structure-borne transmission of vibrational energy use either finite element (FE) techniques or statistical energy analysis (SEA) methods. The FE methods are a very useful tool at low frequencies where the number of resonances involved in the analysis is rather small. On the other hand SEA methods can predict with acceptable accuracy the response and energy transmission between coupled structures at relatively high frequencies where the structural modal density is high and a statistical approach is the appropriate solution. In the mid-frequency range, a relatively large number of resonances exist which make finite element method too costly. On the other hand SEA methods can only predict an average level form. In this mid-frequency range a possible alternative is to use power flow techniques, where the input and flow of vibrational energy to excited and coupled structural components can be expressed in terms of input and transfer mobilities. This power flow technique can be extended from low to high frequencies and this can be integrated with established FE models at low frequencies and SEA models at high frequencies to form a verification of the method. This method of structural analysis using power flo and mobility methods, and its integration with SEA and FE analysis is applied to the case of two thin beams joined together at right angles.

  9. General Mission Analysis Tool (GMAT) Architectural Specification. Draft

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Conway, Darrel, J.

    2007-01-01

    Early in 2002, Goddard Space Flight Center (GSFC) began to identify requirements for the flight dynamics software needed to fly upcoming missions that use formations of spacecraft to collect data. These requirements ranged from low level modeling features to large scale interoperability requirements. In 2003 we began work on a system designed to meet these requirement; this system is GMAT. The General Mission Analysis Tool (GMAT) is a general purpose flight dynamics modeling tool built on open source principles. The GMAT code is written in C++, and uses modern C++ constructs extensively. GMAT can be run through either a fully functional Graphical User Interface (GUI) or as a command line program with minimal user feedback. The system is built and runs on Microsoft Windows, Linux, and Macintosh OS X platforms. The GMAT GUI is written using wxWidgets, a cross platform library of components that streamlines the development and extension of the user interface Flight dynamics modeling is performed in GMAT by building components that represent the players in the analysis problem that is being modeled. These components interact through the sequential execution of instructions, embodied in the GMAT Mission Sequence. A typical Mission Sequence will model the trajectories of a set of spacecraft evolving over time, calculating relevant parameters during this propagation, and maneuvering individual spacecraft to maintain a set of mission constraints as established by the mission analyst. All of the elements used in GMAT for mission analysis can be viewed in the GMAT GUI or through a custom scripting language. Analysis problems modeled in GMAT are saved as script files, and these files can be read into GMAT. When a script is read into the GMAT GUI, the corresponding user interface elements are constructed in the GMAT GUI. The GMAT system was developed from the ground up to run in a platform agnostic environment. The source code compiles on numerous different platforms, and is

  10. Finite element analysis of helicopter structures

    NASA Technical Reports Server (NTRS)

    Rich, M. J.

    1978-01-01

    Application of the finite element analysis is now being expanded to three dimensional analysis of mechanical components. Examples are presented for airframe, mechanical components, and composite structure calculations. Data are detailed on the increase of model size, computer usage, and the effect on reducing stress analysis costs. Future applications for use of finite element analysis for helicopter structures are projected.

  11. A Dual Super-Element Domain Decomposition Approach for Parallel Nonlinear Finite Element Analysis

    NASA Astrophysics Data System (ADS)

    Jokhio, G. A.; Izzuddin, B. A.

    2015-05-01

    This article presents a new domain decomposition method for nonlinear finite element analysis introducing the concept of dual partition super-elements. The method extends ideas from the displacement frame method and is ideally suited for parallel nonlinear static/dynamic analysis of structural systems. In the new method, domain decomposition is realized by replacing one or more subdomains in a "parent system," each with a placeholder super-element, where the subdomains are processed separately as "child partitions," each wrapped by a dual super-element along the partition boundary. The analysis of the overall system, including the satisfaction of equilibrium and compatibility at all partition boundaries, is realized through direct communication between all pairs of placeholder and dual super-elements. The proposed method has particular advantages for matrix solution methods based on the frontal scheme, and can be readily implemented for existing finite element analysis programs to achieve parallelization on distributed memory systems with minimal intervention, thus overcoming memory bottlenecks typically faced in the analysis of large-scale problems. Several examples are presented in this article which demonstrate the computational benefits of the proposed parallel domain decomposition approach and its applicability to the nonlinear structural analysis of realistic structural systems.

  12. Stress analysis and design considerations for Shuttle pointed autonomous research tool for astronomy /SPARTAN/

    NASA Technical Reports Server (NTRS)

    Ferragut, N. J.

    1982-01-01

    The Shuttle Pointed Autonomous Research Tool for Astronomy (SPARTAN) family of spacecraft are intended to operate with minimum interfaces with the U.S. Space Shuttle in order to increase flight opportunities. The SPARTAN I Spacecraft was designed to enhance structural capabilities and increase reliability. The approach followed results from work experience which evolved from sounding rocket projects. Structural models were developed to do the analyses necessary to satisfy safety requirements for Shuttle hardware. A loads analysis must also be performed. Stress analysis calculations will be performed on the main structural elements and subcomponents. Attention is given to design considerations and program definition, the schematic representation of a finite element model used for SPARTAN I spacecraft, details of loads analysis, the stress analysis, and fracture mechanics plan implications.

  13. Optimum element density studies for finite-element thermal analysis of hypersonic aircraft structures

    NASA Technical Reports Server (NTRS)

    Ko, William L.; Olona, Timothy; Muramoto, Kyle M.

    1990-01-01

    Different finite element models previously set up for thermal analysis of the space shuttle orbiter structure are discussed and their shortcomings identified. Element density criteria are established for the finite element thermal modelings of space shuttle orbiter-type large, hypersonic aircraft structures. These criteria are based on rigorous studies on solution accuracies using different finite element models having different element densities set up for one cell of the orbiter wing. Also, a method for optimization of the transient thermal analysis computer central processing unit (CPU) time is discussed. Based on the newly established element density criteria, the orbiter wing midspan segment was modeled for the examination of thermal analysis solution accuracies and the extent of computation CPU time requirements. The results showed that the distributions of the structural temperatures and the thermal stresses obtained from this wing segment model were satisfactory and the computation CPU time was at the acceptable level. The studies offered the hope that modeling the large, hypersonic aircraft structures using high-density elements for transient thermal analysis is possible if a CPU optimization technique was used.

  14. The application of finite element analysis in the skull biomechanics and dentistry.

    PubMed

    Prado, Felippe Bevilacqua; Rossi, Ana Cláudia; Freire, Alexandre Rodrigues; Ferreira Caria, Paulo Henrique

    2014-01-01

    Empirical concepts describe the direction of the masticatory stress dissipation in the skull. The scientific evidence of the trajectories and the magnitude of stress dissipation can help in the diagnosis of the masticatory alterations and the planning of oral rehabilitation in the different areas of Dentistry. The Finite Element Analysis (FEA) is a tool that may reproduce complex structures with irregular geometries of natural and artificial tissues of the human body because it uses mathematical functions that enable the understanding of the craniofacial biomechanics. The aim of this study was to review the literature on the advantages and limitations of FEA in the skull biomechanics and Dentistry study. The keywords of the selected original research articles were: Finite element analysis, biomechanics, skull, Dentistry, teeth, and implant. The literature review was performed in the databases, PUBMED, MEDLINE and SCOPUS. The selected books and articles were between the years 1928 and 2010. The FEA is an assessment tool whose application in different areas of the Dentistry has gradually increased over the past 10 years, but its application in the analysis of the skull biomechanics is scarce. The main advantages of the FEA are the realistic mode of approach and the possibility of results being based on analysis of only one model. On the other hand, the main limitation of the FEA studies is the lack of anatomical details in the modeling phase of the craniofacial structures and the lack of information about the material properties.

  15. Finite element analysis (FEA) analysis of the preflex beam

    NASA Astrophysics Data System (ADS)

    Wan, Lijuan; Gao, Qilang

    2017-10-01

    The development of finite element analysis (FEA) has been relatively mature, and is one of the important means of structural analysis. This method changes the problem that the research of complex structure in the past needs to be done by a large number of experiments. Through the finite element method, the numerical simulation of the structure can be used to achieve a variety of static and dynamic simulation analysis of the mechanical problems, it is also convenient to study the parameters of the structural parameters. Combined with a certain number of experiments to verify the simulation model can be completed in the past all the needs of experimental research. The nonlinear finite element method is used to simulate the flexural behavior of the prestressed composite beams with corrugated steel webs. The finite element analysis is used to understand the mechanical properties of the structure under the action of bending load.

  16. Elemental analysis of scorpion venoms.

    PubMed

    Al-Asmari, AbdulRahman K; Kunnathodi, Faisal; Al Saadon, Khalid; Idris, Mohammed M

    2016-01-01

    Scorpion venom is a rich source of biomolecules, which can perturb physiological activity of the host on envenomation and may also have a therapeutic potential. Scorpion venoms produced by the columnar cells of venom gland are complex mixture of mucopolysaccharides, neurotoxic peptides and other components. This study was aimed at cataloguing the elemental composition of venoms obtained from medically important scorpions found in the Arabian peninsula. The global elemental composition of the crude venom obtained from Androctonus bicolor, Androctonus crassicauda and Leiurus quinquestriatus scorpions were estimated using ICP-MS analyzer. The study catalogued several chemical elements present in the scorpion venom using ICP-MS total quant analysis and quantitation of nine elements exclusively using appropriate standards. Fifteen chemical elements including sodium, potassium and calcium were found abundantly in the scorpion venom at PPM concentrations. Thirty six chemical elements of different mass ranges were detected in the venom at PPB level. Quantitative analysis of the venoms revealed copper to be the most abundant element in Androctonus sp. venom but at lower level in Leiurus quinquestriatus venom; whereas zinc and manganese was found at higher levels in Leiurus sp. venom but at lower level in Androctonus sp. venom. These data and the concentrations of other different elements present in the various venoms are likely to increase our understanding of the mechanisms of venom activity and their pharmacological potentials.

  17. Economic and Financial Analysis Tools | Energy Analysis | NREL

    Science.gov Websites

    Economic and Financial Analysis Tools Economic and Financial Analysis Tools Use these economic and . Job and Economic Development Impact (JEDI) Model Use these easy-to-use, spreadsheet-based tools to analyze the economic impacts of constructing and operating power generation and biofuel plants at the

  18. The development of a tool for assessing the quality of closed circuit camera footage for use in forensic gait analysis.

    PubMed

    Birch, Ivan; Vernon, Wesley; Walker, Jeremy; Saxelby, Jai

    2013-10-01

    Gait analysis from closed circuit camera footage is now commonly used as evidence in criminal trials. The biomechanical analysis of human gait is a well established science in both clinical and laboratory settings. However, closed circuit camera footage is rarely of the quality of that taken in the more controlled clinical and laboratory environments. The less than ideal quality of much of this footage for use in gait analysis is associated with a range of issues, the combination of which can often render the footage unsuitable for use in gait analysis. The aim of this piece of work was to develop a tool for assessing the suitability of closed circuit camera footage for the purpose of forensic gait analysis. A Delphi technique was employed with a small sample of expert forensic gait analysis practitioners, to identify key quality elements of CCTV footage used in legal proceedings. Five elements of the footage were identified and then subdivided into 15 contributing sub-elements, each of which was scored using a 5-point Likert scale. A Microsoft Excel worksheet was developed to calculate automatically an overall score from the fifteen sub-element scores. Five expert witnesses experienced in using CCTV footage for gait analysis then trialled the prototype tool on current case footage. A repeatability study was also undertaken using standardized CCTV footage. The results showed the tool to be a simple and repeatable means of assessing the suitability of closed circuit camera footage for use in forensic gait analysis. The inappropriate use of poor quality footage could lead to challenges to the practice of forensic gait analysis. All parties involved in criminal proceedings must therefore understand the fitness for purpose of any footage used. The development of this tool could offer a method of achieving this goal, and help to assure the continued role of forensic gait analysis as an aid to the identification process. Copyright © 2013 Elsevier Ltd and Faculty of

  19. The Precision Formation Flying Integrated Analysis Tool (PFFIAT)

    NASA Technical Reports Server (NTRS)

    Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor

    2004-01-01

    Several space missions presently in the concept phase (e.g. Stellar Imager, Submillimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation Flying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.

  20. The Precision Formation Flying Integrated Analysis Tool (PFFIAT)

    NASA Technical Reports Server (NTRS)

    Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor

    2004-01-01

    Several space missions presently in the concept phase (e.g. Stellar Imager, Sub- millimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation J?lying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.

  1. Elements and elasmobranchs: hypotheses, assumptions and limitations of elemental analysis.

    PubMed

    McMillan, M N; Izzo, C; Wade, B; Gillanders, B M

    2017-02-01

    Quantifying the elemental composition of elasmobranch calcified cartilage (hard parts) has the potential to answer a range of ecological and biological questions, at both the individual and population level. Few studies, however, have employed elemental analyses of elasmobranch hard parts. This paper provides an overview of the range of applications of elemental analysis in elasmobranchs, discussing the assumptions and potential limitations in cartilaginous fishes. It also reviews the available information on biotic and abiotic factors influencing patterns of elemental incorporation into hard parts of elasmobranchs and provides some comparative elemental assays and mapping in an attempt to fill knowledge gaps. Directions for future experimental research are highlighted to better understand fundamental elemental dynamics in elasmobranch hard parts. © 2016 The Fisheries Society of the British Isles.

  2. Analysis of design tool attributes with regards to sustainability benefits

    NASA Astrophysics Data System (ADS)

    Zain, S.; Ismail, A. F.; Ahmad, Z.; Adesta, E. Y. T.

    2018-01-01

    The trend of global manufacturing competitiveness has shown a significant shift from profit and customer driven business to a more harmonious sustainability paradigm. This new direction, which emphasises the interests of three pillars of sustainability, i.e., social, economic and environment dimensions, has changed the ways products are designed. As a result, the roles of design tools in the product development stage of manufacturing in adapting to the new strategy are vital and increasingly challenging. The aim of this paper is to review the literature on the attributes of design tools with regards to the sustainability perspective. Four well-established design tools are selected, namely Quality Function Deployment (QFD), Failure Mode and Element Analysis (FMEA), Design for Six Sigma (DFSS) and Design for Environment (DfE). By analysing previous studies, the main attributes of each design tool and its benefits with respect to each sustainability dimension throughout four stages of product lifecycle are discussed. From this study, it is learnt that each of the design tools contributes to the three pillars of sustainability either directly or indirectly, but they are unbalanced and not holistic. Therefore, the prospective of improving and optimising the design tools is projected, and the possibility of collaboration between the different tools is discussed.

  3. Framework for Multidisciplinary Analysis, Design, and Optimization with High-Fidelity Analysis Tools

    NASA Technical Reports Server (NTRS)

    Orr, Stanley A.; Narducci, Robert P.

    2009-01-01

    A plan is presented for the development of a high fidelity multidisciplinary optimization process for rotorcraft. The plan formulates individual disciplinary design problems, identifies practical high-fidelity tools and processes that can be incorporated in an automated optimization environment, and establishes statements of the multidisciplinary design problem including objectives, constraints, design variables, and cross-disciplinary dependencies. Five key disciplinary areas are selected in the development plan. These are rotor aerodynamics, rotor structures and dynamics, fuselage aerodynamics, fuselage structures, and propulsion / drive system. Flying qualities and noise are included as ancillary areas. Consistency across engineering disciplines is maintained with a central geometry engine that supports all multidisciplinary analysis. The multidisciplinary optimization process targets the preliminary design cycle where gross elements of the helicopter have been defined. These might include number of rotors and rotor configuration (tandem, coaxial, etc.). It is at this stage that sufficient configuration information is defined to perform high-fidelity analysis. At the same time there is enough design freedom to influence a design. The rotorcraft multidisciplinary optimization tool is built and substantiated throughout its development cycle in a staged approach by incorporating disciplines sequentially.

  4. A finite element head and neck model as a supportive tool for deformable image registration.

    PubMed

    Kim, Jihun; Saitou, Kazuhiro; Matuszak, Martha M; Balter, James M

    2016-07-01

    A finite element (FE) head and neck model was developed as a tool to aid investigations and development of deformable image registration and patient modeling in radiation oncology. Useful aspects of a FE model for these purposes include ability to produce realistic deformations (similar to those seen in patients over the course of treatment) and a rational means of generating new configurations, e.g., via the application of force and/or displacement boundary conditions. The model was constructed based on a cone-beam computed tomography image of a head and neck cancer patient. The three-node triangular surface meshes created for the bony elements (skull, mandible, and cervical spine) and joint elements were integrated into a skeletal system and combined with the exterior surface. Nodes were additionally created inside the surface structures which were composed of the three-node triangular surface meshes, so that four-node tetrahedral FE elements were created over the whole region of the model. The bony elements were modeled as a homogeneous linear elastic material connected by intervertebral disks. The surrounding tissues were modeled as a homogeneous linear elastic material. Under force or displacement boundary conditions, FE analysis on the model calculates approximate solutions of the displacement vector field. A FE head and neck model was constructed that skull, mandible, and cervical vertebrae were mechanically connected by disks. The developed FE model is capable of generating realistic deformations that are strain-free for the bony elements and of creating new configurations of the skeletal system with the surrounding tissues reasonably deformed. The FE model can generate realistic deformations for skeletal elements. In addition, the model provides a way of evaluating the accuracy of image alignment methods by producing a ground truth deformation and correspondingly simulated images. The ability to combine force and displacement conditions provides

  5. Examining the Impact of Culture and Human Elements on OLAP Tools Usefulness

    ERIC Educational Resources Information Center

    Sharoupim, Magdy S.

    2010-01-01

    The purpose of the present study was to examine the impact of culture and human-related elements on the On-line Analytical Processing (OLAP) usability in generating decision-making information. The use of OLAP technology has evolved rapidly and gained momentum, mainly due to the ability of OLAP tools to examine and query large amounts of data sets…

  6. Non-destructive elemental analysis of a carbonaceous chondrite with direct current Muon beam at MuSIC.

    PubMed

    Terada, K; Sato, A; Ninomiya, K; Kawashima, Y; Shimomura, K; Yoshida, G; Kawai, Y; Osawa, T; Tachibana, S

    2017-11-13

    Electron- or X-ray-induced characteristic X-ray analysis has been widely used to determine chemical compositions of materials in vast research fields. In recent years, analysis of characteristic X-rays from muonic atoms, in which a muon is captured, has attracted attention because both a muon beam and a muon-induced characteristic X-ray have high transmission abilities. Here we report the first non-destructive elemental analysis of a carbonaceous chondrite using one of the world-leading intense direct current muon beam source (MuSIC; MUon Science Innovative Channel). We successfully detected characteristic muonic X-rays of Mg, Si, Fe, O, S and C from Jbilet Winselwan CM chondrite, of which carbon content is about 2 wt%, and the obtained elemental abundance pattern was consistent with that of CM chondrites. Because of its high sensitivity to carbon, non-destructive elemental analysis with a muon beam can be a novel powerful tool to characterize future retuned samples from carbonaceous asteroids.

  7. Elemental analysis with external-beam PIXE

    NASA Astrophysics Data System (ADS)

    Lin, E. K.; Wang, C. W.; Teng, P. K.; Huang, Y. M.; Chen, C. Y.

    1992-05-01

    A beamline system and experimental setup has been established for elemental analysis using PIXE with an external beam. Experiments for the study of the elemental composition of ancient Chinese potsherds (the Min and Ching ages) were performed. Continuum X-ray spectra from the samples bombarded by 3 MeV protons have been measured with a Si(Li) detector. From the analysis of PIXE data, the concentration of the main elements (Al, Si, K, and Ca) and of more than ten trace elements in the matrices and glazed surfaces were determined. Results for two different potsherds are presented, and those obtained from the glaze colorants are compared with the results of measurements on a Ching blue-and-white porcelain vase.

  8. Analysis Tools for CFD Multigrid Solvers

    NASA Technical Reports Server (NTRS)

    Mineck, Raymond E.; Thomas, James L.; Diskin, Boris

    2004-01-01

    Analysis tools are needed to guide the development and evaluate the performance of multigrid solvers for the fluid flow equations. Classical analysis tools, such as local mode analysis, often fail to accurately predict performance. Two-grid analysis tools, herein referred to as Idealized Coarse Grid and Idealized Relaxation iterations, have been developed and evaluated within a pilot multigrid solver. These new tools are applicable to general systems of equations and/or discretizations and point to problem areas within an existing multigrid solver. Idealized Relaxation and Idealized Coarse Grid are applied in developing textbook-efficient multigrid solvers for incompressible stagnation flow problems.

  9. Extended Testability Analysis Tool

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  10. NCC: A Multidisciplinary Design/Analysis Tool for Combustion Systems

    NASA Technical Reports Server (NTRS)

    Liu, Nan-Suey; Quealy, Angela

    1999-01-01

    A multi-disciplinary design/analysis tool for combustion systems is critical for optimizing the low-emission, high-performance combustor design process. Based on discussions between NASA Lewis Research Center and the jet engine companies, an industry-government team was formed in early 1995 to develop the National Combustion Code (NCC), which is an integrated system of computer codes for the design and analysis of combustion systems. NCC has advanced features that address the need to meet designer's requirements such as "assured accuracy", "fast turnaround", and "acceptable cost". The NCC development team is comprised of Allison Engine Company (Allison), CFD Research Corporation (CFDRC), GE Aircraft Engines (GEAE), NASA Lewis Research Center (LeRC), and Pratt & Whitney (P&W). This development team operates under the guidance of the NCC steering committee. The "unstructured mesh" capability and "parallel computing" are fundamental features of NCC from its inception. The NCC system is composed of a set of "elements" which includes grid generator, main flow solver, turbulence module, turbulence and chemistry interaction module, chemistry module, spray module, radiation heat transfer module, data visualization module, and a post-processor for evaluating engine performance parameters. Each element may have contributions from several team members. Such a multi-source multi-element system needs to be integrated in a way that facilitates inter-module data communication, flexibility in module selection, and ease of integration.

  11. An open source software tool to assign the material properties of bone for ABAQUS finite element simulations.

    PubMed

    Pegg, Elise C; Gill, Harinderjit S

    2016-09-06

    A new software tool to assign the material properties of bone to an ABAQUS finite element mesh was created and compared with Bonemat, a similar tool originally designed to work with Ansys finite element models. Our software tool (py_bonemat_abaqus) was written in Python, which is the chosen scripting language for ABAQUS. The purpose of this study was to compare the software packages in terms of the material assignment calculation and processing speed. Three element types were compared (linear hexahedral (C3D8), linear tetrahedral (C3D4) and quadratic tetrahedral elements (C3D10)), both individually and as part of a mesh. Comparisons were made using a CT scan of a hemi-pelvis as a test case. A small difference, of -0.05kPa on average, was found between Bonemat version 3.1 (the current version) and our Python package. Errors were found in the previous release of Bonemat (version 3.0 downloaded from www.biomedtown.org) during calculation of the quadratic tetrahedron Jacobian, and conversion of the apparent density to modulus when integrating over the Young׳s modulus field. These issues caused up to 2GPa error in the modulus assignment. For these reasons, we recommend users upgrade to the most recent release of Bonemat. Processing speeds were assessed for the three different element types. Our Python package took significantly longer (110s on average) to perform the calculations compared with the Bonemat software (10s). Nevertheless, the workflow advantages of the package and added functionality makes 'py_bonemat_abaqus' a useful tool for ABAQUS users. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Generic element processor (application to nonlinear analysis)

    NASA Technical Reports Server (NTRS)

    Stanley, Gary

    1989-01-01

    The focus here is on one aspect of the Computational Structural Mechanics (CSM) Testbed: finite element technology. The approach involves a Generic Element Processor: a command-driven, database-oriented software shell that facilitates introduction of new elements into the testbed. This shell features an element-independent corotational capability that upgrades linear elements to geometrically nonlinear analysis, and corrects the rigid-body errors that plague many contemporary plate and shell elements. Specific elements that have been implemented in the Testbed via this mechanism include the Assumed Natural-Coordinate Strain (ANS) shell elements, developed with Professor K. C. Park (University of Colorado, Boulder), a new class of curved hybrid shell elements, developed by Dr. David Kang of LPARL (formerly a student of Professor T. Pian), other shell and solid hybrid elements developed by NASA personnel, and recently a repackaged version of the workhorse shell element used in the traditional STAGS nonlinear shell analysis code. The presentation covers: (1) user and developer interfaces to the generic element processor, (2) an explanation of the built-in corotational option, (3) a description of some of the shell-elements currently implemented, and (4) application to sample nonlinear shell postbuckling problems.

  13. Using experimental modal analysis to assess the behaviour of timber elements

    NASA Astrophysics Data System (ADS)

    Kouroussis, Georges; Fekih, Lassaad Ben; Descamps, Thierry

    2018-03-01

    Timber frameworks are one of the most important and widespread types of structures. Their configurations and joints are usually complex and require a high level of craftsmanship to assemble. In the field of restoration, a good understanding of the structural behaviour is necessary and is often based on assessment techniques dedicated to wood characterisation. This paper presents the use of experimental modal analysis for finite element updating. To do this, several timber beams in a free supported condition were analysed in order to extract their bending natural characteristics (frequency, damping and mode shapes). Corresponding ABAQUS finite element models were derived which included the effects of local defects (holes, cracks and wood nodes), moisture and structural decay. To achieve the modal updating, additional simulations were performed in order to study the sensitivity of the mechanical parameters. With the intent to estimate their mechanical properties, a procedure of modal updating was carried out in MatLab with a Python script. This was created to extract the modal information from the ABAQUS modal analysis results to be compared with the experimental results. The updating was based on a minimum of unconstrained multivariable function using a derivative-free method. The objective function was selected from the conventional comparison tools (absolute or relative frequency difference, and/or modal assurance criterion). This testing technique was used to determine the dynamic mechanical properties of timber beams, such as the anisotropic Young's Moduli and damping ratio. To verify the modulus, a series of static 4-point bending tests and STS04 classifications were conducted. The results also revealed that local defects have a negligible influence on natural frequencies. The results demonstrate that this assessment tool offers an effective method to obtain the mechanical properties of timber elements, especially when on-site and non-destructive techniques are

  14. ImageParser: a tool for finite element generation from three-dimensional medical images

    PubMed Central

    Yin, HM; Sun, LZ; Wang, G; Yamada, T; Wang, J; Vannier, MW

    2004-01-01

    Background The finite element method (FEM) is a powerful mathematical tool to simulate and visualize the mechanical deformation of tissues and organs during medical examinations or interventions. It is yet a challenge to build up an FEM mesh directly from a volumetric image partially because the regions (or structures) of interest (ROIs) may be irregular and fuzzy. Methods A software package, ImageParser, is developed to generate an FEM mesh from 3-D tomographic medical images. This software uses a semi-automatic method to detect ROIs from the context of image including neighboring tissues and organs, completes segmentation of different tissues, and meshes the organ into elements. Results The ImageParser is shown to build up an FEM model for simulating the mechanical responses of the breast based on 3-D CT images. The breast is compressed by two plate paddles under an overall displacement as large as 20% of the initial distance between the paddles. The strain and tangential Young's modulus distributions are specified for the biomechanical analysis of breast tissues. Conclusion The ImageParser can successfully exact the geometry of ROIs from a complex medical image and generate the FEM mesh with customer-defined segmentation information. PMID:15461787

  15. Channel CAT: A Tactical Link Analysis Tool

    DTIC Science & Technology

    1997-09-01

    NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS CHANNEL CAT : A TACTICAL LINK ANALYSIS TOOL by Michael Glenn Coleman September 1997 Thesis...REPORT TYPE AND DATES COVERED September 1997 Master’s Thesis 4. TITLE AND SUBTITLE CHANNEL CAT : A TACTICAL LINK ANALYSIS TOOL 5. FUNDING NUMBERS 6...tool, the Channel Capacity Analysis Tool (Channel CAT ), designed to provide an automated tool for the anlysis of design decisions in developing client

  16. Studies of finite element analysis of composite material structures

    NASA Technical Reports Server (NTRS)

    Douglas, D. O.; Holzmacher, D. E.; Lane, Z. C.; Thornton, E. A.

    1975-01-01

    Research in the area of finite element analysis is summarized. Topics discussed include finite element analysis of a picture frame shear test, BANSAP (a bandwidth reduction program for SAP IV), FEMESH (a finite element mesh generation program based on isoparametric zones), and finite element analysis of a composite bolted joint specimens.

  17. Macro Analysis Tool - MAT

    EPA Science Inventory

    This product is an easy-to-use Excel-based macro analysis tool (MAT) for performing comparisons of air sensor data with reference data and interpreting the results. This tool tackles one of the biggest hurdles in citizen-led community air monitoring projects – working with ...

  18. Improved accuracy for finite element structural analysis via an integrated force method

    NASA Technical Reports Server (NTRS)

    Patnaik, S. N.; Hopkins, D. A.; Aiello, R. A.; Berke, L.

    1992-01-01

    A comparative study was carried out to determine the accuracy of finite element analyses based on the stiffness method, a mixed method, and the new integrated force and dual integrated force methods. The numerical results were obtained with the following software: MSC/NASTRAN and ASKA for the stiffness method; an MHOST implementation method for the mixed method; and GIFT for the integrated force methods. The results indicate that on an overall basis, the stiffness and mixed methods present some limitations. The stiffness method generally requires a large number of elements in the model to achieve acceptable accuracy. The MHOST method tends to achieve a higher degree of accuracy for course models than does the stiffness method implemented by MSC/NASTRAN and ASKA. The two integrated force methods, which bestow simultaneous emphasis on stress equilibrium and strain compatibility, yield accurate solutions with fewer elements in a model. The full potential of these new integrated force methods remains largely unexploited, and they hold the promise of spawning new finite element structural analysis tools.

  19. Elemental distribution analysis of urinary crystals.

    PubMed

    Fazil Marickar, Y M; Lekshmi, P R; Varma, Luxmi; Koshy, Peter

    2009-10-01

    Various crystals are seen in human urine. Some of them, particularly calcium oxalate dihydrate, are seen normally. Pathological crystals indicate crystal formation initiating urinary stones. Unfortunately, many of the relevant crystals are not recognized in light microscopic analysis of the urinary deposit performed in most of the clinical laboratories. Many crystals are not clearly identifiable under the ordinary light microscopy. The objective of the present study was to perform scanning electron microscopic (SEM) assessment of various urinary deposits and confirm the identity by elemental distribution analysis (EDAX). 50 samples of urinary deposits were collected from urinary stone clinic. Deposits containing significant crystalluria (more than 10 per HPF) were collected under liquid paraffin in special containers and taken up for SEM studies. The deposited crystals were retrieved with appropriate Pasteur pipettes, and placed on micropore filter paper discs. The fluid was absorbed by thicker layers of filter paper underneath and discs were fixed to brass studs. They were then gold sputtered to 100 A and examined under SEM (Jeol JSM 35C microscope). When crystals were seen, their morphology was recorded by taking photographs at different angles. At appropriate magnification, EDAX probe was pointed to the crystals under study and the wave patterns analyzed. Components of the crystals were recognized by utilizing the data. All the samples analyzed contained significant number of crystals. All samples contained more than one type of crystal. The commonest crystals encountered included calcium oxalate monohydrate (whewellite 22%), calcium oxalate dihydrate (weddellite 32%), uric acid (10%), calcium phosphates, namely, apatite (4%), brushite (6%), struvite (6%) and octocalcium phosphate (2%). The morphological appearances of urinary crystals described were correlated with the wavelengths obtained through elemental distribution analysis. Various urinary crystals that

  20. Prototype Development of a Tradespace Analysis Tool for Spaceflight Medical Resources.

    PubMed

    Antonsen, Erik L; Mulcahy, Robert A; Rubin, David; Blue, Rebecca S; Canga, Michael A; Shah, Ronak

    2018-02-01

    The provision of medical care in exploration-class spaceflight is limited by mass, volume, and power constraints, as well as limitations of available skillsets of crewmembers. A quantitative means of exploring the risks and benefits of inclusion or exclusion of onboard medical capabilities may help to inform the development of an appropriate medical system. A pilot project was designed to demonstrate the utility of an early tradespace analysis tool for identifying high-priority resources geared toward properly equipping an exploration mission medical system. Physician subject matter experts identified resources, tools, and skillsets required, as well as associated criticality scores of the same, to meet terrestrial, U.S.-specific ideal medical solutions for conditions concerning for exploration-class spaceflight. A database of diagnostic and treatment actions and resources was created based on this input and weighed against the probabilities of mission-specific medical events to help identify common and critical elements needed in a future exploration medical capability. Analysis of repository data demonstrates the utility of a quantitative method of comparing various medical resources and skillsets for future missions. Directed database queries can provide detailed comparative estimates concerning likelihood of resource utilization within a given mission and the weighted utility of tangible and intangible resources. This prototype tool demonstrates one quantitative approach to the complex needs and limitations of an exploration medical system. While this early version identified areas for refinement in future version development, more robust analysis tools may help to inform the development of a comprehensive medical system for future exploration missions.Antonsen EL, Mulcahy RA, Rubin D, Blue RS, Canga MA, Shah R. Prototype development of a tradespace analysis tool for spaceflight medical resources. Aerosp Med Hum Perform. 2018; 89(2):108-114.

  1. An Analysis of the Effects of Chip-groove Geometry on Machining Performance Using Finite Element Methods

    NASA Astrophysics Data System (ADS)

    Ee, K. C.; Dillon, O. W.; Jawahir, I. S.

    2004-06-01

    This paper discusses the influence of major chip-groove parameters of a cutting tool on the chip formation process in orthogonal machining using finite element (FE) methods. In the FE formulation, a thermal elastic-viscoplastic material model is used together with a modified Johnson-Cook material law for the flow stress. The chip back-flow angle and the chip up-curl radius are calculated for a range of cutting conditions by varying the chip-groove parameters. The analysis provides greater understanding of the effectiveness of chip-groove configurations and points a way to correlate cutting conditions with tool-wear when machining with a grooved cutting tool.

  2. The Model Experiments and Finite Element Analysis on Deformation and Failure by Excavation of Grounds in Foregoing-roof Method

    NASA Astrophysics Data System (ADS)

    Sotokoba, Yasumasa; Okajima, Kenji; Iida, Toshiaki; Tanaka, Tadatsugu

    We propose the trenchless box culvert construction method to construct box culverts in small covering soil layers while keeping roads or tracks open. When we use this construction method, it is necessary to clarify deformation and shear failure by excavation of grounds. In order to investigate the soil behavior, model experiments and elasto-plactic finite element analysis were performed. In the model experiments, it was shown that the shear failure was developed from the end of the roof to the toe of the boundary surface. In the finite element analysis, a shear band effect was introduced. Comparing the observed shear bands in model experiments with computed maximum shear strain contours, it was found that the observed direction of the shear band could be simulated reasonably by the finite element analysis. We may say that the finite element method used in this study is useful tool for this construction method.

  3. Survey of visualization and analysis tools

    NASA Technical Reports Server (NTRS)

    Meyer, P. J.

    1994-01-01

    A large number of commercially available visualization and analysis tools are available to the researcher. Some of the strengths and limitations of some of these tools, from the viewpoint of the earth sciences discipline, are discussed. Visualization and analysis tools fall into one of two categories: those that are designed to a specific purpose and are non-extensive and those that are generic visual programming tools that are extensible. Most of the extensible packages examined incorporate a data flow paradigm.

  4. Static aeroelastic analysis and tailoring of a single-element racing car wing

    NASA Astrophysics Data System (ADS)

    Sadd, Christopher James

    This thesis presents the research from an Engineering Doctorate research programme in collaboration with Reynard Motorsport Ltd, a manufacturer of racing cars. Racing car wing design has traditionally considered structures to be rigid. However, structures are never perfectly rigid and the interaction between aerodynamic loading and structural flexibility has a direct impact on aerodynamic performance. This interaction is often referred to as static aeroelasticity and the focus of this research has been the development of a computational static aeroelastic analysis method to improve the design of a single-element racing car wing. A static aeroelastic analysis method has been developed by coupling a Reynolds-Averaged Navier-Stokes CFD analysis method with a Finite Element structural analysis method using an iterative scheme. Development of this method has included assessment of CFD and Finite Element analysis methods and development of data transfer and mesh deflection methods. Experimental testing was also completed to further assess the computational analyses. The computational and experimental results show a good correlation and these studies have also shown that a Navier-Stokes static aeroelastic analysis of an isolated wing can be performed at an acceptable computational cost. The static aeroelastic analysis tool was used to assess methods of tailoring the structural flexibility of the wing to increase its aerodynamic performance. These tailoring methods were then used to produce two final wing designs to increase downforce and reduce drag respectively. At the average operating dynamic pressure of the racing car, the computational analysis predicts that the downforce-increasing wing has a downforce of C[1]=-1.377 in comparison to C[1]=-1.265 for the original wing. The computational analysis predicts that the drag-reducing wing has a drag of C[d]=0.115 in comparison to C[d]=0.143 for the original wing.

  5. Building energy analysis tool

    DOEpatents

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  6. Multi-mission telecom analysis tool

    NASA Technical Reports Server (NTRS)

    Hanks, D.; Kordon, M.; Baker, J.

    2002-01-01

    In the early formulation phase of a mission it is critically important to have fast, easy to use, easy to integrate space vehicle subsystem analysis tools so that engineers can rapidly perform trade studies not only by themselves but in coordination with other subsystem engineers as well. The Multi-Mission Telecom Analysis Tool (MMTAT) is designed for just this purpose.

  7. Patient-specific finite element modeling of bones.

    PubMed

    Poelert, Sander; Valstar, Edward; Weinans, Harrie; Zadpoor, Amir A

    2013-04-01

    Finite element modeling is an engineering tool for structural analysis that has been used for many years to assess the relationship between load transfer and bone morphology and to optimize the design and fixation of orthopedic implants. Due to recent developments in finite element model generation, for example, improved computed tomography imaging quality, improved segmentation algorithms, and faster computers, the accuracy of finite element modeling has increased vastly and finite element models simulating the anatomy and properties of an individual patient can be constructed. Such so-called patient-specific finite element models are potentially valuable tools for orthopedic surgeons in fracture risk assessment or pre- and intraoperative planning of implant placement. The aim of this article is to provide a critical overview of current themes in patient-specific finite element modeling of bones. In addition, the state-of-the-art in patient-specific modeling of bones is compared with the requirements for a clinically applicable patient-specific finite element method, and judgment is passed on the feasibility of application of patient-specific finite element modeling as a part of clinical orthopedic routine. It is concluded that further development in certain aspects of patient-specific finite element modeling are needed before finite element modeling can be used as a routine clinical tool.

  8. Mars Reconnaissance Orbiter Uplink Analysis Tool

    NASA Technical Reports Server (NTRS)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; Hwang, Pauline

    2008-01-01

    This software analyzes Mars Reconnaissance Orbiter (MRO) orbital geometry with respect to Mars Exploration Rover (MER) contact windows, and is the first tool of its kind designed specifically to support MRO-MER interface coordination. Prior to this automated tool, this analysis was done manually with Excel and the UNIX command line. In total, the process would take approximately 30 minutes for each analysis. The current automated analysis takes less than 30 seconds. This tool resides on the flight machine and uses a PHP interface that does the entire analysis of the input files and takes into account one-way light time from another input file. Input flies are copied over to the proper directories and are dynamically read into the tool s interface. The user can then choose the corresponding input files based on the time frame desired for analysis. After submission of the Web form, the tool merges the two files into a single, time-ordered listing of events for both spacecraft. The times are converted to the same reference time (Earth Transmit Time) by reading in a light time file and performing the calculations necessary to shift the time formats. The program also has the ability to vary the size of the keep-out window on the main page of the analysis tool by inputting a custom time for padding each MRO event time. The parameters on the form are read in and passed to the second page for analysis. Everything is fully coded in PHP and can be accessed by anyone with access to the machine via Web page. This uplink tool will continue to be used for the duration of the MER mission's needs for X-band uplinks. Future missions also can use the tools to check overflight times as well as potential site observation times. Adaptation of the input files to the proper format, and the window keep-out times, would allow for other analyses. Any operations task that uses the idea of keep-out windows will have a use for this program.

  9. Finite element modeling and analysis of tires

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Andersen, C. M.

    1983-01-01

    Predicting the response of tires under various loading conditions using finite element technology is addressed. Some of the recent advances in finite element technology which have high potential for application to tire modeling problems are reviewed. The analysis and modeling needs for tires are identified. Reduction methods for large-scale nonlinear analysis, with particular emphasis on treatment of combined loads, displacement-dependent and nonconservative loadings; development of simple and efficient mixed finite element models for shell analysis, identification of equivalent mixed and purely displacement models, and determination of the advantages of using mixed models; and effective computational models for large-rotation nonlinear problems, based on a total Lagrangian description of the deformation are included.

  10. Finite Element Simulations of Micro Turning of Ti-6Al-4V using PCD and Coated Carbide tools

    NASA Astrophysics Data System (ADS)

    Jagadesh, Thangavel; Samuel, G. L.

    2017-02-01

    The demand for manufacturing axi-symmetric Ti-6Al-4V implants is increasing in biomedical applications and it involves micro turning process. To understand the micro turning process, in this work, a 3D finite element model has been developed for predicting the tool chip interface temperature, cutting, thrust and axial forces. Strain gradient effect has been included in the Johnson-Cook material model to represent the flow stress of the work material. To verify the simulation results, experiments have been conducted at four different feed rates and at three different cutting speeds. Since titanium alloy has low Young's modulus, spring back effect is predominant for higher edge radius coated carbide tool which leads to the increase in the forces. Whereas, polycrystalline diamond (PCD) tool has smaller edge radius that leads to lesser forces and decrease in tool chip interface temperature due to high thermal conductivity. Tool chip interface temperature increases by increasing the cutting speed, however the increase is less for PCD tool as compared to the coated carbide tool. When uncut chip thickness decreases, there is an increase in specific cutting energy due to material strengthening effects. Surface roughness is higher for coated carbide tool due to ploughing effect when compared with PCD tool. The average prediction error of finite element model for cutting and thrust forces are 11.45 and 14.87 % respectively.

  11. Integrated transient thermal-structural finite element analysis

    NASA Technical Reports Server (NTRS)

    Thornton, E. A.; Dechaumphai, P.; Wieting, A. R.; Tamma, K. K.

    1981-01-01

    An integrated thermal structural finite element approach for efficient coupling of transient thermal and structural analysis is presented. Integrated thermal structural rod and one dimensional axisymmetric elements considering conduction and convection are developed and used in transient thermal structural applications. The improved accuracy of the integrated approach is illustrated by comparisons with exact transient heat conduction elasticity solutions and conventional finite element thermal finite element structural analyses.

  12. Improved accuracy for finite element structural analysis via a new integrated force method

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Hopkins, Dale A.; Aiello, Robert A.; Berke, Laszlo

    1992-01-01

    A comparative study was carried out to determine the accuracy of finite element analyses based on the stiffness method, a mixed method, and the new integrated force and dual integrated force methods. The numerical results were obtained with the following software: MSC/NASTRAN and ASKA for the stiffness method; an MHOST implementation method for the mixed method; and GIFT for the integrated force methods. The results indicate that on an overall basis, the stiffness and mixed methods present some limitations. The stiffness method generally requires a large number of elements in the model to achieve acceptable accuracy. The MHOST method tends to achieve a higher degree of accuracy for course models than does the stiffness method implemented by MSC/NASTRAN and ASKA. The two integrated force methods, which bestow simultaneous emphasis on stress equilibrium and strain compatibility, yield accurate solutions with fewer elements in a model. The full potential of these new integrated force methods remains largely unexploited, and they hold the promise of spawning new finite element structural analysis tools.

  13. Chromatographic Techniques for Rare Earth Elements Analysis

    NASA Astrophysics Data System (ADS)

    Chen, Beibei; He, Man; Zhang, Huashan; Jiang, Zucheng; Hu, Bin

    2017-04-01

    The present capability of rare earth element (REE) analysis has been achieved by the development of two instrumental techniques. The efficiency of spectroscopic methods was extraordinarily improved for the detection and determination of REE traces in various materials. On the other hand, the determination of REEs very often depends on the preconcentration and separation of REEs, and chromatographic techniques are very powerful tools for the separation of REEs. By coupling with sensitive detectors, many ambitious analytical tasks can be fulfilled. Liquid chromatography is the most widely used technique. Different combinations of stationary phases and mobile phases could be used in ion exchange chromatography, ion chromatography, ion-pair reverse-phase chromatography and some other techniques. The application of gas chromatography is limited because only volatile compounds of REEs can be separated. Thin-layer and paper chromatography are techniques that cannot be directly coupled with suitable detectors, which limit their applications. For special demands, separations can be performed by capillary electrophoresis, which has very high separation efficiency.

  14. Regulatory sequence analysis tools.

    PubMed

    van Helden, Jacques

    2003-07-01

    The web resource Regulatory Sequence Analysis Tools (RSAT) (http://rsat.ulb.ac.be/rsat) offers a collection of software tools dedicated to the prediction of regulatory sites in non-coding DNA sequences. These tools include sequence retrieval, pattern discovery, pattern matching, genome-scale pattern matching, feature-map drawing, random sequence generation and other utilities. Alternative formats are supported for the representation of regulatory motifs (strings or position-specific scoring matrices) and several algorithms are proposed for pattern discovery. RSAT currently holds >100 fully sequenced genomes and these data are regularly updated from GenBank.

  15. Simulating muscular thin films using thermal contraction capabilities in finite element analysis tools.

    PubMed

    Webster, Victoria A; Nieto, Santiago G; Grosberg, Anna; Akkus, Ozan; Chiel, Hillel J; Quinn, Roger D

    2016-10-01

    In this study, new techniques for approximating the contractile properties of cells in biohybrid devices using Finite Element Analysis (FEA) have been investigated. Many current techniques for modeling biohybrid devices use individual cell forces to simulate the cellular contraction. However, such techniques result in long simulation runtimes. In this study we investigated the effect of the use of thermal contraction on simulation runtime. The thermal contraction model was significantly faster than models using individual cell forces, making it beneficial for rapidly designing or optimizing devices. Three techniques, Stoney׳s Approximation, a Modified Stoney׳s Approximation, and a Thermostat Model, were explored for calibrating thermal expansion/contraction parameters (TECPs) needed to simulate cellular contraction using thermal contraction. The TECP values were calibrated by using published data on the deflections of muscular thin films (MTFs). Using these techniques, TECP values that suitably approximate experimental deflections can be determined by using experimental data obtained from cardiomyocyte MTFs. Furthermore, a sensitivity analysis was performed in order to investigate the contribution of individual variables, such as elastic modulus and layer thickness, to the final calibrated TECP for each calibration technique. Additionally, the TECP values are applicable to other types of biohybrid devices. Two non-MTF models were simulated based on devices reported in the existing literature. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis

    USGS Publications Warehouse

    McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.

    2009-01-01

    Natural resources support all of our social and economic activities, as well as our biological existence. Humans have little control over most of the physical, biological, and sociological conditions dictating the status and capacity of natural resources in any particular area. However, the most rapid and threatening influences on natural resources typically are anthropogenic overuse and degradation. In addition, living natural resources (i.e., organisms) do not respect political boundaries, but are aware of their optimal habitat and environmental conditions. Most organisms have wider spatial ranges than the jurisdictional boundaries of environmental agencies that deal with them; even within those jurisdictions, information is patchy and disconnected. Planning and projecting effects of ecological management are difficult, because many organisms, habitat conditions, and interactions are involved. Conservation and responsible resource use involves wise management and manipulation of the aspects of the environment and biological communities that can be effectively changed. Tools and data sets that provide new insights and analysis capabilities can enhance the ability of resource managers to make wise decisions and plan effective, long-term management strategies. Aquatic gap analysis has been developed to provide those benefits. Gap analysis is more than just the assessment of the match or mis-match (i.e., gaps) between habitats of ecological value and areas with an appropriate level of environmental protection (e.g., refuges, parks, preserves), as the name suggests. Rather, a Gap Analysis project is a process which leads to an organized database of georeferenced information and previously available tools to examine conservation and other ecological issues; it provides a geographic analysis platform that serves as a foundation for aquatic ecological studies. This analytical tool box allows one to conduct assessments of all habitat elements within an area of interest

  17. Tools for Data Analysis in the Middle School Classroom: A Teacher Professional Development Program

    NASA Astrophysics Data System (ADS)

    Ledley, T. S.; Haddad, N.; McAuliffe, C.; Dahlman, L.

    2006-12-01

    In order for students to learn how to engage with scientific data to answer questions about the real world, it is imperative that their teachers are 1) comfortable with the data and the tools used to analyze it, and 2) feel prepared to support their students in this complex endeavor. TERC's Tools for Data Analysis in the Middle School Classroom (DataTools) professional development program, funded by NSF's ITEST program, prepares middle school teachers to integrate Web-based scientific data and analysis tools into their existing curricula. This 13-month program supports teachers in using a set of freely or commonly available tools with a wide range of data. It also gives them an opportunity to practice teaching these skills to students before teaching in their own classrooms. The ultimate goal of the program is to increase the number of middle school students who work directly with scientific data, who use the tools of technology to import, manipulate, visualize and analyze the data, who come to understand the power of data-based arguments, and who will consider pursuing a career in technical and scientific fields. In this session, we will describe the elements of the DataTools program and the Earth Exploration Toolbook (EET, http://serc.carleton.edu/eet), a Web-based resource that supports Earth system education for teachers and students in grades 6 through 16. The EET provides essential support to DataTools teachers as they use it to learn to locate and download Web-based data and use data analysis tools. We will also share what we have learned during the first year of this three-year program.

  18. System analysis tools for an ELT at ESO

    NASA Astrophysics Data System (ADS)

    Mueller, Michael; Koch, Franz

    2006-06-01

    Engineering of complex, large scale systems like the ELT designs currently investigated and developed in Europe and Northern America require powerful and sophisticated tools within specific technical disciplines such as mechanics, optics and control engineering. However, even analyzing a certain component of the telescope like the telescope structure necessitates a system approach to evaluate the structural effects onto the optical performance. This paper shows several software tools developed by the European Southern Observatory (ESO) which focus onto the system approach in the analyses: Using modal results of a finite element analysis the SMI-toolbox allows an easy generation of structural models with different sizes and levels of accuracy for the control design and closed-loop simulations. The optical modeling code BeamWarrior was developed by ESO and Astrium GmbH, Germany) especially for integrated modeling and interfering with a structural model. Within BeamWarrior displacements and deformations can be applied in an arbitrary coordinate system, and hence also in the global coordinates of the FE model avoiding error prone transformations. In addition to this, a sparse state space model object was developed for Matlab to gain in computational efficiency and reduced memory requirements due to the sparsity pattern of both the structural models and the control architecture. As one result these tools allow building an integrated model in order to reliably simulate interactions, cross-coupling effects, system responses, and to evaluate global performance. In order to evaluate disturbance effects on the optical performance in openloop more efficiently, an optical evaluation toolbox was built in the FE software ANSYS which performs Zernike decomposition and best-fit computation of the deformations directly in the FE analysis.

  19. Two dimensional finite element thermal model of laser surface glazing for H13 tool steel

    NASA Astrophysics Data System (ADS)

    Kabir, I. R.; Yin, D.; Naher, S.

    2016-10-01

    A two dimensional (2D) transient thermal model with line-heat-source was developed by Finite Element Method (FEM) for laser surface glazing of H13 tool steel using commercial software-ANSYS 15. The geometry of the model was taken as a transverse circular cross-section of cylindrical specimen. Two different power levels (300W, 200W) were used with 0.2mm width of laser beam and 0.15ms exposure time. Temperature distribution, heating and cooling rates, and the dimensions of modified surface were analysed. The maximum temperatures achieved were 2532K (2259°C) and 1592K (1319°C) for laser power 300W and 200W respectively. The maximum cooling rates were 4.2×107 K/s for 300W and 2×107 K/s for 200W. Depths of modified zone increased with increasing laser power. From this analysis, it can be predicted that for 0.2mm beam width and 0.15ms time exposer melting temperature of H13 tool steel is achieved within 200-300W power range of laser beam in laser surface glazing.

  20. Medea selfish genetic elements as tools for altering traits of wild populations: a theoretical analysis.

    PubMed

    Ward, Catherine M; Su, Jessica T; Huang, Yunxin; Lloyd, Alun L; Gould, Fred; Hay, Bruce A

    2011-04-01

    One strategy for controlling transmission of insect-borne disease involves replacing the native insect population with transgenic animals unable to transmit disease. Population replacement requires a drive mechanism to ensure the rapid spread of linked transgenes, the presence of which may result in a fitness cost to carriers. Medea selfish genetic elements have the feature that when present in a female, only offspring that inherit the element survive, a behavior that can lead to spread. Here, we derive equations that describe the conditions under which Medea elements with a fitness cost will spread, and the equilibrium allele frequencies are achieved. Of particular importance, we show that whenever Medea spreads, the non-Medea genotype is driven out of the population, and we estimate the number of generations required to achieve this goal for Medea elements with different fitness costs and male-only introduction frequencies. Finally, we characterize two contexts in which Medea elements with fitness costs drive the non-Medea allele from the population: an autosomal element in which not all Medea-bearing progeny of a Medea-bearing mother survive, and an X-linked element in species in which X/Y individuals are male. Our results suggest that Medea elements can drive population replacement under a wide range of conditions. © 2010 The Author(s). Evolution© 2010 The Society for the Study of Evolution.

  1. Integrating Reliability Analysis with a Performance Tool

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  2. Determination of minor and trace elements concentration in kidney stones using elemental analysis techniques

    NASA Astrophysics Data System (ADS)

    Srivastava, Anjali

    The determination of accurate material composition of a kidney stone is crucial for understanding the formation of the kidney stone as well as for preventive therapeutic strategies. Radiations probing instrumental activation analysis techniques are excellent tools for identification of involved materials present in the kidney stone. The X-ray fluorescence (XRF) and neutron activation analysis (NAA) experiments were performed and different kidney stones were analyzed. The interactions of X-ray photons and neutrons with matter are complementary in nature, resulting in distinctly different materials detection. This is the first approach to utilize combined X-ray fluorescence and neutron activation analysis for a comprehensive analysis of the kideny stones. Presently, experimental studies in conjunction with analytical techniques were used to determine the exact composition of the kidney stone. The use of open source program Python Multi-Channel Analyzer was utilized to unfold the XRF spectrum. A new type of experimental set-up was developed and utilized for XRF and NAA analysis of the kidney stone. To verify the experimental results with analytical calculation, several sets of kidney stones were analyzed using XRF and NAA technique. The elements which were identified from XRF technique are Br, Cu, Ga, Ge, Mo, Nb, Ni, Rb, Se, Sr, Y, Zr. And, by using Neutron Activation Analysis (NAA) are Au, Br, Ca, Er, Hg, I, K, Na, Pm, Sb, Sc, Sm, Tb, Yb, Zn. This thesis presents a new approach for exact detection of accurate material composition of kidney stone materials using XRF and NAA instrumental activation analysis techniques.

  3. Analysis of concrete beams using applied element method

    NASA Astrophysics Data System (ADS)

    Lincy Christy, D.; Madhavan Pillai, T. M.; Nagarajan, Praveen

    2018-03-01

    The Applied Element Method (AEM) is a displacement based method of structural analysis. Some of its features are similar to that of Finite Element Method (FEM). In AEM, the structure is analysed by dividing it into several elements similar to FEM. But, in AEM, elements are connected by springs instead of nodes as in the case of FEM. In this paper, background to AEM is discussed and necessary equations are derived. For illustrating the application of AEM, it has been used to analyse plain concrete beam of fixed support condition. The analysis is limited to the analysis of 2-dimensional structures. It was found that the number of springs has no much influence on the results. AEM could predict deflection and reactions with reasonable degree of accuracy.

  4. A meta-model analysis of a finite element simulation for defining poroelastic properties of intervertebral discs.

    PubMed

    Nikkhoo, Mohammad; Hsu, Yu-Chun; Haghpanahi, Mohammad; Parnianpour, Mohamad; Wang, Jaw-Lin

    2013-06-01

    Finite element analysis is an effective tool to evaluate the material properties of living tissue. For an interactive optimization procedure, the finite element analysis usually needs many simulations to reach a reasonable solution. The meta-model analysis of finite element simulation can be used to reduce the computation of a structure with complex geometry or a material with composite constitutive equations. The intervertebral disc is a complex, heterogeneous, and hydrated porous structure. A poroelastic finite element model can be used to observe the fluid transferring, pressure deviation, and other properties within the disc. Defining reasonable poroelastic material properties of the anulus fibrosus and nucleus pulposus is critical for the quality of the simulation. We developed a material property updating protocol, which is basically a fitting algorithm consisted of finite element simulations and a quadratic response surface regression. This protocol was used to find the material properties, such as the hydraulic permeability, elastic modulus, and Poisson's ratio, of intact and degenerated porcine discs. The results showed that the in vitro disc experimental deformations were well fitted with limited finite element simulations and a quadratic response surface regression. The comparison of material properties of intact and degenerated discs showed that the hydraulic permeability significantly decreased but Poisson's ratio significantly increased for the degenerated discs. This study shows that the developed protocol is efficient and effective in defining material properties of a complex structure such as the intervertebral disc.

  5. Transient analysis using conical shell elements

    NASA Technical Reports Server (NTRS)

    Yang, J. C. S.; Goeller, J. E.; Messick, W. T.

    1973-01-01

    The use of the NASTRAN conical shell element in static, eigenvalue, and direct transient analyses is demonstrated. The results of a NASTRAN static solution of an externally pressurized ring-stiffened cylinder agree well with a theoretical discontinuity analysis. Good agreement is also obtained between the NASTRAN direct transient response of a uniform cylinder to a dynamic end load and one-dimensional solutions obtained using a method of characteristics stress wave code and a standing wave solution. Finally, a NASTRAN eigenvalue analysis is performed on a hydroballistic model idealized with conical shell elements.

  6. Dynamic analysis and vibration testing of CFRP drive-line system used in heavy-duty machine tool

    NASA Astrophysics Data System (ADS)

    Yang, Mo; Gui, Lin; Hu, Yefa; Ding, Guoping; Song, Chunsheng

    2018-03-01

    Low critical rotary speed and large vibration in the metal drive-line system of heavy-duty machine tool affect the machining precision seriously. Replacing metal drive-line with the CFRP drive-line can effectively solve this problem. Based on the composite laminated theory and the transfer matrix method (TMM), this paper puts forward a modified TMM to analyze dynamic characteristics of CFRP drive-line system. With this modified TMM, the CFRP drive-line of a heavy vertical miller is analyzed. And the finite element modal analysis model of the shafting is established. The results of the modified TMM and finite element analysis (FEA) show that the modified TMM can effectively predict the critical rotary speed of CFRP drive-line. And the critical rotary speed of CFRP drive-line is 20% higher than that of the original metal drive-line. Then, the vibration of the CFRP and the metal drive-line were tested. The test results show that application of the CFRP drive shaft in the drive-line can effectively reduce the vibration of the heavy-duty machine tool.

  7. The Complexity Analysis Tool

    DTIC Science & Technology

    1988-10-01

    overview of the complexity analysis tool ( CAT ), an automated tool which will analyze mission critical computer resources (MCCR) software. CAT is based...84 MAR UNCLASSIFIED SECURITY CLASSIFICATION OF THIS PAGE 19. ABSTRACT: (cont) CAT automates the metric for BASIC (HP-71), ATLAS (EQUATE), Ada (subset...UNIX 5.2). CAT analyzes source code and computes complexity on a module basis. CAT also generates graphic representations of the logic flow paths and

  8. Single cell elemental analysis using nuclear microscopy

    NASA Astrophysics Data System (ADS)

    Ren, M. Q.; Thong, P. S. P.; Kara, U.; Watt, F.

    1999-04-01

    The use of Particle Induced X-ray Emission (PIXE), Rutherford Backscattering Spectrometry (RBS) and Scanning Transmission Ion Microscopy (STIM) to provide quantitative elemental analysis of single cells is an area which has high potential, particularly when the trace elements such as Ca, Fe, Zn and Cu can be monitored. We describe the methodology of sample preparation for two cell types, the procedures of cell imaging using STIM, and the quantitative elemental analysis of single cells using RBS and PIXE. Recent work on single cells at the Nuclear Microscopy Research Centre,National University of Singapore has centred around two research areas: (a) Apoptosis (programmed cell death), which has been recently implicated in a wide range of pathological conditions such as cancer, Parkinson's disease etc, and (b) Malaria (infection of red blood cells by the malaria parasite). Firstly we present results on the elemental analysis of human Chang liver cells (ATTCC CCL 13) where vanadium ions were used to trigger apoptosis, and demonstrate that nuclear microscopy has the capability of monitoring vanadium loading within individual cells. Secondly we present the results of elemental changes taking place in individual mouse red blood cells which have been infected with the malaria parasite and treated with the anti-malaria drug Qinghaosu (QHS).

  9. Multi-mission space vehicle subsystem analysis tools

    NASA Technical Reports Server (NTRS)

    Kordon, M.; Wood, E.

    2003-01-01

    Spacecraft engineers often rely on specialized simulation tools to facilitate the analysis, design and operation of space systems. Unfortunately these tools are often designed for one phase of a single mission and cannot be easily adapted to other phases or other misions. The Multi-Mission Pace Vehicle Susbsystem Analysis Tools are designed to provide a solution to this problem.

  10. Paediatric Automatic Phonological Analysis Tools (APAT).

    PubMed

    Saraiva, Daniela; Lousada, Marisa; Hall, Andreia; Jesus, Luis M T

    2017-12-01

    To develop the pediatric Automatic Phonological Analysis Tools (APAT) and to estimate inter and intrajudge reliability, content validity, and concurrent validity. The APAT were constructed using Excel spreadsheets with formulas. The tools were presented to an expert panel for content validation. The corpus used in the Portuguese standardized test Teste Fonético-Fonológico - ALPE produced by 24 children with phonological delay or phonological disorder was recorded, transcribed, and then inserted into the APAT. Reliability and validity of APAT were analyzed. The APAT present strong inter- and intrajudge reliability (>97%). The content validity was also analyzed (ICC = 0.71), and concurrent validity revealed strong correlations between computerized and manual (traditional) methods. The development of these tools contributes to fill existing gaps in clinical practice and research, since previously there were no valid and reliable tools/instruments for automatic phonological analysis, which allowed the analysis of different corpora.

  11. Finite element analysis of osteoporosis models based on synchrotron radiation

    NASA Astrophysics Data System (ADS)

    Xu, W.; Xu, J.; Zhao, J.; Sun, J.

    2016-04-01

    With growing pressure of social aging, China has to face the increasing population of osteoporosis patients as well as the whole world. Recently synchrotron radiation has become an essential tool for biomedical exploration with advantage of high resolution and high stability. In order to study characteristic changes in different stages of primary osteoporosis, this research focused on the different periods of osteoporosis of rats based on synchrotron radiation. Both bone histomorphometry analysis and finite element analysis were then carried on according to the reconstructed three dimensional models. Finally, the changes of bone tissue in different periods were compared quantitatively. Histomorphometry analysis showed that the structure of the trabecular in osteoporosis degraded as the bone volume decreased. For femurs, the bone volume fraction (Bone volume/ Total volume, BV/TV) decreased from 69% to 43%. That led to the increase of the thickness of trabecular separation (from 45.05μ m to 97.09μ m) and the reduction of the number of trabecular (from 7.99 mm-1 to 5.97mm-1). Simulation of various mechanical tests with finite element analysis (FEA) indicated that, with the exacerbation of osteoporosis, the bones' ability of resistance to compression, bending and torsion gradually became weaker. The compression stiffness of femurs decreased from 1770.96 Fμ m-1 to 697.41 Fμ m-1, the bending and torsion stiffness were from 1390.80 Fμ m-1 to 566.11 Fμ m-1 and from 2957.28N.m/o to 691.31 N.m/o respectively, indicated the decrease of bone strength, and it matched the histomorphometry analysis. This study suggested that FEA and synchrotron radiation were excellent methods for analysing bone strength conbined with histomorphometry analysis.

  12. The Use Of Computational Human Performance Modeling As Task Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacuqes Hugo; David Gertman

    2012-07-01

    During a review of the Advanced Test Reactor safety basis at the Idaho National Laboratory, human factors engineers identified ergonomic and human reliability risks involving the inadvertent exposure of a fuel element to the air during manual fuel movement and inspection in the canal. There were clear indications that these risks increased the probability of human error and possible severe physical outcomes to the operator. In response to this concern, a detailed study was conducted to determine the probability of the inadvertent exposure of a fuel element. Due to practical and safety constraints, the task network analysis technique was employedmore » to study the work procedures at the canal. Discrete-event simulation software was used to model the entire procedure as well as the salient physical attributes of the task environment, such as distances walked, the effect of dropped tools, the effect of hazardous body postures, and physical exertion due to strenuous tool handling. The model also allowed analysis of the effect of cognitive processes such as visual perception demands, auditory information and verbal communication. The model made it possible to obtain reliable predictions of operator performance and workload estimates. It was also found that operator workload as well as the probability of human error in the fuel inspection and transfer task were influenced by the concurrent nature of certain phases of the task and the associated demand on cognitive and physical resources. More importantly, it was possible to determine with reasonable accuracy the stages as well as physical locations in the fuel handling task where operators would be most at risk of losing their balance and falling into the canal. The model also provided sufficient information for a human reliability analysis that indicated that the postulated fuel exposure accident was less than credible.« less

  13. Storage element performance optimization for CMS analysis jobs

    NASA Astrophysics Data System (ADS)

    Behrmann, G.; Dahlblom, J.; Guldmyr, J.; Happonen, K.; Lindén, T.

    2012-12-01

    Tier-2 computing sites in the Worldwide Large Hadron Collider Computing Grid (WLCG) host CPU-resources (Compute Element, CE) and storage resources (Storage Element, SE). The vast amount of data that needs to processed from the Large Hadron Collider (LHC) experiments requires good and efficient use of the available resources. Having a good CPU efficiency for the end users analysis jobs requires that the performance of the storage system is able to scale with I/O requests from hundreds or even thousands of simultaneous jobs. In this presentation we report on the work on improving the SE performance at the Helsinki Institute of Physics (HIP) Tier-2 used for the Compact Muon Experiment (CMS) at the LHC. Statistics from CMS grid jobs are collected and stored in the CMS Dashboard for further analysis, which allows for easy performance monitoring by the sites and by the CMS collaboration. As part of the monitoring framework CMS uses the JobRobot which sends every four hours 100 analysis jobs to each site. CMS also uses the HammerCloud tool for site monitoring and stress testing and it has replaced the JobRobot. The performance of the analysis workflow submitted with JobRobot or HammerCloud can be used to track the performance due to site configuration changes, since the analysis workflow is kept the same for all sites and for months in time. The CPU efficiency of the JobRobot jobs at HIP was increased approximately by 50 % to more than 90 %, by tuning the SE and by improvements in the CMSSW and dCache software. The performance of the CMS analysis jobs improved significantly too. Similar work has been done on other CMS Tier-sites, since on average the CPU efficiency for CMSSW jobs has increased during 2011. Better monitoring of the SE allows faster detection of problems, so that the performance level can be kept high. The next storage upgrade at HIP consists of SAS disk enclosures which can be stress tested on demand with HammerCloud workflows, to make sure that the I

  14. FSSC Science Tools: Pulsar Analysis

    NASA Technical Reports Server (NTRS)

    Thompson, Dave

    2010-01-01

    This slide presentation reviews the typical pulsar analysis, giving tips for screening of the data, the use of time series analysis, and utility tools. Specific information about analyzing Vela data is reviewed.

  15. An interactive graphics system to facilitate finite element structural analysis

    NASA Technical Reports Server (NTRS)

    Burk, R. C.; Held, F. H.

    1973-01-01

    The characteristics of an interactive graphics systems to facilitate the finite element method of structural analysis are described. The finite element model analysis consists of three phases: (1) preprocessing (model generation), (2) problem solution, and (3) postprocessing (interpretation of results). The advantages of interactive graphics to finite element structural analysis are defined.

  16. Contamination Analysis Tools

    NASA Technical Reports Server (NTRS)

    Brieda, Lubos

    2015-01-01

    This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.

  17. Design and Application of the Exploration Maintainability Analysis Tool

    NASA Technical Reports Server (NTRS)

    Stromgren, Chel; Terry, Michelle; Crillo, William; Goodliff, Kandyce; Maxwell, Andrew

    2012-01-01

    Conducting human exploration missions beyond Low Earth Orbit (LEO) will present unique challenges in the areas of supportability and maintainability. The durations of proposed missions can be relatively long and re-supply of logistics, including maintenance and repair items, will be limited or non-existent. In addition, mass and volume constraints in the transportation system will limit the total amount of logistics that can be flown along with the crew. These constraints will require that new strategies be developed with regards to how spacecraft systems are designed and maintained. NASA is currently developing Design Reference Missions (DRMs) as an initial step in defining future human missions. These DRMs establish destinations and concepts of operation for future missions, and begin to define technology and capability requirements. Because of the unique supportability challenges, historical supportability data and models are not directly applicable for establishing requirements for beyond LEO missions. However, supportability requirements could have a major impact on the development of the DRMs. The mass, volume, and crew resources required to support the mission could all be first order drivers in the design of missions, elements, and operations. Therefore, there is a need for enhanced analysis capabilities to more accurately establish mass, volume, and time requirements for supporting beyond LEO missions. Additionally, as new technologies and operations are proposed to reduce these requirements, it is necessary to have accurate tools to evaluate the efficacy of those approaches. In order to improve the analysis of supportability requirements for beyond LEO missions, the Space Missions Analysis Branch at the NASA Langley Research Center is developing the Exploration Maintainability Analysis Tool (EMAT). This tool is a probabilistic simulator that evaluates the need for repair and maintenance activities during space missions and the logistics and crew

  18. Analysis and design of friction stir welding tool

    NASA Astrophysics Data System (ADS)

    Jagadeesha, C. B.

    2016-12-01

    Since its inception no one has done analysis and design of FSW tool. Initial dimensions of FSW tool are decided by educated guess. Optimum stresses on tool pin have been determined at optimized parameters for bead on plate welding on AZ31B-O Mg alloy plate. Fatigue analysis showed that the chosen FSW tool for the welding experiment has not ∞ life and it has determined that the life of FSW tool is 2.66×105 cycles or revolutions. So one can conclude that any arbitrarily decided FSW tool generally has finite life and cannot be used for ∞ life. In general, one can determine the suitability of tool and its material to be used in FSW of the given workpiece materials in advance by this analysis in terms of fatigue life of the tool.

  19. Toward transient finite element simulation of thermal deformation of machine tools in real-time

    NASA Astrophysics Data System (ADS)

    Naumann, Andreas; Ruprecht, Daniel; Wensch, Joerg

    2018-01-01

    Finite element models without simplifying assumptions can accurately describe the spatial and temporal distribution of heat in machine tools as well as the resulting deformation. In principle, this allows to correct for displacements of the Tool Centre Point and enables high precision manufacturing. However, the computational cost of FE models and restriction to generic algorithms in commercial tools like ANSYS prevents their operational use since simulations have to run faster than real-time. For the case where heat diffusion is slow compared to machine movement, we introduce a tailored implicit-explicit multi-rate time stepping method of higher order based on spectral deferred corrections. Using the open-source FEM library DUNE, we show that fully coupled simulations of the temperature field are possible in real-time for a machine consisting of a stock sliding up and down on rails attached to a stand.

  20. Improved finite element methodology for integrated thermal structural analysis

    NASA Technical Reports Server (NTRS)

    Dechaumphai, P.; Thornton, E. A.

    1982-01-01

    An integrated thermal-structural finite element approach for efficient coupling of thermal and structural analysis is presented. New thermal finite elements which yield exact nodal and element temperatures for one dimensional linear steady state heat transfer problems are developed. A nodeless variable formulation is used to establish improved thermal finite elements for one dimensional nonlinear transient and two dimensional linear transient heat transfer problems. The thermal finite elements provide detailed temperature distributions without using additional element nodes and permit a common discretization with lower order congruent structural finite elements. The accuracy of the integrated approach is evaluated by comparisons with analytical solutions and conventional finite element thermal structural analyses for a number of academic and more realistic problems. Results indicate that the approach provides a significant improvement in the accuracy and efficiency of thermal stress analysis for structures with complex temperature distributions.

  1. Prediction of Thermal Fatigue in Tooling for Die-casting Copper via Finite Element Analysis

    NASA Astrophysics Data System (ADS)

    Sakhuja, Amit; Brevick, Jerald R.

    2004-06-01

    Recent research by the Copper Development Association (CDA) has demonstrated the feasibility of die-casting electric motor rotors using copper. Electric motors using copper rotors are significantly more energy efficient relative to motors using aluminum rotors. However, one of the challenges in copper rotor die-casting is low tool life. Experiments have shown that the higher molten metal temperature of copper (1085 °C), as compared to aluminum (660 °C) accelerates the onset of thermal fatigue or heat checking in traditional H-13 tool steel. This happens primarily because the mechanical properties of H-13 tool steel decrease significantly above 650 °C. Potential approaches to mitigate the heat checking problem include: 1) identification of potential tool materials having better high temperature mechanical properties than H-13, and 2) reduction of the magnitude of cyclic thermal excursions experienced by the tooling by increasing the bulk die temperature. A preliminary assessment of alternative tool materials has led to the selection of nickel-based alloys Haynes 230 and Inconel 617 as potential candidates. These alloys were selected based on their elevated temperature physical and mechanical properties. Therefore, the overall objective of this research work was to predict the number of copper rotor die-casting cycles to the onset of heat checking (tool life) as a function of bulk die temperature (up to 650 °C) for Haynes 230 and Inconel 617 alloys. To achieve these goals, a 2D thermo-mechanical FEA was performed to evaluate strain ranges on selected die surfaces. The method of Universal Slopes (Strain Life Method) was then employed for thermal fatigue life predictions.

  2. Contact Stress Analysis of Spiral Bevel Gears Using Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Bibel, G. D.; Kumar, A; Reddy, S.; Handschuh, R.

    1995-01-01

    A procedure is presented for performing three-dimensional stress analysis of spiral bevel gears in mesh using the finite element method. The procedure involves generating a finite element model by solving equations that identify tooth surface coordinates. Coordinate transformations are used to orientate the gear and pinion for gear meshing. Contact boundary conditions are simulated with gap elements. A solution technique for correct orientation of the gap elements is given. Example models and results are presented.

  3. Finite element analysis of human joints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bossart, P.L.; Hollerbach, K.

    1996-09-01

    Our work focuses on the development of finite element models (FEMs) that describe the biomechanics of human joints. Finite element modeling is becoming a standard tool in industrial applications. In highly complex problems such as those found in biomechanics research, however, the full potential of FEMs is just beginning to be explored, due to the absence of precise, high resolution medical data and the difficulties encountered in converting these enormous datasets into a form that is usable in FEMs. With increasing computing speed and memory available, it is now feasible to address these challenges. We address the first by acquiringmore » data with a high resolution C-ray CT scanner and the latter by developing semi-automated method for generating the volumetric meshes used in the FEM. Issues related to tomographic reconstruction, volume segmentation, the use of extracted surfaces to generate volumetric hexahedral meshes, and applications of the FEM are described.« less

  4. Design sensitivity analysis of boundary element substructures

    NASA Technical Reports Server (NTRS)

    Kane, James H.; Saigal, Sunil; Gallagher, Richard H.

    1989-01-01

    The ability to reduce or condense a three-dimensional model exactly, and then iterate on this reduced size model representing the parts of the design that are allowed to change in an optimization loop is discussed. The discussion presents the results obtained from an ongoing research effort to exploit the concept of substructuring within the structural shape optimization context using a Boundary Element Analysis (BEA) formulation. The first part contains a formulation for the exact condensation of portions of the overall boundary element model designated as substructures. The use of reduced boundary element models in shape optimization requires that structural sensitivity analysis can be performed. A reduced sensitivity analysis formulation is then presented that allows for the calculation of structural response sensitivities of both the substructured (reduced) and unsubstructured parts of the model. It is shown that this approach produces significant computational economy in the design sensitivity analysis and reanalysis process by facilitating the block triangular factorization and forward reduction and backward substitution of smaller matrices. The implementatior of this formulation is discussed and timings and accuracies of representative test cases presented.

  5. Grid Stiffened Structure Analysis Tool

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The Grid Stiffened Analysis Tool contract is contract performed by Boeing under NASA purchase order H30249D. The contract calls for a "best effort" study comprised of two tasks: (1) Create documentation for a composite grid-stiffened structure analysis tool, in the form of a Microsoft EXCEL spread sheet, that was developed by originally at Stanford University and later further developed by the Air Force, and (2) Write a program that functions as a NASTRAN pre-processor to generate an FEM code for grid-stiffened structure. In performing this contract, Task 1 was given higher priority because it enables NASA to make efficient use of a unique tool they already have; Task 2 was proposed by Boeing because it also would be beneficial to the analysis of composite grid-stiffened structures, specifically in generating models for preliminary design studies. The contract is now complete, this package includes copies of the user's documentation for Task 1 and a CD ROM & diskette with an electronic copy of the user's documentation and an updated version of the "GRID 99" spreadsheet.

  6. Parallel processing in finite element structural analysis

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1987-01-01

    A brief review is made of the fundamental concepts and basic issues of parallel processing. Discussion focuses on parallel numerical algorithms, performance evaluation of machines and algorithms, and parallelism in finite element computations. A computational strategy is proposed for maximizing the degree of parallelism at different levels of the finite element analysis process including: 1) formulation level (through the use of mixed finite element models); 2) analysis level (through additive decomposition of the different arrays in the governing equations into the contributions to a symmetrized response plus correction terms); 3) numerical algorithm level (through the use of operator splitting techniques and application of iterative processes); and 4) implementation level (through the effective combination of vectorization, multitasking and microtasking, whenever available).

  7. Two-Dimensional Nonlinear Finite Element Analysis of CMC Microstructures

    NASA Technical Reports Server (NTRS)

    Mital, Subodh K.; Goldberg, Robert K.; Bonacuse, Peter J.

    2012-01-01

    A research program has been developed to quantify the effects of the microstructure of a woven ceramic matrix composite and its variability on the effective properties and response of the material. In order to characterize and quantify the variations in the microstructure of a five harness satin weave, chemical vapor infiltrated (CVI) SiC/SiC composite material, specimens were serially sectioned and polished to capture images that detailed the fiber tows, matrix, and porosity. Open source quantitative image analysis tools were then used to isolate the constituents, from which two dimensional finite element models were generated which approximated the actual specimen section geometry. A simplified elastic-plastic model, wherein all stress above yield is redistributed to lower stress regions, is used to approximate the progressive damage behavior for each of the composite constituents. Finite element analyses under in-plane tensile loading were performed to examine how the variability in the local microstructure affected the macroscopic stress-strain response of the material as well as the local initiation and progression of damage. The macroscopic stress-strain response appeared to be minimally affected by the variation in local microstructure, but the locations where damage initiated and propagated appeared to be linked to specific aspects of the local microstructure.

  8. Navigating freely-available software tools for metabolomics analysis.

    PubMed

    Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph

    2017-01-01

    The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.

  9. How Do Tissues Respond and Adapt to Stresses Around a Prosthesis? A Primer on Finite Element Stress Analysis for Orthopaedic Surgeons

    PubMed Central

    Brand, Richard A; Stanford, Clark M; Swan, Colby C

    2003-01-01

    Joint implant design clearly affects long-term outcome. While many implant designs have been empirically-based, finite element analysis has the potential to identify beneficial and deleterious features prior to clinical trials. Finite element analysis is a powerful analytic tool allowing computation of the stress and strain distribution throughout an implant construct. Whether it is useful depends upon many assumptions and details of the model. Since ultimate failure is related to biological factors in addition to mechanical, and since the mechanical causes of failure are related to load history, rather than a few loading conditions, chief among them is whether the stresses or strains under limited loading conditions relate to outcome. Newer approaches can minimize this and the many other model limitations. If the surgeon is to critically and properly interpret the results in scientific articles and sales literature, he or she must have a fundamental understanding of finite element analysis. We outline here the major capabilities of finite element analysis, as well as the assumptions and limitations. PMID:14575244

  10. Analysis Tools (AT)

    Treesearch

    Larry J. Gangi

    2006-01-01

    The FIREMON Analysis Tools program is designed to let the user perform grouped or ungrouped summary calculations of single measurement plot data, or statistical comparisons of grouped or ungrouped plot data taken at different sampling periods. The program allows the user to create reports and graphs, save and print them, or cut and paste them into a word processor....

  11. eShadow: A tool for comparing closely related sequences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ovcharenko, Ivan; Boffelli, Dario; Loots, Gabriela G.

    2004-01-15

    Primate sequence comparisons are difficult to interpret due to the high degree of sequence similarity shared between such closely related species. Recently, a novel method, phylogenetic shadowing, has been pioneered for predicting functional elements in the human genome through the analysis of multiple primate sequence alignments. We have expanded this theoretical approach to create a computational tool, eShadow, for the identification of elements under selective pressure in multiple sequence alignments of closely related genomes, such as in comparisons of human to primate or mouse to rat DNA. This tool integrates two different statistical methods and allows for the dynamic visualizationmore » of the resulting conservation profile. eShadow also includes a versatile optimization module capable of training the underlying Hidden Markov Model to differentially predict functional sequences. This module grants the tool high flexibility in the analysis of multiple sequence alignments and in comparing sequences with different divergence rates. Here, we describe the eShadow comparative tool and its potential uses for analyzing both multiple nucleotide and protein alignments to predict putative functional elements. The eShadow tool is publicly available at http://eshadow.dcode.org/« less

  12. Surface analysis of stone and bone tools

    NASA Astrophysics Data System (ADS)

    Stemp, W. James; Watson, Adam S.; Evans, Adrian A.

    2016-03-01

    Microwear (use-wear) analysis is a powerful method for identifying tool use that archaeologists and anthropologists employ to determine the activities undertaken by both humans and their hominin ancestors. Knowledge of tool use allows for more accurate and detailed reconstructions of past behavior, particularly in relation to subsistence practices, economic activities, conflict and ritual. It can also be used to document changes in these activities over time, in different locations, and by different members of society, in terms of gender and status, for example. Both stone and bone tools have been analyzed using a variety of techniques that focus on the observation, documentation and interpretation of wear traces. Traditionally, microwear analysis relied on the qualitative assessment of wear features using microscopes and often included comparisons between replicated tools used experimentally and the recovered artifacts, as well as functional analogies dependent upon modern implements and those used by indigenous peoples from various places around the world. Determination of tool use has also relied on the recovery and analysis of both organic and inorganic residues of past worked materials that survived in and on artifact surfaces. To determine tool use and better understand the mechanics of wear formation, particularly on stone and bone, archaeologists and anthropologists have increasingly turned to surface metrology and tribology to assist them in their research. This paper provides a history of the development of traditional microwear analysis in archaeology and anthropology and also explores the introduction and adoption of more modern methods and technologies for documenting and identifying wear on stone and bone tools, specifically those developed for the engineering sciences to study surface structures on micro- and nanoscales. The current state of microwear analysis is discussed as are the future directions in the study of microwear on stone and bone tools.

  13. Finite Element Analysis of Flexural Vibrations in Hard Disk Drive Spindle Systems

    NASA Astrophysics Data System (ADS)

    LIM, SEUNGCHUL

    2000-06-01

    This paper is concerned with the flexural vibration analysis of the hard disk drive (HDD) spindle system by means of the finite element method. In contrast to previous research, every system component is here analytically modelled taking into account its structural flexibility and also the centrifugal effect particularly on the disk. To prove the effectiveness and accuracy of the formulated models, commercial HDD systems with two and three identical disks are selected as examples. Then their major natural modes are computed with only a small number of element meshes as the shaft rotational speed is varied, and subsequently compared with the existing numerical results obtained using other methods and newly acquired experimental ones. Based on such a series of studies, the proposed method can be concluded as a very promising tool for the design of HDDs and various other high-performance computer disk drives such as floppy disk drives, CD ROM drives, and their variations having spindle mechanisms similar to those of HDDs.

  14. Design and Analysis Tools for Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Folk, Thomas C.

    2009-01-01

    Computational tools are being developed for the design and analysis of supersonic inlets. The objective is to update existing tools and provide design and low-order aerodynamic analysis capability for advanced inlet concepts. The Inlet Tools effort includes aspects of creating an electronic database of inlet design information, a document describing inlet design and analysis methods, a geometry model for describing the shape of inlets, and computer tools that implement the geometry model and methods. The geometry model has a set of basic inlet shapes that include pitot, two-dimensional, axisymmetric, and stream-traced inlet shapes. The inlet model divides the inlet flow field into parts that facilitate the design and analysis methods. The inlet geometry model constructs the inlet surfaces through the generation and transformation of planar entities based on key inlet design factors. Future efforts will focus on developing the inlet geometry model, the inlet design and analysis methods, a Fortran 95 code to implement the model and methods. Other computational platforms, such as Java, will also be explored.

  15. A Multidisciplinary Tool for Systems Analysis of Planetary Entry, Descent, and Landing (SAPE)

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    2009-01-01

    SAPE is a Python-based multidisciplinary analysis tool for systems analysis of planetary entry, descent, and landing (EDL) for Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Titan. The purpose of SAPE is to provide a variable-fidelity capability for conceptual and preliminary analysis within the same framework. SAPE includes the following analysis modules: geometry, trajectory, aerodynamics, aerothermal, thermal protection system, and structural sizing. SAPE uses the Python language-a platform-independent open-source software for integration and for the user interface. The development has relied heavily on the object-oriented programming capabilities that are available in Python. Modules are provided to interface with commercial and government off-the-shelf software components (e.g., thermal protection systems and finite-element analysis). SAPE runs on Microsoft Windows and Apple Mac OS X and has been partially tested on Linux.

  16. Finite Element Modelling and Analysis of Conventional Pultrusion Processes

    NASA Astrophysics Data System (ADS)

    Akishin, P.; Barkanov, E.; Bondarchuk, A.

    2015-11-01

    Pultrusion is one of many composite manufacturing techniques and one of the most efficient methods for producing fiber reinforced polymer composite parts with a constant cross-section. Numerical simulation is helpful for understanding the manufacturing process and developing scientific means for the pultrusion tooling design. Numerical technique based on the finite element method has been developed for the simulation of pultrusion processes. It uses the general purpose finite element software ANSYS Mechanical. It is shown that the developed technique predicts the temperature and cure profiles, which are in good agreement with those published in the open literature.

  17. Debugging and Performance Analysis Software Tools for Peregrine System |

    Science.gov Websites

    High-Performance Computing | NREL Debugging and Performance Analysis Software Tools for Peregrine System Debugging and Performance Analysis Software Tools for Peregrine System Learn about debugging and performance analysis software tools available to use with the Peregrine system. Allinea

  18. Comparative trace elemental analysis of cancerous and non-cancerous tissues of rectal cancer patients using PIXE

    NASA Astrophysics Data System (ADS)

    Naga Raju, G. J.; Sarita, P.; Murthy, K. S. R.

    2017-08-01

    Particle Induced X-ray Emission (PIXE), an accelerator based analytical technique has been employed in this work for the analysis of trace elements in the cancerous and non-cancerous tissues of rectal cancer patients. A beam of 3 MeV protons generated from 3 MV Pelletron accelerator at the Ion Beam Laboratory of Institute of Physics, Bhubaneswar, India was used as projectile to excite the atoms present in the tissues samples. PIXE technique, with its capability to detect simultaneously several elements present at very low concentrations, offers an excellent tool for trace element analysis. The characteristic X-rays emitted by the samples were recorded by a high resolution Si (Li) detector. On the basis of the PIXE spectrum obtained for each sample, the elements Cl, K, Ca, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zn, As, and Br were identified and their relative concentrations were estimated in the cancerous and non-cancerous tissues of rectum. The levels of Mn, Fe, Co, Cu, Zn, and As were higher (p < 0.005) while the levels of Ca, Cr and Ni were lower (p < 0.005) in the cancer tissues relative to the normal tissues. The alterations in the levels of the trace elements observed in the present work are discussed in this paper with respect to their potential role in the initiation, promotion and inhibition of cancer of the rectum.

  19. Finite Element Analysis (FEA) in Design and Production.

    ERIC Educational Resources Information Center

    Waggoner, Todd C.; And Others

    1995-01-01

    Finite element analysis (FEA) enables industrial designers to analyze complex components by dividing them into smaller elements, then assessing stress and strain characteristics. Traditionally mainframe based, FEA is being increasingly used in microcomputers. (SK)

  20. A Comparison of Satellite Conjunction Analysis Screening Tools

    DTIC Science & Technology

    2011-09-01

    visualization tool. Version 13.1.4 for Linux was tested. The SOAP conjunction analysis function does not have the capacity to perform the large...was examined by SOAP to confirm the conjunction. STK Advanced CAT STK Advanced CAT (Conjunction Analysis Tools) is an add-on module for the STK ...run with each tool. When attempting to perform the seven day all vs all analysis with STK Advanced CAT, the program consistently crashed during report

  1. TECHNICAL NOTE: Direct finite-element analysis of the frequency response of a Y-Z lithium niobate SAW filter

    NASA Astrophysics Data System (ADS)

    Xu, Guanshui

    2000-12-01

    A direct finite-element model is developed for the full-scale analysis of the electromechanical phenomena involved in surface acoustic wave (SAW) devices. The equations of wave propagation in piezoelectric materials are discretized using the Galerkin method, in which an implicit algorithm of the Newmark family with unconditional stability is implemented. The Rayleigh damping coefficients are included in the elements near the boundary to reduce the influence of the reflection of waves. The performance of the model is demonstrated by the analysis of the frequency response of a Y-Z lithium niobate filter with two uniform ports, with emphasis on the influence of the number of electrodes. The frequency response of the filter is obtained through the Fourier transform of the impulse response, which is solved directly from the finite-element simulation. It shows that the finite-element results are in good agreement with the characteristic frequency response of the filter predicted by the simple phase-matching argument. The ability of the method to evaluate the influence of the bulk waves at the high-frequency end of the filter passband and the influence of the number of electrodes on insertion loss is noteworthy. We conclude that the direct finite-element analysis of SAW devices can be used as an effective tool for the design of high-performance SAW devices. Some practical computational challenges of finite-element modeling of SAW devices are discussed.

  2. Organic Elemental Analysis.

    ERIC Educational Resources Information Center

    Ma, T. S.; Wang, C. Y.

    1984-01-01

    Presents a literature review on methods used to analyze organic elements. Topic areas include methods for: (1) analyzing carbon, hydrogen, and nitrogen; (2) analyzing oxygen, sulfur, and halogens; (3) analyzing other elements; (4) simultaneously determining several elements; and (5) determing trace elements. (JN)

  3. Integrated Data Visualization and Virtual Reality Tool

    NASA Technical Reports Server (NTRS)

    Dryer, David A.

    1998-01-01

    The Integrated Data Visualization and Virtual Reality Tool (IDVVRT) Phase II effort was for the design and development of an innovative Data Visualization Environment Tool (DVET) for NASA engineers and scientists, enabling them to visualize complex multidimensional and multivariate data in a virtual environment. The objectives of the project were to: (1) demonstrate the transfer and manipulation of standard engineering data in a virtual world; (2) demonstrate the effects of design and changes using finite element analysis tools; and (3) determine the training and engineering design and analysis effectiveness of the visualization system.

  4. Prediction and phylogenetic analysis of mammalian short interspersed elements (SINEs).

    PubMed

    Rogozin, I B; Mayorov, V I; Lavrentieva, M V; Milanesi, L; Adkison, L R

    2000-09-01

    The presence of repetitive elements can create serious problems for sequence analysis, especially in the case of homology searches in nucleotide sequence databases. Repetitive elements should be treated carefully by using special programs and databases. In this paper, various aspects of SINE (short interspersed repetitive element) identification, analysis and evolution are discussed.

  5. MutAIT: an online genetic toxicology data portal and analysis tools.

    PubMed

    Avancini, Daniele; Menzies, Georgina E; Morgan, Claire; Wills, John; Johnson, George E; White, Paul A; Lewis, Paul D

    2016-05-01

    Assessment of genetic toxicity and/or carcinogenic activity is an essential element of chemical screening programs employed to protect human health. Dose-response and gene mutation data are frequently analysed by industry, academia and governmental agencies for regulatory evaluations and decision making. Over the years, a number of efforts at different institutions have led to the creation and curation of databases to house genetic toxicology data, largely, with the aim of providing public access to facilitate research and regulatory assessments. This article provides a brief introduction to a new genetic toxicology portal called Mutation Analysis Informatics Tools (MutAIT) (www.mutait.org) that provides easy access to two of the largest genetic toxicology databases, the Mammalian Gene Mutation Database (MGMD) and TransgenicDB. TransgenicDB is a comprehensive collection of transgenic rodent mutation data initially compiled and collated by Health Canada. The updated MGMD contains approximately 50 000 individual mutation spectral records from the published literature. The portal not only gives access to an enormous quantity of genetic toxicology data, but also provides statistical tools for dose-response analysis and calculation of benchmark dose. Two important R packages for dose-response analysis are provided as web-distributed applications with user-friendly graphical interfaces. The 'drsmooth' package performs dose-response shape analysis and determines various points of departure (PoD) metrics and the 'PROAST' package provides algorithms for dose-response modelling. The MutAIT statistical tools, which are currently being enhanced, provide users with an efficient and comprehensive platform to conduct quantitative dose-response analyses and determine PoD values that can then be used to calculate human exposure limits or margins of exposure. © The Author 2015. Published by Oxford University Press on behalf of the UK Environmental Mutagen Society. All rights

  6. The environment power system analysis tool development program

    NASA Technical Reports Server (NTRS)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.

    1990-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  7. msBiodat analysis tool, big data analysis for high-throughput experiments.

    PubMed

    Muñoz-Torres, Pau M; Rokć, Filip; Belužic, Robert; Grbeša, Ivana; Vugrek, Oliver

    2016-01-01

    Mass spectrometry (MS) are a group of a high-throughput techniques used to increase knowledge about biomolecules. They produce a large amount of data which is presented as a list of hundreds or thousands of proteins. Filtering those data efficiently is the first step for extracting biologically relevant information. The filtering may increase interest by merging previous data with the data obtained from public databases, resulting in an accurate list of proteins which meet the predetermined conditions. In this article we present msBiodat Analysis Tool, a web-based application thought to approach proteomics to the big data analysis. With this tool, researchers can easily select the most relevant information from their MS experiments using an easy-to-use web interface. An interesting feature of msBiodat analysis tool is the possibility of selecting proteins by its annotation on Gene Ontology using its Gene Id, ensembl or UniProt codes. The msBiodat analysis tool is a web-based application that allows researchers with any programming experience to deal with efficient database querying advantages. Its versatility and user-friendly interface makes easy to perform fast and accurate data screening by using complex queries. Once the analysis is finished, the result is delivered by e-mail. msBiodat analysis tool is freely available at http://msbiodata.irb.hr.

  8. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  9. Dcode.org anthology of comparative genomic tools.

    PubMed

    Loots, Gabriela G; Ovcharenko, Ivan

    2005-07-01

    Comparative genomics provides the means to demarcate functional regions in anonymous DNA sequences. The successful application of this method to identifying novel genes is currently shifting to deciphering the non-coding encryption of gene regulation across genomes. To facilitate the practical application of comparative sequence analysis to genetics and genomics, we have developed several analytical and visualization tools for the analysis of arbitrary sequences and whole genomes. These tools include two alignment tools, zPicture and Mulan; a phylogenetic shadowing tool, eShadow for identifying lineage- and species-specific functional elements; two evolutionary conserved transcription factor analysis tools, rVista and multiTF; a tool for extracting cis-regulatory modules governing the expression of co-regulated genes, Creme 2.0; and a dynamic portal to multiple vertebrate and invertebrate genome alignments, the ECR Browser. Here, we briefly describe each one of these tools and provide specific examples on their practical applications. All the tools are publicly available at the http://www.dcode.org/ website.

  10. Design, analysis and testing of a new piezoelectric tool actuator for elliptical vibration turning

    NASA Astrophysics Data System (ADS)

    Lin, Jieqiong; Han, Jinguo; Lu, Mingming; Yu, Baojun; Gu, Yan

    2017-08-01

    A new piezoelectric tool actuator (PETA) for elliptical vibration turning has been developed based on a hybrid flexure hinge connection. Two double parallel four-bar linkage mechanisms and two right circular flexure hinges were chosen to guide the motion. The two input displacement directional stiffness were modeled according to the principle of virtual work modeling method and the kinematic analysis was conducted theoretically. Finite element analysis was used to carry out static and dynamic analyses. To evaluate the performance of the developed PETA, off-line experimental tests were carried out to investigate the step responses, motion strokes, resolutions, parasitic motions, and natural frequencies of the PETA along the two input directions. The relationship between input displacement and output displacement, as well as the tool tip’s elliptical trajectory in different phase shifts was analyzed. By using the developed PETA mechanism, micro-dimple patterns were generated as the preliminary application to demonstrate the feasibility and efficiency of PETA for elliptical vibration turning.

  11. Analytical Tools for Affordability Analysis

    DTIC Science & Technology

    2015-05-01

    function (Womer)  Unit cost as a function of learning and rate  Learning with forgetting (Benkard)  Learning depreciates over time  Discretionary...Analytical Tools for Affordability Analysis David Tate Cost Analysis and Research Division Institute for Defense Analyses Report Documentation...ES) Institute for Defense Analyses, Cost Analysis and Research Division,4850 Mark Center Drive,Alexandria,VA,22311-1882 8. PERFORMING ORGANIZATION

  12. Alphavirus replicon approach to promoterless analysis of IRES elements.

    PubMed

    Kamrud, K I; Custer, M; Dudek, J M; Owens, G; Alterson, K D; Lee, J S; Groebner, J L; Smith, J F

    2007-04-10

    Here we describe a system for promoterless analysis of putative internal ribosome entry site (IRES) elements using an alphavirus (family Togaviridae) replicon vector. The system uses the alphavirus subgenomic promoter to produce transcripts that, when modified to contain a spacer region upstream of an IRES element, allow analysis of cap-independent translation of genes of interest (GOI). If the IRES element is removed, translation of the subgenomic transcript can be reduced >95% compared to the same transcript containing a functional IRES element. Alphavirus replicons, used in this manner, offer an alternative to standard dicistronic DNA vectors or in vitro translation systems currently used to analyze putative IRES elements. In addition, protein expression levels varied depending on the spacer element located upstream of each IRES. The ability to modulate the level of expression from alphavirus vectors should extend the utility of these vectors in vaccine development.

  13. Alphavirus Replicon Approach to Promoterless Analysis of IRES Elements

    PubMed Central

    Kamrud, K.I.; Custer, M.; Dudek, J.M.; Owens, G.; Alterson, K.D.; Lee, J.S.; Groebner, J.L.; Smith, J.F.

    2007-01-01

    Here we describe a system for promoterless analysis of putative internal ribosome entry site (IRES) elements using an alphavirus (Family Togaviridae) replicon vector. The system uses the alphavirus subgenomic promoter to produce transcripts that, when modified to contain a spacer region upstream of an IRES element, allow analysis of cap-independent translation of genes of interest (GOI). If the IRES element is removed, translation of the subgenomic transcript can be reduced > 95 % compared to the same transcript containing a functional IRES element. Alphavirus replicons, used in this manner, offer an alternative to standard dicistronic DNA vectors or in-vitro translation systems currently used to analyze putative IRES elements. In addition, protein expression levels varied depending on the spacer element located upstream of each IRES. The ability to modulate the level of expression from alphavirus vectors should extend the utility of these vectors in vaccine development. PMID:17156813

  14. RADC SCAT automated sneak circuit analysis tool

    NASA Astrophysics Data System (ADS)

    Depalma, Edward L.

    The sneak circuit analysis tool (SCAT) provides a PC-based system for real-time identification (during the design phase) of sneak paths and design concerns. The tool utilizes an expert system shell to assist the analyst so that prior experience with sneak analysis is not necessary for performance. Both sneak circuits and design concerns are targeted by this tool, with both digital and analog circuits being examined. SCAT focuses the analysis at the assembly level, rather than the entire system, so that most sneak problems can be identified and corrected by the responsible design engineer in a timely manner. The SCAT program identifies the sneak circuits to the designer, who then decides what course of action is necessary.

  15. Analysis of Ten Reverse Engineering Tools

    NASA Astrophysics Data System (ADS)

    Koskinen, Jussi; Lehmonen, Tero

    Reverse engineering tools can be used in satisfying the information needs of software maintainers. Especially in case of maintaining large-scale legacy systems tool support is essential. Reverse engineering tools provide various kinds of capabilities to provide the needed information to the tool user. In this paper we analyze the provided capabilities in terms of four aspects: provided data structures, visualization mechanisms, information request specification mechanisms, and navigation features. We provide a compact analysis of ten representative reverse engineering tools for supporting C, C++ or Java: Eclipse Java Development Tools, Wind River Workbench (for C and C++), Understand (for C++), Imagix 4D, Creole, Javadoc, Javasrc, Source Navigator, Doxygen, and HyperSoft. The results of the study supplement the earlier findings in this important area.

  16. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  17. Finite element analysis of auditory characteristics in patients with middle ear diseases.

    PubMed

    Tu, Bo; Li, Xiaoping; Nie, Zhenhua; Shi, Changzheng; Li, Hengguo

    2017-07-01

    This study validates that a finite element model of the human ossicular chain and tympanic membrane can be used as an effective surgical assessment tool in clinics. The present study was performed to investigate the application of a finite element model of ossicular chain and tympanic membrane for fabrication of individualized artificial ossicles. Twenty patients (20 ears) who underwent surgery for middle ear disease (n = 20) and 10 healthy controls (10 ears) were enrolled in the hospital. Computed tomography (CT) and pure tone audiometry were performed before and after surgery. A finite element model was developed using CT scans, and correlation analysis was conducted between stapes displacement and surgical methods. An audiometric test was also performed for 14 patients before and after surgery. Stapes displacement in the healthy group (average = 3.31 × 10 -5  mm) was significantly greater than that in the impaired group (average = 1.41 × 10 -6 mm) prior to surgery. After surgery, the average displacement in the impaired group was 2.55 × 10 -6 mm, which represented a significant improvement. For the patients who underwent the audiometric test, 10 improved hearing after surgery, and stapes displacement increased in nine of these 10 patients.

  18. Biomechanical effects of maxillary expansion on a patient with cleft palate: A finite element analysis

    PubMed Central

    Lee, Haofu; Nguyen, Alan; Hong, Christine; Hoang, Paul; Pham, John; Ting, Kang

    2017-01-01

    Introduction The aims of this study were to evaluate the effects of rapid palatal expansion on the craniofacial skeleton of a patient with unilateral cleft lip and palate (UCLP) and to predict the points of force application for optimal expansion using a 3-dimensional finite element model. Methods A 3-dimensional finite element model of the craniofacial complex with UCLP was generated from spiral computed tomographic scans with imaging software (Mimics, version 13.1; Materialise, Leuven, Belgium). This model was imported into the finite element solver (version 12.0; ANSYS, Canonsburg, Pa) to evaluate transverse expansion forces from rapid palatal expansion. Finite element analysis was performed with transverse expansion to achieve 5 mm of anterolateral expansion of the collapsed minor segment to simulate correction of the anterior crossbite in a patient with UCLP. Results High-stress concentrations were observed at the body of the sphenoid, medial to the orbit, and at the inferior area of the zygomatic process of the maxilla. The craniofacial stress distribution was asymmetric, with higher stress levels on the cleft side. When forces were applied more anteriorly on the collapsed minor segment and more posteriorly on the major segment, there was greater expansion of the anterior region of the minor segment with minimal expansion of the major segment. Conclusions The transverse expansion forces from rapid palatal expansion are distributed to the 3 maxillary buttresses. Finite element analysis is an appropriate tool to study and predict the points of force application for better controlled expansion in patients with UCLP. PMID:27476365

  19. Biomechanical effects of maxillary expansion on a patient with cleft palate: A finite element analysis.

    PubMed

    Lee, Haofu; Nguyen, Alan; Hong, Christine; Hoang, Paul; Pham, John; Ting, Kang

    2016-08-01

    The aims of this study were to evaluate the effects of rapid palatal expansion on the craniofacial skeleton of a patient with unilateral cleft lip and palate (UCLP) and to predict the points of force application for optimal expansion using a 3-dimensional finite element model. A 3-dimensional finite element model of the craniofacial complex with UCLP was generated from spiral computed tomographic scans with imaging software (Mimics, version 13.1; Materialise, Leuven, Belgium). This model was imported into the finite element solver (version 12.0; ANSYS, Canonsburg, Pa) to evaluate transverse expansion forces from rapid palatal expansion. Finite element analysis was performed with transverse expansion to achieve 5 mm of anterolateral expansion of the collapsed minor segment to simulate correction of the anterior crossbite in a patient with UCLP. High-stress concentrations were observed at the body of the sphenoid, medial to the orbit, and at the inferior area of the zygomatic process of the maxilla. The craniofacial stress distribution was asymmetric, with higher stress levels on the cleft side. When forces were applied more anteriorly on the collapsed minor segment and more posteriorly on the major segment, there was greater expansion of the anterior region of the minor segment with minimal expansion of the major segment. The transverse expansion forces from rapid palatal expansion are distributed to the 3 maxillary buttresses. Finite element analysis is an appropriate tool to study and predict the points of force application for better controlled expansion in patients with UCLP. Copyright © 2016 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  20. Nonlinear Finite Element Analysis of Shells with Large Aspect Ratio

    NASA Technical Reports Server (NTRS)

    Chang, T. Y.; Sawamiphakdi, K.

    1984-01-01

    A higher order degenerated shell element with nine nodes was selected for large deformation and post-buckling analysis of thick or thin shells. Elastic-plastic material properties are also included. The post-buckling analysis algorithm is given. Using a square plate, it was demonstrated that the none-node element does not have shear locking effect even if its aspect ratio was increased to the order 10 to the 8th power. Two sample problems are given to illustrate the analysis capability of the shell element.

  1. Intervertebral disc biomechanical analysis using the finite element modeling based on medical images.

    PubMed

    Li, Haiyun; Wang, Zheng

    2006-01-01

    In this paper, a 3D geometric model of the intervertebral and lumbar disks has been presented, which integrated the spine CT and MRI data-based anatomical structure. Based on the geometric model, a 3D finite element model of an L1-L2 segment was created. Loads, which simulate the pressure from above were applied to the FEM, while a boundary condition describing the relative L1-L2 displacement is imposed on the FEM to account for 3D physiological states. The simulation calculation illustrates the stress and strain distribution and deformation of the spine. The method has two characteristics compared to previous studies: first, the finite element model of the lumbar are based on the data directly derived from medical images such as CTs and MRIs. Second, the result of analysis will be more accurate than using the data of geometric parameters. The FEM provides a promising tool in clinical diagnosis and for optimizing individual therapy in the intervertebral disc herniation.

  2. Lithology and mineralogy recognition from geochemical logging tool data using multivariate statistical analysis.

    PubMed

    Konaté, Ahmed Amara; Ma, Huolin; Pan, Heping; Qin, Zhen; Ahmed, Hafizullah Abba; Dembele, N'dji Dit Jacques

    2017-10-01

    The availability of a deep well that penetrates deep into the Ultra High Pressure (UHP) metamorphic rocks is unusual and consequently offers a unique chance to study the metamorphic rocks. One such borehole is located in the southern part of Donghai County in the Sulu UHP metamorphic belt of Eastern China, from the Chinese Continental Scientific Drilling Main hole. This study reports the results obtained from the analysis of oxide log data. A geochemical logging tool provides in situ, gamma ray spectroscopy measurements of major and trace elements in the borehole. Dry weight percent oxide concentration logs obtained for this study were SiO 2 , K 2 O, TiO 2 , H 2 O, CO 2 , Na 2 O, Fe 2 O 3 , FeO, CaO, MnO, MgO, P 2 O 5 and Al 2 O 3 . Cross plot and Principal Component Analysis methods were applied for lithology characterization and mineralogy description respectively. Cross plot analysis allows lithological variations to be characterized. Principal Component Analysis shows that the oxide logs can be summarized by two components related to the feldspar and hydrous minerals. This study has shown that geochemical logging tool data is accurate and adequate to be tremendously useful in UHP metamorphic rocks analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. First- and Second-Order Sensitivity Analysis of a P-Version Finite Element Equation Via Automatic Differentiation

    NASA Technical Reports Server (NTRS)

    Hou, Gene

    1998-01-01

    Sensitivity analysis is a technique for determining derivatives of system responses with respect to design parameters. Among many methods available for sensitivity analysis, automatic differentiation has been proven through many applications in fluid dynamics and structural mechanics to be an accurate and easy method for obtaining derivatives. Nevertheless, the method can be computational expensive and can require a high memory space. This project will apply an automatic differentiation tool, ADIFOR, to a p-version finite element code to obtain first- and second- order then-nal derivatives, respectively. The focus of the study is on the implementation process and the performance of the ADIFOR-enhanced codes for sensitivity analysis in terms of memory requirement, computational efficiency, and accuracy.

  4. Boundary element analysis of post-tensioned slabs

    NASA Astrophysics Data System (ADS)

    Rashed, Youssef F.

    2015-06-01

    In this paper, the boundary element method is applied to carry out the structural analysis of post-tensioned flat slabs. The shear-deformable plate-bending model is employed. The effect of the pre-stressing cables is taken into account via the equivalent load method. The formulation is automated using a computer program, which uses quadratic boundary elements. Verification samples are presented, and finally a practical application is analyzed where results are compared against those obtained from the finite element method. The proposed method is efficient in terms of computer storage and processing time as well as the ease in data input and modifications.

  5. Control/structure interaction conceptual design tool

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1990-01-01

    The JPL Control/Structure Interaction Program is developing new analytical methods for designing micro-precision spacecraft with controlled structures. One of these, the Conceptual Design Tool, will illustrate innovative new approaches to the integration of multi-disciplinary analysis and design methods. The tool will be used to demonstrate homogeneity of presentation, uniform data representation across analytical methods, and integrated systems modeling. The tool differs from current 'integrated systems' that support design teams most notably in its support for the new CSI multi-disciplinary engineer. The design tool will utilize a three dimensional solid model of the spacecraft under design as the central data organization metaphor. Various analytical methods, such as finite element structural analysis, control system analysis, and mechanical configuration layout, will store and retrieve data from a hierarchical, object oriented data structure that supports assemblies of components with associated data and algorithms. In addition to managing numerical model data, the tool will assist the designer in organizing, stating, and tracking system requirements.

  6. Minos as a novel Tc1/mariner-type transposable element for functional genomic analysis in Aspergillus nidulans.

    PubMed

    Evangelinos, Minoas; Anagnostopoulos, Gerasimos; Karvela-Kalogeraki, Iliana; Stathopoulou, Panagiota M; Scazzocchio, Claudio; Diallinas, George

    2015-08-01

    Transposons constitute powerful genetic tools for gene inactivation, exon or promoter trapping and genome analyses. The Minos element from Drosophila hydei, a Tc1/mariner-like transposon, has proved as a very efficient tool for heterologous transposition in several metazoa. In filamentous fungi, only a handful of fungal-specific transposable elements have been exploited as genetic tools, with the impala Tc1/mariner element from Fusarium oxysporum being the most successful. Here, we developed a two-component transposition system to manipulate Minos transposition in Aspergillus nidulans (AnMinos). Our system allows direct selection of transposition events based on re-activation of niaD, a gene necessary for growth on nitrate as a nitrogen source. On average, among 10(8) conidiospores, we obtain up to ∼0.8×10(2) transposition events leading to the expected revertant phenotype (niaD(+)), while ∼16% of excision events lead to AnMinos loss. Characterized excision footprints consisted of the four terminal bases of the transposon flanked by the TA target duplication and led to no major DNA rearrangements. AnMinos transposition depends on the presence of its homologous transposase. Its frequency was not significantly affected by temperature, UV irradiation or the transcription status of the original integration locus (niaD). Importantly, transposition is dependent on nkuA, encoding an enzyme essential for non-homologous end joining of DNA in double-strand break repair. AnMinos proved to be an efficient tool for functional analysis as it seems to transpose in different genomic loci positions in all chromosomes, including a high proportion of integration events within or close to genes. We have used Minos to obtain morphological and toxic analogue resistant mutants. Interestingly, among morphological mutants some seem to be due to Minos-elicited over-expression of specific genes, rather than gene inactivation. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Nitinol Embolic Protection Filters: Design Investigation by Finite Element Analysis

    NASA Astrophysics Data System (ADS)

    Conti, Michele; de Beule, Matthieu; Mortier, Peter; van Loo, Denis; Verdonck, Pascal; Vermassen, Frank; Segers, Patrick; Auricchio, Ferdinando; Verhegghe, Benedict

    2009-08-01

    The widespread acceptance of carotid artery stenting (CAS) to treat carotid artery stenosis and its effectiveness compared with surgical counterpart, carotid endarterectomy (CEA), is still a matter of debate. Transient or permanent neurological deficits may develop in patients undergoing CAS due to distal embolization or hemodynamic changes. Design, development, and usage of embolic protection devices (EPDs), such as embolic protection filters, appear to have a significant impact on the success of CAS. Unfortunately, some drawbacks, such as filtering failure, inability to cross tortuous high-grade stenoses, malpositioning and vessel injury, still remain and require design improvement. Currently, many different designs of such devices are available on the rapidly growing dedicated market. In spite of such a growing commercial interest, there is a significant need for design tools as well as for careful engineering investigations and design analyses of such nitinol devices. The present study aims to investigate the embolic protection filter design by finite element analysis. We first developed a parametrical computer-aided design model of an embolic filter based on micro-CT scans of the Angioguard™ XP (Cordis Endovascular, FL) EPD by means of the open source pyFormex software. Subsequently, we used the finite element method to simulate the deployment of the nitinol filter as it exits the delivery sheath. Comparison of the simulations with micro-CT images of the real device exiting the catheter showed excellent correspondence with our simulations. Finally, we evaluated circumferential basket-vessel wall apposition of a 4 mm size filter in a straight vessel of different sizes and shape. We conclude that the proposed methodology offers a useful tool to evaluate and to compare current or new designs of EPDs. Further simulations will investigate vessel wall apposition in a realistic tortuous anatomy.

  8. Analysis of random structure-acoustic interaction problems using coupled boundary element and finite element methods

    NASA Technical Reports Server (NTRS)

    Mei, Chuh; Pates, Carl S., III

    1994-01-01

    A coupled boundary element (BEM)-finite element (FEM) approach is presented to accurately model structure-acoustic interaction systems. The boundary element method is first applied to interior, two and three-dimensional acoustic domains with complex geometry configurations. Boundary element results are very accurate when compared with limited exact solutions. Structure-interaction problems are then analyzed with the coupled FEM-BEM method, where the finite element method models the structure and the boundary element method models the interior acoustic domain. The coupled analysis is compared with exact and experimental results for a simplistic model. Composite panels are analyzed and compared with isotropic results. The coupled method is then extended for random excitation. Random excitation results are compared with uncoupled results for isotropic and composite panels.

  9. Physical Education Curriculum Analysis Tool (PECAT)

    ERIC Educational Resources Information Center

    Lee, Sarah M.; Wechsler, Howell

    2006-01-01

    The Physical Education Curriculum Analysis Tool (PECAT) will help school districts conduct a clear, complete, and consistent analysis of written physical education curricula, based upon national physical education standards. The PECAT is customizable to include local standards. The results from the analysis can help school districts enhance…

  10. [Principal component analysis and cluster analysis of inorganic elements in sea cucumber Apostichopus japonicus].

    PubMed

    Liu, Xiao-Fang; Xue, Chang-Hu; Wang, Yu-Ming; Li, Zhao-Jie; Xue, Yong; Xu, Jie

    2011-11-01

    The present study is to investigate the feasibility of multi-elements analysis in determination of the geographical origin of sea cucumber Apostichopus japonicus, and to make choice of the effective tracers in sea cucumber Apostichopus japonicus geographical origin assessment. The content of the elements such as Al, V, Cr, Mn, Fe, Co, Ni, Cu, Zn, As, Se, Mo, Cd, Hg and Pb in sea cucumber Apostichopus japonicus samples from seven places of geographical origin were determined by means of ICP-MS. The results were used for the development of elements database. Cluster analysis(CA) and principal component analysis (PCA) were applied to differentiate the sea cucumber Apostichopus japonicus geographical origin. Three principal components which accounted for over 89% of the total variance were extracted from the standardized data. The results of Q-type cluster analysis showed that the 26 samples could be clustered reasonably into five groups, the classification results were significantly associated with the marine distribution of the sea cucumber Apostichopus japonicus samples. The CA and PCA were the effective methods for elements analysis of sea cucumber Apostichopus japonicus samples. The content of the mineral elements in sea cucumber Apostichopus japonicus samples was good chemical descriptors for differentiating their geographical origins.

  11. Decision Analysis Tools for Volcano Observatories

    NASA Astrophysics Data System (ADS)

    Hincks, T. H.; Aspinall, W.; Woo, G.

    2005-12-01

    Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.

  12. Integral finite element analysis of turntable bearing with flexible rings

    NASA Astrophysics Data System (ADS)

    Deng, Biao; Liu, Yunfei; Guo, Yuan; Tang, Shengjin; Su, Wenbin; Lei, Zhufeng; Wang, Pengcheng

    2018-03-01

    This paper suggests a method to calculate the internal load distribution and contact stress of the thrust angular contact ball turntable bearing by FEA. The influence of the stiffness of the bearing structure and the plastic deformation of contact area on the internal load distribution and contact stress of the bearing is considered. In this method, the load-deformation relationship of the rolling elements is determined by the finite element contact analysis of a single rolling element and the raceway. Based on this, the nonlinear contact between the rolling elements and the inner and outer ring raceways is same as a nonlinear compression spring and bearing integral finite element analysis model including support structure was established. The effects of structural deformation and plastic deformation on the built-in stress distribution of slewing bearing are investigated on basis of comparing the consequences of load distribution, inner and outer ring stress, contact stress and other finite element analysis results with the traditional bearing theory, which has guiding function for improving the design of slewing bearing.

  13. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, B.; Penev, M.; Melaina, M.

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  14. Multidisciplinary Tool for Systems Analysis of Planetary Entry, Descent, and Landing

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    2011-01-01

    Systems analysis of a planetary entry (SAPE), descent, and landing (EDL) is a multidisciplinary activity in nature. SAPE improves the performance of the systems analysis team by automating and streamlining the process, and this improvement can reduce the errors that stem from manual data transfer among discipline experts. SAPE is a multidisciplinary tool for systems analysis of planetary EDL for Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Titan. It performs EDL systems analysis for any planet, operates cross-platform (i.e., Windows, Mac, and Linux operating systems), uses existing software components and open-source software to avoid software licensing issues, performs low-fidelity systems analysis in one hour on a computer that is comparable to an average laptop, and keeps discipline experts in the analysis loop. SAPE uses Python, a platform-independent, open-source language, for integration and for the user interface. Development has relied heavily on the object-oriented programming capabilities that are available in Python. Modules are provided to interface with commercial and government off-the-shelf software components (e.g., thermal protection systems and finite-element analysis). SAPE currently includes the following analysis modules: geometry, trajectory, aerodynamics, aerothermal, thermal protection system, and interface for structural sizing.

  15. Failure environment analysis tool applications

    NASA Astrophysics Data System (ADS)

    Pack, Ginger L.; Wadsworth, David B.

    1993-02-01

    Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.

  16. Failure environment analysis tool applications

    NASA Technical Reports Server (NTRS)

    Pack, Ginger L.; Wadsworth, David B.

    1993-01-01

    Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.

  17. Failure environment analysis tool applications

    NASA Technical Reports Server (NTRS)

    Pack, Ginger L.; Wadsworth, David B.

    1994-01-01

    Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within it, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.

  18. Structural Model Tuning Capability in an Object-Oriented Multidisciplinary Design, Analysis, and Optimization Tool

    NASA Technical Reports Server (NTRS)

    Lung, Shun-fat; Pak, Chan-gi

    2008-01-01

    Updating the finite element model using measured data is a challenging problem in the area of structural dynamics. The model updating process requires not only satisfactory correlations between analytical and experimental results, but also the retention of dynamic properties of structures. Accurate rigid body dynamics are important for flight control system design and aeroelastic trim analysis. Minimizing the difference between analytical and experimental results is a type of optimization problem. In this research, a multidisciplinary design, analysis, and optimization (MDAO) tool is introduced to optimize the objective function and constraints such that the mass properties, the natural frequencies, and the mode shapes are matched to the target data as well as the mass matrix being orthogonalized.

  19. Structural Model Tuning Capability in an Object-Oriented Multidisciplinary Design, Analysis, and Optimization Tool

    NASA Technical Reports Server (NTRS)

    Lung, Shun-fat; Pak, Chan-gi

    2008-01-01

    Updating the finite element model using measured data is a challenging problem in the area of structural dynamics. The model updating process requires not only satisfactory correlations between analytical and experimental results, but also the retention of dynamic properties of structures. Accurate rigid body dynamics are important for flight control system design and aeroelastic trim analysis. Minimizing the difference between analytical and experimental results is a type of optimization problem. In this research, a multidisciplinary design, analysis, and optimization [MDAO] tool is introduced to optimize the objective function and constraints such that the mass properties, the natural frequencies, and the mode shapes are matched to the target data as well as the mass matrix being orthogonalized.

  20. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2013-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very difficult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The first version of this tool was a serial code and the current version is a parallel code, which has greatly increased the analysis capabilities. This paper describes the new implementation of this analysis tool on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  1. Nutrition screening tools: an analysis of the evidence.

    PubMed

    Skipper, Annalynn; Ferguson, Maree; Thompson, Kyle; Castellanos, Victoria H; Porcari, Judy

    2012-05-01

    In response to questions about tools for nutrition screening, an evidence analysis project was developed to identify the most valid and reliable nutrition screening tools for use in acute care and hospital-based ambulatory care settings. An oversight group defined nutrition screening and literature search criteria. A trained analyst conducted structured searches of the literature for studies of nutrition screening tools according to predetermined criteria. Eleven nutrition screening tools designed to detect undernutrition in patients in acute care and hospital-based ambulatory care were identified. Trained analysts evaluated articles for quality using criteria specified by the American Dietetic Association's Evidence Analysis Library. Members of the oversight group assigned quality grades to the tools based on the quality of the supporting evidence, including reliability and validity data. One tool, the NRS-2002, received a grade I, and 4 tools-the Simple Two-Part Tool, the Mini-Nutritional Assessment-Short Form (MNA-SF), the Malnutrition Screening Tool (MST), and Malnutrition Universal Screening Tool (MUST)-received a grade II. The MST was the only tool shown to be both valid and reliable for identifying undernutrition in the settings studied. Thus, validated nutrition screening tools that are simple and easy to use are available for application in acute care and hospital-based ambulatory care settings.

  2. Design and Analysis of Bionic Cutting Blades Using Finite Element Method.

    PubMed

    Li, Mo; Yang, Yuwang; Guo, Li; Chen, Donghui; Sun, Hongliang; Tong, Jin

    2015-01-01

    Praying mantis is one of the most efficient predators in insect world, which has a pair of powerful tools, two sharp and strong forelegs. Its femur and tibia are both armed with a double row of strong spines along their posterior edges which can firmly grasp the prey, when the femur and tibia fold on each other in capturing. These spines are so sharp that they can easily and quickly cut into the prey. The geometrical characteristic of the praying mantis's foreleg, especially its tibia, has important reference value for the design of agricultural soil-cutting tools. Learning from the profile and arrangement of these spines, cutting blades with tooth profile were designed in this work. Two different sizes of tooth structure and arrangement were utilized in the design on the cutting edge. A conventional smooth-edge blade was used to compare with the bionic serrate-edge blades. To compare the working efficiency of conventional blade and bionic blades, 3D finite element simulation analysis and experimental measurement were operated in present work. Both the simulation and experimental results indicated that the bionic serrate-edge blades showed better performance in cutting efficiency.

  3. Design and Analysis of Bionic Cutting Blades Using Finite Element Method

    PubMed Central

    Li, Mo; Yang, Yuwang; Guo, Li; Chen, Donghui; Sun, Hongliang; Tong, Jin

    2015-01-01

    Praying mantis is one of the most efficient predators in insect world, which has a pair of powerful tools, two sharp and strong forelegs. Its femur and tibia are both armed with a double row of strong spines along their posterior edges which can firmly grasp the prey, when the femur and tibia fold on each other in capturing. These spines are so sharp that they can easily and quickly cut into the prey. The geometrical characteristic of the praying mantis's foreleg, especially its tibia, has important reference value for the design of agricultural soil-cutting tools. Learning from the profile and arrangement of these spines, cutting blades with tooth profile were designed in this work. Two different sizes of tooth structure and arrangement were utilized in the design on the cutting edge. A conventional smooth-edge blade was used to compare with the bionic serrate-edge blades. To compare the working efficiency of conventional blade and bionic blades, 3D finite element simulation analysis and experimental measurement were operated in present work. Both the simulation and experimental results indicated that the bionic serrate-edge blades showed better performance in cutting efficiency. PMID:27019583

  4. Completely non-destructive elemental analysis of bulky samples by PGAA

    NASA Astrophysics Data System (ADS)

    Oura, Y.; Nakahara, H.; Sueki, K.; Sato, W.; Saito, A.; Tomizawa, T.; Nishikawa, T.

    1999-01-01

    NBAA (neutron beam activation analysis), which is a combination of PGAA and INAA by a single neutron irradiation, using an internal monostandard method is proposed as a very unique and promising method for the elemental analysis of voluminous and invaluable archaeological samples which do not allow even a scrape of the surface. It was applied to chinawares, Sueki ware, and bronze mirrors, and proved to be a very effective method for nondestructive analysis of not only major elements but also some minor elements such as boron that help solve archaeological problems of ears and sites of their production.

  5. General Mission Analysis Tool (GMAT) User's Guide (Draft)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.

    2007-01-01

    4The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system. This document is a draft of the users guide for the tool. Included in the guide is information about Configuring Objects/Resources, Object Fields: Quick Look-up Tables, and Commands and Events.

  6. Interchange Safety Analysis Tool (ISAT) : user manual

    DOT National Transportation Integrated Search

    2007-06-01

    This User Manual describes the usage and operation of the spreadsheet-based Interchange Safety Analysis Tool (ISAT). ISAT provides design and safety engineers with an automated tool for assessing the safety effects of geometric design and traffic con...

  7. An exploration of inter-organisational partnership assessment tools in the context of Australian Aboriginal-mainstream partnerships: a scoping review of the literature.

    PubMed

    Tsou, Christina; Haynes, Emma; Warner, Wayne D; Gray, Gordon; Thompson, Sandra C

    2015-04-23

    The need for better partnerships between Aboriginal organisations and mainstream agencies demands attention on process and relational elements of these partnerships, and improving partnership functioning through transformative or iterative evaluation procedures. This paper presents the findings of a literature review which examines the usefulness of existing partnership tools to the Australian Aboriginal-mainstream partnership (AMP) context. Three sets of best practice principles for successful AMP were selected based on authors' knowledge and experience. Items in each set of principles were separated into process and relational elements and used to guide the analysis of partnership assessment tools. The review and analysis of partnership assessment tools were conducted in three distinct but related parts. Part 1- identify and select reviews of partnership tools; part 2 - identify and select partnership self-assessment tool; part 3 - analysis of selected tools using AMP principles. The focus on relational and process elements in the partnership tools reviewed is consistent with the focus of Australian AMP principles by reconciliation advocates; however, historical context, lived experience, cultural context and approaches of Australian Aboriginal people represent key deficiencies in the tools reviewed. The overall assessment indicated that the New York Partnership Self-Assessment Tool and the VicHealth Partnership Analysis Tools reflect the greatest number of AMP principles followed by the Nuffield Partnership Assessment Tool. The New York PSAT has the strongest alignment with the relational elements while VicHealth and Nuffield tools showed greatest alignment with the process elements in the chosen AMP principles. Partnership tools offer opportunities for providing evidence based support to partnership development. The multiplicity of tools in existence and the reported uniqueness of each partnership, mean the development of a generic partnership analysis for AMP

  8. Performance Analysis of GYRO: A Tool Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Worley, P.; Roth, P.; Candy, J.

    2005-06-26

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wall clock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manualmore » analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.« less

  9. Sustainability Tools Inventory - Initial Gaps Analysis | Science ...

    EPA Pesticide Factsheets

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consumption, waste generation, and hazard generation including air pollution and greenhouse gases. In addition, the tools have been evaluated using four screening criteria: relevance to community decision making, tools in an appropriate developmental stage, tools that may be transferrable to situations useful for communities, and tools with requiring skill levels appropriate to communities. This document provides an initial gap analysis in the area of community sustainability decision support tools. It provides a reference to communities for existing decision support tools, and a set of gaps for those wishing to develop additional needed tools to help communities to achieve sustainability. It contributes to SHC 1.61.4

  10. Contact stress analysis of spiral bevel gears using nonlinear finite element static analysis

    NASA Technical Reports Server (NTRS)

    Bibel, G. D.; Kumar, A.; Reddy, S.; Handschuh, R.

    1993-01-01

    A procedure is presented for performing three-dimensional stress analysis of spiral bevel gears in mesh using the finite element method. The procedure involves generating a finite element model by solving equations that identify tooth surface coordinates. Coordinate transformations are used to orientate the gear and pinion for gear meshing. Contact boundary conditions are simulated with gap elements. A solution technique for correct orientation of the gap elements is given. Example models and results are presented.

  11. FIESTA ROC: A new finite element analysis program for solar cell simulation

    NASA Technical Reports Server (NTRS)

    Clark, Ralph O.

    1991-01-01

    The Finite Element Semiconductor Three-dimensional Analyzer by Ralph O. Clark (FIESTA ROC) is a computational tool for investigating in detail the performance of arbitrary solar cell structures. As its name indicates, it uses the finite element technique to solve the fundamental semiconductor equations in the cell. It may be used for predicting the performance (thereby dictating the design parameters) of a proposed cell or for investigating the limiting factors in an established design.

  12. SECIMTools: a suite of metabolomics data analysis tools.

    PubMed

    Kirpich, Alexander S; Ibarra, Miguel; Moskalenko, Oleksandr; Fear, Justin M; Gerken, Joseph; Mi, Xinlei; Ashrafi, Ali; Morse, Alison M; McIntyre, Lauren M

    2018-04-20

    Metabolomics has the promise to transform the area of personalized medicine with the rapid development of high throughput technology for untargeted analysis of metabolites. Open access, easy to use, analytic tools that are broadly accessible to the biological community need to be developed. While technology used in metabolomics varies, most metabolomics studies have a set of features identified. Galaxy is an open access platform that enables scientists at all levels to interact with big data. Galaxy promotes reproducibility by saving histories and enabling the sharing workflows among scientists. SECIMTools (SouthEast Center for Integrated Metabolomics) is a set of Python applications that are available both as standalone tools and wrapped for use in Galaxy. The suite includes a comprehensive set of quality control metrics (retention time window evaluation and various peak evaluation tools), visualization techniques (hierarchical cluster heatmap, principal component analysis, modular modularity clustering), basic statistical analysis methods (partial least squares - discriminant analysis, analysis of variance, t-test, Kruskal-Wallis non-parametric test), advanced classification methods (random forest, support vector machines), and advanced variable selection tools (least absolute shrinkage and selection operator LASSO and Elastic Net). SECIMTools leverages the Galaxy platform and enables integrated workflows for metabolomics data analysis made from building blocks designed for easy use and interpretability. Standard data formats and a set of utilities allow arbitrary linkages between tools to encourage novel workflow designs. The Galaxy framework enables future data integration for metabolomics studies with other omics data.

  13. Validation of a Pressure-Based Combustion Simulation Tool Using a Single Element Injector Test Problem

    NASA Technical Reports Server (NTRS)

    Thakur, Siddarth; Wright, Jeffrey

    2006-01-01

    The traditional design and analysis practice for advanced propulsion systems, particularly chemical rocket engines, relies heavily on expensive full-scale prototype development and testing. Over the past decade, use of high-fidelity analysis and design tools such as CFD early in the product development cycle has been identified as one way to alleviate testing costs and to develop these devices better, faster and cheaper. Increased emphasis is being placed on developing and applying CFD models to simulate the flow field environments and performance of advanced propulsion systems. This necessitates the development of next generation computational tools which can be used effectively and reliably in a design environment by non-CFD specialists. A computational tool, called Loci-STREAM is being developed for this purpose. It is a pressure-based, Reynolds-averaged Navier-Stokes (RANS) solver for generalized unstructured grids, which is designed to handle all-speed flows (incompressible to hypersonic) and is particularly suitable for solving multi-species flow in fixed-frame combustion devices. Loci-STREAM integrates proven numerical methods for generalized grids and state-of-the-art physical models in a novel rule-based programming framework called Loci which allows: (a) seamless integration of multidisciplinary physics in a unified manner, and (b) automatic handling of massively parallel computing. The objective of the ongoing work is to develop a robust simulation capability for combustion problems in rocket engines. As an initial step towards validating this capability, a model problem is investigated in the present study which involves a gaseous oxygen/gaseous hydrogen (GO2/GH2) shear coaxial single element injector, for which experimental data are available. The sensitivity of the computed solutions to grid density, grid distribution, different turbulence models, and different near-wall treatments is investigated. A refined grid, which is clustered in the vicinity of

  14. Post-Flight Data Analysis Tool

    NASA Technical Reports Server (NTRS)

    George, Marina

    2018-01-01

    A software tool that facilitates the retrieval and analysis of post-flight data. This allows our team and other teams to effectively and efficiently analyze and evaluate post-flight data in order to certify commercial providers.

  15. Dynamic Hurricane Data Analysis Tool

    NASA Technical Reports Server (NTRS)

    Knosp, Brian W.; Li, Peggy; Vu, Quoc A.

    2009-01-01

    A dynamic hurricane data analysis tool allows users of the JPL Tropical Cyclone Information System (TCIS) to analyze data over a Web medium. The TCIS software is described in the previous article, Tropical Cyclone Information System (TCIS) (NPO-45748). This tool interfaces with the TCIS database to pull in data from several different atmospheric and oceanic data sets, both observed by instruments. Users can use this information to generate histograms, maps, and profile plots for specific storms. The tool also displays statistical values for the user-selected parameter for the mean, standard deviation, median, minimum, and maximum values. There is little wait time, allowing for fast data plots over date and spatial ranges. Users may also zoom-in for a closer look at a particular spatial range. This is version 1 of the software. Researchers will use the data and tools on the TCIS to understand hurricane processes, improve hurricane forecast models and identify what types of measurements the next generation of instruments will need to collect.

  16. Elemental analysis of cotton by laser-induced breakdown spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schenk, Emily R.; Almirall, Jose R.

    Laser-induced breakdown spectroscopy (LIBS) has been applied to the elemental characterization of unprocessed cotton. This research is important in forensic and fraud detection applications to establish an elemental fingerprint of U.S. cotton by region, which can be used to determine the source of the cotton. To the best of our knowledge, this is the first report of a LIBS method for the elemental analysis of cotton. The experimental setup consists of a Nd:YAG laser that operates at the fundamental wavelength as the LIBS excitation source and an echelle spectrometer equipped with an intensified CCD camera. The relative concentrations of elementsmore » Al, Ba, Ca, Cr, Cu, Fe, Mg, and Sr from both nutrients and environmental contributions were determined by LIBS. Principal component analysis was used to visualize the differences between cotton samples based on the elemental composition by region in the U.S. Linear discriminant analysis of the LIBS data resulted in the correct classification of >97% of the cotton samples by U.S. region and >81% correct classification by state of origin.« less

  17. Finite element analysis of a composite wheelchair wheel design

    NASA Technical Reports Server (NTRS)

    Ortega, Rene

    1994-01-01

    The finite element analysis of a composite wheelchair wheel design is presented. The design is the result of a technology utilization request. The designer's intent is to soften the riding feeling by incorporating a mechanism attaching the wheel rim to the spokes that would allow considerable deflection upon compressive loads. A finite element analysis was conducted to verify proper structural function. Displacement and stress results are presented and conclusions are provided.

  18. Multi-element fingerprinting as a tool in origin authentication of four east China marine species.

    PubMed

    Guo, Lipan; Gong, Like; Yu, Yanlei; Zhang, Hong

    2013-12-01

    The contents of 25 elements in 4 types of commercial marine species from the East China Sea were determined by inductively coupled plasma mass spectrometry and atomic absorption spectrometry. The elemental composition was used to differentiate marine species according to geographical origin by multivariate statistical analysis. The results showed that principal component analysis could distinguish samples from different areas and reveal the elements which played the most important role in origin diversity. The established models by partial least squares discriminant analysis (PLS-DA) and by probabilistic neural network (PNN) can both precisely predict the origin of the marine species. Further study indicated that PLS-DA and PNN were efficacious in regional discrimination. The models from these 2 statistical methods, with an accuracy of 97.92% and 100%, respectively, could both distinguish samples from different areas without the need for species differentiation. © 2013 Institute of Food Technologists®

  19. Analysis of elemental concentration censored distributions in breast malignant and breast benign neoplasm tissues

    NASA Astrophysics Data System (ADS)

    Kubala-Kukuś, A.; Banaś, D.; Braziewicz, J.; Góźdź, S.; Majewska, U.; Pajek, M.

    2007-07-01

    The total reflection X-ray fluorescence method was applied to study the trace element concentrations in human breast malignant and breast benign neoplasm tissues taken from the women who were patients of Holycross Cancer Centre in Kielce (Poland). These investigations were mainly focused on the development of new possibilities of cancer diagnosis and therapy monitoring. This systematic comparative study was based on relatively large (˜ 100) population studied, namely 26 samples of breast malignant and 68 samples of breast benign neoplasm tissues. The concentrations, being in the range from a few ppb to 0.1%, were determined for thirteen elements (from P to Pb). The results were carefully analysed to investigate the concentration distribution of trace elements in the studied samples. The measurements of concentration of trace elements by total reflection X-ray fluorescence were limited, however, by the detection limit of the method. It was observed that for more than 50% of elements determined, the concentrations were not measured in all samples. These incomplete measurements were treated within the statistical concept called left-random censoring and for the estimation of the mean value and median of censored concentration distributions, the Kaplan-Meier estimator was used. For comparison of concentrations in two populations, the log-rank test was applied, which allows to compare the censored total reflection X-ray fluorescence data. Found statistically significant differences are discussed in more details. It is noted that described data analysis procedures should be the standard tool to analyze the censored concentrations of trace elements analysed by X-ray fluorescence methods.

  20. Evaluation of the finite element fuel rod analysis code (FRANCO)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, K.; Feltus, M.A.

    1994-12-31

    Knowledge of temperature distribution in a nuclear fuel rod is required to predict the behavior of fuel elements during operating conditions. The thermal and mechanical properties and performance characteristics are strongly dependent on the temperature, which can vary greatly inside the fuel rod. A detailed model of fuel rod behavior can be described by various numerical methods, including the finite element approach. The finite element method has been successfully used in many engineering applications, including nuclear piping and reactor component analysis. However, fuel pin analysis has traditionally been carried out with finite difference codes, with the exception of Electric Powermore » Research Institute`s FREY code, which was developed for mainframe execution. This report describes FRANCO, a finite element fuel rod analysis code capable of computing temperature disrtibution and mechanical deformation of a single light water reactor fuel rod.« less

  1. Binary tree eigen solver in finite element analysis

    NASA Technical Reports Server (NTRS)

    Akl, F. A.; Janetzke, D. C.; Kiraly, L. J.

    1993-01-01

    This paper presents a transputer-based binary tree eigensolver for the solution of the generalized eigenproblem in linear elastic finite element analysis. The algorithm is based on the method of recursive doubling, which parallel implementation of a number of associative operations on an arbitrary set having N elements is of the order of o(log2N), compared to (N-1) steps if implemented sequentially. The hardware used in the implementation of the binary tree consists of 32 transputers. The algorithm is written in OCCAM which is a high-level language developed with the transputers to address parallel programming constructs and to provide the communications between processors. The algorithm can be replicated to match the size of the binary tree transputer network. Parallel and sequential finite element analysis programs have been developed to solve for the set of the least-order eigenpairs using the modified subspace method. The speed-up obtained for a typical analysis problem indicates close agreement with the theoretical prediction given by the method of recursive doubling.

  2. Neutron Activation Analysis of the Rare Earth Elements (REE) - With Emphasis on Geological Materials

    NASA Astrophysics Data System (ADS)

    Stosch, Heinz-Günter

    2016-08-01

    Neutron activation analysis (NAA) has been the analytical method of choice for rare earth element (REE) analysis from the early 1960s through the 1980s. At that time, irradiation facilitieswere widely available and fairly easily accessible. The development of high-resolution gamma-ray detectors in the mid-1960s eliminated, formany applications, the need for chemical separation of the REE from the matrix material, making NAA a reliable and effective analytical tool. While not as precise as isotopedilution mass spectrometry, NAA was competitive by being sensitive for the analysis of about half of the rare earths (La, Ce, Nd, Sm, Eu, Tb, Yb, Lu). The development of inductively coupled plasma mass spectrometry since the 1980s, together with decommissioning of research reactors and the lack of installation of new ones in Europe and North America has led to the rapid decline of NAA.

  3. Multi-element analysis of emeralds and associated rocks by k(o) neutron activation analysis

    PubMed

    Acharya; Mondal; Burte; Nair; Reddy; Reddy; Reddy; Manohar

    2000-12-01

    Multi-element analysis was carried out in natural emeralds, their associated rocks and one sample of beryl obtained from Rajasthan, India. The concentrations of 21 elements were assayed by Instrumental Neutron Activation Analysis using the k0 method (k0 INAA method) and high-resolution gamma ray spectrometry. The data reveal the segregation of some elements from associated (trapped and host) rocks to the mineral beryl forming the gemstones. A reference rock standard of the US Geological Survey (USGS BCR-1) was also analysed as a control of the method.

  4. TARGET - TASK ANALYSIS REPORT GENERATION TOOL, VERSION 1.0

    NASA Technical Reports Server (NTRS)

    Ortiz, C. J.

    1994-01-01

    The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K

  5. Statistical Energy Analysis (SEA) and Energy Finite Element Analysis (EFEA) Predictions for a Floor-Equipped Composite Cylinder

    NASA Technical Reports Server (NTRS)

    Grosveld, Ferdinand W.; Schiller, Noah H.; Cabell, Randolph H.

    2011-01-01

    Comet Enflow is a commercially available, high frequency vibroacoustic analysis software founded on Energy Finite Element Analysis (EFEA) and Energy Boundary Element Analysis (EBEA). Energy Finite Element Analysis (EFEA) was validated on a floor-equipped composite cylinder by comparing EFEA vibroacoustic response predictions with Statistical Energy Analysis (SEA) and experimental results. Statistical Energy Analysis (SEA) predictions were made using the commercial software program VA One 2009 from ESI Group. The frequency region of interest for this study covers the one-third octave bands with center frequencies from 100 Hz to 4000 Hz.

  6. Vehicle Technology Simulation and Analysis Tools | Transportation Research

    Science.gov Websites

    | NREL Vehicle Technology Simulation and Analysis Tools Vehicle Technology Simulation and vehicle technologies with the potential to achieve significant fuel savings and emission reductions. NREL : Automotive Deployment Options Projection Tool The ADOPT modeling tool estimates vehicle technology

  7. Finite element analysis simulations for ultrasonic array NDE inspections

    NASA Astrophysics Data System (ADS)

    Dobson, Jeff; Tweedie, Andrew; Harvey, Gerald; O'Leary, Richard; Mulholland, Anthony; Tant, Katherine; Gachagan, Anthony

    2016-02-01

    Advances in manufacturing techniques and materials have led to an increase in the demand for reliable and robust inspection techniques to maintain safety critical features. The application of modelling methods to develop and evaluate inspections is becoming an essential tool for the NDE community. Current analytical methods are inadequate for simulation of arbitrary components and heterogeneous materials, such as anisotropic welds or composite structures. Finite element analysis software (FEA), such as PZFlex, can provide the ability to simulate the inspection of these arrangements, providing the ability to economically prototype and evaluate improved NDE methods. FEA is often seen as computationally expensive for ultrasound problems however, advances in computing power have made it a more viable tool. This paper aims to illustrate the capability of appropriate FEA to produce accurate simulations of ultrasonic array inspections - minimizing the requirement for expensive test-piece fabrication. Validation is afforded via corroboration of the FE derived and experimentally generated data sets for a test-block comprising 1D and 2D defects. The modelling approach is extended to consider the more troublesome aspects of heterogeneous materials where defect dimensions can be of the same length scale as the grain structure. The model is used to facilitate the implementation of new ultrasonic array inspection methods for such materials. This is exemplified by considering the simulation of ultrasonic NDE in a weld structure in order to assess new approaches to imaging such structures.

  8. CAS-viewer: web-based tool for splicing-guided integrative analysis of multi-omics cancer data.

    PubMed

    Han, Seonggyun; Kim, Dongwook; Kim, Youngjun; Choi, Kanghoon; Miller, Jason E; Kim, Dokyoon; Lee, Younghee

    2018-04-20

    The Cancer Genome Atlas (TCGA) project is a public resource that provides transcriptomic, DNA sequence, methylation, and clinical data for 33 cancer types. Transforming the large size and high complexity of TCGA cancer genome data into integrated knowledge can be useful to promote cancer research. Alternative splicing (AS) is a key regulatory mechanism of genes in human cancer development and in the interaction with epigenetic factors. Therefore, AS-guided integration of existing TCGA data sets will make it easier to gain insight into the genetic architecture of cancer risk and related outcomes. There are already existing tools analyzing and visualizing alternative mRNA splicing patterns for large-scale RNA-seq experiments. However, these existing web-based tools are limited to the analysis of individual TCGA data sets at a time, such as only transcriptomic information. We implemented CAS-viewer (integrative analysis of Cancer genome data based on Alternative Splicing), a web-based tool leveraging multi-cancer omics data from TCGA. It illustrates alternative mRNA splicing patterns along with methylation, miRNAs, and SNPs, and then provides an analysis tool to link differential transcript expression ratio to methylation, miRNA, and splicing regulatory elements for 33 cancer types. Moreover, one can analyze AS patterns with clinical data to identify potential transcripts associated with different survival outcome for each cancer. CAS-viewer is a web-based application for transcript isoform-driven integration of multi-omics data in multiple cancer types and will aid in the visualization and possible discovery of biomarkers for cancer by integrating multi-omics data from TCGA.

  9. Finite element mesh refinement criteria for stress analysis

    NASA Technical Reports Server (NTRS)

    Kittur, Madan G.; Huston, Ronald L.

    1990-01-01

    This paper discusses procedures for finite-element mesh selection and refinement. The objective is to improve accuracy. The procedures are based on (1) the minimization of the stiffness matrix race (optimizing node location); (2) the use of h-version refinement (rezoning, element size reduction, and increasing the number of elements); and (3) the use of p-version refinement (increasing the order of polynomial approximation of the elements). A step-by-step procedure of mesh selection, improvement, and refinement is presented. The criteria for 'goodness' of a mesh are based on strain energy, displacement, and stress values at selected critical points of a structure. An analysis of an aircraft lug problem is presented as an example.

  10. Analysis Tool Web Services from the EMBL-EBI.

    PubMed

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-07-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods.

  11. Analysis Tool Web Services from the EMBL-EBI

    PubMed Central

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-01-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods. PMID:23671338

  12. Scalable Implementation of Finite Elements by NASA _ Implicit (ScIFEi)

    NASA Technical Reports Server (NTRS)

    Warner, James E.; Bomarito, Geoffrey F.; Heber, Gerd; Hochhalter, Jacob D.

    2016-01-01

    Scalable Implementation of Finite Elements by NASA (ScIFEN) is a parallel finite element analysis code written in C++. ScIFEN is designed to provide scalable solutions to computational mechanics problems. It supports a variety of finite element types, nonlinear material models, and boundary conditions. This report provides an overview of ScIFEi (\\Sci-Fi"), the implicit solid mechanics driver within ScIFEN. A description of ScIFEi's capabilities is provided, including an overview of the tools and features that accompany the software as well as a description of the input and output le formats. Results from several problems are included, demonstrating the efficiency and scalability of ScIFEi by comparing to finite element analysis using a commercial code.

  13. A Variational Formulation for the Finite Element Analysis of Sound Wave Propagation in a Spherical Shell

    NASA Technical Reports Server (NTRS)

    Lebiedzik, Catherine

    1995-01-01

    Development of design tools to furnish optimal acoustic environments for lightweight aircraft demands the ability to simulate the acoustic system on a workstation. In order to form an effective mathematical model of the phenomena at hand, we have begun by studying the propagation of acoustic waves inside closed spherical shells. Using a fully-coupled fluid-structure interaction model based upon variational principles, we have written a finite element analysis program and are in the process of examining several test cases. Future investigations are planned to increase model accuracy by incorporating non-linear and viscous effects.

  14. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2011-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very di cult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The Tool for Rapid Analysis of Monte Carlo simulations (TRAM) has been used in recent design and analysis work for the Orion vehicle, greatly decreasing the time it takes to evaluate performance requirements. A previous version of this tool was developed to automatically identify driving design variables in Monte Carlo data sets. This paper describes a new, parallel version, of TRAM implemented on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  15. Nonlinear micromechanics-based finite element analysis of the interfacial behaviour of FRP-strengthened reinforced concrete beams

    NASA Astrophysics Data System (ADS)

    Abd El Baky, Hussien

    This research work is devoted to theoretical and numerical studies on the flexural behaviour of FRP-strengthened concrete beams. The objectives of this research are to extend and generalize the results of simple experiments, to recommend new design guidelines based on accurate numerical tools, and to enhance our comprehension of the bond performance of such beams. These numerical tools can be exploited to bridge the existing gaps in the development of analysis and modelling approaches that can predict the behaviour of FRP-strengthened concrete beams. The research effort here begins with the formulation of a concrete model and development of FRP/concrete interface constitutive laws, followed by finite element simulations for beams strengthened in flexure. Finally, a statistical analysis is carried out taking the advantage of the aforesaid numerical tools to propose design guidelines. In this dissertation, an alternative incremental formulation of the M4 microplane model is proposed to overcome the computational complexities associated with the original formulation. Through a number of numerical applications, this incremental formulation is shown to be equivalent to the original M4 model. To assess the computational efficiency of the incremental formulation, the "arc-length" numerical technique is also considered and implemented in the original Bazant et al. [2000] M4 formulation. Finally, the M4 microplane concrete model is coded in FORTRAN and implemented as a user-defined subroutine into the commercial software package ADINA, Version 8.4. Then this subroutine is used with the finite element package to analyze various applications involving FRP strengthening. In the first application a nonlinear micromechanics-based finite element analysis is performed to investigate the interfacial behaviour of FRP/concrete joints subjected to direct shear loadings. The intention of this part is to develop a reliable bond--slip model for the FRP/concrete interface. The bond

  16. Transmission Planning Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-06-23

    Developed to solve specific problem: Assist transmission planning for regional transfers in interconnected power systems. This work was originated in a study for the U.S. Department of State, to recommend transmission reinforcements for the Central American regional system that interconnects 6 countries. Transmission planning analysis is currently performed by engineers with domainspecific and systemspecific knowledge without a unique methodology. The software codes of this disclosure assists engineers by defining systematic analysis procedures to help identify weak points and make decisions on transmission planning of regional interconnected power systems. Transmission Planning Analysis Tool groups PSS/E results of multiple AC contingency analysismore » and voltage stability analysis and QV analysis of many scenarios of study and arrange them in a systematic way to aid power system planning engineers or transmission operators in effective decision]making process or in the off]line study environment.« less

  17. Predicting Rediated Noise With Power Flow Finite Element Analysis

    DTIC Science & Technology

    2007-02-01

    Defence R&D Canada – Atlantic DEFENCE DÉFENSE & Predicting Rediated Noise With Power Flow Finite Element Analysis D. Brennan T.S. Koko L. Jiang J...PREDICTING RADIATED NOISE WITH POWER FLOW FINITE ELEMENT ANALYSIS D.P. Brennan T.S. Koko L. Jiang J.C. Wallace Martec Limited Martec Limited...model- or full-scale data before it is available for general use. Brennan, D.P., Koko , T.S., Jiang, L., Wallace, J.C. 2007. Predicting Radiated

  18. Structural weights analysis of advanced aerospace vehicles using finite element analysis

    NASA Technical Reports Server (NTRS)

    Bush, Lance B.; Lentz, Christopher A.; Rehder, John J.; Naftel, J. Chris; Cerro, Jeffrey A.

    1989-01-01

    A conceptual/preliminary level structural design system has been developed for structural integrity analysis and weight estimation of advanced space transportation vehicles. The system includes a three-dimensional interactive geometry modeler, a finite element pre- and post-processor, a finite element analyzer, and a structural sizing program. Inputs to the system include the geometry, surface temperature, material constants, construction methods, and aerodynamic and inertial loads. The results are a sized vehicle structure capable of withstanding the static loads incurred during assembly, transportation, operations, and missions, and a corresponding structural weight. An analysis of the Space Shuttle external tank is included in this paper as a validation and benchmark case of the system.

  19. Scanning Electron Microscope-Cathodoluminescence Analysis of Rare-Earth Elements in Magnets.

    PubMed

    Imashuku, Susumu; Wagatsuma, Kazuaki; Kawai, Jun

    2016-02-01

    Scanning electron microscope-cathodoluminescence (SEM-CL) analysis was performed for neodymium-iron-boron (NdFeB) and samarium-cobalt (Sm-Co) magnets to analyze the rare-earth elements present in the magnets. We examined the advantages of SEM-CL analysis over conventional analytical methods such as SEM-energy-dispersive X-ray (EDX) spectroscopy and SEM-wavelength-dispersive X-ray (WDX) spectroscopy for elemental analysis of rare-earth elements in NdFeB magnets. Luminescence spectra of chloride compounds of elements in the magnets were measured by the SEM-CL method. Chloride compounds were obtained by the dropwise addition of hydrochloric acid on the magnets followed by drying in vacuum. Neodymium, praseodymium, terbium, and dysprosium were separately detected in the NdFeB magnets, and samarium was detected in the Sm-Co magnet by the SEM-CL method. In contrast, it was difficult to distinguish terbium and dysprosium in the NdFeB magnet with a dysprosium concentration of 1.05 wt% by conventional SEM-EDX analysis. Terbium with a concentration of 0.02 wt% in an NdFeB magnet was detected by SEM-CL analysis, but not by conventional SEM-WDX analysis. SEM-CL analysis is advantageous over conventional SEM-EDX and SEM-WDX analyses for detecting trace rare-earth elements in NdFeB magnets, particularly dysprosium and terbium.

  20. Probabilistic finite elements for transient analysis in nonlinear continua

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Belytschko, T.; Mani, A.

    1985-01-01

    The probabilistic finite element method (PFEM), which is a combination of finite element methods and second-moment analysis, is formulated for linear and nonlinear continua with inhomogeneous random fields. Analogous to the discretization of the displacement field in finite element methods, the random field is also discretized. The formulation is simplified by transforming the correlated variables to a set of uncorrelated variables through an eigenvalue orthogonalization. Furthermore, it is shown that a reduced set of the uncorrelated variables is sufficient for the second-moment analysis. Based on the linear formulation of the PFEM, the method is then extended to transient analysis in nonlinear continua. The accuracy and efficiency of the method is demonstrated by application to a one-dimensional, elastic/plastic wave propagation problem. The moments calculated compare favorably with those obtained by Monte Carlo simulation. Also, the procedure is amenable to implementation in deterministic FEM based computer programs.

  1. An enhanced MMW and SMMW/THz imaging system performance prediction and analysis tool for concealed weapon detection and pilotage obstacle avoidance

    NASA Astrophysics Data System (ADS)

    Murrill, Steven R.; Jacobs, Eddie L.; Franck, Charmaine C.; Petkie, Douglas T.; De Lucia, Frank C.

    2015-10-01

    The U.S. Army Research Laboratory (ARL) has continued to develop and enhance a millimeter-wave (MMW) and submillimeter- wave (SMMW)/terahertz (THz)-band imaging system performance prediction and analysis tool for both the detection and identification of concealed weaponry, and for pilotage obstacle avoidance. The details of the MATLAB-based model which accounts for the effects of all critical sensor and display components, for the effects of atmospheric attenuation, concealment material attenuation, and active illumination, were reported on at the 2005 SPIE Europe Security and Defence Symposium (Brugge). An advanced version of the base model that accounts for both the dramatic impact that target and background orientation can have on target observability as related to specular and Lambertian reflections captured by an active-illumination-based imaging system, and for the impact of target and background thermal emission, was reported on at the 2007 SPIE Defense and Security Symposium (Orlando). Further development of this tool that includes a MODTRAN-based atmospheric attenuation calculator and advanced system architecture configuration inputs that allow for straightforward performance analysis of active or passive systems based on scanning (single- or line-array detector element(s)) or staring (focal-plane-array detector elements) imaging architectures was reported on at the 2011 SPIE Europe Security and Defence Symposium (Prague). This paper provides a comprehensive review of a newly enhanced MMW and SMMW/THz imaging system analysis and design tool that now includes an improved noise sub-model for more accurate and reliable performance predictions, the capability to account for postcapture image contrast enhancement, and the capability to account for concealment material backscatter with active-illumination- based systems. Present plans for additional expansion of the model's predictive capabilities are also outlined.

  2. A study on using pre-forming blank in single point incremental forming process by finite element analysis

    NASA Astrophysics Data System (ADS)

    Abass, K. I.

    2016-11-01

    Single Point Incremental Forming process (SPIF) is a forming technique of sheet material based on layered manufacturing principles. The edges of sheet material are clamped while the forming tool is moved along the tool path. The CNC milling machine is used to manufacturing the product. SPIF involves extensive plastic deformation and the description of the process is more complicated by highly nonlinear boundary conditions, namely contact and frictional effects have been accomplished. However, due to the complex nature of these models, numerical approaches dominated by Finite Element Analysis (FEA) are now in widespread use. The paper presents the data and main results of a study on effect of using preforming blank in SPIF through FEA. The considered SPIF has been studied under certain process conditions referring to the test work piece, tool, etc., applying ANSYS 11. The results show that the simulation model can predict an ideal profile of processing track, the behaviour of contact tool-workpiece, the product accuracy by evaluation its thickness, surface strain and the stress distribution along the deformed blank section during the deformation stages.

  3. Using Kepler for Tool Integration in Microarray Analysis Workflows.

    PubMed

    Gan, Zhuohui; Stowe, Jennifer C; Altintas, Ilkay; McCulloch, Andrew D; Zambon, Alexander C

    Increasing numbers of genomic technologies are leading to massive amounts of genomic data, all of which requires complex analysis. More and more bioinformatics analysis tools are being developed by scientist to simplify these analyses. However, different pipelines have been developed using different software environments. This makes integrations of these diverse bioinformatics tools difficult. Kepler provides an open source environment to integrate these disparate packages. Using Kepler, we integrated several external tools including Bioconductor packages, AltAnalyze, a python-based open source tool, and R-based comparison tool to build an automated workflow to meta-analyze both online and local microarray data. The automated workflow connects the integrated tools seamlessly, delivers data flow between the tools smoothly, and hence improves efficiency and accuracy of complex data analyses. Our workflow exemplifies the usage of Kepler as a scientific workflow platform for bioinformatics pipelines.

  4. Error analysis and correction of discrete solutions from finite element codes

    NASA Technical Reports Server (NTRS)

    Thurston, G. A.; Stein, P. A.; Knight, N. F., Jr.; Reissner, J. E.

    1984-01-01

    Many structures are an assembly of individual shell components. Therefore, results for stresses and deflections from finite element solutions for each shell component should agree with the equations of shell theory. This paper examines the problem of applying shell theory to the error analysis and the correction of finite element results. The general approach to error analysis and correction is discussed first. Relaxation methods are suggested as one approach to correcting finite element results for all or parts of shell structures. Next, the problem of error analysis of plate structures is examined in more detail. The method of successive approximations is adapted to take discrete finite element solutions and to generate continuous approximate solutions for postbuckled plates. Preliminary numerical results are included.

  5. Elemental misinterpretation in automated analysis of LIBS spectra.

    PubMed

    Hübert, Waldemar; Ankerhold, Georg

    2011-07-01

    In this work, the Stark effect is shown to be mainly responsible for wrong elemental allocation by automated laser-induced breakdown spectroscopy (LIBS) software solutions. Due to broadening and shift of an elemental emission line affected by the Stark effect, its measured spectral position might interfere with the line position of several other elements. The micro-plasma is generated by focusing a frequency-doubled 200 mJ pulsed Nd/YAG laser on an aluminum target and furthermore on a brass sample in air at atmospheric pressure. After laser pulse excitation, we have measured the temporal evolution of the Al(II) ion line at 281.6 nm (4s(1)S-3p(1)P) during the decay of the laser-induced plasma. Depending on laser pulse power, the center of the measured line is red-shifted by 130 pm (490 GHz) with respect to the exact line position. In this case, the well-known spectral line positions of two moderate and strong lines of other elements coincide with the actual shifted position of the Al(II) line. Consequently, a time-resolving software analysis can lead to an elemental misinterpretation. To avoid a wrong interpretation of LIBS spectra in automated analysis software for a given LIBS system, we recommend using larger gate delays incorporating Stark broadening parameters and using a range of tolerance, which is non-symmetric around the measured line center. These suggestions may help to improve time-resolving LIBS software promising a smaller probability of wrong elemental identification and making LIBS more attractive for industrial applications.

  6. Localized Overheating Phenomena and Optimization of Spark-Plasma Sintering Tooling Design

    PubMed Central

    Giuntini, Diletta; Olevsky, Eugene A.; Garcia-Cardona, Cristina; Maximenko, Andrey L.; Yurlova, Maria S.; Haines, Christopher D.; Martin, Darold G.; Kapoor, Deepak

    2013-01-01

    The present paper shows the application of a three-dimensional coupled electrical, thermal, mechanical finite element macro-scale modeling framework of Spark Plasma Sintering (SPS) to an actual problem of SPS tooling overheating, encountered during SPS experimentation. The overheating phenomenon is analyzed by varying the geometry of the tooling that exhibits the problem, namely by modeling various tooling configurations involving sequences of disk-shape spacers with step-wise increasing radii. The analysis is conducted by means of finite element simulations, intended to obtain temperature spatial distributions in the graphite press-forms, including punches, dies, and spacers; to identify the temperature peaks and their respective timing, and to propose a more suitable SPS tooling configuration with the avoidance of the overheating as a final aim. Electric currents-based Joule heating, heat transfer, mechanical conditions, and densification are imbedded in the model, utilizing the finite-element software COMSOL™, which possesses a distinguishing ability of coupling multiple physics. Thereby the implementation of a finite element method applicable to a broad range of SPS procedures is carried out, together with the more specific optimization of the SPS tooling design when dealing with excessive heating phenomena. PMID:28811398

  7. Finite Element Analysis of Reverberation Chambers

    NASA Technical Reports Server (NTRS)

    Bunting, Charles F.; Nguyen, Duc T.

    2000-01-01

    The primary motivating factor behind the initiation of this work was to provide a deterministic means of establishing the validity of the statistical methods that are recommended for the determination of fields that interact in -an avionics system. The application of finite element analysis to reverberation chambers is the initial step required to establish a reasonable course of inquiry in this particularly data-intensive study. The use of computational electromagnetics provides a high degree of control of the "experimental" parameters that can be utilized in a simulation of reverberating structures. As the work evolved there were four primary focus areas they are: 1. The eigenvalue problem for the source free problem. 2. The development of a complex efficient eigensolver. 3. The application of a source for the TE and TM fields for statistical characterization. 4. The examination of shielding effectiveness in a reverberating environment. One early purpose of this work was to establish the utility of finite element techniques in the development of an extended low frequency statistical model for reverberation phenomena. By employing finite element techniques, structures of arbitrary complexity can be analyzed due to the use of triangular shape functions in the spatial discretization. The effects of both frequency stirring and mechanical stirring are presented. It is suggested that for the low frequency operation the typical tuner size is inadequate to provide a sufficiently random field and that frequency stirring should be used. The results of the finite element analysis of the reverberation chamber illustrate io-W the potential utility of a 2D representation for enhancing the basic statistical characteristics of the chamber when operating in a low frequency regime. The basic field statistics are verified for frequency stirring over a wide range of frequencies. Mechanical stirring is shown to provide an effective frequency deviation.

  8. Evaluation of cryoanalysis as a tool for analyzing elemental distribution in "live" tardigrades using micro-PIXE

    NASA Astrophysics Data System (ADS)

    Nilsson, E. J. C.; Pallon, J.; Przybylowicz, W. J.; Wang, Y. D.; Jönsson, K. I.

    2014-08-01

    Although heavy on labor and equipment, thus not often applied, cryoanalysis of frozen hydrated biological specimens can provide information that better reflects the living state of the organism, compared with analysis in the freeze-dried state. In this paper we report a study where the cryoanalysis facility with cryosectioning capabilities at Materials Research Department, iThemba LABS, South Africa was employed to evaluate the usefulness of combining three ion beam analytical methods (μPIXE, RBS and STIM) to analyze a biological target where a better elemental compositional description is needed - the tardigrade. Imaging as well as quantification results are of interest. In a previous study, the element composition and redistribution of elements in the desiccated and active states of two tardigrade species was investigated. This study included analysis of both whole and sectioned tardigrades, and the aim was to analyze each specimen twice; first frozen hydrated and later freeze-dried. The combination of the three analytical techniques proved useful: elements from C to Rb in the tardigrades could be determined and certain differences in distribution of elements between the frozen hydrated and the freeze-dried states were observed. RBS on frozen hydrated specimens provided knowledge of matrix elements.

  9. Elemental analysis of granite by instrumental neutron activation analysis (INAA) and X-ray fluorescence analysis (XRF).

    PubMed

    El-Taher, A

    2012-01-01

    The instrumental neutron activation analysis technique (INAA) was used for qualitative and quantitative analysis of granite samples collected from four locations in the Aswan area in South Egypt. The samples were prepared together with their standards and simultaneously irradiated in a neutron flux of 7×10(11)n/cm(2)s in the TRIGA Mainz research reactor. Gamma-ray spectra from an hyper-pure germanium detector were analyzed. The present study provides the basic data of elemental concentrations of granite rocks. The following elements have been determined Na, Mg, K, Fe, Mn, Sc, Cr, Ti, Co, Zn, Ga, Rb, Zr, Nb, Sn, Ba, Cs, La, Ce, Nd, Sm, Eu, Yb, Lu, Hf, Ta, Th and U. The X-ray fluorescence (XRF) was used for comparison and to detect elements, which can be detected only by XRF such as F, S, Cl, Co, Cu, Mo, Ni, Pb, Se and V. The data presented here are our contribution to understanding the elemental composition of the granite rocks. Because there are no existing databases for the elemental analysis of granite, our results are a start to establishing a database for the Egyptian granite. It is hoped that the data presented here will be useful to those dealing with geochemistry, granite chemistry and related fields. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Downhole Elemental Analysis with LIBS

    NASA Technical Reports Server (NTRS)

    Moreschini, Paolo; Zacny, Kris; Rickman, Doug

    2011-01-01

    In this paper we discuss a novel instrument, currently under development at Honeybee Robotics with SBIR funding from NASA. The device is designed to characterize elemental composition as a function of depth in non-terrestrial geological formations. The instrument consists of a miniaturized laser-induced breakdown spectrometer (LIBS) analyzer integrated in a 2" diameter drill string. While the drill provides subsurface access, the LIBS analyzer provides information on the elemental composition of the borehole wall. This instrument has a variety of space applications ranging from exploration of the Moon for which it was originally designed, to Mars, as well as a variety of terrestrial applications. Subsurface analysis is usually performed by sample acquisition through a drill or excavator, followed by sample preparation and subsequent sample presentation to an instrument or suite of instruments. An alternative approach consisting in bringing a miniaturized version of the instrument to the sample has many advantages over the traditional methodology, as it allows faster response, reduced probability of cross-contamination and a simplification in the sampling mechanisms. LIBS functions by focusing a high energy laser on a material inducing a plasma consisting of a small fraction of the material under analysis. Optical emission from the plasma, analyzed by a spectrometer, can be used to determine elemental composition. A triangulation sensor located in the sensor head determines the distance of the sensor from the borehole wall. An actuator modifies the position of the sensor accordingly, in order to compensate for changes due to the profile of the borehole walls. This is necessary because LIBS measurements are negatively affected by changes in the relative position of the focus of the laser with respect to the position of the sample (commonly referred to as the "lens to sample distance"). Profiling the borehole is done by adjusting the position of the sensor with a

  11. Accelerator-based chemical and elemental analysis of atmospheric aerosols

    NASA Astrophysics Data System (ADS)

    Mentes, Besim

    Aerosol particles have always been present in the atmosphere, arising from natural sources. But it was not until recently when emissions from anthropogenic (man made) sources began to dominate, that atmospheric aerosols came into focus and the aerosol science in the environmental perspective started to grow. These sources emit or produce particles with different elemental and chemical compositions, as well as different sizes of the individual aerosols. The effects of increased pollution of the atmosphere are many, and have different time scales. One of the effects known today is acid rain, which causes problems for vegetation. Pollution is also a direct human health risk, in many cities where traffic driven by combustion engines is forbidden at certain times when the meteorological conditions are unfavourable. Aerosols play an important role in the climate, and may have both direct and indirect effect which cause cooling of the planet surface, in contrast to the so-called greenhouse gases. During this work a technique for chemical and elemental analysis of atmospheric aerosols and an elemental analysis methodology for upper tropospheric aerosols have been developed. The elemental analysis is performed by the ion beam analysis (IBA) techniques, PIXE (elements heavier than Al). PESA (C, N and O), cPESA (H) and pNRA (Mg and Na). The chemical speciation of atmospheric aerosols is obtained by ion beam thermography (IBT). During thermography the sample temperature is stepwise increased and the IBA techniques are used to continuously monitor the elemental concentration. A thermogram is obtained for each element. The vaporisation of the compounds in the sample appears as a concentration decrease in the thermograms at characteristic vaporisation temperatures (CVTs). Different aspects of IBT have been examined in Paper I to IV. The features of IBT are: almost total elemental speciation of the aerosol mass, chemical speciation of the inorganic compounds, carbon content

  12. Finite element dynamic analysis on CDC STAR-100 computer

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Lambiotte, J. J., Jr.

    1978-01-01

    Computational algorithms are presented for the finite element dynamic analysis of structures on the CDC STAR-100 computer. The spatial behavior is described using higher-order finite elements. The temporal behavior is approximated by using either the central difference explicit scheme or Newmark's implicit scheme. In each case the analysis is broken up into a number of basic macro-operations. Discussion is focused on the organization of the computation and the mode of storage of different arrays to take advantage of the STAR pipeline capability. The potential of the proposed algorithms is discussed and CPU times are given for performing the different macro-operations for a shell modeled by higher order composite shallow shell elements having 80 degrees of freedom.

  13. Geodetic Strain Analysis Tool

    NASA Technical Reports Server (NTRS)

    Kedar, Sharon; Baxter, Sean C.; Parker, Jay W.; Webb, Frank H.; Owen, Susan E.; Sibthorpe, Anthony J.; Dong, Danan

    2011-01-01

    A geodetic software analysis tool enables the user to analyze 2D crustal strain from geodetic ground motion, and create models of crustal deformation using a graphical interface. Users can use any geodetic measurements of ground motion and derive the 2D crustal strain interactively. This software also provides a forward-modeling tool that calculates a geodetic velocity and strain field for a given fault model, and lets the user compare the modeled strain field with the strain field obtained from the user s data. Users may change parameters on-the-fly and obtain a real-time recalculation of the resulting strain field. Four data products are computed: maximum shear, dilatation, shear angle, and principal components. The current view and data dependencies are processed first. The remaining data products and views are then computed in a round-robin fashion to anticipate view changes. When an analysis or display parameter is changed, the affected data products and views are invalidated and progressively re-displayed as available. This software is designed to facilitate the derivation of the strain fields from the GPS and strain meter data that sample it to facilitate the understanding of the strengths and weaknesses of the strain field derivation from continuous GPS (CGPS) and other geodetic data from a variety of tectonic settings, to converge on the "best practices" strain derivation strategy for the Solid Earth Science ESDR System (SESES) project given the CGPS station distribution in the western U.S., and to provide SESES users with a scientific and educational tool to explore the strain field on their own with user-defined parameters.

  14. Some Observations on the Current Status of Performing Finite Element Analyses

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Knight, Norman F., Jr; Shivakumar, Kunigal N.

    2015-01-01

    Aerospace structures are complex high-performance structures. Advances in reliable and efficient computing and modeling tools are enabling analysts to consider complex configurations, build complex finite element models, and perform analysis rapidly. Many of the early career engineers of today are very proficient in the usage of modern computers, computing engines, complex software systems, and visualization tools. These young engineers are becoming increasingly efficient in building complex 3D models of complicated aerospace components. However, the current trends demonstrate blind acceptance of the results of the finite element analysis results. This paper is aimed at raising an awareness of this situation. Examples of the common encounters are presented. To overcome the current trends, some guidelines and suggestions for analysts, senior engineers, and educators are offered.

  15. Common Bolted Joint Analysis Tool

    NASA Technical Reports Server (NTRS)

    Imtiaz, Kauser

    2011-01-01

    Common Bolted Joint Analysis Tool (comBAT) is an Excel/VB-based bolted joint analysis/optimization program that lays out a systematic foundation for an inexperienced or seasoned analyst to determine fastener size, material, and assembly torque for a given design. Analysts are able to perform numerous what-if scenarios within minutes to arrive at an optimal solution. The program evaluates input design parameters, performs joint assembly checks, and steps through numerous calculations to arrive at several key margins of safety for each member in a joint. It also checks for joint gapping, provides fatigue calculations, and generates joint diagrams for a visual reference. Optimum fastener size and material, as well as correct torque, can then be provided. Analysis methodology, equations, and guidelines are provided throughout the solution sequence so that this program does not become a "black box:" for the analyst. There are built-in databases that reduce the legwork required by the analyst. Each step is clearly identified and results are provided in number format, as well as color-coded spelled-out words to draw user attention. The three key features of the software are robust technical content, innovative and user friendly I/O, and a large database. The program addresses every aspect of bolted joint analysis and proves to be an instructional tool at the same time. It saves analysis time, has intelligent messaging features, and catches operator errors in real time.

  16. Object-Oriented Multi-Disciplinary Design, Analysis, and Optimization Tool

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi

    2011-01-01

    An Object-Oriented Optimization (O3) tool was developed that leverages existing tools and practices, and allows the easy integration and adoption of new state-of-the-art software. At the heart of the O3 tool is the Central Executive Module (CEM), which can integrate disparate software packages in a cross platform network environment so as to quickly perform optimization and design tasks in a cohesive, streamlined manner. This object-oriented framework can integrate the analysis codes for multiple disciplines instead of relying on one code to perform the analysis for all disciplines. The CEM was written in FORTRAN and the script commands for each performance index were submitted through the use of the FORTRAN Call System command. In this CEM, the user chooses an optimization methodology, defines objective and constraint functions from performance indices, and provides starting and side constraints for continuous as well as discrete design variables. The structural analysis modules such as computations of the structural weight, stress, deflection, buckling, and flutter and divergence speeds have been developed and incorporated into the O3 tool to build an object-oriented Multidisciplinary Design, Analysis, and Optimization (MDAO) tool.

  17. Finite Element Analysis of Particle Ionization within Carbon Nanotube Ion Micro Thruster

    DTIC Science & Technology

    2017-12-01

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release. Distribution is unlimited. FINITE ELEMENT ...AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE FINITE ELEMENT ANALYSIS OF PARTICLE IONIZATION WITHIN CARBON NANOTUBE ION MICRO THRUSTER 5...simulation, carbon nanotube simulation, microsatellite, finite element analysis, electric field, particle tracing 15. NUMBER OF PAGES 55 16. PRICE

  18. VStar: Variable star data visualization and analysis tool

    NASA Astrophysics Data System (ADS)

    VStar Team

    2014-07-01

    VStar is a multi-platform, easy-to-use variable star data visualization and analysis tool. Data for a star can be read from the AAVSO (American Association of Variable Star Observers) database or from CSV and TSV files. VStar displays light curves and phase plots, can produce a mean curve, and analyzes time-frequency with Weighted Wavelet Z-Transform. It offers tools for period analysis, filtering, and other functions.

  19. Water Quality Analysis Tool (WQAT)

    EPA Science Inventory

    The purpose of the Water Quality Analysis Tool (WQAT) software is to provide a means for analyzing and producing useful remotely sensed data products for an entire estuary, a particular point or area of interest (AOI or POI) in estuaries, or water bodies of interest where pre-pro...

  20. High-Fidelity Buckling Analysis of Composite Cylinders Using the STAGS Finite Element Code

    NASA Technical Reports Server (NTRS)

    Hilburger, Mark W.

    2014-01-01

    Results from previous shell buckling studies are presented that illustrate some of the unique and powerful capabilities in the STAGS finite element analysis code that have made it an indispensable tool in structures research at NASA over the past few decades. In particular, prototypical results from the development and validation of high-fidelity buckling simulations are presented for several unstiffened thin-walled compression-loaded graphite-epoxy cylindrical shells along with a discussion on the specific methods and user-defined subroutines in STAGS that are used to carry out the high-fidelity simulations. These simulations accurately account for the effects of geometric shell-wall imperfections, shell-wall thickness variations, local shell-wall ply-gaps associated with the fabrication process, shell-end geometric imperfections, nonuniform applied end loads, and elastic boundary conditions. The analysis procedure uses a combination of nonlinear quasi-static and transient dynamic solution algorithms to predict the prebuckling and unstable collapse response characteristics of the cylinders. Finally, the use of high-fidelity models in the development of analysis-based shell-buckling knockdown (design) factors is demonstrated.

  1. Guidelines and Recommendations on the Use of Higher Order Finite Elements for Bending Analysis of Plates

    NASA Astrophysics Data System (ADS)

    Carrera, E.; Miglioretti, F.; Petrolo, M.

    2011-11-01

    This paper compares and evaluates various plate finite elements to analyse the static response of thick and thin plates subjected to different loading and boundary conditions. Plate elements are based on different assumptions for the displacement distribution along the thickness direction. Classical (Kirchhoff and Reissner-Mindlin), refined (Reddy and Kant), and other higher-order displacement fields are implemented up to fourth-order expansion. The Unified Formulation UF by the first author is used to derive finite element matrices in terms of fundamental nuclei which consist of 3×3 arrays. The MITC4 shear-locking free type formulation is used for the FE approximation. Accuracy of a given plate element is established in terms of the error vs. thickness-to-length parameter. A significant number of finite elements for plates are implemented and compared using displacement and stress variables for various plate problems. Reduced models that are able to detect the 3D solution are built and a Best Plate Diagram (BPD) is introduced to give guidelines for the construction of plate theories based on a given accuracy and number of terms. It is concluded that the UF is a valuable tool to establish, for a given plate problem, the most accurate FE able to furnish results within a certain accuracy range. This allows us to obtain guidelines and recommendations in building refined elements in the bending analysis of plates for various geometries, loadings, and boundary conditions.

  2. Tools for Large-Scale Mobile Malware Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bierma, Michael

    Analyzing mobile applications for malicious behavior is an important area of re- search, and is made di cult, in part, by the increasingly large number of appli- cations available for the major operating systems. There are currently over 1.2 million apps available in both the Google Play and Apple App stores (the respec- tive o cial marketplaces for the Android and iOS operating systems)[1, 2]. Our research provides two large-scale analysis tools to aid in the detection and analysis of mobile malware. The rst tool we present, Andlantis, is a scalable dynamic analysis system capa- ble of processing over 3000more » Android applications per hour. Traditionally, Android dynamic analysis techniques have been relatively limited in scale due to the compu- tational resources required to emulate the full Android system to achieve accurate execution. Andlantis is the most scalable Android dynamic analysis framework to date, and is able to collect valuable forensic data, which helps reverse-engineers and malware researchers identify and understand anomalous application behavior. We discuss the results of running 1261 malware samples through the system, and provide examples of malware analysis performed with the resulting data. While techniques exist to perform static analysis on a large number of appli- cations, large-scale analysis of iOS applications has been relatively small scale due to the closed nature of the iOS ecosystem, and the di culty of acquiring appli- cations for analysis. The second tool we present, iClone, addresses the challenges associated with iOS research in order to detect application clones within a dataset of over 20,000 iOS applications.« less

  3. An analysis of cross-coupling of a multicomponent jet engine test stand using finite element modeling techniques

    NASA Technical Reports Server (NTRS)

    Schweikhard, W. G.; Singnoi, W. N.

    1985-01-01

    A two axis thrust measuring system was analyzed by using a finite a element computer program to determine the sensitivities of the thrust vectoring nozzle system to misalignment of the load cells and applied loads, and the stiffness of the structural members. Three models were evaluated: (1) the basic measuring element and its internal calibration load cells; (2) the basic measuring element and its external load calibration equipment; and (3) the basic measuring element, external calibration load frame and the altitude facility support structure. Alignment of calibration loads was the greatest source of error for multiaxis thrust measuring systems. Uniform increases or decreases in stiffness of the members, which might be caused by the selection of the materials, have little effect on the accuracy of the measurements. It is found that the POLO-FINITE program is a viable tool for designing and analyzing multiaxis thrust measurement systems. The response of the test stand to step inputs that might be encountered with thrust vectoring tests was determined. The dynamic analysis show a potential problem for measuring the dynamic response characteristics of thrust vectoring systems because of the inherently light damping of the test stand.

  4. Environmental influence on trace element levels in human hair

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Limic, N.; Valkovic, V.

    1986-12-01

    Trace element content of human hair depends on many factors. It has been shown by a large number of investigators that environmental factors play an important role. Elements from air particulates, water, shampoo or other media get incorporated into the hair structure. Here a model is proposed in which different contributions to trace element levels in human hair are factorized and the environmental contribution to the radial and longitudinal concentration profiles can be calculated. With the proper understanding of environmental contamination, hair analysis has better chances of being used as a diagnostic tool.

  5. Capillary Optics Based X-Ray Micro-Imaging Elemental Analysis

    NASA Astrophysics Data System (ADS)

    Hampai, D.; Dabagov, S. B.; Cappuccio, G.; Longoni, A.; Frizzi, T.; Cibin, G.

    2010-04-01

    A rapidly developed during the last few years micro-X-ray fluorescence spectrometry (μXRF) is a promising multi-elemental technique for non-destructive analysis. Typically it is rather hard to perform laboratory μXRF analysis because of the difficulty of producing an original small-size X-ray beam as well as its focusing. Recently developed for X-ray beam focusing polycapillary optics offers laboratory X-ray micro probes. The combination of polycapillary lens and fine-focused micro X-ray tube can provide high intensity radiation flux on a sample that is necessary in order to perform the elemental analysis. In comparison to a pinhole, an optimized "X-ray source-op tics" system can result in radiation density gain of more than 3 orders by the value. The most advanced way to get that result is to use the confocal configuration based on two X-ray lenses, one for the fluorescence excitation and the other for the detection of secondary emission from a sample studied. In case of X-ray capillary microfocusing a μXRF instrument designed in the confocal scheme allows us to obtain a 3D elemental mapping. In this work we will show preliminary results obtained with our prototype, a portable X-ray microscope for X-ray both imaging and fluorescence analysis; it enables μXRF elemental mapping simultaneously with X-ray imaging. A prototype of compact XRF spectrometer with a spatial resolution less than 100 μm has been designed.

  6. A Six-Node Curved Triangular Element and a Four-Node Quadrilateral Element for Analysis of Laminated Composite Aerospace Structures

    NASA Technical Reports Server (NTRS)

    Martin, C. Wayne; Breiner, David M.; Gupta, Kajal K. (Technical Monitor)

    2004-01-01

    Mathematical development and some computed results are presented for Mindlin plate and shell elements, suitable for analysis of laminated composite and sandwich structures. These elements use the conventional 3 (plate) or 5 (shell) nodal degrees of freedom, have no communicable mechanisms, have no spurious shear energy (no shear locking), have no spurious membrane energy (no membrane locking) and do not require arbitrary reduction of out-of-plane shear moduli or under-integration. Artificial out-of-plane rotational stiffnesses are added at the element level to avoid convergence problems or singularity due to flat spots in shells. This report discusses a 6-node curved triangular element and a 4-node quadrilateral element. Findings show that in regular rectangular meshes, the Martin-Breiner 6-node triangular curved shell (MB6) is approximately equivalent to the conventional 8-node quadrilateral with integration. The 4-node quadrilateral (MB4) has very good accuracy for a 4-node element, and may be preferred in vibration analysis because of narrower bandwidth. The mathematical developments used in these elements, those discussed in the seven appendices, have been applied to elements with 3, 4, 6, and 10 nodes and can be applied to other nodal configurations.

  7. Modeling Intracochlear Magnetic Stimulation: A Finite-Element Analysis.

    PubMed

    Mukesh, S; Blake, D T; McKinnon, B J; Bhatti, P T

    2017-08-01

    This study models induced electric fields, and their gradient, produced by pulsatile current stimulation of submillimeter inductors for cochlear implantation. Using finite-element analysis, the lower chamber of the cochlea, scala tympani, is modeled as a cylindrical structure filled with perilymph bounded by tissue, bone, and cochlear neural elements. Single inductors as well as an array of inductors are modeled. The coil strength (~100 nH) and excitation parameters (peak current of 1-5 A, voltages of 16-20 V) are based on a formative feasibility study conducted by our group. In that study, intracochlear micromagnetic stimulation achieved auditory activation as measured through the auditory brainstem response in a feline model. With respect to the finite element simulations, axial symmetry of the inductor geometry is exploited to improve computation time. It is verified that the inductor coil orientation greatly affects the strength of the induced electric field and thereby the ability to affect the transmembrane potential of nearby neural elements. Furthermore, upon comparing an array of micro-inductors with a typical multi-site electrode array, magnetically excited arrays retain greater focus in terms of the gradient of induced electric fields. Once combined with further in vivo analysis, this modeling study may enable further exploration of the mechanism of magnetically induced, and focused neural stimulation.

  8. A Study on Urban Road Traffic Safety Based on Matter Element Analysis

    PubMed Central

    Hu, Qizhou; Zhou, Zhuping; Sun, Xu

    2014-01-01

    This paper examines a new evaluation of urban road traffic safety based on a matter element analysis, avoiding the difficulties found in other traffic safety evaluations. The issue of urban road traffic safety has been investigated through the matter element analysis theory. The chief aim of the present work is to investigate the features of urban road traffic safety. Emphasis was placed on the construction of a criterion function by which traffic safety achieved a hierarchical system of objectives to be evaluated. The matter element analysis theory was used to create the comprehensive appraisal model of urban road traffic safety. The technique was used to employ a newly developed and versatile matter element analysis algorithm. The matter element matrix solves the uncertainty and incompatibility of the evaluated factors used to assess urban road traffic safety. The application results showed the superiority of the evaluation model and a didactic example was included to illustrate the computational procedure. PMID:25587267

  9. Trace element analysis of coal by neutron activation.

    NASA Technical Reports Server (NTRS)

    Sheibley, D. W.

    1973-01-01

    The irradiation, counting, and data reduction scheme is described for an analysis capability of 1000 samples per year. Up to 56 elements are reported on each sample. The precision and accuracy of the method are shown for 25 elements designated as hazardous by the Environmental Protection Agency (EPA). The interference corrections for selenium and ytterbium on mercury and ytterbium on selenium are described. The effect of bromine and antimony on the determination of arsenic is also mentioned. The use of factorial design techniques to evaluate interferences in the determination of mercury, selenium, and arsenic is shown. Some typical trace element results for coal, fly ash, and bottom ash are given.

  10. Trace element analysis of coal by neutron activation

    NASA Technical Reports Server (NTRS)

    Sheibley, D. W.

    1973-01-01

    The irradiation, counting, and data reduction scheme is described for an analysis capability of 1000 samples per year. Up to 56 elements are reported on each sample. The precision and accuracy of the method are shown for 25 elements designated as hazardous by the Environmental Protection Agency (EPA). The interference corrections for selenium and ytterbium on mercury and ytterbium on selenium are described. The effect of bromine and antimony on the determination of arsenic is also mentioned. The use of factorial design techniques to evaluate interferences in the determination of mercury, selenium, and arsenic is shown. Some typical trace element results for coal, fly ash, and bottom ash are given.

  11. Spreadsheet-based engine data analysis tool - user's guide.

    DOT National Transportation Integrated Search

    2016-07-01

    This record refers to both the spreadsheet tool - Fleet Equipment Performance Measurement Preventive Maintenance Model: Spreadsheet-Based Engine Data Analysis Tool, http://ntl.bts.gov/lib/60000/60000/60007/0-6626-P1_Final.xlsm - and its accompanying ...

  12. Rolling-Element Fatigue Testing and Data Analysis - A Tutorial

    NASA Technical Reports Server (NTRS)

    Vlcek, Brian L.; Zaretsky, Erwin V.

    2011-01-01

    In order to rank bearing materials, lubricants and other design variables using rolling-element bench type fatigue testing of bearing components and full-scale rolling-element bearing tests, the investigator needs to be cognizant of the variables that affect rolling-element fatigue life and be able to maintain and control them within an acceptable experimental tolerance. Once these variables are controlled, the number of tests and the test conditions must be specified to assure reasonable statistical certainty of the final results. There is a reasonable correlation between the results from elemental test rigs with those results obtained with full-scale bearings. Using the statistical methods of W. Weibull and L. Johnson, the minimum number of tests required can be determined. This paper brings together and discusses the technical aspects of rolling-element fatigue testing and data analysis as well as making recommendations to assure quality and reliable testing of rolling-element specimens and full-scale rolling-element bearings.

  13. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    NASA Technical Reports Server (NTRS)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  14. Finite Element Analysis of a Dynamically Loaded Flat Laminated Plate

    DTIC Science & Technology

    1980-07-01

    and the elements are stacked in the thickness direction to represent various material layers. This analysis allows for orthotropic, elastic- plastic or...INCREMENTS 27 V. PLASTICITY 34 Orthotropic Elastic- Plastic Yielding 34 Orthotropic Elastic-Viscoplastic Yielding 37 VI. ELEMENT EQUILIBRIUM...with time, consequently the materials are assumed to be represented by elastic- plastic and elastic-viscoplastic models. The finite element model

  15. Tools for Understanding Identity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Creese, Sadie; Gibson-Robinson, Thomas; Goldsmith, Michael

    to take into account the difficulty of the inferences, allowing the user to consider different scenarios depending on the perceived resources of the attacker, or to prioritize lines of investigation. It also has a number of interesting visualizations that are designed to aid the user in understanding the model. The tool works by considering the inferences as a graph and runs various graph-theoretic algorithms, with some novel adaptations, in order to deduce various properties. Using the Model To help investigators exploit the model to perform identity attribution, we have developed the Identity Map visualization. For a user-provided set of known starting elements and a set of desired target elements for a given identity, the Identity Map generates investigative workflows as paths through the model. Each path consists of a series of elements and inferences between them that connect the input and output elements. Each path also has an associated confidence level that estimates the reliability of the resulting attribution. Identity Map can help investigators understand the possible ways to make an identification decision and guide them toward the data-collection or analysis steps required to reach that decision.« less

  16. Biomechanical analysis comparing natural and alloplastic temporomandibular joint replacement using a finite element model.

    PubMed

    Mesnard, Michel; Ramos, Antonio; Ballu, Alex; Morlier, Julien; Cid, M; Simoes, J A

    2011-04-01

    Prosthetic materials and bone present quite different mechanical properties. Consequently, mandible reconstruction with metallic materials (or a mandible condyle implant) modifies the physiologic behavior of the mandible (stress, strain patterns, and condyle displacements). The changing of bone strain distribution results in an adaptation of the temporomandibular joint, including articular contacts. Using a validated finite element model, the natural mandible strains and condyle displacements were evaluated. Modifications of strains and displacements were then assessed for 2 different temporomandibular joint implants. Because materials and geometry play important key roles, mechanical properties of cortical bone were taken into account in models used in finite element analysis. The finite element model allowed verification of the worst loading configuration of the mandibular condyle. Replacing the natural condyle by 1 of the 2 tested implants, the results also show the importance of the implant geometry concerning biomechanical mandibular behavior. The implant geometry and stiffness influenced mainly strain distribution. The different forces applied to the mandible by the elevator muscles, teeth, and joint loads indicate that the finite element model is a relevant tool to optimize implant geometry or, in a subsequent study, to choose a more suitable distribution of the screws. Bone screws (number and position) have a significant influence on mandibular behavior and on implant stress pattern. Stress concentration and implant fracture must be avoided. Copyright © 2011 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  17. Open Source Tools for Seismicity Analysis

    NASA Astrophysics Data System (ADS)

    Powers, P.

    2010-12-01

    The spatio-temporal analysis of seismicity plays an important role in earthquake forecasting and is integral to research on earthquake interactions and triggering. For instance, the third version of the Uniform California Earthquake Rupture Forecast (UCERF), currently under development, will use Epidemic Type Aftershock Sequences (ETAS) as a model for earthquake triggering. UCERF will be a "living" model and therefore requires robust, tested, and well-documented ETAS algorithms to ensure transparency and reproducibility. Likewise, as earthquake aftershock sequences unfold, real-time access to high quality hypocenter data makes it possible to monitor the temporal variability of statistical properties such as the parameters of the Omori Law and the Gutenberg Richter b-value. Such statistical properties are valuable as they provide a measure of how much a particular sequence deviates from expected behavior and can be used when assigning probabilities of aftershock occurrence. To address these demands and provide public access to standard methods employed in statistical seismology, we present well-documented, open-source JavaScript and Java software libraries for the on- and off-line analysis of seismicity. The Javascript classes facilitate web-based asynchronous access to earthquake catalog data and provide a framework for in-browser display, analysis, and manipulation of catalog statistics; implementations of this framework will be made available on the USGS Earthquake Hazards website. The Java classes, in addition to providing tools for seismicity analysis, provide tools for modeling seismicity and generating synthetic catalogs. These tools are extensible and will be released as part of the open-source OpenSHA Commons library.

  18. Elemental content of Vietnamese rice. Part 2. Multivariate data analysis.

    PubMed

    Kokot, S; Phuong, T D

    1999-04-01

    Rice samples were obtained from the Red River region and some other parts of Vietnam as well as from Yanco, Australia. These samples were analysed for 14 elements (P, K, Mg, Ca, Mn, Zn, Fe, Cu, Al, Na, Ni, As, Mo and Cd) by ICP-AES, ICP-MS and FAAS as described in Part 1. This data matrix was then submitted to multivariate data analysis by principal component analysis to investigate the influences of environmental and crop cultivation variables on the elemental content of rice. Results revealed that geographical location, grain variety, seasons and soil conditions are the most likely significant factors causing changes in the elemental content between the rice samples. To assess rice quality according to its elemental content and physio-biological properties, a multicriteria decision making method (PROMETHEE) was applied. With the Vietnamese rice, the sticky rice appeared to contain somewhat higher levels of nutritionally significant elements such as P, K and Mg than the non-sticky rice. Also, rice samples grown during the wet season have better levels of nutritionally significant mineral elements than those of the dry season, but in general, the wet season seemed to provide better overall elemental and physio-biological rice quality.

  19. Three-dimensional evidence network plot system: covariate imbalances and effects in network meta-analysis explored using a new software tool.

    PubMed

    Batson, Sarah; Score, Robert; Sutton, Alex J

    2017-06-01

    The aim of the study was to develop the three-dimensional (3D) evidence network plot system-a novel web-based interactive 3D tool to facilitate the visualization and exploration of covariate distributions and imbalances across evidence networks for network meta-analysis (NMA). We developed the 3D evidence network plot system within an AngularJS environment using a third party JavaScript library (Three.js) to create the 3D element of the application. Data used to enable the creation of the 3D element for a particular topic are inputted via a Microsoft Excel template spreadsheet that has been specifically formatted to hold these data. We display and discuss the findings of applying the tool to two NMA examples considering multiple covariates. These two examples have been previously identified as having potentially important covariate effects and allow us to document the various features of the tool while illustrating how it can be used. The 3D evidence network plot system provides an immediate, intuitive, and accessible way to assess the similarity and differences between the values of covariates for individual studies within and between each treatment contrast in an evidence network. In this way, differences between the studies, which may invalidate the usual assumptions of an NMA, can be identified for further scrutiny. Hence, the tool facilitates NMA feasibility/validity assessments and aids in the interpretation of NMA results. The 3D evidence network plot system is the first tool designed specifically to visualize covariate distributions and imbalances across evidence networks in 3D. This will be of primary interest to systematic review and meta-analysis researchers and, more generally, those assessing the validity and robustness of an NMA to inform reimbursement decisions. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. 3-d finite element model development for biomechanics: a software demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollerbach, K.; Hollister, A.M.; Ashby, E.

    1997-03-01

    Finite element analysis is becoming an increasingly important part of biomechanics and orthopedic research, as computational resources become more powerful, and data handling algorithms become more sophisticated. Until recently, tools with sufficient power did not exist or were not accessible to adequately model complicated, three-dimensional, nonlinear biomechanical systems. In the past, finite element analyses in biomechanics have often been limited to two-dimensional approaches, linear analyses, or simulations of single tissue types. Today, we have the resources to model fully three-dimensional, nonlinear, multi-tissue, and even multi-joint systems. The authors will present the process of developing these kinds of finite element models,more » using human hand and knee examples, and will demonstrate their software tools.« less

  1. Structured Analysis and the Data Flow Diagram: Tools for Library Analysis.

    ERIC Educational Resources Information Center

    Carlson, David H.

    1986-01-01

    This article discusses tools developed to aid the systems analysis process (program evaluation and review technique, Gantt charts, organizational charts, decision tables, flowcharts, hierarchy plus input-process-output). Similarities and differences among techniques, library applications of analysis, structured systems analysis, and the data flow…

  2. Elemental analysis of glass by laser ablation inductively coupled plasma optical emission spectrometry (LA-ICP-OES).

    PubMed

    Schenk, Emily R; Almirall, José R

    2012-04-10

    The elemental analysis of glass evidence has been established as a powerful discrimination tool for forensic analysts. Laser ablation inductively coupled plasma optical emission spectrometry (LA-ICP-OES) has been compared to laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) and energy dispersive micro X-ray fluorescence spectroscopy (μXRF/EDS) as competing instrumentation for the elemental analysis of glass. The development of a method for the forensic analysis of glass coupling laser ablation to ICP-OES is presented for the first time. LA-ICP-OES has demonstrated comparable analytical performance to LA-ICP-MS based on the use of the element menu, Al (Al I 396.15 nm), Ba (Ba II 455.40 nm), Ca (Ca II 315.88 nm), Fe (Fe II 238.20 nm), Li (Li I 670.78 nm), Mg (Mg I 285.21 nm), Sr (Sr II 407.77 nm), Ti (Ti II 368.51 nm), and Zr (Zr II 343.82 nm). The relevant figures of merit, such as precision, accuracy and sensitivity, are presented and compared to LA-ICP-MS. A set of 41 glass samples was used to assess the discrimination power of the LA-ICP-OES method in comparison to other elemental analysis techniques. This sample set consisted of several vehicle glass samples that originated from the same source (inside and outside windshield panes) and several glass samples that originated from different vehicles. Different match criteria were used and compared to determine the potential for Type I and Type II errors. It was determined that broader match criteria is more applicable to the forensic comparison of glass analysis because it can reduce the affect that micro-heterogeneity inherent in the glass fragments and a less than ideal sampling strategy can have on the interpretation of the results. Based on the test set reported here, a plus or minus four standard deviation (± 4s) match criterion yielded the lowest possibility of Type I and Type II errors. The developed LA-ICP-OES method has been shown to perform similarly to LA-ICP-MS in the

  3. General Mission Analysis Tool (GMAT) Mathematical Specifications

    NASA Technical Reports Server (NTRS)

    Hughes, Steve

    2007-01-01

    The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system developed by NASA and private industry in the spirit of the NASA Mission. GMAT contains new technology and is a testbed for future technology development.

  4. Determination of minor and trace elements in kidney stones by x-ray fluorescence analysis

    NASA Astrophysics Data System (ADS)

    Srivastava, Anjali; Heisinger, Brianne J.; Sinha, Vaibhav; Lee, Hyong-Koo; Liu, Xin; Qu, Mingliang; Duan, Xinhui; Leng, Shuai; McCollough, Cynthia H.

    2014-03-01

    The determination of accurate material composition of a kidney stone is crucial for understanding the formation of the kidney stone as well as for preventive therapeutic strategies. Radiations probing instrumental activation analysis techniques are excellent tools for identification of involved materials present in the kidney stone. In particular, x-ray fluorescence (XRF) can be very useful for the determination of minor and trace materials in the kidney stone. The X-ray fluorescence measurements were performed at the Radiation Measurements and Spectroscopy Laboratory (RMSL) of department of nuclear engineering of Missouri University of Science and Technology and different kidney stones were acquired from the Mayo Clinic, Rochester, Minnesota. Presently, experimental studies in conjunction with analytical techniques were used to determine the exact composition of the kidney stone. A new type of experimental set-up was developed and utilized for XRF analysis of the kidney stone. The correlation of applied radiation source intensity, emission of X-ray spectrum from involving elements and absorption coefficient characteristics were analyzed. To verify the experimental results with analytical calculation, several sets of kidney stones were analyzed using XRF technique. The elements which were identified from this techniques are Silver (Ag), Arsenic (As), Bromine (Br), Chromium (Cr), Copper (Cu), Gallium (Ga), Germanium (Ge), Molybdenum (Mo), Niobium (Nb), Rubidium (Rb), Selenium (Se), Strontium (Sr), Yttrium (Y), Zirconium (Zr). This paper presents a new approach for exact detection of accurate material composition of kidney stone materials using XRF instrumental activation analysis technique.

  5. Flight Operations Analysis Tool

    NASA Technical Reports Server (NTRS)

    Easter, Robert; Herrell, Linda; Pomphrey, Richard; Chase, James; Wertz Chen, Julie; Smith, Jeffrey; Carter, Rebecca

    2006-01-01

    Flight Operations Analysis Tool (FLOAT) is a computer program that partly automates the process of assessing the benefits of planning spacecraft missions to incorporate various combinations of launch vehicles and payloads. Designed primarily for use by an experienced systems engineer, FLOAT makes it possible to perform a preliminary analysis of trade-offs and costs of a proposed mission in days, whereas previously, such an analysis typically lasted months. FLOAT surveys a variety of prior missions by querying data from authoritative NASA sources pertaining to 20 to 30 mission and interface parameters that define space missions. FLOAT provides automated, flexible means for comparing the parameters to determine compatibility or the lack thereof among payloads, spacecraft, and launch vehicles, and for displaying the results of such comparisons. Sparseness, typical of the data available for analysis, does not confound this software. FLOAT effects an iterative process that identifies modifications of parameters that could render compatible an otherwise incompatible mission set.

  6. Considerations for Reporting Finite Element Analysis Studies in Biomechanics

    PubMed Central

    Erdemir, Ahmet; Guess, Trent M.; Halloran, Jason; Tadepalli, Srinivas C.; Morrison, Tina M.

    2012-01-01

    Simulation-based medicine and the development of complex computer models of biological structures is becoming ubiquitous for advancing biomedical engineering and clinical research. Finite element analysis (FEA) has been widely used in the last few decades to understand and predict biomechanical phenomena. Modeling and simulation approaches in biomechanics are highly interdisciplinary, involving novice and skilled developers in all areas of biomedical engineering and biology. While recent advances in model development and simulation platforms offer a wide range of tools to investigators, the decision making process during modeling and simulation has become more opaque. Hence, reliability of such models used for medical decision making and for driving multiscale analysis comes into question. Establishing guidelines for model development and dissemination is a daunting task, particularly with the complex and convoluted models used in FEA. Nonetheless, if better reporting can be established, researchers will have a better understanding of a model’s value and the potential for reusability through sharing will be bolstered. Thus, the goal of this document is to identify resources and considerate reporting parameters for FEA studies in biomechanics. These entail various levels of reporting parameters for model identification, model structure, simulation structure, verification, validation, and availability. While we recognize that it may not be possible to provide and detail all of the reporting considerations presented, it is possible to establish a level of confidence with selective use of these parameters. More detailed reporting, however, can establish an explicit outline of the decision-making process in simulation-based analysis for enhanced reproducibility, reusability, and sharing. PMID:22236526

  7. A Multidimensional Analysis Tool for Visualizing Online Interactions

    ERIC Educational Resources Information Center

    Kim, Minjeong; Lee, Eunchul

    2012-01-01

    This study proposes and verifies the performance of an analysis tool for visualizing online interactions. A review of the most widely used methods for analyzing online interactions, including quantitative analysis, content analysis, and social network analysis methods, indicates these analysis methods have some limitations resulting from their…

  8. DCODE.ORG Anthology of Comparative Genomic Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loots, G G; Ovcharenko, I

    2005-01-11

    Comparative genomics provides the means to demarcate functional regions in anonymous DNA sequences. The successful application of this method to identifying novel genes is currently shifting to deciphering the noncoding encryption of gene regulation across genomes. To facilitate the use of comparative genomics to practical applications in genetics and genomics we have developed several analytical and visualization tools for the analysis of arbitrary sequences and whole genomes. These tools include two alignment tools: zPicture and Mulan; a phylogenetic shadowing tool: eShadow for identifying lineage- and species-specific functional elements; two evolutionary conserved transcription factor analysis tools: rVista and multiTF; a toolmore » for extracting cis-regulatory modules governing the expression of co-regulated genes, CREME; and a dynamic portal to multiple vertebrate and invertebrate genome alignments, the ECR Browser. Here we briefly describe each one of these tools and provide specific examples on their practical applications. All the tools are publicly available at the http://www.dcode.org/ web site.« less

  9. Finite element stress, vibration, and buckling analysis of laminated beams with the use of refined elements

    NASA Astrophysics Data System (ADS)

    Borovkov, Alexei I.; Avdeev, Ilya V.; Artemyev, A.

    1999-05-01

    In present work, the stress, vibration and buckling finite element analysis of laminated beams is performed. Review of the equivalent single-layer (ESL) laminate theories is done. Finite element algorithms and procedures integrated into the original FEA program system and based on the classical laminated plate theory (CLPT), first-order shear deformation theory (FSDT), third-order theory of Reddy (TSDT-R) and third- order theory of Kant (TSDT-K) with the use of the Lanczos method for solving of the eigenproblem are developed. Several numerical tests and examples of bending, free vibration and buckling of multilayered and sandwich beams with various material, geometry properties and boundary conditions are solved. New effective higher-order hierarchical element for the accurate calculation of transverse shear stress is proposed. The comparative analysis of results obtained by the considered models and solutions of 2D problems of the heterogeneous anisotropic elasticity is fulfilled.

  10. Comparison of hexahedral and tetrahedral elements in finite element analysis of the foot and footwear.

    PubMed

    Tadepalli, Srinivas C; Erdemir, Ahmet; Cavanagh, Peter R

    2011-08-11

    Finite element analysis has been widely used in the field of foot and footwear biomechanics to determine plantar pressures as well as stresses and strains within soft tissue and footwear materials. When dealing with anatomical structures such as the foot, hexahedral mesh generation accounts for most of the model development time due to geometric complexities imposed by branching and embedded structures. Tetrahedral meshing, which can be more easily automated, has been the approach of choice to date in foot and footwear biomechanics. Here we use the nonlinear finite element program Abaqus (Simulia, Providence, RI) to examine the advantages and disadvantages of tetrahedral and hexahedral elements under compression and shear loading, material incompressibility, and frictional contact conditions, which are commonly seen in foot and footwear biomechanics. This study demonstrated that for a range of simulation conditions, hybrid hexahedral elements (Abaqus C3D8H) consistently performed well while hybrid linear tetrahedral elements (Abaqus C3D4H) performed poorly. On the other hand, enhanced quadratic tetrahedral elements with improved stress visualization (Abaqus C3D10I) performed as well as the hybrid hexahedral elements in terms of contact pressure and contact shear stress predictions. Although the enhanced quadratic tetrahedral element simulations were computationally expensive compared to hexahedral element simulations in both barefoot and footwear conditions, the enhanced quadratic tetrahedral element formulation seems to be very promising for foot and footwear applications as a result of decreased labor and expedited model development, all related to facilitated mesh generation. Copyright © 2011. Published by Elsevier Ltd.

  11. Designed tools for analysis of lithography patterns and nanostructures

    NASA Astrophysics Data System (ADS)

    Dervillé, Alexandre; Baderot, Julien; Bernard, Guilhem; Foucher, Johann; Grönqvist, Hanna; Labrosse, Aurélien; Martinez, Sergio; Zimmermann, Yann

    2017-03-01

    We introduce a set of designed tools for the analysis of lithography patterns and nano structures. The classical metrological analysis of these objects has the drawbacks of being time consuming, requiring manual tuning and lacking robustness and user friendliness. With the goal of improving the current situation, we propose new image processing tools at different levels: semi automatic, automatic and machine-learning enhanced tools. The complete set of tools has been integrated into a software platform designed to transform the lab into a virtual fab. The underlying idea is to master nano processes at the research and development level by accelerating the access to knowledge and hence speed up the implementation in product lines.

  12. Tools4miRs - one place to gather all the tools for miRNA analysis.

    PubMed

    Lukasik, Anna; Wójcikowski, Maciej; Zielenkiewicz, Piotr

    2016-09-01

    MiRNAs are short, non-coding molecules that negatively regulate gene expression and thereby play several important roles in living organisms. Dozens of computational methods for miRNA-related research have been developed, which greatly differ in various aspects. The substantial availability of difficult-to-compare approaches makes it challenging for the user to select a proper tool and prompts the need for a solution that will collect and categorize all the methods. Here, we present tools4miRs, the first platform that gathers currently more than 160 methods for broadly defined miRNA analysis. The collected tools are classified into several general and more detailed categories in which the users can additionally filter the available methods according to their specific research needs, capabilities and preferences. Tools4miRs is also a web-based target prediction meta-server that incorporates user-designated target prediction methods into the analysis of user-provided data. Tools4miRs is implemented in Python using Django and is freely available at tools4mirs.org. piotr@ibb.waw.pl Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  13. Probabilistic finite elements for fracture and fatigue analysis

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Belytschko, T.; Lawrence, M.; Besterfield, G. H.

    1989-01-01

    The fusion of the probabilistic finite element method (PFEM) and reliability analysis for probabilistic fracture mechanics (PFM) is presented. A comprehensive method for determining the probability of fatigue failure for curved crack growth was developed. The criterion for failure or performance function is stated as: the fatigue life of a component must exceed the service life of the component; otherwise failure will occur. An enriched element that has the near-crack-tip singular strain field embedded in the element is used to formulate the equilibrium equation and solve for the stress intensity factors at the crack-tip. Performance and accuracy of the method is demonstrated on a classical mode 1 fatigue problem.

  14. A new methodology for free wake analysis using curved vortex elements

    NASA Technical Reports Server (NTRS)

    Bliss, Donald B.; Teske, Milton E.; Quackenbush, Todd R.

    1987-01-01

    A method using curved vortex elements was developed for helicopter rotor free wake calculations. The Basic Curve Vortex Element (BCVE) is derived from the approximate Biot-Savart integration for a parabolic arc filament. When used in conjunction with a scheme to fit the elements along a vortex filament contour, this method has a significant advantage in overall accuracy and efficiency when compared to the traditional straight-line element approach. A theoretical and numerical analysis shows that free wake flows involving close interactions between filaments should utilize curved vortex elements in order to guarantee a consistent level of accuracy. The curved element method was implemented into a forward flight free wake analysis, featuring an adaptive far wake model that utilizes free wake information to extend the vortex filaments beyond the free wake regions. The curved vortex element free wake, coupled with this far wake model, exhibited rapid convergence, even in regions where the free wake and far wake turns are interlaced. Sample calculations are presented for tip vortex motion at various advance ratios for single and multiple blade rotors. Cross-flow plots reveal that the overall downstream wake flow resembles a trailing vortex pair. A preliminary assessment shows that the rotor downwash field is insensitive to element size, even for relatively large curved elements.

  15. Automated Finite Element Analysis of Elastically-Tailored Plates

    NASA Technical Reports Server (NTRS)

    Jegley, Dawn C. (Technical Monitor); Tatting, Brian F.; Guerdal, Zafer

    2003-01-01

    A procedure for analyzing and designing elastically tailored composite laminates using the STAGS finite element solver has been presented. The methodology used to produce the elastic tailoring, namely computer-controlled steering of unidirectionally reinforced composite material tows, has been reduced to a handful of design parameters along with a selection of construction methods. The generality of the tow-steered ply definition provides the user a wide variety of options for laminate design, which can be automatically incorporated with any finite element model that is composed of STAGS shell elements. Furthermore, the variable stiffness parameterization is formulated so that manufacturability can be assessed during the design process, plus new ideas using tow steering concepts can be easily integrated within the general framework of the elastic tailoring definitions. Details for the necessary implementation of the tow-steering definitions within the STAGS hierarchy is provided, and the format of the ply definitions is discussed in detail to provide easy access to the elastic tailoring choices. Integration of the automated STAGS solver with laminate design software has been demonstrated, so that the large design space generated by the tow-steering options can be traversed effectively. Several design problems are presented which confirm the usefulness of the design tool as well as further establish the potential of tow-steered plies for laminate design.

  16. Analysis Tools in Geant4 10.2 and 10.3

    NASA Astrophysics Data System (ADS)

    Hřivnáčová, I.; Barrand, G.

    2017-10-01

    A new analysis category based on g4tools was added in Geant4 release 9.5 (2011). The aim was to provide users with a lightweight analysis tool available as part of the Geant4 installation without the need to link to an external analysis package. It has progressively been included in all Geant4 examples. Frequent questions in the Geant4 users forum show its increasing popularity in the Geant4 users community. In this presentation, we will give a brief overview of g4tools and the analysis category. We report on new developments since our CHEP 2013 contribution as well as mention upcoming new features.

  17. A 2-D Interface Element for Coupled Analysis of Independently Modeled 3-D Finite Element Subdomains

    NASA Technical Reports Server (NTRS)

    Kandil, Osama A.

    1998-01-01

    Over the past few years, the development of the interface technology has provided an analysis framework for embedding detailed finite element models within finite element models which are less refined. This development has enabled the use of cascading substructure domains without the constraint of coincident nodes along substructure boundaries. The approach used for the interface element is based on an alternate variational principle often used in deriving hybrid finite elements. The resulting system of equations exhibits a high degree of sparsity but gives rise to a non-positive definite system which causes difficulties with many of the equation solvers in general-purpose finite element codes. Hence the global system of equations is generally solved using, a decomposition procedure with pivoting. The research reported to-date for the interface element includes the one-dimensional line interface element and two-dimensional surface interface element. Several large-scale simulations, including geometrically nonlinear problems, have been reported using the one-dimensional interface element technology; however, only limited applications are available for the surface interface element. In the applications reported to-date, the geometry of the interfaced domains exactly match each other even though the spatial discretization within each domain may be different. As such, the spatial modeling of each domain, the interface elements and the assembled system is still laborious. The present research is focused on developing a rapid modeling procedure based on a parametric interface representation of independently defined subdomains which are also independently discretized.

  18. [Short interspersed repetitive sequences (SINEs) and their use as a phylogenetic tool].

    PubMed

    Kramerov, D A; Vasetskiĭ, N S

    2009-01-01

    The data on one of the most common repetitive elements of eukaryotic genomes, short interspersed elements (SINEs), are reviewed. Their structure, origin, and functioning in the genome are discussed. The variation and abundance of these neutral genomic markers makes them a convenient and reliable tool for phylogenetic analysis. The main methods of such analysis are presented, and the potential and limitations of this approach are discussed using specific examples.

  19. Nondestructive Evaluation Correlated with Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Abdul-Azid, Ali; Baaklini, George Y.

    1999-01-01

    Advanced materials are being developed for use in high-temperature gas turbine applications. For these new materials to be fully utilized, their deformation properties, their nondestructive evaluation (NDE) quality and material durability, and their creep and fatigue fracture characteristics need to be determined by suitable experiments. The experimental findings must be analyzed, characterized, modeled and translated into constitutive equations for stress analysis and life prediction. Only when these ingredients - together with the appropriate computational tools - are available, can durability analysis be performed in the design stage, long before the component is built. One of the many structural components being evaluated by the NDE group at the NASA Lewis Research Center is the flywheel system. It is being considered as an energy storage device for advanced space vehicles. Such devices offer advantages over electrochemical batteries in situations demanding high power delivery and high energy storage per unit weight. In addition, flywheels have potentially higher efficiency and longer lifetimes with proper motor-generator and rotor design. Flywheels made of fiber-reinforced polymer composite material show great promise for energy applications because of the high energy and power densities that they can achieve along with a burst failure mode that is relatively benign in comparison to those of flywheels made of metallic materials Therefore, to help improve durability and reduce structural uncertainties, we are developing a comprehensive analytical approach to predict the reliability and life of these components under these harsh loading conditions. The combination of NDE and two- and three-dimensional finite element analyses (e.g., stress analyses and fracture mechanics) is expected to set a standardized procedure to accurately assess the applicability of using various composite materials to design a suitable rotor/flywheel assembly.

  20. Method of holding optical elements without deformation during their fabrication

    DOEpatents

    Hed, P. Paul

    1997-01-01

    An improved method for securing and removing an optical element to and from a blocking tool without causing deformation of the optical element. A lens tissue is placed on the top surface of the blocking tool. Dots of UV cement are applied to the lens tissue without any of the dots contacting each other. An optical element is placed on top of the blocking tool with the lens tissue sandwiched therebetween. The UV cement is then cured. After subsequent fabrication steps, the bonded blocking tool, lens tissue, and optical element are placed in a debonding solution to soften the UV cement. The optical element is then removed from the blocking tool.

  1. Mineral elements and essential trace elements in blood of seals of the North Sea measured by total-reflection X-ray fluorescence analysis

    NASA Astrophysics Data System (ADS)

    Griesel, S.; Mundry, R.; Kakuschke, A.; Fonfara, S.; Siebert, U.; Prange, A.

    2006-11-01

    Mineral and essential trace elements are involved in numerous physiological processes in mammals. Often, diseases are associated with an imbalance of the electrolyte homeostasis. In this study, the concentrations of mineral elements (P, S, K, Ca) and essential trace elements (Fe, Cu, Zn, Se, Rb, Sr) in whole blood of harbor seals ( Phoca vitulina) were determined using total-reflection X-ray fluorescence spectrometry (TXRF). Samples from 81 free-ranging harbor seals from the North Sea and two captive seals were collected during 2003-2005. Reference ranges and element correlations for health status determination were derived for P, S, K, Ca, Fe, Cu, and Zn level in whole blood. Grouping the seals by age, gender and sample location the concentration levels of the elements were compared. The blood from two captive seals with signs of diseases and four free-ranging seals showed reduced element levels of P, S, and Ca and differences in element correlation of electrolytes were ascertained. Thus, simultaneous measurements of several elements in only 500 μL volumes of whole blood provide the possibility to obtain information on both, the electrolyte balance and the hydration status of the seals. The method could therefore serve as an additional biomonitoring tool for the health assessment.

  2. A Software Tool for Integrated Optical Design Analysis

    NASA Technical Reports Server (NTRS)

    Moore, Jim; Troy, Ed; DePlachett, Charles; Montgomery, Edward (Technical Monitor)

    2001-01-01

    Design of large precision optical systems requires multi-disciplinary analysis, modeling, and design. Thermal, structural and optical characteristics of the hardware must be accurately understood in order to design a system capable of accomplishing the performance requirements. The interactions between each of the disciplines become stronger as systems are designed lighter weight for space applications. This coupling dictates a concurrent engineering design approach. In the past, integrated modeling tools have been developed that attempt to integrate all of the complex analysis within the framework of a single model. This often results in modeling simplifications and it requires engineering specialist to learn new applications. The software described in this presentation addresses the concurrent engineering task using a different approach. The software tool, Integrated Optical Design Analysis (IODA), uses data fusion technology to enable a cross discipline team of engineering experts to concurrently design an optical system using their standard validated engineering design tools.

  3. A new parallel-vector finite element analysis software on distributed-memory computers

    NASA Technical Reports Server (NTRS)

    Qin, Jiangning; Nguyen, Duc T.

    1993-01-01

    A new parallel-vector finite element analysis software package MPFEA (Massively Parallel-vector Finite Element Analysis) is developed for large-scale structural analysis on massively parallel computers with distributed-memory. MPFEA is designed for parallel generation and assembly of the global finite element stiffness matrices as well as parallel solution of the simultaneous linear equations, since these are often the major time-consuming parts of a finite element analysis. Block-skyline storage scheme along with vector-unrolling techniques are used to enhance the vector performance. Communications among processors are carried out concurrently with arithmetic operations to reduce the total execution time. Numerical results on the Intel iPSC/860 computers (such as the Intel Gamma with 128 processors and the Intel Touchstone Delta with 512 processors) are presented, including an aircraft structure and some very large truss structures, to demonstrate the efficiency and accuracy of MPFEA.

  4. Finite element analysis of unnotched charpy impact tests

    DOT National Transportation Integrated Search

    2008-10-01

    This paper describes nonlinear finite element analysis (FEA) to examine the energy to : fracture unnotched Charpy specimens under pendulum impact loading. An oversized, : nonstandard pendulum impactor, called the Bulk Fracture Charpy Machine (BFCM), ...

  5. Multi-element compound specific stable isotope analysis of chlorinated aliphatic contaminants derived from chlorinated pitches.

    PubMed

    Filippini, Maria; Nijenhuis, Ivonne; Kümmel, Steffen; Chiarini, Veronica; Crosta, Giovanni; Richnow, Hans H; Gargini, Alessandro

    2018-05-30

    Tetrachloroethene and trichloroethene are typical by-products of the industrial production of chloromethanes. These by-products are known as "chlorinated pitches" and were often dumped in un-contained waste disposal sites causing groundwater contaminations. Previous research showed that a strongly depleted stable carbon isotope signature characterizes chlorinated compounds associated with chlorinated pitches whereas manufactured commercial compounds have more enriched carbon isotope ratios. The findings were restricted to a single case study and one element (i.e. carbon). This paper presents a multi-element Compound-Specific Stable Isotope Analysis (CSIA, including carbon, chlorine and hydrogen) of chlorinated aliphatic contaminants originated from chlorinated pitches at two sites with different hydrogeology and different producers of chloromethanes. The results show strongly depleted carbon signatures at both sites whereas the chlorine and the hydrogen signatures are comparable to those presented in the literature for manufactured commercial compounds. Multi-element CSIA allowed the identification of sources and site-specific processes affecting chloroethene transformation in groundwater as a result of emergency remediation measures. CSIA turned out to be an effective forensic tool to address the liability for the contamination, leading to a conviction for the crimes of unintentional aggravated public water supply poisoning and environmental disaster. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Applying AI tools to operational space environmental analysis

    NASA Technical Reports Server (NTRS)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines

  7. A finite element analysis of viscoelastically damped sandwich plates

    NASA Astrophysics Data System (ADS)

    Ma, B.-A.; He, J.-F.

    1992-01-01

    A finite element analysis associated with an asymptotic solution method for the harmonic flexural vibration of viscoelastically damped unsymmetrical sandwich plates is given. The element formulation is based on generalization of the discrete Kirchhoff theory (DKT) element formulation. The results obtained with the first order approximation of the asymptotic solution presented here are the same as those obtained by means of the modal strain energy (MSE) method. By taking more terms of the asymptotic solution, with successive calculations and use of the Padé approximants method, accuracy can be improved. The finite element computation has been verified by comparison with an analytical exact solution for rectangular plates with simply supported edges. Results for the same plates with clamped edges are also presented.

  8. Finite Element Analysis of the LOLA Receiver Telescope Lens

    NASA Technical Reports Server (NTRS)

    Matzinger, Elizabeth

    2007-01-01

    This paper presents the finite element stress and distortion analysis completed on the Receiver Telescope lens of the Lunar Orbiter Laser Altimeter (LOLA). LOLA is one of six instruments on the Lunar Reconnaissance Orbiter (LRO), scheduled to launch in 2008. LOLA's main objective is to produce a high-resolution global lunar topographic model to aid in safe landings and enhance surface mobility in future exploration missions. The Receiver Telescope captures the laser pulses transmitted through a diffractive optical element (DOE) and reflected off the lunar surface. The largest lens of the Receiver Telescope, Lens 1, is a 150 mm diameter aspheric lens originally designed to be made of BK7 glass. The finite element model of the Receiver Telescope Lens 1 is comprised of solid elements and constrained in a manner consistent with the behavior of the mounting configuration of the Receiver Telescope tube. Twenty-one temperature load cases were mapped to the nodes based on thermal analysis completed by LOLA's lead thermal analyst, and loads were applied to simulate the preload applied from the ring flexure. The thermal environment of the baseline design (uncoated BK7 lens with no baffle) produces large radial and axial gradients in the lens. These large gradients create internal stresses that may lead to part failure, as well as significant bending that degrades optical performance. The high stresses and large distortions shown in the analysis precipitated a design change from BK7 glass to sapphire.

  9. Elastic-plastic mixed-iterative finite element analysis: Implementation and performance assessment

    NASA Technical Reports Server (NTRS)

    Sutjahjo, Edhi; Chamis, Christos C.

    1993-01-01

    An elastic-plastic algorithm based on Von Mises and associative flow criteria is implemented in MHOST-a mixed iterative finite element analysis computer program developed by NASA Lewis Research Center. The performance of the resulting elastic-plastic mixed-iterative analysis is examined through a set of convergence studies. Membrane and bending behaviors of 4-node quadrilateral shell finite elements are tested for elastic-plastic performance. Generally, the membrane results are excellent, indicating the implementation of elastic-plastic mixed-iterative analysis is appropriate.

  10. Tools for T-RFLP data analysis using Excel.

    PubMed

    Fredriksson, Nils Johan; Hermansson, Malte; Wilén, Britt-Marie

    2014-11-08

    Terminal restriction fragment length polymorphism (T-RFLP) analysis is a DNA-fingerprinting method that can be used for comparisons of the microbial community composition in a large number of samples. There is no consensus on how T-RFLP data should be treated and analyzed before comparisons between samples are made, and several different approaches have been proposed in the literature. The analysis of T-RFLP data can be cumbersome and time-consuming, and for large datasets manual data analysis is not feasible. The currently available tools for automated T-RFLP analysis, although valuable, offer little flexibility, and few, if any, options regarding what methods to use. To enable comparisons and combinations of different data treatment methods an analysis template and an extensive collection of macros for T-RFLP data analysis using Microsoft Excel were developed. The Tools for T-RFLP data analysis template provides procedures for the analysis of large T-RFLP datasets including application of a noise baseline threshold and setting of the analysis range, normalization and alignment of replicate profiles, generation of consensus profiles, normalization and alignment of consensus profiles and final analysis of the samples including calculation of association coefficients and diversity index. The procedures are designed so that in all analysis steps, from the initial preparation of the data to the final comparison of the samples, there are various different options available. The parameters regarding analysis range, noise baseline, T-RF alignment and generation of consensus profiles are all given by the user and several different methods are available for normalization of the T-RF profiles. In each step, the user can also choose to base the calculations on either peak height data or peak area data. The Tools for T-RFLP data analysis template enables an objective and flexible analysis of large T-RFLP datasets in a widely used spreadsheet application.

  11. Element-by-element Solution Procedures for Nonlinear Structural Analysis

    NASA Technical Reports Server (NTRS)

    Hughes, T. J. R.; Winget, J. M.; Levit, I.

    1984-01-01

    Element-by-element approximate factorization procedures are proposed for solving the large finite element equation systems which arise in nonlinear structural mechanics. Architectural and data base advantages of the present algorithms over traditional direct elimination schemes are noted. Results of calculations suggest considerable potential for the methods described.

  12. An emulator for minimizing finite element analysis implementation resources

    NASA Technical Reports Server (NTRS)

    Melosh, R. J.; Utku, S.; Salama, M.; Islam, M.

    1982-01-01

    A finite element analysis emulator providing a basis for efficiently establishing an optimum computer implementation strategy when many calculations are involved is described. The SCOPE emulator determines computer resources required as a function of the structural model, structural load-deflection equation characteristics, the storage allocation plan, and computer hardware capabilities. Thereby, it provides data for trading analysis implementation options to arrive at a best strategy. The models contained in SCOPE lead to micro-operation computer counts of each finite element operation as well as overall computer resource cost estimates. Application of SCOPE to the Memphis-Arkansas bridge analysis provides measures of the accuracy of resource assessments. Data indicate that predictions are within 17.3 percent for calculation times and within 3.2 percent for peripheral storage resources for the ELAS code.

  13. Method of holding optical elements without deformation during their fabrication

    DOEpatents

    Hed, P.P.

    1997-04-29

    An improved method for securing and removing an optical element to and from a blocking tool without causing deformation of the optical element is disclosed. A lens tissue is placed on the top surface of the blocking tool. Dots of UV cement are applied to the lens tissue without any of the dots contacting each other. An optical element is placed on top of the blocking tool with the lens tissue sandwiched therebetween. The UV cement is then cured. After subsequent fabrication steps, the bonded blocking tool, lens tissue, and optical element are placed in a debonding solution to soften the UV cement. The optical element is then removed from the blocking tool. 16 figs.

  14. Finite-element reentry heat-transfer analysis of space shuttle Orbiter

    NASA Technical Reports Server (NTRS)

    Ko, William L.; Quinn, Robert D.; Gong, Leslie

    1986-01-01

    A structural performance and resizing (SPAR) finite-element thermal analysis computer program was used in the heat-transfer analysis of the space shuttle orbiter subjected to reentry aerodynamic heating. Three wing cross sections and one midfuselage cross section were selected for the thermal analysis. The predicted thermal protection system temperatures were found to agree well with flight-measured temperatures. The calculated aluminum structural temperatures also agreed reasonably well with the flight data from reentry to touchdown. The effects of internal radiation and of internal convection were found to be significant. The SPAR finite-element solutions agreed reasonably well with those obtained from the conventional finite-difference method.

  15. Rapid Modeling and Analysis Tools: Evolution, Status, Needs and Directions

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Stone, Thomas J.; Ransom, Jonathan B. (Technical Monitor)

    2002-01-01

    Advanced aerospace systems are becoming increasingly more complex, and customers are demanding lower cost, higher performance, and high reliability. Increased demands are placed on the design engineers to collaborate and integrate design needs and objectives early in the design process to minimize risks that may occur later in the design development stage. High performance systems require better understanding of system sensitivities much earlier in the design process to meet these goals. The knowledge, skills, intuition, and experience of an individual design engineer will need to be extended significantly for the next generation of aerospace system designs. Then a collaborative effort involving the designer, rapid and reliable analysis tools and virtual experts will result in advanced aerospace systems that are safe, reliable, and efficient. This paper discusses the evolution, status, needs and directions for rapid modeling and analysis tools for structural analysis. First, the evolution of computerized design and analysis tools is briefly described. Next, the status of representative design and analysis tools is described along with a brief statement on their functionality. Then technology advancements to achieve rapid modeling and analysis are identified. Finally, potential future directions including possible prototype configurations are proposed.

  16. Tools for developing a quality management program: proactive tools (process mapping, value stream mapping, fault tree analysis, and failure mode and effects analysis).

    PubMed

    Rath, Frank

    2008-01-01

    This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings.

  17. Competencies in Organizational E-Learning: Concepts and Tools

    ERIC Educational Resources Information Center

    Sicilia, Miguel-Angel, Ed.

    2007-01-01

    "Competencies in Organizational E-Learning: Concepts and Tools" provides a comprehensive view of the way competencies can be used to drive organizational e-learning, including the main conceptual elements, competency gap analysis, advanced related computing topics, the application of semantic Web technologies, and the integration of competencies…

  18. An emulator for minimizing computer resources for finite element analysis

    NASA Technical Reports Server (NTRS)

    Melosh, R.; Utku, S.; Islam, M.; Salama, M.

    1984-01-01

    A computer code, SCOPE, has been developed for predicting the computer resources required for a given analysis code, computer hardware, and structural problem. The cost of running the code is a small fraction (about 3 percent) of the cost of performing the actual analysis. However, its accuracy in predicting the CPU and I/O resources depends intrinsically on the accuracy of calibration data that must be developed once for the computer hardware and the finite element analysis code of interest. Testing of the SCOPE code on the AMDAHL 470 V/8 computer and the ELAS finite element analysis program indicated small I/O errors (3.2 percent), larger CPU errors (17.8 percent), and negligible total errors (1.5 percent).

  19. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool.

    PubMed

    Clark, Neil R; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D; Jones, Matthew R; Ma'ayan, Avi

    2015-11-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community.

  20. Computational mechanics analysis tools for parallel-vector supercomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, Olaf O.; Nguyen, Duc T.; Baddourah, Majdi; Qin, Jiangning

    1993-01-01

    Computational algorithms for structural analysis on parallel-vector supercomputers are reviewed. These parallel algorithms, developed by the authors, are for the assembly of structural equations, 'out-of-core' strategies for linear equation solution, massively distributed-memory equation solution, unsymmetric equation solution, general eigensolution, geometrically nonlinear finite element analysis, design sensitivity analysis for structural dynamics, optimization search analysis and domain decomposition. The source code for many of these algorithms is available.

  1. Fourier analysis of finite element preconditioned collocation schemes

    NASA Technical Reports Server (NTRS)

    Deville, Michel O.; Mund, Ernest H.

    1990-01-01

    The spectrum of the iteration operator of some finite element preconditioned Fourier collocation schemes is investigated. The first part of the paper analyses one-dimensional elliptic and hyperbolic model problems and the advection-diffusion equation. Analytical expressions of the eigenvalues are obtained with use of symbolic computation. The second part of the paper considers the set of one-dimensional differential equations resulting from Fourier analysis (in the tranverse direction) of the 2-D Stokes problem. All results agree with previous conclusions on the numerical efficiency of finite element preconditioning schemes.

  2. Interactive Finite Elements for General Engine Dynamics Analysis

    NASA Technical Reports Server (NTRS)

    Adams, M. L.; Padovan, J.; Fertis, D. G.

    1984-01-01

    General nonlinear finite element codes were adapted for the purpose of analyzing the dynamics of gas turbine engines. In particular, this adaptation required the development of a squeeze-film damper element software package and its implantation into a representative current generation code. The ADINA code was selected because of prior use of it and familiarity with its internal structure and logic. This objective was met and the results indicate that such use of general purpose codes is viable alternative to specialized codes for general dynamics analysis of engines.

  3. A tool to include gamma analysis software into a quality assurance program.

    PubMed

    Agnew, Christina E; McGarry, Conor K

    2016-03-01

    To provide a tool to enable gamma analysis software algorithms to be included in a quality assurance (QA) program. Four image sets were created comprising two geometric images to independently test the distance to agreement (DTA) and dose difference (DD) elements of the gamma algorithm, a clinical step and shoot IMRT field and a clinical VMAT arc. The images were analysed using global and local gamma analysis with 2 in-house and 8 commercially available software encompassing 15 software versions. The effect of image resolution on gamma pass rates was also investigated. All but one software accurately calculated the gamma passing rate for the geometric images. Variation in global gamma passing rates of 1% at 3%/3mm and over 2% at 1%/1mm was measured between software and software versions with analysis of appropriately sampled images. This study provides a suite of test images and the gamma pass rates achieved for a selection of commercially available software. This image suite will enable validation of gamma analysis software within a QA program and provide a frame of reference by which to compare results reported in the literature from various manufacturers and software versions. Copyright © 2015. Published by Elsevier Ireland Ltd.

  4. An Array of Qualitative Data Analysis Tools: A Call for Data Analysis Triangulation

    ERIC Educational Resources Information Center

    Leech, Nancy L.; Onwuegbuzie, Anthony J.

    2007-01-01

    One of the most important steps in the qualitative research process is analysis of data. The purpose of this article is to provide elements for understanding multiple types of qualitative data analysis techniques available and the importance of utilizing more than one type of analysis, thus utilizing data analysis triangulation, in order to…

  5. Product ion isotopologue pattern: A tool to improve the reliability of elemental composition elucidations of unknown compounds in complex matrices.

    PubMed

    Kaufmann, A; Walker, S; Mol, G

    2016-04-15

    Elucidation of the elemental compositions of unknown compounds (e.g., in metabolomics) generally relies on the availability of accurate masses and isotopic ratios. This study focuses on the information provided by the abundance ratio within a product ion pair (monoisotopic versus the first isotopic peak) when isolating and fragmenting the first isotopic ion (first isotopic mass spectrum) of the precursor. This process relies on the capability of the quadrupole within the Q Orbitrap instrument to isolate a very narrow mass window. Selecting only the first isotopic peak (first isotopic mass spectrum) leads to the observation of a unique product ion pair. The lighter ion within such an isotopologue pair is monoisotopic, while the heavier ion contains a single carbon isotope. The observed abundance ratio is governed by the percentage of carbon atoms lost during the fragmentation and can be described by a hypergeometric distribution. The observed carbon isotopologue abundance ratio (product ion isotopologue pattern) gives reliable information regarding the percentage of carbon atoms lost in the fragmentation process. It therefore facilitates the elucidation of the involved precursor and product ions. Unlike conventional isotopic abundances, the product ion isotopologue pattern is hardly affected by isobaric interferences. Furthermore, the appearance of these pairs greatly aids in cleaning up a 'matrix-contaminated' product ion spectrum. The product ion isotopologue pattern is a valuable tool for structural elucidation. It increases confidence in results and permits structural elucidations for heavier ions. This tool is also very useful in elucidating the elemental composition of product ions. Such information is highly valued in the field of multi-residue analysis, where the accurate mass of product ions is required for the confirmation process. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  6. A Powerful Molecular Engineering Tool Provided Efficient Chlamydomonas Mutants as Bio-Sensing Elements for Herbicides Detection

    PubMed Central

    Lambreva, Maya D.; Giardi, Maria Teresa; Rambaldi, Irene; Antonacci, Amina; Pastorelli, Sandro; Bertalan, Ivo; Husu, Ivan; Johanningmeier, Udo; Rea, Giuseppina

    2013-01-01

    This study was prompted by increasing concerns about ecological damage and human health threats derived by persistent contamination of water and soil with herbicides, and emerging of bio-sensing technology as powerful, fast and efficient tool for the identification of such hazards. This work is aimed at overcoming principal limitations negatively affecting the whole-cell-based biosensors performance due to inadequate stability and sensitivity of the bio-recognition element. The novel bio-sensing elements for the detection of herbicides were generated exploiting the power of molecular engineering in order to improve the performance of photosynthetic complexes. The new phenotypes were produced by an in vitro directed evolution strategy targeted at the photosystem II (PSII) D1 protein of Chlamydomonas reinhardtii, using exposures to radical-generating ionizing radiation as selection pressure. These tools proved successful to identify D1 mutations conferring enhanced stability, tolerance to free-radical-associated stress and competence for herbicide perception. Long-term stability tests of PSII performance revealed the mutants capability to deal with oxidative stress-related conditions. Furthermore, dose-response experiments indicated the strains having increased sensitivity or resistance to triazine and urea type herbicides with I50 values ranging from 6×10−8 M to 2×10−6 M. Besides stressing the relevance of several amino acids for PSII photochemistry and herbicide sensing, the possibility to improve the specificity of whole-cell-based biosensors, via coupling herbicide-sensitive with herbicide-resistant strains, was verified. PMID:23613953

  7. Tools4miRs – one place to gather all the tools for miRNA analysis

    PubMed Central

    Lukasik, Anna; Wójcikowski, Maciej; Zielenkiewicz, Piotr

    2016-01-01

    Summary: MiRNAs are short, non-coding molecules that negatively regulate gene expression and thereby play several important roles in living organisms. Dozens of computational methods for miRNA-related research have been developed, which greatly differ in various aspects. The substantial availability of difficult-to-compare approaches makes it challenging for the user to select a proper tool and prompts the need for a solution that will collect and categorize all the methods. Here, we present tools4miRs, the first platform that gathers currently more than 160 methods for broadly defined miRNA analysis. The collected tools are classified into several general and more detailed categories in which the users can additionally filter the available methods according to their specific research needs, capabilities and preferences. Tools4miRs is also a web-based target prediction meta-server that incorporates user-designated target prediction methods into the analysis of user-provided data. Availability and Implementation: Tools4miRs is implemented in Python using Django and is freely available at tools4mirs.org. Contact: piotr@ibb.waw.pl Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153626

  8. Finite element analysis of wrinkling membranes

    NASA Technical Reports Server (NTRS)

    Miller, R. K.; Hedgepeth, J. M.; Weingarten, V. I.; Das, P.; Kahyai, S.

    1984-01-01

    The development of a nonlinear numerical algorithm for the analysis of stresses and displacements in partly wrinkled flat membranes, and its implementation on the SAP VII finite-element code are described. A comparison of numerical results with exact solutions of two benchmark problems reveals excellent agreement, with good convergence of the required iterative procedure. An exact solution of a problem involving axisymmetric deformations of a partly wrinkled shallow curved membrane is also reported.

  9. PIXE analysis of caries related trace elements in tooth enamel

    NASA Astrophysics Data System (ADS)

    Annegarn, H. J.; Jodaikin, A.; Cleaton-Jones, P. E.; Sellschop, J. P. F.; Madiba, C. C. P.; Bibby, D.

    1981-03-01

    PIXE analysis has been applied to a set of twenty human teeth to determine trace element concentration in enamel from areas susceptible to dental caries (mesial and distal contact points) and in areas less susceptible to the disease (buccal surfaces), with the aim of determining the possible roles of trace elements in the curious process. The samples were caries-free anterior incisors extracted for periodontal reasons from subjects 10-30 years of age. Prior to extraction of the sample teeth, a detailed dental history and examination was carried out in each individual. PIXE analysis, using a 3 MeV proton beam of 1 mm diameter, allowed the determination of Ca, Mn, Fe, Cu, Zn, Sr and Pb above detection limits. As demonstrated in this work, the enhanced sensitivity of PIXE analysis over electron microprobe analysis, and the capability of localised surface analysis compared with the pooled samples required for neutron activation analysis, makes it a powerful and useful technique in dental analysis.

  10. Finite element analysis of chip formation usingale method

    NASA Astrophysics Data System (ADS)

    Jayaprakash, V.

    2017-05-01

    In recent times, many studies made in FEM on plain isotropic metal plate formulation. The stress analysis plays the significant role in the stability of structural safety and system. The stress and distortion estimation is very helpful for designing and manufacturing product well. Usually the residual stress and plastic strain determine the fatigue life of structure, it also plays the significant role in designing and choosing material. When the load magnitude increases the crack starts to form, decreasing the work load and the residual stress reduces the damage of the metal. The manufacturing process is a key parameter in process and forming the part of any system. However, machining operation involves complex thing like hot development, material property and other estimates based on transition of the plastic strain and residual stress. The reduction of residual stress plays the complexity role in the finite element study. This paper deals with the manufacturing process with less residual stress and strain. The results shows that, by applying the ALE method in machining we can reduce the load on the work piece hence the life type of the work piece can be increased. We also investigate the cutting tool wear and there efficiency since it is a essential machine member in fabrication technology. ABAQUS platform used to solve the machining operation

  11. Porcupine: A visual pipeline tool for neuroimaging analysis

    PubMed Central

    Snoek, Lukas; Knapen, Tomas

    2018-01-01

    The field of neuroimaging is rapidly adopting a more reproducible approach to data acquisition and analysis. Data structures and formats are being standardised and data analyses are getting more automated. However, as data analysis becomes more complicated, researchers often have to write longer analysis scripts, spanning different tools across multiple programming languages. This makes it more difficult to share or recreate code, reducing the reproducibility of the analysis. We present a tool, Porcupine, that constructs one’s analysis visually and automatically produces analysis code. The graphical representation improves understanding of the performed analysis, while retaining the flexibility of modifying the produced code manually to custom needs. Not only does Porcupine produce the analysis code, it also creates a shareable environment for running the code in the form of a Docker image. Together, this forms a reproducible way of constructing, visualising and sharing one’s analysis. Currently, Porcupine links to Nipype functionalities, which in turn accesses most standard neuroimaging analysis tools. Our goal is to release researchers from the constraints of specific implementation details, thereby freeing them to think about novel and creative ways to solve a given problem. Porcupine improves the overview researchers have of their processing pipelines, and facilitates both the development and communication of their work. This will reduce the threshold at which less expert users can generate reusable pipelines. With Porcupine, we bridge the gap between a conceptual and an implementational level of analysis and make it easier for researchers to create reproducible and shareable science. We provide a wide range of examples and documentation, as well as installer files for all platforms on our website: https://timvanmourik.github.io/Porcupine. Porcupine is free, open source, and released under the GNU General Public License v3.0. PMID:29746461

  12. Energy Finite Element Analysis Developments for Vibration Analysis of Composite Aircraft Structures

    NASA Technical Reports Server (NTRS)

    Vlahopoulos, Nickolas; Schiller, Noah H.

    2011-01-01

    The Energy Finite Element Analysis (EFEA) has been utilized successfully for modeling complex structural-acoustic systems with isotropic structural material properties. In this paper, a formulation for modeling structures made out of composite materials is presented. An approach based on spectral finite element analysis is utilized first for developing the equivalent material properties for the composite material. These equivalent properties are employed in the EFEA governing differential equations for representing the composite materials and deriving the element level matrices. The power transmission characteristics at connections between members made out of non-isotropic composite material are considered for deriving suitable power transmission coefficients at junctions of interconnected members. These coefficients are utilized for computing the joint matrix that is needed to assemble the global system of EFEA equations. The global system of EFEA equations is solved numerically and the vibration levels within the entire system can be computed. The new EFEA formulation for modeling composite laminate structures is validated through comparison to test data collected from a representative composite aircraft fuselage that is made out of a composite outer shell and composite frames and stiffeners. NASA Langley constructed the composite cylinder and conducted the test measurements utilized in this work.

  13. Impeller deflection and modal finite element analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spencer, Nathan A.

    2013-10-01

    Deflections of an impeller due to centripetal forces are calculated using finite element analysis. The lateral, or out of plane, deflections are an important design consideration for this particular impeller because it incorporates an air bearing with critical gap tolerances. The target gap distance is approximately 10 microns at a rotational velocity of 2500 rpm. The centripetal forces acting on the impeller cause it deflect in a concave fashion, decreasing the initial gap distance as a function of radial position. This deflection is characterized for a previous and updated impeller design for comparative purposes. The impact of design options suchmore » as material selection, geometry dimensions, and operating rotational velocity are also explored, followed by a sensitivity study with these parameters bounded by specific design values. A modal analysis is also performed to calculate the impeller's natural frequencies which are desired to be avoided during operation. The finite element modeling techniques continue to be exercised by the impeller design team to address specific questions and evaluate conceptual designs, some of which are included in the Appendix.« less

  14. Verification of finite element analysis of fixed partial denture with in vitro electronic strain measurement.

    PubMed

    Wang, Gaoqi; Zhang, Song; Bian, Cuirong; Kong, Hui

    2016-01-01

    The purpose of the study was to verify the finite element analysis model of three-unite fixed partial denture with in vitro electronic strain analysis and analyze clinical situation with the verified model. First, strain gauges were attached to the critical areas of a three-unit fixed partial denture. Strain values were measured under 300 N load perpendicular to the occlusal plane. Secondly, a three-dimensional finite element model in accordance with the electronic strain analysis experiment was constructed from the scanning data. And the strain values obtained by finite element analysis and in vitro measurements were compared. Finally, the clinical destruction of the fixed partial denture was evaluated with the verified finite element analysis model. There was a mutual agreement and consistency between the finite element analysis results and experimental data. The finite element analysis revealed that failure will occur in the veneer layer on buccal surface of the connector under occlusal force of 570 N. The results indicate that the electronic strain analysis is an appropriate and cost saving method to verify the finite element model. The veneer layer on buccal surface of the connector is the weakest area in the fixed partial denture. Copyright © 2015 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.

  15. Wavelet and Multiresolution Analysis for Finite Element Networking Paradigms

    NASA Technical Reports Server (NTRS)

    Kurdila, Andrew J.; Sharpley, Robert C.

    1999-01-01

    This paper presents a final report on Wavelet and Multiresolution Analysis for Finite Element Networking Paradigms. The focus of this research is to derive and implement: 1) Wavelet based methodologies for the compression, transmission, decoding, and visualization of three dimensional finite element geometry and simulation data in a network environment; 2) methodologies for interactive algorithm monitoring and tracking in computational mechanics; and 3) Methodologies for interactive algorithm steering for the acceleration of large scale finite element simulations. Also included in this report are appendices describing the derivation of wavelet based Particle Image Velocity algorithms and reduced order input-output models for nonlinear systems by utilizing wavelet approximations.

  16. Tool Efficiency Analysis model research in SEMI industry

    NASA Astrophysics Data System (ADS)

    Lei, Ma; Nana, Zhang; Zhongqiu, Zhang

    2018-06-01

    One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states, and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.

  17. Biomechanical investigation of naso-orbitoethmoid trauma by finite element analysis.

    PubMed

    Huempfner-Hierl, Heike; Schaller, Andreas; Hemprich, Alexander; Hierl, Thomas

    2014-11-01

    Naso-orbitoethmoid fractures account for 5% of all facial fractures. We used data derived from a white 34-year-old man to make a transient dynamic finite element model, which consisted of about 740 000 elements, to simulate fist-like impacts to this anatomically complex area. Finite element analysis showed a pattern of von Mises stresses beyond the yield criterion of bone that corresponded with fractures commonly seen clinically. Finite element models can be used to simulate injuries to the human skull, and provide information about the pathogenesis of different types of fracture. Copyright © 2014 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  18. Look@NanoSIMS--a tool for the analysis of nanoSIMS data in environmental microbiology.

    PubMed

    Polerecky, Lubos; Adam, Birgit; Milucka, Jana; Musat, Niculina; Vagner, Tomas; Kuypers, Marcel M M

    2012-04-01

    We describe an open-source freeware programme for high throughput analysis of nanoSIMS (nanometre-scale secondary ion mass spectrometry) data. The programme implements basic data processing and analytical functions, including display and drift-corrected accumulation of scanned planes, interactive and semi-automated definition of regions of interest (ROIs), and export of the ROIs' elemental and isotopic composition in graphical and text-based formats. Additionally, the programme offers new functions that were custom-designed to address the needs of environmental microbiologists. Specifically, it allows manual and automated classification of ROIs based on the information that is derived either from the nanoSIMS dataset itself (e.g. from labelling achieved by halogen in situ hybridization) or is provided externally (e.g. as a fluorescence in situ hybridization image). Moreover, by implementing post-processing routines coupled to built-in statistical tools, the programme allows rapid synthesis and comparative analysis of results from many different datasets. After validation of the programme, we illustrate how these new processing and analytical functions increase flexibility, efficiency and depth of the nanoSIMS data analysis. Through its custom-made and open-source design, the programme provides an efficient, reliable and easily expandable tool that can help a growing community of environmental microbiologists and researchers from other disciplines process and analyse their nanoSIMS data. © 2012 Society for Applied Microbiology and Blackwell Publishing Ltd.

  19. When product designers use perceptually based color tools

    NASA Astrophysics Data System (ADS)

    Bender, Walter R.

    1998-07-01

    Palette synthesis and analysis tools have been built based upon a model of color experience. This model adjusts formal compositional elements such as hue, value, chroma, and their contrasts, as well as size and proportion. Clothing and household product designers were given these tools to give guidance to their selection of seasonal palettes for use in production of the private-label merchandise of a large retail chain. The designers chose base palettes. Accents to these palettes were generated with and without the aid of the color tools. These palettes are compared by using perceptual metrics and interviews. The results are presented.

  20. When product designers use perceptually based color tools

    NASA Astrophysics Data System (ADS)

    Bender, Walter R.

    2001-01-01

    Palette synthesis and analysis tools have been built based upon a model of color experience. This model adjusts formal compositional elements such as hue, value, chroma, and their contrasts, as well as size and proportion. Clothing and household product designers were given these tools to guide their selection of seasonal palettes in the production of the private-label merchandise in a large retail chain. The designers chose base palettes. Accents to these palettes were generated with and without the aid of the color tools. These palettes are compared by using perceptual metrics and interviews. The results are presented.

  1. Failure location prediction by finite element analysis for an additive manufactured mandible implant.

    PubMed

    Huo, Jinxing; Dérand, Per; Rännar, Lars-Erik; Hirsch, Jan-Michaél; Gamstedt, E Kristofer

    2015-09-01

    In order to reconstruct a patient with a bone defect in the mandible, a porous scaffold attached to a plate, both in a titanium alloy, was designed and manufactured using additive manufacturing. Regrettably, the implant fractured in vivo several months after surgery. The aim of this study was to investigate the failure of the implant and show a way of predicting the mechanical properties of the implant before surgery. All computed tomography data of the patient were preprocessed to remove metallic artefacts with metal deletion technique before mandible geometry reconstruction. The three-dimensional geometry of the patient's mandible was also reconstructed, and the implant was fixed to the bone model with screws in Mimics medical imaging software. A finite element model was established from the assembly of the mandible and the implant to study stresses developed during mastication. The stress distribution in the load-bearing plate was computed, and the location of main stress concentration in the plate was determined. Comparison between the fracture region and the location of the stress concentration shows that finite element analysis could serve as a tool for optimizing the design of mandible implants. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.

  2. New Tools for Sea Ice Data Analysis and Visualization: NSIDC's Arctic Sea Ice News and Analysis

    NASA Astrophysics Data System (ADS)

    Vizcarra, N.; Stroeve, J.; Beam, K.; Beitler, J.; Brandt, M.; Kovarik, J.; Savoie, M. H.; Skaug, M.; Stafford, T.

    2017-12-01

    Arctic sea ice has long been recognized as a sensitive climate indicator and has undergone a dramatic decline over the past thirty years. Antarctic sea ice continues to be an intriguing and active field of research. The National Snow and Ice Data Center's Arctic Sea Ice News & Analysis (ASINA) offers researchers and the public a transparent view of sea ice data and analysis. We have released a new set of tools for sea ice analysis and visualization. In addition to Charctic, our interactive sea ice extent graph, the new Sea Ice Data and Analysis Tools page provides access to Arctic and Antarctic sea ice data organized in seven different data workbooks, updated daily or monthly. An interactive tool lets scientists, or the public, quickly compare changes in ice extent and location. Another tool allows users to map trends, anomalies, and means for user-defined time periods. Animations of September Arctic and Antarctic monthly average sea ice extent and concentration may also be accessed from this page. Our tools help the NSIDC scientists monitor and understand sea ice conditions in near real time. They also allow the public to easily interact with and explore sea ice data. Technical innovations in our data center helped NSIDC quickly build these tools and more easily maintain them. The tools were made publicly accessible to meet the desire from the public and members of the media to access the numbers and calculations that power our visualizations and analysis. This poster explores these tools and how other researchers, the media, and the general public are using them.

  3. Finite Element Analysis of Adaptive-Stiffening and Shape-Control SMA Hybrid Composites

    NASA Technical Reports Server (NTRS)

    Gao, Xiujie; Burton, Deborah; Turner, Travis L.; Brinson, Catherine

    2005-01-01

    Shape memory alloy hybrid composites with adaptive-stiffening or morphing functions are simulated using finite element analysis. The composite structure is a laminated fiber-polymer composite beam with embedded SMA ribbons at various positions with respect to the neutral axis of the beam. Adaptive stiffening or morphing is activated via selective resistance heating of the SMA ribbons or uniform thermal loads on the beam. The thermomechanical behavior of these composites was simulated in ABAQUS using user-defined SMA elements. The examples demonstrate the usefulness of the methods for the design and simulation of SMA hybrid composites. Keywords: shape memory alloys, Nitinol, ABAQUS, finite element analysis, post-buckling control, shape control, deflection control, adaptive stiffening, morphing, constitutive modeling, user element

  4. Data Analysis with Graphical Models: Software Tools

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.

    1994-01-01

    Probabilistic graphical models (directed and undirected Markov fields, and combined in chain graphs) are used widely in expert systems, image processing and other areas as a framework for representing and reasoning with probabilities. They come with corresponding algorithms for performing probabilistic inference. This paper discusses an extension to these models by Spiegelhalter and Gilks, plates, used to graphically model the notion of a sample. This offers a graphical specification language for representing data analysis problems. When combined with general methods for statistical inference, this also offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper outlines the framework and then presents some basic tools for the task: a graphical version of the Pitman-Koopman Theorem for the exponential family, problem decomposition, and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.

  5. Draper Station Analysis Tool

    NASA Technical Reports Server (NTRS)

    Bedrossian, Nazareth; Jang, Jiann-Woei; McCants, Edward; Omohundro, Zachary; Ring, Tom; Templeton, Jeremy; Zoss, Jeremy; Wallace, Jonathan; Ziegler, Philip

    2011-01-01

    Draper Station Analysis Tool (DSAT) is a computer program, built on commercially available software, for simulating and analyzing complex dynamic systems. Heretofore used in designing and verifying guidance, navigation, and control systems of the International Space Station, DSAT has a modular architecture that lends itself to modification for application to spacecraft or terrestrial systems. DSAT consists of user-interface, data-structures, simulation-generation, analysis, plotting, documentation, and help components. DSAT automates the construction of simulations and the process of analysis. DSAT provides a graphical user interface (GUI), plus a Web-enabled interface, similar to the GUI, that enables a remotely located user to gain access to the full capabilities of DSAT via the Internet and Webbrowser software. Data structures are used to define the GUI, the Web-enabled interface, simulations, and analyses. Three data structures define the type of analysis to be performed: closed-loop simulation, frequency response, and/or stability margins. DSAT can be executed on almost any workstation, desktop, or laptop computer. DSAT provides better than an order of magnitude improvement in cost, schedule, and risk assessment for simulation based design and verification of complex dynamic systems.

  6. Multivariate analysis of elemental chemistry as a robust biosignature

    NASA Astrophysics Data System (ADS)

    Storrie-Lombardi, M.; Nealson, K.

    2003-04-01

    The robotic detection of life in extraterrestrial settings (i.e., Mars, Europa, etc.) would be greatly simplified if analysis could be accomplished in the absence of direct mechanical manipulation of a sample. It would also be preferable to employ a fundamental physico-chemical phenomenon as a biosignature and depend less on the particular manifestations of life on Earth (i.e. to employ non-earthcentric methods). One such approach, which we put forward here, is that of elemental composition, a reflection of the use of specific chemical elements for the construction of living systems. Using appropriate analyses (over the proper spatial scales), it should be possible to see deviations from the geological background (mineral and geochemical composition of the crust), and identify anomalies that would indicate sufficient deviation from the norm as to indicate a possible living system. To this end, over the past four decades elemental distributions have been determined for the sun, the interstellar medium, seawater, the crust of the Earth, carbonaceous chondrite meteorites, bacteria, plants, animals, and human beings. Such data can be relatively easily obtained for samples of a variety of types using a technique known as laser-induced breakdown spectroscopy (LIBS), which employs a high energy laser to ablate a portion of a sample, and then determine elemental composition using remote optical spectroscopy. However, the elements commonly associated with living systems (H, C, O, and N), while useful for detecting extant life, are relatively volatile and are not easily constrained across geological time scales. This minimizes their utility as fossil markers of ancient life. We have investigated the possibility of distinguishing the distributions of less volatile elements in a variety of biological materials from the distributions found in carbonaceous chondrites and the Earth’s crust using principal component analysis (PCA), a classical multivariate analysis technique

  7. Lightweight Object Oriented Structure analysis: Tools for building Tools to Analyze Molecular Dynamics Simulations

    PubMed Central

    Romo, Tod D.; Leioatts, Nicholas; Grossfield, Alan

    2014-01-01

    LOOS (Lightweight Object-Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 120 pre-built tools, including suites of tools for analyzing simulation convergence, 3D histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only 4 core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. PMID:25327784

  8. Enhancement of Local Climate Analysis Tool

    NASA Astrophysics Data System (ADS)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  9. DIY Solar Market Analysis Webinar Series: Top Solar Tools | State, Local,

    Science.gov Websites

    and Tribal Governments | NREL DIY Solar Market Analysis Webinar Series: Top Solar Tools DIY Solar Market Analysis Webinar Series: Top Solar Tools Wednesday, May 14, 2014 As part of a Do-It -Yourself Solar Market Analysis summer series, NREL's Solar Technical Assistance Team (STAT) presented a

  10. Exploring the single-cell RNA-seq analysis landscape with the scRNA-tools database.

    PubMed

    Zappia, Luke; Phipson, Belinda; Oshlack, Alicia

    2018-06-25

    As single-cell RNA-sequencing (scRNA-seq) datasets have become more widespread the number of tools designed to analyse these data has dramatically increased. Navigating the vast sea of tools now available is becoming increasingly challenging for researchers. In order to better facilitate selection of appropriate analysis tools we have created the scRNA-tools database (www.scRNA-tools.org) to catalogue and curate analysis tools as they become available. Our database collects a range of information on each scRNA-seq analysis tool and categorises them according to the analysis tasks they perform. Exploration of this database gives insights into the areas of rapid development of analysis methods for scRNA-seq data. We see that many tools perform tasks specific to scRNA-seq analysis, particularly clustering and ordering of cells. We also find that the scRNA-seq community embraces an open-source and open-science approach, with most tools available under open-source licenses and preprints being extensively used as a means to describe methods. The scRNA-tools database provides a valuable resource for researchers embarking on scRNA-seq analysis and records the growth of the field over time.

  11. Sensitivity Analysis of Weather Variables on Offsite Consequence Analysis Tools in South Korea and the United States.

    PubMed

    Kim, Min-Uk; Moon, Kyong Whan; Sohn, Jong-Ryeul; Byeon, Sang-Hoon

    2018-05-18

    We studied sensitive weather variables for consequence analysis, in the case of chemical leaks on the user side of offsite consequence analysis (OCA) tools. We used OCA tools Korea Offsite Risk Assessment (KORA) and Areal Location of Hazardous Atmospheres (ALOHA) in South Korea and the United States, respectively. The chemicals used for this analysis were 28% ammonia (NH₃), 35% hydrogen chloride (HCl), 50% hydrofluoric acid (HF), and 69% nitric acid (HNO₃). The accident scenarios were based on leakage accidents in storage tanks. The weather variables were air temperature, wind speed, humidity, and atmospheric stability. Sensitivity analysis was performed using the Statistical Package for the Social Sciences (SPSS) program for dummy regression analysis. Sensitivity analysis showed that impact distance was not sensitive to humidity. Impact distance was most sensitive to atmospheric stability, and was also more sensitive to air temperature than wind speed, according to both the KORA and ALOHA tools. Moreover, the weather variables were more sensitive in rural conditions than in urban conditions, with the ALOHA tool being more influenced by weather variables than the KORA tool. Therefore, if using the ALOHA tool instead of the KORA tool in rural conditions, users should be careful not to cause any differences in impact distance due to input errors of weather variables, with the most sensitive one being atmospheric stability.

  12. Finite element analysis of the high strain rate testing of polymeric materials

    NASA Astrophysics Data System (ADS)

    Gorwade, C. V.; Alghamdi, A. S.; Ashcroft, I. A.; Silberschmidt, V. V.; Song, M.

    2012-08-01

    Advanced polymer materials are finding an increasing range of industrial and defence applications. Ultra-high molecular weight polymers (UHMWPE) are already used in lightweight body armour because of their good impact resistance with light weight. However, a broader use of such materials is limited by the complexity of the manufacturing processes and the lack of experimental data on their behaviour and failure evolution under high-strain rate loading conditions. The current study deals with an investigation of the internal heat generation during tensile of UHMWPE. A 3D finite element (FE) model of the tensile test is developed and validated the with experimental work. An elastic-plastic material model is used with adiabatic heat generation. The temperature and stresses obtained with FE analysis are found to be in a good agreement with the experimental results. The model can be used as a simple and cost effective tool to predict the thermo-mechanical behaviour of UHMWPE part under various loading conditions.

  13. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool

    PubMed Central

    Clark, Neil R.; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D.; Jones, Matthew R.; Ma’ayan, Avi

    2016-01-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community. PMID:26848405

  14. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nachtigal, Noel M.; Fruetel, Julia A.; Gleason, Nathaniel J.

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in themore » risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.« less

  15. Dynamic Analysis of Geared Rotors by Finite Elements

    NASA Technical Reports Server (NTRS)

    Kahraman, A.; Ozguven, H. Nevzat; Houser, D. R.; Zakrajsek, J. J.

    1992-01-01

    A finite element model of a geared rotor system on flexible bearings has been developed. The model includes the rotary inertia of on shaft elements, the axial loading on shafts, flexibility and damping of bearings, material damping of shafts and the stiffness and the damping of gear mesh. The coupling between the torsional and transverse vibrations of gears were considered in the model. A constant mesh stiffness was assumed. The analysis procedure can be used for forced vibration analysis geared rotors by calculating the critical speeds and determining the response of any point on the shafts to mass unbalances, geometric eccentricities of gears, and displacement transmission error excitation at the mesh point. The dynamic mesh forces due to these excitations can also be calculated. The model has been applied to several systems for the demonstration of its accuracy and for studying the effect of bearing compliances on system dynamics.

  16. Improvements to Integrated Tradespace Analysis of Communications Architectures (ITACA) Network Loading Analysis Tool

    NASA Technical Reports Server (NTRS)

    Lee, Nathaniel; Welch, Bryan W.

    2018-01-01

    NASA's SCENIC project aims to simplify and reduce the cost of space mission planning by replicating the analysis capabilities of commercially licensed software which are integrated with relevant analysis parameters specific to SCaN assets and SCaN supported user missions. SCENIC differs from current tools that perform similar analyses in that it 1) does not require any licensing fees, 2) will provide an all-in-one package for various analysis capabilities that normally requires add-ons or multiple tools to complete. As part of SCENIC's capabilities, the ITACA network loading analysis tool will be responsible for assessing the loading on a given network architecture and generating a network service schedule. ITACA will allow users to evaluate the quality of service of a given network architecture and determine whether or not the architecture will satisfy the mission's requirements. ITACA is currently under development, and the following improvements were made during the fall of 2017: optimization of runtime, augmentation of network asset pre-service configuration time, augmentation of Brent's method of root finding, augmentation of network asset FOV restrictions, augmentation of mission lifetimes, and the integration of a SCaN link budget calculation tool. The improvements resulted in (a) 25% reduction in runtime, (b) more accurate contact window predictions when compared to STK(Registered Trademark) contact window predictions, and (c) increased fidelity through the use of specific SCaN asset parameters.

  17. Dental application of novel finite element analysis software for three-dimensional finite element modeling of a dentulous mandible from its computed tomography images.

    PubMed

    Nakamura, Keiko; Tajima, Kiyoshi; Chen, Ker-Kong; Nagamatsu, Yuki; Kakigawa, Hiroshi; Masumi, Shin-ich

    2013-12-01

    This study focused on the application of novel finite-element analysis software for constructing a finite-element model from the computed tomography data of a human dentulous mandible. The finite-element model is necessary for evaluating the mechanical response of the alveolar part of the mandible, resulting from occlusal force applied to the teeth during biting. Commercially available patient-specific general computed tomography-based finite-element analysis software was solely applied to the finite-element analysis for the extraction of computed tomography data. The mandibular bone with teeth was extracted from the original images. Both the enamel and the dentin were extracted after image processing, and the periodontal ligament was created from the segmented dentin. The constructed finite-element model was reasonably accurate using a total of 234,644 nodes and 1,268,784 tetrahedral and 40,665 shell elements. The elastic moduli of the heterogeneous mandibular bone were determined from the bone density data of the computed tomography images. The results suggested that the software applied in this study is both useful and powerful for creating a more accurate three-dimensional finite-element model of a dentulous mandible from the computed tomography data without the need for any other software.

  18. Measuring Surface Bulk Elemental Composition on Venus

    NASA Technical Reports Server (NTRS)

    Schweitzer, Jeffrey S.; Parsons, Ann M.; Grau, Jim; Lawrence, David J.; McCclanahan, Timothy P.; Miles, Jeffrey; Peplowski, Patrick; Perkins, Luke; Starr, Richard

    2017-01-01

    The extreme surface environment (462 C, 93 bars pressure) of Venus makes subsurface measurements of its bulk elemental composition extremely challenging. Instruments landed on the surface of Venus must be enclosed in a pressure vessel. The high surface temperatures also require a thermal control system to keep the instrumentation temperatures within their operational range for as long as possible. Since Venus surface probes can currently operate for only a few hours, it is crucial that the lander instrumentation be able to make statistically significant measurements in a short time. An instrument is described that can achieve such a measurement over a volume of thousands of cubic centimeters of material by using high energy penetrating neutron and gamma radiation. The instrument consists of a Pulsed Neutron Generator (PNG) and a Gamma-Ray Spectrometer (GRS). The PNG emits isotropic pulses of 14.1 MeV neutrons that penetrate the pressure vessel walls, the dense atmosphere and the surface rock. The neutrons induce nuclear reactions in the rock to produce gamma rays with energies specific to the element and nuclear process involved. Thus the energies of the detected gamma rays identify the elements present and their intensities provide the abundance of each element. The GRS spectra are analyzed to determine the Venus elemental composition from the spectral signature of individual major, minor, and trace radioactive elements. As a test of such an instrument, a Schlumberger Litho Scanner oil well logging tool was used in a series of experiments at NASA's Goddard Space Flight Center. The Litho Scanner tool was mounted above large (1.8 m x 1.8 m x.9 m) granite and basalt monuments and made a series of one-hour elemental composition measurements in a planar geometry more similar to a planetary lander measurement. Initial analysis of the results shows good agreement with target elemental assays

  19. Combining the Finite Element Method with Structural Connectome-based Analysis for Modeling Neurotrauma: Connectome Neurotrauma Mechanics

    PubMed Central

    Kraft, Reuben H.; Mckee, Phillip Justin; Dagro, Amy M.; Grafton, Scott T.

    2012-01-01

    This article presents the integration of brain injury biomechanics and graph theoretical analysis of neuronal connections, or connectomics, to form a neurocomputational model that captures spatiotemporal characteristics of trauma. We relate localized mechanical brain damage predicted from biofidelic finite element simulations of the human head subjected to impact with degradation in the structural connectome for a single individual. The finite element model incorporates various length scales into the full head simulations by including anisotropic constitutive laws informed by diffusion tensor imaging. Coupling between the finite element analysis and network-based tools is established through experimentally-based cellular injury thresholds for white matter regions. Once edges are degraded, graph theoretical measures are computed on the “damaged” network. For a frontal impact, the simulations predict that the temporal and occipital regions undergo the most axonal strain and strain rate at short times (less than 24 hrs), which leads to cellular death initiation, which results in damage that shows dependence on angle of impact and underlying microstructure of brain tissue. The monotonic cellular death relationships predict a spatiotemporal change of structural damage. Interestingly, at 96 hrs post-impact, computations predict no network nodes were completely disconnected from the network, despite significant damage to network edges. At early times () network measures of global and local efficiency were degraded little; however, as time increased to 96 hrs the network properties were significantly reduced. In the future, this computational framework could help inform functional networks from physics-based structural brain biomechanics to obtain not only a biomechanics-based understanding of injury, but also neurophysiological insight. PMID:22915997

  20. CMM Data Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Due to the increase in the use of Coordinate Measuring Machines (CMMs) to measure fine details and complex geometries in manufacturing, many programs have been made to compile and analyze the data. These programs typically require extensive setup to determine the expected results in order to not only track the pass/fail of a dimension, but also to use statistical process control (SPC). These extra steps and setup times have been addressed through the CMM Data Analysis Tool, which only requires the output of the CMM to provide both pass/fail analysis on all parts run to the same inspection program asmore » well as provide graphs which help visualize where the part measures within the allowed tolerances. This provides feedback not only to the customer for approval of a part during development, but also to machining process engineers to identify when any dimension is drifting towards an out of tolerance condition during production. This program can handle hundreds of parts with complex dimensions and will provide an analysis within minutes.« less

  1. Elementary Mode Analysis: A Useful Metabolic Pathway Analysis Tool for Characterizing Cellular Metabolism

    PubMed Central

    Trinh, Cong T.; Wlaschin, Aaron; Srienc, Friedrich

    2010-01-01

    Elementary Mode Analysis is a useful Metabolic Pathway Analysis tool to identify the structure of a metabolic network that links the cellular phenotype to the corresponding genotype. The analysis can decompose the intricate metabolic network comprised of highly interconnected reactions into uniquely organized pathways. These pathways consisting of a minimal set of enzymes that can support steady state operation of cellular metabolism represent independent cellular physiological states. Such pathway definition provides a rigorous basis to systematically characterize cellular phenotypes, metabolic network regulation, robustness, and fragility that facilitate understanding of cell physiology and implementation of metabolic engineering strategies. This mini-review aims to overview the development and application of elementary mode analysis as a metabolic pathway analysis tool in studying cell physiology and as a basis of metabolic engineering. PMID:19015845

  2. Network Analysis Tools: from biological networks to clusters and pathways.

    PubMed

    Brohée, Sylvain; Faust, Karoline; Lima-Mendez, Gipsi; Vanderstocken, Gilles; van Helden, Jacques

    2008-01-01

    Network Analysis Tools (NeAT) is a suite of computer tools that integrate various algorithms for the analysis of biological networks: comparison between graphs, between clusters, or between graphs and clusters; network randomization; analysis of degree distribution; network-based clustering and path finding. The tools are interconnected to enable a stepwise analysis of the network through a complete analytical workflow. In this protocol, we present a typical case of utilization, where the tasks above are combined to decipher a protein-protein interaction network retrieved from the STRING database. The results returned by NeAT are typically subnetworks, networks enriched with additional information (i.e., clusters or paths) or tables displaying statistics. Typical networks comprising several thousands of nodes and arcs can be analyzed within a few minutes. The complete protocol can be read and executed in approximately 1 h.

  3. Study on Edge Thickening Flow Forming Using the Finite Elements Analysis

    NASA Astrophysics Data System (ADS)

    Kim, Young Jin; Park, Jin Sung; Cho, Chongdu

    2011-08-01

    This study is to examine the forming features of flow stress property and the incremental forming method with increasing the thickness of material. Recently, the optimized forming method is widely studied through the finite element analysis to optimize forming process conditions in many different forming fields. The optimal forming method should be adopted to meet geometric requirements as the reduction in volume per unit length of material such as forging, rolling, spinning etc. However conventional studies have not dealt with issue regarding volume per unit length. For the study we use the finite element method and model a gear part of an automotive engine flywheel as the study model, which is a weld assembly of a plate and a gear with respective different thickness. In simulation of the present study, a optimized forming condition for gear machining, considering the thickness of the outer edge of flywheel is studied using the finite elements analysis for the increasing thickness of the forming method. It is concluded from the study that forming method to increase the thickness per unit length for gear machining is reasonable using the finite elements analysis and forming test.

  4. Application of relativistic electrons for the quantitative analysis of trace elements

    NASA Astrophysics Data System (ADS)

    Hoffmann, D. H. H.; Brendel, C.; Genz, H.; Löw, W.; Richter, A.

    1984-04-01

    Particle induced X-ray emission methods (PIXE) have been extended to relativistic electrons to induce X-ray emission (REIXE) for quantitative trace-element analysis. The electron beam (20 ≤ E0≤ 70 MeV) was supplied by the Darmstadt electron linear accelerator DALINAC. Systematic measurements of absolute K-, L- and M-shell ionization cross sections revealed a scaling behaviour of inner-shell ionization cross sections from which X-ray production cross sections can be deduced for any element of interest for a quantitative sample investigation. Using a multielemental mineral monazite sample from Malaysia the sensitivity of REIXE is compared to well established methods of trace-element analysis like proton- and X-ray-induced X-ray fluorescence analysis. The achievable detection limit for very heavy elements amounts to about 100 ppm for the REIXE method. As an example of an application the investigation of a sample prepared from manganese nodules — picked up from the Pacific deep sea — is discussed, which showed the expected high mineral content of Fe, Ni, Cu and Ti, although the search for aliquots of Pt did not show any measurable content within an upper limit of 250 ppm.

  5. Elements of healthy death: a thematic analysis.

    PubMed

    Estebsari, Fatemeh; Taghdisi, Mohammad Hossein; Mostafaei, Davood; Rahimi, Zahra

    2017-01-01

    Background: Death is a natural and frightening phenomenon, which is inevitable. Previous studies on death, which presented a negative and tedious image of this process, are now being revised and directed towards acceptable death and good death. One of the proposed terms about death and dying is "healthy death", which encourages dealing with death positively and leading a lively and happy life until the last moment. This study aimed to explain the views of Iranians about the elements of healthy death. Methods: This qualitative study was conducted for 12 months in two general hospitals in Tehran (capital of Iran), using the thematic analysis method. After conducting 23 in-depth interviews with 21 participants, transcription of content, and data immersion and analysis, themes, as the smallest meaningful units were extracted, encoded and classified. Results: One main category of healthy death with 10 subthemes, including dying at the right time, dying without hassle, dying without cost, dying without dependency and control, peaceful death, not having difficulty at dying, not dying alone and dying at home, inspired death, preplanned death, and presence of a clergyman or a priest, were extracted as the elements of healthy death from the perspective of the participants in this study. Conclusion: The study findings well explained the elements of healthy death. Paying attention to the conditions and factors causing healthy death by professionals and providing and facilitating quality services for patients in the end stage of life make it possible for patients to experience a healthy death.

  6. SATRAT: Staphylococcus aureus transcript regulatory network analysis tool.

    PubMed

    Gopal, Tamilselvi; Nagarajan, Vijayaraj; Elasri, Mohamed O

    2015-01-01

    Staphylococcus aureus is a commensal organism that primarily colonizes the nose of healthy individuals. S. aureus causes a spectrum of infections that range from skin and soft-tissue infections to fatal invasive diseases. S. aureus uses a large number of virulence factors that are regulated in a coordinated fashion. The complex regulatory mechanisms have been investigated in numerous high-throughput experiments. Access to this data is critical to studying this pathogen. Previously, we developed a compilation of microarray experimental data to enable researchers to search, browse, compare, and contrast transcript profiles. We have substantially updated this database and have built a novel exploratory tool-SATRAT-the S. aureus transcript regulatory network analysis tool, based on the updated database. This tool is capable of performing deep searches using a query and generating an interactive regulatory network based on associations among the regulators of any query gene. We believe this integrated regulatory network analysis tool would help researchers explore the missing links and identify novel pathways that regulate virulence in S. aureus. Also, the data model and the network generation code used to build this resource is open sourced, enabling researchers to build similar resources for other bacterial systems.

  7. Estimation of the influence of tool wear on force signals: A finite element approach in AISI 1045 orthogonal cutting

    NASA Astrophysics Data System (ADS)

    Equeter, Lucas; Ducobu, François; Rivière-Lorphèvre, Edouard; Abouridouane, Mustapha; Klocke, Fritz; Dehombreux, Pierre

    2018-05-01

    Industrial concerns arise regarding the significant cost of cutting tools in machining process. In particular, their improper replacement policy can lead either to scraps, or to early tool replacements, which would waste fine tools. ISO 3685 provides the flank wear end-of-life criterion. Flank wear is also the nominal type of wear for longest tool lifetimes in optimal cutting conditions. Its consequences include bad surface roughness and dimensional discrepancies. In order to aid the replacement decision process, several tool condition monitoring techniques are suggested. Force signals were shown in the literature to be strongly linked with tools flank wear. It can therefore be assumed that force signals are highly relevant for monitoring the condition of cutting tools and providing decision-aid information in the framework of their maintenance and replacement. The objective of this work is to correlate tools flank wear with numerically computed force signals. The present work uses a Finite Element Model with a Coupled Eulerian-Lagrangian approach. The geometry of the tool is changed for different runs of the model, in order to obtain results that are specific to a certain level of wear. The model is assessed by comparison with experimental data gathered earlier on fresh tools. Using the model at constant cutting parameters, force signals under different tool wear states are computed and provide force signals for each studied tool geometry. These signals are qualitatively compared with relevant data from the literature. At this point, no quantitative comparison could be performed on worn tools because the reviewed literature failed to provide similar studies in this material, either numerical or experimental. Therefore, further development of this work should include experimental campaigns aiming at collecting cutting forces signals and assessing the numerical results that were achieved through this work.

  8. Lightweight object oriented structure analysis: tools for building tools to analyze molecular dynamics simulations.

    PubMed

    Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan

    2014-12-15

    LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. © 2014 Wiley Periodicals, Inc.

  9. Advanced Software for Analysis of High-Speed Rolling-Element Bearings

    NASA Technical Reports Server (NTRS)

    Poplawski, J. V.; Rumbarger, J. H.; Peters, S. M.; Galatis, H.; Flower, R.

    2003-01-01

    COBRA-AHS is a package of advanced software for analysis of rigid or flexible shaft systems supported by rolling-element bearings operating at high speeds under complex mechanical and thermal loads. These loads can include centrifugal and thermal loads generated by motions of bearing components. COBRA-AHS offers several improvements over prior commercial bearing-analysis programs: It includes innovative probabilistic fatigue-life-estimating software that provides for computation of three-dimensional stress fields and incorporates stress-based (in contradistinction to prior load-based) mathematical models of fatigue life. It interacts automatically with the ANSYS finite-element code to generate finite-element models for estimating distributions of temperature and temperature-induced changes in dimensions in iterative thermal/dimensional analyses: thus, for example, it can be used to predict changes in clearances and thermal lockup. COBRA-AHS provides an improved graphical user interface that facilitates the iterative cycle of analysis and design by providing analysis results quickly in graphical form, enabling the user to control interactive runs without leaving the program environment, and facilitating transfer of plots and printed results for inclusion in design reports. Additional features include roller-edge stress prediction and influence of shaft and housing distortion on bearing performance.

  10. On the next generation of reliability analysis tools

    NASA Technical Reports Server (NTRS)

    Babcock, Philip S., IV; Leong, Frank; Gai, Eli

    1987-01-01

    The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

  11. DIY Solar Market Analysis Webinar Series: Community Solar Scenario Tool |

    Science.gov Websites

    State, Local, and Tribal Governments | NREL Webinar Series: Community Solar Scenario Tool DIY Solar Market Analysis Webinar Series: Community Solar Scenario Tool Wednesday, August 13, 2014 As part ) presented a live webinar titled, "Community Solar Scenario Tool: Planning for a fruitful solar garden

  12. A reliability analysis tool for SpaceWire network

    NASA Astrophysics Data System (ADS)

    Zhou, Qiang; Zhu, Longjiang; Fei, Haidong; Wang, Xingyou

    2017-04-01

    A SpaceWire is a standard for on-board satellite networks as the basis for future data-handling architectures. It is becoming more and more popular in space applications due to its technical advantages, including reliability, low power and fault protection, etc. High reliability is the vital issue for spacecraft. Therefore, it is very important to analyze and improve the reliability performance of the SpaceWire network. This paper deals with the problem of reliability modeling and analysis with SpaceWire network. According to the function division of distributed network, a reliability analysis method based on a task is proposed, the reliability analysis of every task can lead to the system reliability matrix, the reliability result of the network system can be deduced by integrating these entire reliability indexes in the matrix. With the method, we develop a reliability analysis tool for SpaceWire Network based on VC, where the computation schemes for reliability matrix and the multi-path-task reliability are also implemented. By using this tool, we analyze several cases on typical architectures. And the analytic results indicate that redundancy architecture has better reliability performance than basic one. In practical, the dual redundancy scheme has been adopted for some key unit, to improve the reliability index of the system or task. Finally, this reliability analysis tool will has a directive influence on both task division and topology selection in the phase of SpaceWire network system design.

  13. Deliberate teaching tools for clinical teaching encounters: A critical scoping review and thematic analysis to establish definitional clarity.

    PubMed

    Sidhu, Navdeep S; Edwards, Morgan

    2018-04-27

    We conducted a scoping review of tools designed to add structure to clinical teaching, with a thematic analysis to establish definitional clarity. Six thousand and forty nine citations were screened, 434 reviewed for eligibility, and 230 identified as meeting study inclusion criteria. Eighty-nine names and 51 definitions were identified. Based on a post facto thematic analysis, we propose that these tools be named "deliberate teaching tools" (DTTs) and defined as "frameworks that enable clinicians to have a purposeful and considered approach to teaching encounters by incorporating elements identified with good teaching practice." We identified 46 DTTs in the literature, with 38 (82.6%) originally described for the medical setting. Forty justification articles consisted of 16 feedback surveys, 13 controlled trials, seven pre-post intervention studies with no control group, and four observation studies. Current evidence of efficacy is not entirely conclusive, and many studies contain methodology flaws. Forty-nine clarification articles comprised 12 systematic reviews and 37 narrative reviews. The most number of DTTs described by any review was four. A common design theme was identified in approximately three-quarters of DTTs. Applicability of DTTs to specific alternate settings should be considered in context, and appropriately designed justification studies are warranted to demonstrate efficacy.

  14. Portfolio Analysis Tool

    NASA Technical Reports Server (NTRS)

    Barth, Tim; Zapata, Edgar; Benjamin, Perakath; Graul, Mike; Jones, Doug

    2005-01-01

    Portfolio Analysis Tool (PAT) is a Web-based, client/server computer program that helps managers of multiple projects funded by different customers to make decisions regarding investments in those projects. PAT facilitates analysis on a macroscopic level, without distraction by parochial concerns or tactical details of individual projects, so that managers decisions can reflect the broad strategy of their organization. PAT is accessible via almost any Web-browser software. Experts in specific projects can contribute to a broad database that managers can use in analyzing the costs and benefits of all projects, but do not have access for modifying criteria for analyzing projects: access for modifying criteria is limited to managers according to levels of administrative privilege. PAT affords flexibility for modifying criteria for particular "focus areas" so as to enable standardization of criteria among similar projects, thereby making it possible to improve assessments without need to rewrite computer code or to rehire experts, and thereby further reducing the cost of maintaining and upgrading computer code. Information in the PAT database and results of PAT analyses can be incorporated into a variety of ready-made or customizable tabular or graphical displays.

  15. MetaboTools: A comprehensive toolbox for analysis of genome-scale metabolic models

    DOE PAGES

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    2016-08-03

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less

  16. The Use of Finite Element Analysis to Enhance Research and Clinical Practice in Orthopedics.

    PubMed

    Pfeiffer, Ferris M

    2016-02-01

    Finite element analysis (FEA) is a very powerful tool for the evaluation of biomechanics in orthopedics. Finite element (FE) simulations can effectively and efficiently evaluate thousands of variables (such as implant variation, surgical techniques, and various pathologies) to optimize design, screening, prediction, and treatment in orthopedics. Additionally, FEA can be used to retrospectively evaluate and troubleshoot complications or failures to prevent similar future occurrences. Finally, FE simulations are used to evaluate implants, procedures, and techniques in a time- and cost-effective manner. In this work, an overview of the development of FE models is provided and an example application is presented to simulate knee biomechanics for a specimen with medial meniscus insufficiency. FE models require the development of the geometry of interest, determination of the material properties of the tissues simulated, and an accurate application of a numerical solver to produce an accurate solution and representation of the field variables. The objectives of this work are to introduce the reader to the application of FEA in orthopedic analysis of the knee joint. A brief description of the model development process as well as a specific application to the investigation of knee joint stability in geometries with normal or compromised medial meniscal attachment is included. Significant increases in stretch of the anterior cruciate ligament were predicted in specimens with medial meniscus insufficiency (such behavior was confirmed in corresponding biomechanical testing). It can be concluded from this work that FE analysis of the knee can provide significant new information with which more effective clinical decisions can be made. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  17. A preliminary analysis of trace-elemental signatures in statoliths of different spawning cohorts for Dosidicus gigas off EEZ waters of Chile

    NASA Astrophysics Data System (ADS)

    Liu, Bilin; Chen, Xinjun; Fang, Zhou; Hu, Song; Song, Qian

    2015-12-01

    We applied solution-based ICP-MS method to quantify the trace-elemental signatures in statoliths of jumbo flying squid, Dosidius gigas, which were collected from the waters off northern and central Chile during the scientific surveys carried out by Chinese squid jigging vessels in 2007 and 2008. The age and spawning date of the squid were back-calculated based on daily increments in statoliths. Eight elemental ratios (Sr/Ca, Ba/Ca, Mg/Ca, Mn/Ca, Na/Ca, Fe/Ca, Cu/Ca and Zn/Ca) were analyzed. It was found that Sr is the second most abundant element next to Ca, followed by Na, Fe, Mg, Zn, Cu, Ba and Mn. There was no significant relationship between element/Ca and sea surface temperature (SST) and sea surface salinity (SSS), although weak negative or positive tendency was found. MANOVA analysis showed that multivariate elemental signatures did not differ among the cohorts spawned in spring, autumn and winter, and no significant difference was found between the northern and central sampling locations. Classification results showed that all individuals of each spawned cohorts were correctly classified. This study demonstrates that the elemental signatures in D. gigas statoliths are potentially a useful tool to improve our understanding of its population structure and habitat environment.

  18. Finite element methodology for integrated flow-thermal-structural analysis

    NASA Technical Reports Server (NTRS)

    Thornton, Earl A.; Ramakrishnan, R.; Vemaganti, G. R.

    1988-01-01

    Papers entitled, An Adaptive Finite Element Procedure for Compressible Flows and Strong Viscous-Inviscid Interactions, and An Adaptive Remeshing Method for Finite Element Thermal Analysis, were presented at the June 27 to 29, 1988, meeting of the AIAA Thermophysics, Plasma Dynamics and Lasers Conference, San Antonio, Texas. The papers describe research work supported under NASA/Langley Research Grant NsG-1321, and are submitted in fulfillment of the progress report requirement on the grant for the period ending February 29, 1988.

  19. Multi-Spacecraft Analysis with Generic Visualization Tools

    NASA Astrophysics Data System (ADS)

    Mukherjee, J.; Vela, L.; Gonzalez, C.; Jeffers, S.

    2010-12-01

    To handle the needs of scientists today and in the future, software tools are going to have to take better advantage of the currently available hardware. Specifically, computing power, memory, and disk space have become cheaper, while bandwidth has become more expensive due to the explosion of online applications. To overcome these limitations, we have enhanced our Southwest Data Display and Analysis System (SDDAS) to take better advantage of the hardware by utilizing threads and data caching. Furthermore, the system was enhanced to support a framework for adding data formats and data visualization methods without costly rewrites. Visualization tools can speed analysis of many common scientific tasks and we will present a suite of tools that encompass the entire process of retrieving data from multiple data stores to common visualizations of the data. The goals for the end user are ease of use and interactivity with the data and the resulting plots. The data can be simultaneously plotted in a variety of formats and/or time and spatial resolutions. The software will allow one to slice and separate data to achieve other visualizations. Furthermore, one can interact with the data using the GUI or through an embedded language based on the Lua scripting language. The data presented will be primarily from the Cluster and Mars Express missions; however, the tools are data type agnostic and can be used for virtually any type of data.

  20. Nonlinear solid finite element analysis of mitral valves with heterogeneous leaflet layers

    NASA Astrophysics Data System (ADS)

    Prot, V.; Skallerud, B.

    2009-02-01

    An incompressible transversely isotropic hyperelastic material for solid finite element analysis of a porcine mitral valve response is described. The material model implementation is checked in single element tests and compared with a membrane implementation in an out-of-plane loading test to study how the layered structures modify the stress response for a simple geometry. Three different collagen layer arrangements are used in finite element analysis of the mitral valve. When the leaflets are arranged in two layers with the collagen on the ventricular side, the stress in the fibre direction through the thickness in the central part of the anterior leaflet is homogenized and the peak stress is reduced. A simulation using membrane elements is also carried out for comparison with the solid finite element results. Compared to echocardiographic measurements, the finite element models bulge too much in the left atrium. This may be due to evidence of active muscle fibres in some parts of the anterior leaflet, whereas our constitutive modelling is based on passive material.

  1. Computational mechanics analysis tools for parallel-vector supercomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Nguyen, D. T.; Baddourah, M. A.; Qin, J.

    1993-01-01

    Computational algorithms for structural analysis on parallel-vector supercomputers are reviewed. These parallel algorithms, developed by the authors, are for the assembly of structural equations, 'out-of-core' strategies for linear equation solution, massively distributed-memory equation solution, unsymmetric equation solution, general eigen-solution, geometrically nonlinear finite element analysis, design sensitivity analysis for structural dynamics, optimization algorithm and domain decomposition. The source code for many of these algorithms is available from NASA Langley.

  2. A novel hAT element in Bombyx mori and Rhodnius prolixus: its relationship with miniature inverted repeat transposable elements (MITEs) and horizontal transfer.

    PubMed

    Zhang, H-H; Shen, Y-H; Xu, H-E; Liang, H-Y; Han, M-J; Zhang, Z

    2013-10-01

    Comparative analysis of transposable elements (TEs) from different species can make it possible to reconstruct their history over evolutionary time. In this study, we identified a novel hAT element in Bombyx mori and Rhodnius prolixus with characteristic GGGCGGCA repeats in its subterminal region. Meanwhile, phylogenetic analysis demonstrated that the elements in these two species might represent a separate cluster of the hAT superfamily. Strikingly, a previously identified miniature inverted repeat transposable element (MITE) shared high identity with this autonomous element across the entire length, supporting the hypothesis that MITEs are derived from the internal deletion of DNA transposons. Interestingly, identity of the consensus sequences of this novel hAT element between B. mori and R. prolixus, which diverged about 370 million years ago, was as high as 96.5% over their full length (about 3.6 kb) at the nucleotide level. The patchy distribution amongst species, coupled with overall lack of intense purifying selection acting on this element, suggest that this novel hAT element might have experienced horizontal transfer between the ancestors of B. mori and R. prolixus. Our results highlight that this novel hAT element could be used as a potential tool for germline transformation of R. prolixus to control the transmission of Trypanosoma cruzi, which causes Chagas disease. © 2013 Royal Entomological Society.

  3. Finite element analysis of hysteresis effects in piezoelectric transducers

    NASA Astrophysics Data System (ADS)

    Simkovics, Reinhard; Landes, Hermann; Kaltenbacher, Manfred; Hoffelner, Johann; Lerch, Reinhard

    2000-06-01

    The design of ultrasonic transducers for high power applications, e.g. in medical therapy or production engineering, asks for effective computer aided design tools to analyze the occurring nonlinear effects. In this paper the finite-element-boundary-element package CAPA is presented that allows to model different types of electromechanical sensors and actuators. These transducers are based on various physical coupling effects, such as piezoelectricity or magneto- mechanical interactions. Their computer modeling requires the numerical solution of a multifield problem, such as coupled electric-mechanical fields or magnetic-mechanical fields as well as coupled mechanical-acoustic fields. With the reported software environment we are able to compute the dynamic behavior of electromechanical sensors and actuators by taking into account geometric nonlinearities, nonlinear wave propagation and ferroelectric as well as magnetic material nonlinearities. After a short introduction to the basic theory of the numerical calculation schemes, two practical examples will demonstrate the applicability of the numerical simulation tool. As a first example an ultrasonic thickness mode transducer consisting of a piezoceramic material used for high power ultrasound production is examined. Due to ferroelectric hysteresis, higher order harmonics can be detected in the actuators input current. Also in case of electrical and mechanical prestressing a resonance frequency shift occurs, caused by ferroelectric hysteresis and nonlinear dependencies of the material coefficients on electric field and mechanical stresses. As a second example, a power ultrasound transducer used in HIFU-therapy (high intensity focused ultrasound) is presented. Due to the compressibility and losses in the propagating fluid a nonlinear shock wave generation can be observed. For both examples a good agreement between numerical simulation and experimental data has been achieved.

  4. Single cell versus large population analysis: cell variability in elemental intracellular concentration and distribution.

    PubMed

    Malucelli, Emil; Procopio, Alessandra; Fratini, Michela; Gianoncelli, Alessandra; Notargiacomo, Andrea; Merolle, Lucia; Sargenti, Azzurra; Castiglioni, Sara; Cappadone, Concettina; Farruggia, Giovanna; Lombardo, Marco; Lagomarsino, Stefano; Maier, Jeanette A; Iotti, Stefano

    2018-01-01

    The quantification of elemental concentration in cells is usually performed by analytical assays on large populations missing peculiar but important rare cells. The present article aims at comparing the elemental quantification in single cells and cell population in three different cell types using a new approach for single cells elemental analysis performed at sub-micrometer scale combining X-ray fluorescence microscopy and atomic force microscopy. The attention is focused on the light element Mg, exploiting the opportunity to compare the single cell quantification to the cell population analysis carried out by a highly Mg-selective fluorescent chemosensor. The results show that the single cell analysis reveals the same Mg differences found in large population of the different cell strains studied. However, in one of the cell strains, single cell analysis reveals two cells with an exceptionally high intracellular Mg content compared with the other cells of the same strain. The single cell analysis allows mapping Mg and other light elements in whole cells at sub-micrometer scale. A detailed intensity correlation analysis on the two cells with the highest Mg content reveals that Mg subcellular localization correlates with oxygen in a different fashion with respect the other sister cells of the same strain. Graphical abstract Single cells or large population analysis this is the question!

  5. Statistical methods for the forensic analysis of striated tool marks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoeksema, Amy Beth

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken alongmore » a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.« less

  6. [Content of mineral elements of Gastrodia elata by principal components analysis].

    PubMed

    Li, Jin-ling; Zhao, Zhi; Liu, Hong-chang; Luo, Chun-li; Huang, Ming-jin; Luo, Fu-lai; Wang, Hua-lei

    2015-03-01

    To study the content of mineral elements and the principal components in Gastrodia elata. Mineral elements were determined by ICP and the data was analyzed by SPSS. K element has the highest content-and the average content was 15.31 g x kg(-1). The average content of N element was 8.99 g x kg(-1), followed by K element. The coefficient of variation of K and N was small, but the Mn was the biggest with 51.39%. The highly significant positive correlation was found among N, P and K . Three principal components were selected by principal components analysis to evaluate the quality of G. elata. P, B, N, K, Cu, Mn, Fe and Mg were the characteristic elements of G. elata. The content of K and N elements was higher and relatively stable. The variation of Mn content was biggest. The quality of G. elata in Guizhou and Yunnan was better from the perspective of mineral elements.

  7. Surface Analysis Cluster Tool | Materials Science | NREL

    Science.gov Websites

    spectroscopic ellipsometry during film deposition. The cluster tool can be used to study the effect of various prior to analysis. Here we illustrate the surface cleaning effect of an aqueous ammonia treatment on a

  8. PLANS; a finite element program for nonlinear analysis of structures. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Pifko, A.; Armen, H., Jr.; Levy, A.; Levine, H.

    1977-01-01

    The PLANS system, rather than being one comprehensive computer program, is a collection of finite element programs used for the nonlinear analysis of structures. This collection of programs evolved and is based on the organizational philosophy in which classes of analyses are treated individually based on the physical problem class to be analyzed. Each of the independent finite element computer programs of PLANS, with an associated element library, can be individually loaded and used to solve the problem class of interest. A number of programs have been developed for material nonlinear behavior alone and for combined geometric and material nonlinear behavior. The usage, capabilities, and element libraries of the current programs include: (1) plastic analysis of built-up structures where bending and membrane effects are significant, (2) three dimensional elastic-plastic analysis, (3) plastic analysis of bodies of revolution, and (4) material and geometric nonlinear analysis of built-up structures.

  9. Integrated Sensitivity Analysis Workflow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman-Hill, Ernest J.; Hoffman, Edward L.; Gibson, Marcus J.

    2014-08-01

    Sensitivity analysis is a crucial element of rigorous engineering analysis, but performing such an analysis on a complex model is difficult and time consuming. The mission of the DART Workbench team at Sandia National Laboratories is to lower the barriers to adoption of advanced analysis tools through software integration. The integrated environment guides the engineer in the use of these integrated tools and greatly reduces the cycle time for engineering analysis.

  10. A survey of tools for the analysis of quantitative PCR (qPCR) data.

    PubMed

    Pabinger, Stephan; Rödiger, Stefan; Kriegner, Albert; Vierlinger, Klemens; Weinhäusel, Andreas

    2014-09-01

    Real-time quantitative polymerase-chain-reaction (qPCR) is a standard technique in most laboratories used for various applications in basic research. Analysis of qPCR data is a crucial part of the entire experiment, which has led to the development of a plethora of methods. The released tools either cover specific parts of the workflow or provide complete analysis solutions. Here, we surveyed 27 open-access software packages and tools for the analysis of qPCR data. The survey includes 8 Microsoft Windows, 5 web-based, 9 R-based and 5 tools from other platforms. Reviewed packages and tools support the analysis of different qPCR applications, such as RNA quantification, DNA methylation, genotyping, identification of copy number variations, and digital PCR. We report an overview of the functionality, features and specific requirements of the individual software tools, such as data exchange formats, availability of a graphical user interface, included procedures for graphical data presentation, and offered statistical methods. In addition, we provide an overview about quantification strategies, and report various applications of qPCR. Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  11. Library Optimization in EDXRF Spectral Deconvolution for Multi-element Analysis of Ambient Aerosols

    EPA Science Inventory

    In multi-element analysis of atmospheric aerosols, attempts are made to fit overlapping elemental spectral lines for many elements that may be undetectable in samples due to low concentrations. Fitting with many library reference spectra has the unwanted effect of raising the an...

  12. Designing an Exploratory Text Analysis Tool for Humanities and Social Sciences Research

    ERIC Educational Resources Information Center

    Shrikumar, Aditi

    2013-01-01

    This dissertation presents a new tool for exploratory text analysis that attempts to improve the experience of navigating and exploring text and its metadata. The design of the tool was motivated by the unmet need for text analysis tools in the humanities and social sciences. In these fields, it is common for scholars to have hundreds or thousands…

  13. The Development of a Humanitarian Health Ethics Analysis Tool.

    PubMed

    Fraser, Veronique; Hunt, Matthew R; de Laat, Sonya; Schwartz, Lisa

    2015-08-01

    Introduction Health care workers (HCWs) who participate in humanitarian aid work experience a range of ethical challenges in providing care and assistance to communities affected by war, disaster, or extreme poverty. Although there is increasing discussion of ethics in humanitarian health care practice and policy, there are very few resources available for humanitarian workers seeking ethical guidance in the field. To address this knowledge gap, a Humanitarian Health Ethics Analysis Tool (HHEAT) was developed and tested as an action-oriented resource to support humanitarian workers in ethical decision making. While ethical analysis tools increasingly have become prevalent in a variety of practice contexts over the past two decades, very few of these tools have undergone a process of empirical validation to assess their usefulness for practitioners. A qualitative study consisting of a series of six case-analysis sessions with 16 humanitarian HCWs was conducted to evaluate and refine the HHEAT. Participant feedback inspired the creation of a simplified and shortened version of the tool and prompted the development of an accompanying handbook. The study generated preliminary insight into the ethical deliberation processes of humanitarian health workers and highlighted different types of ethics support that humanitarian workers might find helpful in supporting the decision-making process.

  14. Elemental investigation of Syrian medicinal plants using PIXE analysis

    NASA Astrophysics Data System (ADS)

    Rihawy, M. S.; Bakraji, E. H.; Aref, S.; Shaban, R.

    2010-09-01

    Particle induced X-ray emission (PIXE) technique has been employed to perform elemental analysis of K, Ca, Mn, Fe, Cu, Zn, Br and Sr for Syrian medicinal plants used traditionally to enhance the body immunity. Plant samples were prepared in a simple dried base. The results were verified by comparing with those obtained from both IAEA-359 and IAEA-V10 reference materials. Relative standard deviations are mostly within ±5-10% suggest good precision. A correlation between the elemental content in each medicinal plant with its traditional remedial usage has been proposed. Both K and Ca are found to be the major elements in the samples. Fe, Mn and Zn have been detected in good levels in most of these plants clarifying their possible contribution to keep the body immune system in good condition. The contribution of the elements in these plants to the dietary recommended intakes (DRI) has been evaluated. Advantages and limitations of PIXE analytical technique in this investigation have been reviewed.

  15. Design of an elemental analysis system for CELSS research

    NASA Technical Reports Server (NTRS)

    Schwartzkopf, Steven H.

    1987-01-01

    The results of experiments conducted with higher plants in tightly sealed growth chambers provide definite evidence that the physical closure of a chamber has significant effects on many aspects of a plant's biology. One of these effects is seen in the change in rates of uptake, distribution, and re-release or nutrient elements by the plant (mass balance). Experimental data indicates that these rates are different from those recorded for plants grown in open field agriculture, or in open growth chambers. Since higher plants are a crucial component of a controlled ecological life support system (CELSS), it is important that the consequences of these rate differences be understood with regard to the growth and yield of the plants. A description of a system for elemental analysis which can be used to monitor the mass balance of nutrient elements in CELSS experiments is given. Additionally, data on the uptake of nutrient elements by higher plants grown in a growth chamber is presented.

  16. Laser-ablation ICP-MS as a tool for whole rock trace element analyses on fused powders

    NASA Astrophysics Data System (ADS)

    Girard, G.; Rooney, T. O.

    2013-12-01

    Here we present an accurate and precise technique for routine trace element analysis of geologic materials by laser-ablation inductively coupled plasma mass spectrometry (LA-ICP-MS). We focus on rock powders previously prepared for X-ray fluorescence by fusion in a Li2B4O7 flux, and subsequently quenched in a Pt mold to form a glass disk. Our method allows for the analysis up to 30 trace elements by LA-ICP-MS using a Photon-Machines Analyte G2 193 nm excimer laser coupled to a Thermo-Fisher Scientific ICAP Q quadrupole ICP-MS. Analyses are run as scans on the surface of the disks. Laser ablation conditions for which trace element fractionation effects are minimal have been empirically determined to be ~ 4 J m-2 fluence, at 10 Hz , and 10 μm s-1 scan speed, using a 110 μm laser beam size. Ablated material is carried into the ICP-MS by a He carrier at a rate of 0.75 L min-1. Following pre-ablation to remove surface particles, samples are ablated for 200 s, of which 140 s are used for data acquisition. At the end of each scan, a gas blank is collected for 30 s. Dwell times for each element vary between 15 and 60 μs, depending on abundance and instrument sensitivity, allowing 120 readings of each element during the data acquisition time window. To correct for variations in the total volume of material extracted by the laser, three internal standards are used, Ca, Fe and Zr. These elements are routinely analyzed by X-ray fluorescence by the Geoanalytical laboratory at Michigan State University with precision and accuracy of <5%. The availability of several internal standards allows for better correction of possible persisting laser ablation fractionation effects; for a particular trace element, we correct using the internal standard that best reproduces its ablation behavior. Our calibration is based on a combination of fused powders of US Geological Survey and Geological Survey of Japan rock standards, NIST SRM 612 glass, and US Geological Survey natural and

  17. A new discrete Kirchhoff-Mindlin element based on Mindlin-Reissner plate theory and assumed shear strain fields. I - An extended DKT element for thick-plate bending analysis. II - An extended DKQ element for thick-plate bending analysis

    NASA Astrophysics Data System (ADS)

    Katili, Irwan

    1993-06-01

    A new three-node nine-degree-of-freedom triangular plate bending element is proposed which is valid for the analysis of both thick and thin plates. The element, called the discrete Kirchhoff-Mindlin triangle (DKMT), has a proper rank, passes the patch test for thin and thick plates in an arbitrary mesh, and is free of shear locking. As an extension of the DKMT element, a four-node element with 3 degrees of freedom per node is developed. The element, referred to as DKMQ (discrete Kirchhoff-Mindlin quadrilateral) is found to provide good results for both thin and thick plates without any compatibility problems.

  18. Recurrence time statistics: versatile tools for genomic DNA sequence analysis.

    PubMed

    Cao, Yinhe; Tung, Wen-Wen; Gao, J B

    2004-01-01

    With the completion of the human and a few model organisms' genomes, and the genomes of many other organisms waiting to be sequenced, it has become increasingly important to develop faster computational tools which are capable of easily identifying the structures and extracting features from DNA sequences. One of the more important structures in a DNA sequence is repeat-related. Often they have to be masked before protein coding regions along a DNA sequence are to be identified or redundant expressed sequence tags (ESTs) are to be sequenced. Here we report a novel recurrence time based method for sequence analysis. The method can conveniently study all kinds of periodicity and exhaustively find all repeat-related features from a genomic DNA sequence. An efficient codon index is also derived from the recurrence time statistics, which has the salient features of being largely species-independent and working well on very short sequences. Efficient codon indices are key elements of successful gene finding algorithms, and are particularly useful for determining whether a suspected EST belongs to a coding or non-coding region. We illustrate the power of the method by studying the genomes of E. coli, the yeast S. cervisivae, the nematode worm C. elegans, and the human, Homo sapiens. Computationally, our method is very efficient. It allows us to carry out analysis of genomes on the whole genomic scale by a PC.

  19. Axisymmetric analysis of a tube-type acoustic levitator by a finite element method.

    PubMed

    Hatano, H

    1994-01-01

    A finite element approach was taken for the study of the sound field and positioning force in a tube-type acoustic levitator. An axisymmetric model, where a rigid sphere is suspended on the tube axis, was introduced to model a cylindrical chamber of a levitation tube furnace. Distributions of velocity potential, magnitudes of positioning force, and resonance frequency shifts of the chamber due to the presence of the sphere were numerically estimated in relation to the sphere's position and diameter. Experiments were additionally made to compare with the simulation. The finite element method proved to be a useful tool for analyzing and designing the tube-type levitator.

  20. Material nonlinear analysis via mixed-iterative finite element method

    NASA Technical Reports Server (NTRS)

    Sutjahjo, Edhi; Chamis, Christos C.

    1992-01-01

    The performance of elastic-plastic mixed-iterative analysis is examined through a set of convergence studies. Membrane and bending behaviors are tested using 4-node quadrilateral finite elements. The membrane result is excellent, which indicates the implementation of elastic-plastic mixed-iterative analysis is appropriate. On the other hand, further research to improve bending performance of the method seems to be warranted.

  1. Three-dimensional Stress Analysis Using the Boundary Element Method

    NASA Technical Reports Server (NTRS)

    Wilson, R. B.; Banerjee, P. K.

    1984-01-01

    The boundary element method is to be extended (as part of the NASA Inelastic Analysis Methods program) to the three-dimensional stress analysis of gas turbine engine hot section components. The analytical basis of the method (as developed in elasticity) is outlined, its numerical implementation is summarized, and the approaches to be followed in extending the method to include inelastic material response indicated.

  2. Inferring transposons activity chronology by TRANScendence - TEs database and de-novo mining tool.

    PubMed

    Startek, Michał Piotr; Nogły, Jakub; Gromadka, Agnieszka; Grzebelus, Dariusz; Gambin, Anna

    2017-10-16

    The constant progress in sequencing technology leads to ever increasing amounts of genomic data. In the light of current evidence transposable elements (TEs for short) are becoming useful tools for learning about the evolution of host genome. Therefore the software for genome-wide detection and analysis of TEs is of great interest. Here we describe the computational tool for mining, classifying and storing TEs from newly sequenced genomes. This is an online, web-based, user-friendly service, enabling users to upload their own genomic data, and perform de-novo searches for TEs. The detected TEs are automatically analyzed, compared to reference databases, annotated, clustered into families, and stored in TEs repository. Also, the genome-wide nesting structure of found elements are detected and analyzed by new method for inferring evolutionary history of TEs. We illustrate the functionality of our tool by performing a full-scale analyses of TE landscape in Medicago truncatula genome. TRANScendence is an effective tool for the de-novo annotation and classification of transposable elements in newly-acquired genomes. Its streamlined interface makes it well-suited for evolutionary studies.

  3. Stability analysis using SDSA tool

    NASA Astrophysics Data System (ADS)

    Goetzendorf-Grabowski, Tomasz; Mieszalski, Dawid; Marcinkiewicz, Ewa

    2011-11-01

    The SDSA (Simulation and Dynamic Stability Analysis) application is presented as a tool for analysing the dynamic characteristics of the aircraft just in the conceptual design stage. SDSA is part of the CEASIOM (Computerized Environment for Aircraft Synthesis and Integrated Optimization Methods) software environment which was developed within the SimSAC (Simulating Aircraft Stability And Control Characteristics for Use in Conceptual Design) project, funded by the European Commission 6th Framework Program. SDSA can also be used as stand alone software, and integrated with other design and optimisation systems using software wrappers. This paper focuses on the main functionalities of SDSA and presents both computational and free flight experimental results to compare and validate the presented software. Two aircraft are considered, the EADS Ranger 2000 and the Warsaw University designed PW-6 glider. For the two cases considered here the SDSA software is shown to be an excellent tool for predicting dynamic characteristics of an aircraft.

  4. Finite element analysis of end notch flexure specimen

    NASA Technical Reports Server (NTRS)

    Mall, S.; Kochhar, N. K.

    1986-01-01

    A finite element analysis of the end notch flexure specimen for mode II interlaminar fracture toughness measurement was conducted. The effect of friction between the crack faces and large deflection on the evaluation of G sub IIc from this specimen were investigated. Results of this study are presented in this paper.

  5. Finite-element analysis of end-notch flexure specimens

    NASA Technical Reports Server (NTRS)

    Mall, S.; Kochhar, N. K.

    1986-01-01

    A finite-element analysis of the end-notch flexure specimen for Mode II interlaminar fracture toughness measurement was conducted. The effects of friction between the crack faces and large deflection on the evaluation of G(IIc) from this specimen were investigated. Results of this study are presented in this paper.

  6. Ultra-Sensitive Elemental Analysis Using Plasmas 7.Application to Criminal Investigation

    NASA Astrophysics Data System (ADS)

    Suzuki, Yasuhiro

    This paper describes the application of trace elemental analysis using ICP-AES and ICP-MS to criminal investigation. The comparison of trace elements, such as Rb, Sr, Zr, and so on, is effective for the forensic discrimination of glass fragments, which can be important physical evidence for connecting a suspect to a crime scene or to a victim. This procedure can be applied also to lead shotgun pellets by the removal of matrix lead as the sulfate precipitate after the dissolution of a pellet sample. The determination of a toxic element in bio-logical samples is required to prove that a victim ingested this element. Arsenous acids produced in Japan, China, Germany and Switzerland show characteristic patterns of trace elements characteristic to each country.

  7. On modelling three-dimensional piezoelectric smart structures with boundary spectral element method

    NASA Astrophysics Data System (ADS)

    Zou, Fangxin; Aliabadi, M. H.

    2017-05-01

    The computational efficiency of the boundary element method in elastodynamic analysis can be significantly improved by employing high-order spectral elements for boundary discretisation. In this work, for the first time, the so-called boundary spectral element method is utilised to formulate the piezoelectric smart structures that are widely used in structural health monitoring (SHM) applications. The resultant boundary spectral element formulation has been validated by the finite element method (FEM) and physical experiments. The new formulation has demonstrated a lower demand on computational resources and a higher numerical stability than commercial FEM packages. Comparing to the conventional boundary element formulation, a significant reduction in computational expenses has been achieved. In summary, the boundary spectral element formulation presented in this paper provides a highly efficient and stable mathematical tool for the development of SHM applications.

  8. Simulation model of an eyeball based on finite element analysis on a supercomputer.

    PubMed

    Uchio, E; Ohno, S; Kudoh, J; Aoki, K; Kisielewicz, L T

    1999-10-01

    A simulation model of the human eye was developed. It was applied to the determination of the physical and mechanical conditions of impacting foreign bodies causing intraocular foreign body (IOFB) injuries. Modules of the Hypermesh (Altair Engineering, Tokyo, Japan) were used for solid modelling, geometric construction, and finite element mesh creation based on information obtained from cadaver eyes. The simulations were solved by a supercomputer using the finite element analysis (FEA) program PAM-CRASH (Nihon ESI, Tokyo, Japan). It was assumed that rupture occurs at a strain of 18.0% in the cornea and 6.8% in the sclera and at a stress of 9.4 MPa for both cornea and sclera. Blunt-shaped missiles were shot and set to impact on the surface of the cornea or sclera at velocities of 30 and 60 m/s, respectively. According to the simulation, the sizes of missile above which corneal rupture occurred at velocities of 30 and 60 m/s were 1.95 and 0.82 mm. The missile sizes causing scleral rupture were 0.95 and 0.75 mm at velocities of 30 and 60 m/s. These results suggest that this FEA model has potential usefulness as a simulation tool for ocular injury and it may provide useful information for developing protective measures against industrial and traffic ocular injuries.

  9. Rapid Benefit Indicators (RBI) Spatial Analysis Tools

    EPA Science Inventory

    The Rapid Benefit Indicators (RBI) approach consists of five steps and is outlined in Assessing the Benefits of Wetland Restoration - A Rapid Benefits Indicators Approach for Decision Makers. This spatial analysis tool is intended to be used to analyze existing spatial informatio...

  10. Stability analysis of flexible wind turbine blades using finite element method

    NASA Technical Reports Server (NTRS)

    Kamoulakos, A.

    1982-01-01

    Static vibration and flutter analysis of a straight elastic axis blade was performed based on a finite element method solution. The total potential energy functional was formulated according to linear beam theory. The inertia and aerodynamic loads were formulated according to the blade absolute acceleration and absolute velocity vectors. In vibration analysis, the direction of motion of the blade during the first out-of-lane and first in-plane modes was examined; numerical results involve NASA/DOE Mod-0, McCauley propeller, north wind turbine and flat plate behavior. In flutter analysis, comparison cases were examined involving several references. Vibration analysis of a nonstraight elastic axis blade based on a finite element method solution was performed in a similar manner with the straight elastic axis blade, since it was recognized that a curved blade can be approximated by an assembly of a sufficient number of straight blade elements at different inclinations with respect to common system of axes. Numerical results involve comparison between the behavior of a straight and a curved cantilever beam during the lowest two in-plane and out-of-plane modes.

  11. Finite element analysis on the bending condition of truck frame before and after opening

    NASA Astrophysics Data System (ADS)

    Cai, Kaiwu; Cheng, Wei; Lu, Jifu

    2018-05-01

    Based on the design parameters of a truck frame, the structure design and model of the truck frame are built. Based on the finite element theory, the load, the type of fatigue and the material parameters of the frame are combined with the semi-trailer. Using finite element analysis software, after a truck frame hole in bending condition for the finite element analysis of comparison, through the analysis found that the truck frame hole under bending condition can meet the strength requirements are very helpful for improving the design of the truck frame.

  12. Finite element modeling and analysis of reinforced-concrete bridge.

    DOT National Transportation Integrated Search

    2000-09-01

    Despite its long history, the finite element method continues to be the predominant strategy employed by engineers to conduct structural analysis. A reliable method is needed for analyzing structures made of reinforced concrete, a complex but common ...

  13. General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.

    2007-01-01

    The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system developed by NASA and private industry in the spirit of the NASA Mission. GMAT contains new technology and is a testbed for future technology development. The goal of the GMAT project is to develop new space trajectory optimization and mission design technology by working inclusively with ordinary people, universities, businesses, and other government organizations, and to share that technology in an open and unhindered way. GMAT is a free and open source software system licensed under the NASA Open Source Agreement: free for anyone to use in development of new mission concepts or to improve current missions, freely available in source code form for enhancement or further technology development.

  14. Infrastructure Analysis Tools: A Focus on Cash Flow Analysis (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melaina, M.; Penev, M.

    2012-09-01

    NREL has developed and maintains a variety of infrastructure analysis models for the U.S. Department of Energy. Business case analysis has recently been added to this tool set. This presentation focuses on cash flow analysis. Cash flows depend upon infrastructure costs, optimized spatially and temporally, and assumptions about financing and revenue. NREL has incorporated detailed metrics on financing and incentives into the models. Next steps in modeling include continuing to collect feedback on regional/local infrastructure development activities and 'roadmap' dynamics, and incorporating consumer preference assumptions on infrastructure to provide direct feedback between vehicles and station rollout.

  15. A Quality Assessment Tool for Non-Specialist Users of Regression Analysis

    ERIC Educational Resources Information Center

    Argyrous, George

    2015-01-01

    This paper illustrates the use of a quality assessment tool for regression analysis. It is designed for non-specialist "consumers" of evidence, such as policy makers. The tool provides a series of questions such consumers of evidence can ask to interrogate regression analysis, and is illustrated with reference to a recent study published…

  16. [Three dimensional mathematical model of tooth for finite element analysis].

    PubMed

    Puskar, Tatjana; Vasiljević, Darko; Marković, Dubravka; Jevremović, Danimir; Pantelić, Dejan; Savić-Sević, Svetlana; Murić, Branka

    2010-01-01

    The mathematical model of the abutment tooth is the starting point of the finite element analysis of stress and deformation of dental structures. The simplest and easiest way is to form a model according to the literature data of dimensions and morphological characteristics of teeth. Our method is based on forming 3D models using standard geometrical forms (objects) in programmes for solid modeling. Forming the mathematical model of abutment of the second upper premolar for finite element analysis of stress and deformation of dental structures. The abutment tooth has a form of a complex geometric object. It is suitable for modeling in programs for solid modeling SolidWorks. After analysing the literature data about the morphological characteristics of teeth, we started the modeling dividing the tooth (complex geometric body) into simple geometric bodies (cylinder, cone, pyramid,...). Connecting simple geometric bodies together or substricting bodies from the basic body, we formed complex geometric body, tooth. The model is then transferred into Abaqus, a computational programme for finite element analysis. Transferring the data was done by standard file format for transferring 3D models ACIS SAT. Using the programme for solid modeling SolidWorks, we developed three models of abutment of the second maxillary premolar: the model of the intact abutment, the model of the endodontically treated tooth with two remaining cavity walls and the model of the endodontically treated tooth with two remaining walls and inserted post. Mathematical models of the abutment made according to the literature data are very similar with the real abutment and the simplifications are minimal. These models enable calculations of stress and deformation of the dental structures. The finite element analysis provides useful information in understanding biomechanical problems and gives guidance for clinical research.

  17. Finite Element Analysis Of Influence Of Flank Wear Evolution On Forces In Orthogonal Cutting Of 42CrMo4 Steel

    NASA Astrophysics Data System (ADS)

    Madajewski, Marek; Nowakowski, Zbigniew

    2017-01-01

    This paper presents analysis of flank wear influence on forces in orthogonal turning of 42CrMo4 steel and evaluates capacity of finite element model to provide such force values. Data about magnitude of feed and cutting force were obtained from measurements with force tensiometer in experimental test as well as from finite element analysis of chip formation process in ABAQUS/Explicit software. For studies an insert with complex rake face was selected and flank wear was simulated by grinding operation on its flank face. The aim of grinding inset surface was to obtain even flat wear along cutting edge, which after the measurement could be modeled with CAD program and applied in FE analysis for selected range of wear width. By comparing both sets of force values as function of flank wear in given cutting conditions FEA model was validated and it was established that it can be applied to analyze other physical aspects of machining. Force analysis found that progression of wear causes increase in cutting force magnitude and steep boost to feed force magnitude. Analysis of Fc/Ff force ratio revealed that flank wear has significant impact on resultant force in orthogonal cutting and magnitude of this force components in cutting and feed direction. Surge in force values can result in transfer of substantial loads to machine-tool interface.

  18. Wireless acceleration sensor of moving elements for condition monitoring of mechanisms

    NASA Astrophysics Data System (ADS)

    Sinitsin, Vladimir V.; Shestakov, Aleksandr L.

    2017-09-01

    Comprehensive analysis of the angular and linear accelerations of moving elements (shafts, gears) allows an increase in the quality of the condition monitoring of mechanisms. However, existing tools and methods measure either linear or angular acceleration with postprocessing. This paper suggests a new construction design of an angular acceleration sensor for moving elements. The sensor is mounted on a moving element and, among other things, the data transfer and electric power supply are carried out wirelessly. In addition, the authors introduce a method for processing the received information which makes it possible to divide the measured acceleration into the angular and linear components. The design has been validated by the results of laboratory tests of an experimental model of the sensor. The study has shown that this method provides a definite separation of the measured acceleration into linear and angular components, even in noise. This research contributes an advance in the range of methods and tools for condition monitoring of mechanisms.

  19. Sharing tools and best practice in Global Sensitivity Analysis within academia and with industry

    NASA Astrophysics Data System (ADS)

    Wagener, T.; Pianosi, F.; Noacco, V.; Sarrazin, F.

    2017-12-01

    We have spent years trying to improve the use of global sensitivity analysis (GSA) in earth and environmental modelling. Our efforts included (1) the development of tools that provide easy access to widely used GSA methods, (2) the definition of workflows so that best practice is shared in an accessible way, and (3) the development of algorithms to close gaps in available GSA methods (such as moment independent strategies) and to make GSA applications more robust (such as convergence criteria). These elements have been combined in our GSA Toolbox, called SAFE (www.safetoolbox.info), which has up to now been adopted by over 1000 (largely) academic users worldwide. However, despite growing uptake in academic circles and across a wide range of application areas, transfer to industry applications has been difficult. Initial market research regarding opportunities and barriers for uptake revealed a large potential market, but also highlighted a significant lack of knowledge regarding state-of-the-art methods and their potential value for end-users. We will present examples and discuss our experience so far in trying to overcome these problems and move beyond academia in distributing GSA tools and expertise.

  20. Comparison of Gap Elements and Contact Algorithm for 3D Contact Analysis of Spiral Bevel Gears

    NASA Technical Reports Server (NTRS)

    Bibel, G. D.; Tiku, K.; Kumar, A.; Handschuh, R.

    1994-01-01

    Three dimensional stress analysis of spiral bevel gears in mesh using the finite element method is presented. A finite element model is generated by solving equations that identify tooth surface coordinates. Contact is simulated by the automatic generation of nonpenetration constraints. This method is compared to a finite element contact analysis conducted with gap elements.

  1. Development of a User Interface for a Regression Analysis Software Tool

    NASA Technical Reports Server (NTRS)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  2. A comparative analysis of Patient-Reported Expanded Disability Status Scale tools.

    PubMed

    Collins, Christian DE; Ivry, Ben; Bowen, James D; Cheng, Eric M; Dobson, Ruth; Goodin, Douglas S; Lechner-Scott, Jeannette; Kappos, Ludwig; Galea, Ian

    2016-09-01

    Patient-Reported Expanded Disability Status Scale (PREDSS) tools are an attractive alternative to the Expanded Disability Status Scale (EDSS) during long term or geographically challenging studies, or in pressured clinical service environments. Because the studies reporting these tools have used different metrics to compare the PREDSS and EDSS, we undertook an individual patient data level analysis of all available tools. Spearman's rho and the Bland-Altman method were used to assess correlation and agreement respectively. A systematic search for validated PREDSS tools covering the full EDSS range identified eight such tools. Individual patient data were available for five PREDSS tools. Excellent correlation was observed between EDSS and PREDSS with all tools. A higher level of agreement was observed with increasing levels of disability. In all tools, the 95% limits of agreement were greater than the minimum EDSS difference considered to be clinically significant. However, the intra-class coefficient was greater than that reported for EDSS raters of mixed seniority. The visual functional system was identified as the most significant predictor of the PREDSS-EDSS difference. This analysis will (1) enable researchers and service providers to make an informed choice of PREDSS tool, depending on their individual requirements, and (2) facilitate improvement of current PREDSS tools. © The Author(s), 2015.

  3. Slave finite elements: The temporal element approach to nonlinear analysis

    NASA Technical Reports Server (NTRS)

    Gellin, S.

    1984-01-01

    A formulation method for finite elements in space and time incorporating nonlinear geometric and material behavior is presented. The method uses interpolation polynomials for approximating the behavior of various quantities over the element domain, and only explicit integration over space and time. While applications are general, the plate and shell elements that are currently being programmed are appropriate to model turbine blades, vanes, and combustor liners.

  4. A Finite Element Analysis of a Class of Problems in Elasto-Plasticity with Hidden Variables.

    DTIC Science & Technology

    1985-09-01

    RD-R761 642 A FINITE ELEMENT ANALYSIS OF A CLASS OF PROBLEMS IN 1/2 ELASTO-PLASTICITY MIlT (U) TEXAS INST FOR COMPUTATIONAL MECHANICS AUSTIN J T ODEN...end Subtitle) S. TYPE OF REPORT & PERIOD COVERED A FINITE ELEMENT ANALYSIS OF A CLASS OF PROBLEMS Final Report IN ELASTO-PLASTICITY WITH HIDDEN...aieeoc ede It neceeeary nd Identify by block number) ;"Elastoplasticity, finite deformations; non-convex analysis ; finite element methods, metal forming

  5. An Error Analysis for the Finite Element Method Applied to Convection Diffusion Problems.

    DTIC Science & Technology

    1981-03-01

    D TFhG-]NOLOGY k 4b 00 \\" ) ’b Technical Note BN-962 AN ERROR ANALYSIS FOR THE FINITE ELEMENT METHOD APPLIED TO CONVECTION DIFFUSION PROBLEM by I...Babu~ka and W. G. Szym’czak March 1981 V.. UNVI I Of- ’i -S AN ERROR ANALYSIS FOR THE FINITE ELEMENT METHOD P. - 0 w APPLIED TO CONVECTION DIFFUSION ...AOAO98 895 MARYLAND UNIVYCOLLEGE PARK INST FOR PHYSICAL SCIENCE--ETC F/G 12/I AN ERROR ANALYIS FOR THE FINITE ELEMENT METHOD APPLIED TO CONV..ETC (U

  6. Trace-Element Analysis by Use of PIXE Technique on Agricultural Products

    NASA Astrophysics Data System (ADS)

    Takagi, A.; Yokoyama, R.; Makisaka, K.; Kisamori, K.; Kuwada, Y.; Nishimura, D.; Matsumiya, R.; Fujita, Y.; Mihara, M.; Matsuta, K.; Fukuda, M.

    2009-10-01

    In order to examine whether a trace-element analysis by PIXE (Particle Induced X-ray Emission) gives a clue to identify production area of agricultural products, we carried out a study on soy beans as an example. In the present study, a proton beam at the energy of 2.3MeV was provided by Van de Graaff accelerator at Osaka University. We used a Ge detector with Be window to measure X-ray spectra. We prepared sample soy beans from China, Thailand, Taiwan, and 7 different areas in Japan. As a result of PIXE analysis, 5 elements, potassium, iron, zinc, arsenic and rubidium, have been identified. There are clear differences in relative amount of trace-elements between samples from different international regions. Chinese beans contain much more Rb than the others, while there are significant differences in Fe and Zn between beans of Thailand and Taiwan. There are relatively smaller differences among Japanese beans. This result shows that trace-elements bring us some practical information of the region where the product grown.

  7. Investigation of 1-Dimensional ultrasonic vibration compliance mechanism based on finite element analysis

    NASA Astrophysics Data System (ADS)

    Latif, A. Afiff; Ibrahim, M. Rasidi; Rahim, E. A.; Cheng, K.

    2017-04-01

    The conventional milling has many difficulties in the processing of hard and brittle material. Hence, ultrasonic vibration assisted milling (UVAM) was proposed to overcome this problem. The objective of this research is to study the behavior of compliance mechanism (CM) as the critical part affect the performance of the UVAM. The design of the CM was investigated and focuses on 1-Dimensional. Experimental result was obtained from a portable laser digital vibrometer. While the 1-Dimensional value such as safety factor, deformation of hinges and stress analysis are obtained from finite elements simulation. Finally, the findings help to find the best design judging from the most travelled distance of the piezoelectric actuators. In addition, this paper would provide a clear picture the behavior of the CM embedded in the UVAM, which can provide good data and to improve the machining on reducing tool wear, and lower cutting force on the workpiece surface roughness.

  8. Distributed Finite Element Analysis Using a Transputer Network

    NASA Technical Reports Server (NTRS)

    Watson, James; Favenesi, James; Danial, Albert; Tombrello, Joseph; Yang, Dabby; Reynolds, Brian; Turrentine, Ronald; Shephard, Mark; Baehmann, Peggy

    1989-01-01

    The principal objective of this research effort was to demonstrate the extraordinarily cost effective acceleration of finite element structural analysis problems using a transputer-based parallel processing network. This objective was accomplished in the form of a commercially viable parallel processing workstation. The workstation is a desktop size, low-maintenance computing unit capable of supercomputer performance yet costs two orders of magnitude less. To achieve the principal research objective, a transputer based structural analysis workstation termed XPFEM was implemented with linear static structural analysis capabilities resembling commercially available NASTRAN. Finite element model files, generated using the on-line preprocessing module or external preprocessing packages, are downloaded to a network of 32 transputers for accelerated solution. The system currently executes at about one third Cray X-MP24 speed but additional acceleration appears likely. For the NASA selected demonstration problem of a Space Shuttle main engine turbine blade model with about 1500 nodes and 4500 independent degrees of freedom, the Cray X-MP24 required 23.9 seconds to obtain a solution while the transputer network, operated from an IBM PC-AT compatible host computer, required 71.7 seconds. Consequently, the $80,000 transputer network demonstrated a cost-performance ratio about 60 times better than the $15,000,000 Cray X-MP24 system.

  9. MetaMeta: integrating metagenome analysis tools to improve taxonomic profiling.

    PubMed

    Piro, Vitor C; Matschkowski, Marcel; Renard, Bernhard Y

    2017-08-14

    Many metagenome analysis tools are presently available to classify sequences and profile environmental samples. In particular, taxonomic profiling and binning methods are commonly used for such tasks. Tools available among these two categories make use of several techniques, e.g., read mapping, k-mer alignment, and composition analysis. Variations on the construction of the corresponding reference sequence databases are also common. In addition, different tools provide good results in different datasets and configurations. All this variation creates a complicated scenario to researchers to decide which methods to use. Installation, configuration and execution can also be difficult especially when dealing with multiple datasets and tools. We propose MetaMeta: a pipeline to execute and integrate results from metagenome analysis tools. MetaMeta provides an easy workflow to run multiple tools with multiple samples, producing a single enhanced output profile for each sample. MetaMeta includes a database generation, pre-processing, execution, and integration steps, allowing easy execution and parallelization. The integration relies on the co-occurrence of organisms from different methods as the main feature to improve community profiling while accounting for differences in their databases. In a controlled case with simulated and real data, we show that the integrated profiles of MetaMeta overcome the best single profile. Using the same input data, it provides more sensitive and reliable results with the presence of each organism being supported by several methods. MetaMeta uses Snakemake and has six pre-configured tools, all available at BioConda channel for easy installation (conda install -c bioconda metameta). The MetaMeta pipeline is open-source and can be downloaded at: https://gitlab.com/rki_bioinformatics .

  10. Development of the software tool for generation and visualization of the finite element head model with bone conduction sounds

    NASA Astrophysics Data System (ADS)

    Nikolić, Dalibor; Milošević, Žarko; Saveljić, Igor; Filipović, Nenad

    2015-12-01

    Vibration of the skull causes a hearing sensation. We call it Bone Conduction (BC) sound. There are several investigations about transmission properties of bone conducted sound. The aim of this study was to develop a software tool for easy generation of the finite element (FE) model of the human head with different materials based on human head anatomy and to calculate sound conduction through the head. Developed software tool generates a model in a few steps. The first step is to do segmentation of CT medical images (DICOM) and to generate a surface mesh files (STL). Each STL file presents a different layer of human head with different material properties (brain, CSF, different layers of the skull bone, skin, etc.). The next steps are to make tetrahedral mesh from obtained STL files, to define FE model boundary conditions and to solve FE equations. This tool uses PAK solver, which is the open source software implemented in SIFEM FP7 project, for calculations of the head vibration. Purpose of this tool is to show impact of the bone conduction sound of the head on the hearing system and to estimate matching of obtained results with experimental measurements.

  11. Integrated tools for control-system analysis

    NASA Technical Reports Server (NTRS)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  12. rSNPBase 3.0: an updated database of SNP-related regulatory elements, element-gene pairs and SNP-based gene regulatory networks

    PubMed Central

    2018-01-01

    Abstract Here, we present the updated rSNPBase 3.0 database (http://rsnp3.psych.ac.cn), which provides human SNP-related regulatory elements, element-gene pairs and SNP-based regulatory networks. This database is the updated version of the SNP regulatory annotation database rSNPBase and rVarBase. In comparison to the last two versions, there are both structural and data adjustments in rSNPBase 3.0: (i) The most significant new feature is the expansion of analysis scope from SNP-related regulatory elements to include regulatory element–target gene pairs (E–G pairs), therefore it can provide SNP-based gene regulatory networks. (ii) Web function was modified according to data content and a new network search module is provided in the rSNPBase 3.0 in addition to the previous regulatory SNP (rSNP) search module. The two search modules support data query for detailed information (related-elements, element-gene pairs, and other extended annotations) on specific SNPs and SNP-related graphic networks constructed by interacting transcription factors (TFs), miRNAs and genes. (3) The type of regulatory elements was modified and enriched. To our best knowledge, the updated rSNPBase 3.0 is the first data tool supports SNP functional analysis from a regulatory network prospective, it will provide both a comprehensive understanding and concrete guidance for SNP-related regulatory studies. PMID:29140525

  13. Process Improvement Through Tool Integration in Aero-Mechanical Design

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2010-01-01

    Emerging capabilities in commercial design tools promise to significantly improve the multi-disciplinary and inter-disciplinary design and analysis coverage for aerospace mechanical engineers. This paper explores the analysis process for two example problems of a wing and flap mechanical drive system and an aircraft landing gear door panel. The examples begin with the design solid models and include various analysis disciplines such as structural stress and aerodynamic loads. Analytical methods include CFD, multi-body dynamics with flexible bodies and structural analysis. Elements of analysis data management, data visualization and collaboration are also included.

  14. Finite-Element Analysis of a Mach-8 Flight Test Article Using Nonlinear Contact Elements

    NASA Technical Reports Server (NTRS)

    Richards, W. Lance

    1997-01-01

    A flight test article, called a glove, is required for a Mach-8 boundary-layer experiment to be conducted on a flight mission of the air-launched Pegasus(reg) space booster. The glove is required to provide a smooth, three-dimensional, structurally stable, aerodynamic surface and includes instrumentation to determine when and where boundary-layer transition occurs during the hypersonic flight trajectory. A restraint mechanism has been invented to attach the glove to the wing of the space booster. The restraint mechanism securely attaches the glove to the wing in directions normal to the wing/glove interface surface, but allows the glove to thermally expand and contract to alleviate stresses in directions parallel to the interface surface. A finite-element analysis has been performed using nonlinear contact elements to model the complex behavior of the sliding restraint mechanism. This paper provides an overview of the glove design and presents details of the analysis that were essential to demonstrate the flight worthiness of the wing-glove test article. Results show that all glove components are well within the allowable stress and deformation requirements to satisfy the objectives of the flight research experiment.

  15. TEtools facilitates big data expression analysis of transposable elements and reveals an antagonism between their activity and that of piRNA genes

    PubMed Central

    Lerat, Emmanuelle; Fablet, Marie; Modolo, Laurent; Lopez-Maestre, Hélène

    2017-01-01

    Abstract Over recent decades, substantial efforts have been made to understand the interactions between host genomes and transposable elements (TEs). The impact of TEs on the regulation of host genes is well known, with TEs acting as platforms of regulatory sequences. Nevertheless, due to their repetitive nature it is considerably hard to integrate TE analysis into genome-wide studies. Here, we developed a specific tool for the analysis of TE expression: TEtools. This tool takes into account the TE sequence diversity of the genome, it can be applied to unannotated or unassembled genomes and is freely available under the GPL3 (https://github.com/l-modolo/TEtools). TEtools performs the mapping of RNA-seq data obtained from classical mRNAs or small RNAs onto a list of TE sequences and performs differential expression analyses with statistical relevance. Using this tool, we analyzed TE expression from five Drosophila wild-type strains. Our data show for the first time that the activity of TEs is strictly linked to the activity of the genes implicated in the piwi-interacting RNA biogenesis and therefore fits an arms race scenario between TE sequences and host control genes. PMID:28204592

  16. 3D analysis of semiconductor devices: A combination of 3D imaging and 3D elemental analysis

    NASA Astrophysics Data System (ADS)

    Fu, Bianzhu; Gribelyuk, Michael A.

    2018-04-01

    3D analysis of semiconductor devices using a combination of scanning transmission electron microscopy (STEM) Z-contrast tomography and energy dispersive spectroscopy (EDS) elemental tomography is presented. 3D STEM Z-contrast tomography is useful in revealing the depth information of the sample. However, it suffers from contrast problems between materials with similar atomic numbers. Examples of EDS elemental tomography are presented using an automated EDS tomography system with batch data processing, which greatly reduces the data collection and processing time. 3D EDS elemental tomography reveals more in-depth information about the defect origin in semiconductor failure analysis. The influence of detector shadowing and X-rays absorption on the EDS tomography's result is also discussed.

  17. Improving finite element results in modeling heart valve mechanics.

    PubMed

    Earl, Emily; Mohammadi, Hadi

    2018-06-01

    Finite element analysis is a well-established computational tool which can be used for the analysis of soft tissue mechanics. Due to the structural complexity of the leaflet tissue of the heart valve, the currently available finite element models do not adequately represent the leaflet tissue. A method of addressing this issue is to implement computationally expensive finite element models, characterized by precise constitutive models including high-order and high-density mesh techniques. In this study, we introduce a novel numerical technique that enhances the results obtained from coarse mesh finite element models to provide accuracy comparable to that of fine mesh finite element models while maintaining a relatively low computational cost. Introduced in this study is a method by which the computational expense required to solve linear and nonlinear constitutive models, commonly used in heart valve mechanics simulations, is reduced while continuing to account for large and infinitesimal deformations. This continuum model is developed based on the least square algorithm procedure coupled with the finite difference method adhering to the assumption that the components of the strain tensor are available at all nodes of the finite element mesh model. The suggested numerical technique is easy to implement, practically efficient, and requires less computational time compared to currently available commercial finite element packages such as ANSYS and/or ABAQUS.

  18. Aeroelastic Ground Wind Loads Analysis Tool for Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Ivanco, Thomas G.

    2016-01-01

    Launch vehicles are exposed to ground winds during rollout and on the launch pad that can induce static and dynamic loads. Of particular concern are the dynamic loads caused by vortex shedding from nearly-cylindrical structures. When the frequency of vortex shedding nears that of a lowly-damped structural mode, the dynamic loads can be more than an order of magnitude greater than mean drag loads. Accurately predicting vehicle response to vortex shedding during the design and analysis cycles is difficult and typically exceeds the practical capabilities of modern computational fluid dynamics codes. Therefore, mitigating the ground wind loads risk typically requires wind-tunnel tests of dynamically-scaled models that are time consuming and expensive to conduct. In recent years, NASA has developed a ground wind loads analysis tool for launch vehicles to fill this analytical capability gap in order to provide predictions for prelaunch static and dynamic loads. This paper includes a background of the ground wind loads problem and the current state-of-the-art. It then discusses the history and significance of the analysis tool and the methodology used to develop it. Finally, results of the analysis tool are compared to wind-tunnel and full-scale data of various geometries and Reynolds numbers.

  19. CoryneBase: Corynebacterium Genomic Resources and Analysis Tools at Your Fingertips

    PubMed Central

    Tan, Mui Fern; Jakubovics, Nick S.; Wee, Wei Yee; Mutha, Naresh V. R.; Wong, Guat Jah; Ang, Mia Yang; Yazdi, Amir Hessam; Choo, Siew Woh

    2014-01-01

    Corynebacteria are used for a wide variety of industrial purposes but some species are associated with human diseases. With increasing number of corynebacterial genomes having been sequenced, comparative analysis of these strains may provide better understanding of their biology, phylogeny, virulence and taxonomy that may lead to the discoveries of beneficial industrial strains or contribute to better management of diseases. To facilitate the ongoing research of corynebacteria, a specialized central repository and analysis platform for the corynebacterial research community is needed to host the fast-growing amount of genomic data and facilitate the analysis of these data. Here we present CoryneBase, a genomic database for Corynebacterium with diverse functionality for the analysis of genomes aimed to provide: (1) annotated genome sequences of Corynebacterium where 165,918 coding sequences and 4,180 RNAs can be found in 27 species; (2) access to comprehensive Corynebacterium data through the use of advanced web technologies for interactive web interfaces; and (3) advanced bioinformatic analysis tools consisting of standard BLAST for homology search, VFDB BLAST for sequence homology search against the Virulence Factor Database (VFDB), Pairwise Genome Comparison (PGC) tool for comparative genomic analysis, and a newly designed Pathogenomics Profiling Tool (PathoProT) for comparative pathogenomic analysis. CoryneBase offers the access of a range of Corynebacterium genomic resources as well as analysis tools for comparative genomics and pathogenomics. It is publicly available at http://corynebacterium.um.edu.my/. PMID:24466021

  20. Trace element analysis of soil type collected from the Manjung and central Perak

    NASA Astrophysics Data System (ADS)

    Azman, Muhammad Azfar; Hamzah, Suhaimi; Rahman, Shamsiah Abdul; Elias, Md Suhaimi; Abdullah, Nazaratul Ashifa; Hashim, Azian; Shukor, Shakirah Abd; Kamaruddin, Ahmad Hasnulhadi Che

    2015-04-01

    Trace elements in soils primarily originated from their parent materials. Parents' material is the underlying geological material that has been undergone different types of chemical weathering and leaching processes. Soil trace elements concentrations may be increases as a result of continuous input from various human activities, including power generation, agriculture, mining and manufacturing. This paper describes the Neutron Activation Analysis (NAA) method used for the determination of trace elements concentrations in part per million (ppm) present in the terrestrial environment soil in Perak. The data may indicate any contamination of trace elements contributed from human activities in the area. The enrichment factors were used to check if there any contamination due to the human activities (power plants, agricultural, mining, etc.) otherwise the values would serve as a baseline data for future study. The samples were collected from 27 locations of different soil series in the area at two different depths: the top soil (0-15cm) and the sub soil (15-30cm). The collected soil samples were air dried at 60°C and passed through 2 µm sieve. Instrumental Neutron Activation Analysis (NAA) has been used for the determination of trace elements. Samples were activated in the Nuclear Malaysia TRIGA Mark II reactor followed by gamma spectrometric analysis. By activating the stable elements in the samples, the elements can be determined from the intensities of gamma energies emitted by the respected radionuclides.

  1. Trace element analysis of soil type collected from the Manjung and central Perak

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Azman, Muhammad Azfar, E-mail: m-azfar@nuclearmalaysia.gov.my; Hamzah, Suhaimi; Rahman, Shamsiah Abdul

    2015-04-29

    Trace elements in soils primarily originated from their parent materials. Parents’ material is the underlying geological material that has been undergone different types of chemical weathering and leaching processes. Soil trace elements concentrations may be increases as a result of continuous input from various human activities, including power generation, agriculture, mining and manufacturing. This paper describes the Neutron Activation Analysis (NAA) method used for the determination of trace elements concentrations in part per million (ppm) present in the terrestrial environment soil in Perak. The data may indicate any contamination of trace elements contributed from human activities in the area. Themore » enrichment factors were used to check if there any contamination due to the human activities (power plants, agricultural, mining, etc.) otherwise the values would serve as a baseline data for future study. The samples were collected from 27 locations of different soil series in the area at two different depths: the top soil (0-15cm) and the sub soil (15-30cm). The collected soil samples were air dried at 60°C and passed through 2 µm sieve. Instrumental Neutron Activation Analysis (NAA) has been used for the determination of trace elements. Samples were activated in the Nuclear Malaysia TRIGA Mark II reactor followed by gamma spectrometric analysis. By activating the stable elements in the samples, the elements can be determined from the intensities of gamma energies emitted by the respected radionuclides.« less

  2. Code Analysis and Refactoring with Clang Tools, Version 0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelley, Timothy M.

    2016-12-23

    Code Analysis and Refactoring with Clang Tools is a small set of example code that demonstrates techniques for applying tools distributed with the open source Clang compiler. Examples include analyzing where variables are used and replacing old data structures with standard structures.

  3. Reliability analysis of dispersion nuclear fuel elements

    NASA Astrophysics Data System (ADS)

    Ding, Shurong; Jiang, Xin; Huo, Yongzhong; Li, Lin an

    2008-03-01

    Taking a dispersion fuel element as a special particle composite, the representative volume element is chosen to act as the research object. The fuel swelling is simulated through temperature increase. The large strain elastoplastic analysis is carried out for the mechanical behaviors using FEM. The results indicate that the fission swelling is simulated successfully; the thickness increments grow linearly with burnup; with increasing of burnup: (1) the first principal stresses at fuel particles change from tensile ones to compression ones, (2) the maximum Mises stresses at the particles transfer from the centers of fuel particles to the location close to the interfaces between the matrix and the particles, their values increase with burnup; the maximum Mises stresses at the matrix exist in the middle location between the two particles near the mid-plane along the length (or width) direction, and the maximum plastic strains are also at the above region.

  4. rSNPBase 3.0: an updated database of SNP-related regulatory elements, element-gene pairs and SNP-based gene regulatory networks.

    PubMed

    Guo, Liyuan; Wang, Jing

    2018-01-04

    Here, we present the updated rSNPBase 3.0 database (http://rsnp3.psych.ac.cn), which provides human SNP-related regulatory elements, element-gene pairs and SNP-based regulatory networks. This database is the updated version of the SNP regulatory annotation database rSNPBase and rVarBase. In comparison to the last two versions, there are both structural and data adjustments in rSNPBase 3.0: (i) The most significant new feature is the expansion of analysis scope from SNP-related regulatory elements to include regulatory element-target gene pairs (E-G pairs), therefore it can provide SNP-based gene regulatory networks. (ii) Web function was modified according to data content and a new network search module is provided in the rSNPBase 3.0 in addition to the previous regulatory SNP (rSNP) search module. The two search modules support data query for detailed information (related-elements, element-gene pairs, and other extended annotations) on specific SNPs and SNP-related graphic networks constructed by interacting transcription factors (TFs), miRNAs and genes. (3) The type of regulatory elements was modified and enriched. To our best knowledge, the updated rSNPBase 3.0 is the first data tool supports SNP functional analysis from a regulatory network prospective, it will provide both a comprehensive understanding and concrete guidance for SNP-related regulatory studies. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  5. Multi-color electron microscopy by element-guided identification of cells, organelles and molecules.

    PubMed

    Scotuzzi, Marijke; Kuipers, Jeroen; Wensveen, Dasha I; de Boer, Pascal; Hagen, Kees C W; Hoogenboom, Jacob P; Giepmans, Ben N G

    2017-04-07

    Cellular complexity is unraveled at nanometer resolution using electron microscopy (EM), but interpretation of macromolecular functionality is hampered by the difficulty in interpreting grey-scale images and the unidentified molecular content. We perform large-scale EM on mammalian tissue complemented with energy-dispersive X-ray analysis (EDX) to allow EM-data analysis based on elemental composition. Endogenous elements, labels (gold and cadmium-based nanoparticles) as well as stains are analyzed at ultrastructural resolution. This provides a wide palette of colors to paint the traditional grey-scale EM images for composition-based interpretation. Our proof-of-principle application of EM-EDX reveals that endocrine and exocrine vesicles exist in single cells in Islets of Langerhans. This highlights how elemental mapping reveals unbiased biomedical relevant information. Broad application of EM-EDX will further allow experimental analysis on large-scale tissue using endogenous elements, multiple stains, and multiple markers and thus brings nanometer-scale 'color-EM' as a promising tool to unravel molecular (de)regulation in biomedicine.

  6. Multi-color electron microscopy by element-guided identification of cells, organelles and molecules

    PubMed Central

    Scotuzzi, Marijke; Kuipers, Jeroen; Wensveen, Dasha I.; de Boer, Pascal; Hagen, Kees (C.) W.; Hoogenboom, Jacob P.; Giepmans, Ben N. G.

    2017-01-01

    Cellular complexity is unraveled at nanometer resolution using electron microscopy (EM), but interpretation of macromolecular functionality is hampered by the difficulty in interpreting grey-scale images and the unidentified molecular content. We perform large-scale EM on mammalian tissue complemented with energy-dispersive X-ray analysis (EDX) to allow EM-data analysis based on elemental composition. Endogenous elements, labels (gold and cadmium-based nanoparticles) as well as stains are analyzed at ultrastructural resolution. This provides a wide palette of colors to paint the traditional grey-scale EM images for composition-based interpretation. Our proof-of-principle application of EM-EDX reveals that endocrine and exocrine vesicles exist in single cells in Islets of Langerhans. This highlights how elemental mapping reveals unbiased biomedical relevant information. Broad application of EM-EDX will further allow experimental analysis on large-scale tissue using endogenous elements, multiple stains, and multiple markers and thus brings nanometer-scale ‘color-EM’ as a promising tool to unravel molecular (de)regulation in biomedicine. PMID:28387351

  7. Parameter identification and optimization of slide guide joint of CNC machine tools

    NASA Astrophysics Data System (ADS)

    Zhou, S.; Sun, B. B.

    2017-11-01

    The joint surface has an important influence on the performance of CNC machine tools. In order to identify the dynamic parameters of slide guide joint, the parametric finite element model of the joint is established and optimum design method is used based on the finite element simulation and modal test. Then the mode that has the most influence on the dynamics of slip joint is found through harmonic response analysis. Take the frequency of this mode as objective, the sensitivity analysis of the stiffness of each joint surface is carried out using Latin Hypercube Sampling and Monte Carlo Simulation. The result shows that the vertical stiffness of slip joint surface constituted by the bed and the slide plate has the most obvious influence on the structure. Therefore, this stiffness is taken as the optimization variable and the optimal value is obtained through studying the relationship between structural dynamic performance and stiffness. Take the stiffness values before and after optimization into the FEM of machine tool, and it is found that the dynamic performance of the machine tool is improved.

  8. Paramedir: A Tool for Programmable Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

  9. Integrated multidisciplinary analysis tool IMAT users' guide

    NASA Technical Reports Server (NTRS)

    Meissner, Frances T. (Editor)

    1988-01-01

    The Integrated Multidisciplinary Analysis Tool (IMAT) is a computer software system developed at Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite controls systems influenced by structural dynamics. Using a menu-driven executive system, IMAT leads the user through the program options. IMAT links a relational database manager to commercial and in-house structural and controls analysis codes. This paper describes the IMAT software system and how to use it.

  10. Advanced Vibration Analysis Tool Developed for Robust Engine Rotor Designs

    NASA Technical Reports Server (NTRS)

    Min, James B.

    2005-01-01

    The primary objective of this research program is to develop vibration analysis tools, design tools, and design strategies to significantly improve the safety and robustness of turbine engine rotors. Bladed disks in turbine engines always feature small, random blade-to-blade differences, or mistuning. Mistuning can lead to a dramatic increase in blade forced-response amplitudes and stresses. Ultimately, this results in high-cycle fatigue, which is a major safety and cost concern. In this research program, the necessary steps will be taken to transform a state-of-the-art vibration analysis tool, the Turbo- Reduce forced-response prediction code, into an effective design tool by enhancing and extending the underlying modeling and analysis methods. Furthermore, novel techniques will be developed to assess the safety of a given design. In particular, a procedure will be established for using natural-frequency curve veerings to identify ranges of operating conditions (rotational speeds and engine orders) in which there is a great risk that the rotor blades will suffer high stresses. This work also will aid statistical studies of the forced response by reducing the necessary number of simulations. Finally, new strategies for improving the design of rotors will be pursued.

  11. An accurate nonlinear finite element analysis and test correlation of a stiffened composite wing panel

    NASA Astrophysics Data System (ADS)

    Davis, D. D., Jr.; Krishnamurthy, T.; Stroud, W. J.; McCleary, S. L.

    1991-05-01

    State-of-the-art nonlinear finite element analysis techniques are evaluated by applying them to a realistic aircraft structural component. A wing panel from the V-22 tiltrotor aircraft is chosen because it is a typical modern aircraft structural component for which there is experimental data for comparison of results. From blueprints and drawings, a very detailed finite element model containing 2284 9-node Assumed Natural-Coordinate Strain elements was generated. A novel solution strategy which accounts for geometric nonlinearity through the use of corotating element reference frames and nonlinear strain-displacement relations is used to analyze this detailed model. Results from linear analyses using the same finite element model are presented in order to illustrate the advantages and costs of the nonlinear analysis as compared with the more traditional linear analysis.

  12. An accurate nonlinear finite element analysis and test correlation of a stiffened composite wing panel

    NASA Technical Reports Server (NTRS)

    Davis, D. D., Jr.; Krishnamurthy, T.; Stroud, W. J.; Mccleary, S. L.

    1991-01-01

    State-of-the-art nonlinear finite element analysis techniques are evaluated by applying them to a realistic aircraft structural component. A wing panel from the V-22 tiltrotor aircraft is chosen because it is a typical modern aircraft structural component for which there is experimental data for comparison of results. From blueprints and drawings, a very detailed finite element model containing 2284 9-node Assumed Natural-Coordinate Strain elements was generated. A novel solution strategy which accounts for geometric nonlinearity through the use of corotating element reference frames and nonlinear strain-displacement relations is used to analyze this detailed model. Results from linear analyses using the same finite element model are presented in order to illustrate the advantages and costs of the nonlinear analysis as compared with the more traditional linear analysis.

  13. Power flows and Mechanical Intensities in structural finite element analysis

    NASA Technical Reports Server (NTRS)

    Hambric, Stephen A.

    1989-01-01

    The identification of power flow paths in dynamically loaded structures is an important, but currently unavailable, capability for the finite element analyst. For this reason, methods for calculating power flows and mechanical intensities in finite element models are developed here. Formulations for calculating input and output powers, power flows, mechanical intensities, and power dissipations for beam, plate, and solid element types are derived. NASTRAN is used to calculate the required velocity, force, and stress results of an analysis, which a post-processor then uses to calculate power flow quantities. The SDRC I-deas Supertab module is used to view the final results. Test models include a simple truss and a beam-stiffened cantilever plate. Both test cases showed reasonable power flow fields over low to medium frequencies, with accurate power balances. Future work will include testing with more complex models, developing an interactive graphics program to view easily and efficiently the analysis results, applying shape optimization methods to the problem with power flow variables as design constraints, and adding the power flow capability to NASTRAN.

  14. [Proposal of new trace elements classification to be used in nutrition, oligotherapy and other therapeutics strategies].

    PubMed

    Ramírez Hernández, Javier; Bonete Pérez, María José; Martínez Espinosa, Rosa María

    2014-12-17

    1) to propose a new classification of the trace elements based on a study of the recently reported research; 2) to offer detailed and actualized information about trace elements. the analysis of the research results recently reported reveals that the advances of the molecular analysis techniques point out the importance of certain trace elements in human health. A detailed analysis of the catalytic function related to several elements not considered essential o probably essentials up to now is also offered. To perform the integral analysis of the enzymes containing trace elements informatics tools have been used. Actualized information about physiological role, kinetics, metabolism, dietetic sources and factors promoting trace elements scarcity or toxicity is also presented. Oligotherapy uses catalytic active trace elements with therapeutic proposals. The new trace element classification here presented will be of high interest for different professional sectors: doctors and other professions related to medicine; nutritionist, pharmaceutics, etc. Using this new classification and approaches, new therapeutic strategies could be designed to mitigate symptomatology related to several pathologies, particularly carential and metabolic diseases. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  15. Determination of Specific Forces and Tool Deflections in Micro-milling of Ti-6Al-4V alloy using Finite Element Simulations and Analysis

    NASA Astrophysics Data System (ADS)

    Farina, Simone; Thepsonti, Thanongsak; Ceretti, Elisabetta; Özel, Tugrul

    2011-05-01

    Titanium alloys offer superb properties in strength, corrosion resistance and biocompatibility and are commonly utilized in medical devices and implants. Micro-end milling process is a direct and rapid fabrication method for manufacturing medical devices and implants in titanium alloys. Process performance and quality depend upon an understanding of the relationship between cutting parameters and forces and resultant tool deflections to avoid tool breakage. For this purpose, FE simulations of chip formation during micro-end milling of Ti-6Al-4V alloy with an ultra-fine grain solid carbide two-flute micro-end mill are investigated using DEFORM software. At first, specific forces in tangential and radial directions of cutting during micro-end milling for varying feed advance and rotational speeds have been determined using designed FE simulations for chip formation process. Later, these forces are applied to the micro-end mill geometry along the axial depth of cut in 3D analysis of ABAQUS. Consequently, 3D distributions for tool deflections & von Misses stress are determined. These analyses will yield in establishing integrated multi-physics process models for high performance micro-end milling and a leap-forward to process improvements.

  16. STARS: A general-purpose finite element computer program for analysis of engineering structures

    NASA Technical Reports Server (NTRS)

    Gupta, K. K.

    1984-01-01

    STARS (Structural Analysis Routines) is primarily an interactive, graphics-oriented, finite-element computer program for analyzing the static, stability, free vibration, and dynamic responses of damped and undamped structures, including rotating systems. The element library consists of one-dimensional (1-D) line elements, two-dimensional (2-D) triangular and quadrilateral shell elements, and three-dimensional (3-D) tetrahedral and hexahedral solid elements. These elements enable the solution of structural problems that include truss, beam, space frame, plane, plate, shell, and solid structures, or any combination thereof. Zero, finite, and interdependent deflection boundary conditions can be implemented by the program. The associated dynamic response analysis capability provides for initial deformation and velocity inputs, whereas the transient excitation may be either forces or accelerations. An effective in-core or out-of-core solution strategy is automatically employed by the program, depending on the size of the problem. Data input may be at random within a data set, and the program offers certain automatic data-generation features. Input data are formatted as an optimal combination of free and fixed formats. Interactive graphics capabilities enable convenient display of nodal deformations, mode shapes, and element stresses.

  17. Analysis and control on changeable wheel tool system of hybrid grinding and polishing machine tool for blade finishing

    NASA Astrophysics Data System (ADS)

    He, Qiuwei; Lv, Xingming; Wang, Xin; Qu, Xingtian; Zhao, Ji

    2017-01-01

    Blade is the key component in the energy power equipment of turbine, aircraft engines and so on. Researches on the process and equipment for blade finishing become one of important and difficult point. To control precisely tool system of developed hybrid grinding and polishing machine tool for blade finishing, the tool system with changeable wheel for belt polishing is analyzed in this paper. Firstly, the belt length and wrap angle of each wheel in different position of tension wheel swing angle in the process of changing wheel is analyzed. The reasonable belt length is calculated by using MATLAB, and relationships between wrap angle of each wheel and cylinder expansion amount of contact wheel are obtained. Then, the control system for changeable wheel tool structure is developed. Lastly, the surface roughness of blade finishing is verified by experiments. Theoretical analysis and experimental results show that reasonable belt length and wheel wrap angle can be obtained by proposed analysis method, the changeable wheel tool system can be controlled precisely, and the surface roughness of blade after grinding meets the design requirements.

  18. Standardizing Exoplanet Analysis with the Exoplanet Characterization Tool Kit (ExoCTK)

    NASA Astrophysics Data System (ADS)

    Fowler, Julia; Stevenson, Kevin B.; Lewis, Nikole K.; Fraine, Jonathan D.; Pueyo, Laurent; Bruno, Giovanni; Filippazzo, Joe; Hill, Matthew; Batalha, Natasha; Wakeford, Hannah; Bushra, Rafia

    2018-06-01

    Exoplanet characterization depends critically on analysis tools, models, and spectral libraries that are constantly under development and have no single source nor sense of unified style or methods. The complexity of spectroscopic analysis and initial time commitment required to become competitive is prohibitive to new researchers entering the field, as well as a remaining obstacle for established groups hoping to contribute in a comparable manner to their peers. As a solution, we are developing an open-source, modular data analysis package in Python and a publicly facing web interface including tools that address atmospheric characterization, transit observation planning with JWST, JWST corongraphy simulations, limb darkening, forward modeling, and data reduction, as well as libraries of stellar, planet, and opacity models. The foundation of these software tools and libraries exist within pockets of the exoplanet community, but our project will gather these seedling tools and grow a robust, uniform, and well-maintained exoplanet characterization toolkit.

  19. Measuring Surface Bulk Elemental Composition on Venus

    NASA Astrophysics Data System (ADS)

    Schweitzer, Jeffrey S.; Parsons, Ann M.; Grau, Jim; Lawrence, David J.; McClanahan, Timothy P.; Miles, Jeffrey; Peplowski, Patrick; Perkins, Luke; Starr, Richard

    Bulk elemental composition measurements of the subsurface of Venus are challenging because of the extreme surface environment (462 ˚C, 93 bars pressure). Instruments provided by landed probes on the surface of Venus must therefore be enclosed in a pressure vessel. The high surface temperatures require a thermal control system that keeps the instrumentation and electronics within their operating temperature range for as long as possible. Currently, Venus surface probes can operate for only a few hours. It is therefore crucial that the lander instrumentation be able to make statistically significant measurements in a short time. An instrument is described that can achieve such a measurement over a volume of thousands of cubic centimeters of material by using high energy penetrating neutron and gamma radiation. The instrument consists of a Pulsed Neutron Generator (PNG) and a Gamma-Ray Spectrometer (GRS). The PNG emits isotropic pulses of 14.1 MeV neutrons that penetrate the pressure vessel walls, the dense atmosphere and the surface rock. The neutrons induce nuclear reactions in the rock to produce gamma rays with energies specific to the element and nuclear process involved. Thus the energies of the detected gamma rays identify the elements present and their intensities provide the abundance of each element. The GRS spectra are analyzed to determine the Venus elemental composition from the spectral signature of individual major, minor, and trace radioactive elements. As a test of such an instrument, a Schlumberger Litho Scanner1 oil well logging tool was used in a series of experiments at NASA's Goddard Space Flight Center. The Litho Scanner tool was mounted above large (1.8 m x 1.8 m x .9 m) granite and basalt monuments and made a series of one-hour elemental composition measurements in a planar geometry more similar to a planetary lander measurement. Initial analysis of the results shows good agreement with target elemental assays.

  20. Design synthesis and optimization of permanent magnet synchronous machines based on computationally-efficient finite element analysis

    NASA Astrophysics Data System (ADS)

    Sizov, Gennadi Y.

    In this dissertation, a model-based multi-objective optimal design of permanent magnet ac machines, supplied by sine-wave current regulated drives, is developed and implemented. The design procedure uses an efficient electromagnetic finite element-based solver to accurately model nonlinear material properties and complex geometric shapes associated with magnetic circuit design. Application of an electromagnetic finite element-based solver allows for accurate computation of intricate performance parameters and characteristics. The first contribution of this dissertation is the development of a rapid computational method that allows accurate and efficient exploration of large multi-dimensional design spaces in search of optimum design(s). The computationally efficient finite element-based approach developed in this work provides a framework of tools that allow rapid analysis of synchronous electric machines operating under steady-state conditions. In the developed modeling approach, major steady-state performance parameters such as, winding flux linkages and voltages, average, cogging and ripple torques, stator core flux densities, core losses, efficiencies and saturated machine winding inductances, are calculated with minimum computational effort. In addition, the method includes means for rapid estimation of distributed stator forces and three-dimensional effects of stator and/or rotor skew on the performance of the machine. The second contribution of this dissertation is the development of the design synthesis and optimization method based on a differential evolution algorithm. The approach relies on the developed finite element-based modeling method for electromagnetic analysis and is able to tackle large-scale multi-objective design problems using modest computational resources. Overall, computational time savings of up to two orders of magnitude are achievable, when compared to current and prevalent state-of-the-art methods. These computational savings allow

  1. [Distribution Characteristics and Source Analysis of Dustfall Trace Elements During Winter in Beijing].

    PubMed

    Xiong, Qiu-lin; Zhao, Wen-ji; Guo, Xiao-yu; Chen, Fan-tao; Shu, Tong-tong; Zheng, Xiao-xia; Zhao, Wen-hui

    2015-08-01

    The dustfall content is one of the evaluation indexes of atmospheric pollution. Trace elements especially heavy metals in dustfall can lead to risks to ecological environment and human health. In order to study the distribution characteristics of trace elements, heavy metals pollution and their sources in winter atmospheric dust, 49 dustfall samples were collected in Beijing City and nearby during November 2013 to March 2014. Then the contents (mass percentages) of 40 trace elements were measured by Elan DRC It type inductively coupled plasma mass (ICP-MS). Test results showed that more than half of the trace elements in the dust were less than 10 mg x kg(-1); about a quarter were between 10-100 mg x kg-1); while 7 elements (Pb, Zr, Cr, Cu, Zn, Sr and Ba) were more than 100 mg x kg(-1). The contents of Pb, Cu, Zn, Bi, Cd and Mo of winter dustfall in Beijing city.were respectively 4.18, 4.66, 5.35, 6.31, 6.62, and 8.62 times as high as those of corresponding elements in the surface soil in the same period, which went beyond the soil background values by more than 300% . The contribution of human activities to dustfall trace heavy metals content in Beijing city was larger than that in the surrounding region. Then sources analysis of dustfall and its 20 main trace elements (Cd, Mo, Nb, Ga, Co, Y, Nd, Li, La, Ni, Rb, V, Ce, Pb, Zr, Cr, Cu, Zn, Sr, Ba) was conducted through a multi-method analysis, including Pearson correlation analysis, Kendall correlation coefficient analysis and principal component analysis. Research results indicated that sources of winter dustfall in Beijing city were mainly composed of the earth's crust sources (including road dust, construction dust and remote transmission of dust) and the burning of fossil fuels (vehicle emissions, coal combustion, biomass combustion and industrial processes).

  2. X-ray STM: Nanoscale elemental analysis & Observation of atomic track.

    PubMed

    Saito, Akira; Furudate, Y; Kusui, Y; Saito, T; Akai-Kasaya, M; Tanaka, Y; Tamasaku, K; Kohmura, Y; Ishikawa, T; Kuwahara, Y; Aono, M

    2014-11-01

    Scanning tunneling microscopy (STM) combined with brilliant X-rays from synchrotron radiation (SR) can provide various possibilities of original and important applications, such as the elemental analysis on solid surfaces at an atomic scale. The principle of the elemental analysis is based on the inner-shell excitation of an element-specific energy level "under STM observation". A key to obtain an atomic locality is to extract the element-specific modulation of the local tunneling current (not emission that can damage the spatial resolution), which is derived from the inner-shell excitation [1]. On this purpose, we developed a special SR-STM system and smart tip. To surmount a tiny core-excitation efficiency by hard X-rays, we focused two-dimensionally an incident beam having the highest photon density at the SPring-8.After successes in the elemental analyses by SR-STM [1,2] on a semiconductor hetero-interface (Ge on Si) and metal-semiconductor interface (Cu on Ge), we succeeded in obtaining the elemental contrast between Co nano-islands and Au substrate. The results on the metallic substrate suggest the generality of the method and give some important implications on the principle of contrast. For all cases of three samples, the spatial resolution of the analysis was estimated to be ∼1 nm or less, and it is worth noting that the measured surface domains had a deposition thickness of less than one atomic layer (Fig. 1, left and center).jmicro;63/suppl_1/i14-a/DFU045F1F1DFU045F1Fig. 1.(left) Topographic image and (center) beam-induced tip current image of Ge(111)-Cu (-2V, 0.2 nA). (right) X-ray- induced atomic motion tracks on Ge(111) that were newly imaged by the Xray-STM. On the other hand, we found that the "X-ray induced atomic motion" can be observed directly with atomic scale using the SR-STM system effectively under the incident photon density of ∼2 x10(15) photon/sec/mm(2) [3]. SR-STM visualized successfully the track of the atomic motion (Fig. 1, right

  3. Elemental Analysis of Bone, Teeth, Horn and Antler in Different Animal Species Using Non-Invasive Handheld X-Ray Fluorescence.

    PubMed

    Buddhachat, Kittisak; Klinhom, Sarisa; Siengdee, Puntita; Brown, Janine L; Nomsiri, Raksiri; Kaewmong, Patcharaporn; Thitaram, Chatchote; Mahakkanukrauh, Pasuk; Nganvongpanit, Korakot

    2016-01-01

    Mineralized tissues accumulate elements that play crucial roles in animal health. Although elemental content of bone, blood and teeth of human and some animal species have been characterized, data for many others are lacking, as well as species comparisons. Here we describe the distribution of elements in horn (Bovidae), antler (Cervidae), teeth and bone (humerus) across a number of species determined by handheld X-ray fluorescence (XRF) to better understand differences and potential biological relevance. A difference in elemental profiles between horns and antlers was observed, possibly due to the outer layer of horns being comprised of keratin, whereas antlers are true bone. Species differences in tissue elemental content may be intrinsic, but also related to feeding habits that contribute to mineral accumulation, particularly for toxic heavy metals. One significant finding was a higher level of iron (Fe) in the humerus bone of elephants compared to other species. This may be an adaptation of the hematopoietic system by distributing Fe throughout the bone rather than the marrow, as elephant humerus lacks a marrow cavity. We also conducted discriminant analysis and found XRF was capable of distinguishing samples from different species, with humerus bone being the best source for species discrimination. For example, we found a 79.2% correct prediction and success rate of 80% for classification between human and non-human humerus bone. These findings show that handheld XRF can serve as an effective tool for the biological study of elemental composition in mineralized tissue samples and may have a forensic application.

  4. Elemental Analysis of Bone, Teeth, Horn and Antler in Different Animal Species Using Non-Invasive Handheld X-Ray Fluorescence

    PubMed Central

    Buddhachat, Kittisak; Klinhom, Sarisa; Siengdee, Puntita; Brown, Janine L.; Nomsiri, Raksiri; Kaewmong, Patcharaporn; Thitaram, Chatchote; Mahakkanukrauh, Pasuk; Nganvongpanit, Korakot

    2016-01-01

    Mineralized tissues accumulate elements that play crucial roles in animal health. Although elemental content of bone, blood and teeth of human and some animal species have been characterized, data for many others are lacking, as well as species comparisons. Here we describe the distribution of elements in horn (Bovidae), antler (Cervidae), teeth and bone (humerus) across a number of species determined by handheld X-ray fluorescence (XRF) to better understand differences and potential biological relevance. A difference in elemental profiles between horns and antlers was observed, possibly due to the outer layer of horns being comprised of keratin, whereas antlers are true bone. Species differences in tissue elemental content may be intrinsic, but also related to feeding habits that contribute to mineral accumulation, particularly for toxic heavy metals. One significant finding was a higher level of iron (Fe) in the humerus bone of elephants compared to other species. This may be an adaptation of the hematopoietic system by distributing Fe throughout the bone rather than the marrow, as elephant humerus lacks a marrow cavity. We also conducted discriminant analysis and found XRF was capable of distinguishing samples from different species, with humerus bone being the best source for species discrimination. For example, we found a 79.2% correct prediction and success rate of 80% for classification between human and non-human humerus bone. These findings show that handheld XRF can serve as an effective tool for the biological study of elemental composition in mineralized tissue samples and may have a forensic application. PMID:27196603

  5. Practical applications of surface analytic tools in tribology

    NASA Technical Reports Server (NTRS)

    Ferrante, J.

    1980-01-01

    Many of the currently, widely used tools available for surface analysis are described. Those which have the highest applicability for giving elemental and/or compound analysis for problems of interest in tribology and are truly surface sensitive (that is, less than 10 atomic layers) are presented. The latter group is evaluated in detail in terms of strengths and weaknesses. Emphasis is placed on post facto analysis of experiments performed under 'real' conditions (e.g., in air with lubricants). It is further indicated that such equipment could be used for screening and quality control.

  6. Trace elements by instrumental neutron activation analysis for pollution monitoring

    NASA Technical Reports Server (NTRS)

    Sheibley, D. W.

    1975-01-01

    Methods and technology were developed to analyze 1000 samples/yr of coal and other pollution-related samples. The complete trace element analysis of 20-24 samples/wk averaged 3-3.5 man-hours/sample. The computerized data reduction scheme could identify and report data on as many as 56 elements. In addition to coal, samples of fly ash, bottom ash, crude oil, fuel oil, residual oil, gasoline, jet fuel, kerosene, filtered air particulates, ore, stack scrubber water, clam tissue, crab shells, river sediment and water, and corn were analyzed. Precision of the method was plus or minus 25% based on all elements reported in coal and other sample matrices. Overall accuracy was estimated at 50%.

  7. Refinement of Out of Circularity and Thickness Measurements of a Cylinder for Finite Element Analysis

    DTIC Science & Technology

    2016-09-01

    UNCLASSIFIED UNCLASSIFIED Refinement of Out of Circularity and Thickness Measurements of a Cylinder for Finite Element Analysis...significant effect on the collapse strength and must be accurately represented in finite element analysis to obtain accurate results. Often it is necessary...to interpolate measurements from a relatively coarse grid to a refined finite element model and methods that have wide general acceptance are

  8. Analysis of aircraft tires via semianalytic finite elements

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Kim, Kyun O.; Tanner, John A.

    1990-01-01

    A computational procedure is presented for the geometrically nonlinear analysis of aircraft tires. The tire was modeled by using a two-dimensional laminated anisotropic shell theory with the effects of variation in material and geometric parameters included. The four key elements of the procedure are: (1) semianalytic finite elements in which the shell variables are represented by Fourier series in the circumferential direction and piecewise polynomials in the meridional direction; (2) a mixed formulation with the fundamental unknowns consisting of strain parameters, stress-resultant parameters, and generalized displacements; (3) multilevel operator splitting to effect successive simplifications, and to uncouple the equations associated with different Fourier harmonics; and (4) multilevel iterative procedures and reduction techniques to generate the response of the shell.

  9. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.

    PubMed

    Simonyan, Vahan; Mazumder, Raja

    2014-09-30

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  10. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis

    PubMed Central

    Simonyan, Vahan; Mazumder, Raja

    2014-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis. PMID:25271953

  11. Modeling Tools for Propulsion Analysis and Computational Fluid Dynamics on the Internet

    NASA Technical Reports Server (NTRS)

    Muss, J. A.; Johnson, C. W.; Gotchy, M. B.

    2000-01-01

    The existing RocketWeb(TradeMark) Internet Analysis System (httr)://www.iohnsonrockets.com/rocketweb) provides an integrated set of advanced analysis tools that can be securely accessed over the Internet. Since these tools consist of both batch and interactive analysis codes, the system includes convenient methods for creating input files and evaluating the resulting data. The RocketWeb(TradeMark) system also contains many features that permit data sharing which, when further developed, will facilitate real-time, geographically diverse, collaborative engineering within a designated work group. Adding work group management functionality while simultaneously extending and integrating the system's set of design and analysis tools will create a system providing rigorous, controlled design development, reducing design cycle time and cost.

  12. Inorganic elemental determinations of marine traditional Chinese Medicine Meretricis concha from Jiaozhou Bay: The construction of inorganic elemental fingerprint based on chemometric analysis

    NASA Astrophysics Data System (ADS)

    Shao, Mingying; Li, Xuejie; Zheng, Kang; Jiang, Man; Yan, Cuiwei; Li, Yantuan

    2016-04-01

    The goal of this paper is to explore the relationship between the inorganic elemental fingerprint and the geographical origin identification of Meretricis concha, which is a commonly used marine traditional Chinese medicine (TCM) for the treatment of asthma and scald burns. For that, the inorganic elemental contents of Meretricis concha from five sampling points in Jiaozhou Bay have been determined by means of inductively coupled plasma optical emission spectrometry, and the comparative investigations based on the contents of 14 inorganic elements (Al, As, Cd, Co, Cr, Cu, Fe, Hg, Mn, Mo, Ni, Pb, Se and Zn) of the samples from Jiaozhou Bay and the previous reported Rushan Bay were performed. It has been found that the samples from the two bays are approximately classified into two kinds using hierarchical cluster analysis, and a four-factor model based on principle component analysis could explain approximately 75% of the detection data, also linear discriminant analysis can be used to develop a prediction model to distinguish the samples from Jiaozhou Bay and Rushan Bay with accuracy of about 93%. The results of the present investigation suggested that the inorganic elemental fingerprint based on the combination of the measured elemental content and chemometric analysis is a promising approach for verifying the geographical origin of Meretricis concha, and this strategy should be valuable for the authenticity discrimination of some marine TCM.

  13. Finite element analysis of thrust angle contact ball slewing bearing

    NASA Astrophysics Data System (ADS)

    Deng, Biao; Guo, Yuan; Zhang, An; Tang, Shengjin

    2017-12-01

    In view of the large heavy slewing bearing no longer follows the rigid ring hupothesis under the load condition, the entity finite element model of thrust angular contact ball bearing was established by using finite element analysis software ANSYS. The boundary conditions of the model were set according to the actual condition of slewing bearing, the internal stress state of the slewing bearing was obtained by solving and calculation, and the calculated results were compared with the numerical results based on the rigid ring assumption. The results show that more balls are loaded in the result of finite element method, and the maximum contact stresses between the ball and raceway have some reductions. This is because the finite element method considers the ferrule as an elastic body. The ring will produce structure deformation in the radial plane when the heavy load slewing bearings are subjected to external loads. The results of the finite element method are more in line with the actual situation of the slewing bearing in the engineering.

  14. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel

    PubMed Central

    Chen, Bowang; Wilkening, Stefan; Drechsel, Marion; Hemminki, Kari

    2009-01-01

    Background Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. Findings The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis). Conclusion Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software. PMID:19852806

  15. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel.

    PubMed

    Chen, Bowang; Wilkening, Stefan; Drechsel, Marion; Hemminki, Kari

    2009-10-23

    Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis). Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software.

  16. Scoring Tools for the Analysis of Infant Respiratory Inductive Plethysmography Signals.

    PubMed

    Robles-Rubio, Carlos Alejandro; Bertolizio, Gianluca; Brown, Karen A; Kearney, Robert E

    2015-01-01

    Infants recovering from anesthesia are at risk of life threatening Postoperative Apnea (POA). POA events are rare, and so the study of POA requires the analysis of long cardiorespiratory records. Manual scoring is the preferred method of analysis for these data, but it is limited by low intra- and inter-scorer repeatability. Furthermore, recommended scoring rules do not provide a comprehensive description of the respiratory patterns. This work describes a set of manual scoring tools that address these limitations. These tools include: (i) a set of definitions and scoring rules for 6 mutually exclusive, unique patterns that fully characterize infant respiratory inductive plethysmography (RIP) signals; (ii) RIPScore, a graphical, manual scoring software to apply these rules to infant data; (iii) a library of data segments representing each of the 6 patterns; (iv) a fully automated, interactive formal training protocol to standardize the analysis and establish intra- and inter-scorer repeatability; and (v) a quality control method to monitor scorer ongoing performance over time. To evaluate these tools, three scorers from varied backgrounds were recruited and trained to reach a performance level similar to that of an expert. These scorers used RIPScore to analyze data from infants at risk of POA in two separate, independent instances. Scorers performed with high accuracy and consistency, analyzed data efficiently, had very good intra- and inter-scorer repeatability, and exhibited only minor confusion between patterns. These results indicate that our tools represent an excellent method for the analysis of respiratory patterns in long data records. Although the tools were developed for the study of POA, their use extends to any study of respiratory patterns using RIP (e.g., sleep apnea, extubation readiness). Moreover, by establishing and monitoring scorer repeatability, our tools enable the analysis of large data sets by multiple scorers, which is essential for

  17. Hair: A Diagnostic Tool to Complement Blood Serum and Urine.

    ERIC Educational Resources Information Center

    Maugh, Thomas H., II

    1978-01-01

    Trace elements and some drugs can be identified in hair and it seems likely that other organic chemicals will be identifiable in the future. Since hair is so easily collected, stored, and analyzed it promises to be an ideal complement to serum and urine analysis as a diagnostic tool. (BB)

  18. Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT)

    NASA Technical Reports Server (NTRS)

    Brown, Cheryl B.; Conger, Bruce C.; Miranda, Bruno M.; Bue, Grant C.; Rouen, Michael N.

    2007-01-01

    An effort was initiated by NASA/JSC in 2001 to develop an Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT) for the sizing of Extravehicular Activity System (EVAS) architecture and studies. Its intent was to support space suit development efforts and to aid in conceptual designs for future human exploration missions. Its basis was the Life Support Options Performance Program (LSOPP), a spacesuit and portable life support system (PLSS) sizing program developed for NASA/JSC circa 1990. EVAS_SAT estimates the mass, power, and volume characteristics for user-defined EVAS architectures, including Suit Systems, Airlock Systems, Tools and Translation Aids, and Vehicle Support equipment. The tool has undergone annual changes and has been updated as new data have become available. Certain sizing algorithms have been developed based on industry standards, while others are based on the LSOPP sizing routines. The sizing algorithms used by EVAS_SAT are preliminary. Because EVAS_SAT was designed for use by members of the EVA community, subsystem familiarity on the part of the intended user group and in the analysis of results is assumed. The current EVAS_SAT is operated within Microsoft Excel 2003 using a Visual Basic interface system.

  19. Application of artificial neural network in precise prediction of cement elements percentages based on the neutron activation analysis

    NASA Astrophysics Data System (ADS)

    Eftekhari Zadeh, E.; Feghhi, S. A. H.; Roshani, G. H.; Rezaei, A.

    2016-05-01

    Due to variation of neutron energy spectrum in the target sample during the activation process and to peak overlapping caused by the Compton effect with gamma radiations emitted from activated elements, which results in background changes and consequently complex gamma spectrum during the measurement process, quantitative analysis will ultimately be problematic. Since there is no simple analytical correlation between peaks' counts with elements' concentrations, an artificial neural network for analyzing spectra can be a helpful tool. This work describes a study on the application of a neural network to determine the percentages of cement elements (mainly Ca, Si, Al, and Fe) using the neutron capture delayed gamma-ray spectra of the substance emitted by the activated nuclei as patterns which were simulated via the Monte Carlo N-particle transport code, version 2.7. The Radial Basis Function (RBF) network is developed with four specific peaks related to Ca, Si, Al and Fe, which were extracted as inputs. The proposed RBF model is developed and trained with MATLAB 7.8 software. To obtain the optimal RBF model, several structures have been constructed and tested. The comparison between simulated and predicted values using the proposed RBF model shows that there is a good agreement between them.

  20. Accurate interlaminar stress recovery from finite element analysis

    NASA Technical Reports Server (NTRS)

    Tessler, Alexander; Riggs, H. Ronald

    1994-01-01

    The accuracy and robustness of a two-dimensional smoothing methodology is examined for the problem of recovering accurate interlaminar shear stress distributions in laminated composite and sandwich plates. The smoothing methodology is based on a variational formulation which combines discrete least-squares and penalty-constraint functionals in a single variational form. The smoothing analysis utilizes optimal strains computed at discrete locations in a finite element analysis. These discrete strain data are smoothed with a smoothing element discretization, producing superior accuracy strains and their first gradients. The approach enables the resulting smooth strain field to be practically C1-continuous throughout the domain of smoothing, exhibiting superconvergent properties of the smoothed quantity. The continuous strain gradients are also obtained directly from the solution. The recovered strain gradients are subsequently employed in the integration o equilibrium equations to obtain accurate interlaminar shear stresses. The problem is a simply-supported rectangular plate under a doubly sinusoidal load. The problem has an exact analytic solution which serves as a measure of goodness of the recovered interlaminar shear stresses. The method has the versatility of being applicable to the analysis of rather general and complex structures built of distinct components and materials, such as found in aircraft design. For these types of structures, the smoothing is achieved with 'patches', each patch covering the domain in which the smoothed quantity is physically continuous.

  1. Predicting performance with traffic analysis tools : final report.

    DOT National Transportation Integrated Search

    2008-03-01

    This document provides insights into the common pitfalls and challenges associated with use of traffic analysis tools for predicting future performance of a transportation facility. It provides five in-depth case studies that demonstrate common ways ...

  2. Finite element analysis of the cyclic indentation of bilayer enamel

    NASA Astrophysics Data System (ADS)

    Jia, Yunfei; Xuan, Fu-zhen; Chen, Xiaoping; Yang, Fuqian

    2014-04-01

    Tooth enamel is often subjected to repeated contact and often experiences contact deformation in daily life. The mechanical strength of the enamel determines the biofunctionality of the tooth. Considering the variation of the rod arrangement in outer and inner enamel, we approximate enamel as a bilayer structure and perform finite element analysis of the cyclic indentation of the bilayer structure, to mimic the repeated contact of enamel during mastication. The dynamic deformation behaviour of both the inner enamel and the bilayer enamel is examined. The material parameters of the inner and outer enamel used in the analysis are obtained by fitting the finite element results with the experimental nanoindentation results. The penetration depth per cycle at the quasi-steady state is used to describe the depth propagation speed, which exhibits a two-stage power-law dependence on the maximum indentation load and the amplitude of the cyclic load, respectively. The continuous penetration of the indenter reflects the propagation of the plastic zone during cyclic indentation, which is related to the energy dissipation. The outer enamel serves as a protective layer due to its great resistance to contact deformation in comparison to the inner enamel. The larger equivalent plastic strain and lower stresses in the inner enamel during cyclic indentation, as calculated from the finite element analysis, indicate better crack/fracture resistance of the inner enamel.

  3. Trace element analysis by PIXE in several biomedical fields

    NASA Astrophysics Data System (ADS)

    Weber, G.; Robaye, G.; Bartsch, P.; Collignon, A.; Beguin, Y.; Roelandts, I.; Delbrouck, J. M.

    1984-04-01

    Since 1980 in the University of Liége trace element analysis by PIXE has been developed in several directions, among these: the elemental composition of lung parenchyma, hilar lymph nodes, blood content in hematological disorders and renal insufficiency. The content in trace elements of lung tumor and surrounding tissue is measured and compared to similar content previously obtained on unselected patients of comparable ages. The normalization of the bromine deficiency observed in hemodialized patients is achieved by using a dialyzing bath doped with NaBr in order to obtain a normal bromine level of 5.7 μg/ml. The content of Cu, Zn, Br and Se in blood serum from more than 100 patients suffering from malignant hemopathy has been measured. The results are compared with a reference group. These oligoelements have also been measured sequentially for patients under intensive chemotherapy in acute myeloid leukemia.

  4. Laer-induced Breakdown Spectroscopy Instrument for Element Analysis of Planetary Surfaces

    NASA Technical Reports Server (NTRS)

    Blacic, J.; Pettit, D.; Cremers, D.; Roessler, N.

    1993-01-01

    One of the most fundamental pieces of information about any planetary body is the elemental and mineralogical composition of its surface materials. We are developing an instrument to obtain such data at ranges of up to several hundreds of meters using the technique of Laser-Induced Breakdown Spectroscopy (LIBS). We envision our instrument being used from a spacecraft in close rendezvous with small bodies such as comets and asteroids, or deployed on surface-rover vehicles on large bodies such as Mars and the Moon. The elemental analysis is based on atomic emission spectroscopy of a laser-induced plasma or spark. A pulsed, diode pumped Nd:YAG laser of several hundred millijoules optical energy is used to vaporize and electronically excite the constituent elements of a rock surface remotely located from the laser. Light emitted from the excited plasma is collected and introduced to the entrance slit of a small grating spectrometer. The spectrally dispersed spark light is detected with either a linear photo diode array or area CCD array. When the latter detector is used, the optical and spectrometer components of the LIBS instrument can also be used in a passive imaging mode to collect and integrate reflected sunlight from the same rock surface. Absorption spectral analysis of this reflected light gives mineralogical information that provides a remote geochemical characterization of the rock surface. We performed laboratory calibrations in air and in vacuum on standard rock powders to quantify the LIBS analysis. We performed preliminary field tests using commercially available components to demonstrate remote LIBS analysis of terrestrial rock surfaces at ranges of over 25 m, and we have demonstrated compatibility with a six-wheeled Russian robotic rover vehicle. Based on these results, we believe that all major and most minor elements expected on planetary surfaces can be measured with absolute accuracy of 10-15 percent and much higher relative accuracy. We have

  5. LIBS: a potential tool for industrial/agricultural waste water analysis

    NASA Astrophysics Data System (ADS)

    Karpate, Tanvi; K. M., Muhammed Shameem; Nayak, Rajesh; V. K., Unnikrishnan; Santhosh, C.

    2016-04-01

    Laser Induced Breakdown Spectroscopy (LIBS) is a multi-elemental analysis technique with various advantages and has the ability to detect any element in real time. This technique holds a potential for environmental monitoring and various such analysis has been done in soil, glass, paint, water, plastic etc confirms the robustness of this technique for such applications. Compared to the currently available water quality monitoring methods and techniques, LIBS has several advantages, viz. no need for sample preparation, fast and easy operation, and chemical free during the process. In LIBS, powerful pulsed laser generates plasma which is then analyzed to get quantitative and qualitative details of the elements present in the sample. Another main advantage of LIBS technique is that it can perform in standoff mode for real time analysis. Water samples from industries and agricultural strata tend to have a lot of pollutants making it harmful for consumption. The emphasis of this project is to determine such harmful pollutants present in trace amounts in industrial and agricultural wastewater. When high intensity laser is made incident on the sample, a plasma is generated which gives a multielemental emission spectra. LIBS analysis has shown outstanding success for solids samples. For liquid samples, the analysis is challenging as the liquid sample has the chances of splashing due to the high energy of laser and thus making it difficult to generate plasma. This project also deals with determining the most efficient method for testing of water sample for qualitative as well as quantitative analysis using LIBS.

  6. Groove refinishing tool

    DOEpatents

    Kellogg, Harvey J.; Holm, Robert O.

    1983-01-01

    A groove refinishing tool which utilizes a finishing wheel which is controlled by an air grinder motor. The air grinder motor is mounted on a main body section which is pivotally attached to a shoe element. The shoe element contains guide pins which guide the shoe element on the groove to be refinished. Application of pressure on the main body element compresses a weight counterbalance spring to extend the finishing wheel through the shoe element to refinish the groove surface. A window is provided for viewing the refinishing operation. Milling operations can also be performed by replacing the finishing wheel with a milling wheel.

  7. VARS-TOOL: A Comprehensive, Efficient, and Robust Sensitivity Analysis Toolbox

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Sheikholeslami, R.; Haghnegahdar, A.; Esfahbod, B.

    2016-12-01

    VARS-TOOL is an advanced sensitivity and uncertainty analysis toolbox, applicable to the full range of computer simulation models, including Earth and Environmental Systems Models (EESMs). The toolbox was developed originally around VARS (Variogram Analysis of Response Surfaces), which is a general framework for Global Sensitivity Analysis (GSA) that utilizes the variogram/covariogram concept to characterize the full spectrum of sensitivity-related information, thereby providing a comprehensive set of "global" sensitivity metrics with minimal computational cost. VARS-TOOL is unique in that, with a single sample set (set of simulation model runs), it generates simultaneously three philosophically different families of global sensitivity metrics, including (1) variogram-based metrics called IVARS (Integrated Variogram Across a Range of Scales - VARS approach), (2) variance-based total-order effects (Sobol approach), and (3) derivative-based elementary effects (Morris approach). VARS-TOOL is also enabled with two novel features; the first one being a sequential sampling algorithm, called Progressive Latin Hypercube Sampling (PLHS), which allows progressively increasing the sample size for GSA while maintaining the required sample distributional properties. The second feature is a "grouping strategy" that adaptively groups the model parameters based on their sensitivity or functioning to maximize the reliability of GSA results. These features in conjunction with bootstrapping enable the user to monitor the stability, robustness, and convergence of GSA with the increase in sample size for any given case study. VARS-TOOL has been shown to achieve robust and stable results within 1-2 orders of magnitude smaller sample sizes (fewer model runs) than alternative tools. VARS-TOOL, available in MATLAB and Python, is under continuous development and new capabilities and features are forthcoming.

  8. Strategic Analysis Overview

    NASA Technical Reports Server (NTRS)

    Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe

    2008-01-01

    NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.

  9. Multi-national, multi-lingual, multi-professional CATs: (Curriculum Analysis Tools).

    PubMed

    Eisner, J

    1995-01-01

    A consortium of dental schools and allied dental programs was established in 1991 with the expressed purpose of creating a curriculum database program that was end-user modifiable [1]. In April of 1994, a beta version (Beta 2.5 written in FoxPro(TM) 2.5) of the software CATs, an acronym for Curriculum Analysis Tools, was released for use by over 30 of the consortium's 60 member institutions, while the remainder either waited for the Macintosh (TM) or Windows (TM) versions of the program or were simply not ready to begin an institutional curriculum analysis project. Shortly after this release, the design specifications were rewritten based on a thorough critique of the Beta 2.5 design and coding structures and user feedback. The result was Beta 3.0 which has been designed to accommodate any health professions curriculum, in any country that uses English or French as one of its languages. Given the program's extensive use of screen generation tools, it was quite easy to offer screen displays in a second language. As more languages become available as part of the Unified Medical Language System, used to document curriculum content, the program's design will allow their incorporation. When the software arrives at a new institution, the choice of language and health profession will have been preselected, leaving the Curriculum Database Manager to identify the country where the member institution is located. With these 'macro' end-user decisions completed, the database manager can turn to a more specific set of end-user questions including: 1) will the curriculum view selected for analysis be created by the course directors (provider entry of structured course outlines) or by the students (consumer entry of class session summaries)?; 2) which elements within the provided course outline or class session modules will be used?; 3) which, if any, internal curriculum validation measures will be included?; and 4) which, if any, external validation measures will be included

  10. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    NASA Astrophysics Data System (ADS)

    Battaglieri, M.; Briscoe, B. J.; Celentano, A.; Chung, S.-U.; D'Angelo, A.; De Vita, R.; Döring, M.; Dudek, J.; Eidelman, S.; Fegan, S.; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; García-Tecocoatzi, H.; Glazier, D. I.; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, D. G.; Ketzer, B.; Klein, F. J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, V.; McKinnon, B.; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, A.; Salgado, C.; Santopinto, E.; Sarantsev, A. V.; Sato, T.; Schlüter, T.; [Silva]da Silva, M. L. L.; Stankovic, I.; Strakovsky, I.; Szczepaniak, A.; Vassallo, A.; Walford, N. K.; Watts, D. P.; Zana, L.

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.

  11. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    DOE PAGES

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; ...

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopymore » in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.« less

  12. The MHOST finite element program: 3-D inelastic analysis methods for hot section components. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Nakazawa, Shohei

    1989-01-01

    The user options available for running the MHOST finite element analysis package is described. MHOST is a solid and structural analysis program based on the mixed finite element technology, and is specifically designed for 3-D inelastic analysis. A family of 2- and 3-D continuum elements along with beam and shell structural elements can be utilized, many options are available in the constitutive equation library, the solution algorithms and the analysis capabilities. The outline of solution algorithms is discussed along with the data input and output, analysis options including the user subroutines and the definition of the finite elements implemented in the program package.

  13. Forensic Comparison of Soil Samples Using Nondestructive Elemental Analysis.

    PubMed

    Uitdehaag, Stefan; Wiarda, Wim; Donders, Timme; Kuiper, Irene

    2017-07-01

    Soil can play an important role in forensic cases in linking suspects or objects to a crime scene by comparing samples from the crime scene with samples derived from items. This study uses an adapted ED-XRF analysis (sieving instead of grinding to prevent destruction of microfossils) to produce elemental composition data of 20 elements. Different data processing techniques and statistical distances were evaluated using data from 50 samples and the log-LR cost (C llr ). The best performing combination, Canberra distance, relative data, and square root values, is used to construct a discriminative model. Examples of the spatial resolution of the method in crime scenes are shown for three locations, and sampling strategy is discussed. Twelve test cases were analyzed, and results showed that the method is applicable. The study shows how the combination of an analysis technique, a database, and a discriminative model can be used to compare multiple soil samples quickly. © 2016 American Academy of Forensic Sciences.

  14. The aggregated unfitted finite element method for elliptic problems

    NASA Astrophysics Data System (ADS)

    Badia, Santiago; Verdugo, Francesc; Martín, Alberto F.

    2018-07-01

    Unfitted finite element techniques are valuable tools in different applications where the generation of body-fitted meshes is difficult. However, these techniques are prone to severe ill conditioning problems that obstruct the efficient use of iterative Krylov methods and, in consequence, hinders the practical usage of unfitted methods for realistic large scale applications. In this work, we present a technique that addresses such conditioning problems by constructing enhanced finite element spaces based on a cell aggregation technique. The presented method, called aggregated unfitted finite element method, is easy to implement, and can be used, in contrast to previous works, in Galerkin approximations of coercive problems with conforming Lagrangian finite element spaces. The mathematical analysis of the new method states that the condition number of the resulting linear system matrix scales as in standard finite elements for body-fitted meshes, without being affected by small cut cells, and that the method leads to the optimal finite element convergence order. These theoretical results are confirmed with 2D and 3D numerical experiments.

  15. Strength Analysis on Ship Ladder Using Finite Element Method

    NASA Astrophysics Data System (ADS)

    Budianto; Wahyudi, M. T.; Dinata, U.; Ruddianto; Eko P., M. M.

    2018-01-01

    In designing the ship’s structure, it should refer to the rules in accordance with applicable classification standards. In this case, designing Ladder (Staircase) on a Ferry Ship which is set up, it must be reviewed based on the loads during ship operations, either during sailing or at port operations. The classification rules in ship design refer to the calculation of the structure components described in Classification calculation method and can be analysed using the Finite Element Method. Classification Regulations used in the design of Ferry Ships used BKI (Bureau of Classification Indonesia). So the rules for the provision of material composition in the mechanical properties of the material should refer to the classification of the used vessel. The analysis in this structure used program structure packages based on Finite Element Method. By using structural analysis on Ladder (Ladder), it obtained strength and simulation structure that can withstand load 140 kg both in static condition, dynamic, and impact. Therefore, the result of the analysis included values of safety factors in the ship is to keep the structure safe but the strength of the structure is not excessive.

  16. Tube Bulge Process : Theoretical Analysis and Finite Element Simulations

    NASA Astrophysics Data System (ADS)

    Velasco, Raphael; Boudeau, Nathalie

    2007-05-01

    This paper is focused on the determination of mechanics characteristics for tubular materials, using tube bulge process. A comparative study is made between two different models: theoretical model and finite element analysis. The theoretical model is completely developed, based first on a geometrical analysis of the tube profile during bulging, which is assumed to strain in arc of circles. Strain and stress analysis complete the theoretical model, which allows to evaluate tube thickness and state of stress, at any point of the free bulge region. Free bulging of a 304L stainless steel is simulated using Ls-Dyna 970. To validate FE simulations approach, a comparison between theoretical and finite elements models is led on several parameters such as: thickness variation at the free bulge region pole with bulge height, tube thickness variation with z axial coordinate, and von Mises stress variation with plastic strain. Finally, the influence of geometrical parameters deviations on flow stress curve is observed using analytical model: deviations of the tube outer diameter, its initial thickness and the bulge height measurement are taken into account to obtain a resulting error on plastic strain and von Mises stress.

  17. Simulation model of an eyeball based on finite element analysis on a supercomputer

    PubMed Central

    Uchio, E.; Ohno, S.; Kudoh, J.; Aoki, K.; Kisielewicz, L. T.

    1999-01-01

    BACKGROUND/AIMS—A simulation model of the human eye was developed. It was applied to the determination of the physical and mechanical conditions of impacting foreign bodies causing intraocular foreign body (IOFB) injuries.
METHODS—Modules of the Hypermesh (Altair Engineering, Tokyo, Japan) were used for solid modelling, geometric construction, and finite element mesh creation based on information obtained from cadaver eyes. The simulations were solved by a supercomputer using the finite element analysis (FEA) program PAM-CRASH (Nihon ESI, Tokyo, Japan). It was assumed that rupture occurs at a strain of 18.0% in the cornea and 6.8% in the sclera and at a stress of 9.4 MPa for both cornea and sclera. Blunt-shaped missiles were shot and set to impact on the surface of the cornea or sclera at velocities of 30 and 60 m/s, respectively.
RESULTS—According to the simulation, the sizes of missile above which corneal rupture occurred at velocities of 30 and 60 m/s were 1.95 and 0.82 mm. The missile sizes causing scleral rupture were 0.95 and 0.75 mm at velocities of 30 and 60 m/s.
CONCLUSIONS—These results suggest that this FEA model has potential usefulness as a simulation tool for ocular injury and it may provide useful information for developing protective measures against industrial and traffic ocular injuries.

 PMID:10502567

  18. Potential and Limitations of the Modal Characterization of a Spacecraft Bus Structure by Means of Active Structure Elements

    NASA Technical Reports Server (NTRS)

    Grillenbeck, Anton M.; Dillinger, Stephan A.; Elliott, Kenny B.

    1998-01-01

    Theoretical and experimental studies have been performed to investigate the potential and limitations of the modal characterization of a typical spacecraft bus structure by means of active structure elements. The aim of these studies has been test and advance tools for performing an accurate on-orbit modal identification which may be characterized by the availability of a generally very limited test instrumentation, autonomous excitation capabilities by active structure elements and a zero-g environment. The NASA LARC CSI Evolutionary Testbed provided an excellent object for the experimental part of this study program. The main subjects of investigation were: (1) the selection of optimum excitation and measurement to unambiguously identify modes of interest; (2) the applicability of different types of excitation means with focus on active structure elements; and (3) the assessment of the modal identification potential of different types of excitation functions and modal analysis tools. Conventional as well as dedicated modal analysis tools were applied to determine modal parameters and mode shapes. The results will be presented and discussed based on orthogonality checks as well as on suitable indicators for the quality of the acquired modes with respect to modal purity. In particular, the suitability for modal analysis of the acquired frequency response functions as obtained by excitation with active structure elements will be demonstrated with the help of reciprocity checks. Finally, the results will be summarized in a procedure to perform an on-orbit modal identification, including an indication of limitation to be observed.

  19. Chromatographic-ICPMS methods for trace element and isotope analysis of water and biogenic calcite

    NASA Astrophysics Data System (ADS)

    Klinkhammer, G. P.; Haley, B. A.; McManus, J.; Palmer, M. R.

    2003-04-01

    ICP-MS is a powerful technique because of its sensitivity and speed of analysis. This is especially true for refractory elements that are notoriously difficult using TIMS and less energetic techniques. However, as ICP-MS instruments become more sensitive to elements of interest they also become more sensitive to interference. This becomes a pressing issue when analyzing samples with high total dissolved solids. This paper describes two trace element methods that overcome these problems by using chromatographic techniques to precondition samples prior to analysis by ICP-MS: separation of rare earth elements (REEs) from seawater using HPLC-ICPMS, and flow-through dissolution of foraminiferal calcite. Using HPLC in combination with ICP-MS it is possible to isolate the REEs from matrix, other transition elements, and each other. This method has been developed for small volume samples (5ml) making it possible to analyze sediment pore waters. As another example, subjecting foram shells to flow-through reagent addition followed by time-resolved analysis in the ICP-MS allows for systematic cleaning and dissolution of foram shells. This method provides information about the relationship between dissolution tendency and elemental composition. Flow-through is also amenable to automation thus yielding the high sample throughput required for paleoceanography, and produces a highly resolved elemental matrix that can be statistically analyzed.

  20. Tool Support for Parametric Analysis of Large Software Simulation Systems

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.