Sample records for numerical analysis package

  1. Solving PDEs with Intrepid

    DOE PAGES

    Bochev, P.; Edwards, H. C.; Kirby, R. C.; ...

    2012-01-01

    Intrepid is a Trilinos package for advanced discretizations of Partial Differential Equations (PDEs). The package provides a comprehensive set of tools for local, cell-based construction of a wide range of numerical methods for PDEs. This paper describes the mathematical ideas and software design principles incorporated in the package. We also provide representative examples showcasing the use of Intrepid both in the context of numerical PDEs and the more general context of data analysis.

  2. Numerical analysis of laminar and turbulent incompressible flows using the finite element Fluid Dynamics Analysis Package (FIDAP)

    NASA Technical Reports Server (NTRS)

    Sohn, Jeong L.

    1988-01-01

    The purpose of the study is the evaluation of the numerical accuracy of FIDAP (Fluid Dynamics Analysis Package). Accordingly, four test problems in laminar and turbulent incompressible flows are selected and the computational results of these problems compared with other numerical solutions and/or experimental data. These problems include: (1) 2-D laminar flow inside a wall-driven cavity; (2) 2-D laminar flow over a backward-facing step; (3) 2-D turbulent flow over a backward-facing step; and (4) 2-D turbulent flow through a turn-around duct.

  3. A Review of Meta-Analysis Packages in R

    ERIC Educational Resources Information Center

    Polanin, Joshua R.; Hennessy, Emily A.; Tanner-Smith, Emily E.

    2017-01-01

    Meta-analysis is a statistical technique that allows an analyst to synthesize effect sizes from multiple primary studies. To estimate meta-analysis models, the open-source statistical environment R is quickly becoming a popular choice. The meta-analytic community has contributed to this growth by developing numerous packages specific to…

  4. Numerical Package in Computer Supported Numeric Analysis Teaching

    ERIC Educational Resources Information Center

    Tezer, Murat

    2007-01-01

    At universities in the faculties of Engineering, Sciences, Business and Economics together with higher education in Computing, it is stated that because of the difficulty, calculators and computers can be used in Numerical Analysis (NA). In this study, the learning computer supported NA will be discussed together with important usage of the…

  5. NORTICA—a new code for cyclotron analysis

    NASA Astrophysics Data System (ADS)

    Gorelov, D.; Johnson, D.; Marti, F.

    2001-12-01

    The new package NORTICA (Numerical ORbit Tracking In Cyclotrons with Analysis) of computer codes for beam dynamics simulations is under development at NSCL. The package was started as a replacement for the code MONSTER [1] developed in the laboratory in the past. The new codes are capable of beam dynamics simulations in both CCF (Coupled Cyclotron Facility) accelerators, the K500 and K1200 superconducting cyclotrons. The general purpose of this package is assisting in setting and tuning the cyclotrons taking into account the main field and extraction channel imperfections. The computer platform for the package is Alpha Station with UNIX operating system and X-Windows graphic interface. A multiple programming language approach was used in order to combine the reliability of the numerical algorithms developed over the long period of time in the laboratory and the friendliness of modern style user interface. This paper describes the capability and features of the codes in the present state.

  6. Molded underfill (MUF) encapsulation for flip-chip package: A numerical investigation

    NASA Astrophysics Data System (ADS)

    Azmi, M. A.; Abdullah, M. K.; Abdullah, M. Z.; Ariff, Z. M.; Saad, Abdullah Aziz; Hamid, M. F.; Ismail, M. A.

    2017-07-01

    This paper presents the numerical simulation of epoxy molding compound (EMC) filling in multi flip-chip packages during encapsulation process. The empty and a group flip chip packages were considered in the mold cavity in order to study the flow profile of the EMC. SOLIDWORKS software was used for three-dimensional modeling and it was incorporated into fluid analysis software namely as ANSYS FLUENT. The volume of fluid (VOF) technique was used for capturing the flow front profiles and Power Law model was applied for its rheology model. The numerical result are compared and discussed with previous experimental and it was shown a good conformity for model validation. The prediction of flow front was observed and analyzed at different filling time. The possibility and visual of void formation in the package is captured and the number of flip-chip is one factor that contributed to the void formation.

  7. Language Analysis Package (L.A.P.) Version I System Design.

    ERIC Educational Resources Information Center

    Porch, Ann

    To permit researchers to use the speed and versatility of the computer to process natural language text as well as numerical data without undergoing special training in programing or computer operations, a language analysis package has been developed partially based on several existing programs. An overview of the design is provided and system…

  8. An Innovative Learning Model for Computation in First Year Mathematics

    ERIC Educational Resources Information Center

    Tonkes, E. J.; Loch, B. I.; Stace, A. W.

    2005-01-01

    MATLAB is a sophisticated software tool for numerical analysis and visualization. The University of Queensland has adopted Matlab as its official teaching package across large first year mathematics courses. In the past, the package has met severe resistance from students who have not appreciated their computational experience. Several main…

  9. Analysis pipelines and packages for Infinium HumanMethylation450 BeadChip (450k) data

    PubMed Central

    Morris, Tiffany J.; Beck, Stephan

    2015-01-01

    The Illumina HumanMethylation450 BeadChip has become a popular platform for interrogating DNA methylation in epigenome-wide association studies (EWAS) and related projects as well as resource efforts such as the International Cancer Genome Consortium (ICGC) and the International Human Epigenome Consortium (IHEC). This has resulted in an exponential increase of 450k data in recent years and triggered the development of numerous integrated analysis pipelines and stand-alone packages. This review will introduce and discuss the currently most popular pipelines and packages and is particularly aimed at new 450k users. PMID:25233806

  10. Numerical bifurcation analysis of immunological models with time delays

    NASA Astrophysics Data System (ADS)

    Luzyanina, Tatyana; Roose, Dirk; Bocharov, Gennady

    2005-12-01

    In recent years, a large number of mathematical models that are described by delay differential equations (DDEs) have appeared in the life sciences. To analyze the models' dynamics, numerical methods are necessary, since analytical studies can only give limited results. In turn, the availability of efficient numerical methods and software packages encourages the use of time delays in mathematical modelling, which may lead to more realistic models. We outline recently developed numerical methods for bifurcation analysis of DDEs and illustrate the use of these methods in the analysis of a mathematical model of human hepatitis B virus infection.

  11. Computer Facilitated Mathematical Methods in Chemical Engineering--Similarity Solution

    ERIC Educational Resources Information Center

    Subramanian, Venkat R.

    2006-01-01

    High-performance computers coupled with highly efficient numerical schemes and user-friendly software packages have helped instructors to teach numerical solutions and analysis of various nonlinear models more efficiently in the classroom. One of the main objectives of a model is to provide insight about the system of interest. Analytical…

  12. ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density.

    PubMed

    Moret-Tatay, Carmen; Gamermann, Daniel; Navarro-Pardo, Esperanza; Fernández de Córdoba Castellá, Pedro

    2018-01-01

    The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area). The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done.

  13. ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density

    PubMed Central

    Moret-Tatay, Carmen; Gamermann, Daniel; Navarro-Pardo, Esperanza; Fernández de Córdoba Castellá, Pedro

    2018-01-01

    The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area). The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done. PMID:29765345

  14. Analysis pipelines and packages for Infinium HumanMethylation450 BeadChip (450k) data.

    PubMed

    Morris, Tiffany J; Beck, Stephan

    2015-01-15

    The Illumina HumanMethylation450 BeadChip has become a popular platform for interrogating DNA methylation in epigenome-wide association studies (EWAS) and related projects as well as resource efforts such as the International Cancer Genome Consortium (ICGC) and the International Human Epigenome Consortium (IHEC). This has resulted in an exponential increase of 450k data in recent years and triggered the development of numerous integrated analysis pipelines and stand-alone packages. This review will introduce and discuss the currently most popular pipelines and packages and is particularly aimed at new 450k users. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  15. 7 CFR 51.2927 - Marking and packing requirements.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... and packing requirements. The minimum size or numerical count of the apricots in any package shall be plainly labeled, stenciled, or otherwise marked on the package. (a) Numerical count. When the numerical...

  16. 7 CFR 51.2927 - Marking and packing requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... and packing requirements. The minimum size or numerical count of the apricots in any package shall be plainly labeled, stenciled, or otherwise marked on the package. (a) Numerical count. When the numerical...

  17. 7 CFR 51.2927 - Marking and packing requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... and packing requirements. The minimum size or numerical count of the apricots in any package shall be plainly labeled, stenciled, or otherwise marked on the package. (a) Numerical count. When the numerical...

  18. Analysis of Plane-Parallel Electron Beam Propagation in Different Media by Numerical Simulation Methods

    NASA Astrophysics Data System (ADS)

    Miloichikova, I. A.; Bespalov, V. I.; Krasnykh, A. A.; Stuchebrov, S. G.; Cherepennikov, Yu. M.; Dusaev, R. R.

    2018-04-01

    Simulation by the Monte Carlo method is widely used to calculate the character of ionizing radiation interaction with substance. A wide variety of programs based on the given method allows users to choose the most suitable package for solving computational problems. In turn, it is important to know exactly restrictions of numerical systems to avoid gross errors. Results of estimation of the feasibility of application of the program PCLab (Computer Laboratory, version 9.9) for numerical simulation of the electron energy distribution absorbed in beryllium, aluminum, gold, and water for industrial, research, and clinical beams are presented. The data obtained using programs ITS and Geant4 being the most popular software packages for solving the given problems and the program PCLab are presented in the graphic form. A comparison and an analysis of the results obtained demonstrate the feasibility of application of the program PCLab for simulation of the absorbed energy distribution and dose of electrons in various materials for energies in the range 1-20 MeV.

  19. New solution decomposition and minimization schemes for Poisson-Boltzmann equation in calculation of biomolecular electrostatics

    NASA Astrophysics Data System (ADS)

    Xie, Dexuan

    2014-10-01

    The Poisson-Boltzmann equation (PBE) is one widely-used implicit solvent continuum model in the calculation of electrostatic potential energy for biomolecules in ionic solvent, but its numerical solution remains a challenge due to its strong singularity and nonlinearity caused by its singular distribution source terms and exponential nonlinear terms. To effectively deal with such a challenge, in this paper, new solution decomposition and minimization schemes are proposed, together with a new PBE analysis on solution existence and uniqueness. Moreover, a PBE finite element program package is developed in Python based on the FEniCS program library and GAMer, a molecular surface and volumetric mesh generation program package. Numerical tests on proteins and a nonlinear Born ball model with an analytical solution validate the new solution decomposition and minimization schemes, and demonstrate the effectiveness and efficiency of the new PBE finite element program package.

  20. Micro Computer Tomography for medical device and pharmaceutical packaging analysis.

    PubMed

    Hindelang, Florine; Zurbach, Raphael; Roggo, Yves

    2015-04-10

    Biomedical device and medicine product manufacturing are long processes facing global competition. As technology evolves with time, the level of quality, safety and reliability increases simultaneously. Micro Computer Tomography (Micro CT) is a tool allowing a deep investigation of products: it can contribute to quality improvement. This article presents the numerous applications of Micro CT for medical device and pharmaceutical packaging analysis. The samples investigated confirmed CT suitability for verification of integrity, measurements and defect detections in a non-destructive manner. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. The Ndynamics package—Numerical analysis of dynamical systems and the fractal dimension of boundaries

    NASA Astrophysics Data System (ADS)

    Avellar, J.; Duarte, L. G. S.; da Mota, L. A. C. P.; de Melo, N.; Skea, J. E. F.

    2012-09-01

    A set of Maple routines is presented, fully compatible with the new releases of Maple (14 and higher). The package deals with the numerical evolution of dynamical systems and provide flexible plotting of the results. The package also brings an initial conditions generator, a numerical solver manager, and a focusing set of routines that allow for better analysis of the graphical display of the results. The novelty that the package presents an optional C interface is maintained. This allows for fast numerical integration, even for the totally inexperienced Maple user, without any C expertise being required. Finally, the package provides the routines to calculate the fractal dimension of boundaries (via box counting). New version program summary Program Title: Ndynamics Catalogue identifier: %Leave blank, supplied by Elsevier. Licensing provisions: no. Programming language: Maple, C. Computer: Intel(R) Core(TM) i3 CPU M330 @ 2.13 GHz. Operating system: Windows 7. RAM: 3.0 GB Keywords: Dynamical systems, Box counting, Fractal dimension, Symbolic computation, Differential equations, Maple. Classification: 4.3. Catalogue identifier of previous version: ADKH_v1_0. Journal reference of previous version: Comput. Phys. Commun. 119 (1999) 256. Does the new version supersede the previous version?: Yes. Nature of problem Computation and plotting of numerical solutions of dynamical systems and the determination of the fractal dimension of the boundaries. Solution method The default method of integration is a fifth-order Runge-Kutta scheme, but any method of integration present on the Maple system is available via an argument when calling the routine. A box counting [1] method is used to calculate the fractal dimension [2] of the boundaries. Reasons for the new version The Ndynamics package met a demand of our research community for a flexible and friendly environment for analyzing dynamical systems. All the user has to do is create his/her own Maple session, with the system to be studied, and use the commands on the package to (for instance) calculate the fractal dimension of a certain boundary, without knowing or worrying about a single line of C programming. So the package combines the flexibility and friendly aspect of Maple with the fast and robust numerical integration of the compiled (for example C) basin. The package is old, but the problems it was designed to dealt with are still there. Since Maple evolved, the package stopped working, and we felt compelled to produce this version, fully compatible with the latest version of Maple, to make it again available to the Maple user. Summary of revisions Deprecated Maple Packages and Commands: Paraphrasing the Maple in-built help files, "Some Maple commands and packages are deprecated. A command (or package) is deprecated when its functionality has been replaced by an improved implementation. The newer command is said to supersede the older one, and use of the newer command is strongly recommended". So, we have examined our code to see if some of these occurrences could be dangerous for it. For example, the "readlib" command is unnecessary, and we have removed its occurrences from our code. We have checked and changed all the necessary commands in order for us to be safe in respect to danger from this source. Another change we had to make was related to the tools we have implemented in order to use the interface for performing the numerical integration in C, externally, via the use of the Maple command "ssystem". In the past, we had used, for the external C integration, the DJGPP system. But now we present the package with (free) Borland distribution. The compilation and compiling commands are now slightly changed. For example, to compile only, we had used "gcc-c"; now, we use "bcc32-c", etc. All this installation (Borland) is explained on a "README" file we are submitting here to help the potential user. Restrictions Besides the inherent restrictions of numerical integration methods, this version of the package only deals with systems of first-order differential equations. Unusual features This package provides user-friendly software tools for analyzing the character of a dynamical system, whether it displays chaotic behaviour, and so on. Options within the package allow the user to specify characteristics that separate the trajectories into families of curves. In conjunction with the facilities for altering the user's viewpoint, this provides a graphical interface for the speedy and easy identification of regions with interesting dynamics. An unusual characteristic of the package is its interface for performing the numerical integrations in C using a fifth-order Runge-Kutta method (default). This potentially improves the speed of the numerical integration by some orders of magnitude and, in cases where it is necessary to calculate thousands of graphs in regions of difficult integration, this feature is very desirable. Besides that tool, somewhat more experienced users can produce their own C integrator and, by using the commands available in the package, use it as the C integrator provided with the package as long as the new integrator manages the input and output in the same format as the default one does. Running time This depends strongly on the dynamical system. With an Intel® Core™ i3 CPU M330 @ 2.13 GHz, the integration of 50 graphs, for a system of two first-order equations, typically takes less than a second to run (with the C integration interface). Without the C interface, it takes a few seconds. In order to calculate the fractal dimension, where we typically use 10,000 points to integrate, using the C interface it takes from 20 to 30 s. Without the C interface, it becomes really impractical, taking, sometimes, for the same case, almost an hour. For some cases, it takes many hours.

  2. Numeric analysis of terahertz wave propagation in familiar packaging materials

    NASA Astrophysics Data System (ADS)

    Zhang, Lihong; Yang, Guang

    2015-10-01

    To assess the potential application of terahertz waves in security examination, the transmission characteristics of terahertz waves in packaging materials should be studied. This paper simulates the propagation of terahertz waves in cloth and paper, studies the changes of shape and position of crest of terahertz waves before and after these materials, and gets the law of these changes, which has potential applications in thickness measurement for the thin insulated materials; gives reflected and transmitted wave of terahertz waves, and computes reflected and transmitted coefficient, indicates the good transmission properties of these materials for terahertz waves, which provides the theoretical basis for the realization of contactless security examination of packaged post, package and people pass the important passageway (such as airport and station).

  3. Groundwater Data Package for the 2004 Composite Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thorne, Paul D.

    2004-08-11

    This report presents data and information that supports the groundwater module. The conceptual model of groundwater flow and transport at the Hanford Site is described and specific information applied in the numerical implementation module is provided.

  4. Nonstationary Extreme Value Analysis in a Changing Climate: A Software Package

    NASA Astrophysics Data System (ADS)

    Cheng, L.; AghaKouchak, A.; Gilleland, E.

    2013-12-01

    Numerous studies show that climatic extremes have increased substantially in the second half of the 20th century. For this reason, analysis of extremes under a nonstationary assumption has received a great deal of attention. This paper presents a software package developed for estimation of return levels, return periods, and risks of climatic extremes in a changing climate. This MATLAB software package offers tools for analysis of climate extremes under both stationary and non-stationary assumptions. The Nonstationary Extreme Value Analysis (hereafter, NEVA) provides an efficient and generalized framework for analyzing extremes using Bayesian inference. NEVA estimates the extreme value parameters using a Differential Evolution Markov Chain (DE-MC) which utilizes the genetic algorithm Differential Evolution (DE) for global optimization over the real parameter space with the Markov Chain Monte Carlo (MCMC) approach and has the advantage of simplicity, speed of calculation and convergence over conventional MCMC. NEVA also offers the confidence interval and uncertainty bounds of estimated return levels based on the sampled parameters. NEVA integrates extreme value design concepts, data analysis tools, optimization and visualization, explicitly designed to facilitate analysis extremes in geosciences. The generalized input and output files of this software package make it attractive for users from across different fields. Both stationary and nonstationary components of the package are validated for a number of case studies using empirical return levels. The results show that NEVA reliably describes extremes and their return levels.

  5. AMModels: An R package for storing models, data, and metadata to facilitate adaptive management

    PubMed Central

    Katz, Jonathan E.

    2018-01-01

    Agencies are increasingly called upon to implement their natural resource management programs within an adaptive management (AM) framework. This article provides the background and motivation for the R package, AMModels. AMModels was developed under R version 3.2.2. The overall goal of AMModels is simple: To codify knowledge in the form of models and to store it, along with models generated from numerous analyses and datasets that may come our way, so that it can be used or recalled in the future. AMModels facilitates this process by storing all models and datasets in a single object that can be saved to an .RData file and routinely augmented to track changes in knowledge through time. Through this process, AMModels allows the capture, development, sharing, and use of knowledge that may help organizations achieve their mission. While AMModels was designed to facilitate adaptive management, its utility is far more general. Many R packages exist for creating and summarizing models, but to our knowledge, AMModels is the only package dedicated not to the mechanics of analysis but to organizing analysis inputs, analysis outputs, and preserving descriptive metadata. We anticipate that this package will assist users hoping to preserve the key elements of an analysis so they may be more confidently revisited at a later date. PMID:29489825

  6. AMModels: An R package for storing models, data, and metadata to facilitate adaptive management.

    PubMed

    Donovan, Therese M; Katz, Jonathan E

    2018-01-01

    Agencies are increasingly called upon to implement their natural resource management programs within an adaptive management (AM) framework. This article provides the background and motivation for the R package, AMModels. AMModels was developed under R version 3.2.2. The overall goal of AMModels is simple: To codify knowledge in the form of models and to store it, along with models generated from numerous analyses and datasets that may come our way, so that it can be used or recalled in the future. AMModels facilitates this process by storing all models and datasets in a single object that can be saved to an .RData file and routinely augmented to track changes in knowledge through time. Through this process, AMModels allows the capture, development, sharing, and use of knowledge that may help organizations achieve their mission. While AMModels was designed to facilitate adaptive management, its utility is far more general. Many R packages exist for creating and summarizing models, but to our knowledge, AMModels is the only package dedicated not to the mechanics of analysis but to organizing analysis inputs, analysis outputs, and preserving descriptive metadata. We anticipate that this package will assist users hoping to preserve the key elements of an analysis so they may be more confidently revisited at a later date.

  7. 49 CFR 178.905 - Large Packaging identification codes.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 2 2010-10-01 2010-10-01 false Large Packaging identification codes. 178.905... FOR PACKAGINGS Large Packagings Standards § 178.905 Large Packaging identification codes. Large packaging code designations consist of: two numerals specified in paragraph (a) of this section; followed by...

  8. 7 CFR 51.2927 - Marking and packing requirements.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Requirements § 51.2927 Marking and packing requirements. The minimum size or numerical count of the apricots in any package shall be plainly labeled, stenciled, or otherwise marked on the package. (a) Numerical count. When the numerical count is used the fruit in any sample shall not vary more than one-fourth inch...

  9. 7 CFR 51.2927 - Marking and packing requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Requirements § 51.2927 Marking and packing requirements. The minimum size or numerical count of the apricots in any package shall be plainly labeled, stenciled, or otherwise marked on the package. (a) Numerical count. When the numerical count is used the fruit in any sample shall not vary more than one-fourth inch...

  10. Modeling, design, packing and experimental analysis of liquid-phase shear-horizontal surface acoustic wave sensors

    NASA Astrophysics Data System (ADS)

    Pollard, Thomas B

    Recent advances in microbiology, computational capabilities, and microelectromechanical-system fabrication techniques permit modeling, design, and fabrication of low-cost, miniature, sensitive and selective liquid-phase sensors and lab-on-a-chip systems. Such devices are expected to replace expensive, time-consuming, and bulky laboratory-based testing equipment. Potential applications for devices include: fluid characterization for material science and industry; chemical analysis in medicine and pharmacology; study of biological processes; food analysis; chemical kinetics analysis; and environmental monitoring. When combined with liquid-phase packaging, sensors based on surface-acoustic-wave (SAW) technology are considered strong candidates. For this reason such devices are focused on in this work; emphasis placed on device modeling and packaging for liquid-phase operation. Regarding modeling, topics considered include mode excitation efficiency of transducers; mode sensitivity based on guiding structure materials/geometries; and use of new piezoelectric materials. On packaging, topics considered include package interfacing with SAW devices, and minimization of packaging effects on device performance. In this work novel numerical models are theoretically developed and implemented to study propagation and transduction characteristics of sensor designs using wave/constitutive equations, Green's functions, and boundary/finite element methods. Using developed simulation tools that consider finite-thickness of all device electrodes, transduction efficiency for SAW transducers with neighboring uniform or periodic guiding electrodes is reported for the first time. Results indicate finite electrode thickness strongly affects efficiency. Using dense electrodes, efficiency is shown to approach 92% and 100% for uniform and periodic electrode guiding, respectively; yielding improved sensor detection limits. A numerical sensitivity analysis is presented targeting viscosity using uniform-electrode and shear-horizontal mode configurations on potassium-niobate, langasite, and quartz substrates. Optimum configurations are determined yielding maximum sensitivity. Results show mode propagation-loss and sensitivity to viscosity are correlated by a factor independent of substrate material. The analysis is useful for designing devices meeting sensitivity and signal level requirements. A novel, rapid and precise microfluidic chamber alignment/bonding method was developed for SAW platforms. The package is shown to have little effect on device performance and permits simple macrofluidic interfacing. Lastly, prototypes were designed, fabricated, and tested for viscosity and biosensor applications; results show ability to detect as low as 1% glycerol in water and surface-bound DNA crosslinking.

  11. Scripting MODFLOW model development using Python and FloPy

    USGS Publications Warehouse

    Bakker, Mark; Post, Vincent E. A.; Langevin, Christian D.; Hughes, Joseph D.; White, Jeremy; Starn, Jeffrey; Fienen, Michael N.

    2016-01-01

    Graphical user interfaces (GUIs) are commonly used to construct and postprocess numerical groundwater flow and transport models. Scripting model development with the programming language Python is presented here as an alternative approach. One advantage of Python is that there are many packages available to facilitate the model development process, including packages for plotting, array manipulation, optimization, and data analysis. For MODFLOW-based models, the FloPy package was developed by the authors to construct model input files, run the model, and read and plot simulation results. Use of Python with the available scientific packages and FloPy facilitates data exploration, alternative model evaluations, and model analyses that can be difficult to perform with GUIs. Furthermore, Python scripts are a complete, transparent, and repeatable record of the modeling process. The approach is introduced with a simple FloPy example to create and postprocess a MODFLOW model. A more complicated capture-fraction analysis with a real-world model is presented to demonstrate the types of analyses that can be performed using Python and FloPy.

  12. Interactive visualization of numerical simulation results: A tool for mission planning and data analysis

    NASA Technical Reports Server (NTRS)

    Berchem, J.; Raeder, J.; Walker, R. J.; Ashour-Abdalla, M.

    1995-01-01

    We report on the development of an interactive system for visualizing and analyzing numerical simulation results. This system is based on visualization modules which use the Application Visualization System (AVS) and the NCAR graphics packages. Examples from recent simulations are presented to illustrate how these modules can be used for displaying and manipulating simulation results to facilitate their comparison with phenomenological model results and observations.

  13. Numerical flow analysis of axial flow compressor for steady and unsteady flow cases

    NASA Astrophysics Data System (ADS)

    Prabhudev, B. M.; Satish kumar, S.; Rajanna, D.

    2017-07-01

    Performance of jet engine is dependent on the performance of compressor. This paper gives numerical study of performance characteristics for axial compressor. The test rig is present at CSIR LAB Bangalore. Flow domains are meshed and fluid dynamic equations are solved using ANSYS package. Analysis is done for six different speeds and for operating conditions like choke, maximum efficiency & before stall point. Different plots are compared and results are discussed. Shock displacement, vortex flows, leakage patterns are presented along with unsteady FFT plot and time step plot.

  14. Application of the Finite Elemental Analysis to Modeling Temperature Change of the Vaccine in an Insulated Packaging Container during Transport.

    PubMed

    Ge, Changfeng; Cheng, Yujie; Shen, Yan

    2013-01-01

    This study demonstrated an attempt to predict temperatures of a perishable product such as vaccine inside an insulated packaging container during transport through finite element analysis (FEA) modeling. In order to use the standard FEA software for simulation, an equivalent heat conduction coefficient is proposed and calculated to describe the heat transfer of the air trapped inside the insulated packaging container. The three-dimensional, insulated packaging container is regarded as a combination of six panels, and the heat flow at each side panel is a one-dimension diffusion process. The transit-thermal analysis was applied to simulate the heat transition process from ambient environment to inside the container. Field measurements were carried out to collect the temperature during transport, and the collected data were compared to the FEA simulation results. Insulated packaging containers are used to transport temperature-sensitive products such as vaccine and other pharmaceutical products. The container is usually made of an extruded polystyrene foam filled with gel packs. World Health Organization guidelines recommend that all vaccines except oral polio vaccine be distributed in an environment where the temperature ranges between +2 to +8 °C. The primary areas of concern in designing the packaging for vaccine are how much of the foam thickness and gel packs should be used in order to keep the temperature in a desired range, and how to prevent the vaccine from exposure to freezing temperatures. This study uses numerical simulation to predict temperature change within an insulated packaging container in vaccine cold chain. It is our hope that this simulation will provide the vaccine industries with an alternative engineering tool to validate vaccine packaging and project thermal equilibrium within the insulated packaging container.

  15. A review of evaluative studies of computer-based learning in nursing education.

    PubMed

    Lewis, M J; Davies, R; Jenkins, D; Tait, M I

    2001-01-01

    Although there have been numerous attempts to evaluate the learning benefits of computer-based learning (CBL) packages in nursing education, the results obtained have been equivocal. A literature search conducted for this review found 25 reports of the evaluation of nursing CBL packages since 1966. Detailed analysis of the evaluation methods used in these reports revealed that most had significant design flaws, including the use of too small a sample group, the lack of a control group, etc. Because of this, the conclusions reached were not always valid. More effort is required in the design of future evaluation studies of nursing CBL packages. Copyright 2001 Harcourt Publishers Ltd.

  16. Development and testing of a numerical simulation method for thermally nonequilibrium dissociating flows in ANSYS Fluent

    NASA Astrophysics Data System (ADS)

    Shoev, G. V.; Bondar, Ye. A.; Oblapenko, G. P.; Kustova, E. V.

    2016-03-01

    Various issues of numerical simulation of supersonic gas flows with allowance for thermochemical nonequilibrium on the basis of fluid dynamic equations in the two-temperature approximation are discussed. The computational tool for modeling flows with thermochemical nonequilibrium is the commercial software package ANSYS Fluent with an additional userdefined open-code module. A comparative analysis of results obtained by various models of vibration-dissociation coupling in binary gas mixtures of nitrogen and oxygen is performed. Results of numerical simulations are compared with available experimental data.

  17. AMModels: An R package for storing models, data, and metadata to facilitate adaptive management

    USGS Publications Warehouse

    Donovan, Therese M.; Katz, Jonathan

    2018-01-01

    Agencies are increasingly called upon to implement their natural resource management programs within an adaptive management (AM) framework. This article provides the background and motivation for the R package, AMModels. AMModels was developed under R version 3.2.2. The overall goal of AMModels is simple: To codify knowledge in the form of models and to store it, along with models generated from numerous analyses and datasets that may come our way, so that it can be used or recalled in the future. AMModels facilitates this process by storing all models and datasets in a single object that can be saved to an .RData file and routinely augmented to track changes in knowledge through time. Through this process, AMModels allows the capture, development, sharing, and use of knowledge that may help organizations achieve their mission. While AMModels was designed to facilitate adaptive management, its utility is far more general. Many R packages exist for creating and summarizing models, but to our knowledge, AMModels is the only package dedicated not to the mechanics of analysis but to organizing analysis inputs, analysis outputs, and preserving descriptive metadata. We anticipate that this package will assist users hoping to preserve the key elements of an analysis so they may be more confidently revisited at a later date.

  18. Electromagnetic Field Effects in Semiconductor Crystal Growth

    NASA Technical Reports Server (NTRS)

    Dulikravich, George S.

    1996-01-01

    This proposed two-year research project was to involve development of an analytical model, a numerical algorithm for its integration, and a software for the analysis of a solidification process under the influence of electric and magnetic fields in microgravity. Due to the complexity of the analytical model that was developed and its boundary conditions, only a preliminary version of the numerical algorithm was developed while the development of the software package was not completed.

  19. influx_s: increasing numerical stability and precision for metabolic flux analysis in isotope labelling experiments.

    PubMed

    Sokol, Serguei; Millard, Pierre; Portais, Jean-Charles

    2012-03-01

    The problem of stationary metabolic flux analysis based on isotope labelling experiments first appeared in the early 1950s and was basically solved in early 2000s. Several algorithms and software packages are available for this problem. However, the generic stochastic algorithms (simulated annealing or evolution algorithms) currently used in these software require a lot of time to achieve acceptable precision. For deterministic algorithms, a common drawback is the lack of convergence stability for ill-conditioned systems or when started from a random point. In this article, we present a new deterministic algorithm with significantly increased numerical stability and accuracy of flux estimation compared with commonly used algorithms. It requires relatively short CPU time (from several seconds to several minutes with a standard PC architecture) to estimate fluxes in the central carbon metabolism network of Escherichia coli. The software package influx_s implementing this algorithm is distributed under an OpenSource licence at http://metasys.insa-toulouse.fr/software/influx/. Supplementary data are available at Bioinformatics online.

  20. Clustangles: An Open Library for Clustering Angular Data.

    PubMed

    Sargsyan, Karen; Hua, Yun Hao; Lim, Carmay

    2015-08-24

    Dihedral angles are good descriptors of the numerous conformations visited by large, flexible systems, but their analysis requires directional statistics. A single package including the various multivariate statistical methods for angular data that accounts for the distinct topology of such data does not exist. Here, we present a lightweight standalone, operating-system independent package called Clustangles to fill this gap. Clustangles will be useful in analyzing the ever-increasing number of structures in the Protein Data Bank and clustering the copious conformations from increasingly long molecular dynamics simulations.

  1. Design and development of conformal antenna composite structure

    NASA Astrophysics Data System (ADS)

    Xie, Zonghong; Zhao, Wei; Zhang, Peng; Li, Xiang

    2017-09-01

    In the manufacturing process of the common smart skin antenna, the adhesive covered on the radiating elements of the antenna led to severe deviation of the resonant frequency, which degraded the electromagnetic performance of the antenna. In this paper, a new component called package cover was adopted to prevent the adhesive from covering on the radiating elements of the microstrip antenna array. The package cover and the microstrip antenna array were bonded together as packaged antenna which was then embedded into the composite sandwich structure to develop a new structure called conformal antenna composite structure (CACS). The geometric parameters of the microstrip antenna array and the CACS were optimized by the commercial software CST microwave studio. According to the optimal results, the microstrip antenna array and the CACS were manufactured and tested. The experimental and numerical results of electromagnetic performance showed that the resonant frequency of the CACS was close to that of the microstrip antenna array (with error less than 1%) and the CACS had a higher gain (about 2 dB) than the microstrip antenna array. The package system would increase the electromagnetic radiating energy at the design frequency nearly 66%. The numerical model generated by CST microwave studio in this study could successfully predict the electromagnetic performance of the microstrip antenna array and the CACS with relatively good accuracy. The mechanical analysis results showed that the CACS had better flexural property than the composite sandwich structure without the embedment of packaged antenna. The comparison of the electromagnetic performance for the CACS and the MECSSA showed that the package system was useful and effective.

  2. Application of Elements of Numerical Methods in the Analysis of Journal Bearings in AC Induction Motors: An Industry Case Study

    ERIC Educational Resources Information Center

    Ahrens, Fred; Mistry, Rajendra

    2005-01-01

    In product engineering there often arise design analysis problems for which a commercial software package is either unavailable or cost prohibitive. Further, these calculations often require successive iterations that can be time intensive when performed by hand, thus development of a software application is indicated. This case relates to the…

  3. TRAPR: R Package for Statistical Analysis and Visualization of RNA-Seq Data.

    PubMed

    Lim, Jae Hyun; Lee, Soo Youn; Kim, Ju Han

    2017-03-01

    High-throughput transcriptome sequencing, also known as RNA sequencing (RNA-Seq), is a standard technology for measuring gene expression with unprecedented accuracy. Numerous bioconductor packages have been developed for the statistical analysis of RNA-Seq data. However, these tools focus on specific aspects of the data analysis pipeline, and are difficult to appropriately integrate with one another due to their disparate data structures and processing methods. They also lack visualization methods to confirm the integrity of the data and the process. In this paper, we propose an R-based RNA-Seq analysis pipeline called TRAPR, an integrated tool that facilitates the statistical analysis and visualization of RNA-Seq expression data. TRAPR provides various functions for data management, the filtering of low-quality data, normalization, transformation, statistical analysis, data visualization, and result visualization that allow researchers to build customized analysis pipelines.

  4. methylPipe and compEpiTools: a suite of R packages for the integrative analysis of epigenomics data.

    PubMed

    Kishore, Kamal; de Pretis, Stefano; Lister, Ryan; Morelli, Marco J; Bianchi, Valerio; Amati, Bruno; Ecker, Joseph R; Pelizzola, Mattia

    2015-09-29

    Numerous methods are available to profile several epigenetic marks, providing data with different genome coverage and resolution. Large epigenomic datasets are then generated, and often combined with other high-throughput data, including RNA-seq, ChIP-seq for transcription factors (TFs) binding and DNase-seq experiments. Despite the numerous computational tools covering specific steps in the analysis of large-scale epigenomics data, comprehensive software solutions for their integrative analysis are still missing. Multiple tools must be identified and combined to jointly analyze histone marks, TFs binding and other -omics data together with DNA methylation data, complicating the analysis of these data and their integration with publicly available datasets. To overcome the burden of integrating various data types with multiple tools, we developed two companion R/Bioconductor packages. The former, methylPipe, is tailored to the analysis of high- or low-resolution DNA methylomes in several species, accommodating (hydroxy-)methyl-cytosines in both CpG and non-CpG sequence context. The analysis of multiple whole-genome bisulfite sequencing experiments is supported, while maintaining the ability of integrating targeted genomic data. The latter, compEpiTools, seamlessly incorporates the results obtained with methylPipe and supports their integration with other epigenomics data. It provides a number of methods to score these data in regions of interest, leading to the identification of enhancers, lncRNAs, and RNAPII stalling/elongation dynamics. Moreover, it allows a fast and comprehensive annotation of the resulting genomic regions, and the association of the corresponding genes with non-redundant GeneOntology terms. Finally, the package includes a flexible method based on heatmaps for the integration of various data types, combining annotation tracks with continuous or categorical data tracks. methylPipe and compEpiTools provide a comprehensive Bioconductor-compliant solution for the integrative analysis of heterogeneous epigenomics data. These packages are instrumental in providing biologists with minimal R skills a complete toolkit facilitating the analysis of their own data, or in accelerating the analyses performed by more experienced bioinformaticians.

  5. Scripting MODFLOW Model Development Using Python and FloPy.

    PubMed

    Bakker, M; Post, V; Langevin, C D; Hughes, J D; White, J T; Starn, J J; Fienen, M N

    2016-09-01

    Graphical user interfaces (GUIs) are commonly used to construct and postprocess numerical groundwater flow and transport models. Scripting model development with the programming language Python is presented here as an alternative approach. One advantage of Python is that there are many packages available to facilitate the model development process, including packages for plotting, array manipulation, optimization, and data analysis. For MODFLOW-based models, the FloPy package was developed by the authors to construct model input files, run the model, and read and plot simulation results. Use of Python with the available scientific packages and FloPy facilitates data exploration, alternative model evaluations, and model analyses that can be difficult to perform with GUIs. Furthermore, Python scripts are a complete, transparent, and repeatable record of the modeling process. The approach is introduced with a simple FloPy example to create and postprocess a MODFLOW model. A more complicated capture-fraction analysis with a real-world model is presented to demonstrate the types of analyses that can be performed using Python and FloPy. © 2016, National Ground Water Association.

  6. MEANS: python package for Moment Expansion Approximation, iNference and Simulation

    PubMed Central

    Fan, Sisi; Geissmann, Quentin; Lakatos, Eszter; Lukauskas, Saulius; Ale, Angelique; Babtie, Ann C.; Kirk, Paul D. W.; Stumpf, Michael P. H.

    2016-01-01

    Motivation: Many biochemical systems require stochastic descriptions. Unfortunately these can only be solved for the simplest cases and their direct simulation can become prohibitively expensive, precluding thorough analysis. As an alternative, moment closure approximation methods generate equations for the time-evolution of the system’s moments and apply a closure ansatz to obtain a closed set of differential equations; that can become the basis for the deterministic analysis of the moments of the outputs of stochastic systems. Results: We present a free, user-friendly tool implementing an efficient moment expansion approximation with parametric closures that integrates well with the IPython interactive environment. Our package enables the analysis of complex stochastic systems without any constraints on the number of species and moments studied and the type of rate laws in the system. In addition to the approximation method our package provides numerous tools to help non-expert users in stochastic analysis. Availability and implementation: https://github.com/theosysbio/means Contacts: m.stumpf@imperial.ac.uk or e.lakatos13@imperial.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153663

  7. MEANS: python package for Moment Expansion Approximation, iNference and Simulation.

    PubMed

    Fan, Sisi; Geissmann, Quentin; Lakatos, Eszter; Lukauskas, Saulius; Ale, Angelique; Babtie, Ann C; Kirk, Paul D W; Stumpf, Michael P H

    2016-09-15

    Many biochemical systems require stochastic descriptions. Unfortunately these can only be solved for the simplest cases and their direct simulation can become prohibitively expensive, precluding thorough analysis. As an alternative, moment closure approximation methods generate equations for the time-evolution of the system's moments and apply a closure ansatz to obtain a closed set of differential equations; that can become the basis for the deterministic analysis of the moments of the outputs of stochastic systems. We present a free, user-friendly tool implementing an efficient moment expansion approximation with parametric closures that integrates well with the IPython interactive environment. Our package enables the analysis of complex stochastic systems without any constraints on the number of species and moments studied and the type of rate laws in the system. In addition to the approximation method our package provides numerous tools to help non-expert users in stochastic analysis. https://github.com/theosysbio/means m.stumpf@imperial.ac.uk or e.lakatos13@imperial.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  8. Software package for modeling spin-orbit motion in storage rings

    NASA Astrophysics Data System (ADS)

    Zyuzin, D. V.

    2015-12-01

    A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 106-109 particles in a beam during 109 turns in an accelerator (about 1012-1015 integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin-orbit dynamics.

  9. ATHENA, ARTEMIS, HEPHAESTUS: data analysis for X-ray absorption spectropscopy using IFEFFIT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ravel, B.; Newville, M.; UC)

    2010-07-20

    A software package for the analysis of X-ray absorption spectroscopy (XAS) data is presented. This package is based on the IFEFFIT library of numerical and XAS algorithms and is written in the Perl programming language using the Perl/Tk graphics toolkit. The programs described here are: (i) ATHENA, a program for XAS data processing, (ii) ARTEMIS, a program for EXAFS data analysis using theoretical standards from FEFF and (iii) HEPHAESTUS, a collection of beamline utilities based on tables of atomic absorption data. These programs enable high-quality data analysis that is accessible to novices while still powerful enough to meet the demandsmore » of an expert practitioner. The programs run on all major computer platforms and are freely available under the terms of a free software license.« less

  10. Development and validation of a numerical acoustic analysis program for aircraft interior noise prediction

    NASA Astrophysics Data System (ADS)

    Garcea, Ralph; Leigh, Barry; Wong, R. L. M.

    Reduction of interior noise in propeller-driven aircraft, to levels comparable with those obtained in jet transports, has become a leading factor in the early design stages of the new generation turboprops- and may be essential if these new designs are to succeed. The need for an analytical capability to predict interior noise is accepted throughout the turboprop aircraft industry. To this end, an analytical noise prediction program, which incorporates the SYSNOISE numerical acoustic analysis software, is under development at de Havilland. The discussion contained herein looks at the development program and how it was used in a design sensitivity analysis to optimize the structural design of the aircraft cabin for the purpose of reducing interior noise levels. This report also summarizes the validation of the SYSNOISE package using numerous classical cases from the literature.

  11. CRRES microelectronics package flight data analysis

    NASA Technical Reports Server (NTRS)

    Stassinopoulos, E. G.; Brucker, G. J.; Stauffer, C. A.

    1993-01-01

    A detailed in-depth analysis was performed on the data from some of the CRRES MEP (Microelectronics Package) devices. These space flight measurements covered a period of about fourteen months of mission lifetime. Several types of invalid data were identified and corrections were made. Other problems were noted and adjustments applied, as necessary. Particularly important and surprising were observations of abnormal device behavior in many parts that could neither be explained nor correlated to causative events. Also, contrary to prevailing theory, proton effects appeared to be far more significant and numerous than cosmic ray effects. Another unexpected result was the realization that only nine out of thirty-two p-MOS dosimeters on the MEP indicated a valid operation. Comments, conclusions, and recommendations are given.

  12. Numerical nonlinear inelastic analysis of stiffened shells of revolution. Volume 4: Satellite-1P program for STARS-2P digital computer program

    NASA Technical Reports Server (NTRS)

    Svalbonas, V.; Ogilvie, P.

    1975-01-01

    A special data debugging package called SAT-1P created for the STARS-2P computer program is described. The program was written exclusively in FORTRAN 4 for the IBM 370-165 computer, and then converted to the UNIVAC 1108.

  13. A Database of Herbaceous Vegetation Responses to Elevated Atmospheric CO2 (NDP-073)

    DOE Data Explorer

    Jones, Michael H [The Ohio State Univ., Columbus, OH (United States); Curtis, Peter S [The Ohio State Univ., Columbus, OH (United States); Cushman, Robert M [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Brenkert, Antoinette L [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    1999-01-01

    To perform a statistically rigorous meta-analysis of research results on the response by herbaceous vegetation to increased atmospheric CO2 levels, a multiparameter database of responses was compiled from the published literature. Seventy-eight independent CO2-enrichment studies, covering 53 species and 26 response parameters, reported mean response, sample size, and variance of the response (either as standard deviation or standard error). An additional 43 studies, covering 25 species and 6 response parameters, did not report variances. This numeric data package accompanies the Carbon Dioxide Information Analysis Center's (CDIAC's) NDP- 072, which provides similar information for woody vegetation. This numeric data package contains a 30-field data set of CO2- exposure experiment responses by herbaceous plants (as both a flat ASCII file and a spreadsheet file), files listing the references to the CO2-exposure experiments and specific comments relevant to the data in the data sets, and this documentation file (which includes SAS and Fortran codes to read the ASCII data file; SAS is a registered trademark of the SAS Institute, Inc., Cary, North Carolina 27511).

  14. Scilab software as an alternative low-cost computing in solving the linear equations problem

    NASA Astrophysics Data System (ADS)

    Agus, Fahrul; Haviluddin

    2017-02-01

    Numerical computation packages are widely used both in teaching and research. These packages consist of license (proprietary) and open source software (non-proprietary). One of the reasons to use the package is a complexity of mathematics function (i.e., linear problems). Also, number of variables in a linear or non-linear function has been increased. The aim of this paper was to reflect on key aspects related to the method, didactics and creative praxis in the teaching of linear equations in higher education. If implemented, it could be contribute to a better learning in mathematics area (i.e., solving simultaneous linear equations) that essential for future engineers. The focus of this study was to introduce an additional numerical computation package of Scilab as an alternative low-cost computing programming. In this paper, Scilab software was proposed some activities that related to the mathematical models. In this experiment, four numerical methods such as Gaussian Elimination, Gauss-Jordan, Inverse Matrix, and Lower-Upper Decomposition (LU) have been implemented. The results of this study showed that a routine or procedure in numerical methods have been created and explored by using Scilab procedures. Then, the routine of numerical method that could be as a teaching material course has exploited.

  15. The Statistical Package for the Social Sciences (SPSS) as an adjunct to pharmacokinetic analysis.

    PubMed

    Mather, L E; Austin, K L

    1983-01-01

    Computer techniques for numerical analysis are well known to pharmacokineticists. Powerful techniques for data file management have been developed by social scientists but have, in general, been ignored by pharmacokineticists because of their apparent lack of ability to interface with pharmacokinetic programs. Extensive use has been made of the Statistical Package for the Social Sciences (SPSS) for its data handling capabilities, but at the same time, techniques have been developed within SPSS to interface with pharmacokinetic programs of the users' choice and to carry out a variety of user-defined pharmacokinetic tasks within SPSS commands, apart from the expected variety of statistical tasks. Because it is based on a ubiquitous package, this methodology has all of the benefits of excellent documentation, interchangeability between different types and sizes of machines and true portability of techniques and data files. An example is given of the total management of a pharmacokinetic study previously reported in the literature by the authors.

  16. The International Atomic Energy Agency software package for the analysis of scintigraphic renal dynamic studies: a tool for the clinician, teacher, and researcher.

    PubMed

    Zaknun, John J; Rajabi, Hossein; Piepsz, Amy; Roca, Isabel; Dondi, Maurizio

    2011-01-01

    Under the auspices of the International Atomic Energy Agency, a new-generation, platform-independent, and x86-compatible software package was developed for the analysis of scintigraphic renal dynamic imaging studies. It provides nuclear medicine professionals cost-free access to the most recent developments in the field. The software package is a step forward towards harmonization and standardization. Embedded functionalities render it a suitable tool for education, research, and for receiving distant expert's opinions. Another objective of this effort is to allow introducing clinically useful parameters of drainage, including normalized residual activity and outflow efficiency. Furthermore, it provides an effective teaching tool for young professionals who are being introduced to dynamic kidney studies by selected teaching case studies. The software facilitates a better understanding through practically approaching different variables and settings and their effect on the numerical results. An effort was made to introduce instruments of quality assurance at the various levels of the program's execution, including visual inspection and automatic detection and correction of patient's motion, automatic placement of regions of interest around the kidneys, cortical regions, and placement of reproducible background region on both primary dynamic and on postmicturition studies. The user can calculate the differential renal function through 2 independent methods, the integral or the Rutland-Patlak approaches. Standardized digital reports, storage and retrieval of regions of interest, and built-in database operations allow the generation and tracing of full image reports and of numerical outputs. The software package is undergoing quality assurance procedures to verify the accuracy and the interuser reproducibility with the final aim of launching the program for use by professionals and teaching institutions worldwide. Copyright © 2011 Elsevier Inc. All rights reserved.

  17. Sampling and sensitivity analyses tools (SaSAT) for computational modelling

    PubMed Central

    Hoare, Alexander; Regan, David G; Wilson, David P

    2008-01-01

    SaSAT (Sampling and Sensitivity Analysis Tools) is a user-friendly software package for applying uncertainty and sensitivity analyses to mathematical and computational models of arbitrary complexity and context. The toolbox is built in Matlab®, a numerical mathematical software package, and utilises algorithms contained in the Matlab® Statistics Toolbox. However, Matlab® is not required to use SaSAT as the software package is provided as an executable file with all the necessary supplementary files. The SaSAT package is also designed to work seamlessly with Microsoft Excel but no functionality is forfeited if that software is not available. A comprehensive suite of tools is provided to enable the following tasks to be easily performed: efficient and equitable sampling of parameter space by various methodologies; calculation of correlation coefficients; regression analysis; factor prioritisation; and graphical output of results, including response surfaces, tornado plots, and scatterplots. Use of SaSAT is exemplified by application to a simple epidemic model. To our knowledge, a number of the methods available in SaSAT for performing sensitivity analyses have not previously been used in epidemiological modelling and their usefulness in this context is demonstrated. PMID:18304361

  18. Numerical Simulation of Cast Distortion in Gas Turbine Engine Components

    NASA Astrophysics Data System (ADS)

    Inozemtsev, A. A.; Dubrovskaya, A. S.; Dongauser, K. A.; Trufanov, N. A.

    2015-06-01

    In this paper the process of multiple airfoilvanes manufacturing through investment casting is considered. The mathematical model of the full contact problem is built to determine stress strain state in a cast during the process of solidification. Studies are carried out in viscoelastoplastic statement. Numerical simulation of the explored process is implemented with ProCASTsoftware package. The results of simulation are compared with the real production process. By means of computer analysis the optimization of technical process parameters is done in order to eliminate the defect of cast walls thickness variation.

  19. Software package for modeling spin–orbit motion in storage rings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zyuzin, D. V., E-mail: d.zyuzin@fz-juelich.de

    2015-12-15

    A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 10{sup 6}–10{sup 9} particles in a beam during 10{supmore » 9} turns in an accelerator (about 10{sup 12}–10{sup 15} integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin–orbit dynamics.« less

  20. seawaveQ: an R package providing a model and utilities for analyzing trends in chemical concentrations in streams with a seasonal wave (seawave) and adjustment for streamflow (Q) and other ancillary variables

    USGS Publications Warehouse

    Ryberg, Karen R.; Vecchia, Aldo V.

    2013-01-01

    The seawaveQ R package fits a parametric regression model (seawaveQ) to pesticide concentration data from streamwater samples to assess variability and trends. The model incorporates the strong seasonality and high degree of censoring common in pesticide data and users can incorporate numerous ancillary variables, such as streamflow anomalies. The model is fitted to pesticide data using maximum likelihood methods for censored data and is robust in terms of pesticide, stream location, and degree of censoring of the concentration data. This R package standardizes this methodology for trend analysis, documents the code, and provides help and tutorial information, as well as providing additional utility functions for plotting pesticide and other chemical concentration data.

  1. USSAERO version D computer program development using ANSI standard FORTRAN 77 and DI-3000 graphics

    NASA Technical Reports Server (NTRS)

    Wiese, M. R.

    1986-01-01

    The D version of the Unified Subsonic Supersonic Aerodynamic Analysis (USSAERO) program is the result of numerous modifications and enhancements to the B01 version. These changes include conversion to ANSI standard FORTRAN 77; use of the DI-3000 graphics package; removal of the overlay structure; a revised input format; the addition of an input data analysis routine; and increasing the number of aeronautical components allowed.

  2. Applications of magnetic resonance image segmentation in neurology

    NASA Astrophysics Data System (ADS)

    Heinonen, Tomi; Lahtinen, Antti J.; Dastidar, Prasun; Ryymin, Pertti; Laarne, Paeivi; Malmivuo, Jaakko; Laasonen, Erkki; Frey, Harry; Eskola, Hannu

    1999-05-01

    After the introduction of digital imagin devices in medicine computerized tissue recognition and classification have become important in research and clinical applications. Segmented data can be applied among numerous research fields including volumetric analysis of particular tissues and structures, construction of anatomical modes, 3D visualization, and multimodal visualization, hence making segmentation essential in modern image analysis. In this research project several PC based software were developed in order to segment medical images, to visualize raw and segmented images in 3D, and to produce EEG brain maps in which MR images and EEG signals were integrated. The software package was tested and validated in numerous clinical research projects in hospital environment.

  3. Variational data assimilation system "INM RAS - Black Sea"

    NASA Astrophysics Data System (ADS)

    Parmuzin, Eugene; Agoshkov, Valery; Assovskiy, Maksim; Giniatulin, Sergey; Zakharova, Natalia; Kuimov, Grigory; Fomin, Vladimir

    2013-04-01

    Development of Informational-Computational Systems (ICS) for Data Assimilation Procedures is one of multidisciplinary problems. To study and solve these problems one needs to apply modern results from different disciplines and recent developments in: mathematical modeling; theory of adjoint equations and optimal control; inverse problems; numerical methods theory; numerical algebra and scientific computing. The problems discussed above are studied in the Institute of Numerical Mathematics of the Russian Academy of Science (INM RAS) in ICS for Personal Computers (PC). Special problems and questions arise while effective ICS versions for PC are being developed. These problems and questions can be solved with applying modern methods of numerical mathematics and by solving "parallelism problem" using OpenMP technology and special linear algebra packages. In this work the results on the ICS development for PC-ICS "INM RAS - Black Sea" are presented. In the work the following problems and questions are discussed: practical problems that can be studied by ICS; parallelism problems and their solutions with applying of OpenMP technology and the linear algebra packages used in ICS "INM - Black Sea"; Interface of ICS. The results of ICS "INM RAS - Black Sea" testing are presented. Efficiency of technologies and methods applied are discussed. The work was supported by RFBR, grants No. 13-01-00753, 13-05-00715 and by The Ministry of education and science of Russian Federation, project 8291, project 11.519.11.1005 References: [1] V.I. Agoshkov, M.V. Assovskii, S.A. Lebedev, Numerical simulation of Black Sea hydrothermodynamics taking into account tide-forming forces. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, 5-31 [2] E.I. Parmuzin, V.I. Agoshkov, Numerical solution of the variational assimilation problem for sea surface temperature in the model of the Black Sea dynamics. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, 69-94 [3] V.B. Zalesny, N.A. Diansky, V.V. Fomin, S.N. Moshonkin, S.G. Demyshev, Numerical model of the circulation of Black Sea and Sea of Azov. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, 95-111 [4] V.I. Agoshkov, S.V. Giniatulin, G.V. Kuimov. OpenMP technology and linear algebra packages in the variation data assimilation systems. - Abstracts of the 1-st China-Russia Conference on Numerical Algebra with Applications in Radiactive Hydrodynamics, Beijing, China, October 16-18, 2012. [5] Zakharova N.B., Agoshkov V.I., Parmuzin E.I., The new method of ARGO buoys system observation data interpolation. Russian Journal of Numerical Analysis and Mathematical Modelling. Vol. 28, Issue 1, 2013.

  4. Improved chip design for integrated solid-phase microextraction in on-line proteomic sample preparation.

    PubMed

    Bergkvist, Jonas; Ekström, Simon; Wallman, Lars; Löfgren, Mikael; Marko-Varga, György; Nilsson, Johan; Laurell, Thomas

    2002-04-01

    A recently introduced silicon microextraction chip (SMEC), used for on-line proteomic sample preparation, has proved to facilitate the process of protein identification by sample clean up and enrichment of peptides. It is demonstrated that a novel grid-SMEC design improves the operating characteristics for solid-phase microextraction, by reducing dispersion effects and thereby improving the sample preparation conditions. The structures investigated in this paper are treated both numerically and experimentally. The numerical approach is based on finite element analysis of the microfluidic flow in the microchip. The analysis is accomplished by use of the computational fluid dynamics-module FLOTRAN in the ANSYS software package. The modeling and analysis of the previously reported weir-SMEC design indicates some severe drawbacks, that can be reduced by changing the microextraction chip geometry to the grid-SMEC design. The overall analytical performance was thereby improved and also verified by experimental work. Matrix-assisted laser desorption/ionization mass spectra of model peptides extracted from both the weir-SMEC and the new grid-SMEC support the numerical analysis results. Further use of numerical modeling and analysis of the SMEC structures is also discussed and suggested in this work.

  5. Python Open source Waveform ExtractoR (POWER): an open source, Python package to monitor and post-process numerical relativity simulations

    NASA Astrophysics Data System (ADS)

    Johnson, Daniel; Huerta, E. A.; Haas, Roland

    2018-01-01

    Numerical simulations of Einstein’s field equations provide unique insights into the physics of compact objects moving at relativistic speeds, and which are driven by strong gravitational interactions. Numerical relativity has played a key role to firmly establish gravitational wave astrophysics as a new field of research, and it is now paving the way to establish whether gravitational wave radiation emitted from compact binary mergers is accompanied by electromagnetic and astro-particle counterparts. As numerical relativity continues to blend in with routine gravitational wave data analyses to validate the discovery of gravitational wave events, it is essential to develop open source tools to streamline these studies. Motivated by our own experience as users and developers of the open source, community software, the Einstein Toolkit, we present an open source, Python package that is ideally suited to monitor and post-process the data products of numerical relativity simulations, and compute the gravitational wave strain at future null infinity in high performance environments. We showcase the application of this new package to post-process a large numerical relativity catalog and extract higher-order waveform modes from numerical relativity simulations of eccentric binary black hole mergers and neutron star mergers. This new software fills a critical void in the arsenal of tools provided by the Einstein Toolkit consortium to the numerical relativity community.

  6. Biocidal packaging for pharmaceuticals, foods, and other perishables.

    PubMed

    Larson, Alyssa M; Klibanov, Alexander M

    2013-01-01

    Many consumer goods must be protected from bacterial and fungal colonization to ensure their integrity and safety. By making these items' packaging biocidal, the interior environment can be preserved from microbial spoilage without altering the products themselves. Herein we briefly review this concept, referred to as active packaging, and discuss existing methods for constructing active packaging systems. They are based on either packaging materials that release biocides or those that are themselves intrinsically biocidal (or biostatic), with numerous variations within each category.

  7. PyVCI: A flexible open-source code for calculating accurate molecular infrared spectra

    NASA Astrophysics Data System (ADS)

    Sibaev, Marat; Crittenden, Deborah L.

    2016-06-01

    The PyVCI program package is a general purpose open-source code for simulating accurate molecular spectra, based upon force field expansions of the potential energy surface in normal mode coordinates. It includes harmonic normal coordinate analysis and vibrational configuration interaction (VCI) algorithms, implemented primarily in Python for accessibility but with time-consuming routines written in C. Coriolis coupling terms may be optionally included in the vibrational Hamiltonian. Non-negligible VCI matrix elements are stored in sparse matrix format to alleviate the diagonalization problem. CPU and memory requirements may be further controlled by algorithmic choices and/or numerical screening procedures, and recommended values are established by benchmarking using a test set of 44 molecules for which accurate analytical potential energy surfaces are available. Force fields in normal mode coordinates are obtained from the PyPES library of high quality analytical potential energy surfaces (to 6th order) or by numerical differentiation of analytic second derivatives generated using the GAMESS quantum chemical program package (to 4th order).

  8. Meteorological data-processing package

    NASA Technical Reports Server (NTRS)

    Billingsly, J. B.; Braken, P. A.

    1979-01-01

    METPAK, meteorological data-processing package of satellite data used to develop cloud-tracking maps, is given. Data can develop and enhance numerical prediction models for mesoscale phenomena and improve ability to detect and predict storms.

  9. Microarray Я US: a user-friendly graphical interface to Bioconductor tools that enables accurate microarray data analysis and expedites comprehensive functional analysis of microarray results.

    PubMed

    Dai, Yilin; Guo, Ling; Li, Meng; Chen, Yi-Bu

    2012-06-08

    Microarray data analysis presents a significant challenge to researchers who are unable to use the powerful Bioconductor and its numerous tools due to their lack of knowledge of R language. Among the few existing software programs that offer a graphic user interface to Bioconductor packages, none have implemented a comprehensive strategy to address the accuracy and reliability issue of microarray data analysis due to the well known probe design problems associated with many widely used microarray chips. There is also a lack of tools that would expedite the functional analysis of microarray results. We present Microarray Я US, an R-based graphical user interface that implements over a dozen popular Bioconductor packages to offer researchers a streamlined workflow for routine differential microarray expression data analysis without the need to learn R language. In order to enable a more accurate analysis and interpretation of microarray data, we incorporated the latest custom probe re-definition and re-annotation for Affymetrix and Illumina chips. A versatile microarray results output utility tool was also implemented for easy and fast generation of input files for over 20 of the most widely used functional analysis software programs. Coupled with a well-designed user interface, Microarray Я US leverages cutting edge Bioconductor packages for researchers with no knowledge in R language. It also enables a more reliable and accurate microarray data analysis and expedites downstream functional analysis of microarray results.

  10. CentiServer: A Comprehensive Resource, Web-Based Application and R Package for Centrality Analysis.

    PubMed

    Jalili, Mahdi; Salehzadeh-Yazdi, Ali; Asgari, Yazdan; Arab, Seyed Shahriar; Yaghmaie, Marjan; Ghavamzadeh, Ardeshir; Alimoghaddam, Kamran

    2015-01-01

    Various disciplines are trying to solve one of the most noteworthy queries and broadly used concepts in biology, essentiality. Centrality is a primary index and a promising method for identifying essential nodes, particularly in biological networks. The newly created CentiServer is a comprehensive online resource that provides over 110 definitions of different centrality indices, their computational methods, and algorithms in the form of an encyclopedia. In addition, CentiServer allows users to calculate 55 centralities with the help of an interactive web-based application tool and provides a numerical result as a comma separated value (csv) file format or a mapped graphical format as a graph modeling language (GML) file. The standalone version of this application has been developed in the form of an R package. The web-based application (CentiServer) and R package (centiserve) are freely available at http://www.centiserver.org/.

  11. CentiServer: A Comprehensive Resource, Web-Based Application and R Package for Centrality Analysis

    PubMed Central

    Jalili, Mahdi; Salehzadeh-Yazdi, Ali; Asgari, Yazdan; Arab, Seyed Shahriar; Yaghmaie, Marjan; Ghavamzadeh, Ardeshir; Alimoghaddam, Kamran

    2015-01-01

    Various disciplines are trying to solve one of the most noteworthy queries and broadly used concepts in biology, essentiality. Centrality is a primary index and a promising method for identifying essential nodes, particularly in biological networks. The newly created CentiServer is a comprehensive online resource that provides over 110 definitions of different centrality indices, their computational methods, and algorithms in the form of an encyclopedia. In addition, CentiServer allows users to calculate 55 centralities with the help of an interactive web-based application tool and provides a numerical result as a comma separated value (csv) file format or a mapped graphical format as a graph modeling language (GML) file. The standalone version of this application has been developed in the form of an R package. The web-based application (CentiServer) and R package (centiserve) are freely available at http://www.centiserver.org/ PMID:26571275

  12. Prompt gamma neutron activation analysis of toxic elements in radioactive waste packages.

    PubMed

    Ma, J-L; Carasco, C; Perot, B; Mauerhofer, E; Kettler, J; Havenith, A

    2012-07-01

    The French Alternative Energies and Atomic Energy Commission (CEA) and National Radioactive Waste Management Agency (ANDRA) are conducting an R&D program to improve the characterization of long-lived and medium activity (LL-MA) radioactive waste packages. In particular, the amount of toxic elements present in radioactive waste packages must be assessed before they can be accepted in repository facilities in order to avoid pollution of underground water reserves. To this aim, the Nuclear Measurement Laboratory of CEA-Cadarache has started to study the performances of Prompt Gamma Neutron Activation Analysis (PGNAA) for elements showing large capture cross sections such as mercury, cadmium, boron, and chromium. This paper reports a comparison between Monte Carlo calculations performed with the MCNPX computer code using the ENDF/B-VII.0 library and experimental gamma rays measured in the REGAIN PGNAA cell with small samples of nickel, lead, cadmium, arsenic, antimony, chromium, magnesium, zinc, boron, and lithium to verify the validity of a numerical model and gamma-ray production data. The measurement of a ∼20kg test sample of concrete containing toxic elements has also been performed, in collaboration with Forschungszentrum Jülich, to validate the model in view of future performance studies for dense and large LL-MA waste packages. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. The Cooperative VAS Program with the Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Diak, George R.; Menzel, W. Paul

    1988-01-01

    Work was divided between the analysis/forecast model development and evaluation of the impact of satellite data in mesoscale numerical weather prediction (NWP), development of the Multispectral Atmospheric Mapping Sensor (MAMS), and other related research. The Cooperative Institute for Meteorological Satellite Studies (CIMSS) Synoptic Scale Model (SSM) has progressed from a relatively basic analysis/forecast system to a package which includes such features as nonlinear vertical mode initialization, comprehensive Planetary Boundary Layer (PBL) physics, and the core of a fully four-dimensional data assimilation package. The MAMS effort has produced a calibrated visible and infrared sensor that produces imager at high spatial resolution. The MAMS was developed in order to study small scale atmospheric moisture variability, to monitor and classify clouds, and to investigate the role of surface characteristics in the production of clouds, precipitation, and severe storms.

  14. CSOLNP: Numerical Optimization Engine for Solving Non-linearly Constrained Problems.

    PubMed

    Zahery, Mahsa; Maes, Hermine H; Neale, Michael C

    2017-08-01

    We introduce the optimizer CSOLNP, which is a C++ implementation of the R package RSOLNP (Ghalanos & Theussl, 2012, Rsolnp: General non-linear optimization using augmented Lagrange multiplier method. R package version, 1) alongside some improvements. CSOLNP solves non-linearly constrained optimization problems using a Sequential Quadratic Programming (SQP) algorithm. CSOLNP, NPSOL (a very popular implementation of SQP method in FORTRAN (Gill et al., 1986, User's guide for NPSOL (version 4.0): A Fortran package for nonlinear programming (No. SOL-86-2). Stanford, CA: Stanford University Systems Optimization Laboratory), and SLSQP (another SQP implementation available as part of the NLOPT collection (Johnson, 2014, The NLopt nonlinear-optimization package. Retrieved from http://ab-initio.mit.edu/nlopt)) are three optimizers available in OpenMx package. These optimizers are compared in terms of runtimes, final objective values, and memory consumption. A Monte Carlo analysis of the performance of the optimizers was performed on ordinal and continuous models with five variables and one or two factors. While the relative difference between the objective values is less than 0.5%, CSOLNP is in general faster than NPSOL and SLSQP for ordinal analysis. As for continuous data, none of the optimizers performs consistently faster than the others. In terms of memory usage, we used Valgrind's heap profiler tool, called Massif, on one-factor threshold models. CSOLNP and NPSOL consume the same amount of memory, while SLSQP uses 71 MB more memory than the other two optimizers.

  15. Spectral Analysis within the Virtual Observatory: The GAVO Service TheoSSA

    NASA Astrophysics Data System (ADS)

    Ringat, E.

    2012-03-01

    In the last decade, numerous Virtual Observatory organizations were established. One of these is the German Astrophysical Virtual Observatory (GAVO) that e.g. provides access to spectral energy distributions via the service TheoSSA. In a pilot phase, these are based on the Tübingen NLTE Model-Atmosphere Package (TMAP) and suitable for hot, compact stars. We demonstrate the power of TheoSSA in an application to the sdOB primary of AA Doradus by comparison with a “classical” spectral analysis.

  16. High Resolution Aerospace Applications using the NASA Columbia Supercomputer

    NASA Technical Reports Server (NTRS)

    Mavriplis, Dimitri J.; Aftosmis, Michael J.; Berger, Marsha

    2005-01-01

    This paper focuses on the parallel performance of two high-performance aerodynamic simulation packages on the newly installed NASA Columbia supercomputer. These packages include both a high-fidelity, unstructured, Reynolds-averaged Navier-Stokes solver, and a fully-automated inviscid flow package for cut-cell Cartesian grids. The complementary combination of these two simulation codes enables high-fidelity characterization of aerospace vehicle design performance over the entire flight envelope through extensive parametric analysis and detailed simulation of critical regions of the flight envelope. Both packages. are industrial-level codes designed for complex geometry and incorpor.ats. CuStomized multigrid solution algorithms. The performance of these codes on Columbia is examined using both MPI and OpenMP and using both the NUMAlink and InfiniBand interconnect fabrics. Numerical results demonstrate good scalability on up to 2016 CPUs using the NUMAIink4 interconnect, with measured computational rates in the vicinity of 3 TFLOP/s, while InfiniBand showed some performance degradation at high CPU counts, particularly with multigrid. Nonetheless, the results are encouraging enough to indicate that larger test cases using combined MPI/OpenMP communication should scale well on even more processors.

  17. 75 FR 15440 - Guidance for Industry on Standards for Securing the Drug Supply Chain-Standardized Numerical...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-29

    ...] Guidance for Industry on Standards for Securing the Drug Supply Chain--Standardized Numerical... industry entitled ``Standards for Securing the Drug Supply Chain-Standardized Numerical Identification for... the Drug Supply Chain-Standardized Numerical Identification for Prescription Drug Packages.'' In the...

  18. The Evolution of Alternative Rural Development Strategies in Ethiopia; Implications for Employment and Income Distribution. African Rural Employment Paper No. 12.

    ERIC Educational Resources Information Center

    Tecle, Tesfai

    As Ethiopia has designed and implemented numerous intensive (geographically concentrated) and minimum-package rural development programs between 1967-75, the purpose of this monograph is to: (1) trace the evolution of these package projects; (2) analyze package performances; and (3) identify the implications for Ethiopian planners and policy…

  19. 'spup' - an R package for uncertainty propagation analysis in spatial environmental modelling

    NASA Astrophysics Data System (ADS)

    Sawicka, Kasia; Heuvelink, Gerard

    2017-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability and being able to deal with case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.

  20. Computer Modeling of the Dynamic Strength of Metal-Plastic Cylindrical Shells Under Explosive Loading

    NASA Astrophysics Data System (ADS)

    Abrosimov, N. A.; Novosel'tseva, N. A.

    2017-05-01

    A technique for numerically analyzing the dynamic strength of two-layer metal-plastic cylindrical shells under an axisymmetric internal explosive loading is developed. The kinematic deformation model of the layered package is based on a nonclassical theory of shells. The geometric relations are constructed using relations of the simplest quadratic version of the nonlinear elasticity theory. The stress and strain tensors in the composite macrolayer are related by Hooke's law for an orthotropic body with account of degradation of the stiffness characteristics of the multilayer package due to local failure of some its elementary layers. The physical relations in the metal layer are formulated in terms of a differential theory of plasticity. An energy-correlated resolving system of dynamic equations for the metal-plastic cylindrical shells is derived by minimizing the functional of total energy of the shells as three-dimensional bodies. The numerical method for solving the initial boundary-value problem formulated is based on an explicit variational-difference scheme. The reliability of the technique considered is verified by comparing numerical results with experimental data. An analysis of the ultimate strains and strength of one-layer basalt-and glass-fiber-reinforced plastic and two-layer metalplastic cylindrical shells is carried out.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bastian, Mark; Trigueros, Jose V.

    Phoenix is a Java Virtual Machine (JVM) based library for performing mathematical and astrodynamics calculations. It consists of two primary sub-modules, phoenix-math and phoenix-astrodynamics. The mathematics package has a variety of mathematical classes for performing 3D transformations, geometric reasoning, and numerical analysis. The astrodynamics package has various classes and methods for computing locations, attitudes, accesses, and other values useful for general satellite modeling and simulation. Methods for computing celestial locations, such as the location of the Sun and Moon, are also included. Phoenix is meant to be used as a library within the context of a larger application. For example,more » it could be used for a web service, desktop client, or to compute simple values in a scripting environment.« less

  2. An Analysis of the Joint Modular Intermodal Distribution System

    DTIC Science & Technology

    2007-06-01

    the differing airframes. “Two methods are available to move a CROP-load of ammunition: 1. Reconfigure the load from the CROP onto multiple 463L...used among the services lack: • Transportability across different modes without re-handling/packaging • Quick reconfiguration for onward movement...numerous linkages among different channels of distribution. In the world of integrated logistics, that means that ground, rail, air, and sea modes of

  3. Comparing antibiotic consumption between two European countries: are packages an adequate surrogate for prescriptions?

    PubMed

    Watier, Laurence; Cavalié, Philippe; Coignard, Bruno; Brun-Buisson, Christian

    2017-11-01

    Defined daily doses (DDD) are the gold standard indicator for quantifying prescriptions. Since 2014, the European Centre for Disease Prevention and Control (ECDC) has also been using the number of packages per 1,000 inhabitants per day (ipd), as a surrogate for prescriptions, to report antibiotic consumption in the community and to perform comparisons between European Union (EU) countries participating in the European Surveillance of Antimicrobial Consumption Network (ESAC-Net). In 2015, consumption was reported to range across Europe from 1.0 to 4.7 packages per 1,000 ipd. Our analysis showed that consumption of antibiotics for systemic use per 1,000 ipd was on average 1.3 times greater in France than in Belgium when considering prescriptions in the numerator, 2.5 times greater when considering packages and 1.2 times greater when considering DDD. As long as the same metrics are used over time, antibiotic consumption data aggregated and disseminated by ECDC are useful for assessing temporal trends at the European level and within individual countries; these data may also be used for benchmarking across EU countries. While DDD - although imperfect - are the most widely accepted metric for this purpose, antibiotic packages do not appear suitable for comparisons between countries and may be misleading.

  4. Comparing antibiotic consumption between two European countries: are packages an adequate surrogate for prescriptions?

    PubMed Central

    Watier, Laurence; Cavalié, Philippe; Coignard, Bruno; Brun-Buisson, Christian

    2017-01-01

    Defined daily doses (DDD) are the gold standard indicator for quantifying prescriptions. Since 2014, the European Centre for Disease Prevention and Control (ECDC) has also been using the number of packages per 1,000 inhabitants per day (ipd), as a surrogate for prescriptions, to report antibiotic consumption in the community and to perform comparisons between European Union (EU) countries participating in the European Surveillance of Antimicrobial Consumption Network (ESAC-Net). In 2015, consumption was reported to range across Europe from 1.0 to 4.7 packages per 1,000 ipd. Our analysis showed that consumption of antibiotics for systemic use per 1,000 ipd was on average 1.3 times greater in France than in Belgium when considering prescriptions in the numerator, 2.5 times greater when considering packages and 1.2 times greater when considering DDD. As long as the same metrics are used over time, antibiotic consumption data aggregated and disseminated by ECDC are useful for assessing temporal trends at the European level and within individual countries; these data may also be used for benchmarking across EU countries. While DDD - although imperfect - are the most widely accepted metric for this purpose, antibiotic packages do not appear suitable for comparisons between countries and may be misleading. PMID:29162212

  5. PCIPS 2.0: Powerful multiprofile image processing implemented on PCs

    NASA Technical Reports Server (NTRS)

    Smirnov, O. M.; Piskunov, N. E.

    1992-01-01

    Over the years, the processing power of personal computers has steadily increased. Now, 386- and 486-based PC's are fast enough for many image processing applications, and inexpensive enough even for amateur astronomers. PCIPS is an image processing system based on these platforms that was designed to satisfy a broad range of data analysis needs, while requiring minimum hardware and providing maximum expandability. It will run (albeit at a slow pace) even on a 80286 with 640K memory, but will take full advantage of bigger memory and faster CPU's. Because the actual image processing is performed by external modules, the system can be easily upgraded by the user for all sorts of scientific data analysis. PCIPS supports large format lD and 2D images in any numeric type from 8-bit integer to 64-bit floating point. The images can be displayed, overlaid, printed and any part of the data examined via an intuitive graphical user interface that employs buttons, pop-up menus, and a mouse. PCIPS automatically converts images between different types and sizes to satisfy the requirements of various applications. PCIPS features an API that lets users develop custom applications in C or FORTRAN. While doing so, a programmer can concentrate on the actual data processing, because PCIPS assumes responsibility for accessing images and interacting with the user. This also ensures that all applications, even custom ones, have a consistent and user-friendly interface. The API is compatible with factory programming, a metaphor for constructing image processing procedures that will be implemented in future versions of the system. Several application packages were created under PCIPS. The basic package includes elementary arithmetics and statistics, geometric transformations and import/export in various formats (FITS, binary, ASCII, and GIF). The CCD processing package and the spectral analysis package were successfully used to reduce spectra from the Nordic Telescope at La Palma. A photometry package is also available, and other packages are being developed. A multitasking version of PCIPS that utilizes the factory programming concept is currently under development. This version will remain compatible (on the source code level) with existing application packages and custom applications.

  6. chipPCR: an R package to pre-process raw data of amplification curves.

    PubMed

    Rödiger, Stefan; Burdukiewicz, Michał; Schierack, Peter

    2015-09-01

    Both the quantitative real-time polymerase chain reaction (qPCR) and quantitative isothermal amplification (qIA) are standard methods for nucleic acid quantification. Numerous real-time read-out technologies have been developed. Despite the continuous interest in amplification-based techniques, there are only few tools for pre-processing of amplification data. However, a transparent tool for precise control of raw data is indispensable in several scenarios, for example, during the development of new instruments. chipPCR is an R: package for the pre-processing and quality analysis of raw data of amplification curves. The package takes advantage of R: 's S4 object model and offers an extensible environment. chipPCR contains tools for raw data exploration: normalization, baselining, imputation of missing values, a powerful wrapper for amplification curve smoothing and a function to detect the start and end of an amplification curve. The capabilities of the software are enhanced by the implementation of algorithms unavailable in R: , such as a 5-point stencil for derivative interpolation. Simulation tools, statistical tests, plots for data quality management, amplification efficiency/quantification cycle calculation, and datasets from qPCR and qIA experiments are part of the package. Core functionalities are integrated in GUIs (web-based and standalone shiny applications), thus streamlining analysis and report generation. http://cran.r-project.org/web/packages/chipPCR. Source code: https://github.com/michbur/chipPCR. stefan.roediger@b-tu.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. Flat conductor cable for electrical packaging

    NASA Technical Reports Server (NTRS)

    Angele, W.

    1972-01-01

    Flat conductor cable (FCC) is relatively new, highly promising means for electrical packaging and system integration. FCC offers numerous desirable traits (weight, volume and cost savings, flexibility, high reliability, predictable and repeatable electrical characteristics) which make it extremely attractive as a packaging medium. FCC, today, finds wide application in everything from integration of lunar equipment to the packaging of electronics in nuclear submarines. Described are cable construction and means of termination, applicable specifications and standards, and total FCC systems. A list of additional sources of data is also included for more intensive study.

  8. Influence of metal bonding layer on strain transfer performance of FBG

    NASA Astrophysics Data System (ADS)

    Liu, Hao; Chen, Weimin; Zhang, Peng; Liu, Li; Shu, Yuejie; Wu, Jun

    2013-01-01

    Metal bonding layer seriously affects the strain transfer performance of Fiber Bragg Grating (FBG). Based on the mode of FBG strain transfer, the influence of the length, the thickness, Poisson's ratio, elasticity modulus of metal bonding layer on the strain transfer coefficient of FBG is analyzed by numerical simulation. FBG is packaged to steel wire using metal bonding technology of FBG. The tensile tests of different bonding lengths and elasticity modulus are carried out. The result shows the strain transfer coefficient of FBGs are 0.9848,0.962 and their average strain sensitivities are 1.076 pm/μɛ,1.099 pm/μɛ when the metal bonding layer is zinc, whose lengths are 15mm, 20mm, respectively. The strain transfer coefficient of FBG packaged by metal bonding layer raises 8.9 percent compared to epoxy glue package. The preliminary experimental results show that the strain transfer coefficient increases with the length of metal bonding layer, decreases with the thickness of metal bonding layer and the influence of Poisson's ratio can be ignored. The experiment result is general agreement with the analysis and provides guidance for metal package of FBG.

  9. Supercomputing '91; Proceedings of the 4th Annual Conference on High Performance Computing, Albuquerque, NM, Nov. 18-22, 1991

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Various papers on supercomputing are presented. The general topics addressed include: program analysis/data dependence, memory access, distributed memory code generation, numerical algorithms, supercomputer benchmarks, latency tolerance, parallel programming, applications, processor design, networks, performance tools, mapping and scheduling, characterization affecting performance, parallelism packaging, computing climate change, combinatorial algorithms, hardware and software performance issues, system issues. (No individual items are abstracted in this volume)

  10. ON UPGRADING THE NUMERICS IN COMBUSTION CHEMISTRY CODES. (R824970)

    EPA Science Inventory

    A method of updating and reusing legacy FORTRAN codes for combustion simulations is presented using the DAEPACK software package. The procedure is demonstrated on two codes that come with the CHEMKIN-II package, CONP and SENKIN, for the constant-pressure batch reactor simulati...

  11. PHAST: Protein-like heteropolymer analysis by statistical thermodynamics

    NASA Astrophysics Data System (ADS)

    Frigori, Rafael B.

    2017-06-01

    PHAST is a software package written in standard Fortran, with MPI and CUDA extensions, able to efficiently perform parallel multicanonical Monte Carlo simulations of single or multiple heteropolymeric chains, as coarse-grained models for proteins. The outcome data can be straightforwardly analyzed within its microcanonical Statistical Thermodynamics module, which allows for computing the entropy, caloric curve, specific heat and free energies. As a case study, we investigate the aggregation of heteropolymers bioinspired on Aβ25-33 fragments and their cross-seeding with IAPP20-29 isoforms. Excellent parallel scaling is observed, even under numerically difficult first-order like phase transitions, which are properly described by the built-in fully reconfigurable force fields. Still, the package is free and open source, this shall motivate users to readily adapt it to specific purposes.

  12. MEGNO-analysis of the orbital evolution of GEO objects. (Russian Title: MEGNO-анализ орбитальной эволюции объектов зоны ГЕО )

    NASA Astrophysics Data System (ADS)

    Aleksandrova, A. G.; Chuvashov, I. N.; Bordovitsyna, T. V.

    2011-07-01

    The results of investigations of the instability of orbits in the GEO are presentеd. Average parameter MEGNO as main indicator of chaotic state has been used. The parameter is computed by combined numerical integration of equations of the motion, equations in variation and equations of MEGNO parameters. The results have been obtained using software package "Numerical model of the systems artificial satellite motion", implemented on the cluster "Skiff Cyberia".

  13. 49 CFR 173.62 - Specific packaging requirements for explosives.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... first column in numerical sequence by their identification number (ID #), which is listed in column 4 of the § 172.101 table, of this subchapter. The second column of the Explosives Table specifies the...) Explosives must be packaged in accordance with the following table: (1) The first column lists, in...

  14. [Development of analysis software package for the two kinds of Japanese fluoro-d-glucose-positron emission tomography guideline].

    PubMed

    Matsumoto, Keiichi; Endo, Keigo

    2013-06-01

    Two kinds of Japanese guidelines for the data acquisition protocol of oncology fluoro-D-glucose-positron emission tomography (FDG-PET)/computed tomography (CT) scans were created by the joint task force of the Japanese Society of Nuclear Medicine Technology (JSNMT) and the Japanese Society of Nuclear Medicine (JSNM), and published in Kakuigaku-Gijutsu 27(5): 425-456, 2007 and 29(2): 195-235, 2009. These guidelines aim to standardize PET image quality among facilities and different PET/CT scanner models. The objective of this study was to develop a personal computer-based performance measurement and image quality processor for the two kinds of Japanese guidelines for oncology (18)F-FDG PET/CT scans. We call this software package the "PET quality control tool" (PETquact). Microsoft Corporation's Windows(™) is used as the operating system for PETquact, which requires 1070×720 image resolution and includes 12 different applications. The accuracy was examined for numerous applications of PETquact. For example, in the sensitivity application, the system sensitivity measurement results were equivalent when comparing two PET sinograms obtained from the PETquact and the report. PETquact is suited for analysis of the two kinds of Japanese guideline, and it shows excellent spec to performance measurements and image quality analysis. PETquact can be used at any facility if the software package is installed on a laptop computer.

  15. Comparison between Conduction and Convection Effects on Self-Heating in Doped Microcantilevers

    PubMed Central

    Ansari, Mohd Zahid; Cho, Chongdu

    2012-01-01

    The present study investigates the effects of thermal conduction and convection on self-heating temperatures and bimetallic deflections produced in doped microcantilever sensors. These cantilevers are commonly used as sensors and actuators in microsystems. The cantilever is a monolith, multi-layer structure with a thin U-shaped element inside. The cantilever substrate is made of silicon and silicon dioxide, respectively, and the element is p-doped silicon. A numerical analysis package (ANSYS) is used to study the effect of cantilever substrate material, element width, applied voltage and the operating environments on cantilever characteristics. The numerical results for temperature are compared against their analytical models. Results indicate the numerical results are accurate within 6% of analytical, and Si/Si cantilevers are more suitable for biosensors and AFM, whereas, Si/SiO2 are for hotplates and actuators applications. PMID:22438736

  16. Novel 18650 lithium-ion battery surrogate cell design with anisotropic thermophysical properties for studying failure events

    NASA Astrophysics Data System (ADS)

    Spinner, Neil S.; Hinnant, Katherine M.; Mazurick, Ryan; Brandon, Andrew; Rose-Pehrsson, Susan L.; Tuttle, Steven G.

    2016-04-01

    Cylindrical 18650-type surrogate cells were designed and fabricated to mimic the thermophysical properties and behavior of active lithium-ion batteries. An internal jelly roll geometry consisting of alternating stainless steel and mica layers was created, and numerous techniques were used to estimate thermophysical properties. Surrogate cell density was measured to be 1593 ± 30 kg/m3, and heat capacity was found to be 727 ± 18 J/kg-K. Axial thermal conductivity was determined to be 5.1 ± 0.6 W/m-K, which was over an order of magnitude higher than radial thermal conductivity due to jelly roll anisotropy. Radial heating experiments were combined with numerical and analytical solutions to the time-dependent, radial heat conduction equation, and from the numerical method an additional estimate for heat capacity of 805 ± 23 J/kg-K was found. Using both heat capacities and analysis techniques, values for radial thermal conductivity were between 0.120 and 0.197 W/m-K. Under normal operating conditions, relatively low radial temperature distributions were observed; however, during extreme battery failure with a hexagonal cell package, instantaneous radial temperature distributions as high as 43-71 °C were seen. For a vertical cell package, even during adjacent cell failure, similar homogeneity in internal temperatures were observed, demonstrating thermal anisotropy.

  17. Seismic waveform modeling over cloud

    NASA Astrophysics Data System (ADS)

    Luo, Cong; Friederich, Wolfgang

    2016-04-01

    With the fast growing computational technologies, numerical simulation of seismic wave propagation achieved huge successes. Obtaining the synthetic waveforms through numerical simulation receives an increasing amount of attention from seismologists. However, computational seismology is a data-intensive research field, and the numerical packages usually come with a steep learning curve. Users are expected to master considerable amount of computer knowledge and data processing skills. Training users to use the numerical packages, correctly access and utilize the computational resources is a troubled task. In addition to that, accessing to HPC is also a common difficulty for many users. To solve these problems, a cloud based solution dedicated on shallow seismic waveform modeling has been developed with the state-of-the-art web technologies. It is a web platform integrating both software and hardware with multilayer architecture: a well designed SQL database serves as the data layer, HPC and dedicated pipeline for it is the business layer. Through this platform, users will no longer need to compile and manipulate various packages on the local machine within local network to perform a simulation. By providing users professional access to the computational code through its interfaces and delivering our computational resources to the users over cloud, users can customize the simulation at expert-level, submit and run the job through it.

  18. Kinematic analysis of crank -cam mechanism of process equipment

    NASA Astrophysics Data System (ADS)

    Podgornyj, Yu I.; Skeeba, V. Yu; Martynova, T. G.; Pechorkina, N. S.; Skeeba, P. Yu

    2018-03-01

    This article discusses how to define the kinematic parameters of a crank-cam mechanism. Using the mechanism design, the authors have developed a calculation model and a calculation algorithm that allowed the definition of kinematic parameters of the mechanism, including crank displacements, angular velocities and acceleration, as well as driven link (rocker arm) angular speeds and acceleration. All calculations were performed using the Mathcad mathematical package. The results of the calculations are reported as numerical values.

  19. SUBOPT: A CAD program for suboptimal linear regulators

    NASA Technical Reports Server (NTRS)

    Fleming, P. J.

    1985-01-01

    An interactive software package which provides design solutions for both standard linear quadratic regulator (LQR) and suboptimal linear regulator problems is described. Intended for time-invariant continuous systems, the package is easily modified to include sampled-data systems. LQR designs are obtained by established techniques while the large class of suboptimal problems containing controller and/or performance index options is solved using a robust gradient minimization technique. Numerical examples demonstrate features of the package and recent developments are described.

  20. Mechanism of Void Prediction in Flip Chip Packages with Molded Underfill

    NASA Astrophysics Data System (ADS)

    Wu, Kuo-Tsai; Hwang, Sheng-Jye; Lee, Huei-Huang

    2017-08-01

    Voids have always been present using the molded underfill (MUF) package process, which is a problem that needs further investigation. In this study, the process was studied using the Moldex3D numerical analysis software. The effects of gas (air vent effect) on the overall melt front were also considered. In this isothermal process containing two fluids, the gas and melt colloid interact in the mold cavity. Simulation enabled an appropriate understanding of the actual situation to be gained, and, through analysis, the void region and exact location of voids were predicted. First, the global flow end area was observed to predict the void movement trend, and then the local flow ends were observed to predict the location and size of voids. In the MUF 518 case study, simulations predicted the void region as well as the location and size of the voids. The void phenomenon in a flip chip ball grid array underfill is discussed as part of the study.

  1. qtcm 0.1.2: A Python Implementation of the Neelin-Zeng Quasi-Equilibrium Tropical Circulation model

    NASA Astrophysics Data System (ADS)

    Lin, J. W.-B.

    2008-10-01

    Historically, climate models have been developed incrementally and in compiled languages like Fortran. While the use of legacy compiled languages results in fast, time-tested code, the resulting model is limited in its modularity and cannot take advantage of functionality available with modern computer languages. Here we describe an effort at using the open-source, object-oriented language Python to create more flexible climate models: the package qtcm, a Python implementation of the intermediate-level Neelin-Zeng Quasi-Equilibrium Tropical Circulation model (QTCM1) of the atmosphere. The qtcm package retains the core numerics of QTCM1, written in Fortran to optimize model performance, but uses Python structures and utilities to wrap the QTCM1 Fortran routines and manage model execution. The resulting "mixed language" modeling package allows order and choice of subroutine execution to be altered at run time, and model analysis and visualization to be integrated in interactively with model execution at run time. This flexibility facilitates more complex scientific analysis using less complex code than would be possible using traditional languages alone, and provides tools to transform the traditional "formulate hypothesis → write and test code → run model → analyze results" sequence into a feedback loop that can be executed automatically by the computer.

  2. qtcm 0.1.2: a Python implementation of the Neelin-Zeng Quasi-Equilibrium Tropical Circulation Model

    NASA Astrophysics Data System (ADS)

    Lin, J. W.-B.

    2009-02-01

    Historically, climate models have been developed incrementally and in compiled languages like Fortran. While the use of legacy compiled languages results in fast, time-tested code, the resulting model is limited in its modularity and cannot take advantage of functionality available with modern computer languages. Here we describe an effort at using the open-source, object-oriented language Python to create more flexible climate models: the package qtcm, a Python implementation of the intermediate-level Neelin-Zeng Quasi-Equilibrium Tropical Circulation model (QTCM1) of the atmosphere. The qtcm package retains the core numerics of QTCM1, written in Fortran to optimize model performance, but uses Python structures and utilities to wrap the QTCM1 Fortran routines and manage model execution. The resulting "mixed language" modeling package allows order and choice of subroutine execution to be altered at run time, and model analysis and visualization to be integrated in interactively with model execution at run time. This flexibility facilitates more complex scientific analysis using less complex code than would be possible using traditional languages alone, and provides tools to transform the traditional "formulate hypothesis → write and test code → run model → analyze results" sequence into a feedback loop that can be executed automatically by the computer.

  3. A Python Implementation of an Intermediate-Level Tropical Circulation Model and Implications for How Modeling Science is Done

    NASA Astrophysics Data System (ADS)

    Lin, J. W. B.

    2015-12-01

    Historically, climate models have been developed incrementally and in compiled languages like Fortran. While the use of legacy compiledlanguages results in fast, time-tested code, the resulting model is limited in its modularity and cannot take advantage of functionalityavailable with modern computer languages. Here we describe an effort at using the open-source, object-oriented language Pythonto create more flexible climate models: the package qtcm, a Python implementation of the intermediate-level Neelin-Zeng Quasi-Equilibrium Tropical Circulation model (QTCM1) of the atmosphere. The qtcm package retains the core numerics of QTCM1, written in Fortran, to optimize model performance but uses Python structures and utilities to wrap the QTCM1 Fortran routines and manage model execution. The resulting "mixed language" modeling package allows order and choice of subroutine execution to be altered at run time, and model analysis and visualization to be integrated in interactively with model execution at run time. This flexibility facilitates more complex scientific analysis using less complex code than would be possible using traditional languages alone and provides tools to transform the traditional "formulate hypothesis → write and test code → run model → analyze results" sequence into a feedback loop that can be executed automatically by the computer.

  4. 'spup' - an R package for uncertainty propagation in spatial environmental modelling

    NASA Astrophysics Data System (ADS)

    Sawicka, Kasia; Heuvelink, Gerard

    2016-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected static and interactive visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.

  5. Data Processing System (DPS) software with experimental design, statistical analysis and data mining developed for use in entomological research.

    PubMed

    Tang, Qi-Yi; Zhang, Chuan-Xi

    2013-04-01

    A comprehensive but simple-to-use software package called DPS (Data Processing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical software. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology. © 2012 The Authors Insect Science © 2012 Institute of Zoology, Chinese Academy of Sciences.

  6. Limits of variation, specific infectivity, and genome packaging of massively recoded poliovirus genomes.

    PubMed

    Song, Yutong; Gorbatsevych, Oleksandr; Liu, Ying; Mugavero, JoAnn; Shen, Sam H; Ward, Charles B; Asare, Emmanuel; Jiang, Ping; Paul, Aniko V; Mueller, Steffen; Wimmer, Eckard

    2017-10-10

    Computer design and chemical synthesis generated viable variants of poliovirus type 1 (PV1), whose ORF (6,189 nucleotides) carried up to 1,297 "Max" mutations (excess of overrepresented synonymous codon pairs) or up to 2,104 "SD" mutations (randomly scrambled synonymous codons). "Min" variants (excess of underrepresented synonymous codon pairs) are nonviable except for P2 Min , a variant temperature-sensitive at 33 and 39.5 °C. Compared with WT PV1, P2 Min displayed a vastly reduced specific infectivity (si) (WT, 1 PFU/118 particles vs. P2 Min , 1 PFU/35,000 particles), a phenotype that will be discussed broadly. Si of haploid PV presents cellular infectivity of a single genotype. We performed a comprehensive analysis of sequence and structures of the PV genome to determine if evolutionary conserved cis-acting packaging signal(s) were preserved after recoding. We showed that conserved synonymous sites and/or local secondary structures that might play a role in determining packaging specificity do not survive codon pair recoding. This makes it unlikely that numerous "cryptic, sequence-degenerate, dispersed RNA packaging signals mapping along the entire viral genome" [Patel N, et al. (2017) Nat Microbiol 2:17098] play the critical role in poliovirus packaging specificity. Considering all available evidence, we propose a two-step assembly strategy for +ssRNA viruses: step I, acquisition of packaging specificity, either ( a ) by specific recognition between capsid protein(s) and replication proteins (poliovirus), or ( b ) by the high affinity interaction of a single RNA packaging signal (PS) with capsid protein(s) (most +ssRNA viruses so far studied); step II, cocondensation of genome/capsid precursors in which an array of hairpin structures plays a role in virion formation.

  7. Evaluation of trade-offs in costs and environmental impacts for returnable packaging implementation

    NASA Astrophysics Data System (ADS)

    Jarupan, Lerpong; Kamarthi, Sagar V.; Gupta, Surendra M.

    2004-02-01

    The main thrust of returnable packaging these days is to provide logistical services through transportation and distribution of products and be environmentally friendly. Returnable packaging and reverse logistics concepts have converged to mitigate the adverse effect of packaging materials entering the solid waste stream. Returnable packaging must be designed by considering the trade-offs between costs and environmental impact to satisfy manufacturers and environmentalists alike. The cost of returnable packaging entails such items as materials, manufacturing, collection, storage and disposal. Environmental impacts are explicitly linked with solid waste, air pollution, and water pollution. This paper presents a multi-criteria evaluation technique to assist decision-makers for evaluating the trade-offs in costs and environmental impact during the returnable packaging design process. The proposed evaluation technique involves a combination of multiple objective integer linear programming and analytic hierarchy process. A numerical example is used to illustrate the methodology.

  8. 16 CFR 501.2 - Christmas tree ornaments.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 16 Commercial Practices 1 2012-01-01 2012-01-01 false Christmas tree ornaments. 501.2 Section 501... PROHIBITIONS UNDER PART 500 § 501.2 Christmas tree ornaments. Christmas tree ornaments packaged and labeled for... expressed in terms of numerical count of the ornaments, and (b) The ornaments are so packaged that the...

  9. 16 CFR 501.2 - Christmas tree ornaments.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 16 Commercial Practices 1 2013-01-01 2013-01-01 false Christmas tree ornaments. 501.2 Section 501... PROHIBITIONS UNDER PART 500 § 501.2 Christmas tree ornaments. Christmas tree ornaments packaged and labeled for... expressed in terms of numerical count of the ornaments, and (b) The ornaments are so packaged that the...

  10. 16 CFR 501.2 - Christmas tree ornaments.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 1 2011-01-01 2011-01-01 false Christmas tree ornaments. 501.2 Section 501... PROHIBITIONS UNDER PART 500 § 501.2 Christmas tree ornaments. Christmas tree ornaments packaged and labeled for... expressed in terms of numerical count of the ornaments, and (b) The ornaments are so packaged that the...

  11. 16 CFR 501.2 - Christmas tree ornaments.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 16 Commercial Practices 1 2014-01-01 2014-01-01 false Christmas tree ornaments. 501.2 Section 501... PROHIBITIONS UNDER PART 500 § 501.2 Christmas tree ornaments. Christmas tree ornaments packaged and labeled for... expressed in terms of numerical count of the ornaments, and (b) The ornaments are so packaged that the...

  12. Using R in Introductory Statistics Courses with the pmg Graphical User Interface

    ERIC Educational Resources Information Center

    Verzani, John

    2008-01-01

    The pmg add-on package for the open source statistics software R is described. This package provides a simple to use graphical user interface (GUI) that allows introductory statistics students, without advanced computing skills, to quickly create the graphical and numeric summaries expected of them. (Contains 9 figures.)

  13. Numerical Analysis on the High-Strength Concrete Beams Ultimate Behaviour

    NASA Astrophysics Data System (ADS)

    Smarzewski, Piotr; Stolarski, Adam

    2017-10-01

    Development of technologies of high-strength concrete (HSC) beams production, with the aim of creating a secure and durable material, is closely linked with the numerical models of real objects. The three-dimensional nonlinear finite element models of reinforced high-strength concrete beams with a complex geometry has been investigated in this study. The numerical analysis is performed using the ANSYS finite element package. The arc-length (A-L) parameters and the adaptive descent (AD) parameters are used with Newton-Raphson method to trace the complete load-deflection curves. Experimental and finite element modelling results are compared graphically and numerically. Comparison of these results indicates the correctness of failure criteria assumed for the high-strength concrete and the steel reinforcement. The results of numerical simulation are sensitive to the modulus of elasticity and the shear transfer coefficient for an open crack assigned to high-strength concrete. The full nonlinear load-deflection curves at mid-span of the beams, the development of strain in compressive concrete and the development of strain in tensile bar are in good agreement with the experimental results. Numerical results for smeared crack patterns are qualitatively agreeable as to the location, direction, and distribution with the test data. The model was capable of predicting the introduction and propagation of flexural and diagonal cracks. It was concluded that the finite element model captured successfully the inelastic flexural behaviour of the beams to failure.

  14. Effect of load eccentricity on the buckling of thin-walled laminated C-columns

    NASA Astrophysics Data System (ADS)

    Wysmulski, Pawel; Teter, Andrzej; Debski, Hubert

    2018-01-01

    The study investigates the behaviour of short, thin-walled laminated C-columns under eccentric compression. The tested columns are simple-supported. The effect of load inaccuracy on the critical and post-critical (local buckling) states is examined. A numerical analysis by the finite element method and experimental tests on a test stand are performed. The samples were produced from a carbon-epoxy prepreg by the autoclave technique. The experimental tests rest on the assumption that compressive loads are 1.5 higher than the theoretical critical force. Numerical modelling is performed using the commercial software package ABAQUS®. The critical load is determined by solving an eigen problem using the Subspace algorithm. The experimental critical loads are determined based on post-buckling paths. The numerical and experimental results show high agreement, thus demonstrating a significant effect of load inaccuracy on the critical load corresponding to the column's local buckling.

  15. Mountain-Plains Master Course List. Curriculum Areas: Job Titles: Learning Activity Packages: Courses: Units.

    ERIC Educational Resources Information Center

    Mountain-Plains Education and Economic Development Program, Inc., Glasgow AFB, MT.

    The document contains a master listing of all Mountain-Plains curriculum, compiled by job title, course, unit and LAP (Learning Activity Package), and arranged in numerical order by curriculum area. Preceding each curriculum area is a page of explanatory notes describing the curriculum area and including relevant job descriptions. Where a job…

  16. Determinant Computation on the GPU using the Condensation Method

    NASA Astrophysics Data System (ADS)

    Anisul Haque, Sardar; Moreno Maza, Marc

    2012-02-01

    We report on a GPU implementation of the condensation method designed by Abdelmalek Salem and Kouachi Said for computing the determinant of a matrix. We consider two types of coefficients: modular integers and floating point numbers. We evaluate the performance of our code by measuring its effective bandwidth and argue that it is numerical stable in the floating point number case. In addition, we compare our code with serial implementation of determinant computation from well-known mathematical packages. Our results suggest that a GPU implementation of the condensation method has a large potential for improving those packages in terms of running time and numerical stability.

  17. magnum.fe: A micromagnetic finite-element simulation code based on FEniCS

    NASA Astrophysics Data System (ADS)

    Abert, Claas; Exl, Lukas; Bruckner, Florian; Drews, André; Suess, Dieter

    2013-11-01

    We have developed a finite-element micromagnetic simulation code based on the FEniCS package called magnum.fe. Here we describe the numerical methods that are applied as well as their implementation with FEniCS. We apply a transformation method for the solution of the demagnetization-field problem. A semi-implicit weak formulation is used for the integration of the Landau-Lifshitz-Gilbert equation. Numerical experiments show the validity of simulation results. magnum.fe is open source and well documented. The broad feature range of the FEniCS package makes magnum.fe a good choice for the implementation of novel micromagnetic finite-element algorithms.

  18. MI-Sim: A MATLAB package for the numerical analysis of microbial ecological interactions.

    PubMed

    Wade, Matthew J; Oakley, Jordan; Harbisher, Sophie; Parker, Nicholas G; Dolfing, Jan

    2017-01-01

    Food-webs and other classes of ecological network motifs, are a means of describing feeding relationships between consumers and producers in an ecosystem. They have application across scales where they differ only in the underlying characteristics of the organisms and substrates describing the system. Mathematical modelling, using mechanistic approaches to describe the dynamic behaviour and properties of the system through sets of ordinary differential equations, has been used extensively in ecology. Models allow simulation of the dynamics of the various motifs and their numerical analysis provides a greater understanding of the interplay between the system components and their intrinsic properties. We have developed the MI-Sim software for use with MATLAB to allow a rigorous and rapid numerical analysis of several common ecological motifs. MI-Sim contains a series of the most commonly used motifs such as cooperation, competition and predation. It does not require detailed knowledge of mathematical analytical techniques and is offered as a single graphical user interface containing all input and output options. The tools available in the current version of MI-Sim include model simulation, steady-state existence and stability analysis, and basin of attraction analysis. The software includes seven ecological interaction motifs and seven growth function models. Unlike other system analysis tools, MI-Sim is designed as a simple and user-friendly tool specific to ecological population type models, allowing for rapid assessment of their dynamical and behavioural properties.

  19. Effect of interface layer on the performance of high power diode laser arrays

    NASA Astrophysics Data System (ADS)

    Zhang, Pu; Wang, Jingwei; Xiong, Lingling; Li, Xiaoning; Hou, Dong; Liu, Xingsheng

    2015-02-01

    Packaging is an important part of high power diode laser (HPLD) development and has become one of the key factors affecting the performance of high power diode lasers. In the package structure of HPLD, the interface layer of die bonding has significant effects on the thermal behavior of high power diode laser packages and most degradations and failures in high power diode laser packages are directly related to the interface layer. In this work, the effects of interface layer on the performance of high power diode laser array were studied numerically by modeling and experimentally. Firstly, numerical simulations using finite element method (FEM) were conducted to analyze the effects of voids in the interface layer on the temperature rise in active region of diode laser array. The correlation between junction temperature rise and voids was analyzed. According to the numerical simulation results, it was found that the local temperature rise of active region originated from the voids in the solder layer will lead to wavelength shift of some emitters. Secondly, the effects of solder interface layer on the spectrum properties of high power diode laser array were studied. It showed that the spectrum shape of diode laser array appeared "right shoulder" or "multi-peaks", which were related to the voids in the solder interface layer. Finally, "void-free" techniques were developed to minimize the voids in the solder interface layer and achieve high power diode lasers with better optical-electrical performances.

  20. Numerical Bifurcation Analysis of Delayed Recycle Stream in a Continuously Stirred Tank Reactor

    NASA Astrophysics Data System (ADS)

    Gangadhar, Nalwala Rohitbabu; Balasubramanian, Periyasamy

    2010-10-01

    In this paper, we present the stability analysis of delay differential equations which arise as a result of transportation lag in the CSTR-mechanical separator recycle system. A first order irreversible elementary reaction is considered to model the system and is governed by the delay differential equations. The DDE-BIFTOOL software package is used to analyze the stability of the delay system. The present analysis reveals that the system exhibits delay independent stability for isothermal operation of the CSTR. In the absence of delay, the system is dynamically unstable for non-isothermal operation of the CSTR, and as a result of delay, the system exhibits delay dependent stability.

  1. Chip Scale Package Integrity Assessment by Isothermal Aging

    NASA Technical Reports Server (NTRS)

    Ghaffarian, Reza

    1998-01-01

    Many aspects of chip scale package (CSP) technology, with focus on assembly reliability characteristics, are being investigated by the JPL-led consortia. Three types of test vehicles were considered for evaluation and currently two configurations have been built to optimize attachment processes. These test vehicles use numerous package types. To understand potential failure mechanisms of the packages, particularly solder ball attachment, the grid CSPs were subjected to environmental exposure. Package I/Os ranged from 40 to nearly 300. This paper presents both as assembled, up to 1, 000 hours of isothermal aging shear test results and photo micrographs, and tensile test results before and after 1,500 cycles in the range of -30/100 C for CSPs. Results will be compared to BGAs with the same the same isothermal aging environmental exposures.

  2. SAHARA: A package of PC computer programs for estimating both log-hyperbolic grain-size parameters and standard moments

    NASA Astrophysics Data System (ADS)

    Christiansen, Christian; Hartmann, Daniel

    This paper documents a package of menu-driven POLYPASCAL87 computer programs for handling grouped observations data from both sieving (increment data) and settling tube procedures (cumulative data). The package is designed deliberately for use on IBM-compatible personal computers. Two of the programs solve the numerical problem of determining the estimates of the four (main) parameters of the log-hyperbolic distribution and their derivatives. The package also contains a program for determining the mean, sorting, skewness. and kurtosis according to the standard moments. Moreover, the package contains procedures for smoothing and grouping of settling tube data. A graphic part of the package plots the data in a log-log plot together with the estimated log-hyperbolic curve. Along with the plot follows all estimated parameters. Another graphic option is a plot of the log-hyperbolic shape triangle with the (χ,ζ) position of the sample.

  3. Development and validation of a comprehensive model for map of fruits based on enzyme kinetics theory and arrhenius relation.

    PubMed

    Mangaraj, S; K Goswami, T; Mahajan, P V

    2015-07-01

    MAP is a dynamic system where respiration of the packaged product and gas permeation through the packaging film takes place simultaneously. The desired level of O2 and CO2 in a package is achieved by matching film permeation rates for O2 and CO2 with respiration rate of the packaged product. A mathematical model for MAP of fresh fruits applying enzyme kinetics based respiration equation coupled with the Arrhenious type model was developed. The model was solved numerically using MATLAB programme. The model was used to determine the time to reach to the equilibrium concentration inside the MA package and the level of O2 and CO2 concentration at equilibrium state. The developed model for prediction of equilibrium O2 and CO2 concentration was validated using experimental data for MA packaging of apple, guava and litchi.

  4. A Galleria Boundary Element Method for two-dimensional nonlinear magnetostatics

    NASA Astrophysics Data System (ADS)

    Brovont, Aaron D.

    The Boundary Element Method (BEM) is a numerical technique for solving partial differential equations that is used broadly among the engineering disciplines. The main advantage of this method is that one needs only to mesh the boundary of a solution domain. A key drawback is the myriad of integrals that must be evaluated to populate the full system matrix. To this day these integrals have been evaluated using numerical quadrature. In this research, a Galerkin formulation of the BEM is derived and implemented to solve two-dimensional magnetostatic problems with a focus on accurate, rapid computation. To this end, exact, closed-form solutions have been derived for all the integrals comprising the system matrix as well as those required to compute fields in post-processing; the need for numerical integration has been eliminated. It is shown that calculation of the system matrix elements using analytical solutions is 15-20 times faster than with numerical integration of similar accuracy. Furthermore, through the example analysis of a c-core inductor, it is demonstrated that the present BEM formulation is a competitive alternative to the Finite Element Method (FEM) for linear magnetostatic analysis. Finally, the BEM formulation is extended to analyze nonlinear magnetostatic problems via the Dual Reciprocity Method (DRBEM). It is shown that a coarse, meshless analysis using the DRBEM is able to achieve RMS error of 3-6% compared to a commercial FEM package in lightly saturated conditions.

  5. Basic linear algebra subprograms for FORTRAN usage

    NASA Technical Reports Server (NTRS)

    Lawson, C. L.; Hanson, R. J.; Kincaid, D. R.; Krogh, F. T.

    1977-01-01

    A package of 38 low level subprograms for many of the basic operations of numerical linear algebra is presented. The package is intended to be used with FORTRAN. The operations in the package are dot products, elementary vector operations, Givens transformations, vector copy and swap, vector norms, vector scaling, and the indices of components of largest magnitude. The subprograms and a test driver are available in portable FORTRAN. Versions of the subprograms are also provided in assembly language for the IBM 360/67, the CDC 6600 and CDC 7600, and the Univac 1108.

  6. Emerging Chitosan-Based Films for Food Packaging Applications.

    PubMed

    Wang, Hongxia; Qian, Jun; Ding, Fuyuan

    2018-01-17

    Recent years have witnessed great developments in biobased polymer packaging films for the serious environmental problems caused by the petroleum-based nonbiodegradable packaging materials. Chitosan is one of the most abundant biopolymers after cellulose. Chitosan-based materials have been widely applied in various fields for their biological and physical properties of biocompatibility, biodegradability, antimicrobial ability, and easy film forming ability. Different chitosan-based films have been fabricated and applied in the field of food packaging. Most of the review papers related to chitosan-based films are focusing on antibacterial food packaging films. Along with the advances in the nanotechnology and polymer science, numerous strategies, for instance direct casting, coating, dipping, layer-by-layer assembly, and extrusion, have been employed to prepare chitosan-based films with multiple functionalities. The emerging food packaging applications of chitosan-based films as antibacterial films, barrier films, and sensing films have achieved great developments. This article comprehensively reviews recent advances in the preparation and application of engineered chitosan-based films in food packaging fields.

  7. Reliability evaluation of hermetic dual in-line flat microcircuit packages

    NASA Technical Reports Server (NTRS)

    Johnson, G. M.; Conaway, L. K.

    1977-01-01

    The relative strengths and weaknesses of 35 commonly used hermetic flat and dual in-line packages were determined and used to rank each of the packages according to a numerical weighting scheme for package attributes. The list of attributes included desirable features in five major areas: lead and lead seal, body construction, body materials, lid and lid seal, and marking. The metal flat pack and multilayer integral ceramic flat pack and DIP received the highest rankings, and the soft glass Cerdip and Cerpak types received the lowest rankings. Loss of package hermeticity due to lead and lid seal problems was found to be the predominant failure mode from the literature/data search. However, environmental test results showed that lead and lid seal failures due to thermal stressing was only a problem with the hard glass (Ceramic) body DIP utilizing a metal lid and/or bottom. Insufficient failure data were generated for the other package types tested to correlate test results with the package ranking.

  8. Thermal Cycle Reliability and Failure Mechanisms of CCGA and PBGA Assemblies with and without Corner Staking

    NASA Technical Reports Server (NTRS)

    Ghaffarian, Reza

    2008-01-01

    Area array packages (AAPs) with 1.27 mm pitch have been the packages of choice for commercial applications; they are now starting to be implemented for use in military and aerospace applications. Thermal cycling characteristics of plastic ball grid array (PBGA) and chip scale package assemblies, because of their wide usage for commercial applications, have been extensively reported on in literature. Thermal cycling represents the on-off environmental condition for most electronic products and therefore is a key factor that defines reliability.However, very limited data is available for thermal cycling behavior of ceramic packages commonly used for the aerospace applications. For high reliability applications, numerous AAPs are available with an identical design pattern both in ceramic and plastic packages. This paper compares assembly reliability of ceramic and plastic packages with the identical inputs/outputs(I/Os) and pattern. The ceramic package was in the form of ceramic column grid array (CCGA) with 560 I/Os peripheral array with the identical pad design as its plastic counterpart.

  9. 49 CFR 178.502 - Identification codes for packagings.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...” means a jerrican. (iv) “4” means a box. (v) “5” means a bag. (vi) “6” means a composite packaging. (vii... natural wood. (iv) “D” means plywood. (v) “F” means reconstituted wood. (vi) “G” means fiberboard. (vii... (other than steel or aluminum). (xi) “P” means glass, porcelain or stoneware. (3) A numeral indicating...

  10. 49 CFR 178.502 - Identification codes for packagings.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...” means a jerrican. (iv) “4” means a box. (v) “5” means a bag. (vi) “6” means a composite packaging. (vii... natural wood. (iv) “D” means plywood. (v) “F” means reconstituted wood. (vi) “G” means fiberboard. (vii... (other than steel or aluminum). (xi) “P” means glass, porcelain or stoneware. (3) A numeral indicating...

  11. 49 CFR 178.502 - Identification codes for packagings.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...” means a jerrican. (iv) “4” means a box. (v) “5” means a bag. (vi) “6” means a composite packaging. (vii... natural wood. (iv) “D” means plywood. (v) “F” means reconstituted wood. (vi) “G” means fiberboard. (vii... (other than steel or aluminum). (xi) “P” means glass, porcelain or stoneware. (3) A numeral indicating...

  12. 49 CFR 178.502 - Identification codes for packagings.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...” means a jerrican. (iv) “4” means a box. (v) “5” means a bag. (vi) “6” means a composite packaging. (vii... natural wood. (iv) “D” means plywood. (v) “F” means reconstituted wood. (vi) “G” means fiberboard. (vii... (other than steel or aluminum). (xi) “P” means glass, porcelain or stoneware. (3) A numeral indicating...

  13. 16 CFR 503.4 - Net quantity of contents, numerical count.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... clearly expresses the fact that only one unit is contained in the package. Thus the unit synthetic sponge... sponge,” “one light bulb,” or “one dry cell battery.” However, there still exists the necessity to.... For example, the synthetic sponge which is packaged, requires dimensions such as “5 in. × 3 in. × 1 in...

  14. Experimental and numerical investigation on laser-assisted bending of pre-loaded metal plate

    NASA Astrophysics Data System (ADS)

    Nowak, Zdzisław; Nowak, Marcin; Widłaszewski, Jacek; Kurp, Piotr

    2018-01-01

    The laser forming technique has an important disadvantage, which is the limitation of plastic deformation generated by a single laser beam pass. To increase the plastic deformation it is possible to apply external forces in the laser forming process. In this paper, we investigate the influence of external pre-loads on the laser bending of steel plate. The pre-loads investigated generate bending towards the laser beam. The thermal, elastic-plastic analysis is performed using the commercial nonlinear finite element analysis package ABAQUS. The focus of the paper is to identify how this pattern of the pre-load influence the final bend angle of the plate.

  15. The use of optimization techniques to design controlled diffusion compressor blading

    NASA Technical Reports Server (NTRS)

    Sanger, N. L.

    1982-01-01

    A method for automating compressor blade design using numerical optimization, and applied to the design of a controlled diffusion stator blade row is presented. A general purpose optimization procedure is employed, based on conjugate directions for locally unconstrained problems and on feasible directions for locally constrained problems. Coupled to the optimizer is an analysis package consisting of three analysis programs which calculate blade geometry, inviscid flow, and blade surface boundary layers. The optimizing concepts and selection of design objective and constraints are described. The procedure for automating the design of a two dimensional blade section is discussed, and design results are presented.

  16. Documentation of the seawater intrusion (SWI2) package for MODFLOW

    USGS Publications Warehouse

    Bakker, Mark; Schaars, Frans; Hughes, Joseph D.; Langevin, Christian D.; Dausman, Alyssa M.

    2013-01-01

    The SWI2 Package is the latest release of the Seawater Intrusion (SWI) Package for MODFLOW. The SWI2 Package allows three-dimensional vertically integrated variable-density groundwater flow and seawater intrusion in coastal multiaquifer systems to be simulated using MODFLOW-2005. Vertically integrated variable-density groundwater flow is based on the Dupuit approximation in which an aquifer is vertically discretized into zones of differing densities, separated from each other by defined surfaces representing interfaces or density isosurfaces. The numerical approach used in the SWI2 Package does not account for diffusion and dispersion and should not be used where these processes are important. The resulting differential equations are equivalent in form to the groundwater flow equation for uniform-density flow. The approach implemented in the SWI2 Package allows density effects to be incorporated into MODFLOW-2005 through the addition of pseudo-source terms to the groundwater flow equation without the need to solve a separate advective-dispersive transport equation. Vertical and horizontal movement of defined density surfaces is calculated separately using a combination of fluxes calculated through solution of the groundwater flow equation and a simple tip and toe tracking algorithm. Use of the SWI2 Package in MODFLOW-2005 only requires the addition of a single additional input file and modification of boundary heads to freshwater heads referenced to the top of the aquifer. Fluid density within model layers can be represented using zones of constant density (stratified flow) or continuously varying density (piecewise linear in the vertical direction) in the SWI2 Package. The main advantage of using the SWI2 Package instead of variable-density groundwater flow and dispersive solute transport codes, such as SEAWAT and SUTRA, is that fewer model cells are required for simulations using the SWI2 Package because every aquifer can be represented by a single layer of cells. This reduction in number of required model cells and the elimination of the need to solve the advective-dispersive transport equation results in substantial model run-time savings, which can be large for regional aquifers. The accuracy and use of the SWI2 Package is demonstrated through comparison with existing exact solutions and numerical solutions with SEAWAT. Results for an unconfined aquifer are also presented to demonstrate application of the SWI2 Package to a large-scale regional problem.

  17. Evaluation of Solid Rocket Motor Component Data Using a Commercially Available Statistical Software Package

    NASA Technical Reports Server (NTRS)

    Stefanski, Philip L.

    2015-01-01

    Commercially available software packages today allow users to quickly perform the routine evaluations of (1) descriptive statistics to numerically and graphically summarize both sample and population data, (2) inferential statistics that draws conclusions about a given population from samples taken of it, (3) probability determinations that can be used to generate estimates of reliability allowables, and finally (4) the setup of designed experiments and analysis of their data to identify significant material and process characteristics for application in both product manufacturing and performance enhancement. This paper presents examples of analysis and experimental design work that has been conducted using Statgraphics®(Registered Trademark) statistical software to obtain useful information with regard to solid rocket motor propellants and internal insulation material. Data were obtained from a number of programs (Shuttle, Constellation, and Space Launch System) and sources that include solid propellant burn rate strands, tensile specimens, sub-scale test motors, full-scale operational motors, rubber insulation specimens, and sub-scale rubber insulation analog samples. Besides facilitating the experimental design process to yield meaningful results, statistical software has demonstrated its ability to quickly perform complex data analyses and yield significant findings that might otherwise have gone unnoticed. One caveat to these successes is that useful results not only derive from the inherent power of the software package, but also from the skill and understanding of the data analyst.

  18. eXtended CASA Line Analysis Software Suite (XCLASS)

    NASA Astrophysics Data System (ADS)

    Möller, T.; Endres, C.; Schilke, P.

    2017-02-01

    The eXtended CASA Line Analysis Software Suite (XCLASS) is a toolbox for the Common Astronomy Software Applications package (CASA) containing new functions for modeling interferometric and single dish data. Among the tools is the myXCLASS program which calculates synthetic spectra by solving the radiative transfer equation for an isothermal object in one dimension, whereas the finite source size and dust attenuation are considered as well. Molecular data required by the myXCLASS program are taken from an embedded SQLite3 database containing entries from the Cologne Database for Molecular Spectroscopy (CDMS) and JPL using the Virtual Atomic and Molecular Data Center (VAMDC) portal. Additionally, the toolbox provides an interface for the model optimizer package Modeling and Analysis Generic Interface for eXternal numerical codes (MAGIX), which helps to find the best description of observational data using myXCLASS (or another external model program), that is, finding the parameter set that most closely reproduces the data. http://www.astro.uni-koeln.de/projects/schilke/myXCLASSInterface A copy of the code is available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/598/A7

  19. Regularization with numerical extrapolation for finite and UV-divergent multi-loop integrals

    NASA Astrophysics Data System (ADS)

    de Doncker, E.; Yuasa, F.; Kato, K.; Ishikawa, T.; Kapenga, J.; Olagbemi, O.

    2018-03-01

    We give numerical integration results for Feynman loop diagrams such as those covered by Laporta (2000) and by Baikov and Chetyrkin (2010), and which may give rise to loop integrals with UV singularities. We explore automatic adaptive integration using multivariate techniques from the PARINT package for multivariate integration, as well as iterated integration with programs from the QUADPACK package, and a trapezoidal method based on a double exponential transformation. PARINT is layered over MPI (Message Passing Interface), and incorporates advanced parallel/distributed techniques including load balancing among processes that may be distributed over a cluster or a network/grid of nodes. Results are included for 2-loop vertex and box diagrams and for sets of 2-, 3- and 4-loop self-energy diagrams with or without UV terms. Numerical regularization of integrals with singular terms is achieved by linear and non-linear extrapolation methods.

  20. General specifications for the development of a USL NASA PC R and D statistical analysis support package

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Bassari, Jinous; Triantafyllopoulos, Spiros

    1984-01-01

    The University of Southwestern Louisiana (USL) NASA PC R and D statistical analysis support package is designed to be a three-level package to allow statistical analysis for a variety of applications within the USL Data Base Management System (DBMS) contract work. The design addresses usage of the statistical facilities as a library package, as an interactive statistical analysis system, and as a batch processing package.

  1. Comparison of particle tracking algorithms in commercial CFD packages: sedimentation and diffusion.

    PubMed

    Robinson, Risa J; Snyder, Pam; Oldham, Michael J

    2007-05-01

    Computational fluid dynamic modeling software has enabled microdosimetry patterns of inhaled toxins and toxicants to be predicted and visualized, and is being used in inhalation toxicology and risk assessment. These predicted microdosimetry patterns in airway structures are derived from predicted airflow patterns within these airways and particle tracking algorithms used in computational fluid dynamics (CFD) software packages. Although these commercial CFD codes have been tested for accuracy under various conditions, they have not been well tested for respiratory flows in general. Nor has their particle tracking algorithm accuracy been well studied. In this study, three software packages, Fluent Discrete Phase Model (DPM), Fluent Fine Particle Model (FPM), and ANSYS CFX, were evaluated. Sedimentation and diffusion were each isolated in a straight tube geometry and tested for accuracy. A range of flow rates corresponding to adult low activity (minute ventilation = 10 L/min) and to heavy exertion (minute ventilation = 60 L/min) were tested by varying the range of dimensionless diffusion and sedimentation parameters found using the Weibel symmetric 23 generation lung morphology. Numerical results for fully developed parabolic and uniform (slip) profiles were compared respectively, to Pich (1972) and Yu (1977) analytical sedimentation solutions. Schum and Yeh (1980) equations for sedimentation were also compared. Numerical results for diffusional deposition were compared to analytical solutions of Ingham (1975) for parabolic and uniform profiles. Significant differences were found among the various CFD software packages and between numerical and analytical solutions. Therefore, it is prudent to validate CFD predictions against analytical solutions in idealized geometry before tackling the complex geometries of the respiratory tract.

  2. Substructured multibody molecular dynamics.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grest, Gary Stephen; Stevens, Mark Jackson; Plimpton, Steven James

    2006-11-01

    We have enhanced our parallel molecular dynamics (MD) simulation software LAMMPS (Large-scale Atomic/Molecular Massively Parallel Simulator, lammps.sandia.gov) to include many new features for accelerated simulation including articulated rigid body dynamics via coupling to the Rensselaer Polytechnic Institute code POEMS (Parallelizable Open-source Efficient Multibody Software). We use new features of the LAMMPS software package to investigate rhodopsin photoisomerization, and water model surface tension and capillary waves at the vapor-liquid interface. Finally, we motivate the recipes of MD for practitioners and researchers in numerical analysis and computational mechanics.

  3. Re-evaluation of model-based light-scattering spectroscopy for tissue spectroscopy

    PubMed Central

    Lau, Condon; Šćepanović, Obrad; Mirkovic, Jelena; McGee, Sasha; Yu, Chung-Chieh; Fulghum, Stephen; Wallace, Michael; Tunnell, James; Bechtel, Kate; Feld, Michael

    2009-01-01

    Model-based light scattering spectroscopy (LSS) seemed a promising technique for in-vivo diagnosis of dysplasia in multiple organs. In the studies, the residual spectrum, the difference between the observed and modeled diffuse reflectance spectra, was attributed to single elastic light scattering from epithelial nuclei, and diagnostic information due to nuclear changes was extracted from it. We show that this picture is incorrect. The actual single scattering signal arising from epithelial nuclei is much smaller than the previously computed residual spectrum, and does not have the wavelength dependence characteristic of Mie scattering. Rather, the residual spectrum largely arises from assuming a uniform hemoglobin distribution. In fact, hemoglobin is packaged in blood vessels, which alters the reflectance. When we include vessel packaging, which accounts for an inhomogeneous hemoglobin distribution, in the diffuse reflectance model, the reflectance is modeled more accurately, greatly reducing the amplitude of the residual spectrum. These findings are verified via numerical estimates based on light propagation and Mie theory, tissue phantom experiments, and analysis of published data measured from Barrett’s esophagus. In future studies, vessel packaging should be included in the model of diffuse reflectance and use of model-based LSS should be discontinued. PMID:19405760

  4. Numerical simulation of a flow past a triangular sail-type blade of a wind generator using the ANSYS FLUENT software package

    NASA Astrophysics Data System (ADS)

    Kusaiynov, K.; Tanasheva, N. K.; Min'kov, L. L.; Nusupbekov, B. R.; Stepanova, Yu. O.; Rozhkova, A. V.

    2016-02-01

    An air flow past a single triangular sail-type blade of a wind turbine is analyzed by numerical simulation for low velocities of the incoming flow. The results of numerical simulation indicate a monotonic increase in the drag force and the lift force as functions of the incoming flow; empirical dependences of these quantities are obtained.

  5. Numerics made easy: solving the Navier-Stokes equation for arbitrary channel cross-sections using Microsoft Excel.

    PubMed

    Richter, Christiane; Kotz, Frederik; Giselbrecht, Stefan; Helmer, Dorothea; Rapp, Bastian E

    2016-06-01

    The fluid mechanics of microfluidics is distinctively simpler than the fluid mechanics of macroscopic systems. In macroscopic systems effects such as non-laminar flow, convection, gravity etc. need to be accounted for all of which can usually be neglected in microfluidic systems. Still, there exists only a very limited selection of channel cross-sections for which the Navier-Stokes equation for pressure-driven Poiseuille flow can be solved analytically. From these equations, velocity profiles as well as flow rates can be calculated. However, whenever a cross-section is not highly symmetric (rectangular, elliptical or circular) the Navier-Stokes equation can usually not be solved analytically. In all of these cases, numerical methods are required. However, in many instances it is not necessary to turn to complex numerical solver packages for deriving, e.g., the velocity profile of a more complex microfluidic channel cross-section. In this paper, a simple spreadsheet analysis tool (here: Microsoft Excel) will be used to implement a simple numerical scheme which allows solving the Navier-Stokes equation for arbitrary channel cross-sections.

  6. Features in simulation of crystal growth using the hyperbolic PFC equation and the dependence of the numerical solution on the parameters of the computational grid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Starodumov, Ilya; Kropotin, Nikolai

    2016-08-10

    We investigate the three-dimensional mathematical model of crystal growth called PFC (Phase Field Crystal) in a hyperbolic modification. This model is also called the modified model PFC (originally PFC model is formulated in parabolic form) and allows to describe both slow and rapid crystallization processes on atomic length scales and on diffusive time scales. Modified PFC model is described by the differential equation in partial derivatives of the sixth order in space and second order in time. The solution of this equation is possible only by numerical methods. Previously, authors created the software package for the solution of the Phasemore » Field Crystal problem, based on the method of isogeometric analysis (IGA) and PetIGA program library. During further investigation it was found that the quality of the solution can strongly depends on the discretization parameters of a numerical method. In this report, we show the features that should be taken into account during constructing the computational grid for the numerical simulation.« less

  7. A microacoustic analysis including viscosity and thermal conductivity to model the effect of the protective cap on the acoustic response of a MEMS microphone

    PubMed Central

    Homentcovschi, D.; Miles, R. N.; Loeppert, P. V.; Zuckerwar, A. J.

    2013-01-01

    An analysis is presented of the effect of the protective cover on the acoustic response of a miniature silicon microphone. The microphone diaphragm is contained within a small rectangular enclosure and the sound enters through a small hole in the enclosure's top surface. A numerical model is presented to predict the variation in the sound field with position within the enclosure. An objective of this study is to determine up to which frequency the pressure distribution remains sufficiently uniform so that a pressure calibration can be made in free space. The secondary motivation for this effort is to facilitate microphone design by providing a means of predicting how the placement of the microphone diaphragm in the package affects the sensitivity and frequency response. While the size of the package is typically small relative to the wavelength of the sounds of interest, because the dimensions of the package are on the order of the thickness of the viscous boundary layer, viscosity can significantly affect the distribution of sound pressure around the diaphragm. In addition to the need to consider viscous effects, it is shown here that one must also carefully account for thermal conductivity to properly represent energy dissipation at the system's primary acoustic resonance frequency. The sound field is calculated using a solution of the linearized system consisting of continuity equation, Navier-Stokes equations, the state equation and the energy equation using a finite element approach. The predicted spatial variation of both the amplitude and phase of the sound pressure is shown over the range of audible frequencies. Excellent agreement is shown between the predicted and measured effects of the package on the microphone's sensitivity. PMID:24701031

  8. lumpR 2.0.0: an R package facilitating landscape discretisation for hillslope-based hydrological models

    NASA Astrophysics Data System (ADS)

    Pilz, Tobias; Francke, Till; Bronstert, Axel

    2017-08-01

    The characteristics of a landscape pose essential factors for hydrological processes. Therefore, an adequate representation of the landscape of a catchment in hydrological models is vital. However, many of such models exist differing, amongst others, in spatial concept and discretisation. The latter constitutes an essential pre-processing step, for which many different algorithms along with numerous software implementations exist. In that context, existing solutions are often model specific, commercial, or depend on commercial back-end software, and allow only a limited or no workflow automation at all. Consequently, a new package for the scientific software and scripting environment R, called lumpR, was developed. lumpR employs an algorithm for hillslope-based landscape discretisation directed to large-scale application via a hierarchical multi-scale approach. The package addresses existing limitations as it is free and open source, easily extendible to other hydrological models, and the workflow can be fully automated. Moreover, it is user-friendly as the direct coupling to a GIS allows for immediate visual inspection and manual adjustment. Sufficient control is furthermore retained via parameter specification and the option to include expert knowledge. Conversely, completely automatic operation also allows for extensive analysis of aspects related to landscape discretisation. In a case study, the application of the package is presented. A sensitivity analysis of the most important discretisation parameters demonstrates its efficient workflow automation. Considering multiple streamflow metrics, the employed model proved reasonably robust to the discretisation parameters. However, parameters determining the sizes of subbasins and hillslopes proved to be more important than the others, including the number of representative hillslopes, the number of attributes employed for the lumping algorithm, and the number of sub-discretisations of the representative hillslopes.

  9. Use of statistical study methods for the analysis of the results of the imitation modeling of radiation transfer

    NASA Astrophysics Data System (ADS)

    Alekseenko, M. A.; Gendrina, I. Yu.

    2017-11-01

    Recently, due to the abundance of various types of observational data in the systems of vision through the atmosphere and the need for their processing, the use of various methods of statistical research in the study of such systems as correlation-regression analysis, dynamic series, variance analysis, etc. is actual. We have attempted to apply elements of correlation-regression analysis for the study and subsequent prediction of the patterns of radiation transfer in these systems same as in the construction of radiation models of the atmosphere. In this paper, we present some results of statistical processing of the results of numerical simulation of the characteristics of vision systems through the atmosphere obtained with the help of a special software package.1

  10. Structural zooming research and development of an interactive computer graphical interface for stress analysis of cracks

    NASA Technical Reports Server (NTRS)

    Gerstle, Walter

    1989-01-01

    Engineering problems sometimes involve the numerical solution of boundary value problems over domains containing geometric feature with widely varying scales. Often, a detailed solution is required at one or more of these features. Small details in large structures may have profound effects upon global performance. Conversely, large-scale conditions may effect local performance. Many man-hours and CPU-hours are currently spent in modeling such problems. With the structural zooming technique, it is now possible to design an integrated program which allows the analyst to interactively focus upon a small region of interest, to modify the local geometry, and then to obtain highly accurate responses in that region which reflect both the properties of the overall structure and the local detail. A boundary integral equation analysis program, called BOAST, was recently developed for the stress analysis of cracks. This program can accurately analyze two-dimensional linear elastic fracture mechanics problems with far less computational effort than existing finite element codes. An interactive computer graphical interface to BOAST was written. The graphical interface would have several requirements: it would be menu-driven, with mouse input; all aspects of input would be entered graphically; the results of a BOAST analysis would be displayed pictorially but also the user would be able to probe interactively to get numerical values of displacement and stress at desired locations within the analysis domain; the entire procedure would be integrated into a single, easy to use package; and it would be written using calls to the graphic package called HOOPS. The program is nearing completion. All of the preprocessing features are working satisfactorily and were debugged. The postprocessing features are under development, and rudimentary postprocessing should be available by the end of the summer. The program was developed and run on a VAX workstation, and must be ported to the SUN workstation. This activity is currently underway.

  11. Hysteretic Models Considering Axial-Shear-Flexure Interaction

    NASA Astrophysics Data System (ADS)

    Ceresa, Paola; Negrisoli, Giorgio

    2017-10-01

    Most of the existing numerical models implemented in finite element (FE) software, at the current state of the art, are not capable to describe, with enough reliability, the interaction between axial, shear and flexural actions under cyclic loading (e.g. seismic actions), neglecting crucial effects for predicting the nature of the collapse of reinforced concrete (RC) structural elements. Just a few existing 3D volume models or fibre beam models can lead to a quite accurate response, but they are still computationally inefficient for typical applications in earthquake engineering and also characterized by very complex formulation. Thus, discrete models with lumped plasticity hinges may be the preferred choice for modelling the hysteretic behaviour due to cyclic loading conditions, in particular with reference to its implementation in a commercial software package. These considerations lead to this research work focused on the development of a model for RC beam-column elements able to consider degradation effects and interaction between the actions under cyclic loading conditions. In order to develop a model for a general 3D discrete hinge element able to take into account the axial-shear-flexural interaction, it is necessary to provide an implementation which involves a corrector-predictor iterative scheme. Furthermore, a reliable constitutive model based on damage plasticity theory is formulated and implemented for its numerical validation. Aim of this research work is to provide the formulation of a numerical model, which will allow implementation within a FE software package for nonlinear cyclic analysis of RC structural members. The developed model accounts for stiffness degradation effect and stiffness recovery for loading reversal.

  12. [The body packer syndrome].

    PubMed

    Clément, R; Fornes, P; Lecomte, D

    2001-02-17

    Ingestion of illicit drug packages is a well known method for transportation. These packages are prone to rupture causing overdose. The body packer syndrome may be overlooked in medical practice as illustrated by the following case report. A 19-year old male had convulsions followed by cardiac arrest during a flight. He was resuscitated in the plane, but he died a few hours after admission in intensive care unit. Chest and abdominal X-rays were considered normal. Cocaïne métabolites were found in his urine. The death was considered suspicious. X-rays performed before medicolegal autopsy showed numerous packages in his digestive tract. Thirty-six packages were found in the stomach and intestine. Two were ruptured in the stomach. The cause of death was cocaïne overdose caused by package rupture. The packages are usually visible on an standard abdomen X-ray. The drug is often wrapped in latex membranes or condoms. The air is trapped between the condoms by the nodes, forming two crescents visible on the X-ray. Surgery is preferred to laxatives when the packages are fragile with a high risk of rupture.

  13. Vibration analysis of angle-ply laminated composite plates with an embedded piezoceramic layer.

    PubMed

    Lin, Hsien-Yang; Huang, Jin-Hung; Ma, Chien-Ching

    2003-09-01

    An optical full-field technique, called amplitude-fluctuation electronic speckle pattern interferometry (AF-ESPI), is used in this study to investigate the force-induced transverse vibration of an angle-ply laminated composite embedded with a piezoceramic layer (piezolaminated plates). The piezolaminated plates are excited by applying time-harmonic voltages to the embedded piezoceramic layer. Because clear fringe patterns will appear only at resonant frequencies, both the resonant frequencies and mode shapes of the vibrating piezolaminated plates with five different fiber orientation angles are obtained by the proposed AF-ESPI method. A laser Doppler vibrometer (LDV) system that has the advantage of high resolution and broad dynamic range also is applied to measure the frequency response of piezolaminated plates. In addition to the two proposed optical techniques, numerical computations based on a commercial finite element package are presented for comparison with the experimental results. Three different numerical formulations are used to evaluate the vibration characteristics of piezolaminated plates. Good agreements of the measured data by the optical method and the numerical results predicted by the finite element method (FEM) demonstrate that the proposed methodology in this study is a powerful tool for the vibration analysis of piezolaminated plates.

  14. Book of Knowledge (BOK) for NASA Electronic Packaging Roadmap

    NASA Technical Reports Server (NTRS)

    Ghaffarian, Reza

    2015-01-01

    The objective of this document is to update the NASA roadmap on packaging technologies (initially released in 2007) and to present the current trends toward further reducing size and increasing functionality. Due to the breadth of work being performed in the area of microelectronics packaging, this report presents only a number of key packaging technologies detailed in three industry roadmaps for conventional microelectronics and a more recently introduced roadmap for organic and printed electronics applications. The topics for each category were down-selected by reviewing the 2012 reports of the International Technology Roadmap for Semiconductor (ITRS), the 2013 roadmap reports of the International Electronics Manufacturing Initiative (iNEMI), the 2013 roadmap of association connecting electronics industry (IPC), the Organic Printed Electronics Association (OE-A). The report also summarizes the results of numerous articles and websites specifically discussing the trends in microelectronics packaging technologies.

  15. Effectiveness of the International Phytosanitary Standard ISPM No. 15 on reducing wood borer infestation rates in wood packaging material entering the United States

    Treesearch

    Robert A. Haack; Kerry O. Britton; Eckelhard G. Brockerhoff; Joseph F. Cavey; Lynn J. Garrett; Mark Kimberley; Frank Lowenstein; Amelia Nuding; Lars J. Olson; James Tumer; Kathryn N. Vasilaky

    2014-01-01

    Numerous bark- and wood-infesting insects have been introduced to new countries by international trade where some have caused severe environmental and economic damage. Wood packaging material (WPM), such as pallets, is one of the high risk pathways for the introduction of wood pests. International recognition of this risk resulted in adoption of International Standards...

  16. Laser Welding in Electronic Packaging

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The laser has proven its worth in numerous high reliability electronic packaging applications ranging from medical to missile electronics. In particular, the pulsed YAG laser is an extremely flexible and versatile too] capable of hermetically sealing microelectronics packages containing sensitive components without damaging them. This paper presents an overview of details that must be considered for successful use of laser welding when addressing electronic package sealing. These include; metallurgical considerations such as alloy and plating selection, weld joint configuration, design of optics, use of protective gases and control of thermal distortions. The primary limitations on use of laser welding electronic for packaging applications are economic ones. The laser itself is a relatively costly device when compared to competing welding equipment. Further, the cost of consumables and repairs can be significant. These facts have relegated laser welding to use only where it presents a distinct quality or reliability advantages over other techniques of electronic package sealing. Because of the unique noncontact and low heat inputs characteristics of laser welding, it is an ideal candidate for sealing electronic packages containing MEMS devices (microelectromechanical systems). This paper addresses how the unique advantages of the pulsed YAG laser can be used to simplify MEMS packaging and deliver a product of improved quality.

  17. Quantitative comparison between PGNAA measurements and MCNP calculations in view of the characterization of radioactive wastes in Germany and France

    NASA Astrophysics Data System (ADS)

    Mauerhofer, E.; Havenith, A.; Carasco, C.; Payan, E.; Kettler, J.; Ma, J. L.; Perot, B.

    2013-04-01

    The Forschungszentrum Jülich GmbH (FZJ), together with the Aachen University Rheinisch-Westfaelische Technische Hochschule (RWTH) and the French Alternative Energies and Atomic Energy Commission (CEA Cadarache) are involved in a cooperation aiming at characterizing toxic and reactive elements in radioactive waste packages by means of Prompt Gamma Neutron Activation Analysis (PGNAA) [1]. The French and German waste management agencies have indeed defined acceptability limits concerning these elements in view of their projected geological repositories. A first measurement campaign was performed in the new Prompt Gamma Neutron Activation Analysis (PGNAA) facility called MEDINA, at FZJ, to assess the capture gamma-ray signatures of some elements of interest in large samples up to waste drums with a volume of 200 liter. MEDINA is the acronym for Multi Element Detection based on Instrumental Neutron Activation. This paper presents MCNP calculations of the MEDINA facility and quantitative comparison between measurement and simulation. Passive gamma-ray spectra acquired with a high purity germanium detector and calibration sources are used to qualify the numerical model of the crystal. Active PGNAA spectra of a sodium chloride sample measured with MEDINA then allow for qualifying the global numerical model of the measurement cell. Chlorine indeed constitutes a usual reference with reliable capture gamma-ray production data. The goal is to characterize the entire simulation protocol (geometrical model, nuclear data, and postprocessing tools) which will be used for current measurement interpretation, extrapolation of the performances to other types of waste packages or other applications, as well as for the study of future PGNAA facilities.

  18. Kinematics Simulation Analysis of Packaging Robot with Joint Clearance

    NASA Astrophysics Data System (ADS)

    Zhang, Y. W.; Meng, W. J.; Wang, L. Q.; Cui, G. H.

    2018-03-01

    Considering the influence of joint clearance on the motion error, repeated positioning accuracy and overall position of the machine, this paper presents simulation analysis of a packaging robot — 2 degrees of freedom(DOF) planar parallel robot based on the characteristics of high precision and fast speed of packaging equipment. The motion constraint equation of the mechanism is established, and the analysis and simulation of the motion error are carried out in the case of turning the revolute clearance. The simulation results show that the size of the joint clearance will affect the movement accuracy and packaging efficiency of the packaging robot. The analysis provides a reference point of view for the packaging equipment design and selection criteria and has a great significance on the packaging industry automation.

  19. Migration and sorption phenomena in packaged foods.

    PubMed

    Gnanasekharan, V; Floros, J D

    1997-10-01

    Rapidly developing analytical capabilities and continuously evolving stringent regulations have made food/package interactions a subject of intense research. This article focuses on: (1) the migration of package components such as oligomers and monomers, processing aids, additives, and residual reactants in to packaged foods, and (2) sorption of food components such as flavors, lipids, and moisture into packages. Principles of diffusion and thermodynamics are utilized to describe the mathematics of migration and sorption. Mathematical models are developed from first principles, and their applicability is illustrated using numerical simulations and published data. Simulations indicate that available models are system (polymer-penetrant) specific. Furthermore, some models best describe the early stages of migration/sorption, whereas others should be used for the late stages of these phenomena. Migration- and/or sorption-related problems with respect to glass, metal, paper-based and polymeric packaging materials are discussed, and their importance is illustrated using published examples. The effects of migrating and absorbed components on food safety, quality, and the environment are presented for various foods and packaging materials. The impact of currently popular packaging techniques such as microwavable, ovenable, and retortable packaging on migration and sorption are discussed with examples. Analytical techniques for investigating migration and sorption phenomena in food packaging are critically reviewed, with special emphasis on the use and characteristics of food-simulating liquids (FSLs). Finally, domestic and international regulations concerning migration in packaged foods, and their impact on food packaging is briefly presented.

  20. Package-X 2.0: A Mathematica package for the analytic calculation of one-loop integrals

    NASA Astrophysics Data System (ADS)

    Patel, Hiren H.

    2017-09-01

    This article summarizes new features and enhancements of the first major update of Package-X. Package-X 2.0 can now generate analytic expressions for arbitrarily high rank dimensionally regulated tensor integrals with up to four distinct propagators, each with arbitrary integer weight, near an arbitrary even number of spacetime dimensions, giving UV divergent, IR divergent, and finite parts at (almost) any real-valued kinematic point. Additionally, it can generate multivariable Taylor series expansions of these integrals around any non-singular kinematic point to arbitrary order. All special functions and abbreviations output by Package-X 2.0 support Mathematica's arbitrary precision evaluation capabilities to deal with issues of numerical stability. Finally, tensor algebraic routines of Package-X have been polished and extended to support open fermion chains both on and off shell. The documentation (equivalent to over 100 printed pages) is accessed through Mathematica's Wolfram Documentation Center and contains information on all Package-X symbols, with over 300 basic usage examples, 3 project-scale tutorials, and instructions on linking to FEYNCALC and LOOPTOOLS. Program files doi:http://dx.doi.org/10.17632/yfkwrd4d5t.1 Licensing provisions: CC by 4.0 Programming language: Mathematica (Wolfram Language) Journal reference of previous version: H. H. Patel, Comput. Phys. Commun 197, 276 (2015) Does the new version supersede the previous version?: Yes Summary of revisions: Extension to four point one-loop integrals with higher powers of denominator factors, separate extraction of UV and IR divergent parts, testing for power IR divergences, construction of Taylor series expansions of one-loop integrals, numerical evaluation with arbitrary precision arithmetic, manipulation of fermion chains, improved tensor algebraic routines, and much expanded documentation. Nature of problem: Analytic calculation of one-loop integrals in relativistic quantum field theory. Solution method: Passarino-Veltman reduction formula, Denner-Dittmaier reduction formulae, and additional algorithms described in the manuscript. Restrictions: One-loop integrals are limited to those involving no more than four denominator factors.

  1. Numerical simulation of the hydrodynamic instabilities of Richtmyer-Meshkov and Rayleigh-Taylor

    NASA Astrophysics Data System (ADS)

    Fortova, S. V.; Shepelev, V. V.; Troshkin, O. V.; Kozlov, S. A.

    2017-09-01

    The paper presents the results of numerical simulation of the development of hydrodynamic instabilities of Richtmyer-Meshkov and Rayleigh-Taylor encountered in experiments [1-3]. For the numerical solution used the TPS software package (Turbulence Problem Solver) that implements a generalized approach to constructing computer programs for a wide range of problems of hydrodynamics, described by the system of equations of hyperbolic type. As numerical methods are used the method of large particles and ENO-scheme of the second order with Roe solver for the approximate solution of the Riemann problem.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    NREL developed a modeling and experimental strategy to characterize thermal performance of materials. The technique provides critical data on thermal properties with relevance for electronics packaging applications. Thermal contact resistance and bulk thermal conductivity were characterized for new high-performance materials such as thermoplastics, boron-nitride nanosheets, copper nanowires, and atomically bonded layers. The technique is an important tool for developing designs and materials that enable power electronics packaging with small footprint, high power density, and low cost for numerous applications.

  3. Development and validation of MIX: comprehensive free software for meta-analysis of causal research data.

    PubMed

    Bax, Leon; Yu, Ly-Mee; Ikeda, Noriaki; Tsuruta, Harukazu; Moons, Karel G M

    2006-10-13

    Meta-analysis has become a well-known method for synthesis of quantitative data from previously conducted research in applied health sciences. So far, meta-analysis has been particularly useful in evaluating and comparing therapies and in assessing causes of disease. Consequently, the number of software packages that can perform meta-analysis has increased over the years. Unfortunately, it can take a substantial amount of time to get acquainted with some of these programs and most contain little or no interactive educational material. We set out to create and validate an easy-to-use and comprehensive meta-analysis package that would be simple enough programming-wise to remain available as a free download. We specifically aimed at students and researchers who are new to meta-analysis, with important parts of the development oriented towards creating internal interactive tutoring tools and designing features that would facilitate usage of the software as a companion to existing books on meta-analysis. We took an unconventional approach and created a program that uses Excel as a calculation and programming platform. The main programming language was Visual Basic, as implemented in Visual Basic 6 and Visual Basic for Applications in Excel 2000 and higher. The development took approximately two years and resulted in the 'MIX' program, which can be downloaded from the program's website free of charge. Next, we set out to validate the MIX output with two major software packages as reference standards, namely STATA (metan, metabias, and metatrim) and Comprehensive Meta-Analysis Version 2. Eight meta-analyses that had been published in major journals were used as data sources. All numerical and graphical results from analyses with MIX were identical to their counterparts in STATA and CMA. The MIX program distinguishes itself from most other programs by the extensive graphical output, the click-and-go (Excel) interface, and the educational features. The MIX program is a valid tool for performing meta-analysis and may be particularly useful in educational environments. It can be downloaded free of charge via http://www.mix-for-meta-analysis.info or http://sourceforge.net/projects/meta-analysis.

  4. RCHILD - an R-package for flexible use of the landscape evolution model CHILD

    NASA Astrophysics Data System (ADS)

    Dietze, Michael

    2014-05-01

    Landscape evolution models provide powerful approaches to numerically assess earth surface processes, to quantify rates of landscape change, infer sediment transfer rates, estimate sediment budgets, investigate the consequences of changes in external drivers on a geomorphic system, to provide spatio-temporal interpolations between known landscape states or to test conceptual hypotheses. CHILD (Channel-Hillslope Integrated Landscape Development Model) is one of the most-used models of landscape change in the context of at least tectonic and geomorphologic process interactions. Running CHILD from command line and working with the model output can be a rather awkward task (static model control via text input file, only numeric output in text files). The package RCHILD is a collection of functions for the free statistical software R that help using CHILD in a flexible, dynamic and user-friendly way. The comprised functions allow creating maps, real-time scenes, animations and further thematic plots from model output. The model input files can be modified dynamically and, hence, (feedback-related) changes in external factors can be implemented iteratively. Output files can be written to common formats that can be readily imported to standard GIS software. This contribution presents the basic functionality of the model CHILD as visualised and modified by the package. A rough overview of the available functions is given. Application examples help to illustrate the great potential of numeric modelling of geomorphologic processes.

  5. Combined Uncertainty and A-Posteriori Error Bound Estimates for General CFD Calculations: Theory and Software Implementation

    NASA Technical Reports Server (NTRS)

    Barth, Timothy J.

    2014-01-01

    This workshop presentation discusses the design and implementation of numerical methods for the quantification of statistical uncertainty, including a-posteriori error bounds, for output quantities computed using CFD methods. Hydrodynamic realizations often contain numerical error arising from finite-dimensional approximation (e.g. numerical methods using grids, basis functions, particles) and statistical uncertainty arising from incomplete information and/or statistical characterization of model parameters and random fields. The first task at hand is to derive formal error bounds for statistics given realizations containing finite-dimensional numerical error [1]. The error in computed output statistics contains contributions from both realization error and the error resulting from the calculation of statistics integrals using a numerical method. A second task is to devise computable a-posteriori error bounds by numerically approximating all terms arising in the error bound estimates. For the same reason that CFD calculations including error bounds but omitting uncertainty modeling are only of limited value, CFD calculations including uncertainty modeling but omitting error bounds are only of limited value. To gain maximum value from CFD calculations, a general software package for uncertainty quantification with quantified error bounds has been developed at NASA. The package provides implementations for a suite of numerical methods used in uncertainty quantification: Dense tensorization basis methods [3] and a subscale recovery variant [1] for non-smooth data, Sparse tensorization methods[2] utilizing node-nested hierarchies, Sampling methods[4] for high-dimensional random variable spaces.

  6. XAPiir: A recursive digital filtering package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, D.

    1990-09-21

    XAPiir is a basic recursive digital filtering package, containing both design and implementation subroutines. XAPiir was developed for the experimental array processor (XAP) software package, and is written in FORTRAN. However, it is intended to be incorporated into any general- or special-purpose signal analysis program. It replaces the older package RECFIL, offering several enhancements. RECFIL is used in several large analysis programs developed at LLNL, including the seismic analysis package SAC, several expert systems (NORSEA and NETSEA), and two general purpose signal analysis packages (SIG and VIEW). This report is divided into two sections: the first describes the use ofmore » the subroutine package, and the second, its internal organization. In the first section, the filter design problem is briefly reviewed, along with the definitions of the filter design parameters and their relationship to the subroutine input parameters. In the second section, the internal organization is documented to simplify maintenance and extensions to the package. 5 refs., 9 figs.« less

  7. Ultra high energy resolution focusing monochromator for inelastic X-ray scattering spectrometer

    DOE PAGES

    Suvorov, Alexey; Cunsolo, Alessandro; Chubar, Oleg; ...

    2015-11-25

    Further development of a focusing monochromator concept for X-ray energy resolution of 0.1 meV and below is presented. Theoretical analysis of several optical layouts based on this concept was supported by numerical simulations performed in the “Synchrotron Radiation Workshop” software package using the physical-optics approach and careful modeling of partially-coherent synchrotron (undulator) radiation. Along with the energy resolution, the spectral shape of the energy resolution function was investigated. We show that under certain conditions the decay of the resolution function tails can be faster than that of the Gaussian function.

  8. CFD analyses for advanced pump design

    NASA Technical Reports Server (NTRS)

    Dejong, F. J.; Choi, S.-K.; Govindan, T. R.

    1994-01-01

    As one of the activities of the NASA/MSFC Pump Stage Technology Team, the present effort was focused on using CFD in the design and analysis of high performance rocket engine pumps. Under this effort, a three-dimensional Navier-Stokes code was used for various inducer and impeller flow field calculations. An existing algebraic grid generation procedure was-extended to allow for nonzero blade thickness, splitter blades, and hub/shroud cavities upstream or downstream of the (main) blades. This resulted in a fast, robust inducer/impeller geometry/grid generation package. Problems associated with running a compressible flow code to simulate an incompressible flow were resolved; related aspects of the numerical algorithm (viz., the matrix preconditioning, the artificial dissipation, and the treatment of low Mach number flows) were addressed. As shown by the calculations performed under the present effort, the resulting code, in conjunction with the grid generation package, is an effective tool for the rapid solution of three-dimensional viscous inducer and impeller flows.

  9. SimVascular: An Open Source Pipeline for Cardiovascular Simulation.

    PubMed

    Updegrove, Adam; Wilson, Nathan M; Merkow, Jameson; Lan, Hongzhi; Marsden, Alison L; Shadden, Shawn C

    2017-03-01

    Patient-specific cardiovascular simulation has become a paradigm in cardiovascular research and is emerging as a powerful tool in basic, translational and clinical research. In this paper we discuss the recent development of a fully open-source SimVascular software package, which provides a complete pipeline from medical image data segmentation to patient-specific blood flow simulation and analysis. This package serves as a research tool for cardiovascular modeling and simulation, and has contributed to numerous advances in personalized medicine, surgical planning and medical device design. The SimVascular software has recently been refactored and expanded to enhance functionality, usability, efficiency and accuracy of image-based patient-specific modeling tools. Moreover, SimVascular previously required several licensed components that hindered new user adoption and code management and our recent developments have replaced these commercial components to create a fully open source pipeline. These developments foster advances in cardiovascular modeling research, increased collaboration, standardization of methods, and a growing developer community.

  10. Technological developments and the need for technical competencies in food services.

    PubMed

    Rodgers, Svetlana

    2005-05-01

    The growing scale of institutional and commercial food services poses a technological challenge of producing large quantities of high quality meals in terms of their safety, sensory and nutritional attributes. Developments in food service technology and systems (cook-freeze, cook-chill and others) allow the replacement of fast food with the service of cooked meals, which are often nutritionally superior. Reliance on equipment, packaging and technological 'know-how' makes food service operations more complex. Operators have to minimise the impact of the numerous steps in the production process, the fundamental weaknesses of cook-chill food safety design, coupled with the practical limitations of Hazard Analysis Critical Control Points management, the potential unevenness of temperature distribution and product deterioration during storage. The fundamental knowledge of food science and microbiology, engineering and packaging technologies is needed. At present, the 'high tech' options, which can improve a product's nutritional value, such as natural preservation hurdles or functional meals, are not used in practice.

  11. disLocate: tools to rapidly quantify local intermolecular structure to assess two-dimensional order in self-assembled systems.

    PubMed

    Bumstead, Matt; Liang, Kunyu; Hanta, Gregory; Hui, Lok Shu; Turak, Ayse

    2018-01-24

    Order classification is particularly important in photonics, optoelectronics, nanotechnology, biology, and biomedicine, as self-assembled and living systems tend to be ordered well but not perfectly. Engineering sets of experimental protocols that can accurately reproduce specific desired patterns can be a challenge when (dis)ordered outcomes look visually similar. Robust comparisons between similar samples, especially with limited data sets, need a finely tuned ensemble of accurate analysis tools. Here we introduce our numerical Mathematica package disLocate, a suite of tools to rapidly quantify the spatial structure of a two-dimensional dispersion of objects. The full range of tools available in disLocate give different insights into the quality and type of order present in a given dispersion, accessing the translational, orientational and entropic order. The utility of this package allows for researchers to extract the variation and confidence range within finite sets of data (single images) using different structure metrics to quantify local variation in disorder. Containing all metrics within one package allows for researchers to easily and rapidly extract many different parameters simultaneously, allowing robust conclusions to be drawn on the order of a given system. Quantifying the experimental trends which produce desired morphologies enables engineering of novel methods to direct self-assembly.

  12. MOPEX: a software package for astronomical image processing and visualization

    NASA Astrophysics Data System (ADS)

    Makovoz, David; Roby, Trey; Khan, Iffat; Booth, Hartley

    2006-06-01

    We present MOPEX - a software package for astronomical image processing and display. The package is a combination of command-line driven image processing software written in C/C++ with a Java-based GUI. The main image processing capabilities include creating mosaic images, image registration, background matching, point source extraction, as well as a number of minor image processing tasks. The combination of the image processing and display capabilities allows for much more intuitive and efficient way of performing image processing. The GUI allows for the control over the image processing and display to be closely intertwined. Parameter setting, validation, and specific processing options are entered by the user through a set of intuitive dialog boxes. Visualization feeds back into further processing by providing a prompt feedback of the processing results. The GUI also allows for further analysis by accessing and displaying data from existing image and catalog servers using a virtual observatory approach. Even though originally designed for the Spitzer Space Telescope mission, a lot of functionalities are of general usefulness and can be used for working with existing astronomical data and for new missions. The software used in the package has undergone intensive testing and benefited greatly from effective software reuse. The visualization part has been used for observation planning for both the Spitzer and Herschel Space Telescopes as part the tool Spot. The visualization capabilities of Spot have been enhanced and integrated with the image processing functionality of the command-line driven MOPEX. The image processing software is used in the Spitzer automated pipeline processing, which has been in operation for nearly 3 years. The image processing capabilities have also been tested in off-line processing by numerous astronomers at various institutions around the world. The package is multi-platform and includes automatic update capabilities. The software package has been developed by a small group of software developers and scientists at the Spitzer Science Center. It is available for distribution at the Spitzer Science Center web page.

  13. An Improved Metal-Packaged Strain Sensor Based on A Regenerated Fiber Bragg Grating in Hydrogen-Loaded Boron–Germanium Co-Doped Photosensitive Fiber for High-Temperature Applications

    PubMed Central

    Tu, Yun; Ye, Lin; Zhou, Shao-Ping; Tu, Shan-Tung

    2017-01-01

    Local strain measurements are considered as an effective method for structural health monitoring of high-temperature components, which require accurate, reliable and durable sensors. To develop strain sensors that can be used in higher temperature environments, an improved metal-packaged strain sensor based on a regenerated fiber Bragg grating (RFBG) fabricated in hydrogen (H2)-loaded boron–germanium (B–Ge) co-doped photosensitive fiber is developed using the process of combining magnetron sputtering and electroplating, addressing the limitation of mechanical strength degradation of silica optical fibers after annealing at a high temperature for regeneration. The regeneration characteristics of the RFBGs and the strain characteristics of the sensor are evaluated. Numerical simulation of the sensor is conducted using a three-dimensional finite element model. Anomalous decay behavior of two regeneration regimes is observed for the FBGs written in H2-loaded B–Ge co-doped fiber. The strain sensor exhibits good linearity, stability and repeatability when exposed to constant high temperatures of up to 540 °C. A satisfactory agreement is obtained between the experimental and numerical results in strain sensitivity. The results demonstrate that the improved metal-packaged strain sensors based on RFBGs in H2-loaded B–Ge co-doped fiber provide great potential for high-temperature applications by addressing the issues of mechanical integrity and packaging. PMID:28241465

  14. An Improved Metal-Packaged Strain Sensor Based on A Regenerated Fiber Bragg Grating in Hydrogen-Loaded Boron-Germanium Co-Doped Photosensitive Fiber for High-Temperature Applications.

    PubMed

    Tu, Yun; Ye, Lin; Zhou, Shao-Ping; Tu, Shan-Tung

    2017-02-23

    Local strain measurements are considered as an effective method for structural health monitoring of high-temperature components, which require accurate, reliable and durable sensors. To develop strain sensors that can be used in higher temperature environments, an improved metal-packaged strain sensor based on a regenerated fiber Bragg grating (RFBG) fabricated in hydrogen (H₂)-loaded boron-germanium (B-Ge) co-doped photosensitive fiber is developed using the process of combining magnetron sputtering and electroplating, addressing the limitation of mechanical strength degradation of silica optical fibers after annealing at a high temperature for regeneration. The regeneration characteristics of the RFBGs and the strain characteristics of the sensor are evaluated. Numerical simulation of the sensor is conducted using a three-dimensional finite element model. Anomalous decay behavior of two regeneration regimes is observed for the FBGs written in H₂-loaded B-Ge co-doped fiber. The strain sensor exhibits good linearity, stability and repeatability when exposed to constant high temperatures of up to 540 °C. A satisfactory agreement is obtained between the experimental and numerical results in strain sensitivity. The results demonstrate that the improved metal-packaged strain sensors based on RFBGs in H₂-loaded B-Ge co-doped fiber provide great potential for high-temperature applications by addressing the issues of mechanical integrity and packaging.

  15. A new parallel-vector finite element analysis software on distributed-memory computers

    NASA Technical Reports Server (NTRS)

    Qin, Jiangning; Nguyen, Duc T.

    1993-01-01

    A new parallel-vector finite element analysis software package MPFEA (Massively Parallel-vector Finite Element Analysis) is developed for large-scale structural analysis on massively parallel computers with distributed-memory. MPFEA is designed for parallel generation and assembly of the global finite element stiffness matrices as well as parallel solution of the simultaneous linear equations, since these are often the major time-consuming parts of a finite element analysis. Block-skyline storage scheme along with vector-unrolling techniques are used to enhance the vector performance. Communications among processors are carried out concurrently with arithmetic operations to reduce the total execution time. Numerical results on the Intel iPSC/860 computers (such as the Intel Gamma with 128 processors and the Intel Touchstone Delta with 512 processors) are presented, including an aircraft structure and some very large truss structures, to demonstrate the efficiency and accuracy of MPFEA.

  16. Development and validation of MIX: comprehensive free software for meta-analysis of causal research data

    PubMed Central

    Bax, Leon; Yu, Ly-Mee; Ikeda, Noriaki; Tsuruta, Harukazu; Moons, Karel GM

    2006-01-01

    Background Meta-analysis has become a well-known method for synthesis of quantitative data from previously conducted research in applied health sciences. So far, meta-analysis has been particularly useful in evaluating and comparing therapies and in assessing causes of disease. Consequently, the number of software packages that can perform meta-analysis has increased over the years. Unfortunately, it can take a substantial amount of time to get acquainted with some of these programs and most contain little or no interactive educational material. We set out to create and validate an easy-to-use and comprehensive meta-analysis package that would be simple enough programming-wise to remain available as a free download. We specifically aimed at students and researchers who are new to meta-analysis, with important parts of the development oriented towards creating internal interactive tutoring tools and designing features that would facilitate usage of the software as a companion to existing books on meta-analysis. Results We took an unconventional approach and created a program that uses Excel as a calculation and programming platform. The main programming language was Visual Basic, as implemented in Visual Basic 6 and Visual Basic for Applications in Excel 2000 and higher. The development took approximately two years and resulted in the 'MIX' program, which can be downloaded from the program's website free of charge. Next, we set out to validate the MIX output with two major software packages as reference standards, namely STATA (metan, metabias, and metatrim) and Comprehensive Meta-Analysis Version 2. Eight meta-analyses that had been published in major journals were used as data sources. All numerical and graphical results from analyses with MIX were identical to their counterparts in STATA and CMA. The MIX program distinguishes itself from most other programs by the extensive graphical output, the click-and-go (Excel) interface, and the educational features. Conclusion The MIX program is a valid tool for performing meta-analysis and may be particularly useful in educational environments. It can be downloaded free of charge via or . PMID:17038197

  17. An adaptive wing for a small-aircraft application with a configuration of fibre Bragg grating sensors

    NASA Astrophysics Data System (ADS)

    Mieloszyk, M.; Krawczuk, M.; Zak, A.; Ostachowicz, W.

    2010-08-01

    In this paper a concept of an adaptive wing for small-aircraft applications with an array of fibre Bragg grating (FBG) sensors has been presented and discussed. In this concept the shape of the wing can be controlled and altered thanks to the wing design and the use of integrated shape memory alloy actuators. The concept has been tested numerically by the use of the finite element method. For numerical calculations the commercial finite element package ABAQUS® has been employed. A finite element model of the wing has been prepared in order to estimate the values of the wing twisting angles and distributions of the twist for various activation scenarios. Based on the results of numerical analysis the locations and numbers of the FBG sensors have also been determined. The results of numerical calculations obtained by the authors confirmed the usefulness of the assumed wing control strategy. Based on them and the concept developed of the adaptive wing, a wing demonstration stand has been designed and built. The stand has been used to verify experimentally the performance of the adaptive wing and the usefulness of the FBG sensors for evaluation of the wing condition.

  18. Numerical integration of the extended variable generalized Langevin equation with a positive Prony representable memory kernel.

    PubMed

    Baczewski, Andrew D; Bond, Stephen D

    2013-07-28

    Generalized Langevin dynamics (GLD) arise in the modeling of a number of systems, ranging from structured fluids that exhibit a viscoelastic mechanical response, to biological systems, and other media that exhibit anomalous diffusive phenomena. Molecular dynamics (MD) simulations that include GLD in conjunction with external and/or pairwise forces require the development of numerical integrators that are efficient, stable, and have known convergence properties. In this article, we derive a family of extended variable integrators for the Generalized Langevin equation with a positive Prony series memory kernel. Using stability and error analysis, we identify a superlative choice of parameters and implement the corresponding numerical algorithm in the LAMMPS MD software package. Salient features of the algorithm include exact conservation of the first and second moments of the equilibrium velocity distribution in some important cases, stable behavior in the limit of conventional Langevin dynamics, and the use of a convolution-free formalism that obviates the need for explicit storage of the time history of particle velocities. Capability is demonstrated with respect to accuracy in numerous canonical examples, stability in certain limits, and an exemplary application in which the effect of a harmonic confining potential is mapped onto a memory kernel.

  19. Motivation, values, and work design as drivers of participation in the R open source project for statistical computing

    PubMed Central

    Mair, Patrick; Hofmann, Eva; Gruber, Kathrin; Hatzinger, Reinhold; Zeileis, Achim; Hornik, Kurt

    2015-01-01

    One of the cornerstones of the R system for statistical computing is the multitude of packages contributed by numerous package authors. This amount of packages makes an extremely broad range of statistical techniques and other quantitative methods freely available. Thus far, no empirical study has investigated psychological factors that drive authors to participate in the R project. This article presents a study of R package authors, collecting data on different types of participation (number of packages, participation in mailing lists, participation in conferences), three psychological scales (types of motivation, psychological values, and work design characteristics), and various socio-demographic factors. The data are analyzed using item response models and subsequent generalized linear models, showing that the most important determinants for participation are a hybrid form of motivation and the social characteristics of the work design. Other factors are found to have less impact or influence only specific aspects of participation. PMID:26554005

  20. Motivation, values, and work design as drivers of participation in the R open source project for statistical computing.

    PubMed

    Mair, Patrick; Hofmann, Eva; Gruber, Kathrin; Hatzinger, Reinhold; Zeileis, Achim; Hornik, Kurt

    2015-12-01

    One of the cornerstones of the R system for statistical computing is the multitude of packages contributed by numerous package authors. This amount of packages makes an extremely broad range of statistical techniques and other quantitative methods freely available. Thus far, no empirical study has investigated psychological factors that drive authors to participate in the R project. This article presents a study of R package authors, collecting data on different types of participation (number of packages, participation in mailing lists, participation in conferences), three psychological scales (types of motivation, psychological values, and work design characteristics), and various socio-demographic factors. The data are analyzed using item response models and subsequent generalized linear models, showing that the most important determinants for participation are a hybrid form of motivation and the social characteristics of the work design. Other factors are found to have less impact or influence only specific aspects of participation.

  1. The Importance of Take-Out Food Packaging Attributes: Conjoint Analysis and Quality Function Deployment Approach

    NASA Astrophysics Data System (ADS)

    Lestari Widaningrum, Dyah

    2014-03-01

    This research aims to investigate the importance of take-out food packaging attributes, using conjoint analysis and QFD approach among consumers of take-out food products in Jakarta, Indonesia. The conjoint results indicate that perception about packaging material (such as paper, plastic, and polystyrene foam) plays the most important role overall in consumer perception. The clustering results that there is strong segmentation in which take-out food packaging material consumer consider most important. Some consumers are mostly oriented toward the colour of packaging, while another segment of customers concerns on packaging shape and packaging information. Segmentation variables based on packaging response can provide very useful information to maximize image of products through the package's impact. The results of House of Quality development described that Conjoint Analysis - QFD is a useful combination of the two methodologies in product development, market segmentation, and the trade off between customers' requirements in the early stages of HOQ process

  2. Automotive Gas Turbine Power System-Performance Analysis Code

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    1997-01-01

    An open cycle gas turbine numerical modelling code suitable for thermodynamic performance analysis (i.e. thermal efficiency, specific fuel consumption, cycle state points, working fluid flowrates etc.) of automotive and aircraft powerplant applications has been generated at the NASA Lewis Research Center's Power Technology Division. The use this code can be made available to automotive gas turbine preliminary design efforts, either in its present version, or, assuming that resources can be obtained to incorporate empirical models for component weight and packaging volume, in later version that includes the weight-volume estimator feature. The paper contains a brief discussion of the capabilities of the presently operational version of the code, including a listing of input and output parameters and actual sample output listings.

  3. Integrating opto-thermo-mechanical design tools: open engineering's project presentation

    NASA Astrophysics Data System (ADS)

    De Vincenzo, P.; Klapka, Igor

    2017-11-01

    An integrated numerical simulation package dedicated to the analysis of the coupled interactions of optical devices is presented. To reduce human interventions during data transfers, it is based on in-memory communications between the structural analysis software OOFELIE and the optical design application ZEMAX. It allows the automated enhancement of the existing optical design with information related to the deformations of optical surfaces due to thermomechanical solicitations. From the knowledge of these deformations, a grid of points or a decomposition based on Zernike polynomials can be generated for each surface. These data are then applied to the optical design. Finally, indicators can be retrieved from ZEMAX in order to compare the optical performances with those of the system in its nominal configuration.

  4. Safety analysis report for packaging (onsite) steel drum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCormick, W.A.

    This Safety Analysis Report for Packaging (SARP) provides the analyses and evaluations necessary to demonstrate that the steel drum packaging system meets the transportation safety requirements of HNF-PRO-154, Responsibilities and Procedures for all Hazardous Material Shipments, for an onsite packaging containing Type B quantities of solid and liquid radioactive materials. The basic component of the steel drum packaging system is the 208 L (55-gal) steel drum.

  5. Numerical Analysis Objects

    NASA Astrophysics Data System (ADS)

    Henderson, Michael

    1997-08-01

    The Numerical Analysis Objects project (NAO) is a project in the Mathematics Department of IBM's TJ Watson Research Center. While there are plenty of numerical tools available today, it is not an easy task to combine them into a custom application. NAO is directed at the dual problems of building applications from a set of tools, and creating those tools. There are several "reuse" projects, which focus on the problems of identifying and cataloging tools. NAO is directed at the specific context of scientific computing. Because the type of tools is restricted, problems such as tools with incompatible data structures for input and output, and dissimilar interfaces to tools which solve similar problems can be addressed. The approach we've taken is to define interfaces to those objects used in numerical analysis, such as geometries, functions and operators, and to start collecting (and building) a set of tools which use these interfaces. We have written a class library (a set of abstract classes and implementations) in C++ which demonstrates the approach. Besides the classes, the class library includes "stub" routines which allow the library to be used from C or Fortran, and an interface to a Visual Programming Language. The library has been used to build a simulator for petroleum reservoirs, using a set of tools for discretizing nonlinear differential equations that we have written, and includes "wrapped" versions of packages from the Netlib repository. Documentation can be found on the Web at "http://www.research.ibm.com/nao". I will describe the objects and their interfaces, and give examples ranging from mesh generation to solving differential equations.

  6. Skills Analysis. Workshop Package on Skills Analysis, Skills Audit and Training Needs Analysis.

    ERIC Educational Resources Information Center

    Hayton, Geoff; And Others

    This four-part package is designed to assist Australian workshop leaders running 2-day workshops on skills analysis, skills audit, and training needs analysis. Part A contains information on how to use the package and a list of workshop aims. Parts B, C, and D consist, respectively, of the workshop leader's guide; overhead transparency sheets and…

  7. Quantitative comparison between PGNAA measurements and MCNP calculations in view of the characterization of radioactive wastes in Germany and France

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mauerhofer, E.; Havenith, A.; Kettler, J.

    The Forschungszentrum Juelich GmbH (FZJ), together with the Aachen University Rheinisch-Westfaelische Technische Hochschule (RWTH) and the French Alternative Energies and Atomic Energy Commission (CEA Cadarache) are involved in a cooperation aiming at characterizing toxic and reactive elements in radioactive waste packages by means of Prompt Gamma Neutron Activation Analysis (PGNAA). The French and German waste management agencies have indeed defined acceptability limits concerning these elements in view of their projected geological repositories. A first measurement campaign was performed in the new Prompt Gamma Neutron Activation Analysis (PGNAA) facility called MEDINA, at FZJ, to assess the capture gamma-ray signatures of somemore » elements of interest in large samples up to waste drums with a volume of 200 liter. MEDINA is the acronym for Multi Element Detection based on Instrumental Neutron Activation. This paper presents MCNP calculations of the MEDINA facility and quantitative comparison between measurement and simulation. Passive gamma-ray spectra acquired with a high purity germanium detector and calibration sources are used to qualify the numerical model of the crystal. Active PGNAA spectra of a sodium chloride sample measured with MEDINA then allow for qualifying the global numerical model of the measurement cell. Chlorine indeed constitutes a usual reference with reliable capture gamma-ray production data. The goal is to characterize the entire simulation protocol (geometrical model, nuclear data, and postprocessing tools) which will be used for current measurement interpretation, extrapolation of the performances to other types of waste packages or other applications, as well as for the study of future PGNAA facilities.« less

  8. PLATSIM: A Simulation and Analysis Package for Large-Order Flexible Systems. Version 2.0

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Kenny, Sean P.; Giesy, Daniel P.

    1997-01-01

    The software package PLATSIM provides efficient time and frequency domain analysis of large-order generic space platforms. PLATSIM can perform open-loop analysis or closed-loop analysis with linear or nonlinear control system models. PLATSIM exploits the particular form of sparsity of the plant matrices for very efficient linear and nonlinear time domain analysis, as well as frequency domain analysis. A new, original algorithm for the efficient computation of open-loop and closed-loop frequency response functions for large-order systems has been developed and is implemented within the package. Furthermore, a novel and efficient jitter analysis routine which determines jitter and stability values from time simulations in a very efficient manner has been developed and is incorporated in the PLATSIM package. In the time domain analysis, PLATSIM simulates the response of the space platform to disturbances and calculates the jitter and stability values from the response time histories. In the frequency domain analysis, PLATSIM calculates frequency response function matrices and provides the corresponding Bode plots. The PLATSIM software package is written in MATLAB script language. A graphical user interface is developed in the package to provide convenient access to its various features.

  9. Multiple-Group Analysis Using the sem Package in the R System

    ERIC Educational Resources Information Center

    Evermann, Joerg

    2010-01-01

    Multiple-group analysis in covariance-based structural equation modeling (SEM) is an important technique to ensure the invariance of latent construct measurements and the validity of theoretical models across different subpopulations. However, not all SEM software packages provide multiple-group analysis capabilities. The sem package for the R…

  10. From Global to Cloud Resolving Scale: Experiments with a Scale- and Aerosol-Aware Physics Package and Impact on Tracer Transport

    NASA Astrophysics Data System (ADS)

    Grell, G. A.; Freitas, S. R.; Olson, J.; Bela, M.

    2017-12-01

    We will start by providing a summary of the latest cumulus parameterization modeling efforts at NOAA's Earth System Research Laboratory (ESRL) will be presented on both regional and global scales. The physics package includes a scale-aware parameterization of subgrid cloudiness feedback to radiation (coupled PBL, microphysics, radiation, shallow and congestus type convection), the stochastic Grell-Freitas (GF) scale- and aerosol-aware convective parameterization, and an aerosol aware microphysics package. GF is based on a stochastic approach originally implemented by Grell and Devenyi (2002) and described in more detail in Grell and Freitas (2014, ACP). It was expanded to include PDF's for vertical mass flux, as well as modifications to improve the diurnal cycle. This physics package will be used on different scales, spanning global to cloud resolving, to look at the impact on scalar transport and numerical weather prediction.

  11. An Overview of R in Health Decision Sciences.

    PubMed

    Jalal, Hawre; Pechlivanoglou, Petros; Krijkamp, Eline; Alarid-Escudero, Fernando; Enns, Eva; Hunink, M G Myriam

    2017-10-01

    As the complexity of health decision science applications increases, high-level programming languages are increasingly adopted for statistical analyses and numerical computations. These programming languages facilitate sophisticated modeling, model documentation, and analysis reproducibility. Among the high-level programming languages, the statistical programming framework R is gaining increased recognition. R is freely available, cross-platform compatible, and open source. A large community of users who have generated an extensive collection of well-documented packages and functions supports it. These functions facilitate applications of health decision science methodology as well as the visualization and communication of results. Although R's popularity is increasing among health decision scientists, methodological extensions of R in the field of decision analysis remain isolated. The purpose of this article is to provide an overview of existing R functionality that is applicable to the various stages of decision analysis, including model design, input parameter estimation, and analysis of model outputs.

  12. User's manual for the coupled rotor/airframe vibration analysis graphic package

    NASA Technical Reports Server (NTRS)

    Studwell, R. E.

    1982-01-01

    User instructions for a graphics package for coupled rotor/airframe vibration analysis are presented. Responses to plot package messages which the user must make to activate plot package operations and options are described. Installation instructions required to set up the program on the CDC system are included. The plot package overlay structure and subroutines which have to be modified for the CDC system are also described. Operating instructions for CDC applications are included.

  13. From "farm to fork" strawberry system: current realities and potential innovative scenarios from life cycle assessment of non-renewable energy use and green house gas emissions.

    PubMed

    Girgenti, Vincenzo; Peano, Cristiana; Baudino, Claudio; Tecco, Nadia

    2014-03-01

    In this study, we analysed the environmental profile of the strawberry industry in Northern Italy. The analysis was conducted using two scenarios as reference systems: strawberry crops grown in unheated plastic tunnels using currently existing cultivation techniques, post-harvest management practices and consumption patterns (scenario 1) and the same strawberry cultivation chain in which some of the materials used were replaced with bio-based materials (scenario 2). In numerous studies, biodegradable polymers have been shown to be environmentally friendly, thus potentially reducing environmental impacts. These materials can be recycled into carbon dioxide and water through composting. Many materials, such as Mater-BI® and PLA®, are also derived from renewable resources. The methodology chosen for the environmental analysis was a life cycle assessment (LCA) based on a consequential approach developed to assess a product's overall environmental impact from the production system to its usage and disposal. In the field stage, a traditional mulching film (non-biodegradable) could be replaced with a biodegradable product. This change would result in waste production of 0 kg/ha for the bio-based product compared to 260 kg/ha of waste for polyethylene (PE). In the post-harvest stage, the issue addressed was the use and disposal of packaging materials. The innovative scenario evaluated herein pertains to the use of new packaging materials that increase the shelf life of strawberries, thereby decreasing product losses while increasing waste management efficiency at the level of a distribution platform and/or sales outlet. In the event of product deterioration or non-sale of the product, the packaging and its contents could be collected together as organic waste without any additional processes because the packaging is compostable according to EN13432. Scenario 2 would achieve reductions of 20% in the global warming potential and non-renewable energy impact categories. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Perceived impact of smaller compared with larger-sized bottles of sugar-sweetened beverages on consumption: A qualitative analysis.

    PubMed

    Mantzari, Eleni; Hollands, Gareth J; Pechey, Rachel; Jebb, Susan; Marteau, Theresa M

    2018-01-01

    Sugar-sweetened beverage (SSB) consumption increases obesity risk and is linked to adverse health consequences. Large packages increase food consumption, but most evidence comes from studies comparing larger with standard packages, resulting in uncertainty regarding the impact of smaller packages. There is also little research on beverages. This qualitative study explores the experiences of consuming cola from smaller compared with larger bottles, to inform intervention strategies. Sixteen households in Cambridge, England, participating in a feasibility study assessing the impact of bottle size on in-home SSB consumption, received a set amount of cola each week for four weeks in one of four bottle sizes: 1500 ml, 1000 ml, 500 ml, or 250 ml, in random order. At the study end, household representatives were interviewed about their experiences of using each bottle, including perceptions of i) consumption level; ii) consumption-related behaviours; and iii) factors affecting consumption. Interviews were semi-structured and data analysed using the Framework approach. The present analysis focuses specifically on experiences relating to use of the smaller bottles. The smallest bottles were described as increasing drinking occasion frequency and encouraging consumption of numerous bottles in succession. Factors described as facilitating their consumption were: i) convenience and portability; ii) greater numbers of bottles available, which hindered consumption monitoring and control; iii) perceived insufficient quantity per bottle; and iv) positive attitudes. In a minority of cases the smallest bottles were perceived to have reduced consumption, but this was related to practical issues with the bottles that resulted in dislike. The perception of greater consumption and qualitative reports of drinking habits associated with the smallest bottles raise the possibility that the 'portion size effect' has a lower threshold, beyond which smaller portions and packages may increase consumption. This reinforces the need for empirical evidence to assess the in-home impact of smaller bottles on SSB consumption. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Functional analysis of the bacteriophage T4 DNA-packaging ATPase motor.

    PubMed

    Mitchell, Michael S; Rao, Venigalla B

    2006-01-06

    Packaging of double-stranded DNA into bacteriophage capsids is driven by one of the most powerful force-generating motors reported to date. The phage T4 motor is constituted by gene product 16 (gp16) (18 kDa; small terminase), gp17 (70 kDa; large terminase), and gp20 (61 kDa; dodecameric portal). Extensive sequence alignments revealed that numerous phage and viral large terminases encode a common Walker-B motif in the N-terminal ATPase domain. The gp17 motif consists of a highly conserved aspartate (Asp255) preceded by four hydrophobic residues (251MIYI254), which are predicted to form a beta-strand. Combinatorial mutagenesis demonstrated that mutations that compromised hydrophobicity, or integrity of the beta-strand, resulted in a null phenotype, whereas certain changes in hydrophobicity resulted in cs/ts phenotypes. No substitutions, including a highly conservative glutamate, are tolerated at the conserved aspartate. Biochemical analyses revealed that the Asp255 mutants showed no detectable in vitro DNA packaging activity. The purified D255E, D255N, D255T, D255V, and D255E/E256D mutant proteins exhibited defective ATP binding and very low or no gp16-stimulated ATPase activity. The nuclease activity of gp17 is, however, retained, albeit at a greatly reduced level. These data define the N-terminal ATPase center in terminases and show for the first time that subtle defects in the ATP-Mg complex formation at this center lead to a profound loss of phage DNA packaging.

  16. Nondestructive determination of activity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chabalier, B.

    1996-08-01

    Characterization and appraisal tests include the measurement of activity in raw waste and waste packages. After conditioning, variations in density, matrix composition, and geometry make evaluation of the radionuclide activity in a package destined for storage nearly impossible without measurements and with a low uncertainty. Various nondestructive measuring techniques that use ionizing radiation are employed to characterize waste packages and raw waste. Gamma spectrometry is the most widely used technique because of its simple operation and low cost. This technique is used to quantify the beta-gamma and alpha activity of gamma-emitting radionuclides as well as to check the radioactive homogeneitymore » of the waste packages. Numerous systems for directly measuring waste packages have been developed. Two types of methods may be distinguished, depending on whether results that come from the measurements are weighted by an experimentally determined corrective term or by calculation. Through the MARCO and CARACO measuring systems, a method is described that allows one to quantify the activity of the beta-gamma and alpha radionuclides contained in either a waste package or raw waste whose geometries and material compositions are more or less accurately known. This method is based on (a) measurement by gamma spectrometry of the beta-gamma and alpha activity of the gamma-emitting radionuclides contained in the waste package and (b) the application of calculated corrections; thus, the limitations imposed by reference package geometry and matrix are avoided.« less

  17. Influence of the piezoelectric parameters on the dynamics of an active rotor

    NASA Astrophysics Data System (ADS)

    Gawryluk, Jarosław; Mitura, Andrzej; Teter, Andrzej

    2018-01-01

    The main aim of this paper is an experimental and numerical analysis of the dynamic behavior of an active rotor with three composite blades. The study focuses on developing an effective FE modeling technique of a macro fiber composite element (denoted as MFC or active element) for the dynamic tests of active structures. The active rotor under consideration consists of a hub with a drive shaft, three grips and three glass-epoxy laminate blades with embedded active elements. A simplified FE model of the macro fiber composite element exhibiting the d33 piezoelectric effect is developed using the Abaqus software package. The discussed transducer is modeled as quasi-homogeneous piezoelectric material, and voltage is applied to the opposite faces of the element. In this case, the effective (equivalent) piezoelectric constant d33* is specified. Both static and dynamic tests are performed to verify the proposed model. First, static deflections of the active blade caused by the voltage signal are determined by numerical and experimental analyses. Next, a numerical modal analysis of the active rotor is performed. The eigenmodes and corresponding eigenfrequencies are determined by the Lanczos method. The influence of the model parameters (i.e., the effective piezoelectric constant d33 *, voltage signal, angular velocity) on the dynamics of the active rotor is examined. Finally, selected numerical results are validated in experimental tests. The experimental findings demonstrate that the structural stiffening effect caused by the active element strongly depends on the value of the effective piezoelectric constant.

  18. ORNL Pre-test Analyses of A Large-scale Experiment in STYLE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Paul T; Yin, Shengjun; Klasky, Hilda B

    Oak Ridge National Laboratory (ORNL) is conducting a series of numerical analyses to simulate a large scale mock-up experiment planned within the European Network for Structural Integrity for Lifetime Management non-RPV Components (STYLE). STYLE is a European cooperative effort to assess the structural integrity of (non-reactor pressure vessel) reactor coolant pressure boundary components relevant to ageing and life-time management and to integrate the knowledge created in the project into mainstream nuclear industry assessment codes. ORNL contributes work-in-kind support to STYLE Work Package 2 (Numerical Analysis/Advanced Tools) and Work Package 3 (Engineering Assessment Methods/LBB Analyses). This paper summarizes the current statusmore » of ORNL analyses of the STYLE Mock-Up3 large-scale experiment to simulate and evaluate crack growth in a cladded ferritic pipe. The analyses are being performed in two parts. In the first part, advanced fracture mechanics models are being developed and performed to evaluate several experiment designs taking into account the capabilities of the test facility while satisfying the test objectives. Then these advanced fracture mechanics models will be utilized to simulate the crack growth in the large scale mock-up test. For the second part, the recently developed ORNL SIAM-PFM open-source, cross-platform, probabilistic computational tool will be used to generate an alternative assessment for comparison with the advanced fracture mechanics model results. The SIAM-PFM probabilistic analysis of the Mock-Up3 experiment will utilize fracture modules that are installed into a general probabilistic framework. The probabilistic results of the Mock-Up3 experiment obtained from SIAM-PFM will be compared to those results generated using the deterministic 3D nonlinear finite-element modeling approach. The objective of the probabilistic analysis is to provide uncertainty bounds that will assist in assessing the more detailed 3D finite-element solutions and to also assess the level of confidence that can be placed in the best-estimate finiteelement solutions.« less

  19. clusterProfiler: an R package for comparing biological themes among gene clusters.

    PubMed

    Yu, Guangchuang; Wang, Li-Gen; Han, Yanyan; He, Qing-Yu

    2012-05-01

    Increasing quantitative data generated from transcriptomics and proteomics require integrative strategies for analysis. Here, we present an R package, clusterProfiler that automates the process of biological-term classification and the enrichment analysis of gene clusters. The analysis module and visualization module were combined into a reusable workflow. Currently, clusterProfiler supports three species, including humans, mice, and yeast. Methods provided in this package can be easily extended to other species and ontologies. The clusterProfiler package is released under Artistic-2.0 License within Bioconductor project. The source code and vignette are freely available at http://bioconductor.org/packages/release/bioc/html/clusterProfiler.html.

  20. Nanocellulose in green food packaging.

    PubMed

    Vilarinho, Fernanda; Sanches Silva, Ana; Vaz, M Fátima; Farinha, José Paulo

    2018-06-13

    The development of packaging materials with new functionalities and lower environmental impact is now an urgent need of our society. On one hand, the shelf-life extension of packaged products can be an answer to the exponential increase of worldwide demand for food. On the other hand, uncertainty of crude oil prices and reserves has imposed the necessity to find raw materials to replace oil-derived polymers. Additionally, consumers' awareness toward environmental issues increasingly pushes industries to look with renewed interest to "green" solutions. In response to these issues, numerous polymers have been exploited to develop biodegradable food packaging materials. Although the use of biopolymers has been limited due to their poor mechanical and barrier properties, these can be enhanced by adding reinforcing nanosized components to form nanocomposites. Cellulose is probably the most used and well-known renewable and sustainable raw material. The mechanical properties, reinforcing capabilities, abundance, low density, and biodegradability of nanosized cellulose make it an ideal candidate for polymer nanocomposites processing. Here we review the potential applications of cellulose based nanocomposites in food packaging materials, highlighting the several types of biopolymers with nanocellulose fillers that have been used to form bio-nanocomposite materials. The trends in nanocellulose packaging applications are also addressed.

  1. Poppr: an R package for genetic analysis of populations with mixed (clonal/sexual) reproduction

    USDA-ARS?s Scientific Manuscript database

    Poppr is an R package for analysis of population genetic data. It extends the adegenet package and provides several novel tools, particularly with regard to analysis of data from admixed, clonal, and/or sexual populations. Currently, poppr can be used for dominant/codominant and haploid/diploid gene...

  2. BasinVis 1.0: A MATLAB®-based program for sedimentary basin subsidence analysis and visualization

    NASA Astrophysics Data System (ADS)

    Lee, Eun Young; Novotny, Johannes; Wagreich, Michael

    2016-06-01

    Stratigraphic and structural mapping is important to understand the internal structure of sedimentary basins. Subsidence analysis provides significant insights for basin evolution. We designed a new software package to process and visualize stratigraphic setting and subsidence evolution of sedimentary basins from well data. BasinVis 1.0 is implemented in MATLAB®, a multi-paradigm numerical computing environment, and employs two numerical methods: interpolation and subsidence analysis. Five different interpolation methods (linear, natural, cubic spline, Kriging, and thin-plate spline) are provided in this program for surface modeling. The subsidence analysis consists of decompaction and backstripping techniques. BasinVis 1.0 incorporates five main processing steps; (1) setup (study area and stratigraphic units), (2) loading well data, (3) stratigraphic setting visualization, (4) subsidence parameter input, and (5) subsidence analysis and visualization. For in-depth analysis, our software provides cross-section and dip-slip fault backstripping tools. The graphical user interface guides users through the workflow and provides tools to analyze and export the results. Interpolation and subsidence results are cached to minimize redundant computations and improve the interactivity of the program. All 2D and 3D visualizations are created by using MATLAB plotting functions, which enables users to fine-tune the results using the full range of available plot options in MATLAB. We demonstrate all functions in a case study of Miocene sediment in the central Vienna Basin.

  3. Web-Based Mapping Puts the World at Your Fingertips

    NASA Technical Reports Server (NTRS)

    2008-01-01

    NASA's award-winning Earth Resources Laboratory Applications Software (ELAS) package was developed at Stennis Space Center. Since 1978, ELAS has been used worldwide for processing satellite and airborne sensor imagery data of the Earth's surface into readable and usable information. DATASTAR Inc., of Picayune, Mississippi, has used ELAS software in the DATASTAR Image Processing Exploitation (DIPEx) desktop and Internet image processing, analysis, and manipulation software. The new DIPEx Version III includes significant upgrades and improvements compared to its esteemed predecessor. A true World Wide Web application, this product evolved with worldwide geospatial dimensionality and numerous other improvements that seamlessly support the World Wide Web version.

  4. Computer program documentation for the dynamic analysis of a noncontacting mechanical face seal

    NASA Technical Reports Server (NTRS)

    Auer, B. M.; Etsion, I.

    1980-01-01

    A computer program is presented which achieves a numerical solution for the equations of motion of a noncontacting mechanical face seal. The flexibly-mounted primary seal ring motion is expressed by a set of second order differential equations for three degrees of freedom. These equations are reduced to a set of first order equations and the GEAR software package is used to solve the set of first order equations. Program input includes seal design parameters and seal operating conditions. Output from the program includes velocities and displacements of the seal ring about the axis of an inertial reference system. One example problem is described.

  5. A comparison of InVivoStat with other statistical software packages for analysis of data generated from animal experiments.

    PubMed

    Clark, Robin A; Shoaib, Mohammed; Hewitt, Katherine N; Stanford, S Clare; Bate, Simon T

    2012-08-01

    InVivoStat is a free-to-use statistical software package for analysis of data generated from animal experiments. The package is designed specifically for researchers in the behavioural sciences, where exploiting the experimental design is crucial for reliable statistical analyses. This paper compares the analysis of three experiments conducted using InVivoStat with other widely used statistical packages: SPSS (V19), PRISM (V5), UniStat (V5.6) and Statistica (V9). We show that InVivoStat provides results that are similar to those from the other packages and, in some cases, are more advanced. This investigation provides evidence of further validation of InVivoStat and should strengthen users' confidence in this new software package.

  6. Gene- and pathway-based association tests for multiple traits with GWAS summary statistics.

    PubMed

    Kwak, Il-Youp; Pan, Wei

    2017-01-01

    To identify novel genetic variants associated with complex traits and to shed new insights on underlying biology, in addition to the most popular single SNP-single trait association analysis, it would be useful to explore multiple correlated (intermediate) traits at the gene- or pathway-level by mining existing single GWAS or meta-analyzed GWAS data. For this purpose, we present an adaptive gene-based test and a pathway-based test for association analysis of multiple traits with GWAS summary statistics. The proposed tests are adaptive at both the SNP- and trait-levels; that is, they account for possibly varying association patterns (e.g. signal sparsity levels) across SNPs and traits, thus maintaining high power across a wide range of situations. Furthermore, the proposed methods are general: they can be applied to mixed types of traits, and to Z-statistics or P-values as summary statistics obtained from either a single GWAS or a meta-analysis of multiple GWAS. Our numerical studies with simulated and real data demonstrated the promising performance of the proposed methods. The methods are implemented in R package aSPU, freely and publicly available at: https://cran.r-project.org/web/packages/aSPU/ CONTACT: weip@biostat.umn.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. Reproducibility of neuroimaging analyses across operating systems

    PubMed Central

    Glatard, Tristan; Lewis, Lindsay B.; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C.

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed. PMID:25964757

  8. Reproducibility of neuroimaging analyses across operating systems.

    PubMed

    Glatard, Tristan; Lewis, Lindsay B; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed.

  9. Implications of Sea Level Rise on Coastal Flood Hazards

    NASA Astrophysics Data System (ADS)

    Roeber, V.; Li, N.; Cheung, K.; Lane, P.; Evans, R. L.; Donnelly, J. P.; Ashton, A. D.

    2012-12-01

    Recent global and local projections suggest the sea level will be on the order of 1 m or higher than the current level by the end of the century. Coastal communities and ecosystems in low-lying areas are vulnerable to impacts resulting from hurricane or large swell events in combination with sea-level rise. This study presents the implementation and results of an integrated numerical modeling package to delineate coastal inundation due to storm landfalls at future sea levels. The modeling package utilizes a suite of numerical models to capture both large-scale phenomena in the open ocean and small-scale processes in coastal areas. It contains four components to simulate (1) meteorological conditions, (2) astronomical tides and surge, (3) wave generation, propagation, and nearshore transformation, and (4) surf-zone processes and inundation onto dry land associated with a storm event. Important aspects of this package are the two-way coupling of a spectral wave model and a storm surge model as well as a detailed representation of surf and swash zone dynamics by a higher-order Boussinesq-type wave model. The package was validated with field data from Hurricane Ivan of 2005 on the US Gulf coast and applied to tropical and extratropical storm scenarios respectively at Eglin, Florida and Camp Lejeune, North Carolina. The results show a nonlinear increase of storm surge level and nearshore wave energy with a rising sea level. The exacerbated flood hazard can have major consequences for coastal communities with respect to erosion and damage to infrastructure.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spotz, William F.

    PyTrilinos is a set of Python interfaces to compiled Trilinos packages. This collection supports serial and parallel dense linear algebra, serial and parallel sparse linear algebra, direct and iterative linear solution techniques, algebraic and multilevel preconditioners, nonlinear solvers and continuation algorithms, eigensolvers and partitioning algorithms. Also included are a variety of related utility functions and classes, including distributed I/O, coloring algorithms and matrix generation. PyTrilinos vector objects are compatible with the popular NumPy Python package. As a Python front end to compiled libraries, PyTrilinos takes advantage of the flexibility and ease of use of Python, and the efficiency of themore » underlying C++, C and Fortran numerical kernels. This paper covers recent, previously unpublished advances in the PyTrilinos package.« less

  11. Numerical simulation of heat fluxes in a two-temperature plasma at shock tube walls

    NASA Astrophysics Data System (ADS)

    Kuznetsov, E. A.; Poniaev, S. A.

    2015-12-01

    Numerical simulation of a two-temperature three-component Xenon plasma flow is presented. A solver based on the OpenFOAM CFD software package is developed. The heat flux at the shock tube end wall is calculated and compared with experimental data. It is shown that the heat flux due to electrons can be as high as 14% of the total heat flux.

  12. Z{gamma}{gamma}{gamma} {yields} 0 Processes in SANC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bardin, D. Yu., E-mail: bardin@nu.jinr.ru; Kalinovskaya, L. V., E-mail: kalinov@nu.jinr.ru; Uglov, E. D., E-mail: corner@nu.jinr.ru

    2013-11-15

    We describe the analytic and numerical evaluation of the {gamma}{gamma} {yields} {gamma}Z process cross section and the Z {yields} {gamma}{gamma}{gamma} decay rate within the SANC system multi-channel approach at the one-loop accuracy level with all masses taken into account. The corresponding package for numeric calculations is presented. For checking of the results' correctness we make a comparison with the other independent calculations.

  13. MPBEC, a Matlab Program for Biomolecular Electrostatic Calculations

    NASA Astrophysics Data System (ADS)

    Vergara-Perez, Sandra; Marucho, Marcelo

    2016-01-01

    One of the most used and efficient approaches to compute electrostatic properties of biological systems is to numerically solve the Poisson-Boltzmann (PB) equation. There are several software packages available that solve the PB equation for molecules in aqueous electrolyte solutions. Most of these software packages are useful for scientists with specialized training and expertise in computational biophysics. However, the user is usually required to manually take several important choices, depending on the complexity of the biological system, to successfully obtain the numerical solution of the PB equation. This may become an obstacle for researchers, experimentalists, even students with no special training in computational methodologies. Aiming to overcome this limitation, in this article we present MPBEC, a free, cross-platform, open-source software that provides non-experts in the field an easy and efficient way to perform biomolecular electrostatic calculations on single processor computers. MPBEC is a Matlab script based on the Adaptative Poisson-Boltzmann Solver, one of the most popular approaches used to solve the PB equation. MPBEC does not require any user programming, text editing or extensive statistical skills, and comes with detailed user-guide documentation. As a unique feature, MPBEC includes a useful graphical user interface (GUI) application which helps and guides users to configure and setup the optimal parameters and approximations to successfully perform the required biomolecular electrostatic calculations. The GUI also incorporates visualization tools to facilitate users pre- and post-analysis of structural and electrical properties of biomolecules.

  14. MPBEC, a Matlab Program for Biomolecular Electrostatic Calculations

    PubMed Central

    Vergara-Perez, Sandra; Marucho, Marcelo

    2015-01-01

    One of the most used and efficient approaches to compute electrostatic properties of biological systems is to numerically solve the Poisson-Boltzmann (PB) equation. There are several software packages available that solve the PB equation for molecules in aqueous electrolyte solutions. Most of these software packages are useful for scientists with specialized training and expertise in computational biophysics. However, the user is usually required to manually take several important choices, depending on the complexity of the biological system, to successfully obtain the numerical solution of the PB equation. This may become an obstacle for researchers, experimentalists, even students with no special training in computational methodologies. Aiming to overcome this limitation, in this article we present MPBEC, a free, cross-platform, open-source software that provides non-experts in the field an easy and efficient way to perform biomolecular electrostatic calculations on single processor computers. MPBEC is a Matlab script based on the Adaptative Poisson Boltzmann Solver, one of the most popular approaches used to solve the PB equation. MPBEC does not require any user programming, text editing or extensive statistical skills, and comes with detailed user-guide documentation. As a unique feature, MPBEC includes a useful graphical user interface (GUI) application which helps and guides users to configure and setup the optimal parameters and approximations to successfully perform the required biomolecular electrostatic calculations. The GUI also incorporates visualization tools to facilitate users pre- and post- analysis of structural and electrical properties of biomolecules. PMID:26924848

  15. MPBEC, a Matlab Program for Biomolecular Electrostatic Calculations.

    PubMed

    Vergara-Perez, Sandra; Marucho, Marcelo

    2016-01-01

    One of the most used and efficient approaches to compute electrostatic properties of biological systems is to numerically solve the Poisson-Boltzmann (PB) equation. There are several software packages available that solve the PB equation for molecules in aqueous electrolyte solutions. Most of these software packages are useful for scientists with specialized training and expertise in computational biophysics. However, the user is usually required to manually take several important choices, depending on the complexity of the biological system, to successfully obtain the numerical solution of the PB equation. This may become an obstacle for researchers, experimentalists, even students with no special training in computational methodologies. Aiming to overcome this limitation, in this article we present MPBEC, a free, cross-platform, open-source software that provides non-experts in the field an easy and efficient way to perform biomolecular electrostatic calculations on single processor computers. MPBEC is a Matlab script based on the Adaptative Poisson Boltzmann Solver, one of the most popular approaches used to solve the PB equation. MPBEC does not require any user programming, text editing or extensive statistical skills, and comes with detailed user-guide documentation. As a unique feature, MPBEC includes a useful graphical user interface (GUI) application which helps and guides users to configure and setup the optimal parameters and approximations to successfully perform the required biomolecular electrostatic calculations. The GUI also incorporates visualization tools to facilitate users pre- and post- analysis of structural and electrical properties of biomolecules.

  16. New Mexico Play Fairway Analysis: Particle Tracking ArcGIS Map Packages

    DOE Data Explorer

    Jeff Pepin

    2015-11-15

    These are map packages used to visualize geochemical particle-tracking analysis results in ArcGIS. It includes individual map packages for several regions of New Mexico including: Acoma, Rincon, Gila, Las Cruces, Socorro and Truth or Consequences.

  17. GENERAL PURPOSE ADA PACKAGES

    NASA Technical Reports Server (NTRS)

    Klumpp, A. R.

    1994-01-01

    Ten families of subprograms are bundled together for the General-Purpose Ada Packages. The families bring to Ada many features from HAL/S, PL/I, FORTRAN, and other languages. These families are: string subprograms (INDEX, TRIM, LOAD, etc.); scalar subprograms (MAX, MIN, REM, etc.); array subprograms (MAX, MIN, PROD, SUM, GET, and PUT); numerical subprograms (EXP, CUBIC, etc.); service subprograms (DATE_TIME function, etc.); Linear Algebra II; Runge-Kutta integrators; and three text I/O families of packages. In two cases, a family consists of a single non-generic package. In all other cases, a family comprises a generic package and its instances for a selected group of scalar types. All generic packages are designed to be easily instantiated for the types declared in the user facility. The linear algebra package is LINRAG2. This package includes subprograms supplementing those in NPO-17985, An Ada Linear Algebra Package Modeled After HAL/S (LINRAG). Please note that LINRAG2 cannot be compiled without LINRAG. Most packages have widespread applicability, although some are oriented for avionics applications. All are designed to facilitate writing new software in Ada. Several of the packages use conventions introduced by other programming languages. A package of string subprograms is based on HAL/S (a language designed for the avionics software in the Space Shuttle) and PL/I. Packages of scalar and array subprograms are taken from HAL/S or generalized current Ada subprograms. A package of Runge-Kutta integrators is patterned after a built-in MAC (MIT Algebraic Compiler) integrator. Those packages modeled after HAL/S make it easy to translate existing HAL/S software to Ada. The General-Purpose Ada Packages program source code is available on two 360K 5.25" MS-DOS format diskettes. The software was developed using VAX Ada v1.5 under DEC VMS v4.5. It should be portable to any validated Ada compiler and it should execute either interactively or in batch. The largest package requires 205K of main memory on a DEC VAX running VMS. The software was developed in 1989, and is a copyrighted work with all copyright vested in NASA.

  18. MEG and EEG data analysis with MNE-Python.

    PubMed

    Gramfort, Alexandre; Luessi, Martin; Larson, Eric; Engemann, Denis A; Strohmeier, Daniel; Brodbeck, Christian; Goj, Roman; Jas, Mainak; Brooks, Teon; Parkkonen, Lauri; Hämäläinen, Matti

    2013-12-26

    Magnetoencephalography and electroencephalography (M/EEG) measure the weak electromagnetic signals generated by neuronal activity in the brain. Using these signals to characterize and locate neural activation in the brain is a challenge that requires expertise in physics, signal processing, statistics, and numerical methods. As part of the MNE software suite, MNE-Python is an open-source software package that addresses this challenge by providing state-of-the-art algorithms implemented in Python that cover multiple methods of data preprocessing, source localization, statistical analysis, and estimation of functional connectivity between distributed brain regions. All algorithms and utility functions are implemented in a consistent manner with well-documented interfaces, enabling users to create M/EEG data analysis pipelines by writing Python scripts. Moreover, MNE-Python is tightly integrated with the core Python libraries for scientific comptutation (NumPy, SciPy) and visualization (matplotlib and Mayavi), as well as the greater neuroimaging ecosystem in Python via the Nibabel package. The code is provided under the new BSD license allowing code reuse, even in commercial products. Although MNE-Python has only been under heavy development for a couple of years, it has rapidly evolved with expanded analysis capabilities and pedagogical tutorials because multiple labs have collaborated during code development to help share best practices. MNE-Python also gives easy access to preprocessed datasets, helping users to get started quickly and facilitating reproducibility of methods by other researchers. Full documentation, including dozens of examples, is available at http://martinos.org/mne.

  19. MEG and EEG data analysis with MNE-Python

    PubMed Central

    Gramfort, Alexandre; Luessi, Martin; Larson, Eric; Engemann, Denis A.; Strohmeier, Daniel; Brodbeck, Christian; Goj, Roman; Jas, Mainak; Brooks, Teon; Parkkonen, Lauri; Hämäläinen, Matti

    2013-01-01

    Magnetoencephalography and electroencephalography (M/EEG) measure the weak electromagnetic signals generated by neuronal activity in the brain. Using these signals to characterize and locate neural activation in the brain is a challenge that requires expertise in physics, signal processing, statistics, and numerical methods. As part of the MNE software suite, MNE-Python is an open-source software package that addresses this challenge by providing state-of-the-art algorithms implemented in Python that cover multiple methods of data preprocessing, source localization, statistical analysis, and estimation of functional connectivity between distributed brain regions. All algorithms and utility functions are implemented in a consistent manner with well-documented interfaces, enabling users to create M/EEG data analysis pipelines by writing Python scripts. Moreover, MNE-Python is tightly integrated with the core Python libraries for scientific comptutation (NumPy, SciPy) and visualization (matplotlib and Mayavi), as well as the greater neuroimaging ecosystem in Python via the Nibabel package. The code is provided under the new BSD license allowing code reuse, even in commercial products. Although MNE-Python has only been under heavy development for a couple of years, it has rapidly evolved with expanded analysis capabilities and pedagogical tutorials because multiple labs have collaborated during code development to help share best practices. MNE-Python also gives easy access to preprocessed datasets, helping users to get started quickly and facilitating reproducibility of methods by other researchers. Full documentation, including dozens of examples, is available at http://martinos.org/mne. PMID:24431986

  20. PresenceAbsence: An R package for presence absence analysis

    Treesearch

    Elizabeth A. Freeman; Gretchen Moisen

    2008-01-01

    The PresenceAbsence package for R provides a set of functions useful when evaluating the results of presence-absence analysis, for example, models of species distribution or the analysis of diagnostic tests. The package provides a toolkit for selecting the optimal threshold for translating a probability surface into presence-absence maps specifically tailored to their...

  1. Waste Package Component Design Methodology Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D.C. Mecham

    2004-07-12

    This Executive Summary provides an overview of the methodology being used by the Yucca Mountain Project (YMP) to design waste packages and ancillary components. This summary information is intended for readers with general interest, but also provides technical readers a general framework surrounding a variety of technical details provided in the main body of the report. The purpose of this report is to document and ensure appropriate design methods are used in the design of waste packages and ancillary components (the drip shields and emplacement pallets). The methodology includes identification of necessary design inputs, justification of design assumptions, and usemore » of appropriate analysis methods, and computational tools. This design work is subject to ''Quality Assurance Requirements and Description''. The document is primarily intended for internal use and technical guidance for a variety of design activities. It is recognized that a wide audience including project management, the U.S. Department of Energy (DOE), the U.S. Nuclear Regulatory Commission, and others are interested to various levels of detail in the design methods and therefore covers a wide range of topics at varying levels of detail. Due to the preliminary nature of the design, readers can expect to encounter varied levels of detail in the body of the report. It is expected that technical information used as input to design documents will be verified and taken from the latest versions of reference sources given herein. This revision of the methodology report has evolved with changes in the waste package, drip shield, and emplacement pallet designs over many years and may be further revised as the design is finalized. Different components and analyses are at different stages of development. Some parts of the report are detailed, while other less detailed parts are likely to undergo further refinement. The design methodology is intended to provide designs that satisfy the safety and operational requirements of the YMP. Four waste package configurations have been selected to illustrate the application of the methodology during the licensing process. These four configurations are the 21-pressurized water reactor absorber plate waste package (21-PWRAP), the 44-boiling water reactor waste package (44-BWR), the 5 defense high-level radioactive waste (HLW) DOE spent nuclear fuel (SNF) codisposal short waste package (5-DHLWDOE SNF Short), and the naval canistered SNF long waste package (Naval SNF Long). Design work for the other six waste packages will be completed at a later date using the same design methodology. These include the 24-boiling water reactor waste package (24-BWR), the 21-pressurized water reactor control rod waste package (21-PWRCR), the 12-pressurized water reactor waste package (12-PWR), the 5 defense HLW DOE SNF codisposal long waste package (5-DHLWDOE SNF Long), the 2 defense HLW DOE SNF codisposal waste package (2-MC012-DHLW), and the naval canistered SNF short waste package (Naval SNF Short). This report is only part of the complete design description. Other reports related to the design include the design reports, the waste package system description documents, manufacturing specifications, and numerous documents for the many detailed calculations. The relationships between this report and other design documents are shown in Figure 1.« less

  2. MODEL 9977 B(M)F-96 SAFETY ANALYSIS REPORT FOR PACKAGING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abramczyk, G; Paul Blanton, P; Kurt Eberl, K

    2006-05-18

    This Safety Analysis Report for Packaging (SARP) documents the analysis and testing performed on and for the 9977 Shipping Package, referred to as the General Purpose Fissile Package (GPFP). The performance evaluation presented in this SARP documents the compliance of the 9977 package with the regulatory safety requirements for Type B packages. Per 10 CFR 71.59, for the 9977 packages evaluated in this SARP, the value of ''N'' is 50, and the Transport Index based on nuclear criticality control is 1.0. The 9977 package is designed with a high degree of single containment. The 9977 complies with 10 CFR 71more » (2002), Department of Energy (DOE) Order 460.1B, DOE Order 460.2, and 10 CFR 20 (2003) for As Low As Reasonably Achievable (ALARA) principles. The 9977 also satisfies the requirements of the Regulations for the Safe Transport of Radioactive Material--1996 Edition (Revised)--Requirements. IAEA Safety Standards, Safety Series No. TS-R-1 (ST-1, Rev.), International Atomic Energy Agency, Vienna, Austria (2000). The 9977 package is designed, analyzed and fabricated in accordance with Section III of the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel (B&PV) Code, 1992 edition.« less

  3. An alternative to FASTSIM for tangential solution of the wheel-rail contact

    NASA Astrophysics Data System (ADS)

    Sichani, Matin Sh.; Enblom, Roger; Berg, Mats

    2016-06-01

    In most rail vehicle dynamics simulation packages, tangential solution of the wheel-rail contact is gained by means of Kalker's FASTSIM algorithm. While 5-25% error is expected for creep force estimation, the errors of shear stress distribution, needed for wheel-rail damage analysis, may rise above 30% due to the parabolic traction bound. Therefore, a novel algorithm named FaStrip is proposed as an alternative to FASTSIM. It is based on the strip theory which extends the two-dimensional rolling contact solution to three-dimensional contacts. To form FaStrip, the original strip theory is amended to obtain accurate estimations for any contact ellipse size and it is combined by a numerical algorithm to handle spin. The comparison between the two algorithms shows that using FaStrip improves the accuracy of the estimated shear stress distribution and the creep force estimation in all studied cases. In combined lateral creepage and spin cases, for instance, the error in force estimation reduces from 18% to less than 2%. The estimation of the slip velocities in the slip zone, needed for wear analysis, is also studied. Since FaStrip is as fast as FASTSIM, it can be an alternative for tangential solution of the wheel-rail contact in simulation packages.

  4. Documentation for the MODFLOW 6 Groundwater Flow Model

    USGS Publications Warehouse

    Langevin, Christian D.; Hughes, Joseph D.; Banta, Edward R.; Niswonger, Richard G.; Panday, Sorab; Provost, Alden M.

    2017-08-10

    This report documents the Groundwater Flow (GWF) Model for a new version of MODFLOW called MODFLOW 6. The GWF Model for MODFLOW 6 is based on a generalized control-volume finite-difference approach in which a cell can be hydraulically connected to any number of surrounding cells. Users can define the model grid using one of three discretization packages, including (1) a structured discretization package for defining regular MODFLOW grids consisting of layers, rows, and columns, (2) a discretization by ver­tices package for defining layered unstructured grids consisting of layers and cells, and (3) a general unstruc­tured discretization package for defining flexible grids comprised of cells and their connection properties. For layered grids, a new capability is available for removing thin cells and vertically connecting cells overlying and underlying the thin cells. For complex problems involving water-table conditions, an optional Newton-Raphson formulation, based on the formulations in MODFLOW-NWT and MODFLOW-USG, can be acti­vated. Use of the Newton-Raphson formulation will often improve model convergence and allow solutions to be obtained for difficult problems that cannot be solved using the traditional wetting and drying approach. The GWF Model is divided into “packages,” as was done in previous MODFLOW versions. A package is the part of the model that deals with a single aspect of simulation. Packages included with the GWF Model include those related to internal calculations of groundwater flow (discretization, initial conditions, hydraulic conduc­tance, and storage), stress packages (constant heads, wells, recharge, rivers, general head boundaries, drains, and evapotranspiration), and advanced stress packages (streamflow routing, lakes, multi-aquifer wells, and unsaturated zone flow). An additional package is also available for moving water available in one package into the individual features of the advanced stress packages. The GWF Model also has packages for obtaining and controlling output from the model. This report includes detailed explanations of physical and mathematical concepts on which the GWF Model and its packages are based.Like its predecessors, MODFLOW 6 is based on a highly modular structure; however, this structure has been extended into an object-oriented framework. The framework includes a robust and generalized numeri­cal solution object, which can be used to solve many different types of models. The numerical solution object has several different matrix preconditioning options as well as several methods for solving the linear system of equations. In this new framework, the GWF Model itself is an object as are each of the GWF Model packages. A benefit of the object-oriented structure is that multiple objects of the same type can be used in a single sim­ulation. Thus, a single forward run with MODFLOW 6 may contain multiple GWF Models. GWF Models can be hydraulically connected using GWF-GWF Exchange objects. Connecting GWF models in different ways permits the user to utilize a local grid refinement strategy consisting of parent and child models or to couple adjacent GWF Models. An advantage of the approach implemented in MODFLOW 6 is that multiple models and their exchanges can be incorporated into a single numerical solution object. With this design, models can be tightly coupled at the matrix level.

  5. On the limits of numerical astronomical solutions used in paleoclimate studies

    NASA Astrophysics Data System (ADS)

    Zeebe, Richard E.

    2017-04-01

    Numerical solutions of the equations of the Solar System estimate Earth's orbital parameters in the past and represent the backbone of cyclostratigraphy and astrochronology, now widely applied in geology and paleoclimatology. Given one numerical realization of a Solar System model (i.e., obtained using one code or integrator package), various parameters determine the properties of the solution and usually limit its validity to a certain time period. Such limitations are denoted here as "internal" and include limitations due to (i) the underlying physics/physical model and (ii) numerics. The physics include initial coordinates and velocities of Solar System bodies, treatment of the Moon and asteroids, the Sun's quadrupole moment, and the intrinsic dynamics of the Solar System itself, i.e., its chaotic nature. Numerical issues include solver algorithm, numerical accuracy (e.g., time step), and round-off errors. At present, internal limitations seem to restrict the validity of astronomical solutions to perhaps the past 50 or 60 myr. However, little is currently known about "external" limitations, that is, how do different numerical realizations compare, say, between different investigators using different codes and integrators? Hitherto only two solutions for Earth's eccentricity appear to be used in paleoclimate studies, provided by two different groups that integrated the full Solar System equations over the past >100 myr (Laskar and coworkers and Varadi et al. 2003). In this contribution, I will present results from new Solar System integrations for Earth's eccentricity obtained using the integrator package HNBody (Rauch and Hamilton 2002). I will discuss the various internal limitations listed above within the framework of the present simulations. I will also compare the results to the existing solutions, the details of which are still being sorted out as several simulations are still running at the time of writing.

  6. Draco,Version 6.x.x

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, Kelly; Budge, Kent; Lowrie, Rob

    2016-03-03

    Draco is an object-oriented component library geared towards numerically intensive, radiation (particle) transport applications built for parallel computing hardware. It consists of semi-independent packages and a robust build system. The packages in Draco provide a set of components that can be used by multiple clients to build transport codes. The build system can also be extracted for use in clients. Software includes smart pointers, Design-by-Contract assertions, unit test framework, wrapped MPI functions, a file parser, unstructured mesh data structures, a random number generator, root finders and an angular quadrature component.

  7. The development of an engineering computer graphics laboratory

    NASA Technical Reports Server (NTRS)

    Anderson, D. C.; Garrett, R. E.

    1975-01-01

    Hardware and software systems developed to further research and education in interactive computer graphics were described, as well as several of the ongoing application-oriented projects, educational graphics programs, and graduate research projects. The software system consists of a FORTRAN 4 subroutine package, in conjunction with a PDP 11/40 minicomputer as the primary computation processor and the Imlac PDS-1 as an intelligent display processor. The package comprises a comprehensive set of graphics routines for dynamic, structured two-dimensional display manipulation, and numerous routines to handle a variety of input devices at the Imlac.

  8. Impact of Cloud Analysis on Numerical Weather Prediction in the Galician Region of Spain.

    NASA Astrophysics Data System (ADS)

    Souto, M. J.; Balseiro, C. F.; Pérez-Muñuzuri, V.; Xue, M.; Brewster, K.

    2003-01-01

    The Advanced Regional Prediction System (ARPS) is applied to operational numerical weather prediction in Galicia, northwest Spain. The model is run daily for 72-h forecasts at a 10-km horizontal spacing. Located on the northwest coast of Spain and influenced by the Atlantic weather systems, Galicia has a high percentage (nearly 50%) of rainy days per year. For these reasons, the precipitation processes and the initialization of moisture and cloud fields are very important. Even though the ARPS model has a sophisticated data analysis system (`ADAS') that includes a 3D cloud analysis package, because of operational constraints, the current forecast starts from the 12-h forecast of the National Centers for Environmental Prediction Aviation Model (AVN). Still, procedures from the ADAS cloud analysis are being used to construct the cloud fields based on AVN data and then are applied to initialize the microphysical variables in ARPS. Comparisons of the ARPS predictions with local observations show that ARPS can predict very well both the daily total precipitation and its spatial distribution. ARPS also shows skill in predicting heavy rains and high winds, as observed during November 2000, and especially in the prediction of the 5 November 2000 storm that caused widespread wind and rain damage in Galicia. It is demonstrated that the cloud analysis contributes to the success of the precipitation forecasts.

  9. Encasement and subsidence of salt minibasins: observations from the SE Precaspian Basin and numerical modeling.

    NASA Astrophysics Data System (ADS)

    Fernandez, Naiara; Duffy, Oliver B.; Hudec, Michael R.; Jackson, Christopher A.-L.; Dooley, Tim P.; Jackson, Martin P. A.; Burg, George

    2017-04-01

    The SE Precaspian Basin is characterized by an assemblage of Upper Permian to Triassic minibasins. A recently acquired borehole-constrained 3D reflection dataset reveals the existence of abundant intrasalt reflection packages lying in between the Permo-Triassic minibasins. We propose that most of the mapped intrasalt reflection packages in the study area are minibasins originally deposited on top of salt that were later incorporated into salt by encasement processes. This makes the SE Precaspian Basin a new example of a salt province populated by encased minibasins, which until now had been mainly described from the Gulf of Mexico. Identifying salt-encased sediment packages in the study area has been crucial, not only because they provide a new exploration target, but also because they can play a key role on improving seismic imaging of adjacent or deeper stratigraphic sections. Another remarkable feature observed in the seismic dataset is the widespread occurrence of distinct seismic sequences in the Permo-Triassic minibasins. Bowl- and wedge-shaped seismic sequences define discrete periods of vertical and asymmetric minibasin subsidence. In the absence of shortening, the bowl-to-wedge transition is typically associated with the timing of basal welding and subsequent rotation of the minibasins. Timing of minibasin welding has important implications when addressing the likelihood of suprasalt reservoir charging. We performed a set of 2D numerical simulations aimed at investigating what drives the tilting of minibasins and how it relates to welding. A key observation from the numerical models is that the bowl-to-wedge transition can predate the time of basal welding.

  10. DESIGN ANALYSIS FOR THE DEFENSE HIGH-LEVEL WASTE DISPOSAL CONTAINER

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    G. Radulesscu; J.S. Tang

    The purpose of ''Design Analysis for the Defense High-Level Waste Disposal Container'' analysis is to technically define the defense high-level waste (DHLW) disposal container/waste package using the Waste Package Department's (WPD) design methods, as documented in ''Waste Package Design Methodology Report'' (CRWMS M&O [Civilian Radioactive Waste Management System Management and Operating Contractor] 2000a). The DHLW disposal container is intended for disposal of commercial high-level waste (HLW) and DHLW (including immobilized plutonium waste forms), placed within disposable canisters. The U.S. Department of Energy (DOE)-managed spent nuclear fuel (SNF) in disposable canisters may also be placed in a DHLW disposal container alongmore » with HLW forms. The objective of this analysis is to demonstrate that the DHLW disposal container/waste package satisfies the project requirements, as embodied in Defense High Level Waste Disposal Container System Description Document (SDD) (CRWMS M&O 1999a), and additional criteria, as identified in Waste Package Design Sensitivity Report (CRWMS M&Q 2000b, Table 4). The analysis briefly describes the analytical methods appropriate for the design of the DHLW disposal contained waste package, and summarizes the results of the calculations that illustrate the analytical methods. However, the analysis is limited to the calculations selected for the DHLW disposal container in support of the Site Recommendation (SR) (CRWMS M&O 2000b, Section 7). The scope of this analysis is restricted to the design of the codisposal waste package of the Savannah River Site (SRS) DHLW glass canisters and the Training, Research, Isotopes General Atomics (TRIGA) SNF loaded in a short 18-in.-outer diameter (OD) DOE standardized SNF canister. This waste package is representative of the waste packages that consist of the DHLW disposal container, the DHLW/HLW glass canisters, and the DOE-managed SNF in disposable canisters. The intended use of this analysis is to support Site Recommendation reports and to assist in the development of WPD drawings. Activities described in this analysis were conducted in accordance with the Development Plan ''Design Analysis for the Defense High-Level Waste Disposal Container'' (CRWMS M&O 2000c) with no deviations from the plan.« less

  11. Network Meta-Analysis Using R: A Review of Currently Available Automated Packages

    PubMed Central

    Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph

    2014-01-01

    Network meta-analysis (NMA) – a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously – has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA. PMID:25541687

  12. Network meta-analysis using R: a review of currently available automated packages.

    PubMed

    Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph

    2014-01-01

    Network meta-analysis (NMA)--a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously--has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA.

  13. Concurrent Cuba

    NASA Astrophysics Data System (ADS)

    Hahn, T.

    2016-10-01

    The parallel version of the multidimensional numerical integration package Cuba is presented and achievable speed-ups discussed. The parallelization is based on the fork/wait POSIX functions, needs no extra software installed, imposes almost no constraints on the integrand function, and works largely automatically.

  14. Environmental assessment of packaging: Sense and sensibility

    NASA Astrophysics Data System (ADS)

    Kooijman, Jan M.

    1993-09-01

    The functions of packaging are derived from product requirements, thus for insight into the environmental effects of packaging the actual combination of product and package has to be evaluated along the production and distribution system. This extension to all related environmental aspects adds realism to the environmental analysis and provides guidance for design while preventing a too detailed investigation of parts of the production system. This approach is contrary to current environmental studies where packaging is always treated as an independent object, neglecting the more important environmental effects of the product that are influenced by packaging. The general analysis and quantification stages for this approach are described, and the currently available methods for the assessment of environmental effects are reviewed. To limit the workload involved in an environmental assessment, a step-by-step analysis and the use of feedback is recommended. First the dominant environmental effects of a particular product and its production and distribution are estimated. Then, on the basis of these preliminary results, the appropriate system boundaries are chosen and the need for further or more detailed environmental analysis is determined. For typical food and drink applications, the effect of different system boundaries on the outcome of environmental assessments and the advantage of the step-by-step analysis of the food supply system is shown. It appears that, depending on the consumer group, different advice for reduction of environmental effects has to be given. Furthermore, because of interrelated environmental effects of the food supply system, the continuing quest for more detailed and accurate analysis of the package components is not necessary for improved management of the environmental effects of packaging.

  15. Numerical model of solar dynamic radiator for parametric analysis

    NASA Technical Reports Server (NTRS)

    Rhatigan, Jennifer L.

    1989-01-01

    Growth power requirements for Space Station Freedom will be met through addition of 25 kW solar dynamic (SD) power modules. The SD module rejects waste heat from the power conversion cycle to space through a pumped-loop, multi-panel, deployable radiator. The baseline radiator configuration was defined during the Space Station conceptual design phase and is a function of the state point and heat rejection requirements of the power conversion unit. Requirements determined by the overall station design such as mass, system redundancy, micrometeoroid and space debris impact survivability, launch packaging, costs, and thermal and structural interaction with other station components have also been design drivers for the radiator configuration. Extensive thermal and power cycle modeling capabilities have been developed which are powerful tools in Station design and analysis, but which prove cumbersome and costly for simple component preliminary design studies. In order to aid in refining the SD radiator to the mature design stage, a simple and flexible numerical model was developed. The model simulates heat transfer and fluid flow performance of the radiator and calculates area mass and impact survivability for many combinations of flow tube and panel configurations, fluid and material properties, and environmental and cycle variations. A brief description and discussion of the numerical model, it's capabilities and limitations, and results of the parametric studies performed is presented.

  16. Safety analysis report -- Packages LP-50 tritium package (Packaging of fissile and other radioactive materials)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gates, A.A.; McCarthy, P.G.; Edl, J.W.

    1975-05-01

    Elemental tritium is shipped at low pressure in a stainless steel container (LP-50) surrounded by an aluminum vessel and Celotex insulation at least 4 in. thick in a steel drum. Each package contains a large quantity (greater than a Type A quantity) of nonfissile material, as defined in AECM 0529. This report provides the details of the safety analysis performed for this type container.

  17. Comparison of requirements and capabilities of major multipurpose software packages.

    PubMed

    Igo, Robert P; Schnell, Audrey H

    2012-01-01

    The aim of this chapter is to introduce the reader to commonly used software packages and illustrate their input requirements, analysis options, strengths, and limitations. We focus on packages that perform more than one function and include a program for quality control, linkage, and association analyses. Additional inclusion criteria were (1) programs that are free to academic users and (2) currently supported, maintained, and developed. Using those criteria, we chose to review three programs: Statistical Analysis for Genetic Epidemiology (S.A.G.E.), PLINK, and Merlin. We will describe the required input format and analysis options. We will not go into detail about every possible program in the packages, but we will give an overview of the packages requirements and capabilities.

  18. Lunar surface structural concepts and construction studies

    NASA Technical Reports Server (NTRS)

    Mikulas, Martin

    1991-01-01

    The topics are presented in viewgraph form and include the following: lunar surface structures construction research areas; lunar crane related disciplines; shortcomings of typical mobile crane in lunar base applications; candidate crane cable suspension systems; NIST six-cable suspension crane; numerical example of natural frequency; the incorporation of two new features for improved performance of the counter-balanced actively-controlled lunar crane; lunar crane pendulum mechanics; simulation results; 1/6 scale lunar crane testbed using GE robot for global manipulation; basic deployable truss approaches; bi-pantograph elevator platform; comparison of elevator platforms; perspective of bi-pantograph beam; bi-pantograph synchronously deployable tower/beam; lunar module off-loading concept; module off-loader concept packaged; starburst deployable precision reflector; 3-ring reflector deployment scheme; cross-section of packaged starburst reflector; and focal point and thickness packaging considerations.

  19. Potential migration of buoyant LNAPL from intermediate level waste (ILW) emplaced in a geological disposal facility (GDF) for U.K. radioactive waste.

    PubMed

    Benbow, Steven J; Rivett, Michael O; Chittenden, Neil; Herbert, Alan W; Watson, Sarah; Williams, Steve J; Norris, Simon

    2014-10-15

    A safety case for the disposal of Intermediate Level (radioactive) Waste (ILW) in a deep geological disposal facility (GDF) requires consideration of the potential for waste-derived light non-aqueous phase liquid (LNAPL) to migrate under positive buoyancy from disposed waste packages. Were entrainment of waste-derived radionuclides in LNAPL to occur, such migration could result in a shorter overall travel time to environmental or human receptors than radionuclide migration solely associated with the movement of groundwater. This paper provides a contribution to the assessment of this issue through multiphase-flow numerical modelling underpinned by a review of the UK's ILW inventory and literature to define the nature of the associated ILW LNAPL source term. Examination has been at the waste package-local GDF environment scale to determine whether proposed disposal of ILW would lead to significant likelihood of LNAPL migration, both from waste packages and from a GDF vault into the local host rock. Our review and numerical modelling support the proposition that the release of a discrete free phase LNAPL from ILW would not present a significant challenge to the safety case even with conservative approximations. 'As-disposed' LNAPL emplaced with the waste is not expected to pose a significant issue. 'Secondary LNAPL' generated in situ within the disposed ILW, arising from the decomposition of plastics, in particular PVC (polyvinyl chloride), could form the predominant LNAPL source term. Released high molecular weight phthalate plasticizers are judged to be the primary LNAPL potentially generated. These are expected to have low buoyancy-based mobility due to their very low density contrast with water and high viscosity. Due to the inherent uncertainties, significant conservatisms were adopted within the numerical modelling approach, including: the simulation of a deliberately high organic material--PVC content wastestream (2D03) within an annular grouted waste package vulnerable to LNAPL release; upper bound inventory estimates of LNAPLs; incorporating the lack of any hydraulic resistance of the package vent; the lack of any degradation of dissolved LNAPL; and, significantly, the small threshold displacement pressure assumed at which LNAPL is able to enter initially water-saturated pores. Initial scoping calculations on the latter suggested that the rate at which LNAPL is able to migrate from a waste package is likely to be very small and insignificant for likely representative displacement pressure data: this represents a key result. Adopting a conservative displacement pressure, however, allowed the effect of other features and processes in the system to be assessed. High LNAPL viscosity together with low density contrast with water reduces LNAPL migration potential. Migration to the host rock is less likely if waste package vent fluxes are small, solubility limits are high and path lengths through the backfill are short. The capacity of the system to dissolve all of the free LNAPL will, however, depend on groundwater availability. Even with the conservatisms invoked, the overall conclusion of model simulations of intact and compromised (cracked or corroded) waste packages, for a range of realistic ILW LNAPL scenarios, is that it is unlikely that significant LNAPL would be able to migrate from the waste packages and even more unlikely it would be sufficiently persistent to reach the host rock immediately beyond the GDF. Copyright © 2014. Published by Elsevier B.V.

  20. How Many Peripheral Solder Joints in a Surface Mounted Design Experience Inelastic Strains?

    NASA Astrophysics Data System (ADS)

    Suhir, E.; Yi, S.; Ghaffarian, R.

    2017-03-01

    It has been established that it is the peripheral solder joints that are the most vulnerable in the ball-grid-array (BGA) and column-grid-array (CGA) designs and most often fail. As far as the long-term reliability of a soldered microelectronics assembly as a whole is concerned, it makes a difference, if just one or more peripheral joints experience inelastic strains. It is clear that the low cycle fatigue lifetime of the solder system is inversely proportional to the number of joints that simultaneously experience inelastic strains. A simple and physically meaningful analytical expression (formula) is obtained for the prediction, at the design stage, of the number of such joints, if any, for the given effective thermal expansion (contraction) mismatch of the package and PCB; materials and geometrical characteristics of the package/PCB assembly; package size; and, of course, the level of the yield stress in the solder material. The suggested formula can be used to determine if the inelastic strains in the solder material could be avoided by the proper selection of the above characteristics and, if not, how many peripheral joints are expected to simultaneously experience inelastic strains. The general concept is illustrated by a numerical example carried out for a typical BGA package. The suggested analytical model (formula) is applicable to any soldered microelectronics assembly. The roles of other important factors, such as, e.g., solder material anisotropy, grain size, and their random orientation within a joint, are viewed in this analysis as less important factors than the level of the interfacial stress. The roles of these factors will be accounted for in future work and considered, in addition to the location of the joint, in a more complicated, more sophisticated, and more comprehensive reliability/fatigue model.

  1. Thermal management of LEDs: package to system

    NASA Astrophysics Data System (ADS)

    Arik, Mehmet; Becker, Charles A.; Weaver, Stanton E.; Petroski, James

    2004-01-01

    Light emitting diodes, LEDs, historically have been used for indicators and produced low amounts of heat. The introduction of high brightness LEDs with white light and monochromatic colors have led to a movement towards general illumination. The increased electrical currents used to drive the LEDs have focused more attention on the thermal paths in the developments of LED power packaging. The luminous efficiency of LEDs is soon expected to reach over 80 lumens/W, this is approximately 6 times the efficiency of a conventional incandescent tungsten bulb. Thermal management for the solid-state lighting applications is a key design parameter for both package and system level. Package and system level thermal management is discussed in separate sections. Effect of chip packages on junction to board thermal resistance was compared for both SiC and Sapphire chips. The higher thermal conductivity of the SiC chip provided about 2 times better thermal performance than the latter, while the under-filled Sapphire chip package can only catch the SiC chip performance. Later, system level thermal management was studied based on established numerical models for a conceptual solid-state lighting system. A conceptual LED illumination system was chosen and CFD models were created to determine the availability and limitations of passive air-cooling.

  2. Numerical modeling and experimental validation of the acoustic transmission of aircraft's double-wall structures including sound package

    NASA Astrophysics Data System (ADS)

    Rhazi, Dilal

    In the field of aeronautics, reducing the harmful effects of acoustics constitutes a major concern at the international level and justifies the call for further research, particularly in Canada where aeronautics is a key economic sector, which operates in a context of global competition. Aircraft sidewall structure is usually of a double wall construction with a curved ribbed metallic skin and a lightweight composite or sandwich trim separated by a cavity filled with a noise control treatment. The latter is of a great importance in the transport industry, and continues to be of interest in many engineering applications. However, the insertion loss noise control treatment depends on the excitation of the supporting structure. In particular, Turbulent Boundary Layer is of interest to several industries. This excitation is difficult to simulate in laboratory conditions, given the prohibiting costs and difficulties associated with wind tunnel and in-flight tests. Numerical simulation is the only practical way to predict the response to such excitations and to analyze effects of design changes to the response to such excitation. Another kinds of excitations encountered in industrial are monopole, rain on the Roof and diffuse acoustic field. Deterministic methods can calculate in each point the spectral response of the system. Most known are numerical methods such as finite elements and boundary elements methods. These methods generally apply to the low frequency where modal behavior of the structure dominates. However, the high limit of calculation in frequency of these methods cannot be defined in a strict way because it is related to the capacity of data processing and to the nature of the studied mechanical system. With these challenges in mind, and with limitations of the main numerical codes on the market, the manufacturers have expressed the need for simple models immediately available as early as the stage of preliminary drafts. This thesis represents an attempt to address this need. A numerical tool based on two approaches (Wave and Modal) is developed. It allows a fast computation of the vibroacoustic response for multilayer structures over full frequency spectrum and for various kinds of excitations (monople, rain on the roof, diffuse acoustic filed, turbulent boundary layer) . A comparison between results obtained by the developed model, experimental tests and the finite element method is given and discussed. The results are very promising with respect to the potential of such a model for industrial use as a prediction tool, and even for design. The code can be also integrated within an SEA (Statistical Energy Analysis) strategy in order to model a full vehicle by computing in particular the insertion loss and the equivalent damping added by the sound package. Keywords: Transfer Matrix Method, Wave Approach,Turbulent Boundary Layer, Rain on the Roof, Monopole, Insertion loss, Double-wall, Sound Package.

  3. Temperature distribution of thick thermoset composites

    NASA Astrophysics Data System (ADS)

    Guo, Zhan-Sheng; Du, Shanyi; Zhang, Boming

    2004-05-01

    The development of temperature distribution of thick polymeric matrix laminates during an autoclave vacuum bag process was measured and compared with numerically calculated results. The finite element formulation of the transient heat transfer problem was carried out for polymeric matrix composite materials from the heat transfer differential equations including internal heat generation produced by exothermic chemical reactions. Software based on the general finite element software package was developed for numerical simulation of the entire composite process. From the experimental and numerical results, it was found that the measured temperature profiles were in good agreement with the numerical ones, and conventional cure cycles recommended by prepreg manufacturers for thin laminates should be modified to prevent temperature overshoot.

  4. Tight-binding model for materials at mesoscale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tai, Yuan-Yen; Choi, Hongchul; Zhu, Wei

    2016-12-21

    TBM3 is an open source package for computational simulations of quantum materials at multiple scales in length and time. The project originated to investigate the multiferroic behavior in transition-metal oxide heterostructures. The framework has also been designed to study emergent phemona in other quantum materials like 2-dimensional transition-metal dichalcogenides, graphene, topological insulators, and skyrmion in materials, etc. In the long term, we will enable the package for transport and time-resolved phenomena. TBM3 is currently a C++ based numerical tool package and framework for the design and construction of any kind of lattice structures with multi-orbital and spin degrees of freedom.more » The fortran based portion of the package will be added in the near future. The design of TBM3 is in a highly flexible and reusable framework and the tight-binding parameters can be modeled or informed by DFT calculations. It is currently GPU enabled and feature of CPU enabled MPI will be added in the future.« less

  5. Basic analysis of reflectometry data software package for the analysis of multilayered structures according to reflectometry data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Astaf'ev, S. B., E-mail: bard@ns.crys.ras.ru; Shchedrin, B. M.; Yanusova, L. G.

    2012-01-15

    The main principles of developing the Basic Analysis of Reflectometry Data (BARD) software package, which is aimed at obtaining a unified (standardized) tool for analyzing the structure of thin multilayer films and nanostructures of different nature based on reflectometry data, are considered. This software package contains both traditionally used procedures for processing reflectometry data and the authors' original developments on the basis of new methods for carrying out and analyzing reflectometry experiments. The structure of the package, its functional possibilities, examples of application, and prospects of development are reviewed.

  6. Quantification of free convection for embarked QFN64 electronic package: An experimental and numerical survey

    NASA Astrophysics Data System (ADS)

    Baïri, A.

    2017-08-01

    Embarked Quad Flat Non-lead (QFN) electronic devices are equipped with a significant number of sensors used for flight parameters measurement purposes. Their accuracy directly depends on the package thermal state. Flight safety therefore depends on the reliability of these QFNs, whose junction temperature must remain as low as possible while operating. The QFN64 is favored for these applications. In the operating power range considered here (0.01-0.1W), the study shows that radiative heat transfer is negligible with respect to natural convective exchanges. It is then essential to quantify the convective heat transfer coefficient on its different areas so that the arrangement is properly dimensioned. This is the objective of this work. The device is welded on a PCB which may be inclined with respect to the horizontal plane by an angle ranging from 0° to 90°. Numerical approach results are confirmed by thermal and electrical measurements carried out on prototypes for all power-inclination angle combinations. The correlations here proposed help determine the natural convective heat transfer coefficient in any area of the assembly. This work allowed to thermally characterize and certify a new QFN64 package equipped with sensors designed for aeronautics, currently under industrialization process.

  7. Modeling viscoelastic deformation of the earth due to surface loading by commercial finite element package - ABAQUS

    NASA Astrophysics Data System (ADS)

    Kit Wong, Ching; Wu, Patrick

    2017-04-01

    Wu (2004) developed a transformation scheme to model viscoelatic deformation due to glacial loading by commercial finite element package - ABAQUS. Benchmark tests confirmed that this method works extremely well on incompressible earth model. Bangtsson & Lund (2008),however, showed that the transformation scheme would lead to incorrect results if compressible material parameters are used. Their study implies that Wu's method of stress transformation is inadequate to model the load induced deformation of a compressible earth under the framework of ABAQUS. In light of this, numerical experiments are carried out to find if there exist other methods that serve this purpose. All the tested methods are not satisfying as the results failed to converge through iterations, except at the elastic limit. Those tested methods will be outlined and the results will be presented. Possible reasons of failure will also be discussed. Bängtsson, E., & Lund, B. (2008). A comparison between two solution techniques to solve the equations of glacially induced deformation of an elastic Earth. International journal for numerical methods in engineering, 75(4), 479-502. Wu, P. (2004). Using commercial finite element packages for the study of earth deformations, sea levels and the state of stress. Geophysical Journal International, 158(2), 401-408.

  8. tran-SAS v1.0: a numerical model to compute catchment-scale hydrologic transport using StorAge Selection functions

    NASA Astrophysics Data System (ADS)

    Benettin, Paolo; Bertuzzo, Enrico

    2018-04-01

    This paper presents the tran-SAS package, which includes a set of codes to model solute transport and water residence times through a hydrological system. The model is based on a catchment-scale approach that aims at reproducing the integrated response of the system at one of its outlets. The codes are implemented in MATLAB and are meant to be easy to edit, so that users with minimal programming knowledge can adapt them to the desired application. The problem of large-scale solute transport has both theoretical and practical implications. On the one side, the ability to represent the ensemble of water flow trajectories through a heterogeneous system helps unraveling streamflow generation processes and allows us to make inferences on plant-water interactions. On the other side, transport models are a practical tool that can be used to estimate the persistence of solutes in the environment. The core of the package is based on the implementation of an age master equation (ME), which is solved using general StorAge Selection (SAS) functions. The age ME is first converted into a set of ordinary differential equations, each addressing the transport of an individual precipitation input through the catchment, and then it is discretized using an explicit numerical scheme. Results show that the implementation is efficient and allows the model to run in short times. The numerical accuracy is critically evaluated and it is shown to be satisfactory in most cases of hydrologic interest. Additionally, a higher-order implementation is provided within the package to evaluate and, if necessary, to improve the numerical accuracy of the results. The codes can be used to model streamflow age and solute concentration, but a number of additional outputs can be obtained by editing the codes to further advance the ability to understand and model catchment transport processes.

  9. Excore Modeling with VERAShift

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pandya, Tara M.; Evans, Thomas M.

    It is important to be able to accurately predict the neutron flux outside the immediate reactor core for a variety of safety and material analyses. Monte Carlo radiation transport calculations are required to produce the high fidelity excore responses. Under this milestone VERA (specifically the VERAShift package) has been extended to perform excore calculations by running radiation transport calculations with Shift. This package couples VERA-CS with Shift to perform excore tallies for multiple state points concurrently, with each component capable of parallel execution on independent domains. Specifically, this package performs fluence calculations in the core barrel and vessel, or, performsmore » the requested tallies in any user-defined excore regions. VERAShift takes advantage of the general geometry package in Shift. This gives VERAShift the flexibility to explicitly model features outside the core barrel, including detailed vessel models, detectors, and power plant details. A very limited set of experimental and numerical benchmarks is available for excore simulation comparison. The Consortium for the Advanced Simulation of Light Water Reactors (CASL) has developed a set of excore benchmark problems to include as part of the VERA-CS verification and validation (V&V) problems. The excore capability in VERAShift has been tested on small representative assembly problems, multiassembly problems, and quarter-core problems. VERAView has also been extended to visualize these vessel fluence results from VERAShift. Preliminary vessel fluence results for quarter-core multistate calculations look very promising. Further development is needed to determine the details relevant to excore simulations. Validation of VERA for fluence and excore detectors still needs to be performed against experimental and numerical results.« less

  10. PharmacoGx: an R package for analysis of large pharmacogenomic datasets.

    PubMed

    Smirnov, Petr; Safikhani, Zhaleh; El-Hachem, Nehme; Wang, Dong; She, Adrian; Olsen, Catharina; Freeman, Mark; Selby, Heather; Gendoo, Deena M A; Grossmann, Patrick; Beck, Andrew H; Aerts, Hugo J W L; Lupien, Mathieu; Goldenberg, Anna; Haibe-Kains, Benjamin

    2016-04-15

    Pharmacogenomics holds great promise for the development of biomarkers of drug response and the design of new therapeutic options, which are key challenges in precision medicine. However, such data are scattered and lack standards for efficient access and analysis, consequently preventing the realization of the full potential of pharmacogenomics. To address these issues, we implemented PharmacoGx, an easy-to-use, open source package for integrative analysis of multiple pharmacogenomic datasets. We demonstrate the utility of our package in comparing large drug sensitivity datasets, such as the Genomics of Drug Sensitivity in Cancer and the Cancer Cell Line Encyclopedia. Moreover, we show how to use our package to easily perform Connectivity Map analysis. With increasing availability of drug-related data, our package will open new avenues of research for meta-analysis of pharmacogenomic data. PharmacoGx is implemented in R and can be easily installed on any system. The package is available from CRAN and its source code is available from GitHub. bhaibeka@uhnresearch.ca or benjamin.haibe.kains@utoronto.ca Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  11. Solvation Structure and Thermodynamic Mapping (SSTMap): An Open-Source, Flexible Package for the Analysis of Water in Molecular Dynamics Trajectories.

    PubMed

    Haider, Kamran; Cruz, Anthony; Ramsey, Steven; Gilson, Michael K; Kurtzman, Tom

    2018-01-09

    We have developed SSTMap, a software package for mapping structural and thermodynamic water properties in molecular dynamics trajectories. The package introduces automated analysis and mapping of local measures of frustration and enhancement of water structure. The thermodynamic calculations are based on Inhomogeneous Fluid Solvation Theory (IST), which is implemented using both site-based and grid-based approaches. The package also extends the applicability of solvation analysis calculations to multiple molecular dynamics (MD) simulation programs by using existing cross-platform tools for parsing MD parameter and trajectory files. SSTMap is implemented in Python and contains both command-line tools and a Python module to facilitate flexibility in setting up calculations and for automated generation of large data sets involving analysis of multiple solutes. Output is generated in formats compatible with popular Python data science packages. This tool will be used by the molecular modeling community for computational analysis of water in problems of biophysical interest such as ligand binding and protein function.

  12. Communicating spatial uncertainty to non-experts using R

    NASA Astrophysics Data System (ADS)

    Luzzi, Damiano; Sawicka, Kasia; Heuvelink, Gerard; de Bruin, Sytze

    2016-04-01

    Effective visualisation methods are important for the efficient use of uncertainty information for various groups of users. Uncertainty propagation analysis is often used with spatial environmental models to quantify the uncertainty within the information. A challenge arises when trying to effectively communicate the uncertainty information to non-experts (not statisticians) in a wide range of cases. Due to the growing popularity and applicability of the open source programming language R, we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. The package has implemented Monte Carlo algorithms for uncertainty propagation, the output of which is represented by an ensemble of model outputs (i.e. a sample from a probability distribution). Numerous visualisation methods exist that aim to present such spatial uncertainty information both statically, dynamically and interactively. To provide the most universal visualisation tools for non-experts, we conducted a survey on a group of 20 university students and assessed the effectiveness of selected static and interactive methods for visualising uncertainty in spatial variables such as DEM and land cover. The static methods included adjacent maps and glyphs for continuous variables. Both allow for displaying maps with information about the ensemble mean, variance/standard deviation and prediction intervals. Adjacent maps were also used for categorical data, displaying maps of the most probable class, as well as its associated probability. The interactive methods included a graphical user interface, which in addition to displaying the previously mentioned variables also allowed for comparison of joint uncertainties at multiple locations. The survey indicated that users could understand the basics of the uncertainty information displayed in the static maps, with the interactive interface allowing for more in-depth information. Subsequently, the R package included a collation of the plotting functions that were evaluated in the survey. The implementation of static visualisations was done via calls to the 'ggplot2' package. This allowed the user to provide control over the content, legend, colours, axes and titles. The interactive methods were implemented using the 'shiny' package allowing users to activate the visualisation of statistical descriptions of uncertainty through interaction with a plotted map of means. This research brings uncertainty visualisation to a broader audience through the development of tools for visualising uncertainty using open source software.

  13. Slope Reinforcement with the Utilization of the Coal Waste Anthropogenic Material

    NASA Astrophysics Data System (ADS)

    Gwóźdź-Lasoń, Monika

    2017-10-01

    The protection of the environment, including waste management, is one of the pillars of the policy of the Europe. The application which is presented in that paper tries to show a trans-disciplinary way to design geotechnical constructions - slope stability analysis. The generally accepted principles that the author presents are numerous modelling patterns of earth retaining walls as slope stabilization system. The paper constitutes an attempt to summarise and generalise earlier researches which involved FEM numeric procedures and the Z_Soil package. The design of anthropogenic soil used as a material for reinforced earth retaining walls, are not only of commercial but of environmental importance as well and consistent with the concept of sustainable development and the need to redevelop brownfield. This paper tries to show conceptual and empirical modelling approaches to slope stability system used in anthropogenic soil formation such as heaps, resulting from mining, with a special focus on urban areas of South of Poland and perspectives of anthropogenic materials application in geotechnical engineering are discussed.

  14. Analysis of the dynamic behavior of structures using the high-rate GNSS-PPP method combined with a wavelet-neural model: Numerical simulation and experimental tests

    NASA Astrophysics Data System (ADS)

    Kaloop, Mosbeh R.; Yigit, Cemal O.; Hu, Jong W.

    2018-03-01

    Recently, the high rate global navigation satellite system-precise point positioning (GNSS-PPP) technique has been used to detect the dynamic behavior of structures. This study aimed to increase the accuracy of the extraction oscillation properties of structural movements based on the high-rate (10 Hz) GNSS-PPP monitoring technique. A developmental model based on the combination of wavelet package transformation (WPT) de-noising and neural network prediction (NN) was proposed to improve the dynamic behavior of structures for GNSS-PPP method. A complicated numerical simulation involving highly noisy data and 13 experimental cases with different loads were utilized to confirm the efficiency of the proposed model design and the monitoring technique in detecting the dynamic behavior of structures. The results revealed that, when combined with the proposed model, GNSS-PPP method can be used to accurately detect the dynamic behavior of engineering structures as an alternative to relative GNSS method.

  15. INFFTM: Fast evaluation of 3d Fourier series in MATLAB with an application to quantum vortex reconnections

    NASA Astrophysics Data System (ADS)

    Caliari, Marco; Zuccher, Simone

    2017-04-01

    Although Fourier series approximation is ubiquitous in computational physics owing to the Fast Fourier Transform (FFT) algorithm, efficient techniques for the fast evaluation of a three-dimensional truncated Fourier series at a set of arbitrary points are quite rare, especially in MATLAB language. Here we employ the Nonequispaced Fast Fourier Transform (NFFT, by J. Keiner, S. Kunis, and D. Potts), a C library designed for this purpose, and provide a Matlab® and GNU Octave interface that makes NFFT easily available to the Numerical Analysis community. We test the effectiveness of our package in the framework of quantum vortex reconnections, where pseudospectral Fourier methods are commonly used and local high resolution is required in the post-processing stage. We show that the efficient evaluation of a truncated Fourier series at arbitrary points provides excellent results at a computational cost much smaller than carrying out a numerical simulation of the problem on a sufficiently fine regular grid that can reproduce comparable details of the reconnecting vortices.

  16. Numerical Simulation and Chaotic Analysis of an Aluminum Holding Furnace

    NASA Astrophysics Data System (ADS)

    Wang, Ji-min; Zhou, Yuan-yuan; Lan, Shen; Chen, Tao; Li, Jie; Yan, Hong-jie; Zhou, Jie-min; Tian, Rui-jiao; Tu, Yan-wu; Li, Wen-ke

    2014-12-01

    To achieve high heat efficiency, low pollutant emission and homogeneous melt temperature during thermal process of secondary aluminum, taking into account the features of aluminum alloying process, a CFD process model was developed and integrated with heat load and aluminum temperature control model. This paper presented numerical simulation of aluminum holding furnaces using the customized code based on FLUENT packages. Thermal behaviors of aluminum holding furnaces were investigated by probing into main physical fields such as flue gas temperature, velocity, and concentration, and combustion instability of aluminum holding process was represented by chaos theory. The results show that aluminum temperature uniform coefficient firstly decreases during heating phase, then increases and reduces alternately during holding phase, lastly rises during standing phase. Correlation dimension drops with fuel velocity. Maximal Lyapunov exponent reaches to a maximum when air-fuel ratio is close to 1. It would be a clear comprehension about each phase of aluminum holding furnaces to find new technology, retrofit furnace design, and optimize parameters combination.

  17. Numerical investigation of heat transfer in annulus laminar flow of multi tubes-in-tube helical coil

    NASA Astrophysics Data System (ADS)

    Nada, S. A.; Elattar, H. F.; Fouda, A.; Refaey, H. A.

    2018-03-01

    In the present study, a CFD analysis using ANSYS-FLUENT 14.5 CFD package is used to investigate the characteristics of heat transfer of laminar flow in annulus formed by multi tubes in tube helically coiled heat exchanger. The numerical results are validated by comparison with previous experimental data and fair agreements were existed. The influences of the design and operation parameters such as heat flux, Reynolds numbers and annulus geometry on the heat transfer characteristics are investigated. Different annulus of different numbers of inner tubes, specifically 1, 2, 3, 4 and 5 tubes, are tested. The Results showed that for all the studied annulus, the heat flux has no effect on the Nusselt number and compactness parameter. The annulus formed by using five inner tubes showed the best heat transfer performance and compactness parameter. Correlation of predicting Nusselt number in terms of Reynolds number and number of inner tubes are presented.

  18. Negative autoregulation matches production and demand in synthetic transcriptional networks.

    PubMed

    Franco, Elisa; Giordano, Giulia; Forsberg, Per-Ola; Murray, Richard M

    2014-08-15

    We propose a negative feedback architecture that regulates activity of artificial genes, or "genelets", to meet their output downstream demand, achieving robustness with respect to uncertain open-loop output production rates. In particular, we consider the case where the outputs of two genelets interact to form a single assembled product. We show with analysis and experiments that negative autoregulation matches the production and demand of the outputs: the magnitude of the regulatory signal is proportional to the "error" between the circuit output concentration and its actual demand. This two-device system is experimentally implemented using in vitro transcriptional networks, where reactions are systematically designed by optimizing nucleic acid sequences with publicly available software packages. We build a predictive ordinary differential equation (ODE) model that captures the dynamics of the system and can be used to numerically assess the scalability of this architecture to larger sets of interconnected genes. Finally, with numerical simulations we contrast our negative autoregulation scheme with a cross-activation architecture, which is less scalable and results in slower response times.

  19. Lin4Neuro: a customized Linux distribution ready for neuroimaging analysis

    PubMed Central

    2011-01-01

    Background A variety of neuroimaging software packages have been released from various laboratories worldwide, and many researchers use these packages in combination. Though most of these software packages are freely available, some people find them difficult to install and configure because they are mostly based on UNIX-like operating systems. We developed a live USB-bootable Linux package named "Lin4Neuro." This system includes popular neuroimaging analysis tools. The user interface is customized so that even Windows users can use it intuitively. Results The boot time of this system was only around 40 seconds. We performed a benchmark test of inhomogeneity correction on 10 subjects of three-dimensional T1-weighted MRI scans. The processing speed of USB-booted Lin4Neuro was as fast as that of the package installed on the hard disk drive. We also installed Lin4Neuro on a virtualization software package that emulates the Linux environment on a Windows-based operation system. Although the processing speed was slower than that under other conditions, it remained comparable. Conclusions With Lin4Neuro in one's hand, one can access neuroimaging software packages easily, and immediately focus on analyzing data. Lin4Neuro can be a good primer for beginners of neuroimaging analysis or students who are interested in neuroimaging analysis. It also provides a practical means of sharing analysis environments across sites. PMID:21266047

  20. Lin4Neuro: a customized Linux distribution ready for neuroimaging analysis.

    PubMed

    Nemoto, Kiyotaka; Dan, Ippeita; Rorden, Christopher; Ohnishi, Takashi; Tsuzuki, Daisuke; Okamoto, Masako; Yamashita, Fumio; Asada, Takashi

    2011-01-25

    A variety of neuroimaging software packages have been released from various laboratories worldwide, and many researchers use these packages in combination. Though most of these software packages are freely available, some people find them difficult to install and configure because they are mostly based on UNIX-like operating systems. We developed a live USB-bootable Linux package named "Lin4Neuro." This system includes popular neuroimaging analysis tools. The user interface is customized so that even Windows users can use it intuitively. The boot time of this system was only around 40 seconds. We performed a benchmark test of inhomogeneity correction on 10 subjects of three-dimensional T1-weighted MRI scans. The processing speed of USB-booted Lin4Neuro was as fast as that of the package installed on the hard disk drive. We also installed Lin4Neuro on a virtualization software package that emulates the Linux environment on a Windows-based operation system. Although the processing speed was slower than that under other conditions, it remained comparable. With Lin4Neuro in one's hand, one can access neuroimaging software packages easily, and immediately focus on analyzing data. Lin4Neuro can be a good primer for beginners of neuroimaging analysis or students who are interested in neuroimaging analysis. It also provides a practical means of sharing analysis environments across sites.

  1. Analytical validation of an explicit finite element model of a rolling element bearing with a localised line spall

    NASA Astrophysics Data System (ADS)

    Singh, Sarabjeet; Howard, Carl Q.; Hansen, Colin H.; Köpke, Uwe G.

    2018-03-01

    In this paper, numerically modelled vibration response of a rolling element bearing with a localised outer raceway line spall is presented. The results were obtained from a finite element (FE) model of the defective bearing solved using an explicit dynamics FE software package, LS-DYNA. Time domain vibration signals of the bearing obtained directly from the FE modelling were processed further to estimate time-frequency and frequency domain results, such as spectrogram and power spectrum, using standard signal processing techniques pertinent to the vibration-based monitoring of rolling element bearings. A logical approach to analyses of the numerically modelled results was developed with an aim to presenting the analytical validation of the modelled results. While the time and frequency domain analyses of the results show that the FE model generates accurate bearing kinematics and defect frequencies, the time-frequency analysis highlights the simulation of distinct low- and high-frequency characteristic vibration signals associated with the unloading and reloading of the rolling elements as they move in and out of the defect, respectively. Favourable agreement of the numerical and analytical results demonstrates the validation of the results from the explicit FE modelling of the bearing.

  2. Numerical Simulation of the Layer-Bylayer Destruction of Cylindrical Shells Under Explosive Loading

    NASA Astrophysics Data System (ADS)

    Abrosimov, N. A.; Novoseltseva, N. A.

    2015-09-01

    A technique of numerical analysis of the influence of reinforcement structure on the nature of the dynamic response and the process of layer-by-layer destruction of layered fiberglass cylindrical shells under an axisymmetric internal explosive loading is elaborated. The kinematic model of deformation of the laminate package is based on a nonclassical theory of shells. The geometric dependences are based on simple quadratic relations of the nonlinear theory of elasticity. The relationship between the stress and strain tensors are established by using Hooke's law for orthotropic bodies with account of degradation of stiffness characteristics of the multilayer composite due to the local destruction of some its elementary layers. An energetically consistent system of dynamic equations for composite cylindrical shells is obtained by minimizing the functional of total energy of the shell as a three-dimensional body. The numerical method for solving the formulated initial boundary-value problem is based on an explicit variational-difference scheme. Results confirming the reliability of the method used to analyze the influence of reinforcement structure on the character of destruction and the bearing capacity of pulse-loaded cylindrical shells are presented.

  3. Effect of Various Packaging Methods on Small-Scale Hanwoo (Korean Native Cattle) during Refrigerated Storage

    PubMed Central

    Yu, Hwan Hee; Song, Myung Wook; Kim, Tae-Kyung; Choi, Yun-Sang; Cho, Gyu Yong; Lee, Na-Kyoung; Paik, Hyun-Dong

    2018-01-01

    Abstract The objective of this study was to investigate comparison of physicochemical, microbiological, and sensory characteristics of Hanwoo eye of round by various packaging methods [wrapped packaging (WP), modified atmosphere packaging (MAP), vacuum packaging (VP) with three different vacuum films, and vacuum skin packaging (VSP)] at a small scale. Packaged Hanwoo beef samples were stored in refrigerated conditions (4±1°C) for 28 days. Packaged beef was sampled on days 0, 7, 14, 21, and 28. Physicochemical [pH, surface color, thiobarbituric acid reactive substances (TBARS), and volatile basic nitrogen (VBN) values], microbiological, and sensory analysis of packaged beef samples were performed. VP and VSP samples showed low TBARS and VBN values, and pH and surface color did not change substantially during the 28-day period. For VSP, total viable bacteria, psychrotrophic bacteria, lactic acid bacteria, and coliform counts were lower than those for other packaging systems. Salmonella spp. and Escherichia coli O157:H7 were not detected in any packaged beef samples. A sensory analysis showed that the scores for appearance, flavor, color, and overall acceptability did not change significantly until day 7. In total, VSP was effective with respect to significantly higher a* values, physicochemical stability, and microbial safety in Hanwoo packaging (p<0.05). PMID:29805283

  4. Multiphysical simulation analysis of the dislocation structure in germanium single crystals

    NASA Astrophysics Data System (ADS)

    Podkopaev, O. I.; Artemyev, V. V.; Smirnov, A. D.; Mamedov, V. M.; Sid'ko, A. P.; Kalaev, V. V.; Kravtsova, E. D.; Shimanskii, A. F.

    2016-09-01

    To grow high-quality germanium crystals is one of the most important problems of growth industry. The dislocation density is an important parameter of the quality of single crystals. The dislocation densities in germanium crystals 100 mm in diameter, which have various shapes of the side surface and are grown by the Czochralski technique, are experimentally measured. The crystal growth is numerically simulated using heat-transfer and hydrodynamics models and the Alexander-Haasen dislocation model in terms of the CGSim software package. A comparison of the experimental and calculated dislocation densities shows that the dislocation model can be applied to study lattice defects in germanium crystals and to improve their quality.

  5. A Description and Analysis of the German Packaging Take-Back System

    ERIC Educational Resources Information Center

    Nakajima, Nina; Vanderburg, Willem H.

    2006-01-01

    The German packaging ordinance is an example of legislated extended producer responsibility (also known as product take-back). Consumers can leave packaging with retailers, and packagers are required to pay for their recycling and disposal. It can be considered to be successful in reducing waste, spurring the redesign of packaging to be more…

  6. DESIGN ANALYSIS FOR THE NAVAL SNF WASTE PACKAGE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    T.L. Mitchell

    2000-05-31

    The purpose of this analysis is to demonstrate the design of the naval spent nuclear fuel (SNF) waste package (WP) using the Waste Package Department's (WPD) design methodologies and processes described in the ''Waste Package Design Methodology Report'' (CRWMS M&O [Civilian Radioactive Waste Management System Management and Operating Contractor] 2000b). The calculations that support the design of the naval SNF WP will be discussed; however, only a sub-set of such analyses will be presented and shall be limited to those identified in the ''Waste Package Design Sensitivity Report'' (CRWMS M&O 2000c). The objective of this analysis is to describe themore » naval SNF WP design method and to show that the design of the naval SNF WP complies with the ''Naval Spent Nuclear Fuel Disposal Container System Description Document'' (CRWMS M&O 1999a) and Interface Control Document (ICD) criteria for Site Recommendation. Additional criteria for the design of the naval SNF WP have been outlined in Section 6.2 of the ''Waste Package Design Sensitivity Report'' (CRWMS M&O 2000c). The scope of this analysis is restricted to the design of the naval long WP containing one naval long SNF canister. This WP is representative of the WPs that will contain both naval short SNF and naval long SNF canisters. The following items are included in the scope of this analysis: (1) Providing a general description of the applicable design criteria; (2) Describing the design methodology to be used; (3) Presenting the design of the naval SNF waste package; and (4) Showing compliance with all applicable design criteria. The intended use of this analysis is to support Site Recommendation reports and assist in the development of WPD drawings. Activities described in this analysis were conducted in accordance with the technical product development plan (TPDP) ''Design Analysis for the Naval SNF Waste Package (CRWMS M&O 2000a).« less

  7. Numerical Simulation of Film Cooling with a Coolant Supplied Through Holes in a Trench

    NASA Astrophysics Data System (ADS)

    Khalatov, A. A.; Panchenko, N. A.; Borisov, I. I.; Severina, V. V.

    2017-05-01

    The results of numerical simulation and experimental investigation of the efficiency of film cooling behind a row of holes in a trench in the range of blowing ratio variation 0.5 ≤ m ≤ 2.0 are presented. This scheme is of practical interest for use in the systems of cooling the blades of high-temperature gas turbines. Comparative analysis has shown that the efficiency of the trench scheme substantially exceeds the efficiency of the traditional scheme. The commercial package ANSYS CFX 14 was used in the Calculation Fluid Dynamics (CFD) modeling of film cooling. It is shown that the best agreement between predicted and experimental data is provided by the use of the SST model of turbulence. Analysis of the physical picture of flow has shown that the higher efficiency of film cooling with secondary air supply to the trench is mainly due to the preliminary spreading of a coolant in the trench, decrease in the intensity and scale of the vortex pair structure, absence of the coolant film departure from the plate surface, and to the more uniform transverse distribution of the coolant film.

  8. A continuous flow microfluidic calorimeter: 3-D numerical modeling with aqueous reactants.

    PubMed

    Sen, Mehmet A; Kowalski, Gregory J; Fiering, Jason; Larson, Dale

    2015-03-10

    A computational analysis of the reacting flow field, species diffusion and heat transfer processes with thermal boundary layer effects in a microchannel reactor with a coflow configuration was performed. Two parallel adjacent streams of aqueous reactants flow along a wide, shallow, enclosed channel in contact with a substrate, which is affixed to a temperature controlled plate. The Fluent computational fluid dynamics package solved the Navier-Stokes, mass transport and energy equations. The energy model, including the enthalpy of reaction as a nonuniform heat source, was validated by calculating the energy balance at several control volumes in the microchannel. Analysis reveals that the temperature is nearly uniform across the channel thickness, in the direction normal to the substrate surface; hence, measurements made by sensors at or near the surface are representative of the average temperature. Additionally, modeling the channel with a glass substrate and a silicone cover shows that heat transfer is predominantly due to the glass substrate. Finally, using the numerical results, we suggest that a microcalorimeter could be based on this configuration, and that temperature sensors such as optical nanohole array sensors could have sufficient spatial resolution to determine enthalpy of reaction.

  9. A continuous flow microfluidic calorimeter: 3-D numerical modeling with aqueous reactants

    PubMed Central

    Sen, Mehmet A.; Kowalski, Gregory J.; Fiering, Jason; Larson, Dale

    2015-01-01

    A computational analysis of the reacting flow field, species diffusion and heat transfer processes with thermal boundary layer effects in a microchannel reactor with a coflow configuration was performed. Two parallel adjacent streams of aqueous reactants flow along a wide, shallow, enclosed channel in contact with a substrate, which is affixed to a temperature controlled plate. The Fluent computational fluid dynamics package solved the Navier–Stokes, mass transport and energy equations. The energy model, including the enthalpy of reaction as a nonuniform heat source, was validated by calculating the energy balance at several control volumes in the microchannel. Analysis reveals that the temperature is nearly uniform across the channel thickness, in the direction normal to the substrate surface; hence, measurements made by sensors at or near the surface are representative of the average temperature. Additionally, modeling the channel with a glass substrate and a silicone cover shows that heat transfer is predominantly due to the glass substrate. Finally, using the numerical results, we suggest that a microcalorimeter could be based on this configuration, and that temperature sensors such as optical nanohole array sensors could have sufficient spatial resolution to determine enthalpy of reaction. PMID:25937678

  10. Pse-Analysis: a python package for DNA/RNA and protein/ peptide sequence analysis based on pseudo components and kernel methods.

    PubMed

    Liu, Bin; Wu, Hao; Zhang, Deyuan; Wang, Xiaolong; Chou, Kuo-Chen

    2017-02-21

    To expedite the pace in conducting genome/proteome analysis, we have developed a Python package called Pse-Analysis. The powerful package can automatically complete the following five procedures: (1) sample feature extraction, (2) optimal parameter selection, (3) model training, (4) cross validation, and (5) evaluating prediction quality. All the work a user needs to do is to input a benchmark dataset along with the query biological sequences concerned. Based on the benchmark dataset, Pse-Analysis will automatically construct an ideal predictor, followed by yielding the predicted results for the submitted query samples. All the aforementioned tedious jobs can be automatically done by the computer. Moreover, the multiprocessing technique was adopted to enhance computational speed by about 6 folds. The Pse-Analysis Python package is freely accessible to the public at http://bioinformatics.hitsz.edu.cn/Pse-Analysis/, and can be directly run on Windows, Linux, and Unix.

  11. Safety analysis report for packaging (onsite) multicanister overpack cask

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, W.S.

    1997-07-14

    This safety analysis report for packaging (SARP) documents the safety of shipments of irradiated fuel elements in the MUlticanister Overpack (MCO) and MCO Cask for a highway route controlled quantity, Type B fissile package. This SARP evaluates the package during transfers of (1) water-filled MCOs from the K Basins to the Cold Vacuum Drying Facility (CVDF) and (2) sealed and cold vacuum dried MCOs from the CVDF in the 100 K Area to the Canister Storage Building in the 200 East Area.

  12. Chip-scale thermal management of high-brightness LED packages

    NASA Astrophysics Data System (ADS)

    Arik, Mehmet; Weaver, Stanton

    2004-10-01

    The efficiency and reliability of the solid-state lighting devices strongly depend on successful thermal management. Light emitting diodes, LEDs, are a strong candidate for the next generation, general illumination applications. LEDs are making great strides in terms of lumen performance and reliability, however the barrier to widespread use in general illumination still remains the cost or $/Lumen. LED packaging designers are pushing the LED performance to its limits. This is resulting in increased drive currents, and thus the need for lower thermal resistance packaging designs. As the power density continues to rise, the integrity of the package electrical and thermal interconnect becomes extremely important. Experimental results with high brightness LED packages show that chip attachment defects can cause significant thermal gradients across the LED chips leading to premature failures. A numerical study was also carried out with parametric models to understand the chip active layer temperature profile variation due to the bump defects. Finite element techniques were utilized to evaluate the effects of localized hot spots at the chip active layer. The importance of "zero defects" in one of the more popular interconnect schemes; the "epi down" soldered flip chip configuration is investigated and demonstrated.

  13. AN ADA LINEAR ALGEBRA PACKAGE MODELED AFTER HAL/S

    NASA Technical Reports Server (NTRS)

    Klumpp, A. R.

    1994-01-01

    This package extends the Ada programming language to include linear algebra capabilities similar to those of the HAL/S programming language. The package is designed for avionics applications such as Space Station flight software. In addition to the HAL/S built-in functions, the package incorporates the quaternion functions used in the Shuttle and Galileo projects, and routines from LINPAK that solve systems of equations involving general square matrices. Language conventions in this package follow those of HAL/S to the maximum extent practical and minimize the effort required for writing new avionics software and translating existent software into Ada. Valid numeric types in this package include scalar, vector, matrix, and quaternion declarations. (Quaternions are fourcomponent vectors used in representing motion between two coordinate frames). Single precision and double precision floating point arithmetic is available in addition to the standard double precision integer manipulation. Infix operators are used instead of function calls to define dot products, cross products, quaternion products, and mixed scalar-vector, scalar-matrix, and vector-matrix products. The package contains two generic programs: one for floating point, and one for integer. The actual component type is passed as a formal parameter to the generic linear algebra package. The procedures for solving systems of linear equations defined by general matrices include GEFA, GECO, GESL, and GIDI. The HAL/S functions include ABVAL, UNIT, TRACE, DET, INVERSE, TRANSPOSE, GET, PUT, FETCH, PLACE, and IDENTITY. This package is written in Ada (Version 1.2) for batch execution and is machine independent. The linear algebra software depends on nothing outside the Ada language except for a call to a square root function for floating point scalars (such as SQRT in the DEC VAX MATHLIB library). This program was developed in 1989, and is a copyrighted work with all copyright vested in NASA.

  14. IRISpy: Analyzing IRIS Data in Python

    NASA Astrophysics Data System (ADS)

    Ryan, Daniel; Christe, Steven; Mumford, Stuart; Baruah, Ankit; Timothy, Shelbe; Pereira, Tiago; De Pontieu, Bart

    2017-08-01

    IRISpy is a new community-developed open-source software library for analysing IRIS level 2 data. It is written in Python, a free, cross-platform, general-purpose, high-level programming language. A wide array of scientific computing software packages have already been developed in Python, from numerical computation (NumPy, SciPy, etc.), to visualization and plotting (matplotlib), to solar-physics-specific data analysis (SunPy). IRISpy is currently under development as a SunPy-affiliated package which means it depends on the SunPy library, follows similar standards and conventions, and is developed with the support of of the SunPy development team. IRISpy’s has two primary data objects, one for analyzing slit-jaw imager data and another for analyzing spectrograph data. Both objects contain basic slicing, indexing, plotting, and animating functionality to allow users to easily inspect, reduce and analyze the data. As part of this functionality the objects can output SunPy Maps, TimeSeries, Spectra, etc. of relevant data slices for easier inspection and analysis. Work is also ongoing to provide additional data analysis functionality including derivation of systematic measurement errors (e.g. readout noise), exposure time correction, residual wavelength calibration, radiometric calibration, and fine scale pointing corrections. IRISpy’s code base is publicly available through github.com and can be contributed to by anyone. In this poster we demonstrate IRISpy’s functionality and future goals of the project. We also encourage interested users to become involved in further developing IRISpy.

  15. MWASTools: an R/bioconductor package for metabolome-wide association studies.

    PubMed

    Rodriguez-Martinez, Andrea; Posma, Joram M; Ayala, Rafael; Neves, Ana L; Anwar, Maryam; Petretto, Enrico; Emanueli, Costanza; Gauguier, Dominique; Nicholson, Jeremy K; Dumas, Marc-Emmanuel

    2018-03-01

    MWASTools is an R package designed to provide an integrated pipeline to analyse metabonomic data in large-scale epidemiological studies. Key functionalities of our package include: quality control analysis; metabolome-wide association analysis using various models (partial correlations, generalized linear models); visualization of statistical outcomes; metabolite assignment using statistical total correlation spectroscopy (STOCSY); and biological interpretation of metabolome-wide association studies results. The MWASTools R package is implemented in R (version  > =3.4) and is available from Bioconductor: https://bioconductor.org/packages/MWASTools/. m.dumas@imperial.ac.uk. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  16. DEVELOPMENT OF A PORTABLE SOFTWARE LANGUAGE FOR PHYSIOLOGICALLY-BASED PHARMACOKINETIC (PBPK) MODELS

    EPA Science Inventory

    The PBPK modeling community has had a long-standing problem with modeling software compatibility. The numerous software packages used for PBPK models are, at best, minimally compatible. This creates problems ranging from model obsolescence due to software support discontinuation...

  17. Buying CAM.

    ERIC Educational Resources Information Center

    Meloy, Jim; And Others

    1990-01-01

    The relationship between computer-aided design (CAD), computer-aided manufacturing (CAM), and computer numerical control (CNC) computer applications is described. Tips for helping educate the CAM buyer on what to look for and what to avoid when searching for the most appropriate instructional CAM package are provided. (KR)

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sepke, Scott M.

    In this note, the laser focal plane intensity pro le for a beam modeled using the 3D ray trace package in HYDRA is determined. First, the analytical model is developed followed by a practical numerical model for evaluating the resulting computationally intensive normalization factor for all possible input parameters.

  19. EARL: Exoplanet Analytic Reflected Lightcurves package

    NASA Astrophysics Data System (ADS)

    Haggard, Hal M.; Cowan, Nicolas B.

    2018-05-01

    EARL (Exoplanet Analytic Reflected Lightcurves) computes the analytic form of a reflected lightcurve, given a spherical harmonic decomposition of the planet albedo map and the viewing and orbital geometries. The EARL Mathematica notebook allows rapid computation of reflected lightcurves, thus making lightcurve numerical experiments accessible.

  20. Structural constraints in the packaging of bluetongue virus genomic segments

    PubMed Central

    Burkhardt, Christiane; Sung, Po-Yu; Celma, Cristina C.

    2014-01-01

    The mechanism used by bluetongue virus (BTV) to ensure the sorting and packaging of its 10 genomic segments is still poorly understood. In this study, we investigated the packaging constraints for two BTV genomic segments from two different serotypes. Segment 4 (S4) of BTV serotype 9 was mutated sequentially and packaging of mutant ssRNAs was investigated by two newly developed RNA packaging assay systems, one in vivo and the other in vitro. Modelling of the mutated ssRNA followed by biochemical data analysis suggested that a conformational motif formed by interaction of the 5′ and 3′ ends of the molecule was necessary and sufficient for packaging. A similar structural signal was also identified in S8 of BTV serotype 1. Furthermore, the same conformational analysis of secondary structures for positive-sense ssRNAs was used to generate a chimeric segment that maintained the putative packaging motif but contained unrelated internal sequences. This chimeric segment was packaged successfully, confirming that the motif identified directs the correct packaging of the segment. PMID:24980574

  1. SpeCond: a method to detect condition-specific gene expression

    PubMed Central

    2011-01-01

    Transcriptomic studies routinely measure expression levels across numerous conditions. These datasets allow identification of genes that are specifically expressed in a small number of conditions. However, there are currently no statistically robust methods for identifying such genes. Here we present SpeCond, a method to detect condition-specific genes that outperforms alternative approaches. We apply the method to a dataset of 32 human tissues to determine 2,673 specifically expressed genes. An implementation of SpeCond is freely available as a Bioconductor package at http://www.bioconductor.org/packages/release/bioc/html/SpeCond.html. PMID:22008066

  2. New features in McStas, version 1.5

    NASA Astrophysics Data System (ADS)

    Åstrand, P.-O.; Lefmann, K.; Farhi, E.; Nielsen, K.; Skårup, P.

    The neutron ray-tracing simulation package McStas has attracted numerous users, and the development of the package continues with version 1.5 released at the ICNS 2001 conference. New features include: support for neutron polarisation, labelling of neutrons, realistic source and sample components, and interface to the Riso instrument-control software TASCOM. We give a general introduction to McStas and present the latest developments. In particular, we give an example of how the neutron-label option has been used to locate the origin of a spurious side-peak, observed in an experiment with RITA-1 at Riso.

  3. Visualization of electronic density

    DOE PAGES

    Grosso, Bastien; Cooper, Valentino R.; Pine, Polina; ...

    2015-04-22

    An atom’s volume depends on its electronic density. Although this density can only be evaluated exactly for hydrogen-like atoms, there are many excellent numerical algorithms and packages to calculate it for other materials. 3D visualization of charge density is challenging, especially when several molecular/atomic levels are intertwined in space. We explore several approaches to 3D charge density visualization, including the extension of an anaglyphic stereo visualization application based on the AViz package to larger structures such as nanotubes. We will describe motivations and potential applications of these tools for answering interesting questions about nanotube properties.

  4. Novel Techniques for Millimeter-Wave Packages

    NASA Technical Reports Server (NTRS)

    Herman, Martin I.; Lee, Karen A.; Kolawa, Elzbieta A.; Lowry, Lynn E.; Tulintseff, Ann N.

    1995-01-01

    A new millimeter-wave package architecture with supporting electrical, mechanical and material science experiment and analysis is presented. This package is well suited for discrete devices, monolithic microwave integrated circuits (MMIC's) and multichip module (MCM) applications. It has low-loss wide-band RF transitions which are necessary to overcome manufacturing tolerances leading to lower per unit cost Potential applications of this new packaging architecture which go beyond the standard requirements of device protection include integration of antennas, compatibility to photonic networks and direct transitions to waveguide systems. Techniques for electromagnetic analysis, thermal control and hermetic sealing were explored. Three dimensional electromagnetic analysis was performed using a finite difference time-domain (FDTD) algorithm and experimentally verified for millimeter-wave package input and output transitions. New multi-material system concepts (AlN, Cu, and diamond thin films) which allow excellent surface finishes to be achieved with enhanced thermal management have been investigated. A new approach utilizing block copolymer coatings was employed to hermetically seal packages which met MIL STD-883.

  5. Three-Dimensional Field Solutions for Multi-Pole Cylindrical Halbach Arrays in an Axial Orientation

    NASA Technical Reports Server (NTRS)

    Thompson, William K.

    2006-01-01

    This article presents three-dimensional B field solutions for the cylindrical Halbach array in an axial orientation. This arrangement has applications in the design of axial motors and passive axial magnetic bearings and couplers. The analytical model described here assumes ideal magnets with fixed and uniform magnetization. The field component functions are expressed as sums of 2-D definite integrals that are easily computed by a number of mathematical analysis software packages. The analysis is verified with sample calculations and the results are compared to equivalent results from traditional finite-element analysis (FEA). The field solutions are then approximated for use in flux linkage and induced EMF calculations in nearby stator windings by expressing the field variance with angular displacement as pure sinusoidal function whose amplitude depends on radial and axial position. The primary advantage of numerical implementation of the analytical approach presented in the article is that it lends itself more readily to parametric analysis and design tradeoffs than traditional FEA models.

  6. Evaluation of strategies to communicate harmful and potentially harmful constituent (HPHC) information through cigarette package inserts: a discrete choice experiment.

    PubMed

    Salloum, Ramzi G; Louviere, Jordan J; Getz, Kayla R; Islam, Farahnaz; Anshari, Dien; Cho, Yoojin; O'Connor, Richard J; Hammond, David; Thrasher, James F

    2017-07-13

    The US Food and Drug Administration (FDA) has regulatory authority to use inserts to communicate with consumers about harmful and potentially harmful constituents (HPHCs) in tobacco products; however, little is known about the most effective manner for presenting HPHC information. In a discrete choice experiment, participants evaluated eight choice sets, each of which showed two cigarette packages from four different brands and tar levels (high vs low), accompanied by an insert that included between-subject manipulations (ie, listing of HPHCs vs grouping by disease outcome and numeric values ascribed to HPHCs vs no numbers) and within-subject manipulations (ie, 1 of 4 warning topics; statement linking an HPHC with disease vs statement with no HPHC link). For each choice set, participants were asked: (1) which package is more harmful and (2) which motivates them to not smoke; each with a 'no difference' option. Alternative-specific logit models regressed choice on attribute levels. 1212 participants were recruited from an online consumer panel (725 18-29-year-old smokers and susceptible non-smokers and 487 30-64-year-old smokers). Participants were more likely to endorse high-tar products as more harmful than low-tar products, with a greater effect when numeric HPHC information was present. Compared with a simple warning statement, the statement linking HPHCs with disease encouraged quit motivation. Numeric HPHC information on inserts appears to produce misunderstandings that some cigarettes are less harmful than others. Furthermore, brief narratives that link HPHCs to smoking-related disease may promote cessation versus communications that do not explicitly link HPHCs to disease. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  7. Algebraic grid generation for coolant passages of turbine blades with serpentine channels and pin fins

    NASA Technical Reports Server (NTRS)

    Shih, T. I.-P.; Roelke, R. J.; Steinthorsson, E.

    1991-01-01

    In order to study numerically details of the flow and heat transfer within coolant passages of turbine blades, a method must first be developed to generate grid systems within the very complicated geometries involved. In this study, a grid generation package was developed that is capable of generating the required grid systems. The package developed is based on an algebraic grid generation technique that permits the user considerable control over how grid points are to be distributed in a very explicit way. These controls include orthogonality of grid lines next to boundary surfaces and ability to cluster about arbitrary points, lines, and surfaces. This paper describes that grid generation package and shows how it can be used to generate grid systems within complicated-shaped coolant passages via an example.

  8. User manual for Blossom statistical package for R

    USGS Publications Warehouse

    Talbert, Marian; Cade, Brian S.

    2005-01-01

    Blossom is an R package with functions for making statistical comparisons with distance-function based permutation tests developed by P.W. Mielke, Jr. and colleagues at Colorado State University (Mielke and Berry, 2001) and for testing parameters estimated in linear models with permutation procedures developed by B. S. Cade and colleagues at the Fort Collins Science Center, U.S. Geological Survey. This manual is intended to provide identical documentation of the statistical methods and interpretations as the manual by Cade and Richards (2005) does for the original Fortran program, but with changes made with respect to command inputs and outputs to reflect the new implementation as a package for R (R Development Core Team, 2012). This implementation in R has allowed for numerous improvements not supported by the Cade and Richards (2005) Fortran implementation, including use of categorical predictor variables in most routines.

  9. HEPMath 1.4: A mathematica package for semi-automatic computations in high energy physics

    NASA Astrophysics Data System (ADS)

    Wiebusch, Martin

    2015-10-01

    This article introduces the Mathematica package HEPMath which provides a number of utilities and algorithms for High Energy Physics computations in Mathematica. Its functionality is similar to packages like FormCalc or FeynCalc, but it takes a more complete and extensible approach to implementing common High Energy Physics notations in the Mathematica language, in particular those related to tensors and index contractions. It also provides a more flexible method for the generation of numerical code which is based on new features for C code generation in Mathematica. In particular it can automatically generate Python extension modules which make the compiled functions callable from Python, thus eliminating the need to write any code in a low-level language like C or Fortran. It also contains seamless interfaces to LHAPDF, FeynArts, and LoopTools.

  10. KAPPA -- Kernel Application Package

    NASA Astrophysics Data System (ADS)

    Currie, Malcolm J.; Berry, David. S.

    KAPPA is an applications package comprising about 180 general-purpose commands for image processing, data visualisation, and manipulation of the standard Starlink data format---the NDF. It is intended to work in conjunction with Starlink's various specialised packages. In addition to the NDF, KAPPA can also process data in other formats by using the `on-the-fly' conversion scheme. Many commands can process data arrays of arbitrary dimension, and others work on both spectra and images. KAPPA operates from both the UNIX C-shell and the ICL command language. This document describes how to use KAPPA and its features. There is some description of techniques too, including a section on writing scripts. This document includes several tutorials and is illustrated with numerous examples. The bulk of this document comprises detailed descriptions of each command as well as classified and alphabetical summaries.

  11. Application of Quality by Design (QbD) Principles to Extractables/Leachables Assessment. Establishing a Design Space for Terminally Sterilized Aqueous Drug Products Stored in a Plastic Packaging System.

    PubMed

    Jenke, Dennis

    2010-01-01

    The concept of quality by design (QbD) reflects the current global regulatory thinking related to pharmaceutical products. A cornerstone of the QbD paradigm is the concept of a design space, where the design space is a multidimensional combination of input variables and process parameters that have been demonstrated to provide the assurance of product quality. If a design space can be established for a pharmaceutical process or product, then operation within the design space confirms that the product or process output possesses the required quality attributes. This concept of design space can be applied to the safety (leachables) assessment of drug products manufactured and stored in packaging systems. Critical variables in such a design space would include those variables that affect the interaction of the drug product and its packaging, including (a) composition of the drug product, (b) composition of the packaging system, (c) configuration of the packaging system, and (d) the conditions of contact. This paper proposes and justifies such a leachables design space for aqueous drug products packaged in a specific plastic packaging system. Such a design space has the following boundaries:Aqueous drug products with a pH in the range of 2 to 8 and that contain no polarity-impacting agents such as organic solubilizers and stabilizers (addressing variable a). Packaging systems manufactured from materials that meet the system's existing material specifications (addressing variable b). Nominal fill volumes from 50 to 1000 mL (addressing variable c). Products subjected to terminal sterilization and then stored at room temperature for a period of up to 24 months (addressing variable d). The ramification of such a design space is that any drug product that falls within these boundaries is deemed to be compatible with the packaging system, from the perspective of safety, without the requirement of supporting drug product testing. When drug products are packaged in plastic container systems, substances may leach from the container and accumulate in the product. It is necessary that the drug product's vendor demonstrate that any such leaching does not occur to the extent that the leached substances adversely affect the product's safety and/or efficacy. One method for accomplishing this objective is via analysis of the drug product to identify and quantify the leached substances. When a particular packaging system is utilized for multiple drug products, one reaches the point, after testing numerous drug products, where the leaching properties of the packaging system are well known and readily predictable. In such a case, testing of additional products in the same packaging system produces no new information and thus becomes redundant and unnecessary. The quality by design (QbD) principle can be simply stated as follows: once a system has been tested to the extent that the test results are predictable, further testing can be replaced by establishing that the system was operating within a defined design space. The purpose of this paper is to demonstrate the application of QbD principles to a packaging system that has been utilized with over 12 parenteral drug products. The paper concludes that the leachables profile of all drug products that fit a certain description (the design space) is known and predicable.

  12. dartr: An r package to facilitate analysis of SNP data generated from reduced representation genome sequencing.

    PubMed

    Gruber, Bernd; Unmack, Peter J; Berry, Oliver F; Georges, Arthur

    2018-05-01

    Although vast technological advances have been made and genetic software packages are growing in number, it is not a trivial task to analyse SNP data. We announce a new r package, dartr, enabling the analysis of single nucleotide polymorphism data for population genomic and phylogenomic applications. dartr provides user-friendly functions for data quality control and marker selection, and permits rigorous evaluations of conformation to Hardy-Weinberg equilibrium, gametic-phase disequilibrium and neutrality. The package reports standard descriptive statistics, permits exploration of patterns in the data through principal components analysis and conducts standard F-statistics, as well as basic phylogenetic analyses, population assignment, isolation by distance and exports data to a variety of commonly used downstream applications (e.g., newhybrids, faststructure and phylogeny applications) outside of the r environment. The package serves two main purposes: first, a user-friendly approach to lower the hurdle to analyse such data-therefore, the package comes with a detailed tutorial targeted to the r beginner to allow data analysis without requiring deep knowledge of r. Second, we use a single, well-established format-genlight from the adegenet package-as input for all our functions to avoid data reformatting. By strictly using the genlight format, we hope to facilitate this format as the de facto standard of future software developments and hence reduce the format jungle of genetic data sets. The dartr package is available via the r CRAN network and GitHub. © 2017 John Wiley & Sons Ltd.

  13. Toward the greening of nuclear energy: A content analysis of nuclear energy frames from 1991 to 2008

    NASA Astrophysics Data System (ADS)

    Miller, Sonya R.

    Framing theory has emerged as one of the predominant theories employed in mass communications research in the 21st century. Frames are identified as interpretive packages for content where some issue attributes are highlighted over other attributes. While framing effects studies appear plentiful, longitudinal studies assessing trends in dominant framing packages and story elements for an issue appear to be less understood. Through content analysis, this study examines dominant frame packages, story elements, headline tone, story tone, stereotypes, and source attribution for nuclear energy from 1991-2008 in the New York Times, USA Today, the Wall Street Journal, and the Washington Post. Unlike many content analysis studies, this study compares intercoder reliability among three indices---percentage agreement, proportional reduction of loss and Scott's Pi. The newspapers represented in this study possess a commonality in the types of dominant frames packages employed. Significant dominant frame packages among the four newspapers include human/health, proliferation, procedural, and marketplace. While the procedural frame package was more likely to appear prior to the 1997 Kyoto Protocol, the proliferation frame packaged was more likely to appear after the Kyoto Protol. Over time, the sustainable frame package demonstrated increased significance. This study is part of the growing literature regarding the function of frames over time.

  14. Kranc: a Mathematica package to generate numerical codes for tensorial evolution equations

    NASA Astrophysics Data System (ADS)

    Husa, Sascha; Hinder, Ian; Lechner, Christiane

    2006-06-01

    We present a suite of Mathematica-based computer-algebra packages, termed "Kranc", which comprise a toolbox to convert certain (tensorial) systems of partial differential evolution equations to parallelized C or Fortran code for solving initial boundary value problems. Kranc can be used as a "rapid prototyping" system for physicists or mathematicians handling very complicated systems of partial differential equations, but through integration into the Cactus computational toolkit we can also produce efficient parallelized production codes. Our work is motivated by the field of numerical relativity, where Kranc is used as a research tool by the authors. In this paper we describe the design and implementation of both the Mathematica packages and the resulting code, we discuss some example applications, and provide results on the performance of an example numerical code for the Einstein equations. Program summaryTitle of program: Kranc Catalogue identifier: ADXS_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXS_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Distribution format: tar.gz Computer for which the program is designed and others on which it has been tested: General computers which run Mathematica (for code generation) and Cactus (for numerical simulations), tested under Linux Programming language used: Mathematica, C, Fortran 90 Memory required to execute with typical data: This depends on the number of variables and gridsize, the included ADM example requires 4308 KB Has the code been vectorized or parallelized: The code is parallelized based on the Cactus framework. Number of bytes in distributed program, including test data, etc.: 1 578 142 Number of lines in distributed program, including test data, etc.: 11 711 Nature of physical problem: Solution of partial differential equations in three space dimensions, which are formulated as an initial value problem. In particular, the program is geared towards handling very complex tensorial equations as they appear, e.g., in numerical relativity. The worked out examples comprise the Klein-Gordon equations, the Maxwell equations, and the ADM formulation of the Einstein equations. Method of solution: The method of numerical solution is finite differencing and method of lines time integration, the numerical code is generated through a high level Mathematica interface. Restrictions on the complexity of the program: Typical numerical relativity applications will contain up to several dozen evolution variables and thousands of source terms, Cactus applications have shown scaling up to several thousand processors and grid sizes exceeding 500 3. Typical running time: This depends on the number of variables and the grid size: the included ADM example takes approximately 100 seconds on a 1600 MHz Intel Pentium M processor. Unusual features of the program: based on Mathematica and Cactus

  15. Trends in microbial control techniques for poultry products.

    PubMed

    Silva, Filomena; Domingues, Fernanda C; Nerín, Cristina

    2018-03-04

    Fresh poultry meat and poultry products are highly perishable foods and high potential sources of human infection due to the presence of several foodborne pathogens. Focusing on the microbial control of poultry products, the food industry generally implements numerous preventive measures based on the Hazard Analysis and Critical Control Points (HACCP) food safety management system certification together with technological steps, such as refrigeration coupled to modified atmosphere packaging that are able to control identified potential microbial hazards during food processing. However, in recent years, to meet the demand of consumers for minimally processed, high-quality, and additive-free foods, technologies are emerging associated with nonthermal microbial inactivation, such as high hydrostatic pressure, irradiation, and natural alternatives, such as biopreservation or the incorporation of natural preservatives in packaging materials. These technologies are discussed throughout this article, emphasizing their pros and cons regarding the control of poultry microbiota and their effects on poultry sensory properties. The discussion for each of the preservation techniques mentioned will be provided with as much detail as the data and studies provided in the literature for poultry meat and products allow. These new approaches, on their own, have proved to be effective against a wide range of microorganisms in poultry meat. However, since some of these emergent technologies still do not have full consumer's acceptability and, taking into consideration the hurdle technology concept for poultry processing, it is suggested that they will be used as combined treatments or, more frequently, in combination with modified atmosphere packaging.

  16. An intuitive Python interface for Bioconductor libraries demonstrates the utility of language translators.

    PubMed

    Gautier, Laurent

    2010-12-21

    Computer languages can be domain-related, and in the case of multidisciplinary projects, knowledge of several languages will be needed in order to quickly implements ideas. Moreover, each computer language has relative strong points, making some languages better suited than others for a given task to be implemented. The Bioconductor project, based on the R language, has become a reference for the numerical processing and statistical analysis of data coming from high-throughput biological assays, providing a rich selection of methods and algorithms to the research community. At the same time, Python has matured as a rich and reliable language for the agile development of prototypes or final implementations, as well as for handling large data sets. The data structures and functions from Bioconductor can be exposed to Python as a regular library. This allows a fully transparent and native use of Bioconductor from Python, without one having to know the R language and with only a small community of translators required to know both. To demonstrate this, we have implemented such Python representations for key infrastructure packages in Bioconductor, letting a Python programmer handle annotation data, microarray data, and next-generation sequencing data. Bioconductor is now not solely reserved to R users. Building a Python application using Bioconductor functionality can be done just like if Bioconductor was a Python package. Moreover, similar principles can be applied to other languages and libraries. Our Python package is available at: http://pypi.python.org/pypi/rpy2-bioconductor-extensions/.

  17. Achieving better cooling of turbine blades using numerical simulation methods

    NASA Astrophysics Data System (ADS)

    Inozemtsev, A. A.; Tikhonov, A. S.; Sendyurev, C. I.; Samokhvalov, N. Yu.

    2013-02-01

    A new design of the first-stage nozzle vane for the turbine of a prospective gas-turbine engine is considered. The blade's thermal state is numerically simulated in conjugate statement using the ANSYS CFX 13.0 software package. Critical locations in the blade design are determined from the distribution of heat fluxes, and measures aimed at achieving more efficient cooling are analyzed. Essentially lower (by 50-100°C) maximal temperature of metal has been achieved owing to the results of the performed work.

  18. Numerical simulation of deformation and failure processes of a complex technical object under impact loading

    NASA Astrophysics Data System (ADS)

    Kraus, E. I.; Shabalin, I. I.; Shabalin, T. I.

    2018-04-01

    The main points of development of numerical tools for simulation of deformation and failure of complex technical objects under nonstationary conditions of extreme loading are presented. The possibility of extending the dynamic method for construction of difference grids to the 3D case is shown. A 3D realization of discrete-continuum approach to the deformation and failure of complex technical objects is carried out. The efficiency of the existing software package for 3D modelling is shown.

  19. Numerical simulation of polyester coextrusion: Influence of the thermal parameters and the die geometry on interfacial instabilities

    NASA Astrophysics Data System (ADS)

    Mahdaoui, O.; Agassant, J.-F.; Laure, P.; Valette, R.; Silva, L.

    2007-04-01

    The polymer coextrusion process is a new method of sheet metal lining. It allows to substitute lacquers for steel protection in food packaging industry. The coextrusion process may exhibit flow instabilities at the interface between the two polymer layers. The objective of this study is to check the influence of processing and rheology parameters on the instabilities. Finite elements numerical simulations of the coextrusion allow to investigate various stable and instable flow configurations.

  20. Models for twistable elastic polymers in Brownian dynamics, and their implementation for LAMMPS.

    PubMed

    Brackley, C A; Morozov, A N; Marenduzzo, D

    2014-04-07

    An elastic rod model for semi-flexible polymers is presented. Theory for a continuum rod is reviewed, and it is shown that a popular discretised model used in numerical simulations gives the correct continuum limit. Correlation functions relating to both bending and twisting of the rod are derived for both continuous and discrete cases, and results are compared with numerical simulations. Finally, two possible implementations of the discretised model in the multi-purpose molecular dynamics software package LAMMPS are described.

  1. Validation of a RANS transition model using a high-order weighted compact nonlinear scheme

    NASA Astrophysics Data System (ADS)

    Tu, GuoHua; Deng, XiaoGang; Mao, MeiLiang

    2013-04-01

    A modified transition model is given based on the shear stress transport (SST) turbulence model and an intermittency transport equation. The energy gradient term in the original model is replaced by flow strain rate to saving computational costs. The model employs local variables only, and then it can be conveniently implemented in modern computational fluid dynamics codes. The fifth-order weighted compact nonlinear scheme and the fourth-order staggered scheme are applied to discrete the governing equations for the purpose of minimizing discretization errors, so as to mitigate the confusion between numerical errors and transition model errors. The high-order package is compared with a second-order TVD method on simulating the transitional flow of a flat plate. Numerical results indicate that the high-order package give better grid convergence property than that of the second-order method. Validation of the transition model is performed for transitional flows ranging from low speed to hypersonic speed.

  2. Parallel Fortran-MPI software for numerical inversion of the Laplace transform and its application to oscillatory water levels in groundwater environments

    USGS Publications Warehouse

    Zhan, X.

    2005-01-01

    A parallel Fortran-MPI (Message Passing Interface) software for numerical inversion of the Laplace transform based on a Fourier series method is developed to meet the need of solving intensive computational problems involving oscillatory water level's response to hydraulic tests in a groundwater environment. The software is a parallel version of ACM (The Association for Computing Machinery) Transactions on Mathematical Software (TOMS) Algorithm 796. Running 38 test examples indicated that implementation of MPI techniques with distributed memory architecture speedups the processing and improves the efficiency. Applications to oscillatory water levels in a well during aquifer tests are presented to illustrate how this package can be applied to solve complicated environmental problems involved in differential and integral equations. The package is free and is easy to use for people with little or no previous experience in using MPI but who wish to get off to a quick start in parallel computing. ?? 2004 Elsevier Ltd. All rights reserved.

  3. Calibrated Multivariate Regression with Application to Neural Semantic Basis Discovery.

    PubMed

    Liu, Han; Wang, Lie; Zhao, Tuo

    2015-08-01

    We propose a calibrated multivariate regression method named CMR for fitting high dimensional multivariate regression models. Compared with existing methods, CMR calibrates regularization for each regression task with respect to its noise level so that it simultaneously attains improved finite-sample performance and tuning insensitiveness. Theoretically, we provide sufficient conditions under which CMR achieves the optimal rate of convergence in parameter estimation. Computationally, we propose an efficient smoothed proximal gradient algorithm with a worst-case numerical rate of convergence O (1/ ϵ ), where ϵ is a pre-specified accuracy of the objective function value. We conduct thorough numerical simulations to illustrate that CMR consistently outperforms other high dimensional multivariate regression methods. We also apply CMR to solve a brain activity prediction problem and find that it is as competitive as a handcrafted model created by human experts. The R package camel implementing the proposed method is available on the Comprehensive R Archive Network http://cran.r-project.org/web/packages/camel/.

  4. Common Data Models and Efficient Reproducible Workflows for Distributed Ocean Model Skill Assessment

    NASA Astrophysics Data System (ADS)

    Signell, R. P.; Snowden, D. P.; Howlett, E.; Fernandes, F. A.

    2014-12-01

    Model skill assessment requires discovery, access, analysis, and visualization of information from both sensors and models, and traditionally has been possible only by a few experts. The US Integrated Ocean Observing System (US-IOOS) consists of 17 Federal Agencies and 11 Regional Associations that produce data from various sensors and numerical models; exactly the information required for model skill assessment. US-IOOS is seeking to develop documented skill assessment workflows that are standardized, efficient, and reproducible so that a much wider community can participate in the use and assessment of model results. Standardization requires common data models for observational and model data. US-IOOS relies on the CF Conventions for observations and structured grid data, and on the UGRID Conventions for unstructured (e.g. triangular) grid data. This allows applications to obtain only the data they require in a uniform and parsimonious way using web services: OPeNDAP for model output and OGC Sensor Observation Service (SOS) for observed data. Reproducibility is enabled with IPython Notebooks shared on GitHub (http://github.com/ioos). These capture the entire skill assessment workflow, including user input, search, access, analysis, and visualization, ensuring that workflows are self-documenting and reproducible by anyone, using free software. Python packages for common data models are Pyugrid and the British Met Office Iris package. Python packages required to run the workflows (pyugrid, pyoos, and the British Met Office Iris package) are also available on GitHub and on Binstar.org so that users can run scenarios using the free Anaconda Python distribution. Hosted services such as Wakari enable anyone to reproduce these workflows for free, without installing any software locally, using just their web browser. We are also experimenting with Wakari Enterprise, which allows multi-user access from a web browser to an IPython Server running where large quantities of model output reside, increasing the efficiency. The open development and distribution of these workflows, and the software on which they depend, is an educational resource for those new to the field and a center of focus where practitioners can contribute new software and ideas.

  5. Computer investigations of the turbulent flow around a NACA2415 airfoil wind turbine

    NASA Astrophysics Data System (ADS)

    Driss, Zied; Chelbi, Tarek; Abid, Mohamed Salah

    2015-12-01

    In this work, computer investigations are carried out to study the flow field developing around a NACA2415 airfoil wind turbine. The Navier-Stokes equations in conjunction with the standard k-ɛ turbulence model are considered. These equations are solved numerically to determine the local characteristics of the flow. The models tested are implemented in the software "SolidWorks Flow Simulation" which uses a finite volume scheme. The numerical results are compared with experiments conducted on an open wind tunnel to validate the numerical results. This will help improving the aerodynamic efficiency in the design of packaged installations of the NACA2415 airfoil type wind turbine.

  6. Numerical simulation of vessel dynamics in manoeuvrability and seakeeping problems

    NASA Astrophysics Data System (ADS)

    Blishchik, A. E.; Taranov, A. E.

    2018-05-01

    This paper deals with some examples of numerical modelling for ship's dynamics problems and data comparison with corresponding experimental results. It was considered two kinds of simulation: self-propelled turning motion of crude carrier KVLCC2 and changing position of container carrier S 175 due to wave loadings. Mesh generation and calculation were made in STAR-CCM+ package. URANS equations were used as system of equations closed by k-w SST turbulence model. The vessel had several degrees of freedom, which depend on task. Based on the results of this research, the conclusion was made concerning the applicability of used numerical methods.

  7. Finite element analysis of the design and manufacture of thin-walled pressure vessels used as aerosol cans

    NASA Astrophysics Data System (ADS)

    Abdussalam, Ragba Mohamed

    Thin-walled cylinders are used extensively in the food packaging and cosmetics industries. The cost of material is a major contributor to the overall cost and so improvements in design and manufacturing processes are always being sought. Shape optimisation provides one method for such improvements. Aluminium aerosol cans are a particular form of thin-walled cylinder with a complex shape consisting of truncated cone top, parallel cylindrical section and inverted dome base. They are manufactured in one piece by a reverse-extrusion process, which produces a vessel with a variable thickness from 0.31 mm in the cylinder up to 1.31 mm in the base for a 53 mm diameter can. During manufacture, packaging and charging, they are subjected to pressure, axial and radial loads and design calculations are generally outside the British and American pressure vessel codes. 'Design-by-test' appears to be the favoured approach. However, a more rigorous approach is needed in order to optimise the designs. Finite element analysis (FEA) is a powerful tool for predicting stress, strain and displacement behaviour of components and structures. FEA is also used extensively to model manufacturing processes. In this study, elastic and elastic-plastic FEA has been used to develop a thorough understanding of the mechanisms of yielding, 'dome reversal' (an inherent safety feature, where the base suffers elastic-plastic buckling at a pressure below the burst pressure) and collapse due to internal pressure loading and how these are affected by geometry. It has also been used to study the buckling behaviour under compressive axial loading. Furthermore, numerical simulations of the extrusion process (in order to investigate the effects of tool geometry, friction coefficient and boundary conditions) have been undertaken. Experimental verification of the buckling and collapse behaviours has also been carried out and there is reasonable agreement between the experimental data and the numerical predictions.

  8. Residual gas analysis device

    DOEpatents

    Thornberg, Steven M [Peralta, NM

    2012-07-31

    A system is provided for testing the hermeticity of a package, such as a microelectromechanical systems package containing a sealed gas volume, with a sampling device that has the capability to isolate the package and breach the gas seal connected to a pulse valve that can controllably transmit small volumes down to 2 nanoliters to a gas chamber for analysis using gas chromatography/mass spectroscopy diagnostics.

  9. Unlocking the potential of publicly available microarray data using inSilicoDb and inSilicoMerging R/Bioconductor packages.

    PubMed

    Taminau, Jonatan; Meganck, Stijn; Lazar, Cosmin; Steenhoff, David; Coletta, Alain; Molter, Colin; Duque, Robin; de Schaetzen, Virginie; Weiss Solís, David Y; Bersini, Hugues; Nowé, Ann

    2012-12-24

    With an abundant amount of microarray gene expression data sets available through public repositories, new possibilities lie in combining multiple existing data sets. In this new context, analysis itself is no longer the problem, but retrieving and consistently integrating all this data before delivering it to the wide variety of existing analysis tools becomes the new bottleneck. We present the newly released inSilicoMerging R/Bioconductor package which, together with the earlier released inSilicoDb R/Bioconductor package, allows consistent retrieval, integration and analysis of publicly available microarray gene expression data sets. Inside the inSilicoMerging package a set of five visual and six quantitative validation measures are available as well. By providing (i) access to uniformly curated and preprocessed data, (ii) a collection of techniques to remove the batch effects between data sets from different sources, and (iii) several validation tools enabling the inspection of the integration process, these packages enable researchers to fully explore the potential of combining gene expression data for downstream analysis. The power of using both packages is demonstrated by programmatically retrieving and integrating gene expression studies from the InSilico DB repository [https://insilicodb.org/app/].

  10. Transverse emittance and phase space program developed for use at the Fermilab A0 Photoinjector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thurman-Keup, R.; Johnson, A.S.; Lumpkin, A.H.

    2011-03-01

    The Fermilab A0 Photoinjector is a 16 MeV high intensity, high brightness electron linac developed for advanced accelerator R&D. One of the key parameters for the electron beam is the transverse beam emittance. Here we report on a newly developed MATLAB based GUI program used for transverse emittance measurements using the multi-slit technique. This program combines the image acquisition and post-processing tools for determining the transverse phase space parameters with uncertainties. An integral part of accelerator research is a measurement of the beam phase space. Measurements of the transverse phase space can be accomplished by a variety of methods includingmore » multiple screens separated by drift spaces, or by sampling phase space via pepper pots or slits. In any case, the measurement of the phase space parameters, in particular the emittance, can be drastically simplified and sped up by automating the measurement in an intuitive fashion utilizing a graphical interface. At the A0 Photoinjector (A0PI), the control system is DOOCS, which originated at DESY. In addition, there is a library for interfacing to MATLAB, a graphically capable numerical analysis package sold by The Mathworks. It is this graphical package which was chosen as the basis for a graphical phase space measurement system due to its combination of analysis and display capabilities.« less

  11. Which community care for patients with schizophrenic disorders? Packages of care provided by Departments of Mental Health in Lombardy (Italy).

    PubMed

    Lora, Antonio; Cosentino, Ugo; Gandini, Anna; Zocchetti, Carlo

    2007-01-01

    The treatment of schizophrenic disorders is the most important challenge for community care. The analysis focuses on packages of care provided to 23.602 patients with a ICD-10 diagnosis of schizophrenic disorder and treated in 2001 by the Departments of Mental Health in Lombardy, Italy. Packages of care refer to a mix of treatments provided to each patient during the year by different settings. Direct costs of the packages were calculated. Linear Discriminant Analysis has been used to link socio-demographic and diagnostic sub-groups of the patients to packages of care. People with schizophrenic disorders received relatively few care packages: only four packages involved more than 5%. Two thirds of the patients received only care provided by Community Mental Health Centres. In the other two packages with a percentage over 5%, the activity was provided by CMHCs, jointly with General Hospitals or Day Care Facilities. Complex care packages were rare (only 6%). As well as the intensity, also the variety of care provided by CMHCs increased with the complexity of care packages. In Lombardy more than half of the resources were spent for schizophrenia. The range of the costs per package was very wide. LDA failed to link characteristics of the patients to packages of care. Care packages are useful tools to understand better how mental health system works, how resources have been spent and to point out problems in the quality of care.

  12. The gputools package enables GPU computing in R.

    PubMed

    Buckner, Joshua; Wilson, Justin; Seligman, Mark; Athey, Brian; Watson, Stanley; Meng, Fan

    2010-01-01

    By default, the R statistical environment does not make use of parallelism. Researchers may resort to expensive solutions such as cluster hardware for large analysis tasks. Graphics processing units (GPUs) provide an inexpensive and computationally powerful alternative. Using R and the CUDA toolkit from Nvidia, we have implemented several functions commonly used in microarray gene expression analysis for GPU-equipped computers. R users can take advantage of the better performance provided by an Nvidia GPU. The package is available from CRAN, the R project's repository of packages, at http://cran.r-project.org/web/packages/gputools More information about our gputools R package is available at http://brainarray.mbni.med.umich.edu/brainarray/Rgpgpu

  13. SiMon: Simulation Monitor for Computational Astrophysics

    NASA Astrophysics Data System (ADS)

    Xuran Qian, Penny; Cai, Maxwell Xu; Portegies Zwart, Simon; Zhu, Ming

    2017-09-01

    Scientific discovery via numerical simulations is important in modern astrophysics. This relatively new branch of astrophysics has become possible due to the development of reliable numerical algorithms and the high performance of modern computing technologies. These enable the analysis of large collections of observational data and the acquisition of new data via simulations at unprecedented accuracy and resolution. Ideally, simulations run until they reach some pre-determined termination condition, but often other factors cause extensive numerical approaches to break down at an earlier stage. In those cases, processes tend to be interrupted due to unexpected events in the software or the hardware. In those cases, the scientist handles the interrupt manually, which is time-consuming and prone to errors. We present the Simulation Monitor (SiMon) to automatize the farming of large and extensive simulation processes. Our method is light-weight, it fully automates the entire workflow management, operates concurrently across multiple platforms and can be installed in user space. Inspired by the process of crop farming, we perceive each simulation as a crop in the field and running simulation becomes analogous to growing crops. With the development of SiMon we relax the technical aspects of simulation management. The initial package was developed for extensive parameter searchers in numerical simulations, but it turns out to work equally well for automating the computational processing and reduction of observational data reduction.

  14. Differential maneuvering simulator data reduction and analysis software

    NASA Technical Reports Server (NTRS)

    Beasley, G. P.; Sigman, R. S.

    1972-01-01

    A multielement data reduction and analysis software package has been developed for use with the Langley differential maneuvering simulator (DMS). This package, which has several independent elements, was developed to support all phases of DMS aircraft simulation studies with a variety of both graphical and tabular information. The overall software package is considered unique because of the number, diversity, and sophistication of the element programs available for use in a single study. The purpose of this paper is to discuss the overall DMS data reduction and analysis package by reviewing the development of the various elements of the software, showing typical results that can be obtained, and discussing how each element can be used.

  15. The Dynamo package for tomography and subtomogram averaging: components for MATLAB, GPU computing and EC2 Amazon Web Services

    PubMed Central

    Castaño-Díez, Daniel

    2017-01-01

    Dynamo is a package for the processing of tomographic data. As a tool for subtomogram averaging, it includes different alignment and classification strategies. Furthermore, its data-management module allows experiments to be organized in groups of tomograms, while offering specialized three-dimensional tomographic browsers that facilitate visualization, location of regions of interest, modelling and particle extraction in complex geometries. Here, a technical description of the package is presented, focusing on its diverse strategies for optimizing computing performance. Dynamo is built upon mbtools (middle layer toolbox), a general-purpose MATLAB library for object-oriented scientific programming specifically developed to underpin Dynamo but usable as an independent tool. Its structure intertwines a flexible MATLAB codebase with precompiled C++ functions that carry the burden of numerically intensive operations. The package can be delivered as a precompiled standalone ready for execution without a MATLAB license. Multicore parallelization on a single node is directly inherited from the high-level parallelization engine provided for MATLAB, automatically imparting a balanced workload among the threads in computationally intense tasks such as alignment and classification, but also in logistic-oriented tasks such as tomogram binning and particle extraction. Dynamo supports the use of graphical processing units (GPUs), yielding considerable speedup factors both for native Dynamo procedures (such as the numerically intensive subtomogram alignment) and procedures defined by the user through its MATLAB-based GPU library for three-dimensional operations. Cloud-based virtual computing environments supplied with a pre-installed version of Dynamo can be publicly accessed through the Amazon Elastic Compute Cloud (EC2), enabling users to rent GPU computing time on a pay-as-you-go basis, thus avoiding upfront investments in hardware and longterm software maintenance. PMID:28580909

  16. Time And Temperature Dependent Micromechanical Properties Of Solder Joints For 3D-Package Integration

    NASA Astrophysics Data System (ADS)

    Roellig, Mike; Meier, Karsten; Metasch, Rene

    2010-11-01

    The recent development of 3D-integrated electronic packages is characterized by the need to increase the diversity of functions and to miniaturize. Currently many 3D-integration concepts are being developed and all of them demand new materials, new designs and new processing technologies. The combination of simulation and experimental investigation becomes increasingly accepted since simulations help to shorten the R&D cycle time and reduce costs. Numerical calculations like the Finite-Element-Method are strong tools to calculate stress conditions in electronic packages resulting from thermal strains due to the manufacturing process and environmental loads. It is essential for the application of numerical calculations that the material data is accurate and describes sufficiently the physical behaviour. The developed machine allows the measurement of time and temperature dependent micromechanical properties of solder joints. Solder joints, which are used to mechanically and electrically connect different packages, are physically measured as they leave the process. This allows accounting for process influences, which may change material properties. Additionally, joint sizes and metallurgical interactions between solder and under bump metallization can be respected by this particular measurement. The measurement allows the determination of material properties within a temperature range of 20° C-200° C. Further, the time dependent creep deformation can be measured within a strain-rate range of 10-31/s-10-81/s. Solder alloys based on Sn-Ag/Sn-Ag-Cu with additionally impurities and joint sizes down to O/ 200 μm were investigated. To finish the material characterization process the material model coefficient were extracted by FEM-Simulation to increase the accuracy of data.

  17. The Dynamo package for tomography and subtomogram averaging: components for MATLAB, GPU computing and EC2 Amazon Web Services.

    PubMed

    Castaño-Díez, Daniel

    2017-06-01

    Dynamo is a package for the processing of tomographic data. As a tool for subtomogram averaging, it includes different alignment and classification strategies. Furthermore, its data-management module allows experiments to be organized in groups of tomograms, while offering specialized three-dimensional tomographic browsers that facilitate visualization, location of regions of interest, modelling and particle extraction in complex geometries. Here, a technical description of the package is presented, focusing on its diverse strategies for optimizing computing performance. Dynamo is built upon mbtools (middle layer toolbox), a general-purpose MATLAB library for object-oriented scientific programming specifically developed to underpin Dynamo but usable as an independent tool. Its structure intertwines a flexible MATLAB codebase with precompiled C++ functions that carry the burden of numerically intensive operations. The package can be delivered as a precompiled standalone ready for execution without a MATLAB license. Multicore parallelization on a single node is directly inherited from the high-level parallelization engine provided for MATLAB, automatically imparting a balanced workload among the threads in computationally intense tasks such as alignment and classification, but also in logistic-oriented tasks such as tomogram binning and particle extraction. Dynamo supports the use of graphical processing units (GPUs), yielding considerable speedup factors both for native Dynamo procedures (such as the numerically intensive subtomogram alignment) and procedures defined by the user through its MATLAB-based GPU library for three-dimensional operations. Cloud-based virtual computing environments supplied with a pre-installed version of Dynamo can be publicly accessed through the Amazon Elastic Compute Cloud (EC2), enabling users to rent GPU computing time on a pay-as-you-go basis, thus avoiding upfront investments in hardware and longterm software maintenance.

  18. Highly-optimized TWSM software package for seismic diffraction modeling adapted for GPU-cluster

    NASA Astrophysics Data System (ADS)

    Zyatkov, Nikolay; Ayzenberg, Alena; Aizenberg, Arkady

    2015-04-01

    Oil producing companies concern to increase resolution capability of seismic data for complex oil-and-gas bearing deposits connected with salt domes, basalt traps, reefs, lenses, etc. Known methods of seismic wave theory define shape of hydrocarbon accumulation with nonsufficient resolution, since they do not account for multiple diffractions explicitly. We elaborate alternative seismic wave theory in terms of operators of propagation in layers and reflection-transmission at curved interfaces. Approximation of this theory is realized in the seismic frequency range as the Tip-Wave Superposition Method (TWSM). TWSM based on the operator theory allows to evaluate of wavefield in bounded domains/layers with geometrical shadow zones (in nature it can be: salt domes, basalt traps, reefs, lenses, etc.) accounting for so-called cascade diffraction. Cascade diffraction includes edge waves from sharp edges, creeping waves near concave parts of interfaces, waves of the whispering galleries near convex parts of interfaces, etc. The basic algorithm of TWSM package is based on multiplication of large-size matrices (make hundreds of terabytes in size). We use advanced information technologies for effective realization of numerical procedures of the TWSM. In particular, we actively use NVIDIA CUDA technology and GPU accelerators allowing to significantly improve the performance of the TWSM software package, that is important in using it for direct and inverse problems. The accuracy, stability and efficiency of the algorithm are justified by numerical examples with curved interfaces. TWSM package and its separate components can be used in different modeling tasks such as planning of acquisition systems, physical interpretation of laboratory modeling, modeling of individual waves of different types and in some inverse tasks such as imaging in case of laterally inhomogeneous overburden, AVO inversion.

  19. The Role of Packaging in Solid Waste Management 1966 to 1976.

    ERIC Educational Resources Information Center

    Darnay, Arsen; Franklin, William E.

    The goals of waste processors and packagers obviously differ: the packaging industry seeks durable container material that will be unimpaired by external factors. Until recently, no systematic analysis of the relationship between packaging and solid waste disposal had been undertaken. This three-part document defines these interactions, and the…

  20. As-built design specification for segment map (Sgmap) program

    NASA Technical Reports Server (NTRS)

    Tompkins, M. A. (Principal Investigator)

    1981-01-01

    The segment map program (SGMAP), which is part of the CLASFYT package, is described in detail. This program is designed to output symbolic maps or numerical dumps from LANDSAT cluster/classification files or aircraft ground truth/processed ground truth files which are in 'universal' format.

  1. An atmospheric dispersion index for prescribed burning

    Treesearch

    Leonidas G. Lavdas

    1986-01-01

    A numerical index that estimates the atmosphere's capacity to disperse smoke from prescribed burning is described. The physical assumptions and mathematical development of the index are given in detail. A preliminary interpretation of dispersion index values is offered. A FORTRAN subroutine package for computing the index is included.

  2. Cyrface: An interface from Cytoscape to R that provides a user interface to R packages.

    PubMed

    Gonçalves, Emanuel; Mirlach, Franz; Saez-Rodriguez, Julio

    2013-01-01

    There is an increasing number of software packages to analyse biological experimental data in the R environment. In particular, Bioconductor, a repository of curated R packages, is one of the most comprehensive resources for bioinformatics and biostatistics. The use of these packages is increasing, but it requires a basic understanding of the R language, as well as the syntax of the specific package used. The availability of user graphical interfaces for these packages would decrease the learning curve and broaden their application. Here, we present a Cytoscape app termed Cyrface that allows Cytoscape apps to connect to any function and package developed in R. Cyrface can be used to run R packages from within the Cytoscape environment making use of a graphical user interface. Moreover, it can link R packages with the capabilities of Cytoscape and its apps, in particular network visualization and analysis. Cyrface's utility has been demonstrated for two Bioconductor packages ( CellNOptR and DrugVsDisease), and here we further illustrate its usage by implementing a workflow of data analysis and visualization. Download links, installation instructions and user guides can be accessed from the Cyrface's homepage ( http://www.ebi.ac.uk/saezrodriguez/cyrface/) and from the Cytoscape app store ( http://apps.cytoscape.org/apps/cyrface).

  3. Analysis of Variance: What Is Your Statistical Software Actually Doing?

    ERIC Educational Resources Information Center

    Li, Jian; Lomax, Richard G.

    2011-01-01

    Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…

  4. 49 CFR 109.9 - Transportation for examination and analysis.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 2 2011-10-01 2011-10-01 false Transportation for examination and analysis. 109.9... analysis. (a) An agent may direct a package to be transported to a facility for examination and analysis... the package conforms to subchapter C of this chapter; (2) Conflicting information concerning the...

  5. Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages

    ERIC Educational Resources Information Center

    Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro

    2017-01-01

    Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…

  6. QmeQ 1.0: An open-source Python package for calculations of transport through quantum dot devices

    NASA Astrophysics Data System (ADS)

    Kiršanskas, Gediminas; Pedersen, Jonas Nyvold; Karlström, Olov; Leijnse, Martin; Wacker, Andreas

    2017-12-01

    QmeQ is an open-source Python package for numerical modeling of transport through quantum dot devices with strong electron-electron interactions using various approximate master equation approaches. The package provides a framework for calculating stationary particle or energy currents driven by differences in chemical potentials or temperatures between the leads which are tunnel coupled to the quantum dots. The electronic structures of the quantum dots are described by their single-particle states and the Coulomb matrix elements between the states. When transport is treated perturbatively to lowest order in the tunneling couplings, the possible approaches are Pauli (classical), first-order Redfield, and first-order von Neumann master equations, and a particular form of the Lindblad equation. When all processes involving two-particle excitations in the leads are of interest, the second-order von Neumann approach can be applied. All these approaches are implemented in QmeQ. We here give an overview of the basic structure of the package, give examples of transport calculations, and outline the range of applicability of the different approximate approaches.

  7. Resilience Among Students at the Basic Enlisted Submarine School

    DTIC Science & Technology

    2016-12-01

    reported resilience. The Hayes’ Macro in the Statistical Package for the Social Sciences (SSPS) was used to uncover factors relevant to mediation analysis... Statistical Package for the Social Sciences (SPSS) was used to uncover factors relevant to mediation analysis. Findings suggest that the encouragement of...to Stressful Experiences Scale RTC Recruit Training Command SPSS Statistical Package for the Social Sciences SS Social Support SWB Subjective Well

  8. Utility of coupling nonlinear optimization methods with numerical modeling software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murphy, M.J.

    1996-08-05

    Results of using GLO (Global Local Optimizer), a general purpose nonlinear optimization software package for investigating multi-parameter problems in science and engineering is discussed. The package consists of the modular optimization control system (GLO), a graphical user interface (GLO-GUI), a pre-processor (GLO-PUT), a post-processor (GLO-GET), and nonlinear optimization software modules, GLOBAL & LOCAL. GLO is designed for controlling and easy coupling to any scientific software application. GLO runs the optimization module and scientific software application in an iterative loop. At each iteration, the optimization module defines new values for the set of parameters being optimized. GLO-PUT inserts the new parametermore » values into the input file of the scientific application. GLO runs the application with the new parameter values. GLO-GET determines the value of the objective function by extracting the results of the analysis and comparing to the desired result. GLO continues to run the scientific application over and over until it finds the ``best`` set of parameters by minimizing (or maximizing) the objective function. An example problem showing the optimization of material model is presented (Taylor cylinder impact test).« less

  9. A practical tool for monitoring the performance of measuring systems in a laboratory network: report of an ACB Working Group.

    PubMed

    Ayling, Pete; Hill, Robert; Jassam, Nuthar; Kallner, Anders; Khatami, Zahra

    2017-11-01

    Background A logical consequence of the introduction of robotics and high-capacity analysers has seen a consolidation to larger units. This requires new structures and quality systems to ensure that laboratories deliver consistent and comparable results. Methods A spreadsheet program was designed to accommodate results from up to 12 different instruments/laboratories and present IQC data, i.e. Levey-Jennings and Youden plots and comprehensive numerical tables of the performance of each item. Input of data was made possible by a 'data loader' by which IQC data from the individual instruments could be transferred to the spreadsheet program on line. Results A set of real data from laboratories is used to populate the data loader and the networking software program. Examples are present from the analysis of variance components, the Levey-Jennings and Youden plots. Conclusions This report presents a software package that allows the simultaneous management and detailed monitoring of the performance of up to 12 different instruments/laboratories in a fully interactive mode. The system allows a quality manager of networked laboratories to have a continuous updated overview of the performance. This software package has been made available at the ACB website.

  10. TTLEM: Open access tool for building numerically accurate landscape evolution models in MATLAB

    NASA Astrophysics Data System (ADS)

    Campforts, Benjamin; Schwanghart, Wolfgang; Govers, Gerard

    2017-04-01

    Despite a growing interest in LEMs, accuracy assessment of the numerical methods they are based on has received little attention. Here, we present TTLEM which is an open access landscape evolution package designed to develop and test your own scenarios and hypothesises. TTLEM uses a higher order flux-limiting finite-volume method to simulate river incision and tectonic displacement. We show that this scheme significantly influences the evolution of simulated landscapes and the spatial and temporal variability of erosion rates. Moreover, it allows the simulation of lateral tectonic displacement on a fixed grid. Through the use of a simple GUI the software produces visible output of evolving landscapes through model run time. In this contribution, we illustrate numerical landscape evolution through a set of movies spanning different spatial and temporal scales. We focus on the erosional domain and use both spatially constant and variable input values for uplift, lateral tectonic shortening, erodibility and precipitation. Moreover, we illustrate the relevance of a stochastic approach for realistic hillslope response modelling. TTLEM is a fully open source software package, written in MATLAB and based on the TopoToolbox platform (topotoolbox.wordpress.com). Installation instructions can be found on this website and the therefore designed GitHub repository.

  11. Technical Review Report for the Model 9978-96 Package Safety Analysis Report for Packaging (S-SARP-G-00002, Revision 1, March 2009)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    West, M

    2009-03-06

    This Technical Review Report (TRR) documents the review, performed by Lawrence Livermore National Laboratory (LLNL) Staff, at the request of the Department of Energy (DOE), on the 'Safety Analysis Report for Packaging (SARP), Model 9978 B(M)F-96', Revision 1, March 2009 (S-SARP-G-00002). The Model 9978 Package complies with 10 CFR 71, and with 'Regulations for the Safe Transport of Radioactive Material-1996 Edition (As Amended, 2000)-Safety Requirements', International Atomic Energy Agency (IAEA) Safety Standards Series No. TS-R-1. The Model 9978 Packaging is designed, analyzed, fabricated, and tested in accordance with Section III of the American Society of Mechanical Engineers Boiler and Pressuremore » Vessel Code (ASME B&PVC). The review presented in this TRR was performed using the methods outlined in Revision 3 of the DOE's 'Packaging Review Guide (PRG) for Reviewing Safety Analysis Reports for Packages'. The format of the SARP follows that specified in Revision 2 of the Nuclear Regulatory Commission's Regulatory Guide 7.9, i.e., 'Standard Format and Content of Part 71 Applications for Approval of Packages for Radioactive Material'. Although the two documents are similar in their content, they are not identical. Formatting differences have been noted in this TRR, where appropriate. The Model 9978 Packaging is a single containment package, using a 5-inch containment vessel (5CV). It uses a nominal 35-gallon drum package design. In comparison, the Model 9977 Packaging uses a 6-inch containment vessel (6CV). The Model 9977 and Model 9978 Packagings were developed concurrently, and they were referred to as the General Purpose Fissile Material Package, Version 1 (GPFP). Both packagings use General Plastics FR-3716 polyurethane foam as insulation and as impact limiters. The 5CV is used as the Primary Containment Vessel (PCV) in the Model 9975-96 Packaging. The Model 9975-96 Packaging also has the 6CV as its Secondary Containment Vessel (SCV). In comparison, the Model 9975 Packagings use Celotex{trademark} for insulation and as impact limiters. To provide a historical perspective, it is noted that the Model 9975-96 Packaging is a 35-gallon drum package design that has evolved from a family of packages designed by DOE contractors at the Savannah River Site. Earlier package designs, i.e., the Model 9965, the Model 9966, the Model 9967, and the Model 9968 Packagings, were originally designed and certified in the early 1980s. In the 1990s, updated package designs that incorporated design features consistent with the then-newer safety requirements were proposed. The updated package designs at the time were the Model 9972, the Model 9973, the Model 9974, and the Model 9975 Packagings, respectively. The Model 9975 Package was certified by the Packaging Certification Program, under the Office of Safety Management and Operations. The Model 9978 Package has six Content Envelopes: C.1 ({sup 238}Pu Heat Sources), C.2 ( Pu/U Metals), C.3 (Pu/U Oxides, Reserved), C.4 (U Metal or Alloy), C.5 (U Compounds), and C.6 (Samples and Sources). Per 10 CFR 71.59 (Code of Federal Regulations), the value of N is 50 for the Model 9978 Package leading to a Criticality Safety Index (CSI) of 1.0. The Transport Index (TI), based on dose rate, is calculated to be a maximum of 4.1.« less

  12. Rule-based optimization and multicriteria decision support for packaging a truck chassis

    NASA Astrophysics Data System (ADS)

    Berger, Martin; Lindroth, Peter; Welke, Richard

    2017-06-01

    Trucks are highly individualized products where exchangeable parts are flexibly combined to suit different customer requirements, this leading to a great complexity in product development. Therefore, an optimization approach based on constraint programming is proposed for automatically packaging parts of a truck chassis by following packaging rules expressed as constraints. A multicriteria decision support system is developed where a database of truck layouts is computed, among which interactive navigation then can be performed. The work has been performed in cooperation with Volvo Group Trucks Technology (GTT), from which specific rules have been used. Several scenarios are described where the methods developed can be successfully applied and lead to less time-consuming manual work, fewer mistakes, and greater flexibility in configuring trucks. A numerical evaluation is also presented showing the efficiency and practical relevance of the methods, which are implemented in a software tool.

  13. Efficient analysis of large-scale genome-wide data with two R packages: bigstatsr and bigsnpr.

    PubMed

    Privé, Florian; Aschard, Hugues; Ziyatdinov, Andrey; Blum, Michael G B

    2017-03-30

    Genome-wide datasets produced for association studies have dramatically increased in size over the past few years, with modern datasets commonly including millions of variants measured in dozens of thousands of individuals. This increase in data size is a major challenge severely slowing down genomic analyses, leading to some software becoming obsolete and researchers having limited access to diverse analysis tools. Here we present two R packages, bigstatsr and bigsnpr, allowing for the analysis of large scale genomic data to be performed within R. To address large data size, the packages use memory-mapping for accessing data matrices stored on disk instead of in RAM. To perform data pre-processing and data analysis, the packages integrate most of the tools that are commonly used, either through transparent system calls to existing software, or through updated or improved implementation of existing methods. In particular, the packages implement fast and accurate computations of principal component analysis and association studies, functions to remove SNPs in linkage disequilibrium and algorithms to learn polygenic risk scores on millions of SNPs. We illustrate applications of the two R packages by analyzing a case-control genomic dataset for celiac disease, performing an association study and computing Polygenic Risk Scores. Finally, we demonstrate the scalability of the R packages by analyzing a simulated genome-wide dataset including 500,000 individuals and 1 million markers on a single desktop computer. https://privefl.github.io/bigstatsr/ & https://privefl.github.io/bigsnpr/. florian.prive@univ-grenoble-alpes.fr & michael.blum@univ-grenoble-alpes.fr. Supplementary materials are available at Bioinformatics online.

  14. MethVisual - visualization and exploratory statistical analysis of DNA methylation profiles from bisulfite sequencing.

    PubMed

    Zackay, Arie; Steinhoff, Christine

    2010-12-15

    Exploration of DNA methylation and its impact on various regulatory mechanisms has become a very active field of research. Simultaneously there is an arising need for tools to process and analyse the data together with statistical investigation and visualisation. MethVisual is a new application that enables exploratory analysis and intuitive visualization of DNA methylation data as is typically generated by bisulfite sequencing. The package allows the import of DNA methylation sequences, aligns them and performs quality control comparison. It comprises basic analysis steps as lollipop visualization, co-occurrence display of methylation of neighbouring and distant CpG sites, summary statistics on methylation status, clustering and correspondence analysis. The package has been developed for methylation data but can be also used for other data types for which binary coding can be inferred. The application of the package, as well as a comparison to existing DNA methylation analysis tools and its workflow based on two datasets is presented in this paper. The R package MethVisual offers various analysis procedures for data that can be binarized, in particular for bisulfite sequenced methylation data. R/Bioconductor has become one of the most important environments for statistical analysis of various types of biological and medical data. Therefore, any data analysis within R that allows the integration of various data types as provided from different technological platforms is convenient. It is the first and so far the only specific package for DNA methylation analysis, in particular for bisulfite sequenced data available in R/Bioconductor enviroment. The package is available for free at http://methvisual.molgen.mpg.de/ and from the Bioconductor Consortium http://www.bioconductor.org.

  15. MethVisual - visualization and exploratory statistical analysis of DNA methylation profiles from bisulfite sequencing

    PubMed Central

    2010-01-01

    Background Exploration of DNA methylation and its impact on various regulatory mechanisms has become a very active field of research. Simultaneously there is an arising need for tools to process and analyse the data together with statistical investigation and visualisation. Findings MethVisual is a new application that enables exploratory analysis and intuitive visualization of DNA methylation data as is typically generated by bisulfite sequencing. The package allows the import of DNA methylation sequences, aligns them and performs quality control comparison. It comprises basic analysis steps as lollipop visualization, co-occurrence display of methylation of neighbouring and distant CpG sites, summary statistics on methylation status, clustering and correspondence analysis. The package has been developed for methylation data but can be also used for other data types for which binary coding can be inferred. The application of the package, as well as a comparison to existing DNA methylation analysis tools and its workflow based on two datasets is presented in this paper. Conclusions The R package MethVisual offers various analysis procedures for data that can be binarized, in particular for bisulfite sequenced methylation data. R/Bioconductor has become one of the most important environments for statistical analysis of various types of biological and medical data. Therefore, any data analysis within R that allows the integration of various data types as provided from different technological platforms is convenient. It is the first and so far the only specific package for DNA methylation analysis, in particular for bisulfite sequenced data available in R/Bioconductor enviroment. The package is available for free at http://methvisual.molgen.mpg.de/ and from the Bioconductor Consortium http://www.bioconductor.org. PMID:21159174

  16. Estimating soil hydraulic parameters from transient flow experiments in a centrifuge using parameter optimization technique

    USGS Publications Warehouse

    Šimůnek, Jirka; Nimmo, John R.

    2005-01-01

    A modified version of the Hydrus software package that can directly or inversely simulate water flow in a transient centrifugal field is presented. The inverse solver for parameter estimation of the soil hydraulic parameters is then applied to multirotation transient flow experiments in a centrifuge. Using time‐variable water contents measured at a sequence of several rotation speeds, soil hydraulic properties were successfully estimated by numerical inversion of transient experiments. The inverse method was then evaluated by comparing estimated soil hydraulic properties with those determined independently using an equilibrium analysis. The optimized soil hydraulic properties compared well with those determined using equilibrium analysis and steady state experiment. Multirotation experiments in a centrifuge not only offer significant time savings by accelerating time but also provide significantly more information for the parameter estimation procedure compared to multistep outflow experiments in a gravitational field.

  17. MEGNO-analysis of light pressure influence on orbital evolution of object in GEO. (Russian Title: MEGNO-анализ влияния светового давления на орбитальную эволюцию объектов зоны ГЕО)

    NASA Astrophysics Data System (ADS)

    Aleksandrova, A. G.; Bordovitsyna, T. V.; Chuvashov, I. N.

    2011-07-01

    In the present work results of investigations of the effect of radiation pressure on the orbital evolution of objects in the GEO are presentеd.. MEGNO-analysis of orbits in the GEO for different values of sail (area to mass ratio) have been performed. Average parameter MEGNO as main indicator of chaotic or stable motion has been used. The results have been obtained using software package "Numerical model of the systems artificial satellite motion", implemented on the cluster "Skiff Cyberia".

  18. Calculation of coal resources using ARC/INFO and Earth Vision; methodology for the National Coal Resource Assessment

    USGS Publications Warehouse

    Roberts, L.N.; Biewick, L.R.

    1999-01-01

    This report documents a comparison of two methods of resource calculation that are being used in the National Coal Resource Assessment project of the U.S. Geological Survey (USGS). Tewalt (1998) discusses the history of using computer software packages such as GARNET (Graphic Analysis of Resources using Numerical Evaluation Techniques), GRASS (Geographic Resource Analysis Support System), and the vector-based geographic information system (GIS) ARC/INFO (ESRI, 1998) to calculate coal resources within the USGS. The study discussed here, compares resource calculations using ARC/INFO* (ESRI, 1998) and EarthVision (EV)* (Dynamic Graphics, Inc. 1997) for the coal-bearing John Henry Member of the Straight Cliffs Formation of Late Cretaceous age in the Kaiparowits Plateau of southern Utah. Coal resource estimates in the Kaiparowits Plateau using ARC/INFO are reported in Hettinger, and others, 1996.

  19. Study of the penetration of a plate made of titanium alloy VT6 with a steel ball

    NASA Astrophysics Data System (ADS)

    Buzyurkin, A. E.

    2018-03-01

    The purpose of this work is the development and verification of mathematical relationships, adapted to the package of finite element analysis LS-DYNA and describing the deformation and destruction of a titanium plate in a high-speed collision. Using data from experiments on the interaction of a steel ball with a titanium plate made of VT6 alloy, verification of the available constants necessary for describing the behavior of the material using the Johnson-Cook relationships was performed, as well as verification of the parameters of the fracture model used in the numerical modeling of the collision process. An analysis of experimental data on the interaction of a spherical impactor with a plate showed that the data accepted for VT6 alloy in the first approximation for deformation hardening in the Johnson-Cook model give too high results on the residual velocities of the impactor when piercing the plate.

  20. Automotive Exterior Noise Optimization Using Grey Relational Analysis Coupled with Principal Component Analysis

    NASA Astrophysics Data System (ADS)

    Chen, Shuming; Wang, Dengfeng; Liu, Bo

    This paper investigates optimization design of the thickness of the sound package performed on a passenger automobile. The major characteristics indexes for performance selected to evaluate the processes are the SPL of the exterior noise and the weight of the sound package, and the corresponding parameters of the sound package are the thickness of the glass wool with aluminum foil for the first layer, the thickness of the glass fiber for the second layer, and the thickness of the PE foam for the third layer. In this paper, the process is fundamentally with multiple performances, thus, the grey relational analysis that utilizes grey relational grade as performance index is especially employed to determine the optimal combination of the thickness of the different layers for the designed sound package. Additionally, in order to evaluate the weighting values corresponding to various performance characteristics, the principal component analysis is used to show their relative importance properly and objectively. The results of the confirmation experiments uncover that grey relational analysis coupled with principal analysis methods can successfully be applied to find the optimal combination of the thickness for each layer of the sound package material. Therefore, the presented method can be an effective tool to improve the vehicle exterior noise and lower the weight of the sound package. In addition, it will also be helpful for other applications in the automotive industry, such as the First Automobile Works in China, Changan Automobile in China, etc.

  1. OPTIMASS: a package for the minimization of kinematic mass functions with constraints

    NASA Astrophysics Data System (ADS)

    Cho, Won Sang; Gainer, James S.; Kim, Doojin; Lim, Sung Hak; Matchev, Konstantin T.; Moortgat, Filip; Pape, Luc; Park, Myeonghun

    2016-01-01

    Reconstructed mass variables, such as M 2, M 2 C , M T * , and M T2 W , play an essential role in searches for new physics at hadron colliders. The calculation of these variables generally involves constrained minimization in a large parameter space, which is numerically challenging. We provide a C++ code, O ptimass, which interfaces with the M inuit library to perform this constrained minimization using the Augmented Lagrangian Method. The code can be applied to arbitrarily general event topologies, thus allowing the user to significantly extend the existing set of kinematic variables. We describe this code, explain its physics motivation, and demonstrate its use in the analysis of the fully leptonic decay of pair-produced top quarks using M 2 variables.

  2. Parallel 3D Finite Element Numerical Modelling of DC Electron Guns

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prudencio, E.; Candel, A.; Ge, L.

    2008-02-04

    In this paper we present Gun3P, a parallel 3D finite element application that the Advanced Computations Department at the Stanford Linear Accelerator Center is developing for the analysis of beam formation in DC guns and beam transport in klystrons. Gun3P is targeted specially to complex geometries that cannot be described by 2D models and cannot be easily handled by finite difference discretizations. Its parallel capability allows simulations with more accuracy and less processing time than packages currently available. We present simulation results for the L-band Sheet Beam Klystron DC gun, in which case Gun3P is able to reduce simulation timemore » from days to some hours.« less

  3. Pharmacovigilance in Space: Stability Payload Compliance Procedures

    NASA Technical Reports Server (NTRS)

    Daniels, Vernie R.; Putcha, Lakshmi

    2007-01-01

    Pharmacovigilance is the science of, and activities relating to the detection, assessment, understanding, and prevention of drug-related problems. Over the lase decade, pharmacovigilance activities have contributed to the development of numerous technological and conventional advances focused on medication safety and regulatory intervention. The topics discussed include: 1) Proactive Pharmacovigilance; 2) A New Frontier; 3) Research Activities; 4) Project Purpose; 5) Methods; 6) Flight Stability Kit Components; 7) Experimental Conditions; 8) Research Project Logistics; 9) Research Plan; 10) Pharmaceutical Stability Research Project Pharmacovigilance Aspects; 11) Security / Control; 12) Packaging/Containment Actions; 13) Shelf-Life Assessments; 14) Stability Assessment Parameters; 15) Chemical Content Analysis; 16) Preliminary Results; 17) Temperature/Humidity; 18) Changes in PHysical and Chemical Assessment Parameters; 19) Observations; and 20) Conclusions.

  4. pyCTQW: A continuous-time quantum walk simulator on distributed memory computers

    NASA Astrophysics Data System (ADS)

    Izaac, Josh A.; Wang, Jingbo B.

    2015-01-01

    In the general field of quantum information and computation, quantum walks are playing an increasingly important role in constructing physical models and quantum algorithms. We have recently developed a distributed memory software package pyCTQW, with an object-oriented Python interface, that allows efficient simulation of large multi-particle CTQW (continuous-time quantum walk)-based systems. In this paper, we present an introduction to the Python and Fortran interfaces of pyCTQW, discuss various numerical methods of calculating the matrix exponential, and demonstrate the performance behavior of pyCTQW on a distributed memory cluster. In particular, the Chebyshev and Krylov-subspace methods for calculating the quantum walk propagation are provided, as well as methods for visualization and data analysis.

  5. The numerical study of the coextrusion process of polymer melts in the cable head

    NASA Astrophysics Data System (ADS)

    Kozitsyna, M. V.; Trufanova, N. M.

    2017-06-01

    The process of coextrusion consists in a simultaneous creation of all necessary insulating layers of different polymers in the channel of a special forming tool. The main focus of this study is the analysis of technological, geometrical and rheological characteristics on the values of the layer’s thickness. In this paper are considered three geometries of cable head on the three-dimensional and two-dimensional representation. The mathematical models of separate and joint flow of polymer melts have been implemented by the finite element method in Ansys software package. The velocity fields, temperature, pressure in the cross-sections of the channel and by the length have been obtained. The influence of some thickness characteristics of insulation layers has been identified.

  6. Machining Chatter Analysis for High Speed Milling Operations

    NASA Astrophysics Data System (ADS)

    Sekar, M.; Kantharaj, I.; Amit Siddhappa, Savale

    2017-10-01

    Chatter in high speed milling is characterized by time delay differential equations (DDE). Since closed form solution exists only for simple cases, the governing non-linear DDEs of chatter problems are solved by various numerical methods. Custom codes to solve DDEs are tedious to build, implement and not error free and robust. On the other hand, software packages provide solution to DDEs, however they are not straight forward to implement. In this paper an easy way to solve DDE of chatter in milling is proposed and implemented with MATLAB. Time domain solution permits the study and model of non-linear effects of chatter vibration with ease. Time domain results are presented for various stable and unstable conditions of cut and compared with stability lobe diagrams.

  7. A MODFLOW Infiltration Device Package for Simulating Storm Water Infiltration.

    PubMed

    Jeppesen, Jan; Christensen, Steen

    2015-01-01

    This article describes a MODFLOW Infiltration Device (INFD) Package that can simulate infiltration devices and their two-way interaction with groundwater. The INFD Package relies on a water balance including inflow of storm water, leakage-like seepage through the device faces, overflow, and change in storage. The water balance for the device can be simulated in multiple INFD time steps within a single MODFLOW time step, and infiltration from the device can be routed through the unsaturated zone to the groundwater table. A benchmark test shows that the INFD Package's analytical solution for stage computes exact results for transient behavior. To achieve similar accuracy by the numerical solution of the MODFLOW Surface-Water Routing (SWR1) Process requires many small time steps. Furthermore, the INFD Package includes an improved representation of flow through the INFD sides that results in lower infiltration rates than simulated by SWR1. The INFD Package is also demonstrated in a transient simulation of a hypothetical catchment where two devices interact differently with groundwater. This simulation demonstrates that device and groundwater interaction depends on the thickness of the unsaturated zone because a shallow groundwater table (a likely result from storm water infiltration itself) may occupy retention volume, whereas a thick unsaturated zone may cause a phase shift and a change of amplitude in groundwater table response to a change of infiltration. We thus find that the INFD Package accommodates the simulation of infiltration devices and groundwater in an integrated manner on small as well as large spatial and temporal scales. © 2014, National Ground Water Association.

  8. A New Image Processing and GIS Package

    NASA Technical Reports Server (NTRS)

    Rickman, D.; Luvall, J. C.; Cheng, T.

    1998-01-01

    The image processing and GIS package ELAS was developed during the 1980's by NASA. It proved to be a popular, influential and powerful in the manipulation of digital imagery. Before the advent of PC's it was used by hundreds of institutions, mostly schools. It is the unquestioned, direct progenitor or two commercial GIS remote sensing packages, ERDAS and MapX and influenced others, such as PCI. Its power was demonstrated by its use for work far beyond its original purpose, having worked several different types of medical imagery, photomicrographs of rock, images of turtle flippers and numerous other esoteric imagery. Although development largely stopped in the early 1990's the package still offers as much or more power and flexibility than any other roughly comparable package, public or commercial. It is a huge body or code, representing more than a decade of work by full time, professional programmers. The current versions all have several deficiencies compared to current software standards and usage, notably its strictly command line interface. In order to support their research needs the authors are in the process of fundamentally changing ELAS, and in the process greatly increasing its power, utility, and ease of use. The new software is called ELAS II. This paper discusses the design of ELAS II.

  9. Influence of the stretch wrapping process on the mechanical behavior of a stretch film

    NASA Astrophysics Data System (ADS)

    Klein, Daniel; Stommel, Markus; Zimmer, Johannes

    2018-05-01

    Lightweight construction is an ongoing task in packaging development. Consequently, the stability of packages during transport is gaining importance. This study contributes to the optimization of lightweight packaging concepts regarding their stability. A very widespread packaging concept is the distribution of goods on a pallet whereas a Polyethylene (PE) stretch film stabilizes the lightweight structure during the shipment. Usually, a stretch wrapping machine applies this stretch film to the pallet. The objective of this study is to support packaging development with a method that predicts the result of the wrapping process, based on the mechanical characterization of the stretch film. This result is not only defined by the amount of stretch film, its spatial distribution on the pallet and its internal stresses that result in a containment force. More accurate, this contribution also considers the influence of the deformation history of the stretch film during the wrapping process. By focusing on similarities of stretch wrappers rather than on differences, the influence of generalized process parameters on stretch film mechanics and thereby on pallet stability can be determined experimentally. For a practical use, the predictive method is accumulated in an analytic model of the wrapping process that can be verified experimentally. This paves the way for experimental and numerical approaches regarding the optimization of pallet stability.

  10. InterFace: A software package for face image warping, averaging, and principal components analysis.

    PubMed

    Kramer, Robin S S; Jenkins, Rob; Burton, A Mike

    2017-12-01

    We describe InterFace, a software package for research in face recognition. The package supports image warping, reshaping, averaging of multiple face images, and morphing between faces. It also supports principal components analysis (PCA) of face images, along with tools for exploring the "face space" produced by PCA. The package uses a simple graphical user interface, allowing users to perform these sophisticated image manipulations without any need for programming knowledge. The program is available for download in the form of an app, which requires that users also have access to the (freely available) MATLAB Runtime environment.

  11. Selection of software for mechanical engineering undergraduates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheah, C. T.; Yin, C. S.; Halim, T.

    A major problem with the undergraduate mechanical course is the limited exposure of students to software packages coupled with the long learning curve on the existing software packages. This work proposes the use of appropriate software packages for the entire mechanical engineering curriculum to ensure students get sufficient exposure real life design problems. A variety of software packages are highlighted as being suitable for undergraduate work in mechanical engineering, e.g. simultaneous non-linear equations; uncertainty analysis; 3-D modeling software with the FEA; analysis tools for the solution of problems in thermodynamics, fluid mechanics, mechanical system design, and solid mechanics.

  12. Safety analysis report for packaging, Oak Ridge Y-12 Plant, model DC-1 package with HEU oxide contents. Change pages for Rev.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    This Safety Analysis Report for Packaging for the Oak Ridge Y-12 Plant for the Model DC-1 package with highly enriched uranium (HEU) oxide contents has been prepared in accordance with governing regulations form the Nuclear Regulatory Commission and the Department of Transportation and orders from the Department of energy. The fundamental safety requirements addressed by these regulations and orders pertain to the containment of radioactive material, radiation shielding, and nuclear subcriticality. This report demonstrates how these requirements are met.

  13. Permeability of model porous medium formed by random discs

    NASA Astrophysics Data System (ADS)

    Gubaidullin, A. A.; Gubkin, A. S.; Igoshin, D. E.; Ignatev, P. A.

    2018-03-01

    Two-dimension model of the porous medium with skeleton of randomly located overlapping discs is proposed. The geometry and computational grid are built in open package Salome. Flow of Newtonian liquid in longitudinal and transverse directions is calculated and its flow rate is defined. The numerical solution of the Navier-Stokes equations for a given pressure drop at the boundaries of the area is realized in the open package OpenFOAM. Calculated value of flow rate is used for defining of permeability coefficient on the base of Darcy law. For evaluating of representativeness of computational domain the permeability coefficients in longitudinal and transverse directions are compered.

  14. Image analysis to evaluate the browning degree of banana (Musa spp.) peel.

    PubMed

    Cho, Jeong-Seok; Lee, Hyeon-Jeong; Park, Jung-Hoon; Sung, Jun-Hyung; Choi, Ji-Young; Moon, Kwang-Deog

    2016-03-01

    Image analysis was applied to examine banana peel browning. The banana samples were divided into 3 treatment groups: no treatment and normal packaging (Cont); CO2 gas exchange packaging (CO); normal packaging with an ethylene generator (ET). We confirmed that the browning of banana peels developed more quickly in the CO group than the other groups based on sensory test and enzyme assay. The G (green) and CIE L(∗), a(∗), and b(∗) values obtained from the image analysis sharply increased or decreased in the CO group. And these colour values showed high correlation coefficients (>0.9) with the sensory test results. CIE L(∗)a(∗)b(∗) values using a colorimeter also showed high correlation coefficients but comparatively lower than those of image analysis. Based on this analysis, browning of the banana occurred more quickly for CO2 gas exchange packaging, and image analysis can be used to evaluate the browning of banana peels. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. PIVOT: platform for interactive analysis and visualization of transcriptomics data.

    PubMed

    Zhu, Qin; Fisher, Stephen A; Dueck, Hannah; Middleton, Sarah; Khaladkar, Mugdha; Kim, Junhyong

    2018-01-05

    Many R packages have been developed for transcriptome analysis but their use often requires familiarity with R and integrating results of different packages requires scripts to wrangle the datatypes. Furthermore, exploratory data analyses often generate multiple derived datasets such as data subsets or data transformations, which can be difficult to track. Here we present PIVOT, an R-based platform that wraps open source transcriptome analysis packages with a uniform user interface and graphical data management that allows non-programmers to interactively explore transcriptomics data. PIVOT supports more than 40 popular open source packages for transcriptome analysis and provides an extensive set of tools for statistical data manipulations. A graph-based visual interface is used to represent the links between derived datasets, allowing easy tracking of data versions. PIVOT further supports automatic report generation, publication-quality plots, and program/data state saving, such that all analysis can be saved, shared and reproduced. PIVOT will allow researchers with broad background to easily access sophisticated transcriptome analysis tools and interactively explore transcriptome datasets.

  16. Geometry + Technology = Proof

    ERIC Educational Resources Information Center

    Lyublinskaya, Irina; Funsch, Dan

    2012-01-01

    Several interactive geometry software packages are available today to secondary school teachers. An example is The Geometer's Sketchpad[R] (GSP), also known as Dynamic Geometry[R] software, developed by Key Curriculum Press. This numeric based technology has been widely adopted in the last twenty years, and a vast amount of creativity has been…

  17. A MATLAB-Aided Method for Teaching Calculus-Based Business Mathematics

    ERIC Educational Resources Information Center

    Liang, Jiajuan; Pan, William S. Y.

    2009-01-01

    MATLAB is a powerful package for numerical computation. MATLAB contains a rich pool of mathematical functions and provides flexible plotting functions for illustrating mathematical solutions. The course of calculus-based business mathematics consists of two major topics: 1) derivative and its applications in business; and 2) integration and its…

  18. Extend Instruction outside the Classroom: Take Advantage of Your Learning Management System

    ERIC Educational Resources Information Center

    Jensen, Lauren A.

    2010-01-01

    Numerous institutions of higher education have implemented a learning management system (LMS) or are considering doing so. This web-based software package provides self-service and quick (often personalized) access to content in a dynamic environment. Learning management systems support administrative, reporting, and documentation activities. LMSs…

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, Janine Camille; Thompson, David; Pebay, Philippe Pierre

    Statistical analysis is typically used to reduce the dimensionality of and infer meaning from data. A key challenge of any statistical analysis package aimed at large-scale, distributed data is to address the orthogonal issues of parallel scalability and numerical stability. Many statistical techniques, e.g., descriptive statistics or principal component analysis, are based on moments and co-moments and, using robust online update formulas, can be computed in an embarrassingly parallel manner, amenable to a map-reduce style implementation. In this paper we focus on contingency tables, through which numerous derived statistics such as joint and marginal probability, point-wise mutual information, information entropy,more » and {chi}{sup 2} independence statistics can be directly obtained. However, contingency tables can become large as data size increases, requiring a correspondingly large amount of communication between processors. This potential increase in communication prevents optimal parallel speedup and is the main difference with moment-based statistics (which we discussed in [1]) where the amount of inter-processor communication is independent of data size. Here we present the design trade-offs which we made to implement the computation of contingency tables in parallel. We also study the parallel speedup and scalability properties of our open source implementation. In particular, we observe optimal speed-up and scalability when the contingency statistics are used in their appropriate context, namely, when the data input is not quasi-diffuse.« less

  20. iGC-an integrated analysis package of gene expression and copy number alteration.

    PubMed

    Lai, Yi-Pin; Wang, Liang-Bo; Wang, Wei-An; Lai, Liang-Chuan; Tsai, Mong-Hsun; Lu, Tzu-Pin; Chuang, Eric Y

    2017-01-14

    With the advancement in high-throughput technologies, researchers can simultaneously investigate gene expression and copy number alteration (CNA) data from individual patients at a lower cost. Traditional analysis methods analyze each type of data individually and integrate their results using Venn diagrams. Challenges arise, however, when the results are irreproducible and inconsistent across multiple platforms. To address these issues, one possible approach is to concurrently analyze both gene expression profiling and CNAs in the same individual. We have developed an open-source R/Bioconductor package (iGC). Multiple input formats are supported and users can define their own criteria for identifying differentially expressed genes driven by CNAs. The analysis of two real microarray datasets demonstrated that the CNA-driven genes identified by the iGC package showed significantly higher Pearson correlation coefficients with their gene expression levels and copy numbers than those genes located in a genomic region with CNA. Compared with the Venn diagram approach, the iGC package showed better performance. The iGC package is effective and useful for identifying CNA-driven genes. By simultaneously considering both comparative genomic and transcriptomic data, it can provide better understanding of biological and medical questions. The iGC package's source code and manual are freely available at https://www.bioconductor.org/packages/release/bioc/html/iGC.html .

  1. Variations in algorithm implementation among quantitative texture analysis software packages

    NASA Astrophysics Data System (ADS)

    Foy, Joseph J.; Mitta, Prerana; Nowosatka, Lauren R.; Mendel, Kayla R.; Li, Hui; Giger, Maryellen L.; Al-Hallaq, Hania; Armato, Samuel G.

    2018-02-01

    Open-source texture analysis software allows for the advancement of radiomics research. Variations in texture features, however, result from discrepancies in algorithm implementation. Anatomically matched regions of interest (ROIs) that captured normal breast parenchyma were placed in the magnetic resonance images (MRI) of 20 patients at two time points. Six first-order features and six gray-level co-occurrence matrix (GLCM) features were calculated for each ROI using four texture analysis packages. Features were extracted using package-specific default GLCM parameters and using GLCM parameters modified to yield the greatest consistency among packages. Relative change in the value of each feature between time points was calculated for each ROI. Distributions of relative feature value differences were compared across packages. Absolute agreement among feature values was quantified by the intra-class correlation coefficient. Among first-order features, significant differences were found for max, range, and mean, and only kurtosis showed poor agreement. All six second-order features showed significant differences using package-specific default GLCM parameters, and five second-order features showed poor agreement; with modified GLCM parameters, no significant differences among second-order features were found, and all second-order features showed poor agreement. While relative texture change discrepancies existed across packages, these differences were not significant when consistent parameters were used.

  2. Statistical principle and methodology in the NISAN system.

    PubMed Central

    Asano, C

    1979-01-01

    The NISAN system is a new interactive statistical analysis program package constructed by an organization of Japanese statisticans. The package is widely available for both statistical situations, confirmatory analysis and exploratory analysis, and is planned to obtain statistical wisdom and to choose optimal process of statistical analysis for senior statisticians. PMID:540594

  3. PyPathway: Python Package for Biological Network Analysis and Visualization.

    PubMed

    Xu, Yang; Luo, Xiao-Chun

    2018-05-01

    Life science studies represent one of the biggest generators of large data sets, mainly because of rapid sequencing technological advances. Biological networks including interactive networks and human curated pathways are essential to understand these high-throughput data sets. Biological network analysis offers a method to explore systematically not only the molecular complexity of a particular disease but also the molecular relationships among apparently distinct phenotypes. Currently, several packages for Python community have been developed, such as BioPython and Goatools. However, tools to perform comprehensive network analysis and visualization are still needed. Here, we have developed PyPathway, an extensible free and open source Python package for functional enrichment analysis, network modeling, and network visualization. The network process module supports various interaction network and pathway databases such as Reactome, WikiPathway, STRING, and BioGRID. The network analysis module implements overrepresentation analysis, gene set enrichment analysis, network-based enrichment, and de novo network modeling. Finally, the visualization and data publishing modules enable users to share their analysis by using an easy web application. For package availability, see the first Reference.

  4. Quantification of root gravitropic response using a constant stimulus feedback system.

    PubMed

    Wolverton, Chris

    2015-01-01

    Numerous software packages now exist for quantifying root growth responses, most of which analyze a time resolved sequence of images ex post facto. However, few allow for the real-time analysis of growth responses. The system in routine use in our lab allows for real-time growth analysis and couples this to positional feedback to control the stimulus experienced by the responding root. This combination allows us to overcome one of the confounding variables in studies of root gravity response. Seedlings are grown on standard petri plates attached to a vertical rotating stage and imaged using infrared illumination. The angle of a particular region of the root is determined by image analysis, compared to the prescribed angle, and any corrections in positioning are made by controlling a stepper motor. The system allows for the long-term stimulation of a root at a constant angle and yields insights into the gravity perception and transduction machinery not possible with other approaches.

  5. Effect analysis of design variables on the disc in a double-eccentric butterfly valve.

    PubMed

    Kang, Sangmo; Kim, Da-Eun; Kim, Kuk-Kyeom; Kim, Jun-Oh

    2014-01-01

    We have performed a shape optimization of the disc in an industrial double-eccentric butterfly valve using the effect analysis of design variables to enhance the valve performance. For the optimization, we select three performance quantities such as pressure drop, maximum stress, and mass (weight) as the responses and three dimensions regarding the disc shape as the design variables. Subsequently, we compose a layout of orthogonal array (L16) by performing numerical simulations on the flow and structure using a commercial package, ANSYS v13.0, and then make an effect analysis of the design variables on the responses using the design of experiments. Finally, we formulate a multiobjective function consisting of the three responses and then propose an optimal combination of the design variables to maximize the valve performance. Simulation results show that the disc thickness makes the most significant effect on the performance and the optimal design provides better performance than the initial design.

  6. Efficient and Flexible Climate Analysis with Python in a Cloud-Based Distributed Computing Framework

    NASA Astrophysics Data System (ADS)

    Gannon, C.

    2017-12-01

    As climate models become progressively more advanced, and spatial resolution further improved through various downscaling projects, climate projections at a local level are increasingly insightful and valuable. However, the raw size of climate datasets presents numerous hurdles for analysts wishing to develop customized climate risk metrics or perform site-specific statistical analysis. Four Twenty Seven, a climate risk consultancy, has implemented a Python-based distributed framework to analyze large climate datasets in the cloud. With the freedom afforded by efficiently processing these datasets, we are able to customize and continually develop new climate risk metrics using the most up-to-date data. Here we outline our process for using Python packages such as XArray and Dask to evaluate netCDF files in a distributed framework, StarCluster to operate in a cluster-computing environment, cloud computing services to access publicly hosted datasets, and how this setup is particularly valuable for generating climate change indicators and performing localized statistical analysis.

  7. Tests of a Semi-Analytical Case 1 and Gelbstoff Case 2 SeaWiFS Algorithm with a Global Data Set

    NASA Technical Reports Server (NTRS)

    Carder, Kendall L.; Hawes, Steve K.; Lee, Zhongping

    1997-01-01

    A semi-analytical algorithm was tested with a total of 733 points of either unpackaged or packaged-pigment data, with corresponding algorithm parameters for each data type. The 'unpackaged' type consisted of data sets that were generally consistent with the Case 1 CZCS algorithm and other well calibrated data sets. The 'packaged' type consisted of data sets apparently containing somewhat more packaged pigments, requiring modification of the absorption parameters of the model consistent with the CalCOFI study area. This resulted in two equally divided data sets. A more thorough scrutiny of these and other data sets using a semianalytical model requires improved knowledge of the phytoplankton and gelbstoff of the specific environment studied. Since the semi-analytical algorithm is dependent upon 4 spectral channels including the 412 nm channel, while most other algorithms are not, a means of testing data sets for consistency was sought. A numerical filter was developed to classify data sets into the above classes. The filter uses reflectance ratios, which can be determined from space. The sensitivity of such numerical filters to measurement resulting from atmospheric correction and sensor noise errors requires further study. The semi-analytical algorithm performed superbly on each of the data sets after classification, resulting in RMS1 errors of 0.107 and 0.121, respectively, for the unpackaged and packaged data-set classes, with little bias and slopes near 1.0. In combination, the RMS1 performance was 0.114. While these numbers appear rather sterling, one must bear in mind what mis-classification does to the results. Using an average or compromise parameterization on the modified global data set yielded an RMS1 error of 0.171, while using the unpackaged parameterization on the global evaluation data set yielded an RMS1 error of 0.284. So, without classification, the algorithm performs better globally using the average parameters than it does using the unpackaged parameters. Finally, the effects of even more extreme pigment packaging must be examined in order to improve algorithm performance at high latitudes. Note, however, that the North Sea and Mississippi River plume studies contributed data to the packaged and unpackaged classess, respectively, with little effect on algorithm performance. This suggests that gelbstoff-rich Case 2 waters do not seriously degrade performance of the semi-analytical algorithm.

  8. User's guide to the Variably Saturated Flow (VSF) process to MODFLOW

    USGS Publications Warehouse

    Thoms, R. Brad; Johnson, Richard L.; Healy, Richard W.

    2006-01-01

    A new process for simulating three-dimensional (3-D) variably saturated flow (VSF) using Richards' equation has been added to the 3-D modular finite-difference ground-water model MODFLOW. Five new packages are presented here as part of the VSF Process--the Richards' Equation Flow (REF1) Package, the Seepage Face (SPF1) Package, the Surface Ponding (PND1) Package, the Surface Evaporation (SEV1) Package, and the Root Zone Evapotranspiration (RZE1) Package. Additionally, a new Adaptive Time-Stepping (ATS1) Package is presented for use by both the Ground-Water Flow (GWF) Process and VSF. The VSF Process allows simulation of flow in unsaturated media above the ground-water zone and facilitates modeling of ground-water/surface-water interactions. Model performance is evaluated by comparison to an analytical solution for one-dimensional (1-D) constant-head infiltration (Dirichlet boundary condition), field experimental data for a 1-D constant-head infiltration, laboratory experimental data for two-dimensional (2-D) constant-flux infiltration (Neumann boundary condition), laboratory experimental data for 2-D transient drainage through a seepage face, and numerical model results (VS2DT) of a 2-D flow-path simulation using realistic surface boundary conditions. A hypothetical 3-D example case also is presented to demonstrate the new capability using periodic boundary conditions (for example, daily precipitation) and varied surface topography over a larger spatial scale (0.133 square kilometer). The new model capabilities retain the modular structure of the MODFLOW code and preserve MODFLOW's existing capabilities as well as compatibility with commercial pre-/post-processors. The overall success of the VSF Process in simulating mixed boundary conditions and variable soil types demonstrates its utility for future hydrologic investigations. This report presents a new flow package implementing the governing equations for variably saturated ground-water flow, four new boundary condition packages unique to unsaturated flow, the Adaptive Time-Stepping Package for use with both the GWF Process and the new VSF Process, detailed descriptions of the input and output files for each package, and six simulation examples verifying model performance.

  9. Estimation of water diffusion coefficient into polycarbonate at different temperatures using numerical simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nasirabadi, P. Shojaee; Jabbari, M.; Hattel, J. H.

    2016-06-08

    Nowadays, many electronic systems are exposed to harsh conditions of relative humidity and temperature. Mass transport properties of electronic packaging materials are needed in order to investigate the influence of moisture and temperature on reliability of electronic devices. Polycarbonate (PC) is widely used in the electronics industry. Thus, in this work the water diffusion coefficient into PC is investigated. Furthermore, numerical methods used for estimation of the diffusion coefficient and their assumptions are discussed. 1D and 3D numerical solutions are compared and based on this, it is shown how the estimated value can be different depending on the choice ofmore » dimensionality in the model.« less

  10. Reliable numerical computation in an optimal output-feedback design

    NASA Technical Reports Server (NTRS)

    Vansteenwyk, Brett; Ly, Uy-Loi

    1991-01-01

    A reliable algorithm is presented for the evaluation of a quadratic performance index and its gradients with respect to the controller design parameters. The algorithm is a part of a design algorithm for optimal linear dynamic output-feedback controller that minimizes a finite-time quadratic performance index. The numerical scheme is particularly robust when it is applied to the control-law synthesis for systems with densely packed modes and where there is a high likelihood of encountering degeneracies in the closed-loop eigensystem. This approach through the use of an accurate Pade series approximation does not require the closed-loop system matrix to be diagonalizable. The algorithm was included in a control design package for optimal robust low-order controllers. Usefulness of the proposed numerical algorithm was demonstrated using numerous practical design cases where degeneracies occur frequently in the closed-loop system under an arbitrary controller design initialization and during the numerical search.

  11. An experimental analysis of the effectiveness and sustainability of a Chinese tutoring package.

    PubMed

    Wu, Hang; Miller, L Keith

    2012-01-01

    This experiment evaluated the effects of training tutors to use an instructional package to teach pronunciation and translation of the Chinese language. Tutors' correct use of the package increased from 68% of trials to 92% after training, and student correct pronunciation increased from 45% to 90%, with similar effects for translation. Continued use of the package, high social validity, and extended follow-up suggest that use of the package may be sustainable.

  12. AN EXPERIMENTAL ANALYSIS OF THE EFFECTIVENESS AND SUSTAINABILITY OF A CHINESE TUTORING PACKAGE

    PubMed Central

    Wu, Hang; Miller, L. Keith

    2012-01-01

    This experiment evaluated the effects of training tutors to use an instructional package to teach pronunciation and translation of the Chinese language. Tutors' correct use of the package increased from 68% of trials to 92% after training, and student correct pronunciation increased from 45% to 90%, with similar effects for translation. Continued use of the package, high social validity, and extended follow-up suggest that use of the package may be sustainable. PMID:22403470

  13. Study of Convection Heat Transfer in a Very High Temperature Reactor Flow Channel: Numerical and Experimental Results

    DOE PAGES

    Valentin, Francisco I.; Artoun, Narbeh; Anderson, Ryan; ...

    2016-12-01

    Very High Temperature Reactors (VHTRs) are one of the Generation IV gas-cooled reactor models proposed for implementation in next generation nuclear power plants. A high temperature/pressure test facility for forced and natural circulation experiments has been constructed. This test facility consists of a single flow channel in a 2.7 m (9’) long graphite column equipped with four 2.3kW heaters. Extensive 3D numerical modeling provides a detailed analysis of the thermal-hydraulic behavior under steady-state, transient, and accident scenarios. In addition, forced/mixed convection experiments with air, nitrogen and helium were conducted for inlet Reynolds numbers from 500 to 70,000. Our numerical resultsmore » were validated with forced convection data displaying maximum percentage errors under 15%, using commercial finite element package, COMSOL Multiphysics. Based on this agreement, important information can be extracted from the model, with regards to the modified radial velocity and property gas profiles. Our work also examines flow laminarization for a full range of Reynolds numbers including laminar, transition and turbulent flow under forced convection and its impact on heat transfer under various scenarios to examine the thermal-hydraulic phenomena that could occur during both normal operation and accident conditions.« less

  14. A mathematical model for describing the mechanical behaviour of root canal instruments.

    PubMed

    Zhang, E W; Cheung, G S P; Zheng, Y F

    2011-01-01

    The purpose of this study was to establish a general mathematical model for describing the mechanical behaviour of root canal instruments by combining a theoretical analytical approach with a numerical finite-element method. Mathematical formulas representing the longitudinal (taper, helical angle and pitch) and cross-sectional configurations and area, the bending and torsional inertia, the curvature of the boundary point and the (geometry of) loading condition were derived. Torsional and bending stresses and the resultant deformation were expressed mathematically as a function of these geometric parameters, modulus of elasticity of the material and the applied load. As illustrations, three brands of NiTi endodontic files of different cross-sectional configurations (ProTaper, Hero 642, and Mani NRT) were analysed under pure torsion and pure bending situation by entering the model into a finite-element analysis package (ANSYS). Numerical results confirmed that mathematical models were a feasible method to analyse the mechanical properties and predict the stress and deformation for root canal instruments during root canal preparation. Mathematical and numerical model can be a suitable way to examine mechanical behaviours as a criterion of the instrument design and to predict the stress and strain experienced by the endodontic instruments during root canal preparation. © 2010 International Endodontic Journal.

  15. Multivariate matching pursuit in optimal Gabor dictionaries: theory and software with interface for EEG/MEG via Svarog

    PubMed Central

    2013-01-01

    Background Matching pursuit algorithm (MP), especially with recent multivariate extensions, offers unique advantages in analysis of EEG and MEG. Methods We propose a novel construction of an optimal Gabor dictionary, based upon the metrics introduced in this paper. We implement this construction in a freely available software for MP decomposition of multivariate time series, with a user friendly interface via the Svarog package (Signal Viewer, Analyzer and Recorder On GPL, http://braintech.pl/svarog), and provide a hands-on introduction to its application to EEG. Finally, we describe numerical and mathematical optimizations used in this implementation. Results Optimal Gabor dictionaries, based on the metric introduced in this paper, for the first time allowed for a priori assessment of maximum one-step error of the MP algorithm. Variants of multivariate MP, implemented in the accompanying software, are organized according to the mathematical properties of the algorithms, relevant in the light of EEG/MEG analysis. Some of these variants have been successfully applied to both multichannel and multitrial EEG and MEG in previous studies, improving preprocessing for EEG/MEG inverse solutions and parameterization of evoked potentials in single trials; we mention also ongoing work and possible novel applications. Conclusions Mathematical results presented in this paper improve our understanding of the basics of the MP algorithm. Simple introduction of its properties and advantages, together with the accompanying stable and user-friendly Open Source software package, pave the way for a widespread and reproducible analysis of multivariate EEG and MEG time series and novel applications, while retaining a high degree of compatibility with the traditional, visual analysis of EEG. PMID:24059247

  16. Numerical simulation of two-dimensional Rayleigh-Benard convection

    NASA Astrophysics Data System (ADS)

    Grigoriev, Vasiliy V.; Zakharov, Petr E.

    2017-11-01

    This paper considered Rayleigh-Benard convection (natural convection). This is a flow, which is formed in a viscous medium when heated from below and cooled from above. As a result, are formed vortices (convective cells). This process is described by a system of nonlinear differential equations in Oberbeck-Boussinesq approximation. As the governing parameters characterizing convection states Rayleigh number, Prandtl number are picked. The problem is solved by using finite element method with computational package FEniCS. Numerical results for different Rayleigh numbers are obtained. Studied integral characteristic (Nusselt number) depending on the Rayleigh number.

  17. Numerical solutions for patterns statistics on Markov chains.

    PubMed

    Nuel, Gregory

    2006-01-01

    We propose here a review of the methods available to compute pattern statistics on text generated by a Markov source. Theoretical, but also numerical aspects are detailed for a wide range of techniques (exact, Gaussian, large deviations, binomial and compound Poisson). The SPatt package (Statistics for Pattern, free software available at http://stat.genopole.cnrs.fr/spatt) implementing all these methods is then used to compare all these approaches in terms of computational time and reliability in the most complete pattern statistics benchmark available at the present time.

  18. Regional Morphology Analysis Package (RMAP): Empirical Orthogonal Function Analysis, Background and Examples

    DTIC Science & Technology

    2007-10-01

    1984. Complex principal component analysis : Theory and examples. Journal of Climate and Applied Meteorology 23: 1660-1673. Hotelling, H. 1933...Sediments 99. ASCE: 2,566-2,581. Von Storch, H., and A. Navarra. 1995. Analysis of climate variability. Applications of statistical techniques. Berlin...ERDC TN-SWWRP-07-9 October 2007 Regional Morphology Empirical Analysis Package (RMAP): Orthogonal Function Analysis , Background and Examples by

  19. Determining Permissible Oxygen and Water Vapor Transmission Rate for Non-Retort Military Ration Packaging

    DTIC Science & Technology

    2011-11-01

    OXYGEN AND WATER VAPOR TRANSMISSION RATE FOR NON- RETORT MILITARY RATION PACKAGING by Danielle Froio Alan Wright Nicole Favreau and Sarah...ANSI Std. Z39.18 RETORT STORAGE SHELF LIFE RETORT POUCHES SENSORY ANALYSIS OXYGEN CRACKERS PACKAGING SENSORY... Packaging for MRE. (a) MRE Retort Pouch Quad-Laminate Structure; (b) MRE Non- retort Pouch Tri-Laminate Structure

  20. Stochastic approach for radionuclides quantification

    NASA Astrophysics Data System (ADS)

    Clement, A.; Saurel, N.; Perrin, G.

    2018-01-01

    Gamma spectrometry is a passive non-destructive assay used to quantify radionuclides present in more or less complex objects. Basic methods using empirical calibration with a standard in order to quantify the activity of nuclear materials by determining the calibration coefficient are useless on non-reproducible, complex and single nuclear objects such as waste packages. Package specifications as composition or geometry change from one package to another and involve a high variability of objects. Current quantification process uses numerical modelling of the measured scene with few available data such as geometry or composition. These data are density, material, screen, geometric shape, matrix composition, matrix and source distribution. Some of them are strongly dependent on package data knowledge and operator backgrounds. The French Commissariat à l'Energie Atomique (CEA) is developing a new methodology to quantify nuclear materials in waste packages and waste drums without operator adjustment and internal package configuration knowledge. This method suggests combining a global stochastic approach which uses, among others, surrogate models available to simulate the gamma attenuation behaviour, a Bayesian approach which considers conditional probability densities of problem inputs, and Markov Chains Monte Carlo algorithms (MCMC) which solve inverse problems, with gamma ray emission radionuclide spectrum, and outside dimensions of interest objects. The methodology is testing to quantify actinide activity in different kind of matrix, composition, and configuration of sources standard in terms of actinide masses, locations and distributions. Activity uncertainties are taken into account by this adjustment methodology.

  1. AOP: An R Package For Sufficient Causal Analysis in Pathway ...

    EPA Pesticide Factsheets

    Summary: How can I quickly find the key events in a pathway that I need to monitor to predict that a/an beneficial/adverse event/outcome will occur? This is a key question when using signaling pathways for drug/chemical screening in pharma-cology, toxicology and risk assessment. By identifying these sufficient causal key events, we have fewer events to monitor for a pathway, thereby decreasing assay costs and time, while maximizing the value of the information. I have developed the “aop” package which uses backdoor analysis of causal net-works to identify these minimal sets of key events that are suf-ficient for making causal predictions. Availability and Implementation: The source and binary are available online through the Bioconductor project (http://www.bioconductor.org/) as an R package titled “aop”. The R/Bioconductor package runs within the R statistical envi-ronment. The package has functions that can take pathways (as directed graphs) formatted as a Cytoscape JSON file as input, or pathways can be represented as directed graphs us-ing the R/Bioconductor “graph” package. The “aop” package has functions that can perform backdoor analysis to identify the minimal set of key events for making causal predictions.Contact: burgoon.lyle@epa.gov This paper describes an R/Bioconductor package that was developed to facilitate the identification of key events within an AOP that are the minimal set of sufficient key events that need to be tested/monit

  2. Software design for analysis of multichannel intracardial and body surface electrocardiograms.

    PubMed

    Potse, Mark; Linnenbank, André C; Grimbergen, Cornelis A

    2002-11-01

    Analysis of multichannel ECG recordings (body surface maps (BSMs) and intracardial maps) requires special software. We created a software package and a user interface on top of a commercial data analysis package (MATLAB) by a combination of high-level and low-level programming. Our software was created to satisfy the needs of a diverse group of researchers. It can handle a large variety of recording configurations. It allows for interactive usage through a fast and robust user interface, and batch processing for the analysis of large amounts of data. The package is user-extensible, includes routines for both common and experimental data processing tasks, and works on several computer platforms. The source code is made intelligible using software for structured documentation and is available to the users. The package is currently used by more than ten research groups analysing ECG data worldwide.

  3. Analysis of reference transactions using packaged computer programs.

    PubMed

    Calabretta, N; Ross, R

    1984-01-01

    Motivated by a continuing education class attended by the authors on the measurement of reference desk activities, the reference department at Scott Memorial Library initiated a project to gather data on reference desk transactions and to analyze the data by using packaged computer programs. The programs utilized for the project were SPSS (Statistical Package for the Social Sciences) and SAS (Statistical Analysis System). The planning, implementation and development of the project are described.

  4. Improving finite element results in modeling heart valve mechanics.

    PubMed

    Earl, Emily; Mohammadi, Hadi

    2018-06-01

    Finite element analysis is a well-established computational tool which can be used for the analysis of soft tissue mechanics. Due to the structural complexity of the leaflet tissue of the heart valve, the currently available finite element models do not adequately represent the leaflet tissue. A method of addressing this issue is to implement computationally expensive finite element models, characterized by precise constitutive models including high-order and high-density mesh techniques. In this study, we introduce a novel numerical technique that enhances the results obtained from coarse mesh finite element models to provide accuracy comparable to that of fine mesh finite element models while maintaining a relatively low computational cost. Introduced in this study is a method by which the computational expense required to solve linear and nonlinear constitutive models, commonly used in heart valve mechanics simulations, is reduced while continuing to account for large and infinitesimal deformations. This continuum model is developed based on the least square algorithm procedure coupled with the finite difference method adhering to the assumption that the components of the strain tensor are available at all nodes of the finite element mesh model. The suggested numerical technique is easy to implement, practically efficient, and requires less computational time compared to currently available commercial finite element packages such as ANSYS and/or ABAQUS.

  5. Acoustic wave propagation in a temporal evolving shear-layer for low-Mach number perturbations

    NASA Astrophysics Data System (ADS)

    Hau, Jan-Niklas; Müller, Björn

    2018-01-01

    We study wave packets with the small perturbation/gradient Mach number interacting with a smooth shear-layer in the linear regime of small amplitude perturbations. In particular, we investigate the temporal evolution of wave packets in shear-layers with locally curved regions of variable size using non-modal linear analysis and direct numerical simulations of the two-dimensional gas-dynamical equations. Depending on the wavenumber of the initially imposed wave packet, three different types of behavior are observed: (i) The wave packet passes through the shear-layer and constantly transfers energy back to the mean flow. (ii) It is turned around (or reflected) within the sheared region and extracts energy from the base flow. (iii) It is split into two oppositely propagating packages when reaching the upper boundary of the linearly sheared region. The conducted direct numerical simulations confirm that non-modal linear stability analysis is able to predict the wave packet dynamics, even in the presence of non-linearly sheared regions. In the light of existing studies in this area, we conclude that the sheared regions are responsible for the highly directed propagation of linearly generated acoustic waves when there is a dominating source, as it is the case for jet flows.

  6. Modelling of radiation field around spent fuel container.

    PubMed

    Kryuchkov, E F; Opalovsky, V A; Tikhomirov, G V

    2005-01-01

    Operation of nuclear reactors leads to the production of spent nuclear fuel (SNF). There are two basic strategies of SNF management: ultimate disposal of SNF in geological formations and recycle or repeated utilisation of reprocessed SNF. In both options, there is an urgent necessity to study radiation properties of SNF. Information about SNF radiation properties is required at all stages of SNF management. In order to reach more effective utilisation of nuclear materials, new fuel cycles are under development based on uranium-plutonium, uranium-thorium and some other types of nuclear fuel. These promising types of nuclear fuel are characterised by quite different radiation properties at all the stages of nuclear fuel cycle (NFC) listed above. So, comparative analysis is required for radiation properties of different nuclear fuel types at different NFC stages. The results presented here were obtained from the numerical analysis of the radiation field around transport containers of different SNF types and in SNF storage. The calculations are carried out with the application of the computer code packages SCALE-4.3 and MCNP-4C. Comparison of the dose parameters obtained for different models of the transport container with experimental data allowed us to make certain conclusions about the errors of numerical results caused by the approximate geometrical description of the transport container.

  7. Propensity Score Analysis in R: A Software Review

    ERIC Educational Resources Information Center

    Keller, Bryan; Tipton, Elizabeth

    2016-01-01

    In this article, we review four software packages for implementing propensity score analysis in R: "Matching, MatchIt, PSAgraphics," and "twang." After briefly discussing essential elements for propensity score analysis, we apply each package to a data set from the Early Childhood Longitudinal Study in order to estimate the…

  8. Environmental Assessment of Packaging: The Consumer Point of View

    PubMed

    Van Dam YK

    1996-09-01

    When marketing environmentally responsible packaged products, the producer is confronted with consumer beliefs concerning the environmental friendliness of packaging materials. When making environmentally conscious packaging decisions, these consumer beliefs should be taken into account alongside the technical guidelines. Dutch consumer perceptions of the environmental friendliness of packaged products are reported and compared with the results of a life-cycle analysis assessment. It is shown that consumers judge environmental friendliness mainly from material and returnability. Furthermore, the consumer perception of the environmental friendliness of packaging material is based on the postconsumption waste, whereas the environmental effects of production are ignored. From the consumer beliefs concerning environmental friendliness implications are deduced for packaging policy and for environmental policy.KEY WORDS: Consumer behavior; Environment; Food; Packaging; Perception; Waste

  9. Analysis of counting data: Development of the SATLAS Python package

    NASA Astrophysics Data System (ADS)

    Gins, W.; de Groote, R. P.; Bissell, M. L.; Granados Buitrago, C.; Ferrer, R.; Lynch, K. M.; Neyens, G.; Sels, S.

    2018-01-01

    For the analysis of low-statistics counting experiments, a traditional nonlinear least squares minimization routine may not always provide correct parameter and uncertainty estimates due to the assumptions inherent in the algorithm(s). In response to this, a user-friendly Python package (SATLAS) was written to provide an easy interface between the data and a variety of minimization algorithms which are suited for analyzinglow, as well as high, statistics data. The advantage of this package is that it allows the user to define their own model function and then compare different minimization routines to determine the optimal parameter values and their respective (correlated) errors. Experimental validation of the different approaches in the package is done through analysis of hyperfine structure data of 203Fr gathered by the CRIS experiment at ISOLDE, CERN.

  10. scraps: An open-source Python-based analysis package for analyzing and plotting superconducting resonator data

    DOE PAGES

    Carter, Faustin Wirkus; Khaire, Trupti S.; Novosad, Valentyn; ...

    2016-11-07

    We present "scraps" (SuperConducting Analysis and Plotting Software), a Python package designed to aid in the analysis and visualization of large amounts of superconducting resonator data, specifically complex transmission as a function of frequency, acquired at many different temperatures and driving powers. The package includes a least-squares fitting engine as well as a Monte-Carlo Markov Chain sampler for sampling the posterior distribution given priors, marginalizing over nuisance parameters, and estimating covariances. A set of plotting tools for generating publication-quality figures is also provided in the package. Lastly, we discuss the functionality of the software and provide some examples of itsmore » utility on data collected from a niobium-nitride coplanar waveguide resonator fabricated at Argonne National Laboratory.« less

  11. ANALYSIS/PLOT: a graphics package for use with the SORT/ANALYSIS data bases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sady, C.A.

    1983-08-01

    This report describes a graphics package that is used with the SORT/ANALYSIS data bases. The data listed by the SORT/ANALYSIS program can be presented in pie, bar, line, or Gantt chart form. Instructions for the use of the plotting program and descriptions of the subroutines are given in the report.

  12. Dualities in the analysis of phage DNA packaging motors

    PubMed Central

    Serwer, Philip; Jiang, Wen

    2012-01-01

    The DNA packaging motors of double-stranded DNA phages are models for analysis of all multi-molecular motors and for analysis of several fundamental aspects of biology, including early evolution, relationship of in vivo to in vitro biochemistry and targets for anti-virals. Work on phage DNA packaging motors both has produced and is producing dualities in the interpretation of data obtained by use of both traditional techniques and the more recently developed procedures of single-molecule analysis. The dualities include (1) reductive vs. accretive evolution, (2) rotation vs. stasis of sub-assemblies of the motor, (3) thermal ratcheting vs. power stroking in generating force, (4) complete motor vs. spark plug role for the packaging ATPase, (5) use of previously isolated vs. new intermediates for analysis of the intermediate states of the motor and (6) a motor with one cycle vs. a motor with two cycles. We provide background for these dualities, some of which are under-emphasized in the literature. We suggest directions for future research. PMID:23532204

  13. Dynamic Electrothermal Model of a Sputtered Thermopile Thermal Radiation Detector for Earth Radiation Budget Applications

    NASA Technical Reports Server (NTRS)

    Weckmann, Stephanie

    1997-01-01

    The Clouds and the Earth's Radiant Energy System (CERES) is a program sponsored by the National Aeronautics and Space Administration (NASA) aimed at evaluating the global energy balance. Current scanning radiometers used for CERES consist of thin-film thermistor bolometers viewing the Earth through a Cassegrain telescope. The Thermal Radiation Group, a laboratory in the Department of Mechanical Engineering at Virginia Polytechnic Institute and State University, is currently studying a new sensor concept to replace the current bolometer: a thermopile thermal radiation detector. This next-generation detector would consist of a thermal sensor array made of thermocouple junction pairs, or thermopiles. The objective of the current research is to perform a thermal analysis of the thermopile. Numerical thermal models are particularly suited to solve problems for which temperature is the dominant mechanism of the operation of the device (through the thermoelectric effect), as well as for complex geometries composed of numerous different materials. Feasibility and design specifications are studied by developing a dynamic electrothermal model of the thermopile using the finite element method. A commercial finite element-modeling package, ALGOR, is used.

  14. An intuitive Python interface for Bioconductor libraries demonstrates the utility of language translators

    PubMed Central

    2010-01-01

    Background Computer languages can be domain-related, and in the case of multidisciplinary projects, knowledge of several languages will be needed in order to quickly implements ideas. Moreover, each computer language has relative strong points, making some languages better suited than others for a given task to be implemented. The Bioconductor project, based on the R language, has become a reference for the numerical processing and statistical analysis of data coming from high-throughput biological assays, providing a rich selection of methods and algorithms to the research community. At the same time, Python has matured as a rich and reliable language for the agile development of prototypes or final implementations, as well as for handling large data sets. Results The data structures and functions from Bioconductor can be exposed to Python as a regular library. This allows a fully transparent and native use of Bioconductor from Python, without one having to know the R language and with only a small community of translators required to know both. To demonstrate this, we have implemented such Python representations for key infrastructure packages in Bioconductor, letting a Python programmer handle annotation data, microarray data, and next-generation sequencing data. Conclusions Bioconductor is now not solely reserved to R users. Building a Python application using Bioconductor functionality can be done just like if Bioconductor was a Python package. Moreover, similar principles can be applied to other languages and libraries. Our Python package is available at: http://pypi.python.org/pypi/rpy2-bioconductor-extensions/ PMID:21210978

  15. Safety analysis report for the SR-101 inert reservoir package

    DOT National Transportation Integrated Search

    1998-11-01

    Department of Energy (DOE) AL Weapons Surety Division (WSD) requires the SR-101 Inert Reservoir Package to : meet applicable hazardous material transportation requirements. This Safety Analysis Report (SAR) is based on : requirements in place at the ...

  16. WGCNA: an R package for weighted correlation network analysis.

    PubMed

    Langfelder, Peter; Horvath, Steve

    2008-12-29

    Correlation networks are increasingly being used in bioinformatics applications. For example, weighted gene co-expression network analysis is a systems biology method for describing the correlation patterns among genes across microarray samples. Weighted correlation network analysis (WGCNA) can be used for finding clusters (modules) of highly correlated genes, for summarizing such clusters using the module eigengene or an intramodular hub gene, for relating modules to one another and to external sample traits (using eigengene network methodology), and for calculating module membership measures. Correlation networks facilitate network based gene screening methods that can be used to identify candidate biomarkers or therapeutic targets. These methods have been successfully applied in various biological contexts, e.g. cancer, mouse genetics, yeast genetics, and analysis of brain imaging data. While parts of the correlation network methodology have been described in separate publications, there is a need to provide a user-friendly, comprehensive, and consistent software implementation and an accompanying tutorial. The WGCNA R software package is a comprehensive collection of R functions for performing various aspects of weighted correlation network analysis. The package includes functions for network construction, module detection, gene selection, calculations of topological properties, data simulation, visualization, and interfacing with external software. Along with the R package we also present R software tutorials. While the methods development was motivated by gene expression data, the underlying data mining approach can be applied to a variety of different settings. The WGCNA package provides R functions for weighted correlation network analysis, e.g. co-expression network analysis of gene expression data. The R package along with its source code and additional material are freely available at http://www.genetics.ucla.edu/labs/horvath/CoexpressionNetwork/Rpackages/WGCNA.

  17. WGCNA: an R package for weighted correlation network analysis

    PubMed Central

    Langfelder, Peter; Horvath, Steve

    2008-01-01

    Background Correlation networks are increasingly being used in bioinformatics applications. For example, weighted gene co-expression network analysis is a systems biology method for describing the correlation patterns among genes across microarray samples. Weighted correlation network analysis (WGCNA) can be used for finding clusters (modules) of highly correlated genes, for summarizing such clusters using the module eigengene or an intramodular hub gene, for relating modules to one another and to external sample traits (using eigengene network methodology), and for calculating module membership measures. Correlation networks facilitate network based gene screening methods that can be used to identify candidate biomarkers or therapeutic targets. These methods have been successfully applied in various biological contexts, e.g. cancer, mouse genetics, yeast genetics, and analysis of brain imaging data. While parts of the correlation network methodology have been described in separate publications, there is a need to provide a user-friendly, comprehensive, and consistent software implementation and an accompanying tutorial. Results The WGCNA R software package is a comprehensive collection of R functions for performing various aspects of weighted correlation network analysis. The package includes functions for network construction, module detection, gene selection, calculations of topological properties, data simulation, visualization, and interfacing with external software. Along with the R package we also present R software tutorials. While the methods development was motivated by gene expression data, the underlying data mining approach can be applied to a variety of different settings. Conclusion The WGCNA package provides R functions for weighted correlation network analysis, e.g. co-expression network analysis of gene expression data. The R package along with its source code and additional material are freely available at . PMID:19114008

  18. GOplot: an R package for visually combining expression data with functional analysis.

    PubMed

    Walter, Wencke; Sánchez-Cabo, Fátima; Ricote, Mercedes

    2015-09-01

    Despite the plethora of methods available for the functional analysis of omics data, obtaining comprehensive-yet detailed understanding of the results remains challenging. This is mainly due to the lack of publicly available tools for the visualization of this type of information. Here we present an R package called GOplot, based on ggplot2, for enhanced graphical representation. Our package takes the output of any general enrichment analysis and generates plots at different levels of detail: from a general overview to identify the most enriched categories (bar plot, bubble plot) to a more detailed view displaying different types of information for molecules in a given set of categories (circle plot, chord plot, cluster plot). The package provides a deeper insight into omics data and allows scientists to generate insightful plots with only a few lines of code to easily communicate the findings. The R package GOplot is available via CRAN-The Comprehensive R Archive Network: http://cran.r-project.org/web/packages/GOplot. The shiny web application of the Venn diagram can be found at: https://wwalter.shinyapps.io/Venn/. A detailed manual of the package with sample figures can be found at https://wencke.github.io/ fscabo@cnic.es or mricote@cnic.es. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. Verbal versus Numerical Probabilities: Does Format Presentation of Probabilistic Information regarding Breast Cancer Screening Affect Women's Comprehension?

    ERIC Educational Resources Information Center

    Vahabi, Mandana

    2010-01-01

    Objective: To test whether the format in which women receive probabilistic information about breast cancer and mammography affects their comprehension. Methods: A convenience sample of 180 women received pre-assembled randomized packages containing a breast health information brochure, with probabilities presented in either verbal or numeric…

  20. 7 CFR 51.1310 - Sizing.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... package. The number of pears in the box shall not vary more than 3 from the number indicated on the box. (b) When the numerical count is marked on western standard pear boxes the pears shall not vary more than three-eighths inch in their transverse diameter for counts 120 or less; one-fourth inch for counts...

  1. 7 CFR 51.1310 - Sizing.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... package. The number of pears in the box shall not vary more than 3 from the number indicated on the box. (b) When the numerical count is marked on western standard pear boxes the pears shall not vary more than three-eighths inch in their transverse diameter for counts 120 or less; one-fourth inch for counts...

  2. 7 CFR 51.1269 - Sizing.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... indicated on the package. The number of pears in the box shall not vary more than 3 from the number indicated on the box. (b) When the numerical count is marked on western standard pear boxes the pears shall not vary more than three-eighths inch in their transverse diameter for counts 120 or less; one-fourth...

  3. 7 CFR 51.1269 - Sizing.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... indicated on the package. The number of pears in the box shall not vary more than 3 from the number indicated on the box. (b) When the numerical count is marked on western standard pear boxes the pears shall not vary more than three-eighths inch in their transverse diameter for counts 120 or less; one-fourth...

  4. Educating for Ecological Sustainability: Montessori Education Leads the Way

    ERIC Educational Resources Information Center

    Sutton, Ann

    2009-01-01

    These days, the word "green," and the more comprehensive term "sustainability," surface in numerous arenas, whether it be exhortations to recycle more, employ compact fluorescent lightbulbs, use less hot water, avoid products with excess packaging, adjust thermostats, plant trees, turn off electronic devices when not in use, or buy organic and…

  5. A Numerical Modeling Framework for Cohesive Sediment Transport Driven by Waves and Tidal Currents

    DTIC Science & Technology

    2012-09-30

    for sediment transport. The successful extension to multi-dimensions is benefited from an open-source CFD package, OpenFOAM (www.openfoam.org). This...linz.at/Drupal/), which couples the fluid solver OpenFOAM with the Discrete Element Model (DEM) solver LIGGGHTS (an improved LAMMPS for granular flow

  6. Final Report - Subcontract B623760

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bank, R.

    2017-11-17

    During my visit to LLNL during July 17{27, 2017, I worked on linear system solvers. The two level hierarchical solver that initiated our study was developed to solve linear systems arising from hp adaptive finite element calculations, and is implemented in the PLTMG software package, version 12. This preconditioner typically requires 3-20% of the space used by the stiffness matrix for higher order elements. It has multigrid like convergence rates for a wide variety of PDEs (self-adjoint positive de nite elliptic equations, convection dominated convection-diffusion equations, and highly indefinite Helmholtz equations, among others). The convergence rate is not independent ofmore » the polynomial degree p as p ! 1, but but remains strong for p 9, which is the highest polynomial degree allowed in PLTMG, due to limitations of the numerical quadrature rules implemented in the software package. A more complete description of the method and some numerical experiments illustrating its effectiveness appear in. Like traditional geometric multilevel methods, this scheme relies on knowledge of the underlying finite element space in order to construct the smoother and the coarse grid correction.« less

  7. GPU-accelerated Red Blood Cells Simulations with Transport Dissipative Particle Dynamics.

    PubMed

    Blumers, Ansel L; Tang, Yu-Hang; Li, Zhen; Li, Xuejin; Karniadakis, George E

    2017-08-01

    Mesoscopic numerical simulations provide a unique approach for the quantification of the chemical influences on red blood cell functionalities. The transport Dissipative Particles Dynamics (tDPD) method can lead to such effective multiscale simulations due to its ability to simultaneously capture mesoscopic advection, diffusion, and reaction. In this paper, we present a GPU-accelerated red blood cell simulation package based on a tDPD adaptation of our red blood cell model, which can correctly recover the cell membrane viscosity, elasticity, bending stiffness, and cross-membrane chemical transport. The package essentially processes all computational workloads in parallel by GPU, and it incorporates multi-stream scheduling and non-blocking MPI communications to improve inter-node scalability. Our code is validated for accuracy and compared against the CPU counterpart for speed. Strong scaling and weak scaling are also presented to characterizes scalability. We observe a speedup of 10.1 on one GPU over all 16 cores within a single node, and a weak scaling efficiency of 91% across 256 nodes. The program enables quick-turnaround and high-throughput numerical simulations for investigating chemical-driven red blood cell phenomena and disorders.

  8. SARA - SURE/ASSIST RELIABILITY ANALYSIS WORKSTATION (VAX VMS VERSION)

    NASA Technical Reports Server (NTRS)

    Butler, R. W.

    1994-01-01

    SARA, the SURE/ASSIST Reliability Analysis Workstation, is a bundle of programs used to solve reliability problems. The mathematical approach chosen to solve a reliability problem may vary with the size and nature of the problem. The Systems Validation Methods group at NASA Langley Research Center has created a set of four software packages that form the basis for a reliability analysis workstation, including three for use in analyzing reconfigurable, fault-tolerant systems and one for analyzing non-reconfigurable systems. The SARA bundle includes the three for reconfigurable, fault-tolerant systems: SURE reliability analysis program (COSMIC program LAR-13789, LAR-14921); the ASSIST specification interface program (LAR-14193, LAR-14923), and PAWS/STEM reliability analysis programs (LAR-14165, LAR-14920). As indicated by the program numbers in parentheses, each of these three packages is also available separately in two machine versions. The fourth package, which is only available separately, is FTC, the Fault Tree Compiler (LAR-14586, LAR-14922). FTC is used to calculate the top-event probability for a fault tree which describes a non-reconfigurable system. PAWS/STEM and SURE are analysis programs which utilize different solution methods, but have a common input language, the SURE language. ASSIST is a preprocessor that generates SURE language from a more abstract definition. ASSIST, SURE, and PAWS/STEM are described briefly in the following paragraphs. For additional details about the individual packages, including pricing, please refer to their respective abstracts. ASSIST, the Abstract Semi-Markov Specification Interface to the SURE Tool program, allows a reliability engineer to describe the failure behavior of a fault-tolerant computer system in an abstract, high-level language. The ASSIST program then automatically generates a corresponding semi-Markov model. A one-page ASSIST-language description may result in a semi-Markov model with thousands of states and transitions. The ASSIST program also includes model-reduction techniques to facilitate efficient modeling of large systems. The semi-Markov model generated by ASSIST is in the format needed for input to SURE and PAWS/STEM. The Semi-Markov Unreliability Range Evaluator, SURE, is an analysis tool for reconfigurable, fault-tolerant systems. SURE provides an efficient means for calculating accurate upper and lower bounds for the death state probabilities for a large class of semi-Markov models, not just those which can be reduced to critical-pair architectures. The calculated bounds are close enough (usually within 5 percent of each other) for use in reliability studies of ultra-reliable computer systems. The SURE bounding theorems have algebraic solutions and are consequently computationally efficient even for large and complex systems. SURE can optionally regard a specified parameter as a variable over a range of values, enabling an automatic sensitivity analysis. SURE output is tabular. The PAWS/STEM package includes two programs for the creation and evaluation of pure Markov models describing the behavior of fault-tolerant reconfigurable computer systems: the Pade Approximation with Scaling (PAWS) and Scaled Taylor Exponential Matrix (STEM) programs. PAWS and STEM produce exact solutions for the probability of system failure and provide a conservative estimate of the number of significant digits in the solution. Markov models of fault-tolerant architectures inevitably lead to numerically stiff differential equations. Both PAWS and STEM have the capability to solve numerically stiff models. These complementary programs use separate methods to determine the matrix exponential in the solution of the model's system of differential equations. In general, PAWS is better suited to evaluate small and dense models. STEM operates at lower precision, but works faster than PAWS for larger models. The programs that comprise the SARA package were originally developed for use on DEC VAX series computers running VMS and were later ported for use on Sun series computers running SunOS. They are written in C-language, Pascal, and FORTRAN 77. An ANSI compliant C compiler is required in order to compile the C portion of the Sun version source code. The Pascal and FORTRAN code can be compiled on Sun computers using Sun Pascal and Sun Fortran. For the VMS version, VAX C, VAX PASCAL, and VAX FORTRAN can be used to recompile the source code. The standard distribution medium for the VMS version of SARA (COS-10041) is a 9-track 1600 BPI magnetic tape in VMSINSTAL format. It is also available on a TK50 tape cartridge in VMSINSTAL format. Executables are included. The standard distribution medium for the Sun version of SARA (COS-10039) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. Both Sun3 and Sun4 executables are included. Electronic copies of the ASSIST user's manual in TeX and PostScript formats are provided on the distribution medium. DEC, VAX, VMS, and TK50 are registered trademarks of Digital Equipment Corporation. Sun, Sun3, Sun4, and SunOS are trademarks of Sun Microsystems, Inc. TeX is a trademark of the American Mathematical Society. PostScript is a registered trademark of Adobe Systems Incorporated.

  9. SARA - SURE/ASSIST RELIABILITY ANALYSIS WORKSTATION (UNIX VERSION)

    NASA Technical Reports Server (NTRS)

    Butler, R. W.

    1994-01-01

    SARA, the SURE/ASSIST Reliability Analysis Workstation, is a bundle of programs used to solve reliability problems. The mathematical approach chosen to solve a reliability problem may vary with the size and nature of the problem. The Systems Validation Methods group at NASA Langley Research Center has created a set of four software packages that form the basis for a reliability analysis workstation, including three for use in analyzing reconfigurable, fault-tolerant systems and one for analyzing non-reconfigurable systems. The SARA bundle includes the three for reconfigurable, fault-tolerant systems: SURE reliability analysis program (COSMIC program LAR-13789, LAR-14921); the ASSIST specification interface program (LAR-14193, LAR-14923), and PAWS/STEM reliability analysis programs (LAR-14165, LAR-14920). As indicated by the program numbers in parentheses, each of these three packages is also available separately in two machine versions. The fourth package, which is only available separately, is FTC, the Fault Tree Compiler (LAR-14586, LAR-14922). FTC is used to calculate the top-event probability for a fault tree which describes a non-reconfigurable system. PAWS/STEM and SURE are analysis programs which utilize different solution methods, but have a common input language, the SURE language. ASSIST is a preprocessor that generates SURE language from a more abstract definition. ASSIST, SURE, and PAWS/STEM are described briefly in the following paragraphs. For additional details about the individual packages, including pricing, please refer to their respective abstracts. ASSIST, the Abstract Semi-Markov Specification Interface to the SURE Tool program, allows a reliability engineer to describe the failure behavior of a fault-tolerant computer system in an abstract, high-level language. The ASSIST program then automatically generates a corresponding semi-Markov model. A one-page ASSIST-language description may result in a semi-Markov model with thousands of states and transitions. The ASSIST program also includes model-reduction techniques to facilitate efficient modeling of large systems. The semi-Markov model generated by ASSIST is in the format needed for input to SURE and PAWS/STEM. The Semi-Markov Unreliability Range Evaluator, SURE, is an analysis tool for reconfigurable, fault-tolerant systems. SURE provides an efficient means for calculating accurate upper and lower bounds for the death state probabilities for a large class of semi-Markov models, not just those which can be reduced to critical-pair architectures. The calculated bounds are close enough (usually within 5 percent of each other) for use in reliability studies of ultra-reliable computer systems. The SURE bounding theorems have algebraic solutions and are consequently computationally efficient even for large and complex systems. SURE can optionally regard a specified parameter as a variable over a range of values, enabling an automatic sensitivity analysis. SURE output is tabular. The PAWS/STEM package includes two programs for the creation and evaluation of pure Markov models describing the behavior of fault-tolerant reconfigurable computer systems: the Pade Approximation with Scaling (PAWS) and Scaled Taylor Exponential Matrix (STEM) programs. PAWS and STEM produce exact solutions for the probability of system failure and provide a conservative estimate of the number of significant digits in the solution. Markov models of fault-tolerant architectures inevitably lead to numerically stiff differential equations. Both PAWS and STEM have the capability to solve numerically stiff models. These complementary programs use separate methods to determine the matrix exponential in the solution of the model's system of differential equations. In general, PAWS is better suited to evaluate small and dense models. STEM operates at lower precision, but works faster than PAWS for larger models. The programs that comprise the SARA package were originally developed for use on DEC VAX series computers running VMS and were later ported for use on Sun series computers running SunOS. They are written in C-language, Pascal, and FORTRAN 77. An ANSI compliant C compiler is required in order to compile the C portion of the Sun version source code. The Pascal and FORTRAN code can be compiled on Sun computers using Sun Pascal and Sun Fortran. For the VMS version, VAX C, VAX PASCAL, and VAX FORTRAN can be used to recompile the source code. The standard distribution medium for the VMS version of SARA (COS-10041) is a 9-track 1600 BPI magnetic tape in VMSINSTAL format. It is also available on a TK50 tape cartridge in VMSINSTAL format. Executables are included. The standard distribution medium for the Sun version of SARA (COS-10039) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. Both Sun3 and Sun4 executables are included. Electronic copies of the ASSIST user's manual in TeX and PostScript formats are provided on the distribution medium. DEC, VAX, VMS, and TK50 are registered trademarks of Digital Equipment Corporation. Sun, Sun3, Sun4, and SunOS are trademarks of Sun Microsystems, Inc. TeX is a trademark of the American Mathematical Society. PostScript is a registered trademark of Adobe Systems Incorporated.

  10. Large space telescope, phase A. Volume 4: Scientific instrument package

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The design and characteristics of the scientific instrument package for the Large Space Telescope are discussed. The subjects include: (1) general scientific objectives, (2) package system analysis, (3) scientific instrumentation, (4) imaging photoelectric sensors, (5) environmental considerations, and (6) reliability and maintainability.

  11. The Analysis of the Regression-Discontinuity Design in R

    ERIC Educational Resources Information Center

    Thoemmes, Felix; Liao, Wang; Jin, Ze

    2017-01-01

    This article describes the analysis of regression-discontinuity designs (RDDs) using the R packages rdd, rdrobust, and rddtools. We discuss similarities and differences between these packages and provide directions on how to use them effectively. We use real data from the Carolina Abecedarian Project to show how an analysis of an RDD can be…

  12. Examining the role of unmeasured confounding in mediation analysis with genetic and genomic applications.

    PubMed

    Lutz, Sharon M; Thwing, Annie; Schmiege, Sarah; Kroehl, Miranda; Baker, Christopher D; Starling, Anne P; Hokanson, John E; Ghosh, Debashis

    2017-07-19

    In mediation analysis if unmeasured confounding is present, the estimates for the direct and mediated effects may be over or under estimated. Most methods for the sensitivity analysis of unmeasured confounding in mediation have focused on the mediator-outcome relationship. The Umediation R package enables the user to simulate unmeasured confounding of the exposure-mediator, exposure-outcome, and mediator-outcome relationships in order to see how the results of the mediation analysis would change in the presence of unmeasured confounding. We apply the Umediation package to the Genetic Epidemiology of Chronic Obstructive Pulmonary Disease (COPDGene) study to examine the role of unmeasured confounding due to population stratification on the effect of a single nucleotide polymorphism (SNP) in the CHRNA5/3/B4 locus on pulmonary function decline as mediated by cigarette smoking. Umediation is a flexible R package that examines the role of unmeasured confounding in mediation analysis allowing for normally distributed or Bernoulli distributed exposures, outcomes, mediators, measured confounders, and unmeasured confounders. Umediation also accommodates multiple measured confounders, multiple unmeasured confounders, and allows for a mediator-exposure interaction on the outcome. Umediation is available as an R package at https://github.com/SharonLutz/Umediation A tutorial on how to install and use the Umediation package is available in the Additional file 1.

  13. Thermal Analysis of a Nuclear Waste Repository in Argillite Host Rock

    NASA Astrophysics Data System (ADS)

    Hadgu, T.; Gomez, S. P.; Matteo, E. N.

    2017-12-01

    Disposal of high-level nuclear waste in a geological repository requires analysis of heat distribution as a result of decay heat. Such an analysis supports design of repository layout to define repository footprint as well as provide information of importance to overall design. The analysis is also used in the study of potential migration of radionuclides to the accessible environment. In this study, thermal analysis for high-level waste and spent nuclear fuel in a generic repository in argillite host rock is presented. The thermal analysis utilized both semi-analytical and numerical modeling in the near field of a repository. The semi-analytical method looks at heat transport by conduction in the repository and surroundings. The results of the simulation method are temperature histories at selected radial distances from the waste package. A 3-D thermal-hydrologic numerical model was also conducted to study fluid and heat distribution in the near field. The thermal analysis assumed a generic geological repository at 500 m depth. For the semi-analytical method, a backfilled closed repository was assumed with basic design and material properties. For the thermal-hydrologic numerical method, a repository layout with disposal in horizontal boreholes was assumed. The 3-D modeling domain covers a limited portion of the repository footprint to enable a detailed thermal analysis. A highly refined unstructured mesh was used with increased discretization near heat sources and at intersections of different materials. All simulations considered different parameter values for properties of components of the engineered barrier system (i.e. buffer, disturbed rock zone and the host rock), and different surface storage times. Results of the different modeling cases are presented and include temperature and fluid flow profiles in the near field at different simulation times. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA-0003525. SAND2017-8295 A.

  14. The Geomorphic Road Analysis and Inventory Package (GRAIP) Volume 2: Office Procedures

    Treesearch

    Richard M. Cissel; Thomas A. Black; Kimberly A. T. Schreuders; Ajay Prasad; Charles H. Luce; David G. Tarboton; Nathan A. Nelson

    2012-01-01

    An important first step in managing forest roads for improved water quality and aquatic habitat is the performance of an inventory. The Geomorphic Roads Analysis and Inventory Package (GRAIP) was developed as a tool for making a comprehensive inventory and analysis of the effects of forest roads on watersheds. This manual describes the data analysis and process of a...

  15. Using Python Packages in 6D (Py)Ferret: EOF Analysis, OPeNDAP Sequence Data

    NASA Astrophysics Data System (ADS)

    Smith, K. M.; Manke, A.; Hankin, S. C.

    2012-12-01

    PyFerret was designed to provide the easy methods of access, analysis, and display of data found in the Ferret under the simple yet powerful Python scripting/programming language. This has enabled PyFerret to take advantage of a large and expanding collection of third-party scientific Python modules. Furthermore, ensemble and forecast axes have been added to Ferret and PyFerret for creating and working with collections of related data in Ferret's delayed-evaluation and minimal-data-access mode of operation. These axes simplify processing and visualization of these collections of related data. As one example, an empirical orthogonal function (EOF) analysis Python module was developed, taking advantage of the linear algebra module and other standard functionality in NumPy for efficient numerical array processing. This EOF analysis module is used in a Ferret function to provide an ensemble of levels of data explained by each EOF and Time Amplitude Function (TAF) product. Another example makes use of the PyDAP Python module to provide OPeNDAP sequence data for use in Ferret with minimal data access characteristic of Ferret.

  16. ASAP- ARTIFICIAL SATELLITE ANALYSIS PROGRAM

    NASA Technical Reports Server (NTRS)

    Kwok, J.

    1994-01-01

    The Artificial Satellite Analysis Program (ASAP) is a general orbit prediction program which incorporates sufficient orbit modeling accuracy for mission design, maneuver analysis, and mission planning. ASAP is suitable for studying planetary orbit missions with spacecraft trajectories of reconnaissance (flyby) and exploratory (mapping) nature. Sample data is included for a geosynchronous station drift cycle study, a Venus radar mapping strategy, a frozen orbit about Mars, and a repeat ground trace orbit. ASAP uses Cowell's method in the numerical integration of the equations of motion. The orbital mechanics calculation contains perturbations due to non-sphericity (up to a 40 X 40 field) of the planet, lunar and solar effects, and drag and solar radiation pressure. An 8th order Runge-Kutta integration scheme with variable step size control is used for efficient propagation. The input includes the classical osculating elements, orbital elements of the sun relative to the planet, reference time and dates, drag coefficient, gravitational constants, and planet radius, rotation rate, etc. The printed output contains Cartesian coordinates, velocity, equinoctial elements, and classical elements for each time step or event step. At each step, selected output is added to a plot file. The ASAP package includes a program for sorting this plot file. LOTUS 1-2-3 is used in the supplied examples to graph the results, but any graphics software package could be used to process the plot file. ASAP is not written to be mission-specific. Instead, it is intended to be used for most planetary orbiting missions. As a consequence, the user has to have some basic understanding of orbital mechanics to provide the correct input and interpret the subsequent output. ASAP is written in FORTRAN 77 for batch execution and has been implemented on an IBM PC compatible computer operating under MS-DOS. The ASAP package requires a math coprocessor and a minimum of 256K RAM. This program was last updated in 1988 with version 2.03. IBM PC is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. Lotus and 1-2-3 are registered trademarks of Lotus Development Corporation.

  17. psygenet2r: a R/Bioconductor package for the analysis of psychiatric disease genes.

    PubMed

    Gutiérrez-Sacristán, Alba; Hernández-Ferrer, Carles; González, Juan R; Furlong, Laura I

    2017-12-15

    Psychiatric disorders have a great impact on morbidity and mortality. Genotype-phenotype resources for psychiatric diseases are key to enable the translation of research findings to a better care of patients. PsyGeNET is a knowledge resource on psychiatric diseases and their genes, developed by text mining and curated by domain experts. We present psygenet2r, an R package that contains a variety of functions for leveraging PsyGeNET database and facilitating its analysis and interpretation. The package offers different types of queries to the database along with variety of analysis and visualization tools, including the study of the anatomical structures in which the genes are expressed and gaining insight of gene's molecular function. Psygenet2r is especially suited for network medicine analysis of psychiatric disorders. The package is implemented in R and is available under MIT license from Bioconductor (http://bioconductor.org/packages/release/bioc/html/psygenet2r.html). juanr.gonzalez@isglobal.org or laura.furlong@upf.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  18. Fragman: an R package for fragment analysis.

    PubMed

    Covarrubias-Pazaran, Giovanny; Diaz-Garcia, Luis; Schlautman, Brandon; Salazar, Walter; Zalapa, Juan

    2016-04-21

    Determination of microsatellite lengths or other DNA fragment types is an important initial component of many genetic studies such as mutation detection, linkage and quantitative trait loci (QTL) mapping, genetic diversity, pedigree analysis, and detection of heterozygosity. A handful of commercial and freely available software programs exist for fragment analysis; however, most of them are platform dependent and lack high-throughput applicability. We present the R package Fragman to serve as a freely available and platform independent resource for automatic scoring of DNA fragment lengths diversity panels and biparental populations. The program analyzes DNA fragment lengths generated in Applied Biosystems® (ABI) either manually or automatically by providing panels or bins. The package contains additional tools for converting the allele calls to GenAlEx, JoinMap® and OneMap software formats mainly used for genetic diversity and generating linkage maps in plant and animal populations. Easy plotting functions and multiplexing friendly capabilities are some of the strengths of this R package. Fragment analysis using a unique set of cranberry (Vaccinium macrocarpon) genotypes based on microsatellite markers is used to highlight the capabilities of Fragman. Fragman is a valuable new tool for genetic analysis. The package produces equivalent results to other popular software for fragment analysis while possessing unique advantages and the possibility of automation for high-throughput experiments by exploiting the power of R.

  19. TESSIM: a simulator for the Athena-X-IFU

    NASA Astrophysics Data System (ADS)

    Wilms, J.; Smith, S. J.; Peille, P.; Ceballos, M. T.; Cobo, B.; Dauser, T.; Brand, T.; den Hartog, R. H.; Bandler, S. R.; de Plaa, J.; den Herder, J.-W. A.

    2016-07-01

    We present the design of tessim, a simulator for the physics of transition edge sensors developed in the framework of the Athena end to end simulation effort. Designed to represent the general behavior of transition edge sensors and to provide input for engineering and science studies for Athena, tessim implements a numerical solution of the linearized equations describing these devices. The simulation includes a model for the relevant noise sources and several implementations of possible trigger algorithms. Input and output of the software are standard FITS- files which can be visualized and processed using standard X-ray astronomical tool packages. Tessim is freely available as part of the SIXTE package (http://www.sternwarte.uni-erlangen.de/research/sixte/).

  20. TESSIM: A Simulator for the Athena-X-IFU

    NASA Technical Reports Server (NTRS)

    Wilms, J.; Smith, S. J.; Peille, P.; Ceballos, M. T.; Cobo, B.; Dauser, T.; Brand, T.; Den Hartog, R. H.; Bandler, S. R.; De Plaa, J.; hide

    2016-01-01

    We present the design of tessim, a simulator for the physics of transition edge sensors developed in the framework of the Athena end to end simulation effort. Designed to represent the general behavior of transition edge sensors and to provide input for engineering and science studies for Athena, tessim implements a numerical solution of the linearized equations describing these devices. The simulation includes a model for the relevant noise sources and several implementations of possible trigger algorithms. Input and output of the software are standard FITS-les which can be visualized and processed using standard X-ray astronomical tool packages. Tessim is freely available as part of the SIXTE package (http:www.sternwarte.uni-erlangen.deresearchsixte).

  1. Lenstronomy: Multi-purpose gravitational lens modeling software package

    NASA Astrophysics Data System (ADS)

    Birrer, Simon; Amara, Adam

    2018-04-01

    Lenstronomy is a multi-purpose open-source gravitational lens modeling python package. Lenstronomy reconstructs the lens mass and surface brightness distributions of strong lensing systems using forward modelling and supports a wide range of analytic lens and light models in arbitrary combination. The software is also able to reconstruct complex extended sources as well as point sources. Lenstronomy is flexible and numerically accurate, with a clear user interface that could be deployed across different platforms. Lenstronomy has been used to derive constraints on dark matter properties in strong lenses, measure the expansion history of the universe with time-delay cosmography, measure cosmic shear with Einstein rings, and decompose quasar and host galaxy light.

  2. Thyra Abstract Interface Package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartlett, Roscoe A.

    2005-09-01

    Thrya primarily defines a set of abstract C++ class interfaces needed for the development of abstract numerical atgorithms (ANAs) such as iterative linear solvers, transient solvers all the way up to optimization. At the foundation of these interfaces are abstract C++ classes for vectors, vector spaces, linear operators and multi-vectors. Also included in the Thyra package is C++ code for creating concrete vector, vector space, linear operator, and multi-vector subclasses as well as other utilities to aid in the development of ANAs. Currently, very general and efficient concrete subclass implementations exist for serial and SPMD in-core vectors and multi-vectors. Codemore » also currently exists for testing objects and providing composite objects such as product vectors.« less

  3. Microelectromechanical systems(MEMS): Launching Research Concepts into the Marketplace

    NASA Astrophysics Data System (ADS)

    Arney, Susanne

    1999-04-01

    More than a decade following the demonstration of the first spinning micromotors and microgears, the field of microelectromechanical systems (MEMS) has burgeoned on a worldwide basis. Integrated circuit design, fabrication, and packaging techniques have provided the foundation for the growth of an increasingly mature MEMS infrastructure which spans numerous topics of research as well as industrial application. The remarkable proliferation of MEMS concepts into such contrasting arenas of application as automotive sensors, biology, optical and wireless telecommunications, displays, printing, and physics experiments will be described. Challenges to commercialization of research prototypes will be discussed with emphasis on the development of design, fabrication, packaging, reliability and standards which fundamentally enable the application of MEMS to a highly diversified marketplace.

  4. XFEL OSCILLATOR SIMULATION INCLUDING ANGLE-DEPENDENT CRYSTAL REFLECTIVITY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fawley, William; Lindberg, Ryan; Kim, K-J

    The oscillator package within the GINGER FEL simulation code has now been extended to include angle-dependent reflectivity properties of Bragg crystals. Previously, the package was modified to include frequencydependent reflectivity in order to model x-ray FEL oscillators from start-up from shot noise through to saturation. We present a summary of the algorithms used for modeling the crystal reflectivity and radiation propagation outside the undulator, discussing various numerical issues relevant to the domain of high Fresnel number and efficient Hankel transforms. We give some sample XFEL-O simulation results obtained with the angle-dependent reflectivity model, with particular attention directed to the longitudinalmore » and transverse coherence of the radiation output.« less

  5. Data Packages for the Hanford Immobilized Low Activity Tank Waste Performance Assessment 2001 Version [SEC 1 THRU 5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MANN, F.M.

    Data package supporting the 2001 Immobilized Low-Activity Waste Performance Analysis. Geology, hydrology, geochemistry, facility, waste form, and dosimetry data based on recent investigation are provided. Verification and benchmarking packages for selected software codes are provided.

  6. INTERFACING SAS TO ORACLE IN THE UNIX ENVIRONMENT

    EPA Science Inventory

    SAS is an EPA standard data and statistical analysis software package while ORACLE is EPA's standard data base management system software package. RACLE has the advantage over SAS in data retrieval and storage capabilities but has limited data and statistical analysis capability....

  7. Neural Network Prototyping Package Within IRAF

    NASA Technical Reports Server (NTRS)

    Bazell, David

    1997-01-01

    The purpose of this contract was to develop a neural network package within the IRAF environment to allow users to easily understand and use different neural network algorithms the analysis of astronomical data. The package was developed for use within IRAF to allow portability to different computing environments and to provide a familiar and easy to use interface with the routines. In addition to developing the software and supporting documentation, we planned to use the system for the analysis of several sample problems to prove its viability and usefulness.

  8. An Interactive Computer Aided Design and Analysis Package.

    DTIC Science & Technology

    1986-03-01

    Al-A167 114 AN INTERACTIVE COMPUTER AIDED DESIGN MUD ANAILYSIS 1/𔃼 PACKAGE(U) NAVAL POSTGRADUATE SCHOOL NONTEREY CA T L EUALD "AR 86 UNCLSSIFIED F... SCHOOL Monterey, California DTIC .LECTE MAYOS THESIS AN INTERACTIVE COMPUTER AIDED DESIGN AND ANALYSIS PACKAGE by Terrence L. Ewald March 1986 jThesis...ORGANIZATION Naval Postgraduate School (if dAp90h81111) Naval Postgraduate School . 62A 6C. ADDRESS (0ty. State, and ZIP Code) 7b. ADDRESS (City State. and

  9. The LARSYS educational package: Instructor's notes. [instructional materials for training people to analyze remotely sensed data

    NASA Technical Reports Server (NTRS)

    Lindenlaub, J. C.; Davis, S. M.

    1974-01-01

    Materials are presented for assisting instructors in teaching the LARSYS Educational Package, which is a set of instructional materials to train people to analyze remotely sensed multispectral data. The seven units of the package are described. These units are: quantitative remote sensing, overview of the LARSYS software system, the 2780 remote terminal, demonstration of LARSYS on the 2780 remote terminal, exercises, guide to multispectral data analysis, and a case study using LARSYS for analysis of LANDSAT data.

  10. Long duration exposure facility solar illumination data package

    NASA Technical Reports Server (NTRS)

    Berrios, William M.; Sampair, Thomas

    1990-01-01

    A post flight solar illumination data package was created by the LDEF thermal analysis data group in support of the LDEF science office data group. The data presented was prepared with the Thermal Radiation Analysis System (TRASYS) program. Ground tracking data was used to calculate daily orbital beta angles for the calculation of resultant fluxes. This data package will be useful in calculation of solar illumination fluent for a variety of beta angle orbital conditions encountered during the LDEF mission.

  11. Nonlinear micromechanics-based finite element analysis of the interfacial behaviour of FRP-strengthened reinforced concrete beams

    NASA Astrophysics Data System (ADS)

    Abd El Baky, Hussien

    This research work is devoted to theoretical and numerical studies on the flexural behaviour of FRP-strengthened concrete beams. The objectives of this research are to extend and generalize the results of simple experiments, to recommend new design guidelines based on accurate numerical tools, and to enhance our comprehension of the bond performance of such beams. These numerical tools can be exploited to bridge the existing gaps in the development of analysis and modelling approaches that can predict the behaviour of FRP-strengthened concrete beams. The research effort here begins with the formulation of a concrete model and development of FRP/concrete interface constitutive laws, followed by finite element simulations for beams strengthened in flexure. Finally, a statistical analysis is carried out taking the advantage of the aforesaid numerical tools to propose design guidelines. In this dissertation, an alternative incremental formulation of the M4 microplane model is proposed to overcome the computational complexities associated with the original formulation. Through a number of numerical applications, this incremental formulation is shown to be equivalent to the original M4 model. To assess the computational efficiency of the incremental formulation, the "arc-length" numerical technique is also considered and implemented in the original Bazant et al. [2000] M4 formulation. Finally, the M4 microplane concrete model is coded in FORTRAN and implemented as a user-defined subroutine into the commercial software package ADINA, Version 8.4. Then this subroutine is used with the finite element package to analyze various applications involving FRP strengthening. In the first application a nonlinear micromechanics-based finite element analysis is performed to investigate the interfacial behaviour of FRP/concrete joints subjected to direct shear loadings. The intention of this part is to develop a reliable bond--slip model for the FRP/concrete interface. The bond--slip relation is developed considering the interaction between the interfacial normal and shear stress components along the bonded length. A new approach is proposed to describe the entire tau-s relationship based on three separate models. The first model captures the shear response of an orthotropic FRP laminate. The second model simulates the shear characteristics of an adhesive layer, while the third model represents the shear nonlinearity of a thin layer inside the concrete, referred to as the interfacial layer. The proposed bond--slip model reflects the geometrical and material characteristics of the FRP, concrete, and adhesive layers. Two-dimensional and three-dimensional nonlinear displacement-controlled finite element (FE) models are then developed to investigate the flexural and FRP/concrete interfacial responses of FRP-strengthened reinforced concrete beams. The three-dimensional finite element model is created to accommodate cases of beams having FRP anchorage systems. Discrete interface elements are proposed and used to simulate the FRP/concrete interfacial behaviour before and after cracking. The FE models are capable of simulating the various failure modes, including debonding of the FRP either at the plate end or at intermediate cracks. Particular attention is focused on the effect of crack initiation and propagation on the interfacial behaviour. This study leads to an accurate and refined interpretation of the plate-end and intermediate crack debonding failure mechanisms for FRP-strengthened beams with and without FRP anchorage systems. Finally, the FE models are used to conduct a parametric study to generalize the findings of the FE analysis. The variables under investigation include two material characteristics; namely, the concrete compressive strength and axial stiffness of the FRP laminates as well as three geometric properties; namely, the steel reinforcement ratio, the beam span length and the beam depth. The parametric study is followed by a statistical analysis for 43 strengthened beams involving the five aforementioned variables. The response surface methodology (RSM) technique is employed to optimize the accuracy of the statistical models while minimizing the numbers of finite element runs. In particular, a face-centred design (FCD) is applied to evaluate the influence of the critical variables on the debonding load and debonding strain limits in the FRP laminates. Based on these statistical models, a nonlinear statistical regression analysis is used to propose design guidelines for the FRP flexural strengthening of reinforced concrete beams. (Abstract shortened by UMI.)

  12. Numerical Approach to Modeling and Characterization of Refractive Index Changes for a Long-Period Fiber Grating Fabricated by Femtosecond Laser

    PubMed Central

    Saad, Akram; Cho, Yonghyun; Ahmed, Farid; Jun, Martin Byung-Guk

    2016-01-01

    A 3D finite element model constructed to predict the intensity-dependent refractive index profile induced by femtosecond laser radiation is presented. A fiber core irradiated by a pulsed laser is modeled as a cylinder subject to predefined boundary conditions using COMSOL5.2 Multiphysics commercial package. The numerically obtained refractive index change is used to numerically design and experimentally fabricate long-period fiber grating (LPFG) in pure silica core single-mode fiber employing identical laser conditions. To reduce the high computational requirements, the beam envelope method approach is utilized in the aforementioned numerical models. The number of periods, grating length, and grating period considered in this work are numerically quantified. The numerically obtained spectral growth of the modeled LPFG seems to be consistent with the transmission of the experimentally fabricated LPFG single mode fiber. The sensing capabilities of the modeled LPFG are tested by varying the refractive index of the surrounding medium. The numerically obtained spectrum corresponding to the varied refractive index shows good agreement with the experimental findings. PMID:28774060

  13. Numerical Approach to Modeling and Characterization of Refractive Index Changes for a Long-Period Fiber Grating Fabricated by Femtosecond Laser.

    PubMed

    Saad, Akram; Cho, Yonghyun; Ahmed, Farid; Jun, Martin Byung-Guk

    2016-11-21

    A 3D finite element model constructed to predict the intensity-dependent refractive index profile induced by femtosecond laser radiation is presented. A fiber core irradiated by a pulsed laser is modeled as a cylinder subject to predefined boundary conditions using COMSOL5.2 Multiphysics commercial package. The numerically obtained refractive index change is used to numerically design and experimentally fabricate long-period fiber grating (LPFG) in pure silica core single-mode fiber employing identical laser conditions. To reduce the high computational requirements, the beam envelope method approach is utilized in the aforementioned numerical models. The number of periods, grating length, and grating period considered in this work are numerically quantified. The numerically obtained spectral growth of the modeled LPFG seems to be consistent with the transmission of the experimentally fabricated LPFG single mode fiber. The sensing capabilities of the modeled LPFG are tested by varying the refractive index of the surrounding medium. The numerically obtained spectrum corresponding to the varied refractive index shows good agreement with the experimental findings.

  14. Clustering of Variables for Mixed Data

    NASA Astrophysics Data System (ADS)

    Saracco, J.; Chavent, M.

    2016-05-01

    This chapter presents clustering of variables which aim is to lump together strongly related variables. The proposed approach works on a mixed data set, i.e. on a data set which contains numerical variables and categorical variables. Two algorithms of clustering of variables are described: a hierarchical clustering and a k-means type clustering. A brief description of PCAmix method (that is a principal component analysis for mixed data) is provided, since the calculus of the synthetic variables summarizing the obtained clusters of variables is based on this multivariate method. Finally, the R packages ClustOfVar and PCAmixdata are illustrated on real mixed data. The PCAmix and ClustOfVar approaches are first used for dimension reduction (step 1) before applying in step 2 a standard clustering method to obtain groups of individuals.

  15. Understanding Stellar Light Spatial Inhomogeneities and Time Variability

    NASA Technical Reports Server (NTRS)

    Uitenbroek, Han; Sasselov, Dimitar D.

    2000-01-01

    We would like the opportunity to thank NASA for supporting our efforts to construct tools to analyze the spectra of spatially inhomogeneous and temporally varying stellar atmospheres. This financial support has allowed us to a versatile radiative transfer code that can be used for many different applications. With this numerical code we have written a point-and-click analysis package written in IDL that can be used to look extensively at the generated output data. Below we describe the most recent results obtained with our transfer code and list papers that have appeared with these results. Although we have not been able to produce as many time-dependent calculations as we had hoped (mainly because of programmatic reasons; Sasselov took another position halfway through the grant), we believe we have

  16. Modelling, simulation and computer-aided design (CAD) of gyrotrons for novel applications in the high-power terahertz science and technologies

    NASA Astrophysics Data System (ADS)

    Sabchevski, S.; Idehara, T.; Damyanova, M.; Zhelyazkov, I.; Balabanova, E.; Vasileva, E.

    2018-03-01

    Gyrotrons are the most powerful sources of CW coherent radiation in the sub-THz and THz frequency bands. In recent years, they have demonstrated a remarkable potential for bridging the so-called THz-gap in the electromagnetic spectrum and opened the road to many novel applications of the terahertz waves. Among them are various advanced spectroscopic techniques (e.g., ESR and DNP-NMR), plasma physics and fusion research, materials processing and characterization, imaging and inspection, new medical technologies and biological studies. In this paper, we review briefly the current status of the research in this broad field and present our problem-oriented software packages developed recently for numerical analysis, computer-aided design (CAD) and optimization of gyrotrons.

  17. The numerical study of the influence of rheological parameters stratified flows characteristics in cable dies

    NASA Astrophysics Data System (ADS)

    Kozitsyna, M. V.; Trufanova, N. M.

    2017-01-01

    Today the process of coextrusion is the most technological in the cable production with cross-linked polyethylene, composed of two or more layers of polymeric insulation. Since the covering technology is a simultaneous imposition of all necessary layers (two semiconducting shields on the insulation and conductor and one - on insulation), the main focus of this study is the analysis of significance of various factors influence on stratified flows characteristics. This paper has considered the flow of two abnormally viscous liquids in the cable head. The problem has been solved through a three-dimensional statement by applying the finite element method in the Ansys software package. The influence has been estimated by varying the rheological properties of materials to create all necessary layers thickness.

  18. OPTIMASS: A package for the minimization of kinematic mass functions with constraints

    DOE PAGES

    Cho, Won Sang; Gainer, James S.; Kim, Doojin; ...

    2016-01-07

    Reconstructed mass variables, such as M 2, M 2C, M* T, and M T2 W, play an essential role in searches for new physics at hadron colliders. The calculation of these variables generally involves constrained minimization in a large parameter space, which is numerically challenging. We provide a C++ code, Optimass, which interfaces with the Minuit library to perform this constrained minimization using the Augmented Lagrangian Method. The code can be applied to arbitrarily general event topologies, thus allowing the user to significantly extend the existing set of kinematic variables. Here, we describe this code, explain its physics motivation, andmore » demonstrate its use in the analysis of the fully leptonic decay of pair-produced top quarks using M 2 variables.« less

  19. An attempt to make a reliable assessment of the wet steam flow field in the de Laval nozzle

    NASA Astrophysics Data System (ADS)

    Dykas, Sławomir; Majkut, Mirosław; Smołka, Krystian; Strozik, Michał

    2018-02-01

    This paper presents the results of research on the wet steam flow with spontaneous condensation in the de Laval nozzle. A comparison is made between the results of numerical modelling performed for two cases of boundary conditions obtained using an in-house CFD code and the Ansys CFX commercial package. The numerical modelling results are compared to the results of experimental testing carried out on an in-house laboratory steam tunnel. The differences between the numerical results produced by the two codes in terms of place and intensity of condensations of steam to water point to the difficulty in correct modelling of this type of flows and emphasize the need for further studies in this field.

  20. PLATSIM: An efficient linear simulation and analysis package for large-order flexible systems

    NASA Technical Reports Server (NTRS)

    Maghami, Periman; Kenny, Sean P.; Giesy, Daniel P.

    1995-01-01

    PLATSIM is a software package designed to provide efficient time and frequency domain analysis of large-order generic space platforms implemented with any linear time-invariant control system. Time domain analysis provides simulations of the overall spacecraft response levels due to either onboard or external disturbances. The time domain results can then be processed by the jitter analysis module to assess the spacecraft's pointing performance in a computationally efficient manner. The resulting jitter analysis algorithms have produced an increase in speed of several orders of magnitude over the brute force approach of sweeping minima and maxima. Frequency domain analysis produces frequency response functions for uncontrolled and controlled platform configurations. The latter represents an enabling technology for large-order flexible systems. PLATSIM uses a sparse matrix formulation for the spacecraft dynamics model which makes both the time and frequency domain operations quite efficient, particularly when a large number of modes are required to capture the true dynamics of the spacecraft. The package is written in MATLAB script language. A graphical user interface (GUI) is included in the PLATSIM software package. This GUI uses MATLAB's Handle graphics to provide a convenient way for setting simulation and analysis parameters.

  1. An Examination of the Effects of a Video-Based Training Package on Professional Staff's Implementation of a Brief Functional Analysis and Data Analysis

    ERIC Educational Resources Information Center

    Fleming, Courtney V.

    2011-01-01

    Minimal research has investigated training packages used to teach professional staff how to implement functional analysis procedures and to interpret data gathered during functional analysis. The current investigation used video-based training with role-play and feedback to teach six professionals in a clinical setting to implement procedures of a…

  2. Evaluation of the Field Test of Project Information Packages: Volume III--Resource Cost Analysis.

    ERIC Educational Resources Information Center

    Al-Salam, Nabeel; And Others

    The third of three volumes evaluating the first year field test of the Project Information Packages (PIPs) provides a cost analysis study as a key element in the total evaluation. The resource approach to cost analysis is explained and the specific resource methodology used in the main cost analysis of the 19 PIP field-test projects detailed. The…

  3. GRAFLAB 2.3 for UNIX - A MATLAB database, plotting, and analysis tool: User`s guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunn, W.N.

    1998-03-01

    This report is a user`s manual for GRAFLAB, which is a new database, analysis, and plotting package that has been written entirely in the MATLAB programming language. GRAFLAB is currently used for data reduction, analysis, and archival. GRAFLAB was written to replace GRAFAID, which is a FORTRAN database, analysis, and plotting package that runs on VAX/VMS.

  4. GDCRNATools: an R/Bioconductor package for integrative analysis of lncRNA, miRNA, and mRNA data in GDC.

    PubMed

    Li, Ruidong; Qu, Han; Wang, Shibo; Wei, Julong; Zhang, Le; Ma, Renyuan; Lu, Jianming; Zhu, Jianguo; Zhong, Wei-De; Jia, Zhenyu

    2018-03-02

    The large-scale multidimensional omics data in the Genomic Data Commons (GDC) provides opportunities to investigate the crosstalk among different RNA species and their regulatory mechanisms in cancers. Easy-to-use bioinformatics pipelines are needed to facilitate such studies. We have developed a user-friendly R/Bioconductor package, named GDCRNATools, for downloading, organizing, and analyzing RNA data in GDC with an emphasis on deciphering the lncRNA-mRNA related competing endogenous RNAs (ceRNAs) regulatory network in cancers. Many widely used bioinformatics tools and databases are utilized in our package. Users can easily pack preferred downstream analysis pipelines or integrate their own pipelines into the workflow. Interactive shiny web apps built in GDCRNATools greatly improve visualization of results from the analysis. GDCRNATools is an R/Bioconductor package that is freely available at Bioconductor (http://bioconductor.org/packages/devel/bioc/html/GDCRNATools.html). Detailed instructions, manual and example code are also available in Github (https://github.com/Jialab-UCR/GDCRNATools). arthur.jia@ucr.edu or zhongwd2009@live.cn or doctorzhujianguo@163.com.

  5. Impact of policy changes on infant feeding decisions among low-income women participating in the Special Supplemental Nutrition Program for Women, Infants, and Children.

    PubMed

    Whaley, Shannon E; Koleilat, Maria; Whaley, Mike; Gomez, Judy; Meehan, Karen; Saluja, Kiran

    2012-12-01

    We present infant feeding data before and after the 2009 Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) food package change that supported and incentivized breastfeeding. We describe the key role of California WIC staff in supporting these policy changes. We analyzed WIC data on more than 180,000 infants in Southern California. We employed the analysis of variance and Tukey (honestly significant difference) tests to compare issuance rates of postpartum and infant food packages before and after the changes. We used analysis of covariance to adjust for poverty status changes as a potential confounder. Issuance rates of the "fully breastfeeding" package at infant WIC enrollment increased by 86% with the package changes. Rates also increased significantly for 2- and 6-month-old infants. Issuance rates of packages that included formula decreased significantly. All outcomes remained highly significant in the adjusted model. Policy changes, training of front-line WIC staff, and participant education influenced issuance rates of WIC food packages. In California, the issuance rates of packages that include formula have significantly decreased and the rate for those that include no formula has significantly increased.

  6. SIMA: Python software for analysis of dynamic fluorescence imaging data.

    PubMed

    Kaifosh, Patrick; Zaremba, Jeffrey D; Danielson, Nathan B; Losonczy, Attila

    2014-01-01

    Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scanning microscopy, segmentation of imaged fields into regions of interest (ROIs), and extraction of signals from the segmented ROIs. We have also developed a graphical user interface (GUI) for manual editing of the automatically segmented ROIs and automated registration of ROIs across multiple imaging datasets. This software has been designed with flexibility in mind to allow for future extension with different analysis methods and potential integration with other packages. Software, documentation, and source code for the SIMA package and ROI Buddy GUI are freely available at http://www.losonczylab.org/sima/.

  7. Space shuttle/food system. Volume 2, Appendix C: Food cooling techniques analysis. Appendix D: Package and stowage: Alternate concepts analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The relative penalties associated with various techniques for providing an onboard cold environment for storage of perishable food items, and for the development of packaging and vehicle stowage parameters were investigated in terms of the overall food system design analysis of space shuttle. The degrees of capability for maintaining both a 40 F to 45 F refrigerated temperature and a 0 F and 20 F frozen environment were assessed for the following cooling techniques: (1) phase change (heat sink) concept; (2) thermoelectric concept; (3) vapor cycle concept; and (4) expendable ammonia concept. The parameters considered in the analysis were weight, volume, and spacecraft power restrictions. Data were also produced for packaging and vehicle stowage parameters which are compatible with vehicle weight and volume specifications. Certain assumptions were made for food packaging sizes based on previously generated space shuttle menus. The results of the study are shown, along with the range of meal choices considered.

  8. MODFLOW-2000, the U.S. Geological Survey modular ground-water model : user guide to the LMT6 package, the linkage with MT3DMS for multi-species mass transport modeling

    USGS Publications Warehouse

    Zheng, Chunmiao; Hill, Mary Catherine; Hsieh, Paul A.

    2001-01-01

    MODFLOW-2000, the newest version of MODFLOW, is a computer program that numerically solves the three-dimensional ground-water flow equation for a porous medium using a finite-difference method. MT3DMS, the successor to MT3D, is a computer program for modeling multi-species solute transport in three-dimensional ground-water systems using multiple solution techniques, including the finite-difference method, the method of characteristics (MOC), and the total-variation-diminishing (TVD) method. This report documents a new version of the Link-MT3DMS Package, which enables MODFLOW-2000 to produce the information needed by MT3DMS, and also discusses new visualization software for MT3DMS. Unlike the Link-MT3D Packages that coordinated previous versions of MODFLOW and MT3D, the new Link-MT3DMS Package requires an input file that, among other things, provides enhanced support for additional MODFLOW sink/source packages and allows list-directed (free) format for the flow model produced flow-transport link file. The report contains four parts: (a) documentation of the Link-MT3DMS Package Version 6 for MODFLOW-2000; (b) discussion of several issues related to simulation setup and input data preparation for running MT3DMS with MODFLOW-2000; (c) description of two test example problems, with comparison to results obtained using another MODFLOW-based transport program; and (d) overview of post-simulation visualization and animation using the U.S. Geological Survey?s Model Viewer.

  9. Migration and sensory properties of plastics-based nets used as food-contacting materials under ambient and high temperature heating conditions.

    PubMed

    Kontominas, M G; Goulas, A E; Badeka, A V; Nerantzaki, A

    2006-06-01

    Overall migration from a wide range of commercial plastics-based netting materials destined to be used as either meat or vegetable packaging materials into the fatty food simulant isooctane or the aqueous simulant distilled water, respectively, was studied. In addition, sensory tests of representative netting materials were carried out in bottled water in order to investigate possible development of off-odour/taste and discoloration in this food simulant as a result of migration from the netting material. Sensory tests were supplemented by determination of the volatile compounds' profile in table water exposed to the netting materials using SPME-GC/MS. Test conditions for packaging material/food simulant contact and method of overall migration analysis were according to European Union Directives 90/128 (EEC, 1990) and 2002/72 (EEC, 2002). The results showed that for both PET and polyethylene-based netting materials, overall migration values into distilled water ranged between 11.5 and 48.5 mg l(-1), well below the upper limit (60 mg l(-1)) for overall migration values from plastics-packaging materials set by the European Union. The overall migration values from netting materials into isooctane ranged between 38.0 and 624.0 mg l(-1), both below and above the European Union upper limit for migration. Sensory tests involving contact of representative samples with table water under refluxing (100 degrees C/4 h) conditions showed a number of the netting materials produced both off-odour and/or taste as well as discoloration of the food simulant rendering such materials unfit for the packaging of foodstuffs in applications involving heating at elevated temperatures. GC/MS analysis showed the presence of numerous volatile compounds being produced after netting materials/water contact under refluxing conditions. Although it is extremely difficult to establish a clear correlation between sensory off-odour development and GC/MS volatile compounds' profile, it may be postulated that plastics oxidation products such as hexanal, heptanal, octanal and 2,6 di-tert-butylquinone may contribute to off-odour development using commercially bottled table water as a food simulant. Likewise, compounds such as carbon disulfide, [1,1'-biphenyl]-2-ol and propanoic acid, 2 methyl 1-(1,1-dimethyl)-2-methyl-1,3-propanediyl ester probably originating from cotton and rubber components of netting materials may also contribute to off-odour/taste development.

  10. Electro-Microfluidic Packaging

    NASA Astrophysics Data System (ADS)

    Benavides, G. L.; Galambos, P. C.

    2002-06-01

    There are many examples of electro-microfluidic products that require cost effective packaging solutions. Industry has responded to a demand for products such as drop ejectors, chemical sensors, and biological sensors. Drop ejectors have consumer applications such as ink jet printing and scientific applications such as patterning self-assembled monolayers or ejecting picoliters of expensive analytes/reagents for chemical analysis. Drop ejectors can be used to perform chemical analysis, combinatorial chemistry, drug manufacture, drug discovery, drug delivery, and DNA sequencing. Chemical and biological micro-sensors can sniff the ambient environment for traces of dangerous materials such as explosives, toxins, or pathogens. Other biological sensors can be used to improve world health by providing timely diagnostics and applying corrective measures to the human body. Electro-microfluidic packaging can easily represent over fifty percent of the product cost and, as with Integrated Circuits (IC), the industry should evolve to standard packaging solutions. Standard packaging schemes will minimize cost and bring products to market sooner.

  11. Versatile Software Package For Near Real-Time Analysis of Experimental Data

    NASA Technical Reports Server (NTRS)

    Wieseman, Carol D.; Hoadley, Sherwood T.

    1998-01-01

    This paper provides an overview of a versatile software package developed for time- and frequency-domain analyses of experimental wind-tunnel data. This package, originally developed for analyzing data in the NASA Langley Transonic Dynamics Tunnel (TDT), is applicable for analyzing any time-domain data. A Matlab-based software package, TDT-analyzer, provides a compendium of commonly-required dynamic analysis functions in a user-friendly interactive and batch processing environment. TDT-analyzer has been used extensively to provide on-line near real-time and post-test examination and reduction of measured data acquired during wind tunnel tests of aeroelastically-scaled models of aircraft and rotorcraft as well as a flight test of the NASA High Alpha Research Vehicle (HARV) F-18. The package provides near real-time results in an informative and timely manner far exceeding prior methods of data reduction at the TDT.

  12. Material flow analysis for an industry - A case study in packaging

    USGS Publications Warehouse

    Amey, E.B.; Sandgren, K.

    1996-01-01

    The basic materials used in packaging are glass, metals (primarily aluminum and steel), an ever-growing range of plastics, paper and paperboard, wood, textiles for bags, and miscellaneous other materials (such as glues, inks, and other supplies). They are fabricated into rigid, semi-rigid, or flexible containers. The most common forms of these containers include cans, drums, bottles, cartons, boxes, bags, pouches, and wraps. Packaging products are, for the most part, low cost, bulky products that are manufactured close to their customers. There is virtually no import or export of packaging products. A material flow analysis can be developed that looks at all inputs to an industrial sector, inventories the losses in processing, and tracks the fate of the material after its useful life. An example is presented that identifies the material inputs to the packaging industry, and addresses the ultimate fate of the materials used. ?? 1996 International Association for Mathematical Geology.

  13. Predicting visual attention to nutrition information on food products: the influence of motivation and ability.

    PubMed

    Turner, Monique Mitchell; Skubisz, Christine; Pandya, Sejal Patel; Silverman, Meryl; Austin, Lucinda L

    2014-09-01

    Obesity is linked to numerous diseases including heart disease, diabetes, and cancer. To address this issue, food and beverage manufacturers as well as health organizations have developed nutrition symbols and logos to be placed on the front of food packages to guide consumers to more healthful food choices. In 2010, the U.S. Food and Drug Administration requested information on the extent to which consumers notice, use, and understand front-of-package nutrition symbols. In response, this study used eye-tracking technology to explore the degree to which people pay visual attention to the information contained in food nutrition labels and front-of-package nutrition symbols. Results indicate that people with motivation to shop for healthful foods spent significantly more time looking at all available nutrition information compared to people with motivation to shop for products on the basis of taste. Implications of these results for message design, food labeling, and public policy are discussed.

  14. Manipulating Environmental Time Series in Python/Numpy: the Scikits.Timeseries Package and its Applications.

    NASA Astrophysics Data System (ADS)

    Gerard-Marchant, P. G.

    2008-12-01

    Numpy is a free, open source C/Python interface designed for the fast and convenient manipulation of multidimensional numerical arrays. The base object, ndarray, can also be easily be extended to define new objects meeting specific needs. Thanks to its simplicity, efficiency and modularity, numpy and its companion library Scipy have become increasingly popular in the scientific community over the last few years, with application ranging from astronomy and engineering to finances and statistics. Its capacity to handle missing values is particularly appealing when analyzing environmental time series, where irregular data sampling might be an issue. After reviewing the main characteristics of numpy objects and the mechanism of subclassing, we will present the scikits.timeseries package, developed to manipulate single- and multi-variable arrays indexed in time. We will illustrate some typical applications of this package by introducing climpy, a set of extensions designed to help analyzing the impacts of climate variability on environmental data such as precipitations or streamflows.

  15. ParallelStructure: A R Package to Distribute Parallel Runs of the Population Genetics Program STRUCTURE on Multi-Core Computers

    PubMed Central

    Besnier, Francois; Glover, Kevin A.

    2013-01-01

    This software package provides an R-based framework to make use of multi-core computers when running analyses in the population genetics program STRUCTURE. It is especially addressed to those users of STRUCTURE dealing with numerous and repeated data analyses, and who could take advantage of an efficient script to automatically distribute STRUCTURE jobs among multiple processors. It also consists of additional functions to divide analyses among combinations of populations within a single data set without the need to manually produce multiple projects, as it is currently the case in STRUCTURE. The package consists of two main functions: MPI_structure() and parallel_structure() as well as an example data file. We compared the performance in computing time for this example data on two computer architectures and showed that the use of the present functions can result in several-fold improvements in terms of computation time. ParallelStructure is freely available at https://r-forge.r-project.org/projects/parallstructure/. PMID:23923012

  16. Component-based integration of chemistry and optimization software.

    PubMed

    Kenny, Joseph P; Benson, Steven J; Alexeev, Yuri; Sarich, Jason; Janssen, Curtis L; McInnes, Lois Curfman; Krishnan, Manojkumar; Nieplocha, Jarek; Jurrus, Elizabeth; Fahlstrom, Carl; Windus, Theresa L

    2004-11-15

    Typical scientific software designs make rigid assumptions regarding programming language and data structures, frustrating software interoperability and scientific collaboration. Component-based software engineering is an emerging approach to managing the increasing complexity of scientific software. Component technology facilitates code interoperability and reuse. Through the adoption of methodology and tools developed by the Common Component Architecture Forum, we have developed a component architecture for molecular structure optimization. Using the NWChem and Massively Parallel Quantum Chemistry packages, we have produced chemistry components that provide capacity for energy and energy derivative evaluation. We have constructed geometry optimization applications by integrating the Toolkit for Advanced Optimization, Portable Extensible Toolkit for Scientific Computation, and Global Arrays packages, which provide optimization and linear algebra capabilities. We present a brief overview of the component development process and a description of abstract interfaces for chemical optimizations. The components conforming to these abstract interfaces allow the construction of applications using different chemistry and mathematics packages interchangeably. Initial numerical results for the component software demonstrate good performance, and highlight potential research enabled by this platform.

  17. VS2DI: Model use, calibration, and validation

    USGS Publications Warehouse

    Healy, Richard W.; Essaid, Hedeff I.

    2012-01-01

    VS2DI is a software package for simulating water, solute, and heat transport through soils or other porous media under conditions of variable saturation. The package contains a graphical preprocessor for constructing simulations, a postprocessor for displaying simulation results, and numerical models that solve for flow and solute transport (VS2DT) and flow and heat transport (VS2DH). Flow is described by the Richards equation, and solute and heat transport are described by advection-dispersion equations; the finite-difference method is used to solve these equations. Problems can be simulated in one, two, or three (assuming radial symmetry) dimensions. This article provides an overview of calibration techniques that have been used with VS2DI; included is a detailed description of calibration procedures used in simulating the interaction between groundwater and a stream fed by drainage from agricultural fields in central Indiana. Brief descriptions of VS2DI and the various types of problems that have been addressed with the software package are also presented.

  18. Antioxidant films based on cross-linked methyl cellulose and native Chilean berry for food packaging applications.

    PubMed

    López de Dicastillo, Carol; Rodríguez, Francisco; Guarda, Abel; Galotto, Maria José

    2016-01-20

    Development of antioxidant and antimicrobial active food packaging materials based on biodegradable polymer and natural plant extracts has numerous advantages as reduction of synthetic additives into the food, reduction of plastic waste, and food protection against microorganisms and oxidation reactions. In this way, active films based on methylcellulose (MC) and maqui (Aristotelia chilensis) berry fruit extract, as a source of antioxidants agents, were studied. On the other hand, due to the high water affinity of MC, this polymer was firstly cross-linked with glutaraldehyde (GA) at different concentrations. The results showed that the addition of GA decreased water solubility, swelling, water vapor permeability of MC films, and the release of antioxidant substances from the active materials increased with the concentration of GA. Natural extract and active cross-linked films were characterized in order to obtain the optimal formulation with the highest antioxidant activity and the best physical properties for latter active food packaging application. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. A Systematic Review and Meta-Analysis of Fecal Contamination and Inadequate Treatment of Packaged Water

    PubMed Central

    Williams, Ashley R.; Bain, Robert E. S.; Fisher, Michael B.; Cronk, Ryan; Kelly, Emma R.; Bartram, Jamie

    2015-01-01

    Background Packaged water products provide an increasingly important source of water for consumption. However, recent studies raise concerns over their safety. Objectives To assess the microbial safety of packaged water, examine differences between regions, country incomes, packaged water types, and compare packaged water with other water sources. Methods We performed a systematic review and meta-analysis. Articles published in English, French, Portuguese, Spanish and Turkish, with no date restrictions were identified from online databases and two previous reviews. Studies published before April 2014 that assessed packaged water for the presence of Escherichia coli, thermotolerant or total coliforms were included provided they tested at least ten samples or brands. Results A total of 170 studies were included in the review. The majority of studies did not detect fecal indicator bacteria in packaged water (78/141). Compared to packaged water from upper-middle and high-income countries, packaged water from low and lower-middle-income countries was 4.6 (95% CI: 2.6–8.1) and 13.6 (95% CI: 6.9–26.7) times more likely to contain fecal indicator bacteria and total coliforms, respectively. Compared to all other packaged water types, water from small bottles was less likely to be contaminated with fecal indicator bacteria (OR = 0.32, 95%CI: 0.17–0.58) and total coliforms (OR = 0.10, 95%CI: 0.05, 0.22). Packaged water was less likely to contain fecal indicator bacteria (OR = 0.35, 95%CI: 0.20, 0.62) compared to other water sources used for consumption. Conclusions Policymakers and regulators should recognize the potential benefits of packaged water in providing safer water for consumption at and away from home, especially for those who are otherwise unlikely to gain access to a reliable, safe water supply in the near future. To improve the quality of packaged water products they should be integrated into regulatory and monitoring frameworks. PMID:26505745

  20. A Systematic Review and Meta-Analysis of Fecal Contamination and Inadequate Treatment of Packaged Water.

    PubMed

    Williams, Ashley R; Bain, Robert E S; Fisher, Michael B; Cronk, Ryan; Kelly, Emma R; Bartram, Jamie

    2015-01-01

    Packaged water products provide an increasingly important source of water for consumption. However, recent studies raise concerns over their safety. To assess the microbial safety of packaged water, examine differences between regions, country incomes, packaged water types, and compare packaged water with other water sources. We performed a systematic review and meta-analysis. Articles published in English, French, Portuguese, Spanish and Turkish, with no date restrictions were identified from online databases and two previous reviews. Studies published before April 2014 that assessed packaged water for the presence of Escherichia coli, thermotolerant or total coliforms were included provided they tested at least ten samples or brands. A total of 170 studies were included in the review. The majority of studies did not detect fecal indicator bacteria in packaged water (78/141). Compared to packaged water from upper-middle and high-income countries, packaged water from low and lower-middle-income countries was 4.6 (95% CI: 2.6-8.1) and 13.6 (95% CI: 6.9-26.7) times more likely to contain fecal indicator bacteria and total coliforms, respectively. Compared to all other packaged water types, water from small bottles was less likely to be contaminated with fecal indicator bacteria (OR = 0.32, 95%CI: 0.17-0.58) and total coliforms (OR = 0.10, 95%CI: 0.05, 0.22). Packaged water was less likely to contain fecal indicator bacteria (OR = 0.35, 95%CI: 0.20, 0.62) compared to other water sources used for consumption. Policymakers and regulators should recognize the potential benefits of packaged water in providing safer water for consumption at and away from home, especially for those who are otherwise unlikely to gain access to a reliable, safe water supply in the near future. To improve the quality of packaged water products they should be integrated into regulatory and monitoring frameworks.

  1. Description of the NCAR Community Climate Model (CCM3). Technical note

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kiehl, J.T.; Hack, J.J.; Bonan, G.B.

    This repor presents the details of the governing equations, physical parameterizations, and numerical algorithms defining the version of the NCAR Community Climate Model designated CCM3. The material provides an overview of the major model components, and the way in which they interact as the numerical integration proceeds. This version of the CCM incorporates significant improvements to the physic package, new capabilities such as the incorporation of a slab ocean component, and a number of enhancements to the implementation (e.g., the ability to integrate the model on parallel distributed-memory computational platforms).

  2. Numerical modeling of interaction of the aircraft engine with concrete protective structures

    NASA Astrophysics Data System (ADS)

    Radchenko, P. A.; Batuev, S. P.; Radchenko, A. V.; Plevkov, V. S.

    2018-01-01

    The paper presents numerical modeling results considering interaction of Boeing 747 aircraft engine with nuclear power station protective shell. Protective shell has been given as a reinforced concrete structure with complex scheme of reinforcement. The engine has been simulated by cylinder projectile made from titanium alloy. The interaction velocity has comprised 180 m/s. The simulation is three-dimensional solved by finite element method using the author’s own software package EFES. Fracture and fragmentation of materials have been considered in calculations. Program software has been assessed to be used in calculation of multiple-contact objectives.

  3. An improved cylindrical FDTD method and its application to field-tissue interaction study in MRI.

    PubMed

    Chi, Jieru; Liu, Feng; Xia, Ling; Shao, Tingting; Mason, David G; Crozier, Stuart

    2010-01-01

    This paper presents a three dimensional finite-difference time-domain (FDTD) scheme in cylindrical coordinates with an improved algorithm for accommodating the numerical singularity associated with the polar axis. The regularization of this singularity problem is entirely based on Ampere's law. The proposed algorithm has been detailed and verified against a problem with a known solution obtained from a commercial electromagnetic simulation package. The numerical scheme is also illustrated by modeling high-frequency RF field-human body interactions in MRI. The results demonstrate the accuracy and capability of the proposed algorithm.

  4. RSEIS and RFOC: Seismic Analysis in R

    NASA Astrophysics Data System (ADS)

    Lees, J. M.

    2015-12-01

    Open software is essential for reproducible scientific exchange. R-packages provide a platform for development of seismological investigation software that can be properly documented and traced for data processing. A suite of R packages designed for a wide range of seismic analysis is currently available in the free software platform called R. R is a software platform based on the S-language developed at Bell Labs decades ago. Routines in R can be run as standalone function calls, or developed in object-oriented mode. R comes with a base set of routines, and thousands of user developed packages. The packages developed at UNC include subroutines and interactive codes for processing seismic data, analyzing geographic information (GIS) and inverting data involved in a variety of geophysical applications. On CRAN (Comprehensive R Archive Network, http://www.r-project.org/) currently available packages related to seismic analysis are RSEIS, Rquake, GEOmap, RFOC, zoeppritz, RTOMO, and geophys, Rwave, PEIP, hht, rFDSN. These include signal processing, data management, mapping, earthquake location, deconvolution, focal mechanisms, wavelet transforms, Hilbert-Huang Transforms, tomographic inversion, and Mogi deformation among other useful functionality. All software in R packages is required to have detailed documentation, making the exchange and modification of existing software easy. In this presentation, I will focus on packages RSEIS and RFOC, showing examples from a variety of seismic analyses. The R approach has similarities to the popular (and expensive) MATLAB platform, although R is open source and free to down load.

  5. PrimerSuite: A High-Throughput Web-Based Primer Design Program for Multiplex Bisulfite PCR.

    PubMed

    Lu, Jennifer; Johnston, Andrew; Berichon, Philippe; Ru, Ke-Lin; Korbie, Darren; Trau, Matt

    2017-01-24

    The analysis of DNA methylation at CpG dinucleotides has become a major research focus due to its regulatory role in numerous biological processes, but the requisite need for assays which amplify bisulfite-converted DNA represents a major bottleneck due to the unique design constraints imposed on bisulfite-PCR primers. Moreover, a review of the literature indicated no available software solutions which accommodated both high-throughput primer design, support for multiplex amplification assays, and primer-dimer prediction. In response, the tri-modular software package PrimerSuite was developed to support bisulfite multiplex PCR applications. This software was constructed to (i) design bisulfite primers against multiple regions simultaneously (PrimerSuite), (ii) screen for primer-primer dimerizing artefacts (PrimerDimer), and (iii) support multiplex PCR assays (PrimerPlex). Moreover, a major focus in the development of this software package was the emphasis on extensive empirical validation, and over 1300 unique primer pairs have been successfully designed and screened, with over 94% of them producing amplicons of the expected size, and an average mapping efficiency of 93% when screened using bisulfite multiplex resequencing. The potential use of the software in other bisulfite-based applications such as methylation-specific PCR is under consideration for future updates. This resource is freely available for use at PrimerSuite website (www.primer-suite.com).

  6. Modeling Hybridization Kinetics of Gene Probes in a DNA Biochip Using FEMLAB

    PubMed Central

    Munir, Ahsan; Waseem, Hassan; Williams, Maggie R.; Stedtfeld, Robert D.; Gulari, Erdogan; Tiedje, James M.; Hashsham, Syed A.

    2017-01-01

    Microfluidic DNA biochips capable of detecting specific DNA sequences are useful in medical diagnostics, drug discovery, food safety monitoring and agriculture. They are used as miniaturized platforms for analysis of nucleic acids-based biomarkers. Binding kinetics between immobilized single stranded DNA on the surface and its complementary strand present in the sample are of interest. To achieve optimal sensitivity with minimum sample size and rapid hybridization, ability to predict the kinetics of hybridization based on the thermodynamic characteristics of the probe is crucial. In this study, a computer aided numerical model for the design and optimization of a flow-through biochip was developed using a finite element technique packaged software tool (FEMLAB; package included in COMSOL Multiphysics) to simulate the transport of DNA through a microfluidic chamber to the reaction surface. The model accounts for fluid flow, convection and diffusion in the channel and on the reaction surface. Concentration, association rate constant, dissociation rate constant, recirculation flow rate, and temperature were key parameters affecting the rate of hybridization. The model predicted the kinetic profile and signal intensities of eighteen 20-mer probes targeting vancomycin resistance genes (VRGs). Predicted signal intensities and hybridization kinetics strongly correlated with experimental data in the biochip (R2 = 0.8131). PMID:28555058

  7. Modeling Hybridization Kinetics of Gene Probes in a DNA Biochip Using FEMLAB.

    PubMed

    Munir, Ahsan; Waseem, Hassan; Williams, Maggie R; Stedtfeld, Robert D; Gulari, Erdogan; Tiedje, James M; Hashsham, Syed A

    2017-05-29

    Microfluidic DNA biochips capable of detecting specific DNA sequences are useful in medical diagnostics, drug discovery, food safety monitoring and agriculture. They are used as miniaturized platforms for analysis of nucleic acids-based biomarkers. Binding kinetics between immobilized single stranded DNA on the surface and its complementary strand present in the sample are of interest. To achieve optimal sensitivity with minimum sample size and rapid hybridization, ability to predict the kinetics of hybridization based on the thermodynamic characteristics of the probe is crucial. In this study, a computer aided numerical model for the design and optimization of a flow-through biochip was developed using a finite element technique packaged software tool (FEMLAB; package included in COMSOL Multiphysics) to simulate the transport of DNA through a microfluidic chamber to the reaction surface. The model accounts for fluid flow, convection and diffusion in the channel and on the reaction surface. Concentration, association rate constant, dissociation rate constant, recirculation flow rate, and temperature were key parameters affecting the rate of hybridization. The model predicted the kinetic profile and signal intensities of eighteen 20-mer probes targeting vancomycin resistance genes (VRGs). Predicted signal intensities and hybridization kinetics strongly correlated with experimental data in the biochip (R² = 0.8131).

  8. Scilab software package for the study of dynamical systems

    NASA Astrophysics Data System (ADS)

    Bordeianu, C. C.; Beşliu, C.; Jipa, Al.; Felea, D.; Grossu, I. V.

    2008-05-01

    This work presents a new software package for the study of chaotic flows and maps. The codes were written using Scilab, a software package for numerical computations providing a powerful open computing environment for engineering and scientific applications. It was found that Scilab provides various functions for ordinary differential equation solving, Fast Fourier Transform, autocorrelation, and excellent 2D and 3D graphical capabilities. The chaotic behaviors of the nonlinear dynamics systems were analyzed using phase-space maps, autocorrelation functions, power spectra, Lyapunov exponents and Kolmogorov-Sinai entropy. Various well known examples are implemented, with the capability of the users inserting their own ODE. Program summaryProgram title: Chaos Catalogue identifier: AEAP_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAP_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 885 No. of bytes in distributed program, including test data, etc.: 5925 Distribution format: tar.gz Programming language: Scilab 3.1.1 Computer: PC-compatible running Scilab on MS Windows or Linux Operating system: Windows XP, Linux RAM: below 100 Megabytes Classification: 6.2 Nature of problem: Any physical model containing linear or nonlinear ordinary differential equations (ODE). Solution method: Numerical solving of ordinary differential equations. The chaotic behavior of the nonlinear dynamical system is analyzed using Poincaré sections, phase-space maps, autocorrelation functions, power spectra, Lyapunov exponents and Kolmogorov-Sinai entropies. Restrictions: The package routines are normally able to handle ODE systems of high orders (up to order twelve and possibly higher), depending on the nature of the problem. Running time: 10 to 20 seconds for problems that do not involve Lyapunov exponents calculation; 60 to 1000 seconds for problems that involve high orders ODE and Lyapunov exponents calculation.

  9. An R package for the design, analysis and operation of reservoir systems

    NASA Astrophysics Data System (ADS)

    Turner, Sean; Ng, Jia Yi; Galelli, Stefano

    2016-04-01

    We present a new R package - named "reservoir" - which has been designed for rapid and easy routing of runoff through storage. The package comprises well-established tools for capacity design (e.g., the sequent peak algorithm), performance analysis (storage-yield-reliability and reliability-resilience-vulnerability analysis) and release policy optimization (Stochastic Dynamic Programming). Operating rules can be optimized for water supply, flood control and amenity objectives, as well as for maximum hydropower production. Storage-depth-area relationships are in-built, allowing users to incorporate evaporation from the reservoir surface. We demonstrate the capabilities of the software for global studies using thousands of reservoirs from the Global Reservoir and Dam (GRanD) database fed by historical monthly inflow time series from a 0.5 degree gridded global runoff dataset. The package is freely available through the Comprehensive R Archive Network (CRAN).

  10. Probablistic Analyses of Waste Package Quantities Impacted by Potential Igneous Disruption at Yucca Mountain

    NASA Astrophysics Data System (ADS)

    Wallace, M. G.; Iuzzolina, H.

    2005-12-01

    A probabilistic analysis was conducted to estimate ranges for the numbers of waste packages that could be damaged in a potential future igneous event through a repository at Yucca Mountain. The analysis includes disruption from an intrusive igneous event and from an extrusive volcanic event. This analysis supports the evaluation of the potential consequences of future igneous activity as part of the total system performance assessment for the license application for the Yucca Mountain Project (YMP). The first scenario, igneous intrusion, investigated the case where one or more igneous dikes intersect the repository. A swarm of dikes was characterized by distributions of length, width, azimuth, and number of dikes and the spacings between them. Through the use in part of a latin hypercube simulator and a modified video game engine, mathematical relationships were built between those parameters and the number of waste packages hit. Corresponding cumulative distribution function curves (CDFs) for the number of waste packages hit under several different scenarios were calculated. Variations in dike thickness ranges, as well as in repository magma bulkhead positions were examined through sensitivity studies. It was assumed that all waste packages in an emplacement drift would be impacted if that drift was intersected by a dike. Over 10,000 individual simulations were performed. Based on these calculations, out of a total of over 11,000 planned waste packages distributed over an area of approximately 5.5 km2 , the median number of waste packages impacted was roughly 1/10 of the total. Individual cases ranged from 0 waste packages to the entire inventory being impacted. The igneous intrusion analysis involved an explicit characterization of dike-drift intersections, built upon various distributions that reflect the uncertainties associated with the inputs. The second igneous scenario, volcanic eruption (eruptive conduits), considered the effects of conduits formed in association with a volcanic eruption through the repository. Mathematical relations were built between the resulting conduit areas and the fraction of the repository area occupied by waste packages. This relation was used in conjunction with a joint distribution incorporating variability in eruptive conduit diameters and in the number of eruptive conduits that could intersect the repository.

  11. 16 CFR 500.7 - Net quantity of contents, method of expression.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Net quantity of contents, method of... REGULATIONS UNDER SECTION 4 OF THE FAIR PACKAGING AND LABELING ACT § 500.7 Net quantity of contents, method of expression. The net quantity of contents shall be expressed in terms of weight or mass, measure, numerical...

  12. Evaluation of rotor axial vibrations in a turbo pump unit equipped with an automatic unloading machine

    NASA Astrophysics Data System (ADS)

    Martsynkovskyy, V. A.; Deineka, A.; Kovalenko, V.

    2017-08-01

    The article presents forced axial vibrations of the rotor with an automatic unloading machine in an oxidizer pump. A feature of the design is the use in the autoloading system of slotted throttles with mutually inverse throttling. Their conductivity is determined by a numerical experiment in the ANSYS CFX software package.

  13. Numerical Sedimentation Study of Shoaling on the Ohio River near Mound City, Illinois

    DTIC Science & Technology

    2015-08-01

    from Lock and Dam 53 to just south of Cairo, IL. The water surface profile data on the Ohio River were collected using an Applanix POS_MV system...User Service (OPUS). The Applanix software package “POSPAC” was used to generate solution files by applying corrections from the base station data

  14. 16 CFR 503.4 - Net quantity of contents, numerical count.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... terms of count only, or in terms of count and weight, volume, area, or dimension, the regulations are... provide a net quantity statement to specify weight, volume, area, or dimensions when such are required. For example, the synthetic sponge which is packaged, requires dimensions such as “5 in. × 3 in. × 1 in...

  15. 16 CFR 503.4 - Net quantity of contents, numerical count.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... terms of count only, or in terms of count and weight, volume, area, or dimension, the regulations are... provide a net quantity statement to specify weight, volume, area, or dimensions when such are required. For example, the synthetic sponge which is packaged, requires dimensions such as “5 in. × 3 in. × 1 in...

  16. 16 CFR 503.4 - Net quantity of contents, numerical count.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... terms of count only, or in terms of count and weight, volume, area, or dimension, the regulations are... provide a net quantity statement to specify weight, volume, area, or dimensions when such are required. For example, the synthetic sponge which is packaged, requires dimensions such as “5 in. × 3 in. × 1 in...

  17. 16 CFR 503.4 - Net quantity of contents, numerical count.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... terms of count only, or in terms of count and weight, volume, area, or dimension, the regulations are... provide a net quantity statement to specify weight, volume, area, or dimensions when such are required. For example, the synthetic sponge which is packaged, requires dimensions such as “5 in. × 3 in. × 1 in...

  18. puma: a Bioconductor package for propagating uncertainty in microarray analysis.

    PubMed

    Pearson, Richard D; Liu, Xuejun; Sanguinetti, Guido; Milo, Marta; Lawrence, Neil D; Rattray, Magnus

    2009-07-09

    Most analyses of microarray data are based on point estimates of expression levels and ignore the uncertainty of such estimates. By determining uncertainties from Affymetrix GeneChip data and propagating these uncertainties to downstream analyses it has been shown that we can improve results of differential expression detection, principal component analysis and clustering. Previously, implementations of these uncertainty propagation methods have only been available as separate packages, written in different languages. Previous implementations have also suffered from being very costly to compute, and in the case of differential expression detection, have been limited in the experimental designs to which they can be applied. puma is a Bioconductor package incorporating a suite of analysis methods for use on Affymetrix GeneChip data. puma extends the differential expression detection methods of previous work from the 2-class case to the multi-factorial case. puma can be used to automatically create design and contrast matrices for typical experimental designs, which can be used both within the package itself but also in other Bioconductor packages. The implementation of differential expression detection methods has been parallelised leading to significant decreases in processing time on a range of computer architectures. puma incorporates the first R implementation of an uncertainty propagation version of principal component analysis, and an implementation of a clustering method based on uncertainty propagation. All of these techniques are brought together in a single, easy-to-use package with clear, task-based documentation. For the first time, the puma package makes a suite of uncertainty propagation methods available to a general audience. These methods can be used to improve results from more traditional analyses of microarray data. puma also offers improvements in terms of scope and speed of execution over previously available methods. puma is recommended for anyone working with the Affymetrix GeneChip platform for gene expression analysis and can also be applied more generally.

  19. Interfaces between statistical analysis packages and the ESRI geographic information system

    NASA Technical Reports Server (NTRS)

    Masuoka, E.

    1980-01-01

    Interfaces between ESRI's geographic information system (GIS) data files and real valued data files written to facilitate statistical analysis and display of spatially referenced multivariable data are described. An example of data analysis which utilized the GIS and the statistical analysis system is presented to illustrate the utility of combining the analytic capability of a statistical package with the data management and display features of the GIS.

  20. TYPE A FISSILE PACKAGING FOR AIR TRANSPORT PROJECT OVERVIEW

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eberl, K.; Blanton, P.

    2013-10-11

    This paper presents the project status of the Model 9980, a new Type A fissile packaging for use in air transport. The Savannah River National Laboratory (SRNL) developed this new packaging to be a light weight (<150-lb), drum-style package and prepared a Safety Analysis for Packaging (SARP) for submission to the DOE/EM. The package design incorporates unique features and engineered materials specifically designed to minimize packaging weight and to be in compliance with 10CFR71 requirements. Prototypes were fabricated and tested to evaluate the design when subjected to Normal Conditions of Transport (NCT) and Hypothetical Accident Conditions (HAC). An overview ofmore » the design details, results of the regulatory testing, and lessons learned from the prototype fabrication for the 9980 will be presented.« less

  1. Experimental and Numerical Study on the Strength of Aluminum Extrusion Welding.

    PubMed

    Bingöl, Sedat; Bozacı, Atilla

    2015-07-17

    The quality of extrusion welding in the extruded hollow shapes is influenced significantly by the pressure and effective stress under which the material is being joined inside the welding chamber. However, extrusion welding was not accounted for in the past by the developers of finite element software packages. In this study, the strength of hollow extrusion profile with seam weld produced at different ram speeds was investigated experimentally and numerically. The experiments were performed on an extruded hollow aluminum profile which was suitable to obtain the tensile tests specimens from its seam weld's region at both parallel to extrusion direction and perpendicular to extrusion direction. A new numerical modeling approach, which was recently proposed in literature, was used for numerical analyses of the study. The simulation results performed at different ram speeds were compared with the experimental results, and a good agreement was obtained.

  2. Advanced rotorcraft control using parameter optimization

    NASA Technical Reports Server (NTRS)

    Vansteenwyk, Brett; Ly, Uy-Loi

    1991-01-01

    A reliable algorithm for the evaluation of a quadratic performance index and its gradients with respect to the controller design parameters is presented. The algorithm is part of a design algorithm for an optimal linear dynamic output feedback controller that minimizes a finite time quadratic performance index. The numerical scheme is particularly robust when it is applied to the control law synthesis for systems with densely packed modes and where there is a high likelihood of encountering degeneracies in the closed loop eigensystem. This approach through the use of a accurate Pade series approximation does not require the closed loop system matrix to be diagonalizable. The algorithm has been included in a control design package for optimal robust low order controllers. Usefulness of the proposed numerical algorithm has been demonstrated using numerous practical design cases where degeneracies occur frequently in the closed loop system under an arbitrary controller design initialization and during the numerical search.

  3. Real-time inverse kinematics for the upper limb: a model-based algorithm using segment orientations.

    PubMed

    Borbély, Bence J; Szolgay, Péter

    2017-01-17

    Model based analysis of human upper limb movements has key importance in understanding the motor control processes of our nervous system. Various simulation software packages have been developed over the years to perform model based analysis. These packages provide computationally intensive-and therefore off-line-solutions to calculate the anatomical joint angles from motion captured raw measurement data (also referred as inverse kinematics). In addition, recent developments in inertial motion sensing technology show that it may replace large, immobile and expensive optical systems with small, mobile and cheaper solutions in cases when a laboratory-free measurement setup is needed. The objective of the presented work is to extend the workflow of measurement and analysis of human arm movements with an algorithm that allows accurate and real-time estimation of anatomical joint angles for a widely used OpenSim upper limb kinematic model when inertial sensors are used for movement recording. The internal structure of the selected upper limb model is analyzed and used as the underlying platform for the development of the proposed algorithm. Based on this structure, a prototype marker set is constructed that facilitates the reconstruction of model-based joint angles using orientation data directly available from inertial measurement systems. The mathematical formulation of the reconstruction algorithm is presented along with the validation of the algorithm on various platforms, including embedded environments. Execution performance tables of the proposed algorithm show significant improvement on all tested platforms. Compared to OpenSim's Inverse Kinematics tool 50-15,000x speedup is achieved while maintaining numerical accuracy. The proposed algorithm is capable of real-time reconstruction of standardized anatomical joint angles even in embedded environments, establishing a new way for complex applications to take advantage of accurate and fast model-based inverse kinematics calculations.

  4. A Methodology for Flight-Time Identification of Helicopter-Slung Load Frequency Response Characteristics Using CIFER

    NASA Technical Reports Server (NTRS)

    Sahai, Ranjana; Pierce, Larry; Cicolani, Luigi; Tischler, Mark

    1998-01-01

    Helicopter slung load operations are common in both military and civil contexts. The slung load adds load rigid body modes, sling stretching, and load aerodynamics to the system dynamics, which can degrade system stability and handling qualities, and reduce the operating envelope of the combined system below that of the helicopter alone. Further, the effects of the load on system dynamics vary significantly among the large range of loads, slings, and flight conditions that a utility helicopter will encounter in its operating life. In this context, military helicopters and loads are often qualified for slung load operations via flight tests which can be time consuming and expensive. One way to reduce the cost and time required to carry out these tests and generate quantitative data more readily is to provide an efficient method for analysis during the flight, so that numerous test points can be evaluated in a single flight test, with evaluations performed in near real time following each test point and prior to clearing the aircraft to the next point. Methodology for this was implemented at Ames and demonstrated in slung load flight tests in 1997 and was improved for additional flight tests in 1999. The parameters of interest for the slung load tests are aircraft handling qualities parameters (bandwidth and phase delay), stability margins (gain and phase margin), and load pendulum roots (damping and natural frequency). A procedure for the identification of these parameters from frequency sweep data was defined using the CIFER software package. CIFER is a comprehensive interactive package of utilities for frequency domain analysis previously developed at Ames for aeronautical flight test applications. It has been widely used in the US on a variety of aircraft, including some primitive flight time analysis applications.

  5. Review of European microgravity measurements

    NASA Technical Reports Server (NTRS)

    Hamacher, Hans

    1994-01-01

    AA In a French/Russion cooperation, CNES developed a microgravity detection system for analyzing the Mir space station micro-g-environment for the first time. European efforts to characterize the microgravity (1/9) environment within a space laboratory began in the late seventies with the design of the First Spacelab Mission SL-1. Its Material Science Double Rack was the first payload element to carry its own tri-axial acceleration package. Even though incapable for any frequency analysis, the data provided a wealth of novel information for optimal experiment and hardware design and operations for missions to come. Theoretical investigations under ESA contract demonstrated the significance of the detailed knowledge of micro-g data for a thorough experiment analysis. They especially revealed the high sensitivity of numerous phenomena to low frequency acceleration. Accordingly, the payloads of the Spacelab missions D-1 and D-2 were furnished with state-of-the-art detection systems to ensure frequency analysis between 0.1 and 100 Hz. The Microgravity Measurement Assembly (MMA) of D-2 was a centralized system comprising fixed installed as well as mobile tri-axial packages showing real-time data processing and transmission to ground. ESA's free flyer EURECA carried a system for continuous measurement over the entire mission. All EURECA subsystems and experimental facilities had to meet tough requirements defining the upper acceleration limits. In a French/Russion cooperation, CNES developed a mi crogravity detection system for analyzing the Mir space station micro-g-environment for the first time. An approach to get access to low frequency acceleration between 0 and 0.02 Hz will be realized by QSAM (Quasi-steady Acceleration Measurement) on IML-2, complementary to the NASA system Spacelab Acceleration Measurement System SAMS. A second flight of QSAM is planned for the Russian free flyer FOTON.

  6. Numerical investigation of the thermal behavior of heated natural composite materials

    NASA Astrophysics Data System (ADS)

    Qasim, S. M.; Mohammed, F. Abbas; Hashim, R.

    2015-11-01

    In the present work numerical investigation was carried out for laminar natural convection heat transfer from natural composite material (NCM). Three types of natural materials such as seed dates, egg shells, and feathers are mixed separately with polyester resin. Natural materials are added with different volume fraction (10%, 20%, and 30%) are heated with different heat flux (1078W/m2, 928W/m2, 750W/m2, 608W/m2, and 457W/m2) at (vertical, inclined, and horizontal) position. Continuity and Navier-Stocks equations are solved numerically in three dimensions using ANSYS FLUENT package 12.1 software commercial program. Numerical results showed the temperature distribution was affected for all types at volume fraction 30% and heat flux is 1078 W/m2, for different position. So, shows that the plumes and temperature behavior are affected by the air and the distance from heat source. Numerical results showed acceptable agreement with the experimental previous results.

  7. Structure, Assembly, and DNA Packaging of the Bacteriophage T4 Head

    PubMed Central

    Black, Lindsay W.; Rao, Venigalla B.

    2014-01-01

    The bacteriophage T4 head is an elongated icosahedron packed with 172 kb of linear double-stranded DNA and numerous proteins. The capsid is built from three essential proteins: gp23*, which forms the hexagonal capsid lattice; gp24*, which forms pentamers at 11 of the 12 vertices; and gp20, which forms the unique dodecameric portal vertex through which DNA enters during packaging and exits during infection. Intensive work over more than half a century has led to a deep understanding of the phage T4 head. The atomic structure of gp24 has been determined. A structural model built for gp23 using its similarity to gp24 showed that the phage T4 major capsid protein has the same fold as numerous other icosahedral bacteriophages. However, phage T4 displays an unusual membrane and portal initiated assembly of a shape determining self-sufficient scaffolding core. Folding of gp23 requires the assistance of two chaperones, the Escherichia coli chaperone GroEL acting with the phage-coded gp23-specific cochaperone, gp31. The capsid also contains two nonessential outer capsid proteins, Hoc and Soc, which decorate the capsid surface. Through binding to adjacent gp23 subunits, Soc reinforces the capsid structure. Hoc and Soc have been used extensively in bipartite peptide display libraries and to display pathogen antigens, including those from human immunodeficiency virus (HIV), Neisseria meningitides, Bacillus anthracis, and foot and mouth disease virus. The structure of Ip1*, one of a number of multiple (>100) copy proteins packed and injected with DNA from the full head, shows it to be an inhibitor of one specific restriction endonuclease specifically targeting glycosylated hydroxymethyl cytosine DNA. Extensive mutagenesis, combined with atomic structures of the DNA packaging/terminase proteins gp16 and gp17, elucidated the ATPase and nuclease functional motifs involved in DNA translocation and headful DNA cutting. The cryoelectron microscopy structure of the T4 packaging machine showed a pentameric motor assembled with gp17 subunits on the portal vertex. Single molecule optical tweezers and fluorescence studies showed that the T4 motor packages DNA at the highest rate known and can package multiple segments. Förster resonance energy transfer–fluorescence correlation spectroscopy studies indicate that DNA gets compressed in the stalled motor and that the terminase-to-portal distance changes during translocation. Current evidence suggests a linear two-component (large terminase plus portal) translocation motor in which electrostatic forces generated by ATP hydrolysis drive DNA translocation by alternating the motor between tensed and relaxed states. PMID:22420853

  8. Dynamic Modeling Using MCSim and R (SOT 2016 Biological Modeling Webinar Series)

    EPA Science Inventory

    MCSim is a stand-alone software package for simulating and analyzing dynamic models, with a focus on Bayesian analysis using Markov Chain Monte Carlo. While it is an extremely powerful package, it is somewhat inflexible, and offers only a limited range of analysis options, with n...

  9. Technical Review Report for the Mound 1KW Package Safety Analysis Report for Packaging Waiver for the Use of Modified Primary Containment Vessel (PCV)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    West, M; Hafner, R

    2008-05-05

    This Technical Review Report (TRR) documents the review, performed by the Lawrence Livermore National Laboratory (LLNL) staff, at the request of the U.S. Department of Energy (DOE), on the Waiver for the Use of Modified Primary Containment Vessels (PCV). The waiver is to be used to support a limited number of shipments of fuel for the Multi-Mission Radioisotope Thermoelectric Generator (MMRTG) Project in support of the National Aeronautics and Space Administration's (NASA's) Mars Science Laboratory (MSL) mission. Under the waiver, an inventory of existing national security PCVs will be converted to standard PCVs. Both types of PCVs are currently approvedmore » for use by the Office of Nuclear Energy. LLNL has previously reviewed the national security PCVs under Mound 1KW Package Safety Analysis Report for Packaging, Addendum No. 1, Revision c, dated June 2007 (Addendum 1). The safety analysis of the package is documented in the Safety Analysis Report for Packaging (SARP) for the Mound 1KW Package (i.e., the Mound 1KW SARP, or the SARP) where the standard PCVs have been reviewed by LLNL. The Mound 1KW Package is certified by DOE Certificate of Compliance (CoC) number USA/9516/B(U)F-85 for the transportation of Type B quantities of plutonium heat source material. The waiver requests an exemption, claiming safety equivalent to the requirements specified in 10 CFR 71.12, Specific Exemptions, and will lead to a letter amendment to the CoC. Under the waiver, the Office of Radioisotope Power Systems, NE-34, is seeking an exemption from 10 CFR 71.19(d)(1), Previously Approved Package,[5] which states: '(d) NRC will approve modifications to the design and authorized contents of a Type B package, or a fissile material package, previously approved by NRC, provided--(1) The modifications of a Type B package are not significant with respect to the design, operating characteristics, or safe performance of the containment system, when the package is subjected to the tests specified in {section}71.71 and 71.73.' The LLNL staff had previously reviewed a request from Idaho National Laboratory (INL) to reconfigure national security PCVs to standard PCVs. With a nominal 50% reduction in both the height and the volume, the LLNL staff initially deemed the modifications to be significant, which would not be allowed under the provisions of 10 CFR 71.19(d)(1)--see above. As a follow-up, the DOE requested additional clarification from the Nuclear Regulatory Commission (NRC). The NRC concluded that the reconfiguration would be a new fabrication, and that an exemption to the regulations would be required to allow its use, as per the requirements specified in 10 CFR 71.19(c)(1), Previously Approved Package: '(c) A Type B(U) package, a Type B(M) package, or a fissile material package previously approved by the NRC with the designation '-85' in the identification number of the NRC CoC, may be used under the general license of {section}71.17 with the following additional conditions: (1) Fabrication of the package must be satisfactorily completed by December 31, 2006, as demonstrated by application of its model number in accordance with 71.85(c).' Although the preferred approach toward the resolution of this issue would be for the applicant to submit an updated SARP, the applicant has stated that the process of updating the Model Mound 1KW Package SARP is a work that is in progress, but that the updated SARP is not yet ready for submittal. The applicant has to provide a submittal, proving that the package meets the '-96' requirements of International Atomic Energy Agency (IAEA) Safety Standards Series No. TS-R-1, in order to fabricate approved packagings after December 31, 2006. The applicant has further stated that all other packaging features, as described in the currently approved Model Mound 1KW Package SARP, remain unchanged. This report documents the LLNL review of the waiver request. The specific review for each SARP Chapter is documented.« less

  10. MeV+R: using MeV as a graphical user interface for Bioconductor applications in microarray analysis

    PubMed Central

    Chu, Vu T; Gottardo, Raphael; Raftery, Adrian E; Bumgarner, Roger E; Yeung, Ka Yee

    2008-01-01

    We present MeV+R, an integration of the JAVA MultiExperiment Viewer program with Bioconductor packages. This integration of MultiExperiment Viewer and R is easily extensible to other R packages and provides users with point and click access to traditionally command line driven tools written in R. We demonstrate the ability to use MultiExperiment Viewer as a graphical user interface for Bioconductor applications in microarray data analysis by incorporating three Bioconductor packages, RAMA, BRIDGE and iterativeBMA. PMID:18652698

  11. Electromagnetic interference reduction using electromagnetic bandgap structures in packages, enclosures, cavities, and antennas

    NASA Astrophysics Data System (ADS)

    Mohajer Iravani, Baharak

    Electromagnetic interference (EMI) is a source of noise problems in electronic devices. The EMI is attributed to coupling between sources of radiation and components placed in the same media such as package or chassis. This coupling can be either through conducting currents or through radiation. The radiation of electromagnetic (EM) fields is supported by surface currents. Thus, minimizing these surface currents is considered a major and critical step to suppress EMI. In this work, we present novel strategies to confine surface currents in different applications including packages, enclosures, cavities, and antennas. The efficiency of present methods of EM noise suppression is limited due to different drawbacks. For example, the traditional use of lossy materials and absorbers suffers from considerable disadvantages including mechanical and thermal reliability leading to limited life time, cost, volume, and weight. In this work, we consider the use of Electromagnetic Band Gap (EBG) structures. These structures are suitable for suppressing surface currents within a frequency band denoted as the bandgap. Their design is straight forward, they are inexpensive to implement, and they do not suffer from the limitations of the previous methods. A new method of EM noise suppression in enclosures and cavity-backed antennas using mushroom-type EBG structures is introduced. The effectiveness of the EBG as an EMI suppresser is demonstrated using numerical simulations and experimental measurements. To allow integration of EBGs in printed circuit boards and packages, novel miniaturized simple planar EBG structures based on use of high-k dielectric material (epsilonr > 100) are proposed. The design consists of meander lines and patches. The inductive meander lines serve to provide current continuity bridges between the capacitive patches. The high-k dielectric material increases the effective capacitive load substantially in comparison to commonly used material with much lower dielectric constant. Meander lines can increase the effective inductive load which pushes down the lower edge of bandgap, thus resulting in a wider bandgap. Simulation results are included to show that the proposed EBG structures provide very wide bandgap (˜10GHz) covering the multiple harmonics of of currently available microprocessors and its harmonics. To speed up the design procedure, a model based on combination of lumped elements and transmission lines is proposed. The derived model predicts accurately the starting edge of bandgap. This result is verified with full-wave analysis. Finally, another novel compact wide band mushroom-type EBG structure using magneto-dielectric materials is designed. Numerical simulations show that the proposed EBG structure provides in-phase reflection bandgap which is several times greater than the one obtained from a conventional EBG operating at the same frequency while its cell size is smaller. This type of EBG structure can be used efficiently as a ground plane for low-profile wideband antennas.

  12. Drought: A comprehensive R package for drought monitoring, prediction and analysis

    NASA Astrophysics Data System (ADS)

    Hao, Zengchao; Hao, Fanghua; Singh, Vijay P.; Cheng, Hongguang

    2015-04-01

    Drought may impose serious challenges to human societies and ecosystems. Due to complicated causing effects and wide impacts, a universally accepted definition of drought does not exist. The drought indicator is commonly used to characterize drought properties such as duration or severity. Various drought indicators have been developed in the past few decades for the monitoring of a certain aspect of drought condition along with the development of multivariate drought indices for drought characterizations from multiple sources or hydro-climatic variables. Reliable drought prediction with suitable drought indicators is critical to the drought preparedness plan to reduce potential drought impacts. In addition, drought analysis to quantify the risk of drought properties would provide useful information for operation drought managements. The drought monitoring, prediction and risk analysis are important components in drought modeling and assessments. In this study, a comprehensive R package "drought" is developed to aid the drought monitoring, prediction and risk analysis (available from R-Forge and CRAN soon). The computation of a suite of univariate and multivariate drought indices that integrate drought information from various sources such as precipitation, temperature, soil moisture, and runoff is available in the drought monitoring component in the package. The drought prediction/forecasting component consists of statistical drought predictions to enhance the drought early warning for decision makings. Analysis of drought properties such as duration and severity is also provided in this package for drought risk assessments. Based on this package, a drought monitoring and prediction/forecasting system is under development as a decision supporting tool. The package will be provided freely to the public to aid the drought modeling and assessment for researchers and practitioners.

  13. Influence of the vertical mixing parameterization on the modeling results of the Arctic Ocean hydrology

    NASA Astrophysics Data System (ADS)

    Iakshina, D. F.; Golubeva, E. N.

    2017-11-01

    The vertical distribution of the hydrological characteristics in the upper ocean layer is mostly formed under the influence of turbulent and convective mixing, which are not resolved in the system of equations for large-scale ocean. Therefore it is necessary to include additional parameterizations of these processes into the numerical models. In this paper we carry out a comparative analysis of the different vertical mixing parameterizations in simulations of climatic variability of the Arctic water and sea ice circulation. The 3D regional numerical model for the Arctic and North Atlantic developed in the ICMMG SB RAS (Institute of Computational Mathematics and Mathematical Geophysics of the Siberian Branch of the Russian Academy of Science) and package GOTM (General Ocean Turbulence Model1,2, http://www.gotm.net/) were used as the numerical instruments . NCEP/NCAR reanalysis data were used for determination of the surface fluxes related to ice and ocean. The next turbulence closure schemes were used for the vertical mixing parameterizations: 1) Integration scheme based on the Richardson criteria (RI); 2) Second-order scheme TKE with coefficients Canuto-A3 (CANUTO); 3) First-order scheme TKE with coefficients Schumann and Gerz4 (TKE-1); 4) Scheme KPP5 (KPP). In addition we investigated some important characteristics of the Arctic Ocean state including the intensity of Atlantic water inflow, ice cover state and fresh water content in Beaufort Sea.

  14. A combined study of heat and mass transfer in an infant incubator with an overhead screen.

    PubMed

    Ginalski, Maciej K; Nowak, Andrzej J; Wrobel, Luiz C

    2007-06-01

    The main objective of this study is to investigate the major physical processes taking place inside an infant incubator, before and after modifications have been made to its interior chamber. The modification involves the addition of an overhead screen to decrease radiation heat losses from the infant placed inside the incubator. The present study investigates the effect of these modifications on the convective heat flux from the infant's body to the surrounding environment inside the incubator. A combined analysis of airflow and heat transfer due to conduction, convection, radiation and evaporation has been performed, in order to calculate the temperature and velocity fields inside the incubator before and after the design modification. Due to the geometrical complexity of the model, computer-aided design (CAD) applications were used to generate a computer-based model. All numerical calculations have been performed using the commercial computational fluid dynamics (CFD) package FLUENT, together with in-house routines used for managing purposes and user-defined functions (UDFs) which extend the basic solver capabilities. Numerical calculations have been performed for three different air inlet temperatures: 32, 34 and 36 degrees C. The study shows a decrease of the radiative and convective heat losses when the overhead screen is present. The results obtained were numerically verified as well as compared with results available in the literature from investigations of dry heat losses from infant manikins.

  15. Emerging therapeutic delivery capabilities and challenges utilizing enzyme/protein packaged bacterial vesicles.

    PubMed

    Alves, Nathan J; Turner, Kendrick B; Medintz, Igor L; Walper, Scott A

    2015-07-01

    Nanoparticle-based therapeutics are poised to play a critical role in treating disease. These complex multifunctional drug delivery vehicles provide for the passive and active targeted delivery of numerous small molecule, peptide and protein-derived pharmaceuticals. This article will first discuss some of the current state of the art nanoparticle classes (dendrimers, lipid-based, polymeric and inorganic), highlighting benefits/drawbacks associated with their implementation. We will then discuss an emerging class of nanoparticle therapeutics, bacterial outer membrane vesicles, that can provide many of the nanoparticle benefits while simplifying assembly. Through molecular biology techniques; outer membrane vesicle hijacking potentially allows for stringent control over nanoparticle production allowing for targeted protein packaged nanoparticles to be fully synthesized by bacteria.

  16. Comparison of PASCAL and FORTRAN for solving problems in the physical sciences

    NASA Technical Reports Server (NTRS)

    Watson, V. R.

    1981-01-01

    The paper compares PASCAL and FORTRAN for problem solving in the physical sciences, due to requests NASA has received to make PASCAL available on the Numerical Aerodynamic Simulator (scheduled to be operational in 1986). PASCAL disadvantages include the lack of scientific utility procedures equivalent to the IBM scientific subroutine package or the IMSL package which are available in FORTRAN. Advantages include a well-organized, easy to read and maintain writing code, range checking to prevent errors, and a broad selection of data types. It is concluded that FORTRAN may be the better language, although ADA (patterned after PASCAL) may surpass FORTRAN due to its ability to add complex and vector math, and the specify the precision and range of variables.

  17. Introduction to the Practice of Statistics David Moore Introduction to the Practice of Statistics and George McCabe WH. Freeman 850 £39.99 071676282X 071676282X [Formula: see text].

    PubMed

    2005-10-01

    This is a very well-written and beautifully presented book. It is north American in origin and, while it will be invaluable for teachers of statistics to nurses and other healthcare professionals, it is probably not suitable for many preor post-registration students in health in the UK. The material is quite advanced and, while well illustrated, exemplified and with numerous examples for students, it takes a fairly mathematical approach in places. Nevertheless, the book has much to commend it, including a CD-ROM package containing tutorials, a statistical package, solutions based on the exercises in the text and case studies.

  18. Metabolic Flux Analysis in Isotope Labeling Experiments Using the Adjoint Approach.

    PubMed

    Mottelet, Stephane; Gaullier, Gil; Sadaka, Georges

    2017-01-01

    Comprehension of metabolic pathways is considerably enhanced by metabolic flux analysis (MFA-ILE) in isotope labeling experiments. The balance equations are given by hundreds of algebraic (stationary MFA) or ordinary differential equations (nonstationary MFA), and reducing the number of operations is therefore a crucial part of reducing the computation cost. The main bottleneck for deterministic algorithms is the computation of derivatives, particularly for nonstationary MFA. In this article, we explain how the overall identification process may be speeded up by using the adjoint approach to compute the gradient of the residual sum of squares. The proposed approach shows significant improvements in terms of complexity and computation time when it is compared with the usual (direct) approach. Numerical results are obtained for the central metabolic pathways of Escherichia coli and are validated against reference software in the stationary case. The methods and algorithms described in this paper are included in the sysmetab software package distributed under an Open Source license at http://forge.scilab.org/index.php/p/sysmetab/.

  19. Thermal management methods for compact high power LED arrays

    NASA Astrophysics Data System (ADS)

    Christensen, Adam; Ha, Minseok; Graham, Samuel

    2007-09-01

    The package and system level temperature distributions of a high power (>1W) light emitting diode (LED) array has been investigated using numerical heat flow models. For this analysis, a thermal resistor network model was combined with a 3D finite element submodel of an LED structure to predict system and die level temperatures. The impact of LED array density, LED power density, and active versus passive cooling methods on device operation were calculated. In order to help understand the role of various thermal resistances in cooling such compact arrays, the thermal resistance network was analyzed in order to estimate the contributions from materials as well as active and passive cooling schemes. An analysis of thermal stresses and residual stresses in the die are also calculated based on power dissipation and convection heat transfer coefficients. Results show that the thermal stress in the GaN layer are compressive which can impact the band gap and performance of the LEDs.

  20. Implementation of interconnect simulation tools in spice

    NASA Technical Reports Server (NTRS)

    Satsangi, H.; Schutt-Aine, J. E.

    1993-01-01

    Accurate computer simulation of high speed digital computer circuits and communication circuits requires a multimode approach to simulate both the devices and the interconnects between devices. Classical circuit analysis algorithms (lumped parameter) are needed for circuit devices and the network formed by the interconnected devices. The interconnects, however, have to be modeled as transmission lines which incorporate electromagnetic field analysis. An approach to writing a multimode simulator is to take an existing software package which performs either lumped parameter analysis or field analysis and add the missing type of analysis routines to the package. In this work a traditionally lumped parameter simulator, SPICE, is modified so that it will perform lossy transmission line analysis using a different model approach. Modifying SPICE3E2 or any other large software package is not a trivial task. An understanding of the programming conventions used, simulation software, and simulation algorithms is required. This thesis was written to clarify the procedure for installing a device into SPICE3E2. The installation of three devices is documented and the installations of the first two provide a foundation for installation of the lossy line which is the third device. The details of discussions are specific to SPICE, but the concepts will be helpful when performing installations into other circuit analysis packages.

Top